Att_VA Part B 12-03-07

Att_VA Part B 12-03-07.doc

Eighth grade access to Algebra I: A study of virtual algebra

OMB: 1850-0856

Document [doc]
Download: doc | pdf

Eighth Grade Access to Algebra I: A Study of Virtual Algebra

OMB Clearance Request

Supporting Statement
Part B

December 2007

Prepared For:

Institute of Education Sciences

United States Department of Education

Contract No. ED‑06‑CO‑0025

Prepared By:

Regional Educational Laboratory—Northeast and the Islands

55 Chapel Street

Newton, MA 02458-1060

Supporting Statement Part B

request for clearance of information collection forms for

Eighth Grade Access to Algebra I: A Study of Virtual Algebra



Overview of study design

This study is an investigation of the use of an online algebra course to expand access to 8th graders who are ready to take the course but often cannot take it because they attend schools that are often small and in rural locations that do not offer the course until high school. The design is a randomized controlled trial with randomization at the school level. Schools that do not currently offer a full section of Algebra I to 8th graders will be randomly assigned to get a virtual algebra course (at no cost to them) or no virtual algebra course. We will compare outcomes at the end of 8th grade, end of 9th grade, and beginning of 10th grade for students who attended schools that received virtual algebra to those of students who attended control schools.

Random Assignment Procedure

The study is a randomized controlled trial in which we will randomly assign schools to treatment (virtual algebra course) or control (no virtual algebra course). We will take the following steps to establish the treatment and control groups:

  1. We identify all of the schools in Maine that comprise the target population.

  2. We will stratify schools by two blocking variables. First, we will use the Common Core Data (CCD) indicator for locale.1 Second, we will block schools by type of math curricula used (e.g., traditional text vs. nontraditional approach such as “integrated” math). These blocking variables will allow us to reduce variability across schools within blocks and will also guide decisions about sample adjustment if we do lose schools to attrition during the year of implementation.

  3. We will employ a recruitment strategy in which we invite schools within blocks to participate in the study until we achieve the number of schools required by our power analyses.

  4. For students in schools that agree to participate, we will use scores on the spring 2008 7th-grade Maine Educational Assessment (MEA) in mathematics as a pretest, to be used as a covariate in our analyses.2

  5. All schools will identify the pool of students—approximately 25%3—they deem to be “ready for Algebra I.” We will ask schools to record what their mechanisms are for deciding eligibility. We will not set requirements for the number of students per school that need to be considered eligible.

  6. We will then randomly assign schools to treatment or control, according to blocking variables (locale, curriculum type). Within each block of schools, we will use our random assignment system to randomly assign schools to either the treatment or the business-as-usual group.

  7. Schools that are randomized into the treatment group will offer those eligible students the option to take the course. Schools that are randomized to the control group would conduct business as usual (such as, all students take 8th-grade “general” math).

  8. Schools that are randomized to the control group will be offered the opportunity to provide virtual algebra to their incoming 8th graders at no cost to them during the following academic year (2009–2010).

Our power calculations (see item B2) suggest 60 schools are needed to achieve adequate statistical power. The numbers shown in Table 1 assume that each school will have approximately 60 students in 8th grade and that approximately 25% of them (15 students) are considered “eligible for Algebra I.” Based on these assumptions, we will include approximately 450 students in each condition for comparison of “eligibles” in treatment versus control schools, approximately 1,350 students in each condition for comparison of “noneligibles” in treatment vs. control schools, and approximately 1,800 students in each condition for comparison of ALL students in treatment schools to ALL students in control schools (see Part B, item B1 for a discussion of the planned impact analyses).

Table 1. School and Student Counts for Virtual Algebra Study

Condition

Number of Schools

Number of “Eligibles”

(est. 15 per school)

Number of
“Noneligibles”

(est. 45 per school)

Total Number of Students
(est. 60 per school)

Virtual algebra (treatment)

30

450

1,350

1,800

No virtual algebra (control)

30

450

1,350

1,800

Total

60

900

2,700

3,600



Randomization at the school level, though it requires a larger number of schools to achieve adequate statistical power than randomization at the classroom or school level, affords this study several benefits. Primarily, it avoids creating a control condition that does not represent the true counterfactual, where the addition of a section of virtual algebra (the treatment) would cause a redistribution of students and teachers that would change the composition of the control classrooms. With the random assignment of schools to condition, the composition of 8th-grade math classes in control schools is unchanged. Also, a focus on schools that do not currently offer the course avoids a situation where students might receive an online algebra course instead of a face-to-face algebra course that they might have otherwise received.

Longitudinal Study Design

Because the goals of the study include ascertaining whether taking virtual algebra in 8th grade makes a difference in achievement and knowledge at the end of 8th grade as well as in 9th and 10th grade course-taking patterns, we plan to track the students who attend participating schools longitudinally over time. The premise of “pushing down” Algebra I to 8th grade is that it prepares students for more rigorous course-taking in math and even science through high school so they are ultimately better prepared to succeed in college-level courses. Therefore, of critical interest from a policy perspective is the extent to which students who take virtual algebra in 8th grade enter high school with an advantage that lasts through 9th grade and beyond. To address these important questions, we plan to work with the schools, districts, and staff at Maine’s SEA to obtain follow-up data on all of the students in the study. The follow-up data include transcript data on the students in the study to include math and science courses taken by the end of 9th grade (spring 2010) and enrollments at the beginning of 10th grade (fall 2010). In addition, in fall 2010, all 10th graders in Maine will be required to take (without a fee) the Preliminary Student Achievement Test (PSAT), and we will collect the math section scores for all students in the study. More information about these outcome measures is provided in section IV, below. The longitudinal design allows us to test the impact of virtual algebra over time on outcomes in high school and facilitates our ability to address a fundamental part of the policy questions that frame this investigation.

Description of the Treatment Condition

Online Teacher Training

Implementation of the intervention will include the online course taught by a highly qualified, experienced mathematics teacher with experience leading online courses and trained to deliver the specific content of the Class.com Algebra I course.

The online teachers will be recruited by Class.com, the vendor providing the online Algebra I course, from their existing networks of mathematics teachers who are experienced leaders of online courses. Because Maine does not require that teachers of online courses be certified in or resident in Maine, Class.com will be able to draw from their national network of teachers. All teachers delivering the course for this study will be certified mathematics teachers who are highly qualified as defined by the provisions of NCLB. Those online instructors who have not previously taught the Class.com Algebra I course will be trained by Class.com using their usual training methods, which combine both face-to-face and online training experiences.

Trained observers will observe (both face-to-face and online) the training of the online teachers to ensure they are of high quality. These observations include the completion of detailed protocols specially designed to guide and focus the observations and capture the key aspects of the content and instruction of the training. Among the fundamental aspects that the protocols and observations will address are (a) those features associated with high-quality professional development and adult education, drawn in part for the American Institutes for Research (AIR)-led evaluation of the Eisenhower Professional Development Program and including participant engagement and opportunities for active learning, and (b) the inclusion of conceptually based mathematical content that incorporates such research-based components as connections, contexts, alternative approaches, and the use of multiple representations in ways that are expected to enhance the online delivery of the intervention.

Assignment of “Eligible” Students to Virtual Algebra

To form the virtual algebra classes, schools that are assigned to the treatment condition will offer the course to the approximately 25% of 8th-grade students they identified (prior to random assignment) as “eligible,” that is, ready to take Algebra I. The virtual algebra section will be an additional section in the 8th-grade mathematics programs at the treatment schools. That is, the regular 8th-grade math classes will be taught by the regular math teacher(s) in each school.

The diversion of some of the students to the virtual algebra course will have an impact on the structure of the treatment schools’ 8th-grade math program. We do not anticipate it will have a dramatic impact on staffing in these schools, however, and we expect this to play out in a variety of ways. Some teachers might have one fewer class/section than usual, in which case schools and teachers may see an incentive (e.g. a free period for planning). In other cases, teachers may have the same number of sections as usual but with smaller class size. It is also possible that the 8th-grade math teacher could be displaced during the implementation and evaluation year—if the school decides to place their “eligible” students in virtual algebra and their remaining 8th-grade students in a combined class with 7th graders. We will, however, strongly discourage this and may elect to drop schools that were to do so from the study. The overarching goal is to ensure that all 8th-grade students in the study get an 8th-grade math course that is at least as good as their schools’ usual offering, in the absence of the study. We will work with each of the approximately 30 treatment schools to understand how the introduction of the virtual algebra class and the diversion of students into it affect the rest of each school’s 8th-grade math program.

To implement the course, schools can opt to have students log in and do their coursework any time during the school day, or as a group, during a set class period. Class.com courses are typically used as supplemental programs (both enrichment and remediation) to students’ overall school program, into which students log in at specified periods during the school day—though they are free to log in at other times as well. We anticipate that schools will largely choose to schedule a daily class period for the virtual algebra class.

We will also work with schools to ensure that a monitor is available during these designated class periods. This individual will not be required to provide any instruction. His or her role is to resole any problems that arise (such as a dropped Internet connection) to ensure that students maintain time-on-task and work productively during their online course time. The monitor can be any member of the school staff (or a volunteer) except the regular 8th-grade math teacher. Again, we do not expect a major effect on staffing but we will work with each school to understand their own plans for implementation of the virtual algebra course, and if staffing needs must be addressed we will ensure that the schools incur no additional costs because of the study.

Programmatic Attributes of the Intervention

The online algebra course to be implemented and evaluated in this study is multi-dimensional, consisting of at least six important programmatic attributes that make it different from “business as usual” math instruction in control schools. These attributes and the ways in which they are likely to be different in treatment vs. control schools are shown in Table 2.

Table 2. Programmatic Attributes of Math Instruction in Treatment Schools vs. Control Schools

Attributes

Treatment

Control

Mode of Delivery

Online

Standard face-to-face

Content

Algebra

Integrated eighth grade math with some pre-algebra

Teacher Qualifications

Online teachers are required to be certified and trained in both the content and delivery mode

Control teachers will have varied certification status

Staffing Levels

At least three school staff will be involved in 8th grade mathematics—the online teacher, the classroom monitor, and the regular 8th grade math teacher

As few as one professional educator will be involved in 8th grade mathematics

Class Size

Smaller than 8th grade math classes in control schools

Larger than 8th grade math classes in treatment schools

Ability Grouping

Separates students into ability groups

Does not (necessarily) separate students into ability groups



The primary goal for the study is to generate strong evidence about the impact of online algebra for eighth graders who are ready for the course. In describing the intervention and interpreting the findings, it is important to consider the multi-dimensional nature of the intervention, particularly in terms of the generalizability of findings.

Description of the Control Condition

The control schools will implement their usual 8th-grade math classes. This “business as usual” condition will vary across schools. Some of these math programs will include an “integrated” approach where some algebra is taught to all students. Some of these math programs use a more traditional textbook for “general 8th-grade math” with some accelerated material (including Algebra I materials) available for higher performing students.

We will conduct classroom observations in control schools and collect course materials including syllabi and exams to get, among other measures, a sense of how much algebra (i.e., algebraic concepts) is taught to 8th graders (both the “eligibles” and “noneligibles”) in the control schools. It may be the case that “eligibles” even in control schools receive more algebra in 8th-grade math than “noneligibles” in control schools. We will primarily use this information descriptively to contextualize the findings of the study.



B1. Respondent Universe, Sampling Variables, and Approach

This study involves random assignment of schools into study conditions, but not random selection of schools into the sample. This study involves a purposive, volunteer sample of districts, schools, teachers and students. This evaluation focuses on high internal validity, and we acknowledge that without random sampling, external validity is limited. Because this research project is driven by regional priorities, the need to distribute research activities across the REL-NEI states, the cost concerns of conducting experimental field trials across many sites, and the intentional focus on schools in Maine, a probability sample is not feasible. Should we find that substantially more than 60 schools are willing to participate in this study, we will use simple random sampling (SRS) to determine which 60 schools will be included in the study. We will also use SRS to determine the subsample of 20 schools (10 treatment, 10 control) will be targeted for classroom observations.

Target Population for the Study

The target population of schools consists of those in Maine that serve students in grade 8 and below, but not grade 9 and above, that do not offer one full section of algebra I. (The reason to exclude schools that serve higher than grade 8 is that we assume in these schools will have far more 8th grade students will be able to take the 9th- or 10th-grade algebra I class within the same building.) The target population of students is 8th graders attending these schools and who are considered to be “ready for algebra.”

By “ready for algebra” we mean those students who are considered by their schools (teachers, principals), their parents, and themselves to have sufficient mastery of pre-algebra concepts to take algebra I. Schools currently make decisions about which students are “ready” on the basis of teacher perceptions of preparedness, grades in prior math classes up through 7th-grade math, and, more rarely, scores on assessments such as algebra readiness tests (e.g., Iowa Algebra Aptitude Test, Orleans-Hanna Algebra Prognosis Test).

We have decided to focus this study on Maine because of the high degree of interest in virtual courses for students, low overall enrollments in algebra I among 8th graders across the state, and because Maine has a strong technology initiative that can support the infrastructure needed in the schools to offer an online course. Eighth-grade students in Maine currently use laptop computers in the course of their daily instruction, and engaging with information delivered online is a familiar teaching tool. Because this infrastructure is already in place, implementation of the study will be facilitated and we anticipate a shorter start-up time than in states with more limited technology capacity. However, we will need to take this contextual factor into account when interpreting the findings of the study, as this may affect the generalizability of the results. In locations where technology problems are more likely to occur, especially at start-up, educators should not expect the results we see in Maine until they achieve similar levels of technology integration.



B2. Statistical Methods for Sample Selection and Degree of Accuracy Needed

In this section, we present power analyses for the virtual algebra study. Our power calculations employ Bloom’s (2005) equation number 8 to establish the Minimum Detectable Effect Size (MDES). At this point, there is no good benchmark effect size target to use because of the lack of previous rigorous research—a problem that provides strong justification for conducting this research but little guidance on the size of treatment effect to anticipate. We have established a target effect size (ES) for the study of between 0.20 and 0.25, a policy-relevant range that we believe is conservative given that the treatment is designed to have a direct effect on students’ math knowledge and skills.4






Where

= the proportion of the sample schools allocated to the professional development treatment (assumed to be 0.5)

= the total number of schools in the study sample;

K = the number of cluster-level covariates used;

= the number of students per school at posttest and follow-up (assumed to be 12);

= the intra-class correlation;

= the proportion of the random variance between schools that is reduced by the covariate (school-level explanatory power);

= the proportion of the student-level variance component explained by the student-level pretests;

M = the multiplier that translates the standard error into a minimum detectable effect estimate. It is equal to the t critical value for , the significance level of the intended statistical test, plus the t critical value for , the likelihood of detecting significant effects given a true effect of a particular, size, i.e., the power of the test.

Our calculations are based on the following assumptions:

  1. Statistical power: 80%.

  2. Statistical significance level: alpha of .025 for a two-tailed test, using Bonferonni’s adjustment for two outcome domains (achievement and course-taking).5

  3. Number of students per school: We assume that each school serves an average of 60 8th graders, and of these, approximately 25% (15) will be considered “eligible” and will agree to participate in the virtual algebra course. Power calculations were conducted assuming 80% response rates (i.e., approximately 12 students per school at the posttest).

  4. Proportion of students in treatment condition: 50% under a balanced sample allocation.

  5. Covariate adjustment:

    1. School-level: The correlation of school-level average scores on Maine’s 7th- and 8th-grade math assessment from 2005–2006 to 2006–2007 was 0.81. Based on that correlation, we assume that 66% of the school-level variance in the outcome will be explained by the school-level average 7th grade assessment score from the previous year (i.e., R2 = 0.66).6

    2. Student-level: To further improve precision of our impact estimates, we will use baseline pretest scores at the individual student level as covariates in our analysis. For pretest measures, we will use individual student-level scores on the 7th-grade state math assessment (MEA) as a baseline measure to improve precision. We assume that the baseline measures of student achievement will explain 50% of the variance in outcome measures (i.e., R2 = 0.50).

  1. Intraclass correlation (ICC; ρ). In studies of math interventions, the ICC for schools varies between 0.03 and 0.24 for mathematics (see Schochet, 2005), a wide range for which far more specificity is needed to establish assumptions with confidence. Recently, however, Hedges & Hedberg (2007) recently conducted an empirical analysis of ICCs using a nationally representative dataset from the Longitudinal Study of American Youth (LSAY). For mathematics achievement outcomes in grade 8 in the full population, with covariate adjustment by both achievement pretest scores and demographic variables (i.e., the “residualized conditional model”), they report the ICC = 0.106. In a follow-up communication with the first author, we have learned that the unadjusted (unconditional) ICC for grade 8 mathematics for rural schools in the Northeast region of the country, specifically, is 0.12. Therefore, we assume an ICC value of 0.12 for our power calculations.

  2. Blocking. Blocking, that is, the random assignment of schools within homogenous groups, can reduce the standard errors and MDES of estimated program effects by reducing the unexplained variance in the program impact estimates that must be accounted for by the experimental comparison. Our current research design (and our calculations of MDES) assumes that we will block by two stratification variables: locale (as defined by CCD indicators: “rural” vs. “other”), poverty status (based on school percentage FSLP), and mathematics curricula used for general 8th-grade math (traditional vs. nontraditional7). That is, we will conduct random assignment of schools separately within each block to treatment or control status, resulting in an equal number of program and control schools within each block. The decision about the number of blocking variables is a tradeoff between increased explanatory power and reduced degrees of freedom, which reduces statistical power and increases the MDES.

Prior to and during recruitment of schools for the study, we will collect additional information that may guide the use of additional blocking characteristics. One potential blocking characteristic of particular interest is the “high school attendance area” or catchment area. If a large number of the K-8 or middle schools that are eligible for the study are clustered within high school attendance areas (i.e. more than two K-8 or middle schools per feeder high school), we could match schools by feeder high school and randomize by pairs. This pairwise matching design would improve study power and would ensure that balanced numbers of students who went to treatment and control middle schools attend the same high schools. However, it is possible that many of the schools that will be eligible for the study are the only middle or K-8 school in very small districts that feed into one high school. We will gather information about the high school feeder patterns of each of the study-eligible K-8 and middle schools prior to and during recruitment. Once we have the information, we will revisit the blocking and randomization plan with IES and ATS at least one month prior to the actual random assignment of schools to condition.

Based on this analysis and using the above assumptions, the minimum detectable effect size for a study that includes 60 schools (30 per condition) is 0.20, a policy-relevant value that is within the target range for the study.

We conducted additional power analyses to explore the effect on the MDES when the assumptions for sample size at the school and classroom levels are smaller. Table 3, below, shows the MDEs for calculations that hold all of the assumptions listed above constant, except for the number of schools and students. The MDEs range from 0.20 standard deviations to 0.34 standard deviations.

Table 3. MDEs for Varying Sample Size Assumptions

Number of

Schools

Number of Students per School (No. of “Eligibles”)

60 (12)

50 (10)

35 (7)

60

0.20

0.21

0.24

40

0.25

0.27

0.29

30

0.30

0.31

0.34





B3. Procedures to Maximize Response Rates

Our recruitment plans are informed by our own experience in other large-scale field trials, the extensive knowledge about Maine’s schools that REL-NEI already has, and the guidelines provided in Burghardt & Jackson’s (2007) Tips on Recruiting Schools and Teachers for Random Assignment Studies.

After we obtain OMB clearance, we will determine which schools do not currently offer algebra I at 8th grade by implementing a planned sequence of contacts at the state, district, and school levels. We will build off of the strong partnerships and working relationships that the REL-NEI has already established in Maine to facilitate these contacts. We will conduct preliminary meetings with state and district staff to help determine which schools we should target, because they will have first-hand knowledge of enrollments, algebra course offerings, and availability of technology. We will then call schools to arrange in-person visits, so that we may share with them the goals and structure of the study, and attempt to recruit them as research partners. To the extent possible, senior members of the study team from both EDC and AIR will attend all of these meetings.

To facilitate our communications about the study, we will prepare clear, simple, high-quality materials that explain the study to interested districts and schools and the public. These include a one-page summary, brochure, letterhead for communications, a schedule showing project milestones, and a list of frequently asked questions and responses. We will structure our in-person and telephone communications about the study with clear protocols that guide conversations about the study with talking points and scripts as well as a checklist of items that must be addressed.

Based on our previous experience working with policy makers in the region, we anticipate strong interest in participation at the local levels, as education decision-makers in the region are eager for information about online courses and are interested in increasing access to critical “gate-keeper” courses like algebra I to 8th graders. We will use this interest to gain entrée into schools across the state. Should it be necessary to expand into additional states, we will consult with IES to determine what the ramifications might be for the study (particularly with regard to the differences in state mathematics assessments).

There are several motivating factors that we expect will drive schools’ interest in the study. We anticipate that the ability to offer the online course to students who are considered ready will encourage schools to participate in the study. The online delivery of the instruction is not likely to be a deterrent, because technology is already a widespread, important, and highly relied-on resource. (Maine’s Learning Technology Initiative provides a laptop to every 7th- and 8th-grade student and teacher in the state.) During the recruitment process, we will inform schools that if they are randomized to the control condition, they will receive the online course the following year (2009–2010) at no cost. We will also offer both treatment and control schools the opportunity to have a teacher within their school trained to deliver the online algebra I course at the conclusion of their participation in the donated course, to support sustained implementation of the course if it is deemed effective in the study and beneficial to the individual school.

We will use memoranda of understanding (MOUs) to codify expectations and roles of study participants and research staff. No MOUs will be signed until OMB approval is secured. The elements of the MOU include the following:

  • an overview of the study, including the research questions to be addressed;

  • a listing of the participants and definitions of their roles;

  • a description of the intervention to be provided;

  • a description of the data to be collected by the REL-NEI field staff;

  • a description of the responsibilities of the participating schools, teachers, and the REL-NEI field staff;

  • a timeline for the study, including implementation, test administration, and classroom observations; and

  • signature pages, requiring the principal, all participating teachers, and REL-NEI co-principal investigators to sign.

To track and document the recruitment process rigorously, we will build on our database system set up to identify the eligible pool of schools for recruitment to record and track the recruitment process, including a contacts database and a clear system for indicating the status of each school.


B4. Tests of Procedures and Methods to Be Undertaken

In choosing the instruments, we relied heavily on standardized achievement tests and questionnaires and protocols used successfully in previous studies. Consequently, the instruments and survey questions have been thoroughly tested on large samples with prior OMB approval. The measures are described in detail in our response to Part A, item A2, and are briefly summarized below.

  • Maine Educational Assessment (7th and 8th grade mathematics): this is a long-standing state assessment program, with established, reliable, and valid test forms.

  • NWEA-MAP: This math assessment is a computerized adaptive test with strong psychometric properties and is currently being used as an outcome measure in AIR’s evaluation of the impact of professional development in mathematics study, funded by IES. The purpose of administering an assessment at the end of 8th grade in addition to using the state test scores is that we have been advised that the MEA may not include enough algebra items to serve as the only outcome measure to address our research questions

  • Student survey: the items on this survey are taken from surveys used in AIR’s evaluation of the impact of professional development in mathematics study, funded by IES

  • Teacher survey: the items on this survey are taken from surveys used in AIR’s evaluation of the impact of professional development in mathematics study, funded by IES

  • Classroom instruction: To document the nature of mathematics instruction in the two types of classrooms, we will draw on existing observation protocols for observing instruction in traditional (face-to-face) classrooms—such as those being used for the impact study of professional development in mathematics currently being conducted by AIR, as well as protocols for tracking online interactions.

  • 8th grade math course grades and 9th grade transcripts and credits earned: student record data will be coded using the NCES Classification of Secondary School Courses (CSSC). Course credits will be converted to standardized Carnegie units, and letter grades (A–F) converted to a point system (0–4). Points will then weighted by the number of Carnegie units earned by course type to yield each student’s score for the math class taken.

  • 10th grade enrollment: student record data will be coded using the NCES Classification of Secondary School Courses (CSSC)

  • 10th grade Preliminary Scholastic Aptitude Test (PSAT): this is a long-established, valid, and reliable test of academic aptitude


B5. Names of Statistical and Methodological Consultants and Data Collectors

Teresa Duncan of AIR will serve as the Principal Investigator of this study and will be responsible for quality assurance of all aspects of the study, including design, implementation, analyses, and report-writing. Jessica Heppen of AIR will serve as the Evaluation Director. A senior EDC staff member to be identified will serve as the Implementation Manager, with overall responsibility for school recruitment and ongoing support to school sites, as well as overseeing the activities of the vendor providing the online course and instructors. Michael Russell and Thomas Hoffmann of Nimble Assessment Systems will be responsible for developing and maintaining the on-line data collection system.

Senior Advisors from EDC and AIR will help provide content-area and methodological expertise to the Virtual Algebra study. Lynn Goldsmith of EDC and Steve Leinwand of AIR will serve as mathematics experts, and Michael Garet and David Myers of AIR will serve as technical experts on the design, conduct, and analysis of randomized controlled trials.

Windwalker Corporation, Nimble Assessment Systems, and Class.com are three small businesses that are key partners of this project. Windwalker staff, led by Manya Walker, will help conduct classroom observations and gather administrative data from the schools. Nimble Assessment Systems will design, implement, and support the online data collection systems for the student achievement and survey data. Class.com will provide the online Algebra I course, and will recruit and train the instructors for the course.


Table 4.
Key Study Staff

Name

Role

Title/Organization

Telephone Number

Dr. Teresa Duncan

Principal Investigator

Principal Research Analyst/AIR

(202) 403-6853

Dr. Jessica Heppen

Evaluation Director

Senior Research Analyst/AIR

(202) 403-5347

Dr. Lynn Goldsmith

Senior Advisor

Senior Scientist/EDC

(617) 969-7100

Dr. Steve Leinwand

Senior Advisor

Principal Research Analyst/AIR

(202) 403-6926

Dr. David Myers

Senior Advisor

Senior Vice President/AIR

(202) 403-5110

Dr. Michael Garet

Senior Advisor

Chief Scientist/AIR

(202) 403-5345

Dr. Manya Walton

Task Leader

Project Manager/Windwalker

(703) 970-3500

Dr. Michael Russell

Task Leader

Assessment Director/Nimble Assessment Systems

(781) 237-9417

Dr. Thomas Hoffman

Task Leader

Systems Engineer/Nimble Assessment Systems

(781) 237-9417



In addition to the persons listed above, members of the REL-NEI Technical Working Group (listed in Part A of this submission) provided substantial input to the study design.


1 We will collapse the CCD indicators for locale into two categories, rural versus other, though we will consider a more refined set of categories once the target population of schools has been identified and we have examined the variation in locale indicators across these schools.

2 We had planned to administer an “Algebra Readiness” test to all 7th-grade students in spring 2008 to use as a covariate and to help schools make decisions about student eligibility. However, because Office of Management and Budget and Institute of Education Sciences guidelines as well as methodological considerations restrict us from providing these data to the schools, and given that the 7th grade MEA scores will be available to us, we cannot justify the expense of administering the readiness test.

3 This proportion is consistent with the proportion of 8th graders that are taking Algebra I statewide in Maine. We assume that the percentage of 8th graders that are truly eligible will vary across schools, depending on factors such as demographics and on the quality and focus of math delivered in grades 6 and 7. We will offer the target 25% as a guideline but not a requirement.

4 This MDES range is established for the purpose of testing the effects of virtual algebra by comparing students that take virtual algebra (i.e. “eligible” students in treatment schools) to students who would have taken virtual algebra had their school offered it (i.e. “eligible” students in control schools). Therefore, the study is powered for this comparison. However, we also plan to compare ALL students in treatment schools to ALL students in control schools as well as “noneligible” students in treatment versus control schools. These analyses will include larger numbers of students so in that sense have greater statistical power, but it is highly likely that the ES for these comparisons will be lower. Because we have no basis on which to establish an expected ES for the treatment effect of virtual algebra for students who do not take the course but attend schools where it is offered, we have opted to power the study for the comparison of “eligibles” using a conservative target for MDES.

5 Once the full set of achievement and course-taking outcomes has been assembled (fall 2010), we will conduct our final analyses by doing an omnibus test by creating a composite of the outcome measures and testing the impact of virtual algebra on the composite measure. If the omnibus test is significant, we will move on to look at each outcome separately.

6 For the comparison of “eligible” students in treatment schools vs. “eligible” students in control schools, we will use the pretest scores for just this subset of students as covariates. When entered at the student level, these scores will explain both school- and student-level variance. We expect the correlation between pretest and outcomes to be even higher for just the subsample of students and if so, the precision will improve and the power will be even higher.

7 We will obtain the information we need to establish the best levels for this blocking variable during the recruitment process. “Nontraditional” curricula are likely to include the “integrated” math curricula that infuse algebra concepts with other 8th-grade math material.

File Typeapplication/msword
File TitleEVALUATION OF SRA REAL MATH BUILDING BLOCKS PREK
AuthorTeresa Duncan
Last Modified BySheila.Carey
File Modified2008-02-27
File Created2008-02-27

© 2024 OMB.report | Privacy Policy