OMB-ED QandA Jan 2008

OMB-ED QandA Jan 2008.doc

The Efficacy of the Measures of Academic Progress (MAP) and its Associated Training on Differentiated Instruction and Student Achievement

OMB-ED QandA Jan 2008

OMB: 1850-0850

Document [doc]
Download: doc | pdf

OMB-ED Q&A

200710-1850-002: The Efficacy of the Measures of Academic Progress (MAP) and its Associated Training on Differentiated Instruction and Student Achievement

January 2008


1. Part A, page 6, indicates that teachers within schools will be randomly assigned, however, elsewhere in the document schools are identified as the unit of random assignment. Please clarify.


Random assignment will happen at the school level. The supporting statements A and B were revised accordingly. Part A, Item 1 (page 6) was revised as follows:


"This two-year study employs an experimental design, where half of the schools will be assigned the treatment condition in grade 4 (with grade 5 conducted in business as usual) and half of the schools will be assigned the treatment condition in grade 5 (with grade 4 conducted in business as usual). To enhance the likelihood of participation in the study, a delayed treatment control design is being used. The control condition (grade 4 or 5 in each school) will be offered the MAP program at the end of two years. Although the intervention will be delivered, no data collection will be undertaken because the timing will be outside the scope of the study."


2. The schedule calls for the last report to be delivered in 2011. When does the REL-MW contract with NCEE end? How does this vary among RELs?


The REL-Midwest contract ends 1/31/2011. The last report from this study will be delivered prior to the end of the contract.


3. Is Part B, Item 1 indicating that after district sign up for MAP training, half of them will be told they are in the control group? This approach is quite different from recruiting for most other REL RCTs. How will the REL address district willingness to participate in the study under these conditions?


The districts will fully informed during recruitment about the details of the study design, as outlined in Part A, Item 1 and Part B, Item 1. The following was added to Part B, Item 1:


"Districts that sign up for MAP training are not automatically enrolled in the study; rather, they are given a detailed description of the study and asked if they would like to participate. At the time of recruitment, districts and schools will be fully informed of the features of the study design. The Memo of Understanding will describe the assignment process in detail and it will provide assurances that the delayed treatment for the control condition will be offered in year 3."


4. The plan for addressing the third “policy-relevant question” seems to compete with the preceding two in that it introduces the intervention to the control schools in year 2 whereas the others would seem to dictate waiting. Did the REL consider delaying introduction of the treatment to the control schools until year 3?


Introduction of the treatment to the control grade within each school will be delayed until year three, after the study concludes. See Table 1 in Part B, Item 1.


5. How will a year 2 introduction impact the ability to measure “sustainability of…impact?” (question 2)


The revised design means that we will no longer introduce the MAP+Tr program to the control group in Year 2. The control group will not receive treatment until year 3, after the study concludes. This revised design provides a means of examining if teachers continue to use MAP-based strategies in their classes (testing, grouping, differentiated instruction) in Year 2. It is a clean control through the second year of the study.


6. In Part B, page 11, the section “testing ‘equivalence’ in…” seems to be missing a footnote. What is the hypothesis at work in this section that would require testing whether there are “effects of M+Tr replicated by teachers in the delayed treatment control?”


Because the revised design provides a control condition for two years, testing equivalence is no longer meaningful. Discussion of these analyses was deleted (along with the footnote).


7. In the intervention fidelity analysis, why does n=8 students per class? What is the rationale for sampling? Is there some additional data collection or follow-up for those students? How will they be selected?


The following was included in supporting statement B, page 12:


Sampling is being used in the fidelity study to minimize data collection burden for teachers. An objective of the Year 1 phase of the study is to examine the psychometric quality of several approaches (surveys, logs, observations) to assessing fidelity, and to examine the extent to which data from these methods converge. The sample size (n=8) will be sufficient to detect validity coefficients of 0.40 or greater. Assuming a boost in the variance accounted for at level 2, the sample size will be sufficient to assess the linkage between fidelity level and student outcomes.


There is no guidance in the literature on how to measure the fidelity with which the key component of the MAP program -- differentiated instruction – is implemented in classroom settings. As such, this project is developing and pilot testing three approaches for literacy/reading. The sampling will be done within groups of students identified as in the lower or upper quartile on state reading tests. This assumes that we will be able to locate and select classes with mixed ability levels. Data from NWEA indicates that this is possible. Within the high and low reading readiness subgroups students will be randomly selected. Student with special needs (e.g., gifted or students with IEPs) will not be included in these ability subgroup pools.

File Typeapplication/msword
File TitleOMB-ED Q&A
AuthorBridget Dooling
Last Modified ByBridget Dooling
File Modified2008-01-17
File Created2008-01-17

© 2024 OMB.report | Privacy Policy