NAEP Math Block Difficulty Pilot Study Supporting Statement

Att_NAEP Math Block Difficulty Pilot Study Supporting Statement.docx

System Clearance for Cognitive, Pilot and Field Test Studies

NAEP Math Block Difficulty Pilot Study Supporting Statement

OMB: 1850-0803

Document [docx]
Download: docx | pdf


NATIONAL ASSESSMENT OF

EDUCATIONAL PROGRESS




Supporting Statement

for

Grade 8 Mathematics Block Difficulty Study



Request for Clearance for the Grade 8 Mathematics Block Difficulty Study


OMB# 1850-0803 v.26

(Generic Clearance for Cognitive, Pilot, and Field Test Studies)











March 26, 2009



Table of Contents



  1. Submittal-Related Information

This material is being submitted under the generic Institute of Education Sciences (IES) clearance agreement (OMB #1850-0803 v.26). This generic clearance provides for the National Center of Educational Statistics (NCES) to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.


  1. Background

In the 2011 NAEP assessment, a special study (the Multi-Stage Testing (MST) study) will examine the possibility of using an adaptive testing algorithm in NAEP. The study, administered on computer, will be conducted in mathematics at grade 8. A sample of students will take one cognitive block (known as the routing block) and, depending on the student’s individual performance on that block, the computer will then assign the student either an easy, medium, or hard block for the second block. Another sample of students will take the same blocks via standard NAEP random assignment. Comparisons will be made between the results from these two samples of students to determine if tailoring block difficulty to student ability is a worthwhile endeavor for NAEP in terms of providing more accurate estimates of group ability. In order to prepare for this study, a small-scale field test study will be conducted in June 2010.


In this small-scale field test study, NCES will compare the performance of randomly equivalent student groups on the easy, medium, and hard blocks assembled for the 2011 NAEP MST study. Although the blocks for the MST study are being assembled based on historical information about item difficulties, NCES has raised questions about whether the item difficulties may change when re-assembled due to context effects. The purpose of this pilot study is to determine whether the easy, medium, and hard blocks (re-assembled for the MST study) are distinct in terms of difficulty across block types (i.e. easy, medium, and hard). We expect the students to have different overall scores if the difficulty of the blocks is distinct in practice. Secondary goals may include one or more of the following: comparing the characteristics of the two routing blocks, examining context effects (related to which routing block is presented), and analyzing the speededness of each block based on the number of items reached.


  1. Design and Analysis

Each student will be randomly assigned a booklet that contains: a) either the Router 1A or Router 1B block in the first position; and b) either the Easy, Medium, or Hard block in the second position. The blocks will be presented and students will respond via paper-and-pencil. This design results in a total of six groups (see Tables 1 and 2).


Table 1. Booklet Design


Block 1

Block 2

Group 1

Router 1A

Easy

Group 2

Router 1A

Medium

Group 3

Router 1A

Hard

Group 4

Router 1B

Easy

Group 5

Router 1B

Medium

Group 6

Router 1B

Hard


Table 2. Treatment and Block Structure

Treatment
(Difficulty Type)

Block (Group)

Group 1

Group 2

Group 3

Group 4

Group 5

Group 6

Router 1A

1

1

1

0

0

0

Router 1B

0

0

0

1

1

1

Easy

1

0

0

1

0

0

Medium

0

1

0

0

1

0

Hard

0

0

1

0

0

1


In this unbalanced, incomplete block design, each group will be treated as a “block” (in experimental design terms), which can only be assigned the 2 out of 5 treatments (i.e. cognitive assessment blocks). The results of the analysis will show whether there are any differences among the cognitive assessment blocks in terms of difficulty.


We will compare the easy, medium, and hard assessment blocks to each other. We also may be able to compare the two routing blocks. This is made possible by employing a mixed model analysis, which combines intra- and inter-group information about the treatment (i.e. difficulty type) effects, thus allowing a comparison across all levels of the treatment (Littell et al, 2007).


A total sample size of 900 (150 per group) will result in sufficient power to detect differences in block difficulty. The power analyses are based on the assumption that we examine main effects only (block difficulty). We will not examine interactions of block difficulty x subgroup (e.g., race/ethnicity gender, SES, ELL) and we will not perform any separate analyses by subgroup.


This design will also provide descriptive information about how the routing blocks are functioning. That is, beyond comparing the difficulty of the two routing blocks, we could examine the percentage of students who would be routed to each subsequent block if the guidelines from the MST study were used. If the groups of students receiving each routing block are randomly equivalent, we would expect similar percentages to be routed to each subsequent block.


A 5-7 minute post study questionnaire (See Appendix A) will be administered following the field test.




  1. Consultations outside the Agency

None



  1. Assurance of Confidentiality

NAEP has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the legislation (Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and Education Sciences Reform Act (Public Law 107-110, 20 U.S.C. §9622)). Specifically for the NAEP project, this ensures that privacy, security, and confidentiality policies and procedures are in compliance with the Privacy Act of 1974 and its amendments, NCES Confidentiality Procedures, and the Department of Education ADP Security manual. The federal authority mandating NAEP in Section 9622 of US Code 20 requires the confidentiality of personally identifiable information:

"(A) IN GENERAL.-- The Commissioner for Education Statistics shall ensure that all personally identifiable information about students, their academic achievement, and their families, and that information with respect to individual schools, remains confidential, in accordance with section 552a of title 5.


"(B) PROHIBITION.-- The Assessment Board, the Commissioner for Education Statistics, and any contractor or subcontractor shall not maintain any system of records containing a student's name, birth information, Social Security number, or parents' name or names, or any other personally identifiable information.

Participation is voluntary and personally identifiable information will not be maintained for the student participants. Participants will be provided with the following confidentiality pledge on the front of the test booklet:

The information you provide will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107–347 and other applicable Federal laws, your responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you.



  1. Justification for Sensitive Questions

No sensitive questions will be asked.


  1. Estimate of Hour Burden

The participating schools will be involved with the coordination and scheduling of the administration at the field test. The school burden time is estimated to be between one and two hours per school (approximately 30-40 schools will participate). Burden to the students is estimated to be 5-7 minutes for the post assessment questionnaire. (The cognitive assessment is comprised of two 25-minute blocks).


Therefore, the estimated respondent burden will be:

Respondent

Time per participant

Number of respondents

Total

Grade 8 students

5-7 minutes

900

75-105 hours

School personnel

1-2 hours per school

40

40-80 hours

Totals


940

115-185 hours



  1. Estimate of Costs for Recruiting and Paying Respondents

Students - No remuneration will be given to students.

Schools - It is very challenging to recruit schools for such a study this late in the school year.  Schools are completing state assessments and are over burdened with testing in general.  The schools will get no information about how their students did.  Therefore, for this study, each participating school will receive a used laptop computer. The laptops are abandoned federal property.  NCES (through Westat) purchases laptops for NAEP State Coordinators to use in training and conducting their activities.  These laptops are replaced every 3-4 years, and are due to be replaced this fiscal year.  When the new laptops are purchased for the NAEP State Coordinators, Westat will request permission for the laptops to be "abandoned in place".  The laptops will then revert to Westat, and Westat will use them as incentives for schools participating in the study. The laptops are at least three years old and their estimated value is in the $65-$100 range, depending on condition. Receiving even a used laptop can help increase the response rates.  This approach was successfully utilized before for a similar small study (the 2006 NAEP Word Locator Study).


  1. Cost to the Federal Government

The cost of this project to the government includes:

Activity

Provider

Estimated Cost

Design, preparation, analysis, reporting and follow-up activities

NAEP Education Statistics Services Institute (NESSI)

$82,700

Printing, distribution, and scoring

Pearson

$175,000

Field test administration

Westat

$150,000

Totals


$407,700


  1. Project Schedule

Event

Date

Develop assessment

March-April 2010

Submit OMB Fast-track proposal

March 31, 2010

Recruit Schools

April-May 2010

Upon approval from OMB, conduct studies at schools.

May – June 2010

Score assessment results

June – July 2010

Analyze results and create recommendations

August 2010




Reference


Littell, R. C., Milliken, G. A., Stroup, W. W. , Wolfinger, R. D., and Schabenberger, O. (2007). SAS for Mixed Models (2nd edition). SAS Institute Inc., Cary, NC.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleNATIONAL ASSESSMENT OF
Authorjoconnell
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy