Justification

Vol I-NAEP 2011 Tech Writing Assess Utility Study.doc

System Clearance for Cognitive, Pilot and Field Test Studies

Justification

OMB: 1850-0803

Document [doc]
Download: doc | pdf


NATIONAL ASSESSMENT OF

EDUCATIONAL PROGRESS



Volume 1


Supporting Statement

for

System Small Scale Usability Study



Request for Clearance for System Small Scale Usability Study for the 2010 NAEP Writing Assessment Pilot


OMB# 1850-0803

(Generic Clearance for Cognitive, Pilot, and Field Test Studies)



Student Grade 8 Computer-based Writing Assessment Application Usability Study for Computer-Delivered Pilot Writing Assessment in 2010







March 31, 2009



Table of Contents




  1. Submittal-Related Information

This material is being submitted under the generic Institute of Education Sciences (IES) clearance agreement (OMB #1850-0803 v.8) that was approved in July 2007. This generic clearance allows the National Center of Educational Statistics (NCES) to conduct various procedures (field tests, cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.


  1. Background

As required by the National Assessment Governing Board, the National Center for Education Statistics (NCES) of the U.S. Department of Education is implementing a nationwide computer-based writing assessment at the 8th and 12th grades (and one on paper at grade 4). Both the Writing Framework for the 2011 National Assessment of Educational Progress (NAEP) and the Writing Specifications for the 2011 NAEP published in fall 2007 require computer delivery (CBT) of this assessment.


The NAEP computer-based writing assessment will provide the entire testing infrastructure for the 2011 writing assessment for grades 8 and 12. Between 30,000 and 100,000 students (depending on whether the assessment will be at the national or state level) will be assessed using NAEP-provided hardware and software. To accomplish this, field staff will provide portable computers to a participating school for students to use for the assessment.


NCES has tasked Fulcrum IT Services Company (Fulcrum IT), under Contract No. ED-07-CO-0076, Task No. 7.2.4.2, to design computer applications to operate the assessment for both oversight by administrators and performance by students. The writing assessment is slated to be tested using Toshiba 15-inch laptop computers for the 2010 pilot. Fulcrum IT is designing administrator and student software to operate on the laptops. This usability study would test the design and function of the application for 8th-grade students, looking for standard usability problems related to the design of the prototype’s interface, e.g., screens, buttons, instructions, navigation, and functionality. The study will also test the performance of the communication and interaction between the administration software and the student software.


To conduct the software tests, Fulcrum IT proposes to provide 20 laptops with the prototype application installed for students to take a modified, sample assessment. The assessment will include sample prompts as a means to test how the student uses the application. The usability study is not designed to evaluate either the appropriateness of the prompts or students’ writing responses.


  1. Design

The evaluation is designed to record the following two types of information:

  1. The students’ ability to complete tasks that will be required in the assessment, and

  2. The students’ evaluation of the application via a survey of what they liked and disliked about the interface.

Using a series of representative writing tasks, the information gathered will evaluate (1) the effectiveness of the application in facilitating students’ performance, and (2) the effectiveness of the application in presenting tasks and capturing sample responses between the administrator laptop and student laptops.


The instruments used for each student consist of two rating surveys. The first will be filled out by facilitators observing the students as they perform the tasks, rating the level of difficulty the individual students had with the operation of the software and noting indicators of user confusion or other difficulty with the computerized test environment. The second rating survey will record student opinions of the prototype software: their likes, dislikes, and recommendations for improvement. See Volume II of this clearance package for the actual survey instruments.


The 20 students will be randomly assigned to four groups of five students each. No personal data will be collected or recorded on any of the students. No personal or sensitive questions will be asked, and therefore no parental permission will be required for the students to participate. Students will log in to the application using their participant number only.


The students will then attempt one 15-minute sample writing task. They will be asked to perform certain tasks, depending on their group, to evaluate different aspects of the prototype as well as the communication with the administrator laptop. They will be observed in their actions as they attempt to perform the assessment tasks and rated by the facilitators based on their difficulty or ease of completing the tasks.


After a short break, the students will attempt the second and final 15-minute sample writing task. At the end of the second period, the students will fill out a survey rating the software application they have just experienced, with regards to its ease or difficulty of use and their general like or dislike of various features. A short discussion will end the session to gather any further comments by the students. Refer to Appendix A for a description of the survey process and detailed session process steps.


Fulcrum IT will use Westat, the regular NAEP contractor, for data collection activities and to contact and recruit an appropriate school in the Washington D.C. or surrounding area to participate in the study. Westat has experience in recruiting schools to participate in many of NAEP’s studies.

 



  1. Consultations Outside the Agency

None

  1. Assurance of Confidentiality

Participation is voluntary and no personally identifiable information will be collected or maintained for the student participants. Students will be provided with the following confidentiality pledge: The information you provide will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107–347 and other applicable Federal laws, your responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you.

  1. Justification for Sensitive Questions

No sensitive questions will be asked.


  1. Estimate of Hour Burden

The participating school will need to select and recruit students and provide a room in which the interviews can occur. Burden to the students is estimated to be 60 minutes per student; this includes the time on each computer, the time to answer the comparison questions, and additional time for setup or delays in using the computers.

The overall estimated respondent burden is summarized in the table below.


Respondent

Hours per respondent

Number of respondents

Total

Grade 8 students

1

20

20 hours

School personnel (for student recruitment)

1

1

1 hour

Totals


21

21 hours

  1. Estimate of Costs for Recruiting and Paying Respondents

Participating students will be offered light refreshments, such as individual juice bottles and pretzels or chips. The participating teacher will receive a $40 gift card to an office supply store to use for classroom materials. This practice has proven effective in recruiting subjects to participate in similar studies. The amount offered is consistent with NCES guidelines.

  1. Cost to Federal Government

The cost of this project to the government for materials/compensation will be $70 ($30 for the refreshments and $40 for the gift card). An associated cost is the labor to prepare the survey instruments, arrange and complete the testing, and analyze and record the results. The estimated 120 labor hours billed at $80/hour ($9,600) will be charged to NCES contract. The computers and software are already paid for.

Activity

Cost

On-site materials/compensation

$70

Staff costs for survey preparation, administration, and analysis

$9,600

Total

$9,670


10) Project Schedule

Date

Event

March 18, 2009

Submit draft test proposal to NCES

March 31, 2009

Submit final test proposal for OMB review

April 14–April 24, 2009

Contact public school(s) in the Washington, DC, metropolitan area to request participation and make arrangements for a June testing date in one school.

April 27, 2009

Finalize survey instruments and the participant school.

April 30, 2009

Upon approval of OMB, finalize school test schedule.

By June 1, 2009

Conduct test at the school.

June 1-10, 2009

Analyze results of surveys and user tests.

June 10, 2009

Disseminate analysis and test results to NCES.





Appendix A: Survey Process (Activities to Be Completed With User Participants)

Summary

Each participant will receive a participant number and use it to log in to the application.

Facilitators will remain at one computer station to observe students throughout the usability test.

Roles and Participant Groups

Test Round

Scenario*

Participants

Facilitator

1

A

1-10

Lori Burgess, Fulcrum IT Matt Orban, Fulcrum IT


B

11-20

Paul Harder, Fulcrum IT

Saira Brenner, Fulcrum IT

2

C

**1-10

Lori Burgess

Matt Orban


D

**11-20

Paul Harder

Saira Brenner

** - The same 20 students will participate in both test rounds.


* A = laptops fully connected between student and administrator machine

B = laptops fully disconnected with administrator retrieving data via USB from the

student machine and putting it on the administrator machine

C = laptops fully connected between student and administrator machine; several

machines lose connection and then are re-connected

D = laptops begin fully connected, lose connection, and do not recover the

connection for the remainder of the assessment


Schedule

Test Round

Time

Activity

10 min.

Introduction and assignment of participant numbers

1

15 min.

Log in and perform assessment tasks


5 min.

Break

2

15 min.

Begin new tasks


5 min.

Students take rating survey


10 min.

Group discussion of process

Test Session Process Steps

  1. Greet students and introduce the facilitators.

  2. Discuss (briefly) that we are testing a new type of assessment software, and not the students’ writing performance.

  3. Give students a participant number and group them for facilitator observation.

  4. Give instructions to log in to the application with their participant number.

  5. Students begin following instructions for the 1st round of assessment tasks and scenarios. Facilitators observe and rate ease or difficulty level of students completing tasks.

  6. After 15 minutes, students take a 5-minute break.

  7. Students begin new round of assessment tasks/scenarios.

  8. After 15 minutes, students fill out a survey regarding their opinions of the software.

  9. Facilitators hold a short discussion to gather comments and observations from students as time permits.


File Typeapplication/msword
File TitleNATIONAL ASSESSMENT OF
Authorjoconnell
Last Modified By#Administrator
File Modified2009-04-01
File Created2009-04-01

© 2024 OMB.report | Privacy Policy