NAEP Justification

NAEP Grade 4 WCBA Usability-Vol 1.docx

System Clearance for Cognitive, Pilot and Field Test Studies

NAEP Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf


NATIONAL ASSESSMENT OF

EDUCATIONAL PROGRESS



Volume I


Supporting Statement

for

Small Scale Usability Study



Request for Clearance for Small Scale Usability Study for Grade 4 Writing Computer-Based Assessment


OMB# 1850-0803 v.38

(Generic Clearance for Cognitive, Pilot and Field Test Studies)



Student Grade 4 Usability Evaluation of Writing Computer-Based Assessment




Shape1








January 24, 2011

(revised February 22, 2011)




Contents



  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803). This generic clearance provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.

  1. Background and Study Rationale

The National Assessment Governing Board (Governing Board) has identified the goal of computer-based delivery of the writing assessment for the National Assessment of Educational Progress (NAEP) at grades 4, 8, and 12.1 As such, in 2011, NCES will operationalize a nationwide writing computer-based assessment at grades 8 and 12 for NAEP. In 2012, NCES will pilot a NAEP writing computer-based assessment (WCBA) at grade 4, with the intention of administering the assessment operationally in 2013.


The writing framework indicates the need to study grade 4 students’ facility with computers prior to the administration of the grade 4 assessment. The study described in this submission is intended to obtain specific information on how grade 4 students interact with computers, including their typing skills and overall familiarity with using computers, to inform the development of the grade 4 WCBA to be piloted in 2012. Specifically, this study will examine which aspects of the grades 8 and 12 WCBA platform will work for the grade 4 assessment and which will need modification. NCES has tasked Fulcrum IT, the NAEP technology provider, with the design and implementation of this study, as well as the design and development of the grade 4 WCBA system.


The purpose of the study described in this submission is threefold:

  • collect information about grade 4 students’ familiarity with writing on computers, their typing ability, and the specific keyboarding instruction they have received;

  • collect information about grade 4 students’ recognition of and familiarity with word processing elements, navigation, and interactive tools, along with their design preferences, in the timed testing environment of the existing WCBA platform used at grades 8 and 12, as well as in two modified design prototypes; and

  • use the information collected in the study to determine the best design modifications for the WCBA platform for grade 4 students.


  1. Study Recruitment, Method, and Design

Recruitment and Sample Characteristics

The study will include 60 students, drawn equally from five schools. The selection of schools will be representative of type of location (e.g., urban and rural), minority enrollment, percentage of students that receive free or reduced price lunch, and whether the state test is delivered on computer. While no specific analysis will occur relating to student demographics, students will be selected with an eye to diversity with respect to gender, race, socio-economic status, learning disabilities, and English language proficiency.


NAEP State Coordinators (see Section 5) will leverage their relationships within the states to contact and identify five schools willing to participate in the study. Upon identifying the schools, the NAEP State Coordinators will forward the contact information to Fulcrum IT, who will contact the school to make the arrangements (see Appendix A). It has been our experience from prior studies that the schools will leverage student assent and legal guardian consent forms that are completed at the beginning of each school year. We will only choose schools whose city or county policy provides such generic consent.


Study Method

Because of participants’ age and the need for close observation of their interaction with the computer, students in the study will be tested individually. Each student will be paired with a two-person facilitator/observer team from Fulcrum IT. We are anticipating three teams, each comprised of one Facilitator and one Observer, working simultaneously, each with a different student. NCES personnel or designees may also observe the interview process.


The students will be welcomed and introduced to the facilitator and the observer, and then told that they are there to help answer questions about how fourth graders use computers to write. Students will be reassured that their participation is voluntary and that their names or other personally identifiable information will not be recorded (see Volume II).


Study Design

The study consists of several components:


  • First, students will be asked a limited number of questions regarding any typing instruction they have received and the amount of time they spend on a computer at home or at school.

  • Next, students will be shown paper designs of three versions of WCBA prompt/response screens and asked to choose their preference.

    • Design A is the existing WCBA platform used at grades 8 and 12, while Designs B and C are modified from the WCBA platform to include larger buttons for some controls, such as the highlighter and speak-aloud tools, as well as different location or labeling of navigational features, such as the arrow buttons to open and close the writing panel (the paper designs are provided in Volume II, Part III).

  • The students will then undertake a series of tasks on two of the WCBA computer systems. The first system will be the design selected by the student, and the second will be a design selected by the facilitator in order to achieve an even distribution of system testing.

    • The tasks are designed to test the features and functionality of the systems as the student interacts with the interface screens, buttons, word processing features, and background questions of the different designs.

    • As they complete the tasks, the students will be allowed to ask questions at any point. Their questions are an important part of what the observer will capture in notes. The trained facilitator will encourage each student to investigate and discover the system on his or her own in order to observe and record how the student interacts with the design without being told how to do something. The facilitator will ask questions intended to reveal distinct user preferences or usability problems related to the interface design.

    • After each design try-out, the student will rate the design using a paper survey.

  • After the second design try-out and rating, the student will also complete a short paper survey comparing the two designs.

  • Finally, the study will conclude with a 5-minute typing test using TypingMaster Pro licensed software2 to determine the student’s typing speed and keystroke accuracy.


The three WCBA computer systems and the TypingMaster Pro licensed software will be installed on four Dell computer laptops such as the ones used in the 2011 WCBA assessment. While individual students will work on laptops with only two of the three designs, all designs will be equally tested across students.


The WCBA computer systems will include sample writing prompts and background questions from prior NAEP paper-based writing assessments as a means of testing how the students use the application and its features (refer to Volume II, Part IV-A1a for the sample background questions being used in the tryouts). The usability study will not collect any personally identifiable information, nor will it evaluate either the appropriateness or completeness of the responses.


Observers will document on a scale of zero to three whether the student had no difficulty (0), few difficulties (1), many difficulties (2), or completely could not accomplish a given task (3). These ratings will be summarized in a report to NCES, along with the observer notes about students’ behavior as they undertake the tasks, information from the computer usage surveys and the students’ ratings and comparison surveys, and the results of the typing tests.

  1. Data Collection Instruments

The following instruments, which will be used to gather data, are included in Volume II of this submittal and are referenced by part number below.

  1. Welcome Script and Background Information (Part I). Facilitator will welcome the student and fill out general information without input from students.

  2. Student Oral Survey (Part II). Students will answer questions verbally related to their computer usage, keyboarding experience, and instruction received. The facilitator will ask the questions and the observer will record the student’s answers.

  3. Student Paper Design Preference (Part III). The students will be shown paper versions of the three designs and asked specific questions as to which one seems the most familiar to use. The facilitator will ask the questions and the observer will record the student’s answers. The writing prompt/typing response screen will be shown for each design (see Vol. II, Part III-A for screenshots of Design A, B, and C).

  4. Student Tasks on Interfaces (Part IV-A1 and Part IV-B1). Students will then be asked to perform various tasks, first on the design of their choice, and then on a second design, selected by the facilitator. The facilitator will direct the student, and the observer will document student behavior and difficulties. After each interface review, students will complete a Student Survey (Part IV-A2 and Part IV-B2) on paper to record their design preferences and comments, providing, on a scale of one (positive) to five (negative), their perceptions about the two designs they tested.

  5. Student Design Comparison Survey (Part V). After completion of tasks on both designs, students will fill out a final paper survey asking them to compare the two designs by checking boxes in a rating table and providing four short responses to indicate what they liked the best and what they liked the least about each of the two designs.

  6. Typing Test (Part VI). The students will then take a 5-minute test, typing a specific passage created for this study and using TypingMaster Pro licensed software on the laptops. The software will record keystrokes, error rates, and typing speed for all of the students using the same test.

Student information across the different instruments will be linked by a unique identifying number. As such, no personally identifiable information will be collected or stored for the students.


  1. Consultations outside the Agency

Fulcrum IT Services Company LLC (Fulcrum IT)

Fulcrum IT is the NAEP contractor responsible for the development and ongoing support of NAEP computer-based assessments for NCES.


NAEP State Coordinators

The NAEP State Coordinator serves as the liaison between the state education agency and NAEP, coordinating NAEP activities in his or her state.

  1. Paying Respondents

Students will be offered refreshments, such as individual juice bottles, pretzels, and pizza. Coordination with the school and teacher will ensure that no student receives food to which they are allergic. The five participating schools will each be offered a $50 gift card to a major office/school supply store as a token of appreciation. This practice has proven effective in recruiting subjects to participate in similar research.

  1. Assurance of Confidentiality

NCES has policies and procedures that ensure privacy, security, and confidentiality, in compliance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and Education Sciences Reform Act of 2002 (20 U.S.C. §9573). This legislation ensures that security and confidentiality policies and procedures of all NCES studies, including the NAEP project, are in compliance with the Privacy Act of 1974 and its amendments, NCES confidentiality procedures, and the Department of Education ADP Security Manual.


No student personally identifiable information will be recorded during this study to ensure that the confidentiality of participants is maintained.

  1. Justification for Sensitive Questions

No sensitive questions will be asked.

  1. Estimate of Hourly Burden

Burden to the students is estimated to be 75 minutes per student; including the time to answer questions, type on the computer, finish specific tasks, and complete the student surveys. Burden to the school administrators is split into the initial recruitment (estimated to be 15 minutes for 20 schools) and coordination of arrangements for the five participating schools (estimated to be up to two administrators per school and one hour per administrator). Therefore, the estimated respondent burden is:


Respondent

Number of Responses

Hours per Respondent

Number of Respondents

Total

School administrators

(all recruited schools)

20

0.25

20

5 hours

School administrators

(participating schools)

10

1.00

10

10 hours

Grade 4 students

60

1.25

60

75 hours

Total

90


80

90 hours

  1. Cost to Federal Government

The following table provides the overall project cost estimates:


Activity

Provider

Estimated Cost

Prepare the prototype interface designs and the survey instruments; recruit and coordinate with five participating schools; perform the testing; record, analyze, and report the results

Fulcrum IT

$86,400

Refreshments and school incentive

Fulcrum IT

$750

Total


$87,150

  1. Project Schedule


Date

Event

January-February 2011

Upon OMB approval, recruit schools willing to participate in the study.

Develop Designs B and C for use in the usability test.

February-March 2011

Conduct usability study.

Analyze results of surveys and typing tests.

Submit report to NCES.



Appendix A: Sample Confirmation of Participation Letter to Schools

Hello [School Administrator],

[NAME of NAEP State Coordinator] forwarded your name to me to work out arrangements for my colleagues and me to visit [NAME OF SCHOOL] on or about [DATES]. We are requesting to meet with twelve students in grade 4 to let them test our new system to assess writing using laptop computers.


We would like to meet with the students for up to 75 minutes, but we are flexible depending on what your class schedules are like. We will bring the laptop computers and set up everything that we need. We will also bring snacks, drinks, and pizza if that is amenable for the class and teacher.


While we will ask the students to do a practice writing response on the computer, we are not testing their writing abilities, evaluating their responses, or gathering any personal information from them.


The purpose of this study is to test our system, including how well students can use the laptops, mouse, headphones, etc., and also to give them a short typing test to see at what speed students at this grade level can type. As such, this is a usability test of our system and software, and is in no way a test of the students. We are collecting information for the purpose of revising our system design based on observation and feedback from the students. The study results will have a significant impact on the design of grade 4 computer-based assessments to use with students across the country in the National Assessment of Educational Progress (NAEP).


Our company has been contracted by the National Center for Education Statistics (NCES), part of the Department of Education, to carry out this work. This study is authorized by law under the Education Sciences Reform Act of 2002, 20 U.S.C. §9543. Your participation is voluntary. All responses that relate to or describe identifiable characteristics of individuals may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, unless otherwise compelled by law (20 U.S.C. §9573).


I will call you within the next few days to determine which dates might work for you and to answer any questions you might have. Feel free also to contact me by email or phone, as indicated below. We greatly appreciate your school’s cooperation, and will offer a $50 gift card to a school supply store as a token of our appreciation. Thank you for your help.

Respectfully,


[Fulcrum IT Representative]

1 Writing Framework for the 2011 National Assessment of Educational Progress. (2010, September). National Assessment Governing Board website: http://www.nagb.org/publications/frameworks/writing-2011.pdf

2 NCES owns a multiple-use license of this software and it has been used in previous studies.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleNATIONAL ASSESSMENT OF
Authorjoconnell
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy