National Assessment of Educational Progress (NAEP) Digitally Based Assessments (DBA) Usability Study 2017-18

NCES Cognitive, Pilot, and Field Test Studies System

Volume II - NAEP DBA Usability Studies 2017-18

National Assessment of Educational Progress (NAEP) Digitally Based Assessments (DBA) Usability Study 2017-18

OMB: 1850-0803

Document [docx]
Download: docx | pdf



National Center for Education Statistics

National Assessment of Educational Progress



Volume II

Interview and User Testing Instruments





National Assessment of Educational Progress (NAEP) Digitally Based Assessments (DBA) Usability Study 2017-18



OMB# 1850-0803 v.181











November 2016

Table of Contents


Paperwork Burden Statement

The Paperwork Reduction Act and the NCES confidentiality statement are indicated below. Appropriate sections of this information are included in the consent forms and letters. The statements will be included in the materials used in the study.



Paperwork Burden Statement, OMB Information

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0803. The time required to complete this information collection is estimated to average 75 minutes including the time to review instructions and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: National Assessment of Educational Progress (NAEP), National Center for Education Statistics (NCES), 550 12th St., SW, 4th floor, Washington, DC 20202.



This is a project of the National Center for Education Statistics (NCES), part of the Institute of Education Sciences, within the U.S. Department of Education.



Your answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002, 20 U.S.C §9573]. OMB No. 1850-0803



OMB # 1850-0803 Approval Expires 07/31/2019

  1. Participant ID and Welcome Script

The Fulcrum interviewer will complete the participant ID information below, prior to starting the user testing session:

Information

Observer Notes

Grade


School Name


Welcome Script

Text in italics is suggested content with which the facilitator should be thoroughly familiar in advance.

The facilitator should project a warm and reassuring manner toward the students and use conversational language to develop a friendly rapport.



Hello, my name is [Name of Facilitator] and I work for [Company]. We are carrying out a study for the National Center for Education Statistics (NCES) within the U.S. Department of Education. Thank you for helping us with our study today.

The Department of Education regularly conducts assessments for students in grades 4, 8, and 12 on subjects like mathematics and reading. These assessments are used to see how well our education system is doing, and allow us to compare performance between different states. These assessments used to be given using paper and pencil, then some were given on laptop computers, and now we are using touch-screen tablets. We are creating many new questions and tools for these assessments and need to know if they will work before we send them out to be used in an assessment. We don’t want students to start the assessment and then have no idea what to do! My job is to try out these new questions and tools with fourth-, eighth-, and twelfth-graders to find out if they are easy to understand and use.

This is where you come in. You were selected to participate in our study because you are a [grade level]-grader and will be able to give us honest and insightful feedback on these new items and tools. We have created a made-up test with all the different tools we need to evaluate. We are going to walk you through this practice test to see how easy it is to understand and use. We will ask you to perform specific tasks, and will write down what you do. We will be asking you questions like “What do you think this symbol means?” or “How would you go about doing this?” or “How easy do you think that task was?” Even though you will be answering test questions, we are not focusing on whether you get them right or not. We don’t even know the answers to some of them ourselves.

We won’t be scoring any of the questions, but we will use your opinions and results without your name, along with the ideas and answers of other students, to make changes and improve the system for [insert grade]­graders all across the country.

Your participation is voluntary and if at any time you decide you do not want to continue, you may stop. Do you have any questions?

[Answer questions as appropriate]

Let’s begin.

  1. Computer and Tablet Familiarity Survey

The Computer and Tablet Familiarity Survey will be given to student participants at the beginning of the user testing session and after the welcome. The facilitator will read the questions and answer options aloud. Clarifying language will be used to help guide the students to make an accurate response. Some examples of this type of language are “Well, do you use a tablet every day?” and “How about the last time you used a computer? When was that?” These questions are also utilized before testing to help build rapport between the facilitator and the participant.

I have a couple of questions for you about your experience with computers, tablets, and mobile phones, either in or out of school:

How often do you…

Not at all

Rarely

Sometimes

Frequently

All the time

...spend time on a computer (desktop or laptop) in a typical day?

1

2

3

4

5

...spend time on a computer tablet in a typical day?

1

2

3

4

5

...make or receive calls on a cellular telephone in a typical day?

1

2

3

4

5

...send or receive texts on a cellular telephone in a typical day?

1

2

3

4

5

...send or receive emails on a cellular telephone in a typical day?

1

2

3

4

5

...use the internet on a cellular telephone in a typical day?

1

2

3

4

5

... use apps on a cellular telephone in a typical day?

1

2

3

4

5







How much do you know about…

Very little

Little

Some

More than most

Expert

...using computers?

1

2

3

4

5

...using cellular telephones?

1

2

3

4

5



  1. Ease-of-Use Rating Survey/ Comments

The following Likert-type item will be administered at the conclusion of tasks during the user testing.



Very Difficult

Somewhat Difficult

Can’t Decide

Somewhat Easy

Very Easy

1

2

3

4

5

Initially, the facilitator will use the following script to prompt a response for the Ease-of-Use scale.

I’d like to ask you about what you just did. How easy was it for you to figure out what to do? Was it easy, difficult, or somewhere in the middle?

If the participant answers to the effect of “easy,” the facilitator will then ask, “Was it somewhat easy, or very easy?”

If the participant answers to the effect of “difficult,” the facilitator will then ask, “Was it somewhat difficult, or very difficult?”

If the participant answers to the effect of “in the middle,” the facilitator will then ask, “Was it a little bit easy, a little bit difficult, or is it hard to decide?”

These prompts will be used initially and as needed by participants, but may be omitted for some students as they become accustomed to responding to the Ease-of-Use scale.

  1. User Testing Scenarios

Description

The facilitator will set up items on the tablet between each task. Some tasks will be completed using the eNAEP system, while others will be completed using prototypes that require switching from the eNAEP system to a web browser.

Participants will be given specific tasks to complete using the tablet. They will be asked to explain what they are doing and why they are doing it while completing the tasks. If participants fail to provide sufficient narration of their actions, they will be prompted by the facilitator using questions such as, “What are you doing now?” and “Can you tell me why you did that?”

If a participant performs an action that did not have the desired effect, the facilitator may say, “Hmm, that didn’t work. What else could you try?” in order to get a deeper understanding of the student’s mindset regarding a task.

If participants are unable to complete a particular task after a couple of attempts or are not willing to try because they don’t know how, they will be told how to perform that task element before moving onto the next item.

The task instructions for all items will fall into one of the four following types:

  1. Identify function of indicated control (e.g., “What do you think happens when you click on this?”)

  2. Identify control to achieve indicated function (e.g., “Tell me how you might change your answer.”)

  3. Perform task (e.g., “Please read the instructions and complete this item.”)

  4. Provide feedback (e.g., “We have tried doing this with two different buttons. Which one do you think we should use on our assessments?”)

The tasks described in this section are examples of the types of tasks that students will be performing, and do not constitute a comprehensive list. For each user testing session, the actual tasks, as well as their order of presentation, will be determined by the nature of the interactions being studied, as well as the counter-balancing needed for accurate interpretation of the results.


The following are descriptions of components of the user testing scenarios:

Task instructions –– The task instructions (shown in italic text) let the participant know what they should try to accomplish. For example, in the first sample task below, the participant is asked to get the tablet ready to take the test. These instructions are often intentionally vague to see how intuitive the process is for the student.

Task step –– A task step (also frequently called a “task element” and shown in the left column of the table below) is a specific action that is required for completion of a task. Students will not be told what task steps are required for completion of a task. As mentioned earlier, prompts may be used to facilitate task completion if the participant is unlikely to do so without them. However, the need for prompts is part of the data we will be collecting, so they will be provided only at certain points for each task step.

Control mechanism –– The control mechanism (shown in the right column of the table on page 6) is a part of the interface on the tablet that the user interacts with to complete a task step. Examples of control mechanisms are the NEXT button, the volume control (both hardware and software), and the highlighter tool. The control mechanism, like the task step, is not described or mentioned to the participants. It is included here as an indicator of what is being tested. The task completion data, collected at the control mechanism level, will provide the high level of detail needed to provide sufficient feedback to item developers to maximize the usability of their items for actual NAEP assessments.

The tasks in the following scenarios represent a sample of the types of tasks that participants will be asked to complete using the tablet. Each user testing session will comprise a different series of tasks and instructions depending on the availability and development stage of different assessment mechanisms in the eNAEP system.


4.1 Sample Task 1 – Scratchwork

[Participant is presented with the following screen.]

Task Instructions: Let’s say you decided the word “homework” was really important, and wanted to circle it. Can you please circle the word, and then show me how you would erase your markings?



Task Step

Control Mechanism

Circles word

Pencil or highlighter

Selects eraser or erases all

Trackpad, stylus, or finger

Erases text

Eraser or erase all

[Facilitator administers the Ease-of-Use scale.]

4.2 Sample Task 2 – Text to Speech

[Participant is presented with the following screen.]

Shape1

Task Instructions: This one appears to be a math item. Let’s say you were confused about what exactly the question was asking, or didn’t know one of the words. Do you see a way to have the text be read to you in the controls?



Task Step

Control Mechanism

Activate text-to-speech

Text-to-speech button

Activate text to be read

Clicks on box surrounding words



[Facilitator administers the Ease-of-Use scale.]

4.3 Sample Task 3 – Inline Choice

Shape2

Task Instructions: Okay, on this screen, just go ahead and answer the question how you think it should be answered.

Task Step

Control Mechanism

Student selects answer from drop-downs

Inline choice mechanism



[Facilitator administers the Ease-of-Use scale.]

  1. Exit Questions

Questions like the examples below will be asked after completion of all tasks.



Instructions: I would like to get your opinion about everything we’ve done so far. (Responses will be transcribed by the researcher.)

  1. Now you’ve completed items using the mouse, the trackpad, a stylus, and your finger. Which one did you like best?

  2. Why?

  3. How about your second favorite?

  4. And your least favorite was?

  5. We used tabs to move through test items. Some were on the left side, some were on the right side, and others were on the bottom. Which side do you think makes the most sense?

  6. Did you like using the stylus?

    1. What kinds of things on the test was the stylus good for?

  7. How would you rate this test on a tablet? Remember, we are not thinking about how hard the questions are to answer, but how hard or easy it is to move around and do things required for the test on this tablet. Was it easy, difficult, or somewhere in the middle? (Continue the Ease-of-Use prompts to get final rating.)

  8. If you were going to be taking one of our tests, which way would you want to do it? On paper, on a tablet, or on a laptop computer?

  9. What would be the best thing about taking a test using the tablet?

  10. Would there be any bad things about taking a test using the tablet? (Prompt for what.)

  11. Do you have any other comments for us about giving our tests to kids using tablets?

Thank you for helping us to improve our test.

  1. Thanks and Dismissal

At the conclusion of all testing, students will be thanked for their time, and told that they may keep the pair of ear buds as a small reward for participation. For testing sessions outside of schools, a gift card will also be given to participating students and to parents/guardians if they provided transportation to the student. Students will then be dismissed according to procedure at the school or testing location.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Apprioval App, Vol II
Authorjoconnell
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy