Justification

Vol I NAEP SAIL VirtualWorld for ELA 2014-17 Updated.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf


National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement



NAEP Survey Assessments Innovations Lab (SAIL)

Pretesting Activities: Virtual World for English Language Arts Assessment


OMB# 1850-0803 v.170

Revision to a previously approved package (1850-0803 v.106)






June 19, 2014

(rev. July 15, 2014)

Revised September 2016




Table of Contents



  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803), which provides for NCES to conduct various procedures (such as field tests and cognitive interviews) to test new methodologies, question types, or delivery methods to improve assessment instruments.

The original request to conduct the NAEP SAIL Pretesting Activities: Virtual World for English Language Arts Assessment was approved on July 17, 2014 (OMB# 1850-0803 v.106). This updated submission requests an increased time length for cognitive interviews, and an expanded timeframe in which the approved activities are to be conducted.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the various subject areas and to also collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

As part of NAEP’s development process, systems of delivery and assessment items are pretested on smaller numbers of respondents before they are administered to larger samples in pilot or operational administrations. The NAEP Survey Assessments Innovations Lab (SAIL) initiative is a research program set up to explore the potential value to NAEP in the development of innovative technology-based item types. This project, for which we seek approval to conduct empirical research studies, is the SAIL Virtual World for English Language Arts (ELA) Assessment. This project will help us develop systems or tools that can be used as a platform for assessment activities or tasks. The goal of this project is to examine measurement of new conceptions of combined English Language Arts (ELA) reading and writing proficiency through the use of a virtual, immersive “world” that allows assessment of information gathering, processing, and evaluation skills (i.e., multiple source inquiry and research) in more naturalistic ways. Through the use of a rich, computer-simulated environment, problems can be framed that require students to “read” and evaluate multiple types/genres of information (e.g., reports, newspaper articles, blogs, emails, and videos), and compose expressive communication (i.e., “writing” understood to include the integration of text, images, and data, directed to a specific audience). The goal is to leverage new technologies emerging from artificial intelligence (e.g., games and simulations) for an assessment task design that: 1) improves measurement of the information processing and evaluation components in the ELA-related frameworks; and 2) provides opportunities to measure more integrated constructs (e.g., reading, writing, and research skills) as they are ideally taught and applied in the real world.

The Virtual World for ELA is envisaged as a computer- or tablet-based interactive narrative environment in which students can gather, interact with, and analyze multiple types of information resources, and synthesize those resources into a coherent response to a question or problem. The response is addressed to a specific audience using information resources which can be obtained from environments with realistic qualities, such as a Web search engine, library catalog, historical archive, or conversations with various experts and laypeople. The research studies proposed here are designed to gather information about how this platform is used by students, the types of activities that we can have students perform with them, and the kinds of information they can provide for assessment purposes. The information gathered from the proposed research studies will feed into the iterative development of these interactive systems.

As part of the SAIL research and development process, the systems will be put through iterative testing activities, including play testing, cognitive interviews, and tryouts. These iterative testing phases are especially important given unknown factors associated with these platforms for innovative technology-based items. NCES contracted the Educational Testing Service (ETS) to develop the platform and associated items, and to carry out the necessary studies.

Volume I describes the design, data collection, burden, cost, and schedules of the research activities for the aforementioned projects; Volume I Appendices provide recruitment and communication materials; and Volume II provides protocols and questions used in the research sessions.


Types of Research Methods

The following sections describe the different types of research methodologies that will be used.

Play Testing

In play testing, a process adapted from the game-design industry, a diverse set of students in small teams of two to four will work through and discuss activities, problems, and tasks with one another. An observer/facilitator will give overviews of the activities to students and provide guidance on what students should reflect on. Play testing will take place early in the test development process using preliminary versions of the virtual systems. The purpose of play testing is to gather student views on early versions of the interactive technology and begin to understand the range of ways that students use them. The primary goal here is evaluating and refining the platform and activities.

During play testing, students will be encouraged to talk together about issues they confront, while observers note reactions to and potential problems with content or format. Observers will query students to draw them out, facilitate deeper reactions, or probe areas of possible confusion. Through play testing, researchers will be able to identify construct-irrelevant features in tasks, such as inaccessible language, difficult interactions, or uninteresting or unfamiliar scenarios or activities that result in poor student engagement. Play testing early in the research and development cycle allows for refinements to the system that can be tested in subsequent, more intensive cognitive interviews.

Cognitive Interviews (used in the middle phases of the research project)

In cognitive interviews (often referred to as a cognitive laboratory study or cog lab), an interviewer uses a structured protocol in a one-on-one interview drawing on methods from cognitive science. The objective is to explore how students are thinking and what cognitive processes they are using as they work through tasks. The primary goal here is to understand how students think with the systems, and explore what kinds of evidence of student cognition the systems can elicit.

Two methods will be combined: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to "think-aloud" (i.e., describe what they are thinking) as they work through questions or tasks. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think-aloud” process, or to explore additional issues that have been identified a priori as being of particular interest. This combination of allowing students to verbalize their thought processes in an unconstrained way, supplemented by specific and targeted probes from the interviewer, has proven to be productive in previous NAEP pretesting1 and will be the primary approach in the NAEP cognitive interviews described in this package.

Cognitive interview studies produce largely qualitative data in the form of verbalizations made by students during the think-aloud phase or in response to the interviewer probes. Some informal observations of behavior are also gathered, since typically a second observer is involved, in addition to the interviewer. Behavioral observations may include such things as nonverbal indicators of affect, suggesting emotional states such as frustration or engagement, and interactions with the task, such as ineffectual or repeated actions suggesting misunderstanding or usability issues.

Small-Scale Tryouts (used in the last phase of the project)

During small-scale tryouts, students work uninterrupted through a selected set of draft activities, problems, or tasks. The strength of using a tryout methodology on a small scale is that it allows data to be gathered about student responses and actions during normal, uninterrupted performance. This approach provides a small-scale snapshot of the ranges of responses and actions that the systems are meant to elicit, but with fewer resource implications than formal piloting. Previous experience, for example with the NAEP Technology Engineering Literacy Assessment2, shows that tryout-based insights are very informative, especially for the refinement of scoring rubrics (e.g., for examining, characterizing, and grouping the types of actions and responses that students provide and allocating appropriate scoring levels accordingly) and for finalizing or revising decisions about student actions that are to be captured.

NAEP SAIL Research & Development: Technology Based

Given that SAIL projects involve technology-based platforms, all of the research activities will be conducted using technology (e.g., tablet or computer; game-like control devices)3. Play testing will use preliminary versions of the systems, cognitive interviews will be conducted using interim versions, and small-scale tryouts will be conducted on more fully developed versions towards the end of the project.

  1. Sampling and Recruitment Plans

Play Testing Studies

Students will be recruited from districts that are located near the ETS campus, in Princeton, New Jersey, for scheduling efficiency and flexibility. ETS will recruit students, representing a range of demographic groups, using existing ETS contacts with individual parents/guardians, as well as administrators, teachers, and staff at local urban and suburban schools and afterschool programs for students. In some cases, ETS will directly contact parents/guardians of students who have previously participated in ETS research and who are known to fit the targeted range of grade level, gender, race/ethnicity, socioeconomic background, and district type (urban, suburban, rural). In other cases, flyers, e-mails, or letters will be used to contact parents/guardians and administrators or teachers/staff. School administrators/teachers may be asked to distribute paper flyers and consent forms to students and parents. During these communications, the parent/guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort, as well as the activities that it entails. Confirmation e-mails and/or letters will be sent to participants. Only after ETS has obtained written consent from the parent/guardian will the student be allowed to participate in a play testing session. See appendices A-I for representative recruitment, confirmation, consent, and thank you materials.

A small number of students will participate in play testing. A small sample is sufficient at the play testing stage given that the key purpose is to identify usability errors and other construct-irrelevant issues.4 For the SAIL Virtual World for ELA, 25 students total from 8th or 9th grades be recruited, some of whom will be testing the virtual world environment and tools in various locations and some testing a scenario-based task built in the virtual world.

Cognitive Laboratories

For the cognitive laboratories/cognitive interviews (sometimes called cog labs), ETS staff will recruit students from the following demographic populations:

  • A mix of race/ethnicity (Black, Asian, White, Hispanic, etc.);

  • A mix of socioeconomic background; and

  • A mix of urban/suburban/rural

Although the sample will include a mix of student characteristics, due to the small sample sizes the results will not explicitly measure differences by those characteristics. The recruitment process for the cognitive interviews will be the same as described above for play testing. The materials provided in Appendices A-I will also be used in the cognitive interview recruitment process. For the SAIL Virtual World cognitive interviews, 15 students total from 8th/9th grades will be recruited

Several researchers have confirmed the standard of five as the minimum number of participants for analysis for the purposes of exploratory cognitive interviewing.5 Thus, the numbers we suggest should be sufficient for cognitive interviews given that the tasks involve some complexity.

Small-scale Tryouts for Scenario-Based Tasks

Tryouts, toward the end of the project, will involve larger numbers of students. Recruitment for tryouts will include identifying a school or schools to work with for conducting the tryouts (see appendices J and K). However, if due to the larger number of students required, insufficient numbers are obtained through school recruitment, we may include some of the recruitment methods used for play testing and cog labs (see appendices L-O, P2, Q-S).

Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics, since even with the slightly higher sample numbers there would not be enough statistical power to do so. For the SAIL Virtual World for ELA tryouts a minimum of 100 students total from 8th/9th grades will be recruited, with a maximum of 200 students. Because we anticipate incomplete rates of consent form returns, we will attempt to recruit 200 students, in an attempt to achieve a minimum sample of 100 students. If data from 100 students cannot be obtained from our recruitment methods, we may use individual contacts with parents to recruit additional students for testing on campus. For example, if 80 students’ data can be collected from a school, we may conduct 20 additional individual tryout sessions at the ETS campus.

Table 1 summarizes the number of students for the play testing, cognitive interviews, and tryout components of the cognitive pretesting activities.

Table 1. Sample Size: Cognitive Pretest Activities: Play Testing, Cognitive Interviews, Tryouts 6

SAIL: Virtual World for ELA Pre-testing Activities

Number of Students (Grade 8/9)

Play Testing

25

Cognitive Interview

15

Tryouts

200*

Total

240

* Maximum number

  1. Data Collection Process

Play Testing

Play testing will take place at the ETS campus, in one of three dedicated research laboratories that are set up with recording equipment and working space for observers, facilitators, and one or more students (suitable for individual or small group sessions). Participants will first be welcomed and introduced to the facilitators/observers (assessment specialists, cognitive scientists, research assistants/associates, or task designers), and will be assured that their participation is voluntary and that without the permission of their parent or guardian, their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)]. Participants will also be reminded that with their parent or guardian’s permission some responses or clips from videos may be selected for use in research reports or presentations. Observers will then give an overview of the planned activities to students and provide guidance about what students should focus on. Observers will take notes on what students say and the sessions will be audio recorded. In addition, where feasible, screen-capture will be used to record the actions occurring on the screen; note that this screen recording will not provide any identifiable data about the student. If log file capture is available, all student actions with the system will also be recorded in a data file; this will not provide any identifiable data, since students will be coded with an anonymous ID number. If the system is implemented on a tablet device, the project will also use digital video capture of the student interacting with the system, since their touch-based interactions with the system will be an important part of the data. Parents and students will be informed about the video recordings prior to the sessions and their informed consent will be part of the criteria for participation.

For the most part, students will be allowed to explore and interact with the mocked-up task and activities by themselves with little intrusion on the part of the observer. However, at a few strategic points, observers may introduce questions meant to explore students’ reactions to the task, areas of confusion, and ways of thinking about answers to the questions in the tasks and/or items. Examples of such questions are:

  • Did you find the problem in this task interesting – why or why not?

  • Are there any aspects of this that are confusing? Did you understand that part?

  • How would you answer this question/How would you do this activity? [Ask different group members if their approaches would differ].

  • How could this task/activity/system be improved? Could it be clearer? Could it be more interesting? Could it be easier to interact with?

Prior to each play testing session, ETS staff may identify some key focus areas for activity or for the system that students will be using. If students do not provide sufficient comments on targeted parts, an observer may ask a group of students if they had any thoughts about the particular sections, using questions such as those described above, but focused on specific places or issues in the task or activities or system. See Volume II, Part B for the protocol used in the study.

Analysis Plan

Feedback from a play testing session is immediate and can be evaluated after the session. Notes from the observers in each session will be aggregated; one aggregate document will be produced for each task or set of items that are observed, with all observers contributing their observations to this common document. Since play testing is a more informal process that generates relatively unstructured information, no formal quantitative analyses of these data will be performed, and the qualitative analyses will seek to pick out themes or individual observations that are important for the goals of developing the system or tasks going forward.

Cognitive Laboratories

Cognitive interviews will take place at the ETS campus, in one of the dedicated research laboratories described in the section on play testing above. All sessions will be individual, and will be attended by two ETS staff (typically a facilitator/interviewer and an additional observer).

Participants will be welcomed, introduced to the interviewer and the observer, and told they are there to help develop and try out new kinds of systems for gathering information about students. They will be assured that their participation is voluntary and that without the permission of their parent or guardian, their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)]. Participants will also be reminded that with their parent or guardian’s permission some responses or clips from videos may be selected for use in research reports or presentations. Interviewers will explain the think-aloud process (Volume II, Part C II a.) and conduct a practice session with a sample question.

On completion of the think-aloud component, the interviewer will proceed with follow-up questions (examples can be found in Volume II, Part C-III). In this verbal probing component, the interviewer asks the student targeted questions about specific aspects of knowledge, skill, or ability that the task is attempting to measure, so that the interviewer can collect more information on the strategies and reasoning that the student employed as he or she worked through the task. The targeted questions will be generated for each task prior to testing. The interviewer is also encouraged to raise additional issues that became evident during the course of the interview. For example, if a student paused for a long time over a particular section, appeared to be frustrated at any point, or indicated sudden realization, the interviewer might probe these kinds of observations further, to find out what was going on. The interviews will be based on the protocol structures described in Volume II, Part C. Students will also complete a brief questionnaire regarding demographic information, computer experience, and familiarity with the research skills tested in the assessment.

As with the play testing sessions, observers will take notes on what students say, and the student’s think-aloud verbalizations will be captured using digital audio recording. Where feasible, screen-capture software (e.g., Camtasia, CamStudio) will be used to record the actions occurring on the screen, and if a log file (i.e., a digital record of all interactions with the system) capture is available, student actions with the system will also be recorded in a data file (neither of which will produce identifiable data about the student). These recordings can be replayed or analyzed later, to see how a given student progressed through the task and what actions they took. Further, digital video capture may be used to record students’ touch-based interactions with a tablet device. The combination of the screen-capture and the video is important to determine all of the actions a student may have made that did not result in a change on the screen (e.g., unsuccessfully attempting to apply an interactive gesture that was not recognized by the system or attempting to interact with a non-interactive element). Interviewers will also record their own notes separately, including behaviors (e.g., the participant appeared confused), whether extra time was needed, whether prior knowledge was evident, and so on. Parents and students will be informed about the video recordings prior to the sessions and their informed consent to being recorded will be part of the criteria for participation.

Analysis Plans

For the cognitive interview data collections, documentation will be grouped at the activity level. The types of data collected about each activity will include

  • think-aloud verbal reports;

  • behavioral data (e.g., actions observable from screen-capture or video of student);

  • responses to generic questions prompting students to think out loud;

  • responses to targeted questions specific to the activity;

  • responses to post-task questionnaire;

  • additional volunteered participant comments; and

  • debriefing questions.

The general analysis approach will be to compile the different types of data to facilitate identification of patterns of responses for specific tasks or activities, such as patterns of frequency counts of verbal report codes and of responses to probes or debriefing questions, or types of actions observed from students at specific points in a given task. This overall approach will help to ensure that the data are analyzed in a way that is thorough, systematic, and that will enhance identification of problems with the systems or tasks and provide recommendations for addressing those problems.

Small-Scale Tryouts

These studies will be conducted by ETS, but will be conducted in classroom settings during the school day7. If compatibility issues allow, we will screen capture student actions as they appear on screen using software such as CamStudio, a freely available program. The core strength of such screen recording capabilities is their facility for capturing a student’s interactive behaviors as they happen, while one or more observers can later record text comments that are time-locked to the student actions observed in a log file. Using screen capture is both a convenience for observers and a potential means of reducing student stress or distraction, which can detract from data quality. Student actions with the system will also be automatically recorded in a log file. Where there are discrete actions to be captured (e.g., taps on a tablet screen), the log file will capture and identify interactions with timestamps.

In contrast to the cognitive interviews, in the tryouts there will be no think-aloud or verbal probing component, although students will be asked to complete a questionnaire following use of the environment, and may be asked a general evaluative question to get their overall impressions of tasks or activities with the system after they have completed the session. Again, the goal of tryouts is to gather authentic, uncontaminated task performance and interaction data. Therefore, students will work through tasks and selected items at their own pace and without interruption. The protocol is described in Volume II, Part D.

Analysis Plan

Student responses to items will be compiled into spreadsheets to allow quantitative and descriptive analyses of the performance data. For the behavioral data, the screen captures will be used for qualitative analysis to characterize the range of behaviors observed for tryouts that are conducted with one student at a time. Once the coding is established, a basic quantitative analysis will provide frequency counts and, where relevant, order information, for different behaviors or actions observed from each student. These will also be compiled into spreadsheets, and the performance data and behavioral data for each student will be combined in the same document. The log files will be analyzed to examine frequencies, categories, and orders of actions as appropriate for the research and development goals at this stage of the projects; much of this analysis will use descriptive analyses such as graphs and tables, but some inferential statistical tests may also be used.

  1. Consultations Outside the Agency

ETS will develop the platforms and associated items, perform recruitment and data collection activities, and carry out the necessary research studies.

  1. Assurance of Confidentiality

Participants will be notified that their participation is voluntary and that without the permission of their parent or guardian, their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)]. Participants will also be notified that with their parent or guardian’s permission some responses or clips from videos may be selected for use in research reports or presentations8.

Written consent will be obtained from parents or legal guardians of students who are under the age of 18. Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files and secured for the duration of the study and will be destroyed after the final report is completed. Where sessions will be recorded9, the only identification included on the files will be the unique ID assigned to each participant by the interviewer. The recorded files will be secured for the duration of the study and will be destroyed when the research is complete.

  1. Justification for Sensitive Questions

Throughout the item and task development process, as well as the process of developing interview protocols, effort has been made to avoid asking for information that might be considered sensitive or offensive. Reviewers have attempted to identify and minimize potential bias in questions.

  1. Estimate of Hourly Burden

Play Testing Burden

The estimated burden for recruitment assumes attrition throughout the process.10 The anticipated total number of student participants for play testing is 25 total. Around 15 teachers, school officials, and club and community center administrators will be contacted via e-mail and phone. Initial e-mail contact, response, and distribution of materials are estimated at 20 minutes or 0.33 hours. We anticipate distributing 300 flyers via these 15 contacts to parents and students. Time to review is estimated at 5 minutes or 0.08 hours per parent. For parents who are interested in having their child participate, time to fill out the online screening form or participate in a phone screener is estimated at 9 minutes or 0.15 hours. For those selected to participate and asked to fill out the consent form, the estimated time is 8 minutes or 0.13 hours. The follow-up e-mail or letter to confirm participation (or non-participation) for each session is estimated at 3 minutes or 0.05 hours. Play testing sessions are expected to last up to 90 minutes for all students. Table 2 details the estimated burden for play testing.

Table 2. Specific Burden for Play Testing studies11

Respondent

Hours per respondent

Number of respondents

Total burden hours

Student Recruitment via Teachers, Staff, and Club or Community Center Administrators

Initial contact with staff: e-mail, flyer distribution, and planning

0.33

15

5

Parent or Legal Guardian, and Student (18 or older)

Flyer review

0.08

300

24

Completion of online screening form or phone screening

0.15

50*

8

Consent form completion and return

0.13

25**

3

Confirmation to parent via email or letter

0.05

50*

3

Recruitment Totals


315

43

Student

Phase 1 Virtual World for ELA

1.5

25

38

Interview Totals


25

38

Total Burden

465 responses

340

81

* Subset of initial contact group ** Subset of students with completed screening forms

Cognitive Interview Burden

The estimated burden for recruitment assumes attrition throughout the process.12 The anticipated number of student participants for these cognitive interviews is 15 total. Around 9 school administrators and staff officials or club and community center administrators (and parents, if needed) will be contacted via e-mail and phone. Initial e-mail contact, response, and distribution of materials are estimated at 20 minutes or 0.33 hours. We anticipate distributing 180 flyers with consent forms via these 9 school and community group contacts to parents and students. Time to review is estimated at 5 minutes or 0.08 hours. For parents who are interested in having their child participate, time to fill out the online screening form or participate in a phone screener is estimated at 9 minutes or 0.15 hours. For those selected to participate and asked to fill out the consent form, the estimated time is 8 minutes or 0.13 hours. The follow-up e-mail or letter to confirm participation for each session is estimated at 3 minutes or 0.05 hours. Individual cognitive interviews are expected to last up to 120 minutes for all students. Table 3 details the estimated burden for the cognitive laboratories.

Table 3. Estimate of Hourly Burden for Cognitive Interviews

Respondent

Hours per respondent

Number of respondents

Total burden hours

Student Recruitment via School Administrators and Staff and Club and or Community Center Administrators

Initial contact with staff: e-mail, flyer distribution, and planning

0.33

9

3

Parent or Legal Guardian

Flyer review

0.08

180

15

Completion of online screening form or phone screening

0.15

30*

5

Consent form completion and return

0.13

15**

2

Confirmation to parent via email or letter

0.05

30*

2

Recruitment Totals


189

27

Student

Virtual World for ELA

2

15

30

Interview Totals


15

30

Total Burden

279 responses

204

57

* Subset of initial contact group ** Subset of students with completed screening forms

Small-Scale Tryout Burden

The estimated burden for recruitment assumes attrition throughout the process.13 The anticipated number of student participants for these cognitive interviews is a minimum of 100 and a maximum of 200. Around 15 school administrators and staff officials or club and community center administrators (and parents, if needed) will be contacted via e-mail and phone. Initial e-mail contact, response, and discussion are estimated at 60 minutes or 1 hour. We anticipate pursuing testing at 2 of these schools, and spending an estimated 5 hours coordinating the testing session with a staff member from each school. A technology coordinator at each school will spend an estimated two hours installing the screen capture software on laptops or tablets. For teachers (estimate of four teachers) whose classrooms will be participating, we estimate 1 hour each for distributing and collecting consent forms. Anticipated time for parents to read and fill out the consent form is 8 minutes or 0.13 hours. Tryout sessions are expected to last up to 90 minutes for all students. Table 4 details the hourly burden for tryouts if all testing is done in school classrooms.

We anticipate being able to test at least the minimum number of students through schools. However, if a low rate of return for consent forms results in testing fewer than the minimum of 100 students, the difference will be made up by testing students at ETS, and the recruitment methods and burden for those students would be the same as what is described for the play testing and cognitive interviews.

Table 4. Estimate of Hourly Burden for Tryouts

Respondent

Hours per respondent

Number of respondents

Total burden hours

School Recruitment via School Administrators

Initial contact with staff: e-mail, flyer distribution, and planning

1

15

15

Coordination and planning with participating schools

5

2*

10

School Staff

Consent form distribution and collection by teachers

1

4

4

Installation/un-installation of screen capture software

2

2

4

Parent or Legal Guardian

Consent form completion and return

0.13

200

26

Recruitment Totals


221

59

Student

Virtual World for ELA

1.5

200**

300

Tryout Totals


200

300

Total Burden

423 responses

421

359

* Subset of initial contact group ** Maximum potential participants

Total for All Pretesting Activities

The combined totals for all of pretesting activities are listed in Table 5.

Table 5. Combined Burden for SAIL Virtual World Research Activities


Number of respondents

Number of responses

Burden Hours

Pretest Activity




Total Play Testing

340

465

81

Total Cognitive Interviews

204

279

57

Total Tryouts

421

423

359

Overall Totals

965

1,167

497


  1. Estimate of Costs for Recruiting and Paying Respondents

For playtesting, to encourage participation and thank participants for their time and effort, a $25 VISA gift card will be offered to each participating student, plus a $25 VISA gift card to a parent or legal guardian bringing the student to and from the testing site. For cognitive interviews, given the additional time spent in the study session (120 minutes), a $30 VISA gift card will be offered to each participating student, plus a $25 VISA gift card to a parent or legal guardian bringing the student to and from the testing site. To encourage school participation in small-scale tryouts, and increase the efficiency of tryouts recruitment and data collection, an incentive will be offered of $7.00 per participating student to participating schools, up to a maximum of 200 students (i.e., assuming a single school participant; this total will be divided across multiple schools if more than one school is recruited for participation). A significant amount of time and work is required from the participating school staff to obtain consent forms from parents, coordinate the study, and install and uninstall software. In addition, the schools will be providing the space for the tryout and will be using time during school hours for the conduct of the study. Conducting the study during school hours in the classroom is preferred over after school or individually recruited sessions in order to obtain a broader, more diverse, sample. In other similar research studies, the school incentive per participating student may range as high as $15. Given the nature of this study and the requirements on the school staff, the incentive does not need to be as large. However, we feel the amount cannot be reduced lower than the recommended $7.00 per student without resulting in challenges recruiting schools and/or decreased student participation through schools (thereby requiring more students to be assessed at a testing site, resulting in higher costs of the study given the use of student incentives [$25 per student plus $25 per parent for tryouts, if conducted at ETS]).

  1. Costs to Federal Government

The estimated cost to the federal government for the virtual world play testing, cognitive interviews, and small scale tryouts described in this submittal, including designing, preparing for, and conducting play testing sessions, recruitment, incentive costs, data collection, and summary of findings by ETS is $489,604.


  1. Schedule

Table 6 depicts the high-level schedule for the various activities. Each activity includes recruitment, data collection, analyses, and reports.

Table 6. High-Level Schedule of Milestones

Activity

Dates

Play-Testing

Nov 2014 – August 2016

Cognitive Labs

October 2016 – December 2016

Tryouts

December 2016 – May 2017


1 For example, NAEP Science Pretesting Activities (OMB #1850-0803 v.73, October 2012) and NAEP 2011 Cognitive Interview Studies of NAEP Cognitive Items (OMB #1850-0803 v.45, March 2011).

2 Technology and Engineering Literacy Pre-Assessment Studies: Tryout and Usability Studies (OMB #1850-0803 v.66, February 2012).

3 For the ease of description, the term “computer” has been used in the recruitment materials.

4 Nielson, J. (1994). Estimating the number of subjects needed for a think aloud test. In J. Human-computer Studies. 41, 385-397. Available at: http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/DG308%20DID/nielsen-1994.pdf

5 Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think-aloud method: A practical guide to modeling cognitive processes. San Diego, CA: Academic Press. Available at: ftp://akmc.biz/ShareSpace/ResMeth-IS-Spring2012/Zhora_el_Gauche/Reading%20Materials/Someren_et_al-The_Think_Aloud_Method.pdf

6 This table represents the expected distribution across grades. Depending on the nature of the items and tasks and the specific recruitment challenges, the actual distribution may vary slightly. For burden purposes, the maximum number of students by pretesting activity will not exceed the total shown in the table.

7 However, if a low rate of return for consent forms results in testing fewer than the minimum of 100 students, the difference will be made up by testing students at ETS. For these sessions only, digital video capture may be used to record students’ touch-based interactions with the tablet device.

8 The clause regarding the potential use of some responses or clips from videos in research reports or presentations will be included only for students participating in play testing and cognitive interviews conducted at ETS. Any data collection conducted at schools will not be videotaped, and recordings of tryouts conducted in any location will not be used in research reports or presentations.

9 Recordings may be audio and/or video, as described in the specific interview sections.

10 Assumptions for approximate attrition rates are 83.3 percent from initial contact (flyer from teacher) to screening form completion and 50 percent from submission of screening form to participation.

11 The burden estimates in this table reflect the maximum burden for recruitment if students do not participate in multiple play testing sessions.

12 Assumptions for approximate attrition rates are 83.3 percent from initial contact (flyer from teacher) to screening form completion and 50 percent from submission of screening form to participation.

13 Assumptions for approximate attrition rates are 83 percent from initial contact (flyer from teacher) to screening form completion and 50 percent from submission of screening form to participation.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBackground Cog Lab OMB Submission V.1
SubjectNAEP BQ
AuthorDonnell Butler
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy