Justification

Volume I - eNAEP Pretest Study 2016.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement



National Assessment of Educational Progress

eNAEP Pretesting Study 2016



OMB# 1850-0803 v. 159









May 2016



Table of Contents










  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803) that provides for NCES to conduct various procedures (such as field tests, cognitive interviews, usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments. This submittal is to pretest the National Assessment of Educational Progress (NAEP) test delivery system prior to its first operational use. The data generated by the study will help inform the development, refinement, and quality control (QC) of the NAEP student test delivery system.

  1. Background and Study Rationale

NAEP is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in various subject areas and to collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

Over the last few years, NAEP has been transitioning to digitally based assessments (DBA) that are administered on tablets using a test delivery system developed for NAEP (known as eNAEP). The eNAEP system was successfully used in the 2015 and 2016 pilot assessments.1 The first operational use of the eNAEP system will be in conjunction with the 2017 NAEP assessments. Enhancements have been made after each administration of eNAEP to address issues identified in the field, to make the system more user friendly, and to allow for the assessment of additional content and item types.

The purpose of this study is to conduct a real-world test of the eNAEP system with students, allowing the system to be tested in the manner that will be used in the national study to help identify system issues early in the software development process. The rationale for this study is based on lessons learned and issues encountered by students in the field during the 2016 pilot assessment that were not found during normal testing. It is believed that students use and interact with the system differently than adult QC testers. Therefore, including students as part of the pretesting and QC process should allow for issues to be identified and addressed prior to the operational use of the system.

For this study, a pretesting event with 30 students from grades 4 and 8 will be held in a simulated classroom after each preliminary/draft version of eNAEP (referred to as “builds”) is produced. Each event will have two sessions lasting approximately 100 minutes each. The same students will participate in both sessions. Three events will take place over the course of the eNAEP development, refinement, and QC.

  1. Recruitment and Sample Characteristics

An NCES subcontractor for NAEP, EurekaFacts, will recruit no more than 30 students (a mix of grade 4 and 8 students) for each pretesting event, for a total of 90 students for all three events. The studies will be held, likely on a Saturday, at the EurekaFacts facility in Rockville, Maryland.

EurekaFacts will recruit participants for the pretesting study from the District of Columbia, Maryland, Virginia, and West Virginia. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics. Students will be recruited to obtain the following criteria:

  • A mix of race/ethnicity (Black, Asian, White, Hispanic);

  • A mix of socioeconomic background;

  • A mix of urban/suburban/rural areas; and

  • A mix of students requiring accommodations.

While EurekaFacts will use various outreach methods (see Appendices A-J) to recruit students to participate, the bulk of the recruitment will be conducted by telephone and based on their acquisition of targeted mailing lists containing residential addresses and landline telephone listings. EurekaFacts will also use a participant recruitment strategy that integrates multiple outreach methods and resources such as newspaper and internet ads, community organizations (e.g., Boys and Girls Clubs, Parent-Teacher Associations), and mass media recruitment (e.g., postings on the EurekaFacts website).

Interested students will be screened (see Appendix K) to ensure that they meet the criteria for participation in the pretesting study (i.e., the students are from the targeted demographic groups outlined above and their parents/guardians have given consent). When recruiting participants, EurekaFacts staff will speak to the parent/guardian of the interested minor before starting the screening process. During this communication, the parent/guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirming that a participant is qualified, willing, and available to participate in this study, he or she will receive a confirmation e-mail/letter and phone call. Written informed parental consent (see Appendix L) will be obtained for all respondents who are interested in participating in the data collection efforts.

  1. Study design and data collection

Shortly after each of the preliminary/draft eNAEP builds are released, a pretesting event with 30 students from grades 4 and 8 will be held in a simulated classroom. Each event will have two sessions lasting approximately 100 minutes each; the same students will participate in both sessions. The events will be structured as follows:

  • During the first session, each student will be asked to take the assessment under standard NAEP assessment conditions (approximately 90 minutes). One of NCES contractors for NAEP, Westat, will administer the session using standard procedures. Students will take the full assessment, including the tutorial, cognitive items and tasks2, and the survey questionnaires.3

  • To conclude the first session, a group debrief (limited to 10 minutes) will be conducted to solicit feedback from the students (see Volume II for the debriefing script).

  • After the first session, students will be given a break at which time students will be offered snacks and/or lunch.

  • At the beginning of the second session, students will be instructed to push the eNAEP system to the limits, with the intent to identify any flaws in eNAEP or with the functionality of the tablets. Again, students will take the full assessment (approximately 90 minutes), including the tutorial and the survey questionnaires (see Volume II for directions for the students).

  • To conclude the second session, a group debrief (limited to 10 minutes) will be conducted to solicit feedback from the students.

As part of the assessment administration in both sessions 1 and 2, students will take a set of survey questionnaires. The maximum time for the survey questionnaire component is 15 minutes (included in the 100-minute time estimation for each session). Students will take a “core” section regarding general student and contextual information and a subject-specific section. Volume II includes the library of possible student survey items to be administered.4 Not all of the items presented in Volume II will be administered in this eNAEP pretesting study. The number of items selected for each student will be appropriate to the time allocated. As the items for the 2017 administration are finalized throughout the development process, the final sub-set will be included in the eNAEP system for pretesting. As such, the earlier builds may include different items selected from the library than the final build.

Normal data collection will be enabled by the eNAEP system and any errors generated will be collected automatically by the system. Note that student responses will not be scored. In addition to the eNAEP system recording information, administrators and observers from NCES, Westat, Fulcrum, ETS, and/or EurekaFacts will monitor the assessments and record notes detailing any issues encountered by the students, as well as what the students were doing at the time each issue occurred. In addition, observers may ask individual students for clarification of the actions he or she took prior to an issue or error occurring. For example, observers may ask questions such as, “What is the error?”; “What was the last thing you saw before the error?”; “What were you expecting to happen?”; or “What did you do right before the error happened?”. Understanding and documenting what caused the system error is necessary in order to have enough information for staff to replicate the error and develop a fix for it.

The sessions will be audio and/or video recorded to capture information regarding any student actions that resulted in system errors or issues.

  1. Consultations outside the agency

Westat is the Sampling and Data Collection (SDC) contractor for NAEP. Westat will provide the tablets for the student’s use and carry out pretesting study.

Fulcrum is the NAEP contractor responsible for the development and ongoing support of NAEP digitally based assessments for NCES, including the system to be used for the eNAEP pretesting study. Fulcrum will be onsite to assist Westat in the administration of the study.

ETS serves as the Planning and Coordination (PC), Item Development (ID), and Design, Analysis, and Reporting (DAR) contractor for NAEP. ETS staff may assist in administering and/or observing some sessions.

EurekaFacts is located in Rockville, Maryland. It is an established for-profit research and consulting firm, offering facilities, tools, and staff to collect and analyze both qualitative and quantitative data. EurekaFacts is working as a subcontractor for ETS to recruit participants and provide the facilities to be used for the study. In addition, EurekaFacts staff may assist in administering and/or observing some sessions.

  1. Justification for Sensitive Questions

Throughout the item and debriefing question development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

To encourage participation and thank them for their time and effort, an incentive will be offered to each participating student. Because the total student time will near 4 hours at the testing site, to accommodate two testing sessions and a break, we will offer each student a $50 incentive. If a parent or legal guardian brings their student to and from the testing site, they will also receive $25 as a thank you for their time, effort, and transportation for their child. In addition, students will be offered snacks and/or lunch during the break between the two testing sessions.

  1. Assurance of Confidentiality

The eNAEP pretesting study will not collect any personally identifiable information. Prior to the start of the study, participants will be notified that their participation is voluntary and that their answers may be used only for research purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)].

Written notification will be sent to the legal guardian(s) of students before testing is conducted. Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form.

  1. Estimate of Hourly Burden

The estimated burden for recruitment assumes attrition throughout the process.5 Each pretesting session will be 100 minutes, including the assessment time and the debriefing session. Each student will participate in two sessions and a 30 minute break for a total of 230 minutes. Table 1, below, details the estimated burden.

Table 1. Estimate of Hourly Burden

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Parent or Legal Guardian for Student Recruitment

Initial contact

400

400

0.05

20

Follow-up via phone

200*

200

0.15

30

Consent & confirmation

100*

100

0.15

15

Participation (Pretesting)  

Students

90**

90

3.83

345

Total

490

790


410

* Subset of initial contact group

** Estimated number of actual participants is expected to be approximately 90% of the confirmed cases.

  1. Cost to federal government

Table 2, below, provides the overall project cost estimates.

Table 2: Estimate of Costs

Activity

Provider

Estimated Cost

Recruiting students and providing facilities for the study

EurekaFacts

$60,000

Administering the study

Westat

$8,250

Assisting with administering the study; analyzing the results

Fulcrum

$10,125

Total


$78,375

  1. Project Schedule

The schedule for each pretesting event is based on the eNAEP development schedule. The current schedule is as follows:

  • Event 1: May-June 2016

  • Event 2: July-August 2016

  • Event 3: September-October 2016

1 More information about NAEP DBAs can be found at http://nces.ed.gov/nationsreportcard/dba/default.aspx.

2 Mathematics, reading, writing, U.S. history, civics, and geography items and tasks will be administered as part of the pretesting study. In a session, each student will take items from only one subject area.

3 Draft content may be used in the earlier builds.

4 The final items will consist of those selected for NAEP 2017 administration (currently under review [OMB #1850-New v.1; previous OMB #1850-0790 v.43]). The questionnaire components in Volume II are a subset of the items in the 1850-New v.1 submittal.

5 Assumptions for approximate attrition rates are 50 percent from initial contact to follow-up, 50 percent from follow-up to confirmation, and 90 percent from confirmation to participation.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTabletStudyUsability_Vol1_9-10-13
SubjectOperational Analysis
AuthorFulcrum IT
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy