Changes Memo

NAEP Gr 8 Social Sciences IICs Pretesting Round 2 Changes Memo.docx

NCES Cognitive, Pilot, and Field Test Studies System

Changes Memo

OMB: 1850-0803

Document [docx]
Download: docx | pdf

MEMORANDUM OMB # 1850-0803 v. 224

DATE: February 20, 2018

TO: Kashka Kubzdela

National Center for Education Statistics

FROM: Linda Hamilton

National Center for Education Statistics

SUBJECT: National Assessment of Educational Progress (NAEP) Grade 8 Social Sciences Interactive Item Components (IICs) Pretesting-Round 2 [Based on a previously approved package (1850-0803 v.197)]


The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by the National Center for Education Statistics (NCES).

In May 2017, OMB approved for NCES to conduct an initial round (round 1) of pretesting activities (OMB #1850-0803, v. 197), as part of the NAEP assessment development process, consisting of cognitive interviews and tryouts to collect data on newly developed interactive item components (IICs) for the 2020 grade 8 social science assessments (including civics, geography, and U.S. history). These IICs include an interactive timeline (U.S. History), a simulated web search (civics), a multimedia source container (civics and U.S. history), and a geographic information system (GIS) toolset (geography). The focus of this pretesting was to investigate whether these IICs elicit the targeted knowledge and skills; whether any item content, interaction, or presentation causes confusion or introduces construct-irrelevant errors; and to gather information about how long students take to complete various IICs. A range of items using each new interactive item component was to be included in the pretesting activities.

The pretesting round 1 was conducted from August to September 2017. Cognitive interviews with 24 students from the Washington, D.C. metropolitan area were conducted to understand what reasoning processes students used as they worked through IICs and item sets. Tryouts were conducted with 75 students from the greater Washington, D.C. metropolitan area (including Washington, D.C., Maryland, and Northern Virginia) to collect information concerning students’ thoughts about the broader IIC tasks and their experiences with the scenarios in the item sets, and to provide a reasonable sample of quantitative data on student performance, including timing data.

The pretesting findings (a) indicated that the IIC’s elicit the targeted social science knowledge and skills and (b) provided valuable feedback on the effectiveness of item scoring guides. Findings also indicated that a number of software and user-interface updates were required to improve the student experience and reduce IIC load times in the assessment delivery system. Based on the pretesting findings, NAEP is implementing content, performance, and design changes for each IIC and, thus, an additional round of cognitive interviews and tryouts on the IICs is needed before piloting these items on a larger scale.

Consequently, this request is to conduct round 2 of pretesting activities on the grade 8 social science assessment IICs. The IIC revisions made based on the results of round 1 pretesting are not reflected in the ICR package materials because the assessment items, software, and interface are not subject to PRA. Given that round 2 will largely follow the recruitment and administration procedures used in round 1, the content of this request is very similar to that approved in May 2017 (OMB #1850-0803, v. 197) with only minor changes to reflect round 2 pretesting. This memo has been created to facilitate OMB’s review and provides a listing of the differences between the approved (v.197) ICR documents and the documents in this request (the original, approved text is shown in red font, and the revised text in blue font) followed by an explanation of each revision:

Revisions to Volume 1

  1. Cover page:

Added “Round 2” to the title of the package, updated the version number to “v. 224”, and changed the date to “February 2018”.


  1. Section 2: Background and Study Rational

Added the following paragraphs (the same text as this memo’s intro):

The initial round (round 1) of pretesting for the IICs was approved on May 22, 2017 (OMB #1850-0803, v. 197) and was conducted from August to September 2017. Cognitive interviews with 24 students from the Washington, D.C. metropolitan area were conducted to understand what reasoning processes students used as they worked through IICs and item sets. Tryouts were conducted with 75 students from the greater Washington, D.C. metropolitan area (including Washington, D.C., Maryland, and Northern Virginia) to collect information concerning students’ thoughts about the broader IIC tasks and their experiences with the scenarios in the item sets, and to provide a reasonable sample of quantitative data on student performance, including timing data.

The pretesting findings (a) indicated that the IIC’s elicit the targeted social science knowledge and skills and (b) provided valuable feedback on the effectiveness of item scoring guides. Findings also indicated that a number of software and user-interface updates were required to improve the student experience and reduce IIC load times in the assessment delivery system. Based on the pretesting findings, NAEP is implementing content, performance, and design changes for each IIC and, thus, an additional round of cognitive interviews and tryouts on the IICs is needed before piloting these items on a larger scale.

Consequently, this request is to conduct round 2 of pretesting activities on the grade 8 social science assessment IICs. The IIC revisions made based on the results of round 1 pretesting are not reflected in the ICR package materials because the assessment items, software, and interface are not subject to PRA. Given that round 2 will largely follow the recruitment and administration procedures used in round 1, the content of this request is very similar to that approved in May 2017 (OMB #1850-0803, v. 197) with only minor changes to reflect round 2 pretesting. An accompanying Changes Memo has been created to facilitate OMB’s review and to provide a listing of the differences between the approved (v.197) ICR documents and those in this request, including an explanation of each revision.

Explanation of revision:

This sentence was added to describe what was done in the first round of pretesting and to explain why round 2 is needed


  1. Section 3: Recruitment and Data Collection:

Revised the number of students indicated in the second paragraph under Cognitive Interviews “Sampling and Recruitment Plan”

Original Text: “…cognitive interviewing is expected to involve approximately 20-24 students.”

Revised Text: “…cognitive interviewing is expected to involve approximately 20 students.”

Explanation of revision:

For the first round of pretesting 24 students participated in cognitive interviews. For round 2, twenty participating students will be sufficient.


  1. Section 8: Estimate of Hourly Burden:

Updated burden table to reflect round 2 pretesting

Table 1: Estimate of Hourly Burden for Pretesting Activities

Respondent

Original Number of respondents

Revised Number of respondents

Hours per respondent

Original Total hours

Revised Total hours

Student Recruitment via Teachers and Staff

 

 


 

 

Initial contact with staff: e-mail, flyer distribution, & planning

14

13

0.33

5

5

Parent or Legal Guardian






Flyer and consent form review

264

254

0.08

22

21

Consent form completion and return

132*

127*

0.13

18

17

Confirmation to parent via email or letter

132*

95*

0.05

7

5

Recruitment Totals

278

267


52

48

Student

Grade 8 Cognitive Interviews

24

20

1.5

36

30

Grade 8 Tryouts

75

75

1.5

113

113

Interview Totals

99

95


149

143

Total Burden

377

362


201

191


Explanation of revision:

Round 2 will include a slightly lower number of student participants in the cognitive interviews.

  1. Section 9: Cost to the Federal Government

Updated costs to reflect the additional pretesting being requested

Table 2: Cost to the Federal Government

Activity

Provider

Original Estimated Cost

Revised Estimated Cost

Cognitive Interviews




Design and prepare for cognitive interviews; analyze findings & prepare report

ETS

$ 134,644

$106,501

Prepare for and administer cognitive interviews (including recruitment, incentive costs, data collection, analysis, & reporting)

EurekaFacts

$ 141,670

$115,135

Tryouts




Design and prepare for task tryouts; analyze findings and prepare report

ETS

$ 151,140

$131,538

Prepare for and administer task tryouts (including recruitment, incentive costs, data collection, & reporting)

EurekaFacts

$ 130,384

$140,019


Explanation of revision:

The costs were updated to reflect round 2 needs.

  1. Section 10: Project Schedule

Table 3: Schedule

Activity

Each activity includes recruitment, data collection, and analyses

Round 1 Dates

Round 2 Dates

Cognitive interviews

August-September 2017

April-June 2018

Small-scale tryouts

August-September 2017

April- June 2018

Pretesting reports submitted

October 2017

July 2018


Explanation of revision:

Round 2 pretesting will take place in the spring and summer of 2018.

Revisions to Volume 2

  1. Cover page:

Added “Round 2” to the title of the package, updated the version number to “v. 224”, and changed the date to “February 2018”.


  1. An additional prompt was added to the protocol on page 7 section, II d, to instruct students to use either the touch screen or the stylus.

New text:

[If the student is taking Golden Gate Park] The first item set you see is called Golden Gate Park. Please use [your stylus, not your fingers]/[ your fingers, not your stylus], when using the drawing and measurement tools that are part of this item set.


Explanation of revision:

Based on earlier pretesting results from the Golden Gate Park task and the recommendation of cognitive scientists with experience in students’ use of tablets, we decided to explore the impact of different user interface approaches on student experience and performance. The change in the protocol was necessary to instruct students to use a specific interface approach, either stylus or finger.

Appendices

  1. Cover page:

Added “Round 2” to the title of the package, updated the version number to “v. 224”, and changed the date to “February 2018”.

No changes were made to the contents of this appendices document.

Page 4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorO'Connell, James M
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy