Justification

Vol 1 NAEP Grade 4 Writing Prompts Study.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics

National Assessment of Educational Progress


Volume I

Supporting Statement


National Assessment of Educational Progress (NAEP) Pretesting Visual Representations in 4th Grade Writing Tasks



OMB# 1850-0803 v. 196

May 2017

Table of Contents



  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and procedures. This request is to conduct cognitive interviews and small scale tryouts to probe one aspect of test validity—the effects of visual representations and associated interactive features on student performance on the National Assessment of Educational Progress (NAEP) 4th grade writing tasks. The data obtained from the cognitive interviews and small scale tryouts are intended to inform guidelines for the development of more accessible writing tasks that include multimedia stimuli.

  1. Background and Study Rationale

NAEP is a federally authorized survey, by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by NCES, which is part of the Institute of Education Sciences, within the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

This request is part of a study intended to investigate whether features of grade 4 writing tasks involving multimedia can be systematically manipulated to make the tasks more accessible to students, especially low performing students (i.e., those in the bottom 20 percent) while remaining aligned with the NAEP writing framework. Previous work conducted by the authors of this study has identified multimedia features that have been shown to correlate with task difficulty. These features are particularly evident in persuasive writing tasks (one of the three types of writing tasks required by the 2011 NAEP Writing Framework). The ultimate goal of this study is to provide guidance to item development that would allow improved measurement at the lower end of the writing achievement distribution, hence augmenting the validity and utility of the NAEP writing assessment.

In the study, we first created modified versions of two grade 4 writing tasks by manipulating visual and interactive features associated with task difficulty. Using cognitive interviews and small scale tryouts, we will evaluate the impact of these modifications on student's ability to use and interpret the task stimuli as well as their perception of task difficulty. In order to help isolate the contribution of the specified visual and interactive features to task difficulty, the writing tasks have been modified to remove another potential source of difficulty, particularly for fourth graders—poor keyboarding skills. Therefore, students will read the prompts and view the associated videos or graphics on-line, but they will write their responses on paper.

  1. Recruitment and Data Collection

Sampling and Recruitment Plan

NCES contracted the American Institutes for Research (AIR) to carry out the cognitive interview and small scale tryout testing activities described in this package. AIR subcontracted EurekaFacts Research to recruit students in the Washington DC, Maryland, and Virginia areas and to conduct the cognitive interview and small scale tryout testing activities.

A total of 60 fourth graders will participate in this study -- 36 for the cognitive interviews and 24 for the small scale tryouts of writing tasks. Students will be recruited by EurekaFacts staff from the following demographic populations:

  • Mix of Gender,

  • Mix of race (Black, Asian, White),

  • Mix of Hispanic ethnicity, and

  • Socioeconomic background: half of the sample will be limited to low socialeconomic status (SES) students, and the other half will include students with a mix of socioeconomic backgrounds.

Although the sample will include a mix of student characteristics, results will not explicitly measure differences by these characteristics. Additionally, while English Language Learners (ELLs) will be eligible for the study if their Englsh language skills are sufficient to permit them to participate without a language acccommodation, students with serious visual, hearing, or cognitive disorders will be excluded. Furthermore, there will be no assumption that the selected students will be representative of the subgroups from which they are drawn.

EurekaFacts will use a participant recruitment strategy that integrates multiple out-reach/contact methods and resources (see Appendices) such as internet ads, individual emails, telephone recruiting, and on-site location-based recruiting. More specific methods and resources we anticipate using include the following:

  • utilizing EurekaFacts’ existing databases with students, parents, and individuals that expressed interest in participating in research studies;

  • outreach to EurekaFacts’ existing database of community organizations, independent clubs, and activity-centered groups;

  • sending emails or making telephone calls to leaders of youth oriented nonprofit community organizations and groups;

  • utilizing targeted contact lists purchased from reputable third-party vendors;

  • posting information on Craigslist sites for the Washington D.C. metro area; and

  • in-person posting and canvassing at retail outlets and community local stores.

EurekaFacts has an existing large database of diverse contacts that will assist in the recruitment effort. This includes relationships with several different types of community centers in the region, including the YMCA, Frederick Housing Authority, Latin American Youth Center (LAYC), Horton’s Kids, and others. In the past, EurekaFacts has also specifically targeted low SES populations and recruited English Language Learner students. Existing relationships and diverse recruitment strategies are expected to result in a mixed demographics sample of both high and low performing students.

Although a considerable number of interviews/small scale tryouts will be conducted at the EurekaFacts interview site in Rockville, Maryland, EurekaFacts will also conduct these activities at organizations, community centers, and other interview sites. The need for parents to transport the students to interview/tryout sites can be a limitation especially when recruiting in low income areas. As a result, EurekaFacts will send personnel to conduct interviews/tryouts at the organization or community center, upon approval from the centers, in order to remove the travel burden from parents and students. Interviews/tryouts will be completed at a time and location that is most convenient for the parents and students. This approach is expected to help encourage participation of students in low income areas.

At community centers and organizations, initial contact with most commonly the head or leader will be done once via e-mail, followed by up to three contact efforts via phone. Based on prior NAEP studies, outreach efforts are more effective and response rates improve with initial written contact followed by secondary phone contact as the person of contact already has some preliminary information about the research effort and the reason for the call.

When recruiting individual participants, EurekaFacts staff will first speak to the parent/legal guardian of the interested minor before starting a screening process (see Appendices). During this communication, the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. Interested participants will be screened to ensure that they meet the criteria for participation in the tryout (e.g., their parents/legal guardians have given consent and they are from the targeted demographic groups outlined above). After confirmation that participants are qualified, willing, and available to participate in the research project, they will receive a confirmation email/letter and phone call. Informed consent from parents/legal guardians will be obtained for all respondents who are interested in participating.

Data Collection Process

For the study, there will be two writing tasks, each with three versions: an original version, an enhanced version that removes as many of the hypothesized inhibiting features as possible, and a third version designed to test a specific hypothesis regarding inhibiting or facilitating multimedia features. The small scale tryouts will obtain responses on each of the six tasks versions that can be scored using standard NAEP guidelines and provide a (qualitative) indication of the relative difficulty of each version.

Cognitive Interviews

Interviewers will implement a script and a structured interview protocol. The interview will last up to one hour. At the beginning, the interviewer will provide an explanation of the study and the procedures to follow. The interviewer will demonstrate the think-aloud process and then the student will practice a think-aloud to ensure an understanding of the process.

The remainder of the cognitive interview will be organized around two main activities. First, the student will write a response to one of the three versions of one task, while reporting on their experience through a combination of concurrent think-aloud, retrospective think-aloud, and responses to probes. Second, the student will compare this version of the task with a second version of the same task and will be probed regarding their perceptions of the relative clarity/ease of interpretation of the two task versions.

The protocol also contains generic prompts that can be applied flexibly by the interviewer to facilitate and encourage students to verbalize their thoughts such as “What are you thinking?”

Following the comparison of the two versions of the same task, the student will complete a short written survey about their experience with computer-based school tasks (see Volume II), the participant and parent(s) will be thanked and will receive the promised incentive amount, and the parent(s) as well as the student participant will each sign a receipt for their incentive payment. The interview notes will be organized in the template provided by AIR and delivered to AIR for analysis. Both audio and video recordings of the cognitive interviews will be made. These data, along with the student’s written response, will be secured for the duration of the study and will be destroyed three months after the final report is submitted (see section 7 for confidentiality safeguards).

Small Scale Tryouts

During small scale tryouts, students will work uninterrupted through two tasks (one of three versions of each of the two writing tasks included in the cognitive interviews). This allows for data to be gathered quickly on normal and uninterrupted task performance. The tryout process in this study will follow a script and a proctoring protocol.

Each group interview will be conducted by a trained proctor and proctor’s assistant based on the number of students that are scheduled. Students will work on the task in classroom-like settings, in groups ranging in size from 4 to 6 students per session. At the beginning of the session, the proctor will provide an explanation of the study and the procedure to be followed during the session. The students then will work individually on completing the 2 writing tasks assigned to them. Students will be given a maximum of 30 minutes to complete the first task, will be asked to stop after 30 minutes, and then instructed to move on to the second task, for which they will also be given 30-minutes. Finally, the student will complete the same brief written survey as those participating in cognitive interviews. Overall the session will take up to 90 minutes.

Again, the goal of the tryouts is to gather authentic, uncontaminated task performance data. Therefore, students will work through tasks at their own pace and without interruption (up to the 30-minute limit for each task). As with the cognitive interviews, students will read the writing prompt and view the associated video on-line, but write their response on paper. The proctor and proctor’s assistants will be available to answer any questions the students may have about the procedure, but not about the substance of the writing tasks. The proctors will record any questions or technical problems the students experience, and will code observable behavioral proxies of boredom, inattentiveness, confusion, etc.

Finally, the participant and parent(s) will be thanked, receive the promossed incentive amount, and both the parent(s) and student participant will be asked to sign a receipt for their incentive payment. The students’ written responses will be secured for the duration of the study and will be destroyed three months after the final report is submitted (see section 7 for confidentiality safeguards).

Analysis Plan

Cognitive Interviews

For the cognitive interviews data collection, documentation will be grouped at participant level. The types of data collected about writing prompts and components will include:

  • think-aloud verbal reports;

  • process/observable data such as time spent on writing responses, time spent for planning;

  • responses to generic questions prompting students to think out loud;

  • responses to targeted questions specific to the tasks to work on;

  • additional volunteered participant comments; and

  • answers to contextual survey questions.

The data collected from the cognitive interviews will be compiled to identify patterns of responses for tasks, including patterns of responses to probes, or types of actions observed from students at specific points while working through the writing tasks.

This approach will help to ensure that the data are analyzed in a way that is thorough and systematic. In this way, the analysis strategy will enable the identification of visual and interactive features that facilitate or inhibit comprehension and performance and develop recommendations for addressing these types of features in tasks involving multimedia. In addition, students’ survey data will be analyzed and related to their reported task difficulty.

There will be no reporting of the performance data related to students’ responses to the one task; however, these responses will be scored to provide context to students’ comments and verify the validity of their verbal responses to the interviewer probing (e.g., did students say something was easy when they provided a written response that would earn a very low score?).

A summary report will be produced, which will include a description of participant characteristics, and positive and negative reactions to visual and interactive features within tasks. Principles drawn from the findings may be used to guide a future experimental study intended to provide confirmatory information about students’ writing performance as a function of the task modification.

Small Scale Tryouts

For the small scale tryout group administration data collections, documentation also will be grouped at participant level. Scores of participants’ writing responses will be compiled along with their responses to survey questions on their computer-based learning experiences. Data will be analyzed to examine if modifications of writing tasks, that is, introduction/removal of facilitators and/or inhibitors makes for differences in performance.

  1. Consultations Outside the Agency

The American Institutes for Research (AIR), under contract to NCES, will analyze results and draft a report.

  1. Justification for Sensitive Questions

Throughout the development process, effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

To encourage student participation and thank students for their time and effort, a gift card from a major credit card will be offered to each participating student. For the cognitive interviews, which will run for 60 minutes, we will offer each student a $25 gift card. For the small scale tryouts, which will run for 90 minutes, we will offer a $35 gift card. If a parent or legal guardian brings their student to and from the interview/tryout site, they will also receive a $25 gift card as a thank you for their time, effort, and transportation for their child.

  1. Assurance of Confidentiality

Students taking part in the cognitive interviews or tryouts will be notified that their participation is voluntary and that all the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). Written consent will be obtained from parents or legal guardians of student participants. Participants will be identified by unique identifiers, with the code book indicating their true identity kept under lock and key in a data storage area at AIR. The consent forms, which include the participant’s name, will be separated from the interview files and secured for the duration of the study. They will be destroyed after the final report is completed. The interviews will be recorded in both audio and video format. The only identification included on the recorded files will be the unique identifiers. The recorded files will be secured for the duration of the study and destroyed after the final report is completed.

  1. Estimate of Hourly Burden

The estimated burden for recruitment assumes attrition throughout the process. Assumptions for approximate attrition rates are 50 percent from initial parent contact to consent form completion and 50 percent from submission of consent form to participation. Cognitive interview sessions will be scheduled for no more than 60 minutes and small scale tryouts of the writing tasks will be scheduled for no more than 90 minutes.

Table 1. Estimate of Hourly Burden for Cognitive Interviews

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Student Recruitment via Organizations

Initial contact with staff, flyer distribution, & planning

6

6

0.33

2

Parent or Legal Guardian

Initial contact and consent form review

144

144

0.08

12

Consent form completion and return

72*

72

0.13

9

Confirmation to parent via email or letter

72*

72

0.05

4

Recruitment Totals

150

294


27

Participation (Cognitive Interviews)

Students

36

36

1.0

36

Cognitive Interview Totals

36

36


36

Total Burden

186

330


63

* Subset of initial contact group, not double counted in the total number of respondents.

Note: numbers have been rounded and therefore may affect totals


Table 2. Estimate of Hourly Burden for Small Scale Tryouts

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Student Recruitment via Organizations

Initial contact with staff, flyer distribution, & planning

4

4

0.33

1

Parent or Legal Guardian

Initial contact and consent form review

96

96

0.08

8

Consent form completion and return

48*

48

0.13

6

Confirmation to parent via email or letter

48*

48

0.05

2

Recruitment Totals

100

196


17

Participation (Small Scale Tryouts)

Students

24

24

1.5

36

Small Scale Tryout Total

24

24


36

Total Burden

124

220


53

* Subset of initial contact group, not double counted in the total number of respondents.

Note: numbers have been rounded and therefore may affect totals

Table 3. Total Estimate of Burden Across All Study Activities

Pretesting activity

Number of respondents

Number of responses

Burden hours

Cognitive Interviews

186

330

63

Small Scale Tryouts

124

220

53

Total Burden

310

550

116


  1. Costs to Federal Government

The estimated cost to federal government for the NAEP writing grade 4 cognitive interviews and small scale tryout activities study is $349,088 as shown in Table 4.

Table 4. Estimate of Costs to Federal Government

Activity

Provider

Cost

Design, material preparation, incentives, coding, scoring, analysis, and reporting

AIR

$232,513

Prepare for and administer cognitive interviews (including recruitment & data collection)

EurekaFacts

$116,575

Total Cost Estimate


$349,088



  1. Project Schedule

Table 5 depicts the high-level schedule for the study. Both study components—cognitive interviews and small scale tryouts—will proceed on the same schedule.

Table 5. Timeline for NAEP Grade 4 Writing Cognitive Interviews and Small Scale Tryouts Study

Activity

Dates

Participant recruitment

June 2017 – August 2017

Data collection, preparation, and coding

June 2017 August 2017

Data analysis

August 2017 September 2017

Summary report

November 2017


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSteven Hummel;Kim Gattis;Fran Stancavage
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy