ATC21S Justification

ATC21S Cog Labs Phase 2.2 Memo.docx

NCES Cognitive, Pilot, and Field Test Studies System

ATC21S Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics



DATE: February 1, 2011

TO: OMB

FROM: Daniel McGrath, NCES

THROUGH: Kashka Kubzdela, NCES

SUBJECT: ATC21S Phase 2.2 Cognitive Interviews (OMB No. 1850-0803 v.41)

Submitted-Related Information

The following material is being submitted under the National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803), which provides for NCES to improve methodologies, question types, and/or delivery methods of its survey and assessment instruments by conducting field tests, focus groups, and cognitive interviews. The request for approval described in this memorandum is to conduct the second of four phases of development and testing of assessment tasks for 6th, 8th, and 10th grade students in what are considered to be 21st century skills and to recruit participants for the third phase. The request for clearance for pilot test activities of the third phase will be submitted at a later date. The four phases of the field test include task validation of the assessment prototype, cognitive labs, a pilot test, and a field test. Attached as supporting documents to this memo please find:

  • Appendices:

    • A. Phase 2.2 Cognitive Lab recruitment materials, consent forms, protocol

    • B. Phase 2.3 Pilot Test recruitment materials, consent forms

    • C. Cognitive Lab Observation Notes with screenshots of ICT Assessment

  • WestEd Affidavit of Non-disclosure

Background

The Assessment and Teaching of 21st Century Skills (ATC21S) project is focused on defining what are considered to be 21st century skills and developing and testing instruments to assess those skills. The two areas to be developed are literacy in information and computer technology (ICT) and collaborative problem solving – areas that are thought to play an important role in students’ preparation for 21st century work places. The project is coordinated by Cisco Systems Inc., Intel Corporation, and Microsoft Corp and led by representatives of six “Founder” countries that have agreed to test the assessments developed in the ATC21S project: Australia, Finland, Portugal, Singapore, the United Kingdom, and the United States.

In 2009, ATC21S convened five working groups of experts to review the state-of-the-art and develop key issues to be resolved in conceptualizing the domain of 21st century skills: developing methodology and technology for assessing these skills; understanding the relationship between these skills, instruction, and other learning opportunities; and developing a policy framework for implementing assessments of the skills. This work resulted in five White Papers designed to inform the future stages of the project:


21st Century Skills white paper identified high-priority 21st century skills, defined them in operational terms, gave examples of these skills as they are enacted in a range of real world situations and of assessment tasks and scoring rubrics that would provide evidence of students' level of mastery. However, in order for those definitions to be translated into assessments, they required further development, accompanied by formulation of hypotheses concerning the nature and characteristics of the developmental learning continua associated with each of the skills.


Methodological Issues white paper identified and addressed methodological problems in both summative and formative assessment of 21st century skills. It gave particular attention to ICT-enabled and large-scale assessment.


Technological Issues white paper identified and analyzed technological problems confronted in ICT-based assessment of 21st century skills and proposed solutions to these problems.


Classroom Learning Environments and Formative Evaluation white paper reviewed, synthesized and evaluated innovative ways to improve individual and group development of 21st century skills, considering both formal and informal learning opportunities. The latter are frequently excluded from official curricula and formal assessments, though they can be very important for future success.


Policy Frameworks for New Assessments white paper developed a framework for moving from small marginal pilot projects to implementing new forms of assessment within a coherent teaching and learning system. The paper focused on the systemic reform needed to achieve this shift.


In 2010, the ATC21S Executive Board (made up of senior officials from each of the founder countries and representatives from Cisco, Intel, and Microsoft) approved a focus on two 21st century skill domains promising for measurement: (1) Collaborative Problem Solving and (2) ICT Literacy – Learning in Digital Communities. Panels of experts were convened to define the constructs to be measured.


The Collaborative Problem Solving panel, under the leadership of Professor Friedrich Hesse (University of Tubingen/Knowledge Media Research Center, Germany) and Professor Eckhard Klieme (German Institute for International Educational Research), defined the construct of collaborative problem solving. The ICT Literacy – Learning in Digital Communities panel under the leadership of Dr. John Ainley (Australian Council for Educational Research, Australia) defined the construct of digital literacy and social networking. The panels developed theoretical frameworks for their domains and proposed a series of proficiency levels to guide assessment task developers in the next phase of the project.


Web-based assessments designed to measure literacy in information and communication technology (ICT) were developed in draft form in the United States and assessments to measure the collaborative problem solving were developed in draft form in the United Kingdom. The development work is funded by the Cisco, Intel, and Microsoft consortium under a separate contract.


The end products of web-based assessments and supporting strategies will be made available in the public domain for classroom use. NCES plans to use the knowledge gained about web-based assessments of 21st century skills in development of items for its longitudinal surveys, international assessments, and the National Assessment of Educational Progress (NAEP).

Design and Context

NCES has contracted WestEd (henceforth, the contractor) to implement all phases of the U.S. National Feasibility Test project. Specifically, the contractor will work with the two development contractors, the Berkeley Evaluation and Assessment Research (BEAR) Center at the University of California, Berkeley, and World Class Arena Ltd. in the UK to test the ICT Literacy and Collaborative Problem Solving assessments in the field through different phases of implementation and report on the findings. The findings will be used to further refine the assessments until a final version is achieved. Eugene Owen and Dan McGrath are the project leads at NCES.


Phases completed to date.


2.1 Phase – Validation of Task Concepts involved the contractor recruiting 9 teachers (3 who taught 6th grade, 3 who taught 8th grade, and 3 who taught 10th grade) to preview the initial assessment prototype for the ICT Literacy assessment and to collect the teachers’ feedback on the task concepts as they relate to the grade level they teach. At the time of this submission, the Collaborative Problem Solving assessments are still under development by World Class Arena Ltd. and have not been through this Validation of Task Concept phase yet (validation to be performed under a separate contract)


Phases for which OMB approval is sought


2.2 Phase – Cognitive Lab think-aloud sessions involve the contractor observing a series of individual students as they work through each of the ICT Literacy prototype tasks, and collecting metacognitive data during the process. The Collaborative Problem solving assessments are not developed yet and approval for them will be sought in a future submission. The cognitive lab think-aloud session is designed to reveal the cognitive processes that students use as they complete the ICT Literacy task. In particular data will be gathered on students’ ability to understand the task instructions, navigate the computer interface, complete the tasks, respond to questions, and provide a valid response. The sessions are designed to identify usability problems with the interface and navigation and to provide validity evidence that in completing the tasks students do use the ICT literacy skills that the tasks are designed to assess, prior to the pilot test phase of the project that follows this phase. The results of the cognitive labs will inform the refinement of the interface and tasks before they are administered to a larger number of students in the pilot test, thus improving the quality of the assessment by reducing construct irrelevant variance.


Sampling Plan and Recruitment

Individual students will be selected from the 6th, 8th, and 10th grades. A total of 36 students from the 3 grade levels will be recruited to participate. Recruitment will be done via the nine teachers who participated in the first phase of the project, in which they reviewed early versions of the assessments. Those teachers will pass recruitment information to students in their classes. If that initial recruitment effort does not produce the required sample, we will extend the recruitment to other teachers in the San Francisco Bay Area. The recruitment materials are provided in appendix A.


Overview of the Assessment Tasks

Three scenarios have been developed by the BEAR Center to assess the Information and Communications Technology (ICT) Literacy 21st century skills. Within each scenario, students undertake a series of tasks. The scenarios that BEAR has developed are:


• Scenario 1: Webspiration - a poetry-based environment

• Scenario 2: Arctic Trek - a natural adventure-based environment

• Scenario 3: Second Language Chat - a peer-based language learning environment


Although the ICT tasks are set in scenarios that relate to typical content areas such as English Language Arts, Science, Mathematics, and Languages, the assessment focus of the tasks themselves is the ICT Literacy skills domain. The ICT Literacy skills domain is broken out in the assessment into four sub-domains:

• Functioning as a consumer in networks

• Functioning as a producer in networks

• Participating in the development of social capital through networks

• Participating in intellectual capital (collective intelligence) in networks

Different tasks throughout the three scenarios map to these sub-domains.


Webspiration

The setting of this scenario is reading and analysis of well-known poems as part of a poetry work unit. The tasks in the scenario involve students in articulating the moods and meanings of particular poems. Students use Webspiration, a concept mapping tool, to formulate their own ideas on the poems, to create an idea map collaboratively, and to analyze each poem they read. Students submit their own ideas or build on classmates’ thoughts. Figure 2 shows a screenshot of a task from the scenario.


The task taps students’ skills in using computer interfaces, performing basic IT tasks, searching using common search engines, identifying tools for networking, producing simple representations from templates, reading simple displays, participating productively in social networking sites, and beginning to organize and reorganize information digitally. A complete set of screenshots from the tasks in the Webspiration scenario is shown in the cognitive laboratory protocol in appendix C.

Arctic trek

One potential mechanism for the assessment of student-ability-in-learning-network aspect of ICT literacy is to model assessment practice through a set of exemplary classroom materials. This scenario is based on the Go North/Polar Husky information website (www.polarhusky.com) run by the University of Minnesota (see Figure 3). The Go North website is an online adventure learning project based around arctic environmental expeditions. The website is a learning hub with a broad range of information and many different mechanisms to support networking with students, teachers, and experts. ICT literacy resources developed relating to this scenario focus on the Consumer in Networks sub-domain.


Shape1

Figure 2. Screenshot of a sample task from the Webspiration scenario


Shape2

Shape3

Figure 3. Arctic Trek Resources


The tour through the site for the ATC21S demonstration task is conceived as a "collaboration contest," or virtual treasure hunt. The Arctic Trek task views social networks through ICT as an aggregation of different tools, resources, and people that together build community in areas of interest. In this task, students in small teams ponder tools and approaches to unravel clues, while touring scientific and mathematics expeditions of actual scientists. The task helps model for teachers how to integrate technology across the subjects. It also shows how the Go North site focuses on space to represent yourself, chat, and dialogue as forms of ICT literacy. A complete set of screenshots from the tasks in the Arctic Trek scenario will be shown in the cognitive laboratory protocol in appendix C when the task has been updated.


Second Language Chat

This scenario is developed as a peer-based second language learning environment through which students interact in learning. There seems to be consensus that to develop proficiency in a second language (or either mother tongue) requires ample opportunities to read, write, listen, and speak. These assessment tasks ask students to set up a technology/network-based chat room, invite participants, and facilitate a chat — in two languages. It also involves evaluating the chat and working with virtual rating systems and online tools such as spreadsheets.


Worldwide, "conversation partner" language programs have sprung up in recent years. They bring together students wishing to practice a language with native speakers, often in far-flung parts of the world. The cultural and language exchanges that result demonstrate how schools can dissolve the physical boundaries of walls and classrooms. They also tap rich new learning spaces through the communication networks of ICT literacy. A complete set of screenshots from the tasks in the Conversation Partners scenario is shown in the cognitive laboratory protocol in appendix C.


Shape4

Figure 3. Language learners will engage with native speakers in the "Conversation Partners" task


Data Collection Process

Cognitive labs are commonly used in the development of complex tasks in web-based assessments1. The contractor will conduct the cognitive labs in a process that reflects this approach. In cognitive labs an interviewer uses a structured protocol in a one-on-one interview using two methods: think-aloud interviewing and verbal probing techniques. In the think-aloud interviewing, students are explicitly instructed to “think aloud” (i.e., describe what they are thinking) as they interact with the tasks on screen and apply their knowledge and skills in solving the problems. The student works through each set of tasks and a software application captures the activities on the screen (mouse movements, clicks and text entries) while simultaneously recording the audio stream as the student speaks. In this way a complete and synchronized record of the student’s actions and thought processes are captured and can be replayed after the session for more detailed analysis than is possible in the live session. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think aloud” process. These probes might include, for example, asking the student if he or she appears to be hesitating to determine if the student is having a cognitive issue with the task or confusion about the navigation on the screen.


At the same time as the student is working through the scenarios and the tasks within them, the interviewer uses a paper-based protocol to follow along with the student. The cognitive lab protocol is included in appendix A. For each screen of the scenario, the interviewer has one sheet that shows an image of the screen the student is working on and check boxes that list the tasks to be completed on that screen and the ICT Literacy skills that the task was designed to elicit. Other check boxes list interface usability aspects for recording if the student has problems interpreting or using the interface. This part of the protocol involves the interviewer observing as the student works and noting, using the check boxes, whether or not the student used the cognitive skills that the ICT Literacy task was designed to elicit and if there were usability issues. The interviewer can also make other relevant notes in a general comments section on each page and indicate on the picture of the screenshot where the student was having difficulty.


To be effective in administering the cognitive labs, the interviewer will need to develop a clear understanding of the scenarios and tasks, the cognitive processes assumed to underpin the tasks, as well as use good communication skills appropriate to interacting with the individuals taking the task. A student will take approximately 90 minutes to complete the tasks in the cognitive lab session, and with introduction and follow up, the session length will be two hours. Students will be recruited for the sessions via their teachers who will be provided with a flyer for students and their parents. As students will be giving up personal time after school to participate in the cognitive labs, each will be given a $10 valued gift card, and because parents will have to make arrangements to collect their child after the session, parents will be given a $15 gift card.


Data Analysis

The cognitive lab session produces a mix of quantitative data (e.g., counts of the number of students who used the skills the tasks were designed to test) and qualitative data (e.g., descriptions of the issues that students were having with the tasks and content or with the interface). The purpose is to identify and correct problems of ambiguity or misunderstanding, or other difficulties respondents have in performing the tasks and responding to questions. The screen and audio recordings are used to check back on particular parts that are flagged as of importance from the handwritten notes of the observer. The unit of analysis will be the tasks within each scenario. For each task spreadsheets will be compiled to summarize the data collected. This will include counts of each skill being assessed that was addressed by the student, counts of the times that there was a usability issue, together with qualitative descriptions of the way that students applied the skills being assessed and the usability issues they encountered. These data summaries will allow for identification of patterns of responses and issues across the whole student sample. The results will enable the assessment and software developers to make changes to the scenarios and tasks to make them easier to understand and therefore less burdensome for students while also ensuring that the tasks target the skills of interest for the assessment rather than skills or knowledge that are extraneous to the ICT Literacy domain.


2.3 Phase – Pilot Test (this submission seeks approval for recruitment only; approval for administration will be sought when pilot test materials are completed). The Pilot Test follows after the BEAR Center development team at UC Berkeley, based on findings from the cognitive labs phase, modifies the tasks and completes programming to produce a pilot version of the assessments. Pilot testing involves taking the pilot version of the assessments and administering them in the classrooms of 3 teachers for each grade level (6th, 8th, and 10th grades). The Pilot Test will obtain data on how effectively the assessments measure the targeted skills, test the feasibility of administering complex web-based assessments in a school setting, and test the data capture and measurement methods involved in interpreting the complex responses that these types of assessment will generate. The contractor will recruit and train 3 teachers, one from each of the targeted 6th, 8th, and 10th grade levels, to administer the assessment to their classes. Teachers will be recruited from contacts that the contractor has in various schools across the country. Teachers will be contacted initially via email and then provided with a document that describes the project and what is involved in participation. The contractor anticipates that teachers who are teaching English language arts, mathematics, science, or languages and have an interest in 21st Century skills teaching or the use of technology in education are the most likely group to volunteer to participate. The contractor will develop training guides, assessment administration procedures, and technical requirements for this phase. Training will be conducted online using webinar software so that the contractor can walk each teacher through the administration process. Teachers will also have to book school computer rooms for the administration, talk with the school computer network support person to ensure that the computers meet the minimum requirements for the administration, and make sure that student logins are set up prior to the administration. The assessments will be administered in two 45-minute class periods. Assuming each of the 3 teachers has a class of 25, an estimated 75 students will take the assessments. Student responses will be recorded and stored electronically for analysis of the whole data set. Because the 3 teachers will be serving the role of the school and IT coordinators over a period of time, and will be responsible for securing the space and computers, assuring that the software to be tested runs adequately on all of the school computers to be used, student recruitment, parental consent forms, participating in training, and implementing the assessment over two class periods, they will be paid $200 for their time and effort.


Phases for which OMB approval will be sought in future submissions under OMB# 1850-0803


2.4 Phase – The Field Test is designed to provide sufficient data to establish empirically based scales that have the capacity to indicate students’ place and progress on developmental continua associated with each of 21st century skill sets assessed. During the field test, the administration of many of the tasks will be fully automated and should not require teacher input apart from their fulfilling a supervisory role during the administration, although as in the Pilot Test, teachers will be required to book school computer rooms for the administration and talk with the school computer network support person to ensure that the computers meet the minimum requirements for the administration. The teacher will also need to make sure that student logins are set up prior to the administration. A web-based delivery system will deliver the field test version of the assessment.


Final Report

The contractor will produce and deliver a written report summarizing the project, the activities, and the findings by December 31, 2011.

Use of Results

  • The cognitive laboratories will collect information about the way students go about engaging with the task and identify any functional issues that need to be addressed in fine-tuning the task.

  • In the future phases (for which separate OMB submissions will be made), the pilot test will provide data on the feasibility of the assessments being administered in school settings, and the field test will gather enough data to calibrate the scales used to measure the skills. All personally identifying information will be removed from the resulting data set.

  • End products of the overall project will be made publicly available for use in classrooms and will inform NCES’s development of assessment tasks in its longitudinal studies, international assessments, and NAEP.


Results and data from both activities will be compiled by the contractor and provided in usable form to NCES and the international researcher coordinator who will direct the assessment developers to make appropriate changes to the assessments (NCES will provide direction through the international research coordinator).

Assurance of Confidentiality

Cognitive Lab “Think Aloud” participants are minors who will need to have consent letters (see attached Parent Consent Letter) read and signed by their parent or guardian. Submission of these letters is a condition for participation. Students will be informed of the activity asked of them, and can volunteer or stop participation at any point of the Cognitive Lab session. Any personal identifying information the students provide to facilitate engagement in the assessment (e.g. email address) will be de-identified and not used after the cognitive session.


Pilot Test teacher participants will be asked to sign a Teacher Consent form provided by the contractor as a condition of participation. Student response data will be de-identified through use of a unique ID number so that individual students cannot be identified in the data set collected for analysis.


Project Schedule

Shape5

Estimate of Hour Burden

Teachers who teach English language arts, mathematics, science, or languages in the 6th, 8th and 10th grades are ideal recruits for the different phases of the project. By extension, their students can voluntarily participate from Phases 2.2 Cognitive Labs through Phase 2.5 Field Test. However, this OMB package asks approval only for the time burden associated with Phase 2.2 – Cognitive Labs, and Phase 2.3-Pilot Test.


Phase of Work

Activity

Number of Recruits

Estimated Hours

TOTAL burden Hours

Phase 2.2 Cognitive Labs

Recruit students and collect Parent consents

9 teachers

1

9

Read and sign Parent permission letter

36 parents

10 minutes

6

Think Aloud” lab participation

36 students

2

72

Phase 2.3 Pilot Test (recruitment approval only)

Recruiting process

3 teachers

1

3

Send and collect parent opt-out forms

3 teachers

2

6

Read, sign, & send back opt-out forms

75 parents

15 minutes

19


Estimate of Cost Burden

There is no direct cost to teachers and students.

Cost to the Federal Government

Phase 2.2, Cognitive Labs & Phase 2.3, Pilot Test

$127,741


Phase 2.4, Field Test

$75,201


Final Report

$32,185


1 See Forsyth, B., & Lessler, J. T. (1991). Cognitive laboratory methods: A taxonomy. In R. P. Biemer, L. Groves, N. Lyberg, Mathiowetz & S. Sudman (Eds.), Measurement errors in surveys. New York: Wiley; Nolin, M., & Chandler, K. (1996). Use of cognitive laboratories and recorded interviews in the national household education survey (No. NCES-96-332). Rockville, MD: Westat; Zucker, S., Sassman, C., & Case, B. J. (2004). Cognitive labs. San Antonio, TX: Harcourt Assessment.

ATC21S 7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMemorandum
AuthorMike Timms
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy