Justification

ATC21S Pilot Test Phase 2.3 Memo.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf

Memorandum United States Department of Education

Institute of Education Sciences

National Center for Education Statistics


DATE: May 11, 2011

TO: Shelly Martinez, OMB

FROM: Daniel McGrath, NCES

THROUGH: Kashka Kubzdela, NCES

SUBJECT: ATC21S: US Pilot Test (OMB# 1850-0803 v.49)

Submitted-Related Information

The following material is being submitted under the National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803), which provides for NCES to improve methodologies, question types, and/or delivery methods of its survey and assessment instruments by conducting field tests, focus groups, and cognitive interviews. The request for approval described in this memorandum is to conduct the third of four phases of development and testing of assessment tasks for 6th, 8th, and 10th grade students in what are considered to be 21st century skills and to recruit participants for the fourth phase. The request for clearance for field test data collection of the fourth phase will be submitted at a later date. The four phases of the field test include task validation of the assessment prototype, cognitive labs, a pilot test, and a field test. Attached as supporting documents to this memo please find:

  • Appendices:

    • A. Phase 2.3 Pilot test training and administration materials

    • B. Phase 2.4 Field Test recruitment materials and consent forms

    • C. Screenshots of ICT Assessment

    • WestEd Affidavit of Non-disclosure

Background

The Assessment and Teaching of 21st Century Skills (ATC21S) project is focused on defining what are considered to be 21st century skills. The two areas to be developed are literacy in information and computer technology (ICT) and collaborative problem solving, areas that are thought to play an important role in students’ preparation for 21st century work places. The project is coordinated by Cisco Systems Inc., Intel Corporation, and Microsoft Corp and led by representatives of six “Founder” countries that have agreed to test the assessments developed in the ATC21S project: Australia, Finland, Portugal, Singapore, the United Kingdom, and the United States.

In 2009, ATC21S convened five working groups of experts to review the state-of-the-art and develop key issues to be resolved in conceptualizing the domain of 21st century skills; developing methodology and technology for assessing these skills; understanding the relationship between these skills, instruction, and other learning opportunities; and developing a policy framework for implementing assessments of the skills. Short descriptions of the resulting five White Papers, designed to inform the future stages of the project, were provided in the approved clearance package for ATC21S cognitive interviews (OMB# 1850-0803 v.41, approved on February 23, 2011).


In 2010, the ATC21S Executive Board (made up of senior officials from each of the founder countries and representatives from Cisco, Intel, and Microsoft) approved a focus on two 21st century skill domains promising for measurement: (1) Collaborative Problem Solving and (2) ICT Literacy – Learning in Digital Communities. Panels of experts were convened to define the constructs to be measured. Both panels focused on the theoretical framework for their domain and proposed a series of proficiency levels to guide assessment task developers in the next phase of the project.


The Collaborative Problem Solving panel, under the leadership of Professor Friedrich Hesse (University of Tubingen/Knowledge Media Research Center, Germany) and Professor Eckhard Klieme (German Institute for International Educational Research), defined the construct of collaborative problem solving. The ICT Literacy – Learning in Digital Communities panel under the leadership of Dr. John Ainley (Australian Council for Educational Research, Australia) defined the construct of digital literacy and social networking. Both panels focused on the theoretical framework for their domain and proposed a series of proficiency levels to guide assessment task developers in the next phase of the project.


Web-based assessments designed to measure literacy in information and communication technology (ICT) were developed in draft form in the United States and assessments to measure the collaborative problem solving were developed in draft form in the United Kingdom. The development work is funded by the Cisco, Intel, and Microsoft consortium under a separate contract.


The end products of web-based assessments and supporting strategies will be made available in the public domain for classroom use. NCES plans to use the web-based assessments in development of items for its longitudinal surveys, international assessments, and, potentially, NAEP.

Design and Context

NCES has contracted WestEd (henceforth, the contractor) to implement all phases of the U.S. National Feasibility Test project. Specifically, the contractor will work with the two development contractors, the Berkeley Evaluation and Assessment Research (BEAR) Center at the University of California, Berkeley, and World Class Arena Ltd. in the UK to test the ICT Literacy and Collaborative Problem Solving assessments in the field through different phases of implementation and report on the findings. The findings will be used to further refine the assessments until a final version is achieved. Eugene Owen and Dan McGrath are the project leads at NCES.


Phases completed to date.


2.1 Phase – Validation of Task Concepts involved the contractor recruiting 9 teachers (3 who teach 6th grade, 3 who teach 8th grade, and 3 who teach 10th grade) to preview the initial assessment prototype for the ICT Literacy assessment and to collect the teacher’s feedback on the task concepts as they relate to the grade level they teach. To date, the World Class Arena vendor has produced eight collaborative problem solving task drafts. WestEd conducted a task concept review with 6 teachers (two each from grades 6, 8, and 10). There are still some other collaborative problem solving tasks under development, but as of the time of this submission they have not been received. All the collaborative problem solving tasks will be pilot tested by other countries involved in the project, but not in the US because they will not be available when US schools are in session. However, it is planned that all of the collaborative problem solving tasks will be available for the Field Trial in the fall and further information on the nature of the tasks and screenshots from them will be included in the OMB submission that will be submitted for the Field Trial.


2.2 Phase – Cognitive Lab think aloud sessions involved the contractor observing a series of individual students as they worked through each of the ICT Literacy prototype tasks, and collecting metacognitive data during the process. Twenty-one students from the 6th, 8th, and 10th grades participated. The Collaborative Problem solving assessments were not included in the US cognitive lab sessions because the assessments were not ready in time, so they are being included in cognitive labs in other partner countries in the project. The cognitive lab think aloud sessions were designed to reveal the cognitive processes that students used as they completed the ICT Literacy tasks. In particular data were gathered on students’ ability to understand the task instructions, navigate the computer interface, complete the tasks, respond to questions, and to provide a valid response. The sessions were designed to identify usability problems prior to the pilot test with the interface and navigation and to provide validity evidence that in completing the tasks students did use the ICT literacy skills that the tasks were designed to assess.


The results of the cognitive labs have been used to inform the refinement of the interface and tasks before they are administered to a larger number of students in the pilot test, thus improving the quality of the assessment by reducing construct irrelevant variance. Analysis of the results of the ATC21S 2011 Cognitive Laboratories with students in Australia, Singapore, and the United States for the Learning in Digital Communities (ICT Literacy) scenarios have led to revisions in the areas of system performance, task content , use of external software tools, and administration. The changes are summarized below.


System performance improvements. The Formative Assessment Delivery System (FADS) was amended to allow students to go back in a task using either the back button or paging icons so that they can easily view and modify their responses. Scroll bars have been provided on free text response boxes so students can more easily see what they have entered. The text in the tasks will be moved into an external text file from which the task pages load, which makes it easier for non-English speaking countries to create translations of the text in the tasks so that it displays in their native language. This feature does not change the way the tasks run for English-speaking countries like the US. In addition, load time of steps in the tasks has been improved to make the student progress through the tasks more rapid. New benchmarking related to sizing/resolution of some elements and possibly server location improvements for the local contexts has been implemented. Server timeout issues have been investigated to improve performance for schools connecting with lower internet bandwidths. Testing of system and object improvements will take place, following completion of the revisions, according to the platform/browser specifications protocols.


Task content. Minor editing of some of the screen content has been implemented following detailed notes provided via the cognitive laboratory protocols. In particular, new clues have been developed for 6th grade students in the Arctic Trek scenario to simplify the reading for those younger students. In general, the scenario lengths have been reduced to fit into one class period for easier administration, and the scenarios have been divided into a 10-minute "practice" portion and a 45-minute assessment delivery.


External software tools. Updating of some of the external tools has taken place, including purchasing a server service for the Webspiration concept-mapping tool so that it runs more efficiently, and transfer to Microsoft Live@Edu tools where possible. The AudioPals screen may be substituted or shortened. Live@EDU accounts will be automated and provided as coded logins/passwords to countries. Some specific debugging issues in the tasks have been addressed (to date these include Kodu graphic being screened back to be more transparent so text is readable, portions of Scenario 1 concept map object being re-engineered, display on Spinner object being updated, and a spacing overlap issue on Mac Safari recent version of the OS/browser being corrected). A timing icon or other timing capacity on screen and in the objects that may be helpful for time management, will be implemented funding permitting.


Administration. The Teacher Guide and Assessment Delivery Documents including specifications and requirements have been updated. The Assessment Blueprints, which map the tasks to the content domain have been updated.



Phases for which OMB approval is sought


2.3 Phase –The Pilot Test (recruitment was approved on 2/23/2011; now approval for administration is sought) follows the modification of the tasks and completion of programming by the BEAR Center development team at UC Berkeley to produce a pilot version of the assessments, based on findings from the cognitive labs phase. The pilot version of the assessments will be administered in the classrooms of three teachers, one for each grade level (6th, 8th, and 10th grades). The Pilot Test will obtain data on how effectively the assessments measure the targeted skills, test the feasibility of administering complex web-based assessments in a school setting, and test the data capture and measurement methods involved in interpreting the complex responses that these types of assessments generate. Three teachers, one from each of the targeted 6th, 8th, and 10th grade levels, will be trained to administer the assessment to their classes. Teachers were recruited from contacts that the contractor has in various schools. The training guides, assessment administration procedures, and technical requirements for this phase are included in Appendix A. Training will be conducted online using webinar software so that the contractor can walk the teacher through the administration process. Teachers will also have to book school computer rooms for the administration, talk with the school computer network support person to ensure that the computers meet the minimum requirements for the administration, and make sure that student logins are set up prior to the administration. The assessments will be administered in two 45-minute class periods. An estimated 25 students per each of the 3 teachers (total 75 students) will take the assessments. Student responses will be recorded and stored electronically for analysis of the whole data set. Teachers will receive $200 for their participation, training, assessment administration, and fulfilling the role of school coordinator.


Overview of the Assessment Tasks

Three scenarios have been developed by the BEAR Center to assess the Information and Communications Technology (ICT) Literacy 21st century skills. Within each scenario, students undertake a series of tasks. The scenarios that BEAR has developed are:


• Scenario 1: Webspiration - a poetry-based environment

• Scenario 2: Arctic Trek - a natural adventure-based environment

• Scenario 3: Second Language Chat - a peer-based language learning environment


Although the ICT tasks are set in scenarios that relate to typical content areas such as English Language Arts, Science, Mathematics, and Languages, the assessment focus of the tasks themselves is the ICT Literacy skills domain. The ICT Literacy skills domain is broken out in the assessment into four sub-domains:

• Functioning as a consumer in networks

• Functioning as a producer in networks

• Participating in the development of social capital through networks

• Participating in intellectual capital (collective intelligence) in networks

Different tasks throughout the three scenarios map to these sub-domains.



Figure 1. Screenshot of a sample task from the Webspiration scenario (A complete set of screenshots from the tasks in the Webspiration scenario is shown in Appendix C)


Webspiration

The setting of this scenario is reading and analysis of well-known poems as part of a poetry work unit. The tasks in the scenario involve students in articulating the moods and meanings of particular poems. Students use Webspiration, a concept mapping tool, to formulate their own ideas on the poems, to create an idea map collaboratively, and to analyze each poem they read. Students submit their own ideas or build on classmates’ thoughts. Figure 1 shows a screenshot of a task from the scenario.


Some of the ICT skills involved in this module are:

• Use a computer interface

• Perform basic IT tasks

• Search for pieces of information using common search engines

• Know that tools exist for networking (e.g. Facebook, Concept Maps)

• Produce simple representations from templates

• Read and interpret simple displays

• Start an identity

• Log into an external website

• Post an artifact

• Participate in a social activity online

• Know about survey tools

• Make tags

• Post a question or answer one

• Begin to organize and reorganize information digitally




Figure 2. Arctic Trek Resources (A complete set of screenshots from the tasks in the Arctic Trek scenario is shown in Appendix C)


Arctic trek

One potential mechanism for the assessment of student ability in learning network aspect of ICT literacy is to model assessment practice through a set of exemplary classroom materials. This scenario that has been developed is based on the Go North/Polar Husky information website (www.polarhusky.com) run by the University of Minnesota (see Figure 2). The Go North website is an online adventure learning project based around arctic environmental expeditions. The website is a learning hub with a broad range of information and many different mechanisms to support networking with students, teachers, and experts. The ICT literacy resources developed relating to this scenario focus on the Consumer in Networks sub-domain.


The tour through the site for the ATC21S demonstration task is conceived as a "collaboration contest," or virtual treasure hunt. The Arctic Trek task views social networks through ICT as an aggregation of different tools, resources and people that together build community in areas of interest. In this task, students in small teams ponder tools and approaches to unravel clues, while touring scientific and mathematics expeditions of actual scientists. The task helps model for teachers how to integrate technology across the subjects. It also shows how the Go North site allows students to use chat and dialogue as forms of ICT literacy.




Figure 3. Language learners will engage with native speakers in the "Conversation Partners" task (A complete set of screenshots from the tasks in the Conversation Partners scenario is shown in Appendix C)

Second Language Chat

This scenario is developed as a peer-based second language learning environment through which students interact in learning. There seems to be consensus that to develop proficiency in a second language (or the mother tongue) requires ample opportunities to read, write, listen, and speak. These assessment tasks ask students to set up a technology/network-based chat room, invite participants, and facilitate a chat — in two languages. It also involves evaluating the chat and working with virtual rating systems and online tools such as spreadsheets.


Worldwide, "conversation partner" language programs have sprung up in recent years. They bring together students wishing to practice a language with native speakers, often in far-flung parts of the world. The cultural and language exchanges that result demonstrate how schools can dissolve the physical boundaries of walls and classrooms. They also tap rich new learning spaces through the communication networks of ICT literacy.



Data Analysis

The pilot test will produce a data set comprised of the student actions and responses in each of the assessment tasks within the ICT Literacy scenarios. The purpose is to obtain data on how effectively the assessments measure the targeted skills, test the feasibility of administering complex web-based assessments in a school setting, and test the data capture and measurement methods involved in interpreting the complex responses that these types of assessment will generate. The unit of analysis will be the tasks within each scenario. Spreadsheets will be compiled for each task to summarize the data collected. These data summaries will allow for identification of patterns of responses and issues across the whole student sample, including any difficulties in deployment or data capture. The results will enable the assessment and software developers to make changes to the scenarios and tasks prior to their inclusion in a larger field test in September 2011.


Phases for which OMB approval will be sought under a future memo


2.4 Phase – Field Test (this package seeks approval for recruitment for this phase; approval for administration will be sought in a future package). The Field Test, which will take place in September and October of 2011, is designed to provide sufficient data to establish empirically based scales that have the capacity to indicate students’ place and progress on developmental continua associated with each of the 21st century skill sets assessed. The US sample will be a part of an overall sampling design that spans all of the countries participating in the field test phase of the project and will involve field testing of tasks from both the ICT Literacy and Collaborative Problem Solving domains. Students will take three tasks (two from one strand and one from the other) so that psychometric equating of task difficulty and scaling of performances across domains can be achieved. During the field test, the administration of the tasks will be fully automated, and should not require teacher input apart from their fulfilling a supervisory role during the administration. A web-based delivery system will deliver the field test version of the assessment. Teacher will fulfill the same role as in Phase 2.3 (described above). The assessments will be administered in three 45-minute class periods. Assuming each of the 21 teachers has a class of 25, an estimated 525 students from the US will take the assessments. Student responses will be recorded and stored electronically for analysis of the whole data set. Teachers will receive $200 for their participation, training, assessment administration, and fulfilling the role of school coordinator. Recruitment materials for the Field Test are provided in Appendix B.


Final Report

The contractor will produce and deliver a written report summarizing the project, the activities, and the findings by December 31, 2011.

Use of Results

  • The pilot test will provide data on the feasibility of the assessments being administered in school settings and on data collection being carried out through the test administration system. All personally identifying information will be removed.

  • The future field test (for which separate OMB submission will be made) will gather enough data to calibrate the scales used to measure the skills. A data set will be collected. All personally identifying information will be removed.

  • End products of the overall project will be made publicly available for use in classrooms and will be used by NCES for the development of assessment tasks in its longitudinal studies, international assessments, and, potentially, NAEP.


Results and data from both activities will be compiled by the contractor and provided in usable form to NCES and the international researcher coordinator who will direct the assessment developers to make appropriate changes to the assessments (NCES will provide direction through the international research coordinator).

Assurance of Confidentiality

Field Test teacher participants will be asked to sign a Teacher Consent form provided by the contractor as a condition of participation. Student response data will be de-identified through the use of unique ID numbers so that individual students cannot be identified in the data set collected for analysis.

Project Schedule


Estimate of Hour Burden

Teachers who teach English language arts, mathematics, science or languages in the 6th, 8th and 10th grades are ideal recruits for the different phases of the project. By extension, their students can voluntarily participate from Phases 2.2 Cognitive Labs through Phase 2.4 Field Test. However, this OMB package asks approval only for the time burden associated with Phase 2.3 – Pilot Test and Phase 2.4-Field Test.


Burden Table


Phase of Work

Activity

Number of Respondents

Number of Responses

Estimated Hours

TOTAL burden Hours

Phase 2.3 Pilot Test

Teacher training, preparation, and assessment coordination

3

3

4

12

Teacher: administer pilot test

3

3

2

6

Phase 2.4 Field Test (recruitment approval only)

Teacher Recruitment

21

21

1

21

Principal Agreement

21

21

0.25

6

Teachers: send and collect parent opt-out forms

21

21

2

42

Parents: read, sign, & send back opt-out forms

525

525

0.25

132

Total


570

594


219


Estimate of Cost Burden

There are no direct cost to teachers and students.

Cost to the Federal Government

Phase 2.2, Cognitive Labs & Phase 2.3, Pilot Test

$127,741


Phase 2.4, Field Test

$75,201


Final Report

$32,185


ATC21S 8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMike Timms
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy