Passback response

ATC21S Field Test Phase 2.4 Passback Response.docx

NCES Cognitive, Pilot, and Field Test Studies System

Passback response

OMB: 1850-0803

Document [docx]
Download: docx | pdf

November 21, 2011

MEMORANDUM



To: Shelly Martinez, OMB

From: Dan McGrath, NCES

Through: Kashka Kubzdela, NCES

Re: Response to OMB passback on ATC21S Project Phase 2.4 Field Test (OMB# 1850-0803 v.59)




  1. You note that the Collaborative Problem Solving tasks were not cognitively tested in the US because they were not developed in time, and make no mention of any field test involving the Collaborative Problem Solving tasks.  In your memo accompanying your previous ATC21S submission (dated May 11, 2011, for OMB# 1850-0803 v.49, page 3), you write that “All the collaborative problem solving tasks will be pilot tested by other countries involved in the project, but not in the US because they will not be available when US schools are in session. However, it is planned that all of the collaborative problem solving tasks will be available for the Field Trial in the fall and further information on the nature of the tasks and screenshots from them will be included in the OMB submission that will be submitted for the Field Trial.”  We would like further information on the international tests of these tasks, and how you can be sure that the items are valid in the United States without any cognitive or basic field tests.



The tasks have been designed to conform to the theoretical framework developed by ACT21S. Not all tasks were pilot tested in all countries, but given the similar response patterns across countries where the tasks have been administered to date (Australia, Finland, Costa Rica, Netherlands, Singapore), we are reasonably confident that the tasks are eliciting the skills they are intended to. In international studies it is common practice to not pilot test in all countries, but to conduct field trials in all countries. For example, in PISA small scale trials are done in a limited number of countries, not in the U.S., and then field tested in all countries, including in the U.S.



  1. In regards to sampling we are unsure of the frame and the sampling procedures. 



    1. You indicate that you plan on using the results of this feasibility test to establish scales for both components that will apply across the US.  While we understand that a completely representative sample of the US student population is not possible here, we would like more information regarding the sample and how you plan on applying the results across the nation.  Are you still only using California schools as you did for the pilot test, or will you be selecting a geographically and socioeconomically diverse sample of schools, teachers, and students?



Recruitment for the field trials was approved in the OMB package (OMB# 1850-0803 v.49) on May 25, 2011. The recruitment activities were described on page 8 of the memo dated May 11, 2011, under the section titled 2.4 Phase—Field Test. We have not made any changes to that approved plan and, based on OMB approval, we sent out recruitment fliers to agencies working with teachers in Washington, Oregon, Texas, Arizona, New Jersey, New York, Vermont, Nebraska, Maryland, Mississippi, and California. To date, we have received applications from teachers in rural, suburban, and urban schools. Schools range in student population from 55 students in small rural schools, to 1,273 students in large urban schools. In terms of student demographics, the range includes: Hispanic, 3% to 99%; White, not Hispanic, 1% to 97%; Black, not Hispanic, 0% to 66%; Asian, <1% to 10%; and Native American, <1% to 5%. Socioeconomic status (students eligible for free and reduced lunch) ranges from 12% to 99%, and English Language Learners ranges from 0% to 62%.



Seven teachers at the 6th grade level, seven teachers at the eighth grade level, and seven teachers at the 10th grade will be selected to administer the assessments. We will make sure that the seven teachers at each grade level represent a geographically and socioeconomically diverse sample of schools drawn from the pool of applicants.



The intent of the field trials is to see how the assessments work in the classroom. The ACT21S project is not planning to report student results or apply them across the nation.



    1. Will students be randomly assigned to task strands, is assignment to the strands purposeful, or will students have the option of picking which stand to take?  If the students are to be assigned, we would like more information regarding this process.



Students will be randomly assigned to task strands. A counterbalanced allocation process is used to ensure that all tasks are administered and that all participating students are linked across tasks.  The objective is not to assess the students, but to collect sufficient data to enable calibration of the tasks. Students within age levels are randomly assigned to tasks, but in a manner that ensures the linking across all tasks in order to obtain sufficient data points across specific strands for development of robust scoring.



    1. Will the same sample of teachers/students take both the ICT and the Collaborative Problem Solving tests?



Yes, the same sample of students will take both the ICT and Collaborative Problem Solving Tasks.



  1. OMB is not comfortable with these tasks being conducted during instructional time.  Please clarify when teachers will administer the tests to the students.



We are recommending that teachers administer the assessments during other time periods scheduled during the school day such as: computer labs, library time, activity periods (that usually occur in block scheduling), intervention/enrichment periods, etc. We are not recommending that teachers schedule the assessments during core instructional time.



  1. Why are the teachers are being offered $200 to administer the test?  Please justify.



The $200.00 teacher incentive was approved in the OMB package (OMB# 1850-0803 v.49) on May 25, 2011 as part of the recruitment materials and activities. The recruitment activities were described on page 8 of the memo dated May 11, 2011, under the section titled 2.4 Phase—Field Test, and the approved communication materials to respondents (Appendix B) reflected the incentive amount. Teachers are playing a key role in making the field test activities possible in their schools and each will put several hours towards making the field test possible. As we described in the memo of the current submission:



“During the field trials, the administration of the tasks will be fully automated, and should not require teacher input apart from their fulfilling a supervisory role during the administration. A web-based delivery system will deliver the assessments. Training of teachers will be conducted online using webinar software so that the contractor can walk the teachers through the administration process. Prior to the training session, teachers will receive the ICT Literacy Administration Manual and the Collaborative Problem Solving Administration Manual (both can be found in Appendix A). Teachers will reserve school computer rooms for the administration, talk with the school computer network support person to ensure that the computers meet the minimum requirements for the administration, and make sure that student logins are set up prior to the administration. The assessments will be administered in three 45-minute class periods. An estimated 25 students per each of the 21 teachers (total 525 students) from the U.S. will take the assessments. Student responses will be recorded and stored electronically for analysis of the whole data set. Teachers will receive $200 for their participation, training, assessment administration, and fulfilling the role of school coordinator.



  1. Comments on the manuals:

    1. Both the ICT and the Collaborative Problem Solving tasks manuals should be formatted in the same style.  It is confusing to have formatting differences, especially if teachers are receiving both manuals.



The ICT manual was reformatted to match the Collaborative Problem Solving manual.



    1. In the ICT checklist for test proctors “Before the testing” section, both an “administrator” and “you” are used.  Are they the same person (i.e. the teacher that is administering the test)?  If so, the language should be consistent to avoid confusion.



Yes, it is the same person. The language was corrected to avoid confusion. Please see ICT Manual, page 2, 2.1 Roles.



    1. An intro script is given to be read before the students begin the tasks where they are told the test is not for a grade (see Page 4 and 12 of Appendix A).  A voluntary participation statement should be included. 



The following script has been added to the ICT Manual. The voluntary participation statement has been added to both manuals.

Section 4: Administration of Tasks

Before administering the tasks, please tell students the following:


They will not receive a grade for taking the computer-based assessment tasks. The tasks are being field tested to see how well they work. The performance of the student is not being evaluated. Student names will not be associated with the assessments’ results. No identifying information will be recorded, and information will be stored securely. All information may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose.


Participation in the field trials is voluntary. If students do not wish to participate, they will be given another assignment.

3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAuthorised User
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy