NonSub: Assessing Early Childhood Teachers' Use of Child Progress Monitoring to Individualize Teaching Practices

CPM OMB interim memo_20160115.docx

Pre-testing of Evaluation Surveys

NonSub: Assessing Early Childhood Teachers' Use of Child Progress Monitoring to Individualize Teaching Practices

OMB: 0970-0355

Document [docx]
Download: docx | pdf

MEMORANDUM






TO: Steph Tathum; Office Information and Regulatory Affairs (OIRA), Office of Management and Budget (OMB)


FROM: Laura Hoard; Office of Planning, Research and Evaluation (OPRE), Administration for Children and Families (ACF) DATE: 1/15/2016

SUBJECT: “Assessing Early Childhood Teachers’ Use of Child Progress Monitoring to Individualize Teaching Practices” project –
revised materials (0970-0355)

Shape1


The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for non-substantive changes to our original OMB submission under generic clearance for pre-testing (OMB control 0970-0355) for the “Assessing Early Childhood Teachers’ Use of Child Progress Monitoring to Individualize Teaching Practices” project. As described in our original, approved, justification package the planned process is an iterative one, where we will incorporate changes based on pre-testing, submit revised materials to OMB as a non-substantive change and then administer again.

The EDIT team is conducting data collection in nine classrooms to pretest the observational EDIT instrument (completed by EDIT team members) and teacher interview. The primary goals of this pretest are to implement and refine the newest EDIT instrument rubrics and ratings, and continue to assess the overall feasibility of the EDIT protocols, procedures, and materials. Based on initial fielding, the team revised the Examining Data Informing Teaching (EDIT) instrument and the EDIT teacher interview. Additionally, we added another consent form for additional program staff whose image or voice may appear in the video recordings. We describe the changes below and include updated materials.

Refinements to the EDIT instrument

The EDIT instrument is being refined through an iterative process. The EDIT development team conducts classroom visits and then reviews and discusses field experiences in periodic team debriefings. The team has visited 3 classrooms thus far and recommends several refinements to the EDIT instrument. A “clean” version of the revised EDIT instrument is in Attachment K and a tracked-changes version is in Attachment K-T. Recommended refinements fall into these categories:



  • Elimination of redundancy. For example, in item 1 (p. 3)1 the criteria “A few targets can be changed with instruction or intervention.” is being removed and folded into an existing criteria “Targets are in an area in which children typically make progress within the program year with instruction or intervention.” Additional items revised to eliminate redundancy include 4B-4H (pp. 18-19), 4N (p. 12), and 4O (p. 13).

  • Clarifications. Terms are revised to be better specified. For example, in item 2A (p. 5), “The evidence” becomes “The observed evidence.” Additional items revised to be better specified include items 2H (p. 7), 3I (p. 11), 4H-4M (p. 12), 6D and 6E (p. 18), and supplemental ratings 4, 10, 11 and 17 (pp. 22-23). Terms are also defined. For example, in 2B (p. 5), “progress” is defined as “at least three times per reporting period.” Finally, we clarified the prompt “for observed assessments only” to read “for video-based observations only” (items 2F-2H on p. 7 and item 2N on p. 9).

  • Increased consistency. Text is aligned across items and criteria.2 In 2B (p. 5), the same text is used to describe criteria in a rating3 of “7” as appeared in the rating of “5”.

  • More accurate differentiation between levels of teacher implementation. During pretesting, we realized that some items needed to be revised to better differentiate between the different rating levels with which teachers implement ongoing assessment for individualization. We revised item 6A to switch the placement in the holistic rubric of two items (p. 17). We also revised holistic rubric 7 to more accurately differentiate levels of teacher implementation (p. 20).

  • Not Applicable and Cannot Rate options specified. Some items only apply in specific circumstances. Rating options for “Not Applicable” and “Cannot Rate” were added to several items. For example, item 2H (p. 7), only applies when a teacher uses a standard task assessment, and it can only be rated when documentation for that standard task is available. Additional items revised to specify “Not Applicable” or “Cannot Rate” options include items 2F-2G (p. 7), 2I-2M (p. 9), and 6B-6H (p. 18).

  • Redundant child-specific ratings eliminated. We ask teachers for information regarding two focal children in the classroom. Even though the children have varying skills, we find the teachers are often making decisions in the same way for both children. For some items where we had asked EDIT team members to provide separate ratings for how the teacher makes decisions for each of the two focal children, we found that teachers received the same rating for both children. To eliminate redundancy, we edited items 2F-2H to ask for one overall rating (p. 7).

  • Fields added for researchers to enter ratings. The holistic rubrics lacked fields to enter a rating for the rubric; we added these fields to the holistic rubrics 1, 5, and 6 (pp. 3, 15, and 17). We also added a field to record the date of the visit (p. 1).

  • Sources of evidence fields eliminated. The “sources of evidence” fields were not adding value to researchers’ ability to rate instrument items. These fields required site visitors to write out the subset of documents used to rate a group of items from the larger “list of artifacts received” field. In order to make the best use of time, we eliminated all such fields (pp. 3, 9, 11, 13, 15, 16, 18, and 20) and will instead only rely on the “list of artifacts received” field (p. 1).

  • List of artifacts adjusted to indicate age of documentation. To indicate that certain documents should serve as background information to current performance and progress, we added a prompt for researchers to place a “(B)” in front of artifacts collected more than two months ago in the “list of artifacts received” field (p. 1).

  • Prompt added. Previously, we did not explicitly describe sources that could inform ratings for these items .We added a prompt to items 2I-2N (p. 9) indicating the sources that could inform the ratings (i.e., documents, observations, and the teacher interview).

  • Item added. One item was added to complete coverage of the construct, “interpreting the data.” This is item 5D (p. 16), “Teacher involves the other teachers/staff in interpreting and understanding the data.”

Refinements to the EDIT teacher interview

As explained in supporting statement A, EDIT team members conduct a 55-minute individual semi-structured interview with the lead classroom teacher. The interviewers probe for explanations about documents and observations gathered, and also to obtain information on the teacher’s planning and implementation of instructional adaptations and individualized teaching strategies. Because the interview questions are open-ended, respondents sometimes give responses that are pertinent to several interview questions and topics at one time. We have added a two-page cover sheet to the interview protocol (Attachment L and L-T) to highlight the key topics that the interviewer should touch on in order to ensure all essential information is covered during the interview while maintaining a semi-structured approach that eliminates redundancy. Interviewers will refer the two-page cover sheet throughout the interview, especially before moving on to the next section, to ensure that the most important topics were addressed.

We also added one question to the teacher interview to help elicit information about reviewing data by subgroups to be able to answer EDIT instrument item 4J (p. 12) (“The teacher organizes the information to look at performance by subgroup for one or more assessment targets at a single timepoint”). The interview length remains at 55-minutes. Attachment L is a “clean” copy of the interview and Attachment L-T shows the change to the interview in tracked-changes (see p. 7 of that attachment).

Additional teacher consent

Our previous submission included a consent form for each lead teacher participating in video recordings, as well as a consent form for students. Attachment O contains an additional consent form for non-lead teachers whose image or voice may appear in the video recordings. This consent form has been reviewed and approved by the New England Institutional Review Board and is consistent with the other OMB-approved forms under this clearance.

1 Page numbers refer to the tracked-changes document in Attachment K-T.

2 “Criteria” refers to the behavioral description of teachers performing at a given rating level.

3 “Rating” refers to the score assigned for a particular item.

An Affirmative Action/Equal Opportunity Employer

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKHawn
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy