Response to Public Comments

Response to public comments_NAEP 2016.docx

National Assessment of Education Progress (NAEP) 2014-2016 System Clearance

Response to Public Comments

OMB: 1850-0790

Document [docx]
Download: docx | pdf

Public Comments Received During the 30-day Comment Period and NCES Responses

Docket: ED-2015-ICCD-0089 Comment on FR Doc # 2015-16661
2016 Main National Assessment of Educational Progress (NAEP) Administration

Comments related to NAEP 2016 Assessments

Comments related to intrusiveness of student questions

Shape1

Document: ED-2015-ICCD-0089-0005

Submitter Information

Name: Beth LaDuca, Oregon Department of Education

Comment

Thank you for the opportunity to comment on the proposed NAEP 2016 DBA student surveys. The Oregon Department of Education does not believe that some of the new student survey questions to be piloted in NAEP 2016 are necessary to the proper functions of the U.S. Department of Education. Furthermore, some of the new items appear to be at odds with P.L. 107-279, Sec. 303 (5) which states “. . . any academic assessment authorized under this section be tests that do not evaluate or assess personal or family beliefs and attitudes or publicly disclose personally identifiable information.” The National Assessment Governing Board’s Background Information Framework for the National Assessment of Educational Progress makes clear that prohibitions in the law apply to background (survey) questions as well as subject area test questions:

In their report on the bill, the House-Senate conference committee that negotiated its final form says the law “does not preclude the use of non-intrusive, non-cognitive questions, approved by the National Assessment Governing Board, whose direct relationship to academic achievement has been demonstrated and is being studied as part of [NAEP] for the purposes of improving such achievement.” The report language is not binding, but is intended to guide the implementation of the law. This framework emphasizes that the legal prohibitions must be followed in preparing background questions and collecting any other non-cognitive data for NAEP. (p. 16)

This paragraph from the Background Information Framework illustrates that legislators had serious concerns about including in NAEP questions that students or parents might find intrusive. Questions about “personal or family beliefs and attitudes” are clearly prohibited by law. Yet, NAEP proposes to pilot survey questions in 2016 to 4th, 8th, and 12 graders such as:

Who are the adults that live in your home? You can write things like, for example, “Dad", “Mom”, “Brother”, or “Grandma”.

Which of the following best describes where you are living? A. A single family home B. A townhouse C. An apartment D. A trailer or mobile home E. A community home or shelter F. Other (Please specify)

In this school year, how often have you felt any of the following ways about your school? Select one answer choice on each row. Examples of items:

I felt left out of things at school.

I felt awkward and out of place at school.

How much does each of the following statements describe a person like you? Select one answer choice for each row. Examples of items:

I want to hide how nervous I am about reading.

I want other students to think I am good at reading.

How much does each of the following statements describe a person like you? Select one answer choice for each row. Examples of items:

I like complex problems more than easy problems.

I like activities that challenge my thinking abilities.

I like to think of my life as a puzzle that I must solve.

A reasonable person easily could find either of the first two items intrusive. The next three items clearly evaluate “personal or family beliefs and attitudes.” The Oregon Department of Education requests that NCES revisit P.L. 107-279 Sec. 303 (5) and the National Assessment Governing Board’s Background Information Framework for the National Assessment of Educational Progress to ensure that the NAEP 2016 DBA student survey questions are consistent with NAEP law and policy.

Shape2

Document: ED-2015-ICCD-0089-0007

Submitter Information

Name: Keric Ashley, California Department of Education

Comment

The California Department of Education (CDE) appreciates the opportunity to review and comment on the proposed 2016 Main NAEP Administration. The CDE’s comments address concerns over student questions as well as the burden estimate to teachers, schools, and school coordinators.


Intrusive Questions

The CDE asserts that several proposed student questions violate the legal requirement that the NAEP “… not evaluate or assess personal or family beliefs and attitudes…” [P.L. 107-110, Sec. 411 (b) (5)]. While non-cognitive or background questions are allowable, according to the National Assessment Governing Board (NAGB) Background Information Framework for the National Assessment of Educational Progress (available at: https://www.nagb.org/publications/frameworks/background-information/2003- background-information.html) “such information must be clearly related to student achievement, as shown by other research,” “questions shall be non-intrusive…,” and “questions should not address personal feelings and attitudes.” NAEP proposes to pilot survey questions in 2016 to students in grades 4, 8, and 12 that the CDE believes solicit personal feelings and attitudes. Please see Attachment A for comments on specific questions that the CDE finds intrusive.


During NAEP 2015, California parents expressed concern over NAEP questionnaires, resulting in students opting out of the student survey. If these questions are included in NAEP 2016, we anticipate more students in California will opt out from the survey. Currently, at least two other states refuse to allow NAEP student surveys to be administered to students in their state.


Burden Underestimated

[This portion of the comment appears below in the next section (“comments related to burden”)]


Summary

The CDE is concerned that proposed student questions are inconsistent with NAEP law and policy, and that the burden to teachers, principals, and school coordinators has been underestimated. The CDE requests that the NCES revisit P.L. 107-110 and the NAGB’s Background Information Framework for the National Assessment of Educational Progress to ensure that the 2016 Main NAEP Administration is consistent with NAEP law and policy.


If you have any questions regarding this subject, please contact Julie K. Williams, California NAEP State Coordinator, Assessment Development and Administration Division, by phone at 916-319-0408 or by e-mail at [email protected].

Response

We thank you for your thoughtful and careful review of the NAEP contextual questions submitted for piloting in 2016. We have carefully considered your comments and have provided further justification for why we have not made changes.

Thank you again for your helpful review.

Student Questionnaires:

The NAEP student questionnaires include items that require students to provide responses on factual questions about their family’s socioeconomic background, self-reported behaviors, and perceptions on learning and learning contexts, both in the school setting as well as more generally. In compliance with legislation, student questionnaires do not include items about family or personal beliefs1. The student questionnaires focus only on contextual factors that clearly relate to academic achievement (e.g., grit, self-efficacy, achievement motivation, school climate) which directly addresses the National Assessment Governing Board’s policy principles laid out in their 2012 policy statement that “NAEP reporting should be enriched by greater use of contextual data derived from background or non-cognitive questions asked of students, teachers, and schools” (National Assessment Governing Board, 2012, p. 2).


Educators, psychologists, economists, and others have called for the collection of noncognitive student information that can explain why some students do better in school than others. Many students who do not achieve at their full potential struggle due to the contextual factors that these items are intended to measure. Similar questions have been included in other NCES administered assessments such as TIMSS, PISA, or the National School Climate Survey, and in other Federal questionnaires, including the U.S. Census. The insights achieved by the use of these well-established survey questions will help educators, policy makers, and other stakeholders make better informed decisions about how best to help students develop the knowledge and skills they need to succeed.


While some of the noncognitive question do use the pronouns “I” and “you”, the content of the questions is focused on observable actions, such as finishing what one begins or paying attention and resisting distractions. These are the sorts of behaviors that are observed by teachers in classrooms all over the country on a regular basis and are acknowledged as being important correlates of academic success.


These questions are also in compliance with the legislation, which indicates that law “does not preclude the use of non-intrusive, non-cognitive questions, approved by the NAGB, whose direct relationship to academic achievement has been demonstrated and is being studied as part of [NAEP] for the purposes of improving such achievement.” In addition, NAEP is required by legislation to collect data on socioeconomic status (SES). The literature on SES distinguishes three main components of the construct: Income, Education, and Occupation. In order to collect this information (particularly for income), NAEP relies on proxy measures because not all components can be directly measured in NAEP.


All questions proposed for piloting have gone through multiple rounds of reviews, including but not limited to reviews by NAEP subject-matter expert groups, organizational Internal Review Board (IRB), and the National Assessment Governing Board, and have successfully passed extensive pre-testing via cognitive interviews with all respondent groups. Further note that NAEP does not report student responses at the individual level but strictly in aggregate forms. Also, to reduce the impact of any individual question on NAEP reporting, the program has shifted to a balanced reporting approach that includes multi-item indices, where possible, to maximize robustness and validity. In addition, in compliance with legislation and established practices through previous NAEP administrations, students can decide for each question whether they want to provide a response or not.


Lastly, the questions proposed for piloting include multiple alternative versions of questions that address each topic. For example, different questions about parental occupation and family composition are asked in the pilot to discern which version works best for NAEP in a large-scale operational setting. The same principle holds true for other core modules and subject-specific topic areas. That is, we also compare different versions of items for grit, desire for learning, school climate, technology use, self-efficacy, and classroom practices. Each student will only be given one of the alternative versions for a given topic. In order to test multiple versions, we are implementing a spiraling design for student questionnaires in the 2016 pilot that will allow us to keep individual student burden low and ensure that each student only answers a subset of the proposed contextual questions.


After the pilot, a reduced set of questions will be selected for operational administration in 2017. This process will be informed by results from the pilot, including statistics on the predictive power of all questions for achievement and test-taker and stakeholder experiences from the field.


Comments related to burden for teachers and school administrators

Shape3

Document: ED-2015-ICCD-0089-0006

Submitter Information

Name: Beth LaDuca, Oregon Department of Education

Comment

Thank you for the opportunity to comment on the burden estimates for the NAEP 2016 DBA school and teacher questionnaires. The Oregon Department of Education (ODE) does not believe that the burden estimates for the school and teacher questionnaires are accurate. ODE received multiple complaints about the length of teacher and school questionnaires from principals and teachers in schools selected for the NAEP 2015 program. According to principals, the school questionnaire requested detailed data that took considerable time to gather. Several principals reported spending several hours, rather than the estimated 30 minutes, to complete the school questionnaire. Multiple teachers reported that the teacher questionnaires were simply too long. For 2016, the burden estimate for the teacher questionnaires is 30 minutes despite the fact that the National Assessment Governing Board Background Information Framework for the National Assessment of Educational Progress states that the “average individual response time to answer background questions for each assessment, as calculated in accordance with Office of Management and Budget (OMB) procedures, shall be limited as follows: . . . 20 minutes for each teacher. . .” (p. 10).

The contractors for NAEP 2015 have data that NCES should analyze to review the burden estimates for NAEP 2016 DBA teacher and school questionnaires, as the DBA questionnaires will be very similar to the questionnaires delivered for NAEP 2015 math and reading assessments. The majority of NAEP 2015 questionnaires were completed online, so the contractor should be able to provide response time estimates. Also, the contractor responsible for the online assessment planning system MyNAEP collected feedback from users, including feedback about school and teacher surveys.

Please review all the available data and, if needed, adjust the burden estimates for school questionnaires so that respondents are well-informed about the time necessary to complete them. Please ensure that that the teacher questionnaires adhere to the National Assessment Governing Board limit of 20 minutes.

Shape4

Document: ED-2015-ICCD-0089-0007

Submitter Information

Name: Keric Ashley, California Department of Education

Comment

The California Department of Education (CDE) appreciates the opportunity to review and comment on the proposed 2016 Main NAEP Administration. The CDE’s comments address concerns over student questions as well as the burden estimate to teachers, schools, and school coordinators.


Intrusive Questions

[This portion of the comment appears above in the previous set (“Intrusiveness”)]


Burden Underestimated

The CDE believes the burden estimates are inaccurate for the school and teacher questionnaires and for the school coordinator tasks. During NAEP 2015, the CDE received numerous complaints from teachers and principals about the length of teacher and school questionnaires. According to principals, the school questionnaire requested detailed data that took considerable time to gather. Several principals reported spending several hours to complete the school questionnaire, rather than the estimated 30 minutes. Multiple teachers reported that the teacher questionnaires were too long. For 2016, the burden estimate for the teacher questionnaires is 30 minutes. Yet the NAGB Background Information Framework for the National Assessment of Educational Progress states the “average individual response time to answer background questions for each assessment, as calculated in accordance with Office of Management and Budget (OMB) procedures, shall be limited as follows: . . . 20 minutes for each teacher. . .” (p. 10).


School coordinators for NAEP 2015 experienced a much greater burden than the estimated two hours for NAEP 2016. The CDE received numerous complaints from schools and districts regarding the time needed to complete the NAEP pre-assessment tasks. One school coordinator reported that it took more than thirty hours of her time during a three week window. A two hour estimate for school coordinator tasks for NAEP 2016 is grossly underestimated. The CDE’s conservative estimate is ten hours burden for school coordinators.

The CDE requests that the NCES analyze data from the NAEP 2015 contractors to review and revise the burden estimates for NAEP 2016 teacher and school questionnaires and school coordinators. The CDE recommends that the teacher questionnaires adhere to the NAGB limit of 20 minutes.


Summary

The CDE is concerned that proposed student questions are inconsistent with NAEP law and policy, and that the burden to teachers, principals, and school coordinators has been underestimated. The CDE requests that the NCES revisit P.L. 107-110 and the NAGB’s Background Information Framework for the National Assessment of Educational Progress to ensure that the 2016 Main NAEP Administration is consistent with NAEP law and policy.


If you have any questions regarding this subject, please contact Julie K. Williams, California NAEP State Coordinator, Assessment Development and Administration Division, by phone at 916-319-0408 or by e-mail at [email protected].

Response

We thank you for your thoughtful and careful review of the NAEP contextual questions submitted for piloting in 2016. We have carefully considered your comments and have provided further justification for why we have not made changes.

Thank you again for your helpful review.

Teacher/School Administrator Questionnaires

The burden for teachers and school administrators to respond to the questionnaires was estimated based on information from the most recent NAEP administrations. To facilitate gathering information that is necessary to complete the school questionnaire, we are exploring the possibility of providing a one-pager with information that the school administrator will be asked to provide but may not have readily available (e.g., number of computers in the school). This will allow school administrators to fill out questionnaires more efficiently in one session, thereby reducing burden. In addition, please note that the times listed in NAGB’s Background Information Framework are guidelines only. As such, NAEP does not have a policy which limits the amount of time to be spent on survey questionnaires.


School Coordinators

We understand that some school coordinators encountered technical difficulties while completing the pre-assessment tasks in 2015. The pre-assessment system, servers, and processes have been carefully reviewed and appropriately updated so that these challenges will not be encountered in 2016.

1 Examples for questions about family or personal beliefs are questions about religious or sexual orientation or questions about personal relationships with and attitudes about other family members.

4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBohrnstedt, George
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy