Public Comment and Response

NAEP 2015 Wave 3 Response to 30-day Public Comment.docx

National Assessment of Education Progress (NAEP) 2014-2016 System Clearance

Public Comment and Response

OMB: 1850-0790

Document [docx]
Download: docx | pdf

Public Comments Received During the 30-day Comment Period and NCES Responses


Comments related to NAEP to ECLS-K:2011 Link and Computer Familiarity Study

Comment Number: 1

Shape1

Docket: ED-2014-ICCD-0115
National Assessment of Educational Progress (NAEP) 2015 Wave 3-ECLS-K:2011 Link and Computer Familiarity Study

Document: ED-2014-ICCD-0115-0004
Comment on FR Doc # 2014-18555

Submitter Information

Name: Beth LaDuca
Government Agency Type: State

Government Agency: Oregon Department of Education

Thank you for this opportunity for Information Technology and Educational Technology Specialists at the Oregon Department of Education to comment on how the Department might enhance the quality, utility, and clarity of the information to be collected through the NAEP 2015 Computer Familiarity Questionnaires to be administered to students in grades 4, 8, 12. These comments apply to all of the grade level surveys.
1. Asking students about their understanding of terms (several that appear to be "made-up") prior to asking them to rate their skills/abilities will likely lower their self-evaluation of skills because many of the terms will be things they have had no experiences in learning, terms that we would not expect even a small percentage of student to know.
This set of questions is recall, bottom level of Bloom's and seems to have NO relevant purpose.
2. Students will not differentiate between a physical keyboard (computer keyboard) and the keyboard available on a tablet. Has anyone conducted psychometric studies on students' ability to use a tablet keyboard with any less success than a physical keyboard? Many students using mobile devices for learning are more able and confident with the keyboard on a tablet than with a physical keyboard, yet many of the current testing models require a physical keyboard. This is an adult paradigm. We need to study this to understand impact.
3. What is the study trying to understand? This is not clear from the questioning strategy.
4. At the end you use computer interchangeably with tablet. This is confusing.
5. There are too many questions. We estimate this would take 4th graders closer to 45 minutes, and 8th and 12th graders 30 min.
6. Students often learn skills at home, not just at school. This is not accounted for in this survey.
7. Students are using computers and tablets for much more sophisticated tasks than just simple writing, looking things up, or reviewing math. What about creating movies to illustrate learning, transform, create original products, activities much higher on Bloom's taxonomy. What about creating maps, maintaining a YouTube channel, social networking, etc? There are endless things that could be represented here that more accurately reflect what students are doing with technology. See https://sites.google.com/a/msad60.org/technology-is-learning/samr-model
Again: What is your goal? What do you really need to find out?
8. Put positive answer first! Don't make the assumption with the first response that I'm dumb.
These comments apply to the grade 4 survey.
1. "Laptop or desktop computer" is dated language and does not take into account tablet devices. We recommend: At home, do you have your own computer or tablet device?
2. Organization of the questions by device not by the verb would help students progress through the survey in a more logical fashion. Presenting the questions to 4th graders in a table, as in the 8th grade survey, would be better. It is much easier to review and takes up less space/less time.
These comments apply to the grade 8 and grade 12 surveys.
1. Other sets of questions, such as 3-5, should also be presented in a table.
2. In question 8, change "were you taught" to "did you learn". Same for 4th grade.
Again -- table for 4th grade.
3. In question 10, why would you ask students if they can take a desktop home?
4. In question 12 and 13, the activities represent low level Bloom's activities. This creates the appearance that the end goal for students and teachers using technology for learning is only for low level tasks. Please do not underestimate the negative impact. Look at the National Educational Technology Standards for Students at: http://www.iste.org/docs/pdfs/20-14_ISTE_Standards-S_PDF.pdf. Also, there is no advance in expectations for 12th graders beyond those for 8th graders.
5. Combine 12.a and 12.b as "Have you used a computer to write a paper?" Why is length an issue?
6. For question 14, the real question being asked is: Do you feel comfortable with your ability to communicate quickly and effectively using a keyboard?
7. For question 17.g, use CPU - nobody says Central processing unit.
8. For question 16, students will not take this seriously. If they are asked ridiculous questions, you will get ridiculous answers.
9. For question 18, these activities are low level and do no align with the National Educational Technology Standards for Students (and teachers) that have been adopted by most states.
10. For question 19, let's be positive! 10 to 0.
11. For questions 20 - 23, ability to communicate is not based on the number of fingers you can use! These questions assume able-bodied students. They are discriminatory and inappropriate. We have elementary students creating applications. Students will rank skill sets on most of these questions as very low because they have greater expectations for uses of technology than are reflected in this survey.

NCES Responses

We thank you for your thoughtful and careful review of the questions that comprise the Computer Access and Familiarity Study. We have considered each of them carefully and responded to each below. In some cases we have made changes to the questions based on your suggestions, including adding additional items to better measure familiarity with more advanced technology skills. In other cases we argue that the questions should remain as written. In all cases we have provided justification for why we have or have not made changes.

Thank you again for your helpful review.

The received comments are copied below in black font, followed by NCES responses in blue font:

These comments apply to all of the grade level surveys.

1. Asking students about their understanding of terms (several that appear to be "made-up") prior to asking them to rate their skills/abilities will likely lower their self-evaluation of skills because many of the terms will be things they have had no experiences in learning, terms that we would not expect even a small percentage of student to know.
This set of questions is recall, bottom level of Bloom's and seems to have NO relevant purpose.

The questions about students’ familiarity with concepts related to computers and digital technology were included to create a broad index of topic familiarity similar to the index of familiarity with mathematics terms used in PISA 2012. Even though the questions focus on recall only, PISA results have shown that a topic familiarity index can predict student achievement over and beyond most other constructs measured in the student questionnaires. The inclusion of foil concepts led to an increase in predictive validity in PISA and is based on the well-researched concept of “overclaiming “that can help reduce social desirability effects in survey scales in general. We therefore think that these questions can add value to this NAEP special study and suggest keeping them. We agree, however, that these questions would be better placed at the end of the questionnaire to avoid unwanted priming effects, and we will make the suggested change.


2. Students will not differentiate between a physical keyboard (computer keyboard) and the keyboard available on a tablet. Has anyone conducted psychometric studies on students' ability to use a tablet keyboard with any less success than a physical keyboard? Many students using mobile devices for learning are more able and confident with the keyboard on a tablet than with a physical keyboard, yet many of the current testing models require a physical keyboard. This is an adult paradigm. We need to study this to understand impact.

To our knowledge there have not been any studies done that examine whether there is a mode effect due to using a physical versus a virtual keyboard, but we agree that such studies should be done in the future. Importantly, in the cognitive interviews of our items, which were conducted with both 4th and 8th grader students, no one asked whether we meant a physical or virtual keyboard for those questions where the term “computer keyboard” appears (e.g., 8a, 14 and 15 in the Grade 8 and 12 questionnaires). That is, none of the participants in the cognitive interview study appeared confused by the meaning of these questions with respect to the issue of whether we meant a physical or virtual keyboard.


3. What is the study trying to understand? This is not clear from the questioning strategy.

NCES and NAGB have announced that NAEP will be a fully technology-based assessment by 2017. A concern is the degree to which all children are ready for a move to a technology-based assessment. In particular, do all students have the same access and experience with the technologies that will be used to collect the data, and what is the relationship between access and experience with these technologies and performance on NAEP reading, mathematics, and science at grades 4, 8 and 12? In addition, is there less access to technology for disadvantaged students, and does this raise questions about bias in NAEP reporting for these students? The computer access and familiarity study is designed to answer both of these research questions. The two major constructs to be measured from selected items in the study, and for which we hope to construct indices, are access to and familiarity with technology. If these two measures prove useful in explaining performance on NAEP, the goal is to include them as measures in operational NAEP in the future.


4. At the end you use computer interchangeably with tablet. This is confusing.

Throughout the questionnaire we have quite consciously distinguished between computers and tablets. We did so because the 2015 pilot will be delivered by tablet. We were therefore especially concerned that we measure access and familiarity with tablets distinct from access and familiarity with computers. Those questions where “computer” is used without a qualifier (e.g., Q8b in the Grade 8 and 12 questionnaires: Were you taught any of the following at school? How to write sentences and paragraphs using a computer) are device independent. As mentioned in a previous response, all items were tested using cognitive interviews with 4th and 8th grade students, and none of the students had problems interpreting the questions. We agree, however, that the selected questions referred to could be further improved by changing the wording as follows: Question 1: “How much do you know about using computers and other digital devices?”; Questions 19-23: “…how familiar with using computers and other digital devices…” instead of “… how familiar with computers and digital technology …” and we will make these changes.


5. There are too many questions. We estimate this would take 4th graders closer to 45 minutes, and 8th and 12th graders 30 min.

This is a pilot study and therefore contains more items than would be contained in operational versions of the two measures that we propose to construct. The goal is to make the final measures as short as possible so long as they demonstrate predictive validity with respect to NAEP performance and meet accepted standards for internal consistency reliability. However, we cannot know in advance which items will work in these two measures and which will not. Therefore, the pilot study necessarily must include considerably more items than we expect the final measures to be used in operational NAEP will contain.


6. Students often learn skills at home, not just at school. This is not accounted for in this survey.

We agree that students learn computer skills at places other than school. For that reason, students are asked about computer and other technology access at home as well as at school. And Qs.17 and 18 are designed to capture information on skills in computer technology the students have, regardless of where they learned them. But we also ask questions about what they have learned about computer technology and its uses in school. This information will be used to evaluate whether what is learned varies as a function of student socio-demographic characteristics—that is, to help answer our second research question.


7. Students are using computers and tablets for much more sophisticated tasks than just simple writing, looking things up, or reviewing math. What about creating movies to illustrate learning, transform, create original products, activities much higher on Bloom's taxonomy. What about creating maps, maintaining a YouTube channel, social networking, etc? There are endless things that could be represented here that more accurately reflect what students are doing with technology. See https://sites.google.com/a/msad60.org/technology-is-learning/samr-model
Again: What is your goal? What do you really need to find out?

Given that NAEP is moving to a technology based assessment, our goal (as stated above) is to develop a set of items that concisely, but reliability and validly assess students’ access to and familiarity with technology and to build measures of each construct. The goal is not to create an inventory of all the ways in which technology is used in schools. That stated, the types of questions you suggest would be valuable to ask since we may not have enough “top” for the measurement of familiarity with technology at Grades 8 and 12. For these two grade levels, we are therefore adding four sub-questions to Questions 12 and 13. The new sub-questions will ask about creating multimedia presentations (individually or as a collaborative group exercise), working on a website that the student maintains, and creating maps.



8. Put positive answer first! Don't make the assumption with the first response that I'm dumb.

Questions and response formats follow NAEP conventions, which list answers in ascending order from top to bottom or left to right. These questions will be administered with other NAEP contextual questions (both during the pilot and in subsequent uses). It would be confusing to students and raise the possibility of mistakes if response conventions differed from item to item.



These questions apply to the Grade 4 survey

1."Laptop or desktop computer" is dated language and does not take into account tablet devices. We recommend: At home, do you have your own computer or tablet device?

As noted above, we were careful to distinguish between types of devices in our questions because the NAEP 2015 pilot will be delivered by tablet. We are therefore especially concerned that we measure access and familiarity with tablets as distinct from computers. Having used a tablet could be positively related to NAEP performance when compared to having experience using a computer, for example.


2. Organization of the questions by device not by the verb would help students progress through the survey in a more logical fashion. Presenting the questions to 4th graders in a table, as in the 8th grade survey, would be better. It is much easier to review and takes up less space/less time.
These comments apply to the grade 8 and grade 12 surveys.

We have placed the vast majority of questions at Grades 8 and 12 into a matrix (table) format, but NAEP does not use a matrix format at grade four because analyses of earlier assessments showed that about one quarter of fourth-graders only answer the first item in a matrix, apparently failing to understand the format.

With regard to whether the items should be organized by device or by verb, we note that the current ordering of the items was used in the cognitive interviews, and no problems were discovered. Consequently we do not recommend reorganizing the items in this manner.



These question apply to the Grade 8 and Grade 12 survey

1. Other sets of questions, such as 3-5, should also be presented in a table.

Whenever feasible, we put questions for eighth and twelfth graders in matrix (table) form. However, Qs 3, 4, and 5 each have six response categories and some of the response categories contain as many as nine words. With so many lengthy response options, the items cannot be properly formatted across the page, as required for matrix items. Importantly, the questions as currently formatted worked well in the cognitive interviews we did with 8th graders, and thus we plan to leave Qs 3, 4, and 5 as three separate questions.

.
2. In question 8, change "were you taught" to "did you learn". Same for 4th grade.
Again -- table for 4th grade.

We do not recommend making this wording change. The question is designed to measure instruction, not learning. Responses will be used to help establish whether there is equity in instruction for all students regardless of their socio-demographic backgrounds.


3. In question 10, why would you ask students if they can take a desktop home?

We agree and will change the response to include only whether students can take a laptop home.


4. In question 12 and 13, the activities represent low level Bloom's activities. This creates the appearance that the end goal for students and teachers using technology for learning is only for low level tasks. Please do not underestimate the negative impact. Look at the National Educational Technology Standards for Students at: http://www.iste.org/docs/pdfs/20-14_ISTE_Standards-S_PDF.pdf. Also, there is no advance in expectations for 12th graders beyond those for 8th graders.

As indicated in our response to your question 7 above, our goal for this study is to develop a set of items that concisely, but reliability and validly assess students’ access to and familiarity with technology and to build measures of each construct. The goal is not to create an inventory of all the ways in which technology is used in schools. However, we agree that we may not have enough “top” for the measurement of familiarity with technology at Grades 8 and 12 and will therefore add four additional sub-questions to Questions 12 and 13. The new sub-questions will ask about creating multimedia presentations (individually or as a collaborative group exercise), working on a website that the student maintains, and creating maps.



5. Combine 12.a and 12.b as "Have you used a computer to write a paper?" Why is length an issue?

During the development process for our questionnaire, we drew on some experts in the field to help us refine our questions. When it came to the question referenced above, using a computer to write a paper, the feeling was that we needed to distinguish students’ responses based on the types of writing assignments on computers to which they had been exposed. Experience with both short and long writing assignments are relevant to students’ ability to respond to NAEP assessments, but experience with longer writing assignments may be especially predictive of students’ ability to respond successfully to the longer constructed response questions on NAEP.



6. For question 14, the real question being asked is: Do you feel comfortable with your ability to communicate quickly and effectively using a keyboard?

Question 14 was designed to assess students’ keyboarding skills and was tested and refined through cognitive interviews with students in Grades 4 and 8. The substitute question you suggest might be appropriate for those in Grade 12, less appropriate for Grade 8 and likely not understood by many if not most 4th grader students. Finally, since we would like to use the same question across all three grade levels we should use the question as originally stated.



7. For question 17.g, use CPU - nobody says Central processing unit.

We agree and will change the wording accordingly.



8. For question 16, students will not take this seriously. If they are asked ridiculous questions, you will get ridiculous answers.

We used this question in our cognitive interviews with 4th and 8th graders, and they answered it seriously. Given that half the sample will take the 2015 NAEP assessments using paper and pencil and the other half will take the assessments on tablets, it will be useful to determine whether having a preference for one or the other mode of administration relates to test performance holding other factors constant (e.g., SES, gender, motivation). That is, does a student who states a preference for taking tests on a computer do better, given that he or she is taking the NAEP assessment on a tablet, than a student who has a preference for taking tests using paper and pencil, controlling for socio-demographics and motivation? Similarly, does a student who states a preference for taking tests using a computer, who is assigned to the paper-and-pencil condition do worse (with controls in place) than a student who states a preference for taking tests using paper and pencil? Because we think these are important questions to be able to answer with the move to a TBA NAEP, we propose to leave the question in the survey.



9. For question 18, these activities are low level and do no align with the National Educational Technology Standards for Students (and teachers) that have been adopted by most states.

In line with the study rationale described in the previous responses, one main goal of the study is to evaluate whether students demonstrate basic proficiency with computers and technology as a possible performance moderator for the NAEP technology-based assessment. The behaviors chosen represent these types of behaviors. While we recommend keeping the current sub-questions, we will also include one additional sub-question “Figure out how to use new functions of a digital device I am not yet familiar with.” to capture additional higher-level behaviors.


10. For question 19, let's be positive! 10 to 0.

Questions and response formats follow NAEP conventions, which list answers in ascending order from top to bottom or left to right. These questions will be administered with other NAEP contextual questions (both during the pilot and in subsequent uses). It would be confusing to students and raise the possibility of mistakes if response conventions differed from item to item.


11. For questions 20 - 23, ability to communicate is not based on the number of fingers you can use! These questions assume able-bodied students. They are discriminatory and inappropriate. We have elementary students creating applications. Students will rank skill sets on most of these questions as very low because they have greater expectations for uses of technology than are reflected in this survey.

In line with the study rationale described in the previous responses, one main goal of the study is to evaluate whether students demonstrate basic proficiency with computers and technology as a possible performance moderator for the NAEP technology-based assessment. Questions 20-23 are so-called anchoring vignettes that were specifically included to provide better frames of reference for students’ self-reported answers in the survey. Knowledge about students’ evaluation of computer skills captured in the vignettes can provide context for their rating standards for other questions and can increase the validity of the questionnaire overall. The four vignettes capture different levels of familiarity. Question 22 explicitly includes students’ experience with programming and creating their own apps. As noted in our response to your Comment 4, we will also change the wording of the question stems associated with the four vignettes to “…how familiar with using computers and other digital devices…” instead of “… how familiar with computers and digital technology …”



Thank you again for your interest in the NAEP Computer Familiarity Study, and for your comments and suggestions.

With regards,


Patricia M. Etienne

Program Director, Assessment Coordination National Assessment of Educational Progress

3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBohrnstedt, George
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy