A-2 Comment Summaries and Responses

2023 Att. A-2 Comment Summaries and Responses.docx

Computer and Internet Use Supplement to the Census Bureau's Current Population Survey

A-2 Comment Summaries and Responses

OMB: 0660-0021

Document [docx]
Download: docx | pdf

November 2023 NTIA Internet Use Survey: Comments on Proposed Information Collection

On June 2, 2023, the National Telecommunications and Information Administration (NTIA) published a notice in the Federal Register requesting comments on the next edition of the NTIA Internet Use Survey, pursuant to the provisions of the Paperwork Reduction Act (PRA). NTIA received seventeen comments in response to this notice, seven of which were substantively similar. All comments contained valuable and thoughtful feedback that NTIA has considered in the course of finalizing its proposed information collection. NTIA summarizes and responds to each comment below.

  1. Lori Special – State Library of North Carolina

This comment asks about the provisions in the survey administration for people “with no or low reading literacy, blind or visually impaired, or those who have to use other locations like libraries to access the survey.” The author goes on to suggest that libraries could be used to deploy and collect this survey in these communities. However, the author notes, libraries, especially those in digital deserts, are often understaffed and have few resources, so any ask of them should not come at the last minute. “Staff is often willing, IF they are given time to plan and prepare. Make the ask early.” The comment also contains other administrative suggestions around having paper copies, paid postage, and assistive audio. We appreciate the thoughtful consideration of accessibility issues, but note that the Census Bureau selects a random sample of housing units, and employs live interviewers to contact the households residing in those units through a combination of home visits and live phone interviews. Because the NTIA Internet Use Survey is a supplement to the Current Population Survey, NTIA relies on, and benefits from, the Census Bureau’s expertise in collecting data from persons with disabilities, low levels of literacy, and other challenges.

  1. Shelia Corren – Clemson University Professor of Sociology

The comment starts by stating how excited the author is to see new NTIA data. The author suggests that the survey should go beyond yes/no responses for devices and online activity, and that data on the frequency of device and Internet use would be much more useful for researchers. Specifically on DEVQUA, the comment suggests that the survey should assess the working status of devices independently. The comment also recommends that a new question should be added that asks where the respondent gets help with their devices, noting that since people tend to get help from the same place, it is okay if all devices are grouped together under this question. The commenter provided this sample question:

Question: In general, when you need help fixing a computer or another digital device (such as a smartphone) issue, where did you turn to for help? (CHOOSE ONE)

  • I figured it out by myself

  • Online tutorials or videos

  • Friend or family member

  • Co-worker or supervisor

  • Community institution such as a school, library, or religious organization

  • Business help desk (such as Best Buy’s Geek Squad or Apple’s Genius Bar)

  • Other (record verbatim answer)

  • Don’t know/I can’t remember needing help

Lastly, the comment suggests that given how pervasive Internet access is now, it is less important to collect where exactly people use the Internet. The comment suggests that INCAFE, INTRAV, INELHO, INOTHER can all be dropped and that INWORK, INSCHL, INLICO should be kept.

We appreciate the suggestions to increase the depth of knowledge around frequency of Internet use, the quality of individual devices, and device repair options, though constraints on interview time make these infeasible to implement at this time. Further, we welcome the commenter’s prioritization of certain Internet use locations over others; while some testing would first be necessary to ensure we continue to capture the full Internet-using population, we may pursue a restructuring of that section for a future edition of the Survey.

  1. Wisconsin Regional Training Partnership, National Association of State Directors of Adult Education, Kentucky Digital Equity Initiative, National Skills Coalition, Stacey Wedlake, American Library Association, EveryoneOn

Each of these parties submitted a substantively similar comment. The comments begin by noting that the survey “is a unique and invaluable source of data about US residents’ digital access and adoption… No other data collection effort or vendor even comes close.” The authors then argue in great detail that there has been a rapid increase in demand for digital skills and that the survey needs to collect information on the topic. This is especially urgent, they argue, as digital skills have not kept up with demand and data are needed to help allocate resources and develop policies to close this gap. As such, the comments support added questions on specifics tasks that might be carried out with digital technologies, recommend questions be added on education and employment (as questions were added specifically on health), and list three specific questions that could be added:

  • Education/workforce Q1: In the past 12 months, have difficulties in using a computer or other technology that requires digital skills stopped you from applying for a job offer or promotion? (YES/NO)

  • Education/workforce Q2: In the past 12 months, has there been a work task or responsibility that you were unable to complete because of difficulties in using a computer or other technology that requires digital skills? (YES/NO)

  • Education/workforce Q3: In the past 12 months, has there been an educational or learning opportunity that you were not able to participate in because of difficulties in using a computer or other technology that requires digital skills? (YES/NO)

While those questions are a higher priority, the authors suggest that, space permitting, additional questions could be added similar to the ones above but instead focusing on “problems with internet access”. Lastly, these comments strongly recommend the same question on how people receive help with digital tasks that was recommended in the comment by Sheila Corren.

We strongly agree that there is a need to improve understanding of the state of digital skills in America, and we will keep these important suggestions in mind for future editions of the Survey. We note that digital skills measurement is an active area of academic research, with different parties suggesting different vehicles for obtaining this information (e.g., qualitative interviews, or observation of subjects completing particular tasks). It is unclear whether a large national household survey like the NTIA Internet Use Survey is the best way to collect digital skills data, though there are certainly advantages to having such information available in the same dataset as other data on individual computer and Internet use. While substantial testing and collaboration with experts would be required, and there would be difficult questions about trade-offs required to keep interview times from rising, this is an area of great interest for future development.

  1. Pete Wilson

The comment states that the survey is too long and places too great a burden on respondents. A suggested policy is that the survey should be no longer than 5 minutes. Our partners at the Census Bureau have determined that supplements to the Current Population Survey such as the NTIA Internet Use Survey should aim for a maximum household interview time of 10 minutes. We defer to their expertise, and therefore respectfully disagree with this proposal.

  1. World Education

The comment expresses concerns about the accessibility of the survey for individuals with language barriers and/or disabilities, especially since NTIA is planning on using the data to inform the State Digital Equity Capacity Grant Program. To make sure the data are accurate for this purpose, the comment argues, the survey must be accessible to all Covered Populations named under the Digital Equity Act. The comment then goes on to support the National Skills Coalition’s suggestions (see summary item 3, above).

The comment then discusses accessibility for individuals facing a language barrier and notes two significant issues with the survey questions: the complexity of the language used in the survey questions, and the lack of non-English options. The comment suggests that the following remedies should be pursued: rewrite the devices and Internet definitions in plain English and/or include images of the items being defined; provide high quality translations of the survey; ensure staff is fluent in relevant languages; ensure that terms are understandable to respondents with limited familiarity with technology and are distinguishable between similar items, e.g., a modem vs. a router; and add an “I don’t know” response to questions such as HOMTE and USEINT.

After noting a lack of guidance in the Interviewer Manual on working with respondents with disabilities, the comment suggests that work be done to ensure that accommodations are offered, that specific guidance is included in the Interviewer Manual. And that respondents have access to assistive technology.

Lastly, the comment notes that the expected completion time of 10 minutes might not be accurate for the aforementioned groups and, because of this higher burden, might decrease response rates. Also, the comment discusses how involving local community groups (1) might reduce concerns around a respondent opening their front door to strangers and (2) could potentially help adapt the survey to a respondent’s primary language.

We deeply appreciate the focus on accessibility in this comment. In particular, a recurring theme of our cognitive testing efforts with the Census Bureau has been ensuring we use clear and broadly understood language throughout the survey instrument, which can be challenging when asking questions about technology use. Some of the proposed changes to this information collection were made specifically to improve clarity and use of plain language, and we are always open to testing more such improvements in the future. Separately, while Census Bureau interviewers translate our questions into Spanish as needed during particular interviews, we understand the potential advantages of adopting tested and standardized translations of the survey questions, and will keep this in mind for future investigation. We further note that the Census Bureau does have mechanisms in place to address some of the mentioned concerns, such as the ability to record “don’t know” responses to any question.

  1. Information Technology and Innovation Foundation

The comment starts by recognizing the survey as an important resource for digital inclusion, and endorsing the range of response options for reasons for Internet non-adoption. The comment suggests adding more detailed questions on non-adoption. First, the comment suggests asking respondents if they know about the Affordable Connectivity Program and whether they have signed up for the program. These specific questions could help delineate who does not know about the program and who is facing other barriers to Internet access.

Other recommended additions include asking the respondents to address the disadvantages they’ve faced due to a lack of Internet service, and what benefits of Internet service would make them reconsider. The author argues that, since the “most effective digital inclusion efforts are targeted to specific needs,” these questions would help policymakers design effective policy. The last suggestion is that the survey should include an additional broad category of response “I don’t know how to use the Internet” and include follow up questions on what interventions would help the respondent use the Internet.

We are constantly working to improve our understanding of the challenges faced by those who find themselves on the wrong side of the digital divide, including through survey questions. While concerns around interview times constrain our ability to add additional questions at this time, we note there is some potential for researchers to study these issues without adding more questions. For example, it may be possible to link CPS household records with ACP enrollment data, though it would be necessary to go through the Census Bureau’s process of obtaining access to the non-public (containing personally-identifiable information) datasets in a secure environment.

  1. Mark Friedman

The comment lists specific questions and recommended wording tweaks:

  • On the cell phone use question, add references to “smartphone” throughout the question

  • On the social network use question, add reference to TikTok

  • On the video call question, add reference to Teams

  • On the smart home device question, add reference to “Door Ring”

  • On the “reasons why you do not use the internet at home?” question, add “Don’t know how?”

As part of the expert review and cognitive testing process, we and Census Bureau colleagues carefully considered the language and examples used to explain various technologies. At this time we do not believe it is necessary to add the additional examples, but we will continue to monitor the performance of questions in case more examples are needed in the future. We appreciate the suggestion to add lack of digital skills as a potential reason for non-use of the Internet at home, and will consider that moving forward as part of the larger discussion around measuring digital skills.

  1. Amy Gonzales – Associate Professor, Department of Communication, UC Santa Barbara

The commenter notes that DEVSTA does not act as expected when compared to DEVQUA. The commenter suspects that the question is capturing multiple phenomena and that it could be narrowed to potentially be more useful. We thank the commenter for highlighting this issue, which we also noticed in our own work. In the proposed revisions to this information collection, we revised DEVSTA to take the form of a simpler yes/no question, which we hope will improve its performance.

  1. Digitunity

The comment begins by praising the “robust inclusion of devices” included in the survey. The comment then makes the following suggestions for the survey:

  • On LAPTOP, a separate category for Chromebooks, whose capabilities are much more limited than those of most laptops

  • On MPHONE, reworded more succinctly as “What about a cell phone, such as a smartphone, which connects to the Internet?”

  • General concerns that the device questions delineate whether a device has internet capabilities, such as a Wifi-enabled laptop

  • DEVQUA could be reworded to distinguish between circumstances in which (a) the device functions improperly and/or (b) the device is not suited for completing the work needed: “Thinking about all the different devices we just discussed, how well do the computers and other Internet-connected devices [you use/used by this household] function overall?”

  • For questions around specific use cases (pages 6-7), the comment encourages the question to distinguish between devices to help determine if the device is adequate and appropriate

  • NOHM could have an additional answer choice about networking equipment not working

  • NOHM, PRINOH response number 6 could be broken out into “No computing device,” “Device inadequate,” and “Broken” to enable more granular analysis

  • LOPRCE could have an additional question: “At what price, if any, would [you/your household] buy device?”

  • An additional question clarifying device issues could be added: What are the reasons that you are not able to use your device?

    • i. I am not able to troubleshoot or fix my device on my own.

    • ii. I don’t know where to find repair support that I can afford.

    • iii. It is broken and I can’t afford a replacement.

    • iv. My computer is not able to be repaired.

    • v. I lack a desk/headphones/camera to effectively use my device.

The comment also expresses support for the questions suggested by the National Skills Coalition (see summary item #3 above).

We appreciate the suggestions to improve our understanding of what devices Americans use and what challenges they face in attempting to fully utilize their capabilities. While the need to keep interview times down constrains our ability to add new questions, some of the additional answer choices could, after future testing, potentially be added with minimal impact on burden.

  1. CTIA

The comment begins by acknowledging the importance of the survey for programs such as the Affordable Connectivity Program. The comment then suggests that questions should be added assessing awareness of affordability programs such as ACP and Lifeline and that NTIA could coordinate with the FCC on surveys of ACP customers that FCC has indicated it is undertaking.

The comment then makes suggestions around questions involving pricing. The comment argues that the proposed question about the monthly price a consumer would pay for Internet service is of limited use given that the respondent may not be familiar with all available options as well as programs like Lifeline and ACP. The comment suggests the question could be revised to ask if a $30/month discount (equivalent to ACP) would change their decision.

The comment then argues that the proposed question regarding whether broadband service meets respondents’ needs should be revised because a respondent might not have the context needed to answer. The comment suggests that the question should instead ask if the services enables respondents to complete the tasks for which they purchased the internet service.

Lasty, the comment suggests that the question on Internet connected devices should be clarified so that respondents can tell the question is specifically about the performance of the device, not the internet connection.

We strongly agree that it is important to improve the state of knowledge around the Affordable Connectivity Program and similar initiatives, though given the difficulty of adding additional questions, we again suggest it may be possible to study the impact of ACP through other means, such as the record matching strategy described above (see summary item #6 above). On the pricing question, we respectfully disagree with the assessment that LOPRCE is of limited use; the data generated from that question in the 2021 edition of the Survey has been helpful in shedding additional light on the challenges facing offline households, e.g. in a blog post NTIA published last year. Finally, we appreciate the suggestions for clarification of certain existing questions, and note that ensuring clarity and proper understanding of the survey questions was a main focus of the cognitive testing process that resulted in many of the proposed changes to the information collection.

  1. Benton Institute for Broadband and Society and other orgs

The comment begins by discussing the importance of the survey in developing and measuring the effectiveness of policy, particularly with respect to the Infrastructure Investment and Jobs Act. The comments go on to make several recommendations about changing the survey to better fit this goal.

The first group of suggestions focuses on questions about challenges associated with use of devices and the internet. For DEVQUA, the comment suggests either asking about a specific device or about how often a device does not work properly. The next two points echo the recommendations made by the National Skills Coalition (see comment #3). The comment then recommends that DEVSTA retains its previous format, as opposed to the change to a Yes/No form. Lastly on this point, the comment recommends HNETST be reframed to also catch situations where people cut or reduce their internet service because it was too expensive.

The next group of recommendations focus on the affordability and quality of internet services. The first recommendation is that questions should be around awareness of the ACP and if the respondent has tried to enroll in the program. The comment also recommends that HNETQL asks about how often internet service is of poor quality. Lastly the comment recommends that NOHM be updated because the reasons for not having internet service may have changed since the question was first written. The comment lists several options to be given as answers.

The comment then discuses adding questions around digital skills and literacy. The recommendations revolve around adding more attitudinal scales and more response options to questions. The comment suggests adding questions about how important internet access was in performing tasks like finding jobs or receiving healthcare. The comment then suggests adding questions about the reliability of information found on the internet and a self assessment of media literacy skills. Lastly, the comment recommends that PSCON be revised to be more precise, using language such as "How concerned, if at all, are you by the following online privacy and security risks?” with answers on a 4 point scale.

The comment then says the survey should ensure that data can be disaggregated by covered populations under the Digital Equity NOFO and BEAD NOFO.

The comment then proposes a list of several minor edits. For INLICO, the comment recommends removing “in other public places” as the authors believe this phrase is unclear. For EGOVTS, the comment recommends adding specific examples of government services. For HOMIT, the comment argues the question could be simplified to just asking respondents whether they own any smart appliances, as by definition they are continually connected to the network. Lastly, the comment discusses MEDREC and MEDDOC and says that the distinction between “live and interactive” health services vs scheduling, billing, record management, etc. is more useful. The comment suggests that MEDREC should include scheduling appointments and that MEDDOC should measure tele-health visits.

Lastly, the comment responds to the request for input on reducing the burden on respondents. First, the comment suggests deleting INCAFE, INTRAV, and INELHO or combining them into a single question about the primary location of internet use. The comment argues that these questions in particular offer little insight. The final suggestion is that some questions could be combined since the importance of their distinction is outdated. TEXTIM; DESKTP and LAPTOP; and AUDIO and VIDEO are listed potential candidates for condensing or combination.

We appreciate the thorough set of recommendations. In particular, the similarities between the views of the authors of this comment and some of the other received comments suggest certain future directions for exploration. For example, while we do find value in tracking Internet use at a wide range of locations, the commonality across comments of suggested locations to prioritize and consider dropping could prove useful if we need to further reduce interview times in the future. We further note the desire here for more information about digital skills, the Affordable Connectivity Program, and reasons for non-use of the Internet from home. We also agree about the importance of understanding issues around media literacy and misinformation, but as with other suggestions, we currently lack the capacity to add additional questions and risk increased interview times.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCmehil-Warn, Christian
File Modified0000-00-00
File Created2023-08-24

© 2024 OMB.report | Privacy Policy