A-2 Comment Summaries and Responses

2021 Att. A-2 Summary of Comments.docx

Computer and Internet Use Supplement to the Census Bureau's Current Population Survey

A-2 Comment Summaries and Responses

OMB: 0660-0021

Document [docx]
Download: docx | pdf

November 2021 Current Population Survey Computer and Internet Use Supplement
Comments on Proposed Information Collection

On May 25, 2021, the National Telecommunications and Information Administration (NTIA) published a notice in the Federal Register requesting comments on the next edition of its Computer and Internet Use Supplement to the Census Bureau’s Current Population Survey (CPS), pursuant to the provisions of the Paperwork Reduction Act (PRA). NTIA received eight comments in response to this notice. All eight comments contained valuable and thoughtful feedback that NTIA has considered in the course of finalizing its proposed information collection. NTIA summarizes and responds to each comment below.

1. Jean Public

In this comment, Jean Public expresses concern that the costs of the survey (cost, length, and need for personal visits) far outweigh the benefits. NTIA respectfully disagrees with the characterization of the survey instrument and with the suggested solution. We believe that fielding the NTIA Internet Use Survey as a CPS supplement is very efficient in terms of cost, resources, and convenience. Obtaining subscription information from ISPs, while also a valuable data collection, cannot shed light on how Americans make use of the Internet, other locations of Internet use, the demographics of non-users, or other important questions.

2. Kenneth Flamm

In his comment, Kenneth Flamm suggests revising HOMTE, which asks respondents how they access the Internet from their home, to provide clearer “guidance as to what choice to select if a household has fixed wireless service, which is a common ‘high speed’ broadband service option in many U.S. census blocks.” Flamm recommends adding “fixed wireless” as another example in option (1), or adding another category to gather this information. Currently, option (1) asks respondents if they access Internet from their homes using “high-speed Internet service…such as cable, DSL, or fiber optic service.” Flamm states that the wording seems “to imply that technologies other than cable, DSL, or fiber are NOT high speed services.” Households with fixed wireless service must then choose between option (1) since it can be considered “high-speed” service, or option (4) since it is not cable, DSL, or fiber optic service. This could potentially lead to inconsistencies and inaccurate estimation of broadband availability. Additionally, Flamm questions what “some other service” entails: would a “cellular hot spot with 3G or 4G LTE service” be covered, or does that fall under a “mobile data plan?”

The commenter notes that FCC data show that “20% of service providers in urban census blocks are fixed wireless service providers, and an even larger share of rural census blocks are served using fixed wireless service.” NTIA appreciates this suggestion and will monitor the adoption of this technology to consider listing it as an example in future editions of the survey. However, current FCC subscription data suggest that fixed wireless is not used commonly nationwide, as opposed to being available to purchase.1 While fixed wireless service may be common in certain areas, we may not be able to capture enough of those households in our sample in a nationwide survey.

3. Nancy Gordon

As a researcher “who studies demographic and social factors…associated with internet use,” Nancy Gordon appreciates the importance of the survey data, particularly “for studying internet access equity.” However, she raises two issues regarding the survey length and unit of measurement. On the first matter, Gordon points out that the longer length of the survey might deter respondents from answering all of the questions, “especially if they are coming after a lot of other questions in CPS modules.” She alludes to an important point on the necessary balancing act between collecting lots of data and being efficient. NTIA agrees with the importance of ensuring that the survey does not become too long that we lose respondents. In fact, NTIA has been advised by the Census Bureau that response rates for CPS supplements may suffer when average response time exceeds ten minutes, so it is important to ensure that the length of the survey instrument does not exceed that threshold. Given this constraint, there is little space for adding additional questions without removing existing ones, and any new changes require extensive development and cognitive testing prior to use in production.

Second, Gordon challenges the usefulness of measuring survey results at the household level and emphasizes the value of “ha[ving] information about an individual respondent’s use of the internet.” We agree that for many purposes, individual-level data are ideal. We collect both household- and individual-level data in the NTIA Internet Use Survey, including on use of the Internet, different types of devices, and online activities. For other questions, such as those about the types of Internet connections that may be used at home, the household can be a more natural unit of measurement.

4. Murray Howard

Murray Howard recommends an additional question to gather information on “the number of successive billing cycles with internet response slowdowns that is most likely to prompt customers to upgrade from finite data plans to unlimited data plans.” Howard expresses concern that as “customers approach the maximum data usage allowed by a plan,” internet providers slow data speeds in order to compel customers into upgrading to unlimited data plans. While “there is nothing nefarious in their practices,” this can be “bad for trying to get work done online.”

This is an interesting idea, but would require testing to ensure respondents can consistently report on the varying nuances of data plans and that respondents are up to date on any throttling or surcharges imposed by internet providers. This information is also difficult to obtain as it depends on respondents knowing how many billing cycles have passed and assumes that households can afford to upgrade to an “unlimited” plan if they wish to. Moreover, plans that are advertised as “unlimited” can still vary in terms of speed usage. This can lead to confusion and inconsistency in reported services, and survey respondents are unlikely to look at a bill from their service provider in the middle of an interview.

5 & 6. Karl Kolesnikoff

The commenter suggests improving the 2021 supplement by including more questions and examples to clarify the survey. In his first comment, Karl Kolesnikoff proposes three changes to HOMTE, which asks respondents how they access the Internet from their home. First, Kolesnikoff recommends removing cable and DSL as “high-speed” Internet options, as DSL “can barely give you 10/1 up and down speed,” and cable “is maybe good for 100/25 if your [sic] fortunate to pay for it and have the right ISP.” Second, he suggests separating option (1) into three separate questions about cable, DSL, and fiber optic service given that they are “distinctly different modes of internet connection.” NTIA appreciates this suggestion; unfortunately our experience suggests there may be substantial barriers to obtaining accurate data in this area. According to cognitive testing and reports from the field from previous editions of the NTIA Internet Use Survey2, as well as testing of the computer and Internet use questions on the American Community Survey (ACS),3 many people have difficulty identifying the exact technologies they use to get on the Internet. This issue is what led us in the 2015 Survey to combine different types of fixed wired technologies (e.g., cable, DSL, and fiber optic service), and what led to a very similar change in the ACS. Third, he proposes clearly defining “some other service” in option (4) as there “is a finite number” of other services. Specifically, he points to the growth of WISPs as an example.

In his second comment, Kolesnikoff proposes additional questions regarding devices, particularly the number of devices each individual in the household has and the quality of each device. In regards to the quality of devices, Kolesnikoff suggests asking respondents if they “[keep] the software updated on the device and if not why,” and if they “know how to update applications…or the operating system.” This could provide “good insight into digital literacy.” NTIA agrees that understanding levels of knowledge with technology is an important area for research. Unfortunately, this particular proposal may be difficult to implement given the wide range of device types asked about in other areas of the survey, and would either necessitate an expansion of the survey that may reduce response rates, or require NTIA to remove a number of other important survey questions. That said, NTIA will keep the need for more extensive digital literacy data in mind when contemplating future modifications to the Survey.

7. Rehabilitation Engineering Research Center for Wireless Inclusive Technology (Wireless RERC) and the Center for the Development and Application of Internet of Things Technologies (CDAIT)

Wireless RERC and CDAIT applaud the importance of the “vital information on digital technology” provided by the NTIA Internet Use Survey. However, they express concern that the survey does not fully include “people with disabilities in the broader design of data collection,” which informs public policy. Incorporating such “data will allow for identifying barriers to internet use and technology adoption by people with disabilities, assisting” in “[creating] a more accessible and usable broadband environment.” Wireless RERC and CDAIT propose ensuring that “interviewers conducting personal visits are properly equipped to communicate with people who have disabilities” and that “additional information be provided or prepared on the procedures for conducting the in-person and telephone interviews in an inclusive manner.” Given that NTIA leverages the Census Bureau’s existing methodology for administering the CPS, we do not have the ability to change the way interviews take place for our supplement.

Wireless RERC and CDAIT recommend an additional question to “facilitate increased understanding of the various types of ICT people with disabilities use to access the internet and its content.” Specifically, they suggest asking if respondents “use any of the following aids to access online content or communicate via an internet-connected PC or laptop,” and provide the following options: screen reader, screen magnifier, I.P. relay, augmentative and alternative communication (AAC) device or software, text-to-speech technology, speech-to-text technology, braille display, and other. Similarly, Wireless RERC and CDAIT suggest adding options to NOHM, which asks respondents what are the reasons they do not use the internet at home: “Difficulty using the Internet (i.e., not accessible to my disability),” and/or “The cost of assistive technology/software to access online content.” The resulting data could provide more nuanced information regarding broadband affordability and adoption. NTIA appreciates the proposed questions and is very interested in fostering a better understanding of disability-related barriers to Internet access and adoption. While NTIA has not had space to add these questions and options, we recognize the potential value of gathering such data and will keep these comments in mind for future development.

Wireless RERC and CDAIT also propose an additional question following EGOVTS to “[probe] potential barriers to using the Internet to access government and public services” in order to assist in adjusting federal regulatory policies if needed. Next, Wireless RERC and CDAIT point out that the key to benefiting from Internet service is device affordability. While HNETST is “a good metric for service affordability,” additional questions regarding device affordability would provide a clearer understanding of the barriers to broadband adoption. They recommend asking respondents if they have “temporarily [lost] a home internet connection due to the inability to replace a broken, lost, or stolen personal computer or laptop,” as well as asking respondents if they have “temporarily [lost] a mobile Internet connection due to the inability to replace a broken, lost, or stolen personal smartphone.” We agree that data on barriers to accessing government and public services and device affordability could be very interesting; however, NTIA notes that the proposed survey instrument seeks to cover a wide range of topics, and that there is little space for adding additional questions without removing existing ones. That said, we will keep these comments in mind when contemplating future modifications to the CPS Supplement.

Additionally, Wireless RERC and CDAIT propose a change to survey flow. Specifically, they suggest placing EGOODS before ESRVCS as “they are related concepts, and presenting them in sequence is logical.” Moreover, they recommend adding alternative examples to ESRVCS given that the existing examples are “solely related to app-mediated services, which is fine if the goal of the question is to measure the app economy.” While cognitive testing did not reveal any understanding issues with the current placement and examples for these questions, we will keep these suggestions in mind in the future.

8. National Digital Inclusion Alliance (NDIA) and Public Knowledge

In its letter, NDIA and Public Knowledge break out their recommendations into two parts: additional areas of inquiry beyond NTIA’s proposed questions, and suggested changes to currently proposed questions. The first additional question recommended by NDIA and Public Knowledge is aimed at understanding digital literacy and skills, given that a “national, comprehensive dataset on U.S. residents levels of digital skills currently does not exist.” They advise “NTIA begin collecting basic skills data” and offer sample questions and methodology by way of Eszter Hargittai’s comment in response to an NTIA RFC from last year, the North Carolina’s Broadband Infrastructure Office’s state survey, Dr. Roberto Gallardo’s research, and the National Skills Coalition. In particular, NDIA and Public Knowledge point to question 10.1 from North Carolina’s Broadband Infrastructure Office’s state survey, which asks respondents to rate their skills in six activities on a scale of 1 (Would like to learn) to 5 (Not interested). NTIA agrees that understanding levels of skill with technology is an important area for research, and that a set of binary questions about online activities is an imperfect vehicle for understanding varying levels of digital literacy. Unfortunately, the NDIA and Public Knowledge proposal to improve in this area would either necessitate a drastic expansion of the survey that may reduce response rates, or require NTIA to remove a large number of other important survey questions. Additionally, the proposed scale would require significant cognitive testing prior to use in production. That said, while this particular proposal may be difficult to implement, NTIA will keep the need for more extensive digital literacy data in mind when contemplating future editions of the Survey, as well as other potential data collections.

The next additional question put forward relates to connectivity barriers and affordability. NDIA and Public Knowledge “applaud NTIA for incorporating revised questions in its latest draft of the Internet Use survey that pertain to cost and pricing…however NTIA could collect more specific data as it pertains to price from current broadband customers.” Specifically, they advocate for the addition of specific questions about “how much respondents pay for internet service,” such as “estimate[ing] a total monthly figure for expenses related to Internet access…or…breakdowns of services, and obtaining an estimate of the total monthly expenses so that the amount spent on Internet can be evaluated from the responses.” They note that this methodology has been used in state-level surveys. NTIA agrees that these data would be useful; in fact, questions about Internet access service pricing were included in the 2011 and 2013 editions of the NTIA Internet Use Survey. Unfortunately, that experience suggests it is difficult to obtain accurate pricing data through a household survey, when respondents are unlikely to have a copy of their broadband bill handy.

NDIA and Public Knowledge recommend three additional questions to determine whether existing connections are suitable for household needs. Given that “current broadband speeds often fail to meet modern consumer needs, especially over the past year as a result of increased strain on network infrastructure,” questions like “HNETQL are essential for assessing whether service providers are fulfilling their obligations to their customers.” However, three aspects of the issue that do not seem to be addressed in the revised survey are “household simultaneous usage, the disparity between advertised and actual speeds, and service outages.” First, considering the prevalence of household members’ simultaneous use of high-bandwidth activities – especially during the pandemic – it is important to “understand whether households have access to broadband speeds that meet their needs.” NDIA and Public Knowledge propose asking respondents how often their “families engage in this kind of simultaneous use of their home internet connection, and whether they have noticed impacts or degradation in service quality when attempting to do so.” NTIA appreciates this suggestion, as the issue of satisfying household needs is important to understand. However, NTIA notes that the proposed follow-up question regarding Internet services impacted or degraded is at least generally addressed by HNETQL, which asks respondents whether their Internet connection at home fills the household’s needs, including for speed, reliability, and data caps.

Second, NDIA and Public Knowledge express concern regarding the “disparity between advertised speeds and actual speeds,” which became more apparent in the past year “as households became ever more tethered to their home Internet connections.” Since there is not currently a question on the revised survey that addresses this issue, it could “prove an effective and possibly revealing datapoint,” as well as allow NTIA “to gather data on whether customers are actually receiving the level of service they were advertised.” Accordingly, NDIA and Public Knowledge recommend three additional questions about “their ability to perform bandwidth-intensive activities,” the “speed they believe they are paying for, and if they believe they are receiving that level of service.” They note that it would be important to include “I don’t know” as an answer choice as it would expose an important knowledge gap, “providing practitioners an opportunity to educate the communities that they serve.” NTIA believes that measuring speeds directly is more likely to lead to a better understanding of the potential discrepancy between advertised and delivered speeds. For example, the FCC works in partnership with SamKnows to conduct automated, direct measurements of service to measure broadband performance.

The third additional question put forward “is directly tied to the question of whether or not connections are suiting household needs” and focuses on service outages. Outages can “severely impair a household’s ability to use the Internet,” are important indicators of network reliability, and are “more prevalent among those in the lowest income groups.” NDIA and Public Knowledge propose asking respondents about any service outages that have occurred, specifically about their length and frequency, as well as their impact on a “household’s ability to complete their online activities.” While more information about service reliability and outages would be useful for researchers, NTIA must balance the desire for more detail on a particular topic against the need to gather data on other topics related to computer and Internet use. There could be a variety of reasons for outages, including extreme weather, natural disasters, and accidents. This would also require extensive development and testing using resources not currently available for the November 2021 edition of the CPS Supplement.

Next, NDIA and Public Knowledge propose asking additional questions to assess Americans’ changing use of the Internet for research and news. There is an increasing reliance “on the Internet as a primary conduit for news and information,” and research “has shown that online sources of news have outpaced television and print sources.” Given that NTIA asks “numerous questions about the types of activities respondents engage in online,” NDIA and Public Knowledge recommend asking questions about respondents’ intentional and passive behavior in finding information and news online, including “Do you use the Internet to research and find information?” and “Do you get news from the Internet?” The passive form of the question attempts to capture specific behavior such as being “exposed to headlines or articles via social media or on an Internet homepage.” The information gathered “reflects on an essential component of how the Internet is used that directly affects our democracy and public life.” NTIA is always seeking a better understanding of households’ and individuals’ activities online, and this is an interesting idea that merits further consideration. Such a question would, however, require development and testing using resources not currently available for the November 2021 edition of the CPS Supplement.

In addition to recommending additional questions, NDIA and Public Knowledge suggest changes to DEVQUA, HOMTE, NOHM, and PRINOH. NDIA and Public Knowledge believe that the wording of DEVQUA, which asks respondents how well their computers and other Internet-connected devices work, makes it “unclear whether or not a user’s device is working as intended, or if the device is able to reliably connect to the Internet.” They propose the following change: “Thinking about all of the different devices we just discussed, how reliably do the computers and other devices [you use/used by this household] connect to the Internet? Please do not include home Wi-Fi routers or similar equipment.” NTIA respectfully disagrees with the proposed modification as DEVQUA was developed and intended to assess the quality of the household’s devices overall, and not just the quality of its ability to connect to the Internet. Further, we separately assess the quality of Internet connections in the Survey.

NDIA and Public Knowledge further advise breaking out cable, DSL, and fiber-optic service into three separate choices for respondents to choose from. This would help gather “more nuanced data, and…align with the practices of several state broadband offices (which ask a similar question).” NTIA appreciates this suggestion; unfortunately, our experience suggests there may be substantial barriers to obtaining accurate data in this area. Cognitive testing and reports from the field, as well as testing of the computer and Internet use questions on the American Community Survey (ACS) suggest that many people have difficulty identifying the exact technologies they use to get on the Internet. This issue is what led us in the 2015 survey to combine different types of fixed wired technologies (e.g., cable, DSL, and fiber optic service), and what led to a similar change in the ACS.

Finally, NDIA and Public Knowledge propose a modification to option (3) in PRINOH and NOHM, which both ask about the reasons respondents do not go online or do not use the Internet at home. The current language for option (3) is “Not worth the cost,” which seems to address reliability instead of affordability. Therefore, they propose modifying the language accordingly to “Service is not reliable in my area.” NTIA respectfully disagrees with the proposed modification to option (3) as “Not worth the cost” is intended to understand household priorities in relation to cost and affordability instead of service reliability.

2 Rodney L. Terry, Aleia Clark Fobia. (2017). Cognitive Pretesting of the 2015 CPS Computer and Internet Use Supplement. Research and Methodology Directorate, Center for Survey Measurement Study Series (Survey Methodology #2017-07). U.S. Census Bureau. Available online at <http://www.census.gov/srd/papers/pdf/rsm2017-07.pdf.>

3 Jamie M. Lewis, Dorothy A. Barth. (2017). 2016 American Community Survey Content Test Evaluation Report: Computer and Internet Use. American Community Survey Research and Evaluation Program. U.S. Census Bureau. Available online at < https://www.census.gov/content/dam/Census/library/working-papers/2017/acs/2017_Lewis_01.pdf.>

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCao, Michelle - Intern
File Modified0000-00-00
File Created2021-08-17

© 2024 OMB.report | Privacy Policy