3060-1265 Supporting Statement_final version

3060-1265 Supporting Statement_final version.docx

Connect America Fund - Performance Testing Measures

OMB: 3060-1265

Document [docx]
Download: docx | pdf

Connect America Fund – Performance Testing Measures 3060-1265 October 2022


SUPPORTING STATEMENT



This collection is being submitted to the Office of Management and Budget (OMB) to seek approval to extend the requirements contained in this information collection, with updates to the numbers of respondents.


This collection addresses the requirements for universal service high cost support recipients to test their networks for compliance with the FCC’s speed and latency requirements. See Connect America Fund, Order, WC Docket No. 10-90, 33 FCC Rcd 6509 (WCB/WTB/OET 2018) (Performance Measures Order); Connect America Fund, Order on Reconsideration, WC Docket No. 10-90, FCC 19-104 (2019) (Performance Measures Reconsideration Order); 47 C.F.R. § 54.313(a)(6). On June 13, 2022, OMB approved a non-substantive change clarifying that this collection’s requirements apply to recipients of Rural Digital Opportunity Fund support.


  1. Justification


    1. Circumstances that make the collection necessary. The Communications Act of 1934, as amended, requires the “preservation and advancement of universal service.” 47 U.S.C. § 254(b). The information collection requirements reported under this collection are the result of FCC actions to promote the Act’s universal service goals.


In the USF/ICC Transformation Order, the Commission laid the groundwork for today’s universal service programs providing $4.5 billion in support for broadband Internet deployment in high-cost areas. Connect America Fund, et al., Report and Order and Further Notice of Proposed Rulemaking, WC Docket No. 10-90, et al., 26 FCC Rcd 17663 (2011) (USF/ICC Transformation Order). The USF/ICC Transformation Order required, among other things, that high-cost universal service recipients “test their broadband networks for compliance with speed and latency metrics and certify to and report the results to the Universal Service Administrative Company (USAC) on an annual basis.” Id. at 17705, para. 109. Pursuant to the Commission’s direction in that Order, the Wireline Competition Bureau, the Wireless Telecommunications Bureau, and the Office of Engineering and Technology (the Bureaus and OET) adopted more specific methodologies for such testing in the Performance Measures Order. See generally Performance Measures Order. See also 47 C.F.R. § 54.313(a)(6) (requiring that recipients of high-cost support provide “[t]he results of network performance tests pursuant to the methodology and in the format determined by the Wireline Competition Bureau, Wireless Telecommunications Bureau, and Office of Engineering and Technology”). Addressing petitions for reconsideration, the Bureaus and OET more recently adopted certain modifications and clarifications to the requirements pertaining to high-latency bidders in the Connect America Fund (CAF) Phase II auction, and the Commission refined the general testing requirements further. See generally Connect America Fund, Order on Reconsideration, WC Docket No. 10-90, DA 19-911 (WCB/WTB/OET 2019) (Satellite-Related Performance Measures Order); Performance Measures Reconsideration Order. Accordingly, this collection includes the requirements for testing speed and latency to ensure that carriers are meeting the public interest obligations associated with their receipt of high-cost universal service support.


Carriers will identify, from among the locations they have already submitted and certified in USAC’s High Cost Universal Broadband (HUBB) portal, the locations where they have an active subscriber (deployment locations are reported under OMB Control Number 3060-1228, and active locations will be reported under this control number). From those subscriber locations, USAC will then select a random sample from which the carrier will be required to perform testing for speed and latency. Carriers that do not provide location information in the HUBB will use a randomization tool provided by USAC to select a random sample of locations for testing. The carrier will then be required to submit to USAC the results of the testing on an annual basis. The annual filing will include the testing results for each quarter from the prior year. The carrier’s sample for each service tier (e.g. 10 Mbps/1 Mbps, 25 Mbps/1 Mbps) shall be regenerated every two years. During the two-year cycle, carriers will have the ability to add and remove subscriber locations if necessary, e.g., as subscribership changes.


Currently approved requirements in this information collection (no changes to requirements):


  1. Selection of locations with active subscribers (See 12.a):


A maximum of 50 randomly-selected subscribers in each state and each speed tier must be selected. The number of consumers to be tested will be based on the number of subscribers at high-cost-supported locations. Performance Measures Order, 33 FCC Rcd at 6522-24, paras. 36-40.


Through USAC’s HUBB, carriers report locations where they have deployed broadband using CAF or other high-cost support. See OMB Control No. 3060-1228. Every two years, pursuant to this information collection, carriers must identify locations where the carrier has an active subscriber (i.e., by assigning a unique identifier to a subscriber location) from among those deployed locations they report to the HUBB. USAC will use this information to generate random lists of locations with subscribers for carriers to test. By building on the existing reporting requirement, this requirement minimizes the burden on carriers while ensuring that providers cannot cherry pick the subscribers likely to have the best performance. See Performance Measures Order, 33 FCC Rcd at 6524, para. 40. Screenshots provided with this submission show the interface through which carriers must identify subscriber locations. Carriers reporting in the HUBB can download their geocoded location data and will download the subscriber random sample for testing.


For those few carriers that are not required to report locations in the HUBB (e.g., those carriers that have already deployed broadband at the required speed and latency to 100% of the locations in their service area), a randomization tool is available for carriers to select a random sample of locations for testing. Using this tool, carriers will input the number of high-cost-supported subscriber locations they have in a state and speed tier, and will receive a list of numbered locations they must test. Neither USAC nor the Commission collects information from these carriers for non-HUBB locations through the tool, but location information may be collected during a subsequent audit.


  1. Speed and latency testing and submission of results (See 12.b):


Carriers receiving high-cost support to provide broadband service to fixed locations, including all providers with CAF Phase II, Alternative-Connect America Cost Model (A-CAM), Connect America Fund Broadband Loop Support (CAF BLS), Rural Broadband Experiments (RBE), Alaska Plan (wireline), and Rural Digital Opportunity Fund (RDOF) obligations, are required to conduct these speed and latency tests. See 47 C.F.R. § 54.313(a)(6). Three testing options will be permitted – use of Measuring Broadband America (MBA) testing; off-the-shelf testing; and provider-developed self-testing or self-testing. Performance Measures Order, 33 FCC Rcd at 6513, paras. 9-10.


A test is defined to be a single, discrete observation or measurement of speed or latency conducted from the customer premises of an active subscriber at a high-cost-supported location to a remote test server located at, or reached by, passing through an FCC-designated Internet exchange point (IXP). For providers serving non-contiguous areas more than 500 air miles from the contiguous United States, testing must be conducted from the customer premises of an active subscriber to the point in the non-contiguous area where all mainland traffic is aggregated for transport from the non-contiguous area. Performance Measures Order, 33 FCC Rcd at 6515-17, paras. 17-21.


Testing must be conducted for one week during each quarter of the year. In those weeks, testing must be performed between the hours of 6:00 pm to 12:00 am local time each day, including weekends (testing hours). For latency testing, a provider must conduct a minimum of one test per minute—sixty tests per hour—for each testing hour. For speed testing, a provider must conduct a minimum of one test per testing hour in each direction (download and upload). Performance Measures Order, 33 FCC Rcd at 6519-21, paras. 27-33.


Providers must submit all test results in the HUBB. Carriers cannot delete, trim, edit or otherwise exclude any test measurements. However, if a provider knows or suspects that the testing infrastructure has failed or has negatively impacted test results, the provider may submit evidence of the test infrastructure failure with sufficiently detailed information for the Commission to understand its cause and determine the extent to which any test results should be discarded or adjusted when calculating compliance. Performance Measures Order, 33 FCC Rcd at 6532, para. 61. By conducting speed and latency tests and submitting test results, carriers will aid the Commission in ensuring that its $4.5 billion budget for high-cost universal service support brings to high-cost areas broadband service reasonably comparable to those services provided in urban areas. See 47 U.S.C. § 254(b)(3).


Performance measures data and certifications are due July 1 each year for all four quarters of the prior calendar year. See 47 C.F.R. § 54.313(j)(1) (setting a July 1 annual deadline for high-cost recipients’ reporting, including of performance test data). However, carriers that are found to be not fully compliant with the Commission’s speed and latency standards will be subject to quarterly, rather than annual, reporting of test results. Performance Measures Order, 33 FCC Rcd at 6532, para. 63. The Commission also has an initial trial testing period in which carriers will be required to conduct testing and submit results within a short time (i.e., within one week at the end of each quarter of pre-testing) where there will be no penalties for noncompliance associated with a failure to meet the required standards. See Performance Measures Reconsideration Order at 32-33, paras. 82-83.


The attached template provides details on the formatting required for carriers’ submission of testing data to USAC. In brief, a carrier must provide the following information for each speed test result:


  1. Carrier identifier

  2. Subscriber identifier

  3. HUBB location identifier

  4. Study Area Code

  5. Required speed tier

  6. Whether download or upload tested

  7. Test server location

  8. Test start time

  9. Test end time

  10. Length of test

  11. Total bytes transferred in test

  12. Number of threads in TCP connections used in test

  13. Speed test result in Mbps

  14. Whether test was successful or failure


For latency test results, a carrier must provide the following information:


  1. Carrier name

  2. Subscriber identifier

  3. HUBB location identifier

  4. Study Area Code

  5. Latency tier

  6. Start date for testing week

  7. Test start time

  8. Test end time

  9. Length of test

  10. Test server location

  11. Latency in microseconds

  12. Whether test was successful or failure


  1. Voice testing for high-latency service providers and submission of results (See 12.c):


Certain technologies, such as satellite, are not capable of meeting the Commission’s 100 ms latency standard. Carriers using such technologies are referred to as high-latency providers. In addition to broadband speed and latency testing, high-latency service providers receiving high-cost support must also demonstrate a Mean Opinion Score (MOS) of 4 or above using a modified ITU Standard Recommendation P.800 conversational-opinion test conducted over the actual network by an independent testing organization. Performance Measures Order, 33 FCC Rcd at 6524-26, paras. 44-46. See also Satellite-Related Performance Measures Order at 5-12, paras. 12-27.


As with speed and latency testing for non-high-latency service providers, providers must submit all test results. In other words, providers cannot delete, trim, edit or otherwise exclude any test measurements. However, if a provider knows or suspects that the testing infrastructure has failed or has negatively impacted test results, the provider may submit evidence of the test infrastructure failure with sufficiently detailed information for the Commission to understand its cause and determine the extent to which any test results should be discarded or adjusted when calculating compliance. Performance Measures Order, 33 FCC Rcd at 6532, paras. 61-62.


Statutory authority for this information collection is contained in 47 U.S.C. sections 151-154, 155, 201-206, 214, 218-220, 251, 252, 254, 256, 303(r), 332, 403, 405, 410, and 1302. 

 

This information collection does not affect individuals or households; thus, there are no impacts under the Privacy Act.   




    1. Use of Information: The Commission will use the information to ensure that Connect America and RDOF funds, totaling over $20 billion, are spent in accordance with the rules of the program, and specifically to determine that carriers are meeting their speed and latency requirements.


    1. Technology collection techniques: Respondents will use USAC’s HUBB portal to facilitate the selection process for speed and latency testing. Using the HUBB portal, carriers will identify the locations with active subscribers. Carriers will also use the HUBB portal to submit the speed and latency test results. The interface is designed to provide online storage of applications and related information for carriers, with the potential to ease compliance with recordkeeping requirements and possible audits. Furthermore, where possible, information already provided by carriers can be carried forward to filings in later funding years (i.e. pre-populated data), to further reduce the filing burden.


    1. Efforts to Identify Duplication: There will be no duplication of information. The information sought is unique to each carrier or respondent, and similar information is not available.


    1. Impact on Small Entities: The collection of information may affect small entities as well as large entities. With multiple testing options, the process has been designed to limit the burden as much as possible on small entities. Testing sample sizes are set based on the number of subscribers so smaller entities are required to test substantially fewer locations than larger entities. Filing guidance and training will be made available to assist small entities in understanding what type of information should be submitted and in what format.


    1. Consequences if information is not collected. The information collected is used to determine compliance with the rules and eligibility for high-cost universal service support. These requirements were put in place, in part, as a response to Government Accountability Office recommendations to increase the transparency and accountability of high cost program funding. Without the requested information, USAC will not be able to determine whether a carrier is entitled to all the support it seeks or is complying with its service obligations. Failure to file the necessary location information may result in partial or complete denial of high-cost universal service support for the carrier.


    1. Special Circumstance. We do not foresee any special circumstances with this information collection.


    1. Federal Register notice; efforts to consult with persons outside the Commission. A 60-day notice was published in the Federal Register pursuant to 5 C.F.R. § 1320.8(d) on August 8, 2022. See 87 FR 47208. No PRA comments were received.


    1. Payments or gifts to respondents. The Commission does not anticipate providing any payment or gifts to respondents.


    1. Assurances of confidentiality. The testing results for speeds and latency for individual locations will remain confidential and not be available to the public. Further, no customer privacy data is obtained.


    1. Questions of a sensitive nature. There are no questions of a sensitive nature with respect to the information collection requirements described herein.


    1. Estimates of the hour burden of the collection to respondents. The following represents the hour burden on the collections of information:


      1. Selection of locations with active subscribers (updated respondents):


        1. Number of Respondents: Approximately 1,677 eligible telecommunications carriers. This factors in one respondent for each deployment obligation (i.e., carrier/state combination). In many instances, the same company will have an obligation in different states. For instance, Company A has an obligation in Maryland and an obligation in Virginia. While this is one company, it is factored in twice in the number of respondents.


        1. Frequency of Response: Once every two years.


        1. Total number of responses per respondent: 1 every two years (.5 per year).


        1. Estimated time per response: 16 hours.


        1. Total annual hour burden: 13,416.


16 hours per response every two years equals 8 hours per year for 1,677 respondents filing. Total annual hour burden is calculated as follows:


1,677 respondents x 1 response per respondent every two years = 1,677 responses x (16 hours / 2 years) = 13,416 total annual hours.


        1. Total estimate of in-house cost to respondents: $536,640 (13,416 hours x $40/hour).


        1. Explanation of calculation: We estimate that each carrier will take, on average, 16 hours to gather and submit the data for its active subscribers every two years, which means 8 hours per year.


1,677 (number of responses) x 16 (hours to prepare report per response) / 2 years x $40/hour = $536,640


      1. Speed and latency testing and submission of results (updated respondents):


        1. Number of Respondents: Approximately 1,677 eligible telecommunications carriers. This factors in one respondent for each deployment obligation (i.e., carrier/state combination). In many instances, the same company will have an obligation in different states. For instance, Company A has an obligation in Maryland and an obligation in Virginia. While this is one company, it is factored in twice in the number of respondents.


        1. Frequency of Response: Once annually and quarterly reporting for non-fully compliant carriers. The Commission will have an initial trial testing period in which carriers will be required to conduct testing and submit results within a short time, but there will be no penalties for noncompliance associated with a failure to meet the required standards.


        1. Total number of responses per respondent: Approximately 2 (on average).


        1. Estimated time per response: 45 hours.


        1. Total annual hour burden: 150,930.


45 hours per response for 1,677 carriers. Total annual hour burden is calculated as follows:


1,677 respondents x 2 responses per respondent on average = 3,354 responses x 45 hours = 150,930 total annual hours.


        1. Total estimate of in-house cost to respondents: $6,037,200 (150,930 hours x $40/hour).


        1. Explanation of calculation: We estimate that each carrier will take, on average, 45 hours per response. This includes the time to install any necessary software or testing equipment at the customer premises, conduct the testing, gather the results, and submit the results in the HUBB portal annually or, if necessary, quarterly. Some carriers may require a company truck roll to physically install additional equipment at the customer premises, but many carriers will be able to set up testing software remotely, if they do not have such software already set up.

3,354 (number of responses) x 45 (hours to prepare response per year) x $40/hour = $6,037,200


      1. Voice testing for high-latency service providers and submission of results (updated respondents):


        1. Number of Respondents: Approximately 3 eligible telecommunications carriers providing high-latency service through the Connect America Fund or the Rural Digital Opportunity Fund.


        1. Frequency of Response: Once annually and quarterly reporting for non-fully compliant carriers. As noted above, the Commission will have an initial trial testing period in which carriers will be required to conduct testing and submit results within a short time, but there will be no penalties for noncompliance associated with a failure to meet the required standards.


        1. Total number of responses per respondent: 1.


        1. Estimated time per response: 60 hours.


        1. Total annual hour burden: 180.


60 hours per response, once per year for 3 respondents. Total annual hour burden is calculated as follows:


3 respondents x 1 response per respondent = 3 responses x 60 hours = 180 total annual hours.


        1. Total estimate of in-house cost to respondents: $7,200 (180 hours x $40/hour).


        1. Explanation of calculation: We estimate that each carrier will take 60 hours per response. This includes the time to conduct the testing twice per year, gather the results, and submit the results in the HUBB portal annually.


3 (number of responses) x 60 (hours to prepare the response) x $40/hour = $7,200


The estimated respondents and responses and burden hours are listed below:




Information Collection Requirements


Number of Respondents


Number of Responses

Per Year


Estimated Time per Response (hours)


Total Burden Hours


In-house Cost to Respondents

a. Selection of locations with active subscribers

1,677

.5

16

13,416

$536,640

b. Speed and latency testing and submission of results

1,677

2

45

150,930

$6,037,200

c. Voice testing for high-latency service providers and submission of results

3

1

60

180

$7,200


Total Number of Respondents: 1,677 unique respondents filing multiple times.


Total Number of Responses Annually: 4,196 (rounded up)

Total Annual Hourly Burden for Requirements (a) – (c): 164,526


Total Annual In-House Costs to Respondents: $6,581,040


    1. Estimates for the cost burden of the collection to respondents. Carriers may use outside contracting to assist with the testing or procure the testing equipment, although it is not required. For high-latency voice testing, an independent agency or organization must determine the MOS for the carrier; the expected costs for contracting such outside parties are built into the in-house costs for respondents noted above.


    1. Estimates of the cost burden to the Commission. There will be few, if any, costs to the Commission because ensuring proper use of universal service support is already part of Commission duties. Furthermore, no new systems or programs will be acquired or developed to process the information collection.


    1. Program changes or adjustments. The Commission is reporting adjustments to the number of respondents in this information collection. The total number of respondents increased from 1,277 to 1,677 (+400), and accordingly the total annual responses increased from 3,195 to 4,196 (+1,001), and the total annual burden hours increased from 120,158 to 164,526 (+44,368).


There are no program changes.


    1. Collections of information whose results will be published. The Commission does not plan to make the individual testing information available to the public but plans to summarize that information and make it available.


    1. Display of expiration date for OMB approval of information collection. There is no paper form associated with this information collection; it is collected electronically through the portal described above. The Commission seeks approval to not display the expiration date for OMB approval of this information collection. The Commission will use an edition date in lieu of the OMB expiration date. This will prevent the Commission from having to repeatedly update the expiration date on the portal each time this collection is submitted to OMB for review and approval. The Commission publishes a list of all OMB-approved information collections in 47 C.F.R. § 0.408 of the Commission’s rules.


    1. Exceptions to certification for Paperwork Reduction Act Submissions. There are no exceptions to the certification statement.


  1. Collections of Information Employing Statistical Methods:


An explanation of the statistical methodology involved in this collection is provided in the separate Part B: Statistical Methodology document.




10

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJesse Jachman
File Modified0000-00-00
File Created2022-10-17

© 2024 OMB.report | Privacy Policy