3060-XXXX CAF Performance Testing Measures Supporting Statement_11.4.19_v2

3060-XXXX CAF Performance Testing Measures Supporting Statement_11.4.19_v2.docx

Connect America Fund - Performance Testing Measures

OMB: 3060-1265

Document [docx]
Download: docx | pdf

New Information Collection

Connect America Fund – Performance Testing Measures 3060-XXXX November 2019


SUPPORTING STATEMENT



This new information collection is being submitted to obtain the Office of Management and Budget (OMB) approval for new information collection requirements due to recent Federal Communications Commission (Commission or FCC) orders as explained below.


The new collection addresses the requirements for universal service high cost support recipients to test their networks for compliance with the FCC’s speed and latency requirements. See Connect America Fund, Order, WC Docket No. 10-90, 33 FCC Rcd 6509 (WCB/WTB/OET 2018) (Performance Measures Order); Connect America Fund, Order on Reconsideration, WC Docket No. 10-90, FCC 19-104 (2019) (Performance Measures Reconsideration Order); 47 C.F.R. § 54.313(a)(6).


  1. Justification


    1. Circumstances that make the collection necessary. The Communications Act of 1934, as amended, requires the “preservation and advancement of universal service.” 47 U.S.C. § 254(b). The information collection requirements reported under this collection are the result of FCC actions to promote the Act’s universal service goals.


In the USF/ICC Transformation Order, the Commission laid the groundwork for today’s universal service programs providing $4.5 billion in support for broadband Internet deployment in high-cost areas. Connect America Fund, et al., Report and Order and Further Notice of Proposed Rulemaking, WC Docket No. 10-90, et al., 26 FCC Rcd 17663 (2011) (USF/ICC Transformation Order). The USF/ICC Transformation Order required, among other things, that high-cost universal service recipients “test their broadband networks for compliance with speed and latency metrics and certify to and report the results to the Universal Service Administrative Company (USAC) on an annual basis.” Id. at 17705, para. 109. Pursuant to the Commission’s direction in that Order, the Wireline Competition Bureau, the Wireless Telecommunications Bureau, and the Office of Engineering and Technology (the Bureaus and OET) adopted more specific methodologies for such testing in the Performance Measures Order. See generally Performance Measures Order. See also 47 C.F.R. § 54.313(a)(6) (requiring that recipients of high-cost support provide “[t]he results of network performance tests pursuant to the methodology and in the format determined by the Wireline Competition Bureau, Wireless Telecommunications Bureau, and Office of Engineering and Technology”). Addressing petitions for reconsideration, the Bureaus and OET more recently adopted certain modifications and clarifications to the requirements pertaining to high-latency bidders in the Connect America Fund (CAF) Phase II auction, and the Commission refined the general testing requirements further. See generally Connect America Fund, Order on Reconsideration, WC Docket No. 10-90, DA 19-911 (WCB/WTB/OET 2019) (Satellite-Related Performance Measures Order); Performance Measures Reconsideration Order. Accordingly, this collection includes the requirements for testing speed and latency to ensure that carriers are meeting the public interest obligations associated with their receipt of high-cost universal service support.


Carriers will identify, from among the locations they have already submitted and certified in USAC’s High Cost Universal Broadband (HUBB) portal, the locations where they have an active subscriber (deployment locations are reported under OMB Control Number 3060-1228, and active locations will be reported under this control number). From those subscriber locations, USAC will then select a random sample from which the carrier will be required to perform testing for speed and latency. Carriers that do not provide location information in the HUBB will use a randomization tool provided by USAC to select a random sample of locations for testing. The carrier will then be required to submit to USAC the results of the testing on an annual basis. The annual filing will include the testing results for each quarter from the prior year. The carrier’s sample for each service tier (e.g. 10 Mbps/1 Mbps, 25 Mbps/1 Mbps) shall be regenerated every two years. During the two-year cycle, carriers will have the ability to add and remove subscriber locations if necessary, e.g., as subscribership changes.



New requirements for which we are seeking OMB approval:


  1. Selection of locations with active subscribers (See 12.a):


A maximum of 50 randomly-selected subscribers in each state and each speed tier must be selected. The number of consumers to be tested will be based on the number of subscribers at CAF-supported locations. Performance Measures Order, 33 FCC Rcd at 6522-24, paras. 36-40.


Through USAC’s HUBB, carriers report locations where they have deployed broadband using CAF support. See OMB Control No. 3060-1228. Every two years, pursuant to this information collection, carriers must identify locations where the carrier has an active subscriber (i.e., by assigning a unique identifier to a subscriber location) from among those deployed locations they report to the HUBB. USAC will use this information to generate random lists of locations with subscribers for carriers to test. By building on the existing reporting requirement, this new requirement will minimize the burden on carriers while ensuring that providers cannot cherry pick the subscribers likely to have the best performance. See Performance Measures Order, 33 FCC Rcd at 6524, para. 40. Screenshots provided with this submission show the interface through which carriers must identify subscriber locations. Carriers reporting in the HUBB can download their geocoded location data and will download the subscriber random sample for testing.


For those few carriers that are not required to report locations in the HUBB (e.g., those carriers that have already deployed broadband at the required speed and latency to 100% of the locations in their service area), a randomization tool will be available for carriers to select a random sample of locations for testing. Using this tool, carriers will input the number of CAF-supported subscriber locations they have in a state and speed tier, and will receive a list of numbered locations they must test. Neither USAC nor the Commission would collect information from these carriers for non-HUBB locations through the tool, but location information may be collected during a subsequent audit.


  1. Speed and latency testing and submission of results (See 12.b):


Carriers receiving high-cost support to provide broadband service to fixed locations, including all providers with CAF Phase II, Alternative-Connect America Cost Model (A-CAM), Connect America Fund Broadband Loop Support (CAF BLS), Rural Broadband Experiments (RBE), and Alaska Plan (wireline) obligations, are required to conduct these speed and latency tests. See 47 C.F.R. § 54.313(a)(6). Three testing options will be permitted – use of Measuring Broadband America (MBA) testing; off-the-shelf testing; and provider-developed self-testing or self-testing. Performance Measures Order, 33 FCC Rcd at 6513, paras. 9-10.


A test is defined to be a single, discrete observation or measurement of speed or latency conducted from the customer premises of an active subscriber at a CAF-supported location to a remote test server located at, or reached by, passing through an FCC-designated Internet exchange point (IXP). For providers serving non-contiguous areas more than 500 air miles from the contiguous United States, testing must be conducted from the customer premises of an active subscriber to the point in the non-contiguous area where all mainland traffic is aggregated for transport from the non-contiguous area. Performance Measures Order, 33 FCC Rcd at 6515-17, paras. 17-21.


Testing must be conducted for one week during each quarter of the year. In those weeks, testing must be performed between the hours of 6:00 pm to 12:00 am local time each day, including weekends (testing hours). For latency testing, a provider must conduct a minimum of one test per minute—sixty tests per hour—for each testing hour. For speed testing, a provider must conduct a minimum of one test per testing hour in each direction (download and upload). Performance Measures Order, 33 FCC Rcd at 6519-21, paras. 27-33.


Providers must submit all test results in the HUBB. Carriers cannot delete, trim, edit or otherwise exclude any test measurements. However, if a provider knows or suspects that the testing infrastructure has failed or has negatively impacted test results, the provider may submit evidence of the test infrastructure failure with sufficiently detailed information for the Commission to understand its cause and determine the extent to which any test results should be discarded or adjusted when calculating compliance. Performance Measures Order, 33 FCC Rcd at 6532, para. 61. By conducting speed and latency tests and submitting test results, carriers will aid the Commission in ensuring that its $4.5 billion budget for high-cost universal service support brings to high-cost areas broadband service reasonably comparable to those services provided in urban areas. See 47 U.S.C. § 254(b)(3).


The first performance measures data and certification will be due by July 1, 2021, for price cap carriers receiving CAF Phase II model-based support, and shall include data for the third and fourth quarters of 2020, and recipients of other kinds of high-cost support will begin testing at later dates. Performance Measures Reconsideration Order at 32-33, paras. 81-84. Thereafter, data and certification will be due July 1 each year for all four quarters of the prior calendar year. See 47 C.F.R. § 54.313(j)(1) (setting a July 1 annual deadline for high-cost recipients’ reporting, including of performance test data). However, carriers that are found to be not fully compliant with the Commission’s speed and latency standards will be subject to quarterly, rather than annual, reporting of test results. Performance Measures Order, 33 FCC Rcd at 6532, para. 63. The Commission will have an initial trial testing period in which carriers will be required to conduct testing and submit results within a short time (i.e., within one week at the end of each quarter of pre-testing) where there will be no penalties for noncompliance associated with a failure to meet the required standards. See Performance Measures Reconsideration Order at 32-33, paras. 82-83.


The attached template provides details on the formatting required for carriers’ submission of testing data to USAC. In brief, a carrier must provide the following information for each speed test result:


  1. Carrier identifier

  2. Subscriber identifier

  3. HUBB location identifier

  4. Study Area Code

  5. Required speed tier

  6. Whether download or upload tested

  7. Test server location

  8. Test start time

  9. Test end time

  10. Length of test

  11. Total bytes transferred in test

  12. Number of threads in TCP connections used in test

  13. Speed test result in Mbps

  14. Whether test was successful or failure


For latency test results, a carrier must provide the following information:


  1. Carrier name

  2. Subscriber identifier

  3. HUBB location identifier

  4. Study Area Code

  5. Latency tier

  6. Start date for testing week

  7. Test start time

  8. Test end time

  9. Length of test

  10. Test server location

  11. Latency in microseconds

  12. Whether test was successful or failure


  1. Voice testing for high-latency service providers and submission of results (See 12.c):


Certain technologies, such as satellite, are not capable of meeting the Commission’s 100 ms latency standard. Carriers using such technologies are referred to as high-latency providers. In addition to broadband speed and latency testing, high-latency service providers receiving high-cost support must also demonstrate a Mean Opinion Score (MOS) of 4 or above using a modified ITU Standard Recommendation P.800 conversational-opinion test conducted over the actual network by an independent testing organization. Performance Measures Order, 33 FCC Rcd at 6524-26, paras. 44-46. See also Satellite-Related Performance Measures Order at 5-12, paras. 12-27.


As with speed and latency testing for non-high-latency service providers, providers must submit all test results. In other words, providers cannot delete, trim, edit or otherwise exclude any test measurements. However, if a provider knows or suspects that the testing infrastructure has failed or has negatively impacted test results, the provider may submit evidence of the test infrastructure failure with sufficiently detailed information for the Commission to understand its cause and determine the extent to which any test results should be discarded or adjusted when calculating compliance. Performance Measures Order, 33 FCC Rcd at 6532, paras. 61-62.


    1. Use of Information: The Commission will use the information to ensure that Connect America funds, totaling approximately $4.5 billion, are spent in accordance with the rules of the program, and specifically to determine that carriers are meeting their speed and latency requirements.


    1. Technology collection techniques: Respondents will use USAC’s HUBB portal to facilitate the selection process for speed and latency testing. Using the HUBB portal, carriers will identify the locations with active subscribers. Carriers will also use the HUBB portal to submit the speed and latency test results. The interface is designed to provide online storage of applications and related information for carriers, with the potential to ease compliance with recordkeeping requirements and possible audits. Furthermore, where possible, information already provided by carriers can be carried forward to filings in later funding years (i.e. pre-populated data), to further reduce the filing burden.


    1. Efforts to Identify Duplication: This will be the first time that eligible telecommunications carriers receiving high-cost support will be required to test their networks for both speed and latency. There are no similar collection requirements.


    1. Impact on Small Entities: The collection of information may affect small entities as well as large entities. With multiple testing options, the process has been designed to limit the burden as much as possible on small entities. Testing sample sizes are set based on the number of subscribers so smaller entities are required to test substantially fewer locations than larger entities. Filing guidance and training will be made available to assist small entities in understanding what type of information should be submitted and in what format.


    1. Consequences if information is not collected. The information collected is used to determine compliance with the rules and eligibility for high-cost universal service support. These requirements were put in place, in part, as a response to Government Accountability Office recommendations to increase the transparency and accountability of high cost program funding. Without the requested information, USAC will not be able to determine whether a carrier is entitled to all the support it seeks or is complying with its service obligations. Failure to file the necessary location information may result in partial or complete denial of high-cost universal service support for the carrier.


    1. Special Circumstance. We do not foresee any special circumstances with this information collection.


    1. Federal Register notice; efforts to consult with persons outside the Commission. A 60-day notice was published in the Federal Register pursuant to 5 C.F.R. § 1320.8(d) on June 7, 2019. See 84 FR 26677. We received one comment in response to this notice. See Comments of NTCA-The Rural Broadband Association (filed Aug. 8, 2019) (NTCA Aug. Comments). NTCA-The Rural Broadband Association (NTCA) argues that the burden hours per response for this collection are higher than we have estimated and asks that the Commission “reassess the estimated hours burden per response.” NTCA Aug. Comments at 1. Specifically, NTCA enumerates various steps its member companies have identified as necessary for conducting the required testing. See NTCA Aug. Comments at 2-3. NTCA argues that the Commission’s analysis has not fully considered the amount of time for each of the steps involved. See generally NTCA Aug. Comments. As discussed, below, we have considered the factors raised by NTCA and have taken into account the necessary steps for conducting the performance measurement tests in the burden estimates.


First, we acknowledge that there are a number of steps involved for conducting the required testing. The supporting statement explains that the estimated 45 burden hours “includes the time to install any necessary software or testing equipment at the customer premises, conduct the testing, gather the results, and submit the results,” as well as “to physically install additional equipment at the customer premises” in cases where that is necessary. Conducting the tests and submitting the results requires little time and, in many cases, will be automated once set up for a period of two years. Thus, setting up the testing makes up the bulk of the 45-hour estimate.


Second, NTCA’s estimated burden-hours rely on false assumptions and exaggerated generalizations. For example, NTCA’s analysis assumes that all carriers required to conduct performance testing face the same circumstances and must spend the same amount of time setting up testing. See, e.g., NTCA Aug. Comments at 8 (discussing how NTCA members expect to clean and re-package certain equipment). However, there are many stakeholders receiving high-cost support that must conduct testing. Some carriers will require more time, and others will require less. The 45-hour figure is intended to be an average figure across the entire pool of respondents—not just for particular NTCA members. NTCA also expects one aspect of the setup process to require one hour per test location, or up to 50 hours per state and service tier combination, totaling 100 hours for carriers with two service tiers in a state or 150 hours for carriers with three. See NTCA Aug. Comments at 6. However, NTCA ignores that the 45-hour estimate is for each respondent, which the supporting statement defines as a carrier testing in a particular state and service tier. Thus, a carrier in a state with two service tiers is counted as two respondents, spending an estimated 45 hours for each service tier (90 hours total) for that requirement. Finally, NTCA ignores that the initial setup costs for test locations will generally cover two years of testing. Test locations only need to be changed during the two-year period if a subscriber cancels service at a test location, and we do not expect all of a carrier’s subscribers at test locations to cancel service during a two-year period.


NTCA also ignores other factors that will reduce the burden on carriers. NTCA states that there will be substantial need for initial “bench testing” of equipment, i.e., reviewing the testing equipment or software prior to installation. NTCA’s 40-50 hour estimate for bench testing is substantially overstated because this equipment is already in use and has been thoroughly tested by other parties. Moreover, NTCA fails to take into account that there are numerous software solutions on the market that will work with existing end-user equipment and are built into most new end-user equipment. Although not all customers will have equipment that meets this need, many will, precluding the need for any new customer hardware or a truck roll (i.e., sending a technician to the customer’s premises). Our estimate took into account that some truck rolls would be necessary, but over time, as older end-user equipment is retired and replaced by new equipment, fewer truck rolls will be required. Further, even today, many testing-specific hardware solutions are easily installed by the customer. Therefore, the customer may require assistance from a technician in a limited number of situations. If the carrier does install new end-user equipment solely for testing purposes, that equipment will remain in place in most cases for the two-year testing cycle—thus, reducing any number of truck rolls that may be required. Although NTCA argues that customer consent will be needed for testing, this will only be the case where new equipment must be installed at the customer premises. If the testing can be done solely via software, no customer proprietary network information is involved in the required testing or reporting, other than information for which the carrier likely would already have obtained customer consent. Carriers routinely perform network testing of speed and latency without specifically notifying the customer, and the existence of such testing is often noted in the terms of use for subscribing to broadband service; the performance measures testing is of a similar nature.


Accordingly, the estimated 45 burden hours per respondent per year—or 90 hours over a two-year period—is reasonable for the requirement.


A 30-day notice was published in the Federal Register pursuant to 5 CFR § 1320.10(a) on August 20, 2019. See 84 FR 43130. We received three comments in response to this notice. See Comments of AT&T (filed Sept. 19, 2019) (AT&T Comments); Comments of Hughes Network Systems, LLC (filed Sept. 18, 2019) (Hughes Comments); Comments of NTCA-The Rural Broadband Association (filed Sept. 19, 2019) (NTCA Sept. Comments).


As an initial matter, the Commission recently addressed many of the arguments made in the comments in the Performance Measures Reconsideration Order. For example, AT&T raises concerns about the list of locations designated as FCC-designated IXPs in the Performance Measures Order. AT&T Comments at 8-10. But the Commission has redefined the locations that qualify as FCC-designated IXPs in response to such industry concerns. See Performance Measures Reconsideration Order at 4-8, paras. 12-19. Likewise, in that order, the Commission revised the schedule for starting testing that would be required under this collection—to provide significantly more time for the vast majority of carriers—and this supporting statement reflects those changes. See id. at 30-35, paras. 76-91. In response to AT&T’s contention that once-per-minute latency tests were overly burdensome, the Commission explained that MBA testing requires 2,000 latency tests per hour, which has not been found to pose any technical or other difficulties; moreover, the disparity in testing frequency between speed and latency reflects the different type of testing necessary to determine whether carriers are meeting required benchmarks. See id. at 12-14 paras. 31-38. AT&T’s estimated costs, totaling $1.65 million for the first year of testing, appear grossly exaggerated in any case. See AT&T Comments at 7. If AT&T cannot reprogram its own systems, AT&T can still easily use off-the-shelf equipment from a vendor, and even if AT&T’s numbers are correct, they must be considered in context. AT&T has accepted $248 million dollars of CAF support to serve over 2.2 million locations.


NTCA’s comments filed in response to the 30-day notice reiterate many of its earlier statements, already addressed above. NTCA also claims its estimates are based on interviews with some of its members, but NTCA continues to speak in generalizations and does not provide specific evidence as part of its comments. See NTCA Sept. Comments at 3-4. Further, NTCA claims that the “Supporting Statement itself illuminates the reasonableness of NTCA estimates, which contemplated the separate testing required for each service tier” and that “The Supporting Statement appears to echo the NTCA understanding in these regards.” As noted above, NTCA mistakenly implied that the requirement for a carrier to test separate locations in each service tier and state was a sort of multiplying factor for the burden hours associated with each respondent. However, the supporting statement treats a carrier in a particular state and service tier as one respondent. In other words, a carrier with two service tiers in two states is treated as four respondents, and the burden hour estimate is per respondent. NTCA’s concern about the burden hour estimates as not taking into account carriers’ states and service tiers was thus misplaced. See NTCA Sept. Comments at 6 n.19.


Hughes also does not provide adequate evidence for its assertion that the hours burden associated with speed and latency should be 120 hours per 6 months, rather than 45 hours per response. See Hughes Comments at 2-3. In any case, the vast majority of our estimated 1,277 respondents did not dispute the figure, which is intended to be an average estimate per response, not the figure that one carrier expects for its own operations. Even for a single respondent, as a carrier installs the necessary hardware and/or software, the testing process will become increasingly automated and take substantially less than 45 hours to complete. For MOS testing, Hughes estimates that it will take 62 hours to conduct phone calls for testing 370 locations, but testing 370 locations is only required for carriers with over 3,500 active subscribers at CAF-supported locations. It is unclear whether either of the two estimated respondents will have over 3,500 subscribers at such locations in the near future, and for carriers with 3,500 or fewer active subscribers at CAF-supported locations, a carrier must test 100 such locations. Finally, regarding Hughes’ arguments that MOS testing, as well as speed and latency testing, does not “minimize the burden of information collection on respondents,” Hughes has since indicated to the Commission that “it is willing and able to comply with the testing framework set out in the MOS Reconsideration Order,” so we believe those concerns are moot. See Hughes Comments at 3-4; Withdrawal of Petition for Clarification or, in the Alternative, Reconsideration, Hughes Network Systems, LLC, WC Docket No. 10-90 (filed Oct. 10, 2019).


    1. Payments or gifts to respondents. The Commission does not anticipate providing any payment or gifts to respondents.


    1. Assurances of confidentiality. The testing results for speeds and latency for individual locations will remain confidential and not be available to the public. Further, no customer privacy data is obtained.


    1. Questions of a sensitive nature. There are no questions of a sensitive nature with respect to the information collection requirements described herein.


    1. Estimates of the hour burden of the collection to respondents. The following represents the hour burden on the collections of information:


      1. Selection of locations with active subscribers (new requirement):


        1. Number of Respondents: Approximately 1,277 eligible telecommunications carriers. This factors in one respondent for each deployment obligation (i.e., carrier/state combination). In many instances, the same company will have an obligation in different states. For instance, Company A has an obligation in Maryland and an obligation in Virginia. While this is one company, it is factored in twice in the number of respondents.


        1. Frequency of Response: Once every two years.


        1. Total number of responses per respondent: 1 every two years (.5 per year).


        1. Estimated time per response: 16 hours.


        1. Total annual hour burden: 10,216.


16 hours per response every two years equals 8 hours per year for 1,277 respondents filing. Total annual hour burden is calculated as follows:


1,277 respondents x 1 response per respondent every two years = 1,277 responses x (16 hours / 2 years) = 10,216 total annual hours.


        1. Total estimate of in-house cost to respondents: $408,640 (10,216 hours x $40/hr.).


        1. Explanation of calculation: We estimate that each carrier will take, on average, 16 hours to gather and submit the data for its active subscribers every two years, which means 8 hours per year.


1,277 (number of responses) x 16 (hours to prepare report per response) / 2 years x $40/hr. = $408,640


      1. Speed and latency testing and submission of results (new requirement):


        1. Number of Respondents: Approximately 1,277 eligible telecommunications carriers. This factors in one respondent for each deployment obligation (i.e., carrier/state combination). In many instances, the same company will have an obligation in different states. For instance, Company A has an obligation in Maryland and an obligation in Virginia. While this is one company, it is factored in twice in the number of respondents.


        1. Frequency of Response: Once annually and quarterly reporting for non-fully compliant carriers. The Commission will have an initial trial testing period in which carriers will be required to conduct testing and submit results within a short time, but there will be no penalties for noncompliance associated with a failure to meet the required standards.


        1. Total number of responses per respondent: Approximately 2 (on average).


        1. Estimated time per response: 45 hours.


        1. Total annual hour burden: 114,930.


45 hours per response for 1,277 carriers. Total annual hour burden is calculated as follows:


1,277 respondents x 2 responses per respondent on average = 2,554 responses x 45 hours = 114,930 total annual hours.


        1. Total estimate of in-house cost to respondents: $4,597,200 (114,930 hours x $40/hr.).


        1. Explanation of calculation: We estimate that each carrier will take, on average, 45 hours per response. This includes the time to install any necessary software or testing equipment at the customer premises, conduct the testing, gather the results, and submit the results in the HUBB portal annually or, if necessary, quarterly. Some carriers may require a company truck roll to physically install additional equipment at the customer premises, but many carriers will be able to set up testing software remotely, if they do not have such software already set up.

2,554 (number of responses) x 45 (hours to prepare response per year) x $40/hr. = $4,597,200


      1. Voice testing for high-latency service providers and submission of results (new requirement):


        1. Number of Respondents: Approximately 2 eligible telecommunications carriers providing high-latency service through the Connect America Fund.


        1. Frequency of Response: Once annually and quarterly reporting for non-fully compliant carriers. As noted above, the Commission will have an initial trial testing period in which carriers will be required to conduct testing and submit results within a short time, but there will be no penalties for noncompliance associated with a failure to meet the required standards.


        1. Total number of responses per respondent: 1.


        1. Estimated time per response: 60 hours.


        1. Total annual hour burden: 120.


60 hours per response, once per year for 2 respondents. Total annual hour burden is calculated as follows:


2 respondents x 1 response per respondent = 2 responses x 60 hours = 120 total annual hours.


        1. Total estimate of in-house cost to respondents: $4,800 (120 hours x $40/hr.).


        1. Explanation of calculation: We estimate that each carrier will take 60 hours per response. This includes the time to conduct the testing twice per year, gather the results, and submit the results in the HUBB portal annually.


8 (number of response) x 15 (hours to prepare the response) x $40/hr. = $4,800


The estimated respondents and responses and burden hours are listed below:




Information Collection Requirements


Number of Respondents


Number of Responses

Per Year


Estimated Time per Response (hours)


Total Burden Hours


In-house Cost to Respondents

a. Selection of locations with active subscribers

1,277

.5

16

5,108

$408,640

b. Speed and latency testing and submission of results

1,277

2

45

114,930

$4,597,200

c. Voice testing for high-latency service providers and submission of results

2

1

60

120

$4,800


Total Number of Respondents: 1,277 unique respondents filing multiple times.


Total Number of Responses Annually: 3,195 (rounded up)

Total Annual Hourly Burden for Requirements (a) – (c): 120,158


Total Annual In-House Costs to Respondents: $5,010,640


    1. Estimates for the cost burden of the collection to respondents. Carriers may use outside contracting to assist with the testing or procure the testing equipment, although it is not required. For high-latency voice testing, an independent agency or organization must determine the MOS for the carrier; the expected costs for contracting such outside parties are built into the in-house costs for respondents noted above.


    1. Estimates of the cost burden to the Commission. There will be few, if any, costs to the Commission because ensuring proper use of universal service support is already part of Commission duties. Furthermore, no new systems or programs will be acquired or developed to process the information collection.


    1. Program changes or adjustments. The Commission is reporting program changes/increases to this new information collection. These increases to the total number of respondents of +1,277, total annual responses of +3,195 and total annual burden hours of +120,158 will be added to OMB’s Active Inventory.


    1. Collections of information whose results will be published. The Commission does not plan to make the individual testing information available to the public but plans to summarize that information and make it available.


    1. Display of expiration date for OMB approval of information collection. There is no paper form associated with this information collection; it is collected electronically through the portal described above. The Commission seeks approval to not display the expiration date for OMB approval of this information collection. The Commission will use an edition date in lieu of the OMB expiration date. This will prevent the Commission from having to repeatedly update the expiration date on the portal each time this collection is submitted to OMB for review and approval. The Commission publishes a list of all OMB-approved information collections in 47 C.F.R. § 0.408 of the Commission’s rules.


    1. Exceptions to certification for Paperwork Reduction Act Submissions. There are no exceptions to the certification statement.


  1. Collections of Information Employing Statistical Methods:


An explanation of the statistical methodology involved in this collection is provided in the separate Part B: Statistical Methodology document.




11

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJesse Jachman
File Modified0000-00-00
File Created2022-05-27

© 2024 OMB.report | Privacy Policy