Connect America Fund, 42052-42061 [2018-17338]
Download as PDF
42052
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
‘‘Final Rules’’ section below, and that
this amendment shall be effective 30
days after publication of this Report and
Order in the Federal Register.
31. It is further ordered that the
Commission’s Consumer &
Governmental Affairs Bureau, Reference
Information Center, shall send a copy of
this Report and Order to Congress and
the Government Accountability Office
pursuant to the Congressional Review
Act, see 5 U.S.C. 801(a)(1)(A).
List of Subjects in 47 CFR Parts 51 and
52
Communications common carriers,
Telecommunications, Telephone.
Federal Communications Commission.
Katura Jackson,
Federal Register Liaison Officer, Office of the
Secretary.
Final Rules
For the reasons discussed in the
preamble, the Federal Communications
Commission amends 47 CFR parts 51
and 52 as follows:
PART 51—INTERCONNECTION
1. The authority citation for part 51 is
revised to read as follows:
■
Authority: 47 U.S.C. 151–55, 201–05, 207–
09, 218, 225–27, 251–52, 271, 332 unless
otherwise noted.
■
2. Revise § 51.205 to read as follows:
§ 51.205
Dialing parity: General.
A local exchange carrier (LEC) shall
provide local dialing parity to
competing providers of telephone
exchange service, with no unreasonable
dialing delays. Dialing parity shall be
provided for originating
telecommunications services that
require dialing to route a call.
§ 51.209
■
§ 51.213
■
[Removed]
4. Remove § 51.213.
§ 51.215
■
[Removed]
3. Remove § 51.209.
[Removed]
5. Remove § 51.215.
PART 52—NUMBERING
6. The authority citation for part 52 is
revised to read as follows:
daltland on DSKBBV9HB2PROD with RULES
■
Authority: 47 U.S.C. 151–55, 201–05, 207–
09, 218, 225–27, 251–54, 271, 303(r), 332,
1302.
d. Revising paragraph (c).
The revisions and addition read as
follows:
FEDERAL COMMUNICATIONS
COMMISSION
§ 52.26 NANC Recommendations on Local
Number Portability Administration.
[WC Docket No. 10–90; DA 18–710]
(a) Local number portability
administration shall comply with the
recommendations of the North
American Numbering Council (NANC)
as set forth in the report to the
Commission prepared by the NANC’s
Local Number Portability
Administration Selection Working
Group, dated April 25, 1997 (Working
Group Report) and its appendices,
which are incorporated by reference
pursuant to 5 U.S.C. 552(a) and 1 CFR
part 51. Except that: Sections 7.8 and
7.10 of Appendix D and the following
portions of Appendix E: Section 7, Issue
Statement I of Appendix A, and
Appendix B in the Working Group
Report are not incorporated herein.
(b) * * *
(1) Each designated N–1 carrier (as
described in the Working Group Report)
is responsible for ensuring number
portability queries are performed on a
N–1 basis where ‘‘N’’ is the entity
terminating the call to the end user, or
a network provider contracted by the
entity to provide tandem access, unless
another carrier has already performed
the query;
*
*
*
*
*
(c) The Director of the Federal
Register approves this incorporation by
reference in accordance with 5 U.S.C.
552(a) and 1 CFR part 51. Copies of the
Working Group Report and its
appendices can be inspected during
normal business hours at the following
locations: FCC Reference Information
Center, 445 12th Street SW, Room
CY–A257, Washington, DC 20554 or at
the National Archives and Records
Administration (NARA). For
information on the availability of this
material at NARA, call (202) 741–6030,
or go to: https://www.archives.gov/
federal-register/cfr/ibr-locations.html.
The Working Group Report and its
appendices are also available on the
internet at https://docs.fcc.gov/public/
attachments/DOC-341177A1.pdf.
Connect America Fund
■
[FR Doc. 2018–17843 Filed 8–17–18; 8:45 am]
BILLING CODE 6712–01–P
7. Amend § 52.26 by:
a. Revising paragraph (a);
b. Redesignating paragraphs (b)(1)
through (3) as paragraphs (b)(2) through
(4);
■ c. Adding a new paragraph (b)(1); and
■
■
■
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
PO 00000
Frm 00036
Fmt 4700
Sfmt 4700
47 CFR Part 54
Federal Communications
Commission.
ACTION: Final action.
AGENCY:
In this document, the
Wireline Competition Bureau (WCB),
the Wireless Telecommunications
Bureau (WTB) (jointly referred to herein
as the Bureaus), and the Office of
Engineering and Technology (OET)
adopt requirements promoting greater
accountability for certain recipients of
Connect America Fund (CAF) high-cost
universal service support, including
price cap carriers, rate-of-return carriers,
rural broadband experiment (RBE)
support recipients, Alaska Plan carriers,
and CAF Phase II auction winners.
Specifically, the Bureaus and OET
establish a uniform framework for
measuring the speed and latency
performance for recipients of high-cost
universal service support to serve fixed
locations.
DATES: This final action is effective
September 19, 2018.
FOR FURTHER INFORMATION CONTACT:
Suzanne Yelen, Wireline Competition
Bureau, (202) 418–7400 or TTY: (202)
418–0484.
SUPPLEMENTARY INFORMATION: This is a
summary of the Commission’s Order in
WC Docket No. 10–90; DA 18–710,
adopted on July 6, 2018 and released on
July 6, 2018. The full text of this
document is available for public
inspection during regular business
hours in the FCC Reference Center,
Room CY–A257, 445 12th Street SW,
Washington, DC 20554 or at the
following internet address: https://
docs.fcc.gov/public/attachments/DA-18710A1.pdf.
SUMMARY:
I. Introduction
1. In the Order, the Bureaus and OET
adopt requirements promoting greater
accountability for certain recipients of
CAF high-cost universal service
support, including price cap carriers,
rate-of-return carriers, RBE support
recipients, Alaska Plan carriers, and
CAF Phase II auction winners.
Specifically, the Bureaus and OET
establish a uniform framework for
measuring the speed and latency
performance for recipients of high-cost
universal service support to serve fixed
locations.
2. The Bureaus and OET also require
providers to submit testing results as
E:\FR\FM\20AUR1.SGM
20AUR1
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
part of their annual compliance
certification. Carriers that do not
comply with the Bureaus and OET’s
speed and latency requirements will be
subject to a reduction in support,
commensurate with their level of
noncompliance. In addition, providers
will be subject to audit of all testing
data. With this testing and compliance
framework, the Bureaus and OET aim to
maximize the benefits consumers reap
from its high-cost universal service
programs in even the hardest-to-reach
areas, thus making the best use of its
Universal Service Fund (USF) dollars
and further closing the digital divide.
daltland on DSKBBV9HB2PROD with RULES
II. Choice of Testing Method
3. The Bureaus and OET provide
high-cost support recipients that serve
fixed locations three options to afford
flexibility in choosing solutions to
conduct required performance testing.
Specifically, the Bureaus and OET
conclude that eligible
telecommunications carriers (ETCs)
subject to fixed broadband performance
obligations may conduct required
testing by employing either (1)
Measuring Broadband America (MBA)
testing infrastructure (MBA testing), (2)
existing network management systems
and tools (off-the-shelf testing), or (3)
provider-developed self-testing
configurations (provider-developed selftesting or self-testing). Providers may
employ any of these three options as
long as the provider’s implementation
meets the testing requirements
established in this Order. The Bureaus
and OET define the three options as
follows:
• First, a high-cost support recipient
may use MBA testing by arranging with
entities that manage and perform testing
for the MBA program to implement
performance testing, as required, for
CAF. The provider is responsible for all
costs required to implement testing of
its network, including any costs
associated with obtaining and
maintaining Whiteboxes, to the extent
that any additional Whiteboxes are
employed as part of the MBA testing.
The Bureaus and OET note that the
MBA testing must occur in areas and for
the locations supported by CAF, e.g., in
CAF Phase II eligible areas for price cap
carriers and for specific built-out
locations for RBE, Alternative Connect
America Cost Model (A–CAM), and
legacy rate-of-return support recipients.
• Second, a high-cost support
recipient may elect to use existing
network management systems and tools,
ping tests, and other commonly
available performance measurement and
network management tools—off-the-
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
shelf testing—to implement
performance testing.
• Third, a high-cost support recipient
may implement a provider-developed
self-testing configuration using software
installed on residential gateways or in
equipment attached to residential
gateways to regularly initiate speed and
latency tests. Providers that implement
self-testing of their own networks may
make network performance testing
services available to other providers.
The Bureaus and OET continue to
consider whether the Universal Service
Administrative Company (USAC) may
have a role in offering server capacity at
an internet Exchange Point in an FCCdesignated metropolitan area (FCCdesignated IXP), without any oversight
role in conducting tests, to mitigate
smaller providers’ costs.
4. By providing these three options,
the Bureaus and OET ensure that there
is a cost-effective method for conducting
testing for providers of different sizes
and technological sophistication. The
Bureaus and OET do not require that
providers invest in and implement new
internal systems; instead, providers may
perform speed and latency tests with
readily-available, off-the-shelf solutions
or existing MBA infrastructure. On the
other hand, some providers may prefer
implementing their own self-testing
systems, especially if such testing
features are already built into CPE for
the carrier’s own network management
purposes. These three options allow the
provider to align required performance
testing with their established network
management systems and operations,
making it as easy as possible for carriers
to implement the required testing while
establishing rigorous testing parameters
and standards, based on real-world data.
5. The Bureaus and OET recognize
that self-testing using providerdeveloped software may create
opportunities for ‘‘manipulation or
gaming’’ by CAF recipients. However,
the Bureaus and OET believe that the
testing and compliance requirements
they adopt will minimize the possibility
of such behavior. First, as explained in
more detail in the following, the
Bureaus and OET will be requiring
providers to submit and certify testing
data annually. Second, USAC will be
verifying provider compliance and
auditing performance testing results.
6. The Bureaus and OET reject Alaska
Communications’ proposal that highcost support recipients may submit
radio frequency propagation maps in
lieu of conducting speed tests to
demonstrate compliance with speed
obligations. Such maps are only
illustrative of planned, ‘‘theoretical’’
coverage and do not provide actual data
PO 00000
Frm 00037
Fmt 4700
Sfmt 4700
42053
on what consumers experience. The
Bureaus and OET therefore require
providers to conduct the required
testing using one of the three options
identified in this document.
III. General Testing Parameters
7. All ETCs subject to fixed broadband
performance obligations must conduct
the required speed and latency testing
using the parameters in this Order,
regardless of which of the three testing
options the carrier selects. The Bureaus
and OET first define ‘‘test’’ and the
associated span of measurement, in the
context of these performance
measurements. Next, the Bureaus and
OET adopt requirements regarding
when tests must begin and when exactly
carriers may perform the tests, and they
set the number of active subscriber
locations carriers must test, with
variations depending on the size of the
carrier. Finally, the Bureaus and OET
address how high-latency bidders in the
CAF Phase II auction must conduct
required voice testing.
8. To maintain a stringent
performance compliance regime while
avoiding unnecessary burdens on
smaller carriers, the Bureaus and OET
allow flexibility concerning the specific
testing approach so that carriers can
select, consistent with its adopted
framework, the best and most efficient
testing methods for their particular
circumstances. The Bureaus and OET
encourage the use of industry testing
standards, such as the TR–143 Standard,
for conducting self-testing.
9. For reasons similar to those
outlined in the CAF Phase II Price Cap
Service Obligation Order, 78 FR 70881,
November 27, 2013, the Bureaus and
OET require that high-cost support
recipients serving fixed locations
perform these tests over the
measurement span already applicable to
price cap carriers receiving CAF Phase
II model-based support. ETCs must test
speed and latency from the customer
premises of an active subscriber to a
remote test server located at or reached
by passing through an FCC-designated
IXP. Accordingly, a speed test is a single
measurement of download or upload
speed of 10 to 15 seconds duration
between a specific consumer location
and a specific remote server location.
Similarly, a latency test is a single
measurement of latency, often
performed using a single User Datagram
Protocol (UDP) packet or a group of
three internet Control Message Protocol
(ICMP) or UDP packets sent at
essentially the same time, as is common
with ping tests.
10. Large and small ETCs alike
commit to providing a certain level of
E:\FR\FM\20AUR1.SGM
20AUR1
daltland on DSKBBV9HB2PROD with RULES
42054
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
service when accepting high-cost
support to deploy broadband. ‘‘Testing
. . . on only a portion of the network
connecting a consumer to the internet
core will not show whether that
customer is able to enjoy high-quality
real-time applications because it is
network performance from the
customer’s location to the destination
that determines the quality of the
service from the customer’s
perspective.’’ Although the
measurement span the Bureaus and OET
adopt may include transport (e.g.,
backhaul or transit) that a provider does
not control, the carrier can influence the
quality of transport purchased and can
negotiate with the transport provider for
a level of service that will enable it to
meet the Commission’s performance
requirements. This is true for both price
cap carriers and smaller carriers. The
Bureaus and OET therefore disagree
with suggestions that testing should
only occur within a provider’s own
network because providers do not
always control the portion of the
network reaching the nearest FCCdesignated IXP.
11. Previously, the Bureaus and OET
designated the following ten locations
as FCC-designated IXPs: New York City,
NY; Washington, DC; Atlanta, GA;
Miami, FL; Chicago, IL; Dallas-Fort
Worth, TX; Los Angeles, CA; San
Francisco, CA; Seattle, WA; and Denver,
CO. All of these areas, except Denver,
are locations used by the MBA program,
which selected these locations because
they are geographically distributed
major U.S. Internet peering locations.
Denver was added to the list so that all
contiguous areas in the United States
are within 700 miles of an FCCdesignated IXP. Because the Bureaus
and OET are expanding testing to
additional CAF recipients, they add the
following six metropolitan areas as
additional FCC-designated IXPs: Salt
Lake City, UT; St. Paul, MN; Helena,
MT; Kansas City, MO; Phoenix, AZ; and
Boston, MA. This expanded list ensures
that most mainland U.S. locations are
within 300 air miles of an FCCdesignated IXP, and all are within
approximately 500 air miles of one.
Further, the Bureaus and OET find that
there is no reason to limit testing to the
provider’s nearest IXP; rather, providers
can use any FCC-designated IXP for
testing purposes.
12. Still, the Bureaus and OET
recognize that non-contiguous providers
face unique challenges in providing
service outside the continental U.S. The
distance between a carrier and its
nearest IXP affects latency and may
affect speed as well. At this time, the
Bureaus and OET do not have sufficient
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
data to determine the extent of the effect
of distance on speed performance
testing. Therefore, similar to the existing
exception for non-contiguous price cap
carriers accepting model-based CAF
Phase II support, the Bureaus and OET
permit all providers serving noncontiguous areas greater than 500 air
miles from an FCC-designated IXP to
conduct all required latency and speed
testing between the customer premises
and the point at which traffic is
aggregated for transport to the
continental U.S. The Bureaus and OET
have identified a sufficient number of
IXPs so that no point in the continental
U.S. is more than approximately 500
miles from an FCC-designated IXP.
Therefore, allowing non-contiguous
providers located more than 500 miles
from an FCC-designated IXP to test to
the point in the non-contiguous area
where traffic is aggregated for transport
to the mainland will prevent these
providers from being unfairly penalized
for failing to meet their performance
obligations solely because of the
location of the areas being served.
However, as the Commission gains
additional MBA and other data on speed
and latency from non-contiguous areas,
the Bureaus and OET may revisit this
conclusion.
13. First, the Bureaus and OET
establish the specific test intervals
within the daily test period. For latency,
the Bureaus and OET require a
minimum of one discrete test per
minute, i.e., 60 tests per hour, for each
of the testing hours, at each subscriber
test location, with the results of each
discrete test recorded separately. The
Bureaus and OET note that intensive
consumer use of the network (such as
streaming video) during testing, referred
to as cross-talk, can influence both
consumer service and testing results.
The data usage load for latency testing
is minimal; sending 60 UDP packets of
64 bytes each in one hour is
approximately 4,000 bytes in total.
However, to prevent cross-talk from
negatively affecting both the consumer
experience and test results, the Bureaus
and OET adopt consumer load
thresholds—i.e., cross-talk thresholds—
similar to those used by the MBA
program. Accordingly, for latency
testing, if the consumer load exceeds 64
Kbps downstream, the provider may
cancel the test and reevaluate whether
the consumer load exceeds 64 Kbps
downstream before retrying the test in
the next minute. Providers who elect to
do more than the minimum required
number of latency tests at subscriber test
locations must include the results from
all tests performed during testing
PO 00000
Frm 00038
Fmt 4700
Sfmt 4700
periods in their compliance
calculations.
14. For speed, the Bureaus and OET
require a minimum of one download
test and one upload test per testing hour
at each subscriber test location. The
Bureaus and OET note that speed testing
has greater network impact than latency
testing. For speed testing, the Bureaus
and OET require providers to start
separate download and upload speed
tests at the beginning of each test hour
window. As with latency, the Bureaus
and OET adopt cross-talk thresholds
similar to those used in the MBA
program. If the consumer load is greater
than 64 Kbps downstream for download
tests or 32 Kbps upstream for upload
tests, the provider may defer the
affected download or upload test for one
minute and reevaluate whether the
consumer load exceeds the relevant 64
Kbps or 32 Kbps threshold before
retrying the test. This load check-andretry must continue at one-minute
intervals until the speed test can be run
or the one-hour test window ends and
the test for that hour is canceled. Also
as with latency, providers who elect to
do more than the minimum required
number of speed tests at subscriber test
locations must include the results from
all tests performed during testing
periods for compliance calculations.
15. Second, to capture any seasonal
effects on a carrier’s broadband
performance, the Bureaus and OET
require that carriers subject to the
latency and speed testing requirements
conduct one week of testing in each
quarter of the calendar year.
Specifically, carriers must conduct one
week of testing in each of the following
quarters: January through March, April
through June, July through September,
and October through December. By
requiring measurements quarterly,
rather than in four consecutive weeks,
the Bureaus and OET expect test results
to reflect a carrier’s performance
throughout the year, including during
times of the year in which there is a
seasonal increase or decrease in network
usage. Although previously WCB
required price cap carriers receiving
CAF Phase II support to test latency for
two weeks each quarter, the Bureaus
and OET find that requiring testing one
week each quarter strikes a better
balance of accounting for seasonal
changes in broadband usage and
minimizing the burden on consumers
who may participate in testing.
16. Third, in establishing the daily
testing period, the Bureaus and OET
slightly expand the test period and
require that carriers conduct tests
between 6:00 p.m. and 12:00 a.m.
(testing hours), including on weekends.
E:\FR\FM\20AUR1.SGM
20AUR1
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
The Bureaus and OET continue to find
that MBA data supports its conclusion
that there is a peak period of internet
usage every evening. However, the
Bureaus and OET intend to revisit this
requirement periodically to determine
whether peak internet usage times have
changed substantially.
17. The Bureaus and OET conclude
that requiring measurements over an
expanded period, by including one hour
before the peak period and one hour
after, will best ensure that carriers meet
the speed and latency obligations
associated with the high-cost support
they receive. MBA data shows that
broadband internet access service
providers that perform well during the
peak period tend to perform well
consistently throughout the day.
Further, the Bureaus and OET required
schedule of testing is consistent with
the specific, realistic standards they set
forth which were developed using MBA
peak-period data. Thus, the Bureaus and
OET will be judging testing hours data
based on a standard developed using
MBA data from the same time period.
18. Additionally, the Bureaus and
OET disagree with assertions that
requiring speed testing during the peak
period will introduce problematic
network congestion over the provider’s
core network. Based on MBA speed test
data, a download service speed test for
10 Mbps requires approximately 624
MB combined downloaded data for 50
locations per hour. This is less traffic
than what would be generated by
streaming a little less than one-half of a
high-definition movie. A download
service speed test for 25 Mbps requires
approximately 1,841 MB combined
downloaded data for 50 locations,
which is about the same amount of
traffic as a little less than two highdefinition movies. The small amount of
data should have no noticeable effect on
network congestion. Upload test datausage is even lower. Based upon MBA
speed test data, a one-hour upload
service speed test for 1 Mbps and 3
Mbps for 50 locations will be
approximately 57 MB and 120 MB,
respectively. This testing will use
bandwidth equivalent to uploading 12
photos to a social media website at 1
Mbps or 24 photos at 3 Mbps. To the
extent that a carrier is concerned about
possible impacts on the consumer
experience, the Bureaus and OET permit
carriers the flexibility to choose whether
to stagger their tests, so long as they do
not violate any other testing
requirements, as they explain in their
discussion of the testing intervals in the
following.
19. Fourth, testing for all locations in
a single speed tier in a single state must
be done during the same week. If a
42055
provider has more than one speed tier
in a state, testing for each speed tier can
be conducted during different weeks
within the quarter. For a provider
serving multiple states, testing of each
service tier does not need to be done
during the same week, i.e., a provider
may test its 10/1 Mbps customers in
New York one week and in
Pennsylvania during a different week.
The Bureaus and OET will generally
consider requests for waiver or
extension in cases where a major,
disruptive event (e.g., a hurricane)
negatively affects a provider’s
broadband performance. However, prior
to requesting a waiver, providers should
determine whether rescheduling testing
within the 3-month test window will be
sufficient to handle the disruptive
event.
20. The Bureaus and OET require that
carriers test up to 50 locations per CAFrequired service tier offering per state,
depending on the number of subscribers
a carrier has in a state. The subscribers
eligible for testing must be at locations
that are reported in the HUBB where
there is an active subscriber. The
Bureaus and OET decline to adopt a
simple percentage-based alternative but,
instead, adopt the following scaled
requirements for each state and service
tier combination for a carrier:
REQUIRED TEST LOCATIONS FOR SPEED
Number of subscribers at CAF-supported locations per state and service tier combination
Number of test locations
daltland on DSKBBV9HB2PROD with RULES
50 or fewer .............................................................................................................................................................
51–500 ...................................................................................................................................................................
Over 500 ................................................................................................................................................................
The Bureaus and OET recognize that it
is possible that a carrier serving 50 or
fewer subscribers in a state and
particular service tier cannot find the
required number of five active
subscribers for testing purposes. To the
extent necessary, the Bureaus and OET
permit such carriers to test existing,
non-CAF-supported active subscriber
locations within the same state and
service tier to satisfy its requirement of
testing five active subscriber locations.
Carriers may voluntarily test the speed
and/or latency of additional randomly
selected CAF-supported subscribers
over the minimum number of required
test locations as part of their quarterly
testing. However, data for all tested
locations must be submitted for
inclusion in the compliance
calculations, i.e., carriers must identify
the set of testing locations at the
beginning of the testing and cannot
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
exclude some locations during or after
the testing.
21. Carriers must test an adequate
number of subscriber locations to
provide a clear picture of the carrier’s
performance and its customers’
broadband experience across a state.
The Bureaus and OET find that 50 test
locations, per speed tier per state,
remains a good indicator as to whether
providers are fulfilling their obligations.
A sample size of 50 test locations out of
2,500 or more subscribers provides a
picture of carriers’ performance with a
±11.5 percent margin of error and 90
percent confidence level. Testing 50
locations out of more than 500
subscribers yields a comparable picture
of carriers’ performance. The Bureaus
and OET acknowledge, however, that
smaller carriers may find testing 50
locations burdensome. Below 2,500
CAF-supported subscribers, greater
percentages of subscribers are necessary
PO 00000
Frm 00039
Fmt 4700
Sfmt 4700
5.
10% of total subscribers.
50.
to achieve the same margin of error and
confidence level, but below 500
subscribers the necessary percentage
rises quickly above 10 percent. Carriers
serving fewer subscribers would thus be
unable to provide test results achieving
the same margin of error and confidence
level without testing a more
proportionately burdensome percentage
of their subscribers.
22. The Bureaus and OET also now
find it preferable to use the number of
subscribers in a state and service tier,
rather than the number of lines for
which a provider is receiving support,
to determine the required number of test
locations. A carrier receiving support for
2,000 lines serving 100 subscribers
would find it much more difficult to test
50 active subscriber locations, compared
to a carrier receiving support for 2,000
lines but serving 1,500 subscribers, and
commenters have noted that providers
may find it difficult to find a sufficient
E:\FR\FM\20AUR1.SGM
20AUR1
daltland on DSKBBV9HB2PROD with RULES
42056
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
number of locations if they have
relatively few subscribers. Basing the
number of locations to be tested on the
number of subscribers, rather than the
number of lines, addresses this concern.
23. The Bureaus and OET therefore
require testing a specific number of
subscribers for carriers serving more
than 500 subscribers in a single service
tier and state, but require carriers
serving between 51 and 500 subscribers
in a single service tier and state to test
a fixed percentage of subscribers. For
carriers serving 50 or fewer subscribers
in a state and service tier, a percentagebased alternative may be insufficient; in
an extreme situation, data from a single
subscriber cannot clearly demonstrate a
carrier’s speed and latency performance.
Accordingly, the Bureaus and OET
require those providers to test a specific
number of active subscriber locations.
The Bureaus and OET conclude that this
scaled approach balances the need to
test a reasonable number of subscriber
locations within a state based on the
total number of subscribers and
performance tiers with minimizing the
burden on smaller providers to find
consumer locations to be tested. The
Bureaus and OET note, also, that a
carrier receiving different types of CAF
funding in the same state should
aggregate its customers in each speed
tier for purposes of testing. The
following examples illustrate how this
scaled approach should be
implemented:
• A carrier with 2,300 customers
subscribed to a single service tier of
10/1 Mbps in one state must test 50
locations in that state, while a carrier
providing solely 25/3 Mbps service to
over 2,500 subscribers in each of three
states must test 50 locations in each
state.
• A carrier providing 10/1 Mbps
service and 25/3 Mbps service to 100
subscribers each in a single state must
test 10 locations for each of the two
service tiers—20 locations in total.
• A carrier providing solely 10/1
Mbps service to 30 subscribers must test
five locations, and if that carrier is only
able to test three CAF-supported
locations, that carrier must test two nonCAF-supported locations receiving 10/1
Mbps service in the same state.
• A carrier with 2,000 customers
subscribed to 10/1 Mbps in one state
through CAF Phase II funding and 500
RBE customers subscribed to 10/1 Mbps
in the same state, and no other high-cost
support with deployment obligations,
must test a total of 50 locations in that
state for the 10/1 Mbps service tier.
24. Test subjects must be randomly
selected every two years from among the
provider’s active subscribers in each
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
service tier in each state. Subscribers for
latency testing may be randomly
selected from those subscribers being
tested for speed at all speed tiers or
randomly selected from all CAFsupported subscribers, every two years.
Any sample location lacking an active
subscriber 12 months after that location
was selected must be replaced by an
actively subscribed location, randomly
selected. Random selection will ensure
that providers cannot pick and choose
amongst subscribers so that only those
subscribers likely to have the best
performance (e.g., those closest to a
central office) are tested. Carriers may
use inducements to encourage
subscribers to participate in testing.
This may be particularly useful in cases
where support is tied to a particular
performance level for the network but
the provider does not have enough
subscribers to higher performance
service to test to comply with the testing
sample sizes. However, to ensure that
the selection remains random, carriers
must offer the same inducement to all
randomly-selected subscribers in the
areas for which participating subscribers
are required for the carrier to conduct
testing. WCB will provide further
guidance regarding random selection by
public notice.
25. The Bureaus and OET reiterate the
Commission’s requirement that highlatency providers subject to testing must
demonstrate a Mean Opinion Score
(MOS) of four or higher. The Bureaus
and OET agree with ADTRAN, Inc.
(ADTRAN) that listening-opinion tests
would not suffice to demonstrate a highquality consumer voice experience.
Latency only minimally affects
participants’ experiences and
evaluations in listening-opinion tests,
which involve passive listening to audio
samples. However, in the USF/ICC
Transformation Order, 76 FR 73830,
November 29, 2011, the Commission
required ‘‘ETCs to offer sufficiently low
latency to enable use of real-time
applications, such as VoIP.’’ Unlike a
listening-opinion test, in a conversationopinion test, two participants actively
participate in a conversation. The backand-forth of conversations highlights
delay, echo, and other issues caused by
latency in a way that one-way, passive
listening cannot. Therefore, the Bureaus
and OET require that high-latency
providers conduct an ITU–T
Recommendation P.800 conversationalopinion test.
26. Specifically, the Bureaus and OET
require the use of the underlying
conversational-opinion test
requirements specified by the ITU–T
Recommendation P.800, with testing
conditions as described in the
PO 00000
Frm 00040
Fmt 4700
Sfmt 4700
following. The Bureaus and OET believe
that MOS testing under these conditions
will ensure that the test results reflect
the consumer experience as accurately
as possible. First, high-latency providers
must use operational network
infrastructure, such as actual satellite
links, for conducting MOS testing, not
laboratory-based simulations intended
to reproduce service conditions.
Second, the tests must be implemented
using equipment, systems, and
processes that are used in provisioning
service to locations funded by high-cost
universal service support. Third, live
interviews and surveys must be
conducted by an independent agency or
organization (Reviewer) to determine
the MOS. Survey forms, mail-in
documentation, automated phone calls,
or other non-interactive and nonperson-to-person interviews are not
permitted. Any organization or
laboratory with experience testing
services for compliance with
telecommunications industry-specified
standards and, preferably, MOS testing
experience, may be a Reviewer. Fourth,
testing must be conducted over a ‘‘single
hop’’ satellite connection with at least
one endpoint at an active subscriber
location using the subscriber’s end-user
equipment. Finally, the second
endpoint may be a centralized location
from which the Reviewer conducts live
interviews with the subscriber to
determine the subscriber’s MOS
evaluation.
27. To reduce the burden of the MOS
testing for high-latency bidders while
still ensuring high-quality voice service,
the Bureaus and OET adopt a separate
scaled table for the number of locations
that are subject to MOS testing.
Specifically, the Bureaus and OET will
determine the number of testing
locations based upon the number of
subscribers nationally for which CAFsupported service is provided. The
Bureaus and OET recognize that the
satellite infrastructures employed by
many high-latency bidders have
characteristics different from terrestrial
networks that make testing of satellite
service on a national, rather than state,
basis appropriate. That is, middle-mile/
backhaul for satellite networks are the
direct links from the consumer locations
to the satellite and then from the
satellite to selected downlink sites, so
there is unlikely to be significant
variability based on the state in which
the subscriber is located. The consumers
must be randomly selected from the
total CAF-supported subscriber base in
all applicable states to ensure that
different types of geographic locations
are tested.
E:\FR\FM\20AUR1.SGM
20AUR1
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
REQUIRED TEST LOCATIONS FOR MOS
TESTING
Number of
MOS test
locations
Number of subscribers at CAF-supported locations nationally
3500 or fewer ......................................
Over 3500 ...........................................
100
370
daltland on DSKBBV9HB2PROD with RULES
This scaled, nationwide testing
requirement will reduce high-latency
bidders’ testing burden while ensuring a
sufficient testing sample to verify
compliance with voice performance
requirements.
IV. Compliance Framework
28. The Bureaus and OET extend the
existing standard for full compliance
with high-cost support recipients’
latency obligations and adopt a standard
for full compliance with speed
obligations. The Bureaus and OET also
establish a compliance framework
outlining specific actions for various
degrees of compliance that fall short of
those standards.
29. The Bureaus and OET reaffirm the
existing low-latency and high-latency
standards and establish a speed
standard for full compliance. The data
on round-trip latency in the United
States has not markedly changed since
the 2013 CAF Phase II Price Cap Service
Obligation Order, and no party has
challenged the Commission’s reasoning
for the existing 100 ms latency standard.
Accordingly, the Bureaus and OET
conclude that all high-cost support
recipients serving fixed locations,
except those carriers submitting highlatency bids in the CAF Phase II
auction, must certify that 95 percent or
more of all testing hours measurements
of network round-trip latency are at or
below 100 ms. High-latency bidders
must certify that 95 percent or more of
all testing hours measurements are at or
below 750 ms. Providers must record
the observed latency for all latency test
measurements, including all lost packet
tests. Thus, providers may not discard
lost-packet tests from their test results;
these tests count as discrete tests not
meeting the standard.
30. For speed, the Bureaus and OET
require that 80 percent of download and
upload measurements be at or above 80
percent of the CAF-required speed tier
(i.e., an 80/80 standard). For example, if
a carrier receives high-cost support for
10/1 Mbps service, 80 percent of the
download speed measurements must be
at or above 8 Mbps, while 80 percent of
the upload speed measurements must be
at or above 0.8 Mbps. The Bureaus and
OET require carriers to meet and test to
their CAF obligation speed(s) regardless
of whether their subscribers purchase
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
internet service offerings with
advertised speeds matching the CAFrequired speeds at CAF-eligible
locations. Thus, carriers that have
deployed a network with the requisite
speeds must include all subscribers at
that level in their testing, but may still
find it necessary to upgrade individual
subscriber locations, at least
temporarily, to conduct speed testing.
For example, a carrier may be required
to deploy and offer 100/20 Mbps
service, but only 5 of its 550 subscribers
at CAF-supported locations take 100/20
Mbps service, with the remainder taking
20/20 Mbps service. To satisfy its testing
obligations, the carrier would be
required to (1) test all 5 of the 100/20
Mbps subscribers and (2) randomly
select 45 of its other CAF-supported
subscribers, raise those subscribers’
speed to 100/20 Mbps, at least
temporarily, and test those 45
subscribers.
31. The Bureaus and OET believe that
this standard best meets its statutory
requirement to ensure that high-costsupported broadband deployments
provide reasonably comparable service
as those available in urban areas. The
most recent MBA report cites the 80/80
standard as a ‘‘key measure’’ of network
consistency. MBA data show that all
fixed terrestrial broadband technologies
that are included in the MBA program
can meet this standard. The Bureaus
and OET are confident that high-cost
support recipients’ newer fixed
broadband deployments will benefit
from more up-to-date technologies and
network designs that should provide
even better performance.
32. Further, the Bureaus and OET
expect that a realistic 80/80 standard
will provide a ‘‘cushion’’ to address
certain testing issues. The Bureaus and
OET noted in this document that some
commenters expressed concern that
they would be responsible for testing to
an IXP even though that involved the
use of backhaul that a provider may not
control. The Bureaus and OET believe
that the 80/80 standard allows sufficient
leeway to providers so that they will
meet performance standards as long as
they have reasonable backhaul
arrangements. In addition, commenters
have raised a concern that speed testing
could possibly show misleadingly low
results if the subscriber being tested is
using the connection at the time of the
testing. However, the testing
methodology addresses this concern. As
with the MBA, the Bureaus and OET
allow rescheduling of testing in
instances where the customer usage
exceeds MBA cross-talk thresholds.
Thus, the Bureaus and OET do not
anticipate that customer cross-talk will
PO 00000
Frm 00041
Fmt 4700
Sfmt 4700
42057
affect CAF performance data any more
(or less) than the MBA program data on
which its standard is based. Customer
usage should not prevent carriers with
appropriately constructed networks
from meeting its requirements.
33. The Bureaus and OET find that a
speed standard similar to what they
have adopted for latency to measure
broadband speed performance, as
proposed by ADTRAN, is not
appropriate. Staff analysis has found
that this standard would not ensure
CAF-supported service that is
comparable to that in urban areas. The
2016 MBA Report stated that
‘‘[c]onsistency of speed may be more
important to customers who are heavy
users of applications that are both high
bandwidth and sensitive to short
duration declines in actual speed, such
as streaming video.’’ A speed standard
relying on an average or median value
would not ensure consistency of speed
because the distribution of values
around the median may vary
significantly. A carrier could meet such
a standard by ensuring that the average
or median speed test meets a target
speed, while not providing sufficiently
fast service nearly half the time or to
nearly half its subscribers in locations
supported by universal service. The
Bureaus and OET therefore conclude
that the 80/80 standard they adopt
herein is a better measure of
comparability and high-quality service.
34. Finally, the Bureaus and OET
recognize that, because of technical
limitations, it is currently unrealistic to
expect that providers obligated to
provide gigabit service, i.e., speeds of
1,000 Mbps, achieve actual speeds of
1,000 Mbps download at the customer
premises. Typical customer premises
equipment, including equipment for
gigabit subscribers, permits a maximum
throughput of 1 Gbps, and the overhead
associated with gigabit internet traffic
(whether in urban or rural areas) can
reach up to 60 Mbps out of the
theoretical 1 Gbps. Customer premises
equipment with higher maximum
throughput are generally more costly
and not readily available. Thus, even if
a gigabit provider were to
‘‘overprovision’’ its gigabit service, the
subscriber would not experience speeds
of 1,000 Mbps. The Bureaus and OET do
not want to discourage carriers from
bidding in the upcoming CAF auction to
provide 1 Gbps service by requiring
unachievable service levels. The
Bureaus and OET note that the 80/80
standard they adopt requires gigabit
carriers to demonstrate that 80 percent
of their testing hours download speed
tests are at or above 80 percent of 1,000
Mbps, i.e., 800 Mbps. This standard
E:\FR\FM\20AUR1.SGM
20AUR1
42058
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
should not pose a barrier to carriers
bidding to provide 1 Gbps service.
35. Consistent with the Commission’s
universal service goals, the Bureaus and
OET adopt a compliance framework that
encourages ETCs to comply fully with
their performance obligations and
includes the potential for USAC to audit
test results. The Bureaus and OET
establish a four-level framework that
sets forth particular obligations and
automatic triggers based on an ETC’s
degree of compliance with its latency,
speed, and, if applicable, MOS testing
standards in each state and high-cost
support program. The Bureaus and OET
will determine a carrier’s compliance
for each standard separately. In each
case, the Bureaus and OET will divide
the percentage of its measurements
meeting the relevant standard by the
required percentage of measurements to
be in full compliance.
36. In other words, for latency, in
each state in which the carrier has CAFsupported locations, the Bureaus and
OET will calculate the percentage of
compliance using the 95-percent
standard, so they will divide the
percentage of the carrier’s testing hours’
latency measurements at or below the
required latency (i.e., 100 ms or 750 ms)
by 95. As an example, if a low-latency
provider observes that 90 percent of all
its testing hours measurements are at or
below 100 ms, then that provider’s
latency compliance percentage would
be 90/95 = 94.7 percent in that state. For
speed, for each speed tier and state the
Bureaus and OET will calculate the
percentage of compliance relative to the
80-percent-based standard, so they will
divide the percentage of the carrier’s
testing hours speed measurements at or
above 80 percent of the target speed by
80. Thus, if a provider observes that 65
percent of its testing hours speed
measurements meet 80 percent of the
required speed, the provider’s
compliance percentage would be 65/80
= 81.25 percent for the relevant speed
tier in that state. Carriers must include
and submit the results from all tests and
cannot exclude any tests conducted
beyond the minimum numbers of tests,
as outlined in this Order, for the
calculation of latency and speed
compliance percentages.
37. For MOS testing, the high-latency
bidder must demonstrate a MOS of 4 or
higher, so a high-latency bidder would
calculate its percentage of compliance
relative to 4. Thus, a provider
demonstrating a MOS of 3 would have
a compliance percentage of 3⁄4 = 75
percent. For a high-latency bidder
conducting MOS testing across its entire
network, rather than state-by-state, the
Bureaus and OET will calculate the
same MOS compliance percentage for
each state that it serves with CAF Phase
II support.
38. To avoid penalizing a provider for
failing to meet multiple standards for
the same locations, the Bureaus and
OET adopt a streamlined compliance
framework in which the lowest of a
carrier’s separate latency, speed, and, if
applicable, MOS compliance
percentages (including percentages for
each speed tier) determines its
obligations. All carriers not fully
compliant in a particular state must
submit quarterly reports providing one
week of testing hours test results,
subject to the same requirements the
Bureaus and OET establish in this
Order, and describing steps taken to
resolve the compliance gap, and USAC
will withhold a percentage of a noncompliant carrier’s monthly support.
Whenever a carrier in Levels 1 through
3 comes into a higher level of
compliance, that level’s requirements
will apply, and USAC will return the
withheld support up to an amount
reflecting the difference between the
levels’ required withholding but not
including any support withheld by
USAC for more than 12 months.
39. The Bureaus and OET define
Level 1 compliance to include carriers
with compliance percentages at or above
85 but below 100 percent, and they
direct USAC to withhold 5 percent of a
Level 1-compliant carrier’s monthly
support. Level 2 compliance includes
carriers with compliance percentages at
or above 70 but below 85 percent, and
the Bureaus and OET direct USAC to
withhold 10 percent of a Level
2-compliant carrier’s monthly support.
Level 3 compliance includes carriers
with compliance percentages at or above
55 but below 70 percent, and the
Bureaus and OET direct USAC to
withhold 15 percent of a Level
3-compliant carrier’s monthly support.
Level 4 compliance includes carriers
with compliance percentages below 55
percent, and the Bureaus and OET
direct USAC to withhold 25 percent of
a Level 4-compliant carrier’s monthly
support. The Bureaus and OET will also
refer Level 4-compliant carriers to
USAC for an investigation into the
extent to which the carrier has actually
deployed broadband in accordance with
its deployment obligations. The
following table provides a summary of
the compliance framework, where x is
the carrier’s compliance percentage:
COMPLIANCE LEVELS AND SUPPORT REDUCTIONS
Qualifying compliance percentage x
daltland on DSKBBV9HB2PROD with RULES
Full Compliance ........................................................
Level 1 ......................................................................
Level 2 ......................................................................
Level 3 ......................................................................
Level 4 ......................................................................
40. Similar to commenters’ proposals,
the framework the Bureaus and OET
adopt resembles the non-compliance
framework for interim deployment
milestones in section 54.320(d) of the
Commission’s rules. The Bureaus and
OET emphasize that the goal of this
compliance framework is to provide
incentives, rather than penalize.
Balancing commenters’ concerns
regarding the severity or leniency of a
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
Required
quarterly
reporting
x ≥ 100% ..................................................................
85% ≤ x < 100% ......................................................
70% ≤ x < 85% ........................................................
55% ≤ x < 70% ........................................................
x < 55% ....................................................................
No ......................
Yes .....................
Yes .....................
Yes .....................
Yes .....................
such a framework, the Bureaus and OET
conclude that its framework
appropriately encourages carriers to
come into full compliance and offer, in
areas requiring high-cost support,
broadband service meeting standards
consistent with what consumers
typically experience.
41. Finally, the Bureaus and OET
provide one exception to this noncompliance framework. As discussed in
PO 00000
Frm 00042
Fmt 4700
Sfmt 4700
Monthly support
withheld
(percent)
N/A
5
10
15
25
this document, carriers that serve 50 or
fewer subscribers in a state and
particular service tier but cannot find
five active subscribers for conducting
the required testing may test non-CAFsupported active subscriber locations to
the extent necessary. Because those
carriers’ test results would not solely
reflect the performance of CAFsupported locations, any such carriers
not fully complying with the Bureaus
E:\FR\FM\20AUR1.SGM
20AUR1
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
and OET latency and speed standards
will be referred to USAC for further
investigation of the level of performance
at the CAF-supported locations.
42. The Commission requires that
providers subject to these testing
requirements annually certify and report
the results to USAC, which may audit
the test results. To facilitate compliance
monitoring, the Bureaus and OET
require providers to submit speed and
latency test results, including the
technologies used to provide broadband
at the tested locations, for each state and
speed tier combination in addition to an
annual certification in a format to be
determined by WCB; high-latency
bidders conducting MOS testing across
their entire networks, rather than stateby-state, may submit and certify MOS
test results on a nationwide basis. To
minimize the burden on providers,
USAC will calculate the compliance
percentages required using the data
submitted. By requiring carriers to
submit test results annually, or quarterly
if they are not fully in compliance with
the Bureaus and OET standards, and
having USAC perform the compliance
calculations, the Bureaus and OET
minimize the potential for any
manipulation or gaming of the testing
regime, as providers will be required to
certify to a set of specific results rather
than to a general level of compliance.
Because of the need to develop a
mechanism for collecting the testing
data and obtain Paperwork Reduction
Act (PRA) approval, carriers will be
required to submit the first set of testing
data and accompanying certification by
July 1, 2020. This submission should
include data for at least the third and
fourth quarters of 2019. Subsequently,
data and certifications will be due by
July 1 of each year for the preceding
calendar year. WCB will provide further
guidance by public notice regarding
how carriers will submit their testing
data and certifications. Together with
USAC audits and possible withholding
of support, the Bureaus and OET believe
these measures will provide ample
incentives for carriers to comply with
their obligations.
daltland on DSKBBV9HB2PROD with RULES
V. Procedural Matters
A. Paperwork Reduction Act
43. This Order contains new or
modified information collection
requirements subject to the Paperwork
Reduction Act of 1995 (PRA), Public
Law 104–13. It will be submitted to the
Office of Management and Budget
(OMB) for review under section 3507(d)
of the PRA. OMB, the general public,
and other Federal agencies will be
invited to comment on the new or
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
modified information collection
requirements contained in this
proceeding. In addition, the
Commission notes that pursuant to the
Small Business Paperwork Relief Act of
2002, Public Law 107–198, see 44 U.S.C.
3506(c)(4), it previously sought specific
comment on how the Commission might
further reduce the information
collection burden for small business
concerns with fewer than 25 employees.
In this present document, the
Commission has assessed the effects of
the new and modified rules that might
impose information collection burdens
on small business concerns, and find
that they either will not have a
significant economic impact on a
substantial number of small entities or
will have a minimal economic impact
on a substantial number of small
entities.
B. Congressional Review Act
44. The Commission will send a copy
of this Order to Congress and the
Government Accountability Office
pursuant to the Congressional Review
Act, see 5 U.S.C. 801(a)(1)(A).
45. As required by the Regulatory
Flexibility Act of 1980 (RFA), as
amended, an Initial Regulatory
Flexibility Analysis (IRFA) was
incorporated in the USF/ICC
Transformation FNPRM, 76 FR 78384,
December 16, 2011. The Commission
sought written public comment on the
proposals in the USF/ICC
Transformation FNPRM, including
comment on the IRFA. The Commission
did not receive any relevant comments
on the USF/ICC Transformation FNPRM
IRFA. This present Final Regulatory
Flexibility Analysis (FRFA) conforms to
the RFA.
46. As a condition of receiving highcost universal service support, eligible
telecommunications carriers (ETCs)
must offer broadband service in their
supported areas that meets certain basic
performance requirements. ETCs subject
to broadband performance obligations
must currently offer broadband with
latency suitable for real-time
applications, such as VoIP, and meet a
minimum speed standard of 10 Mbps
downstream and 1 Mbps upstream or
greater. Recipients of high-cost support
must also test their broadband networks
for compliance with speed and latency
metrics and certify and report the
results to the Universal Service
Administrative Company (USAC) and
the relevant state or tribal government
on an annual basis, with those results
subject to audit.
47. In the Order, the Bureaus and OET
define how ETCs with Connect America
Fund (CAF) Phase II, Alternative
PO 00000
Frm 00043
Fmt 4700
Sfmt 4700
42059
Connect America Cost Model (A–CAM),
rate-of-return mandatory buildout, rural
broadband experiment (RBE), or Alaska
Plan obligations must test speed and
latency and certify and report the
results. Specifically, the Bureaus and
OET establish a uniform framework for
measuring speed and latency
performance. The Bureaus and OET
permit three testing methods as options
for ETCs to conduct the required speed
and latency tests, and the Bureaus and
OET provide a definition for a ‘‘test’’ in
this context and specify the
measurement span associated with these
tests. The Bureaus and OET establish
specific test parameters for latency and
speed, including how often and how
many tests must be conducted and the
minimum test sample size. The Bureaus
and OET also establish voice testing
requirements for high-latency bidders in
the CAF Phase II auction. Finally, the
Bureaus and OET define compliance for
latency and speed standards and
establish the required certifications, as
well as a compliance framework
providing strong incentives for ETCs to
meet its standards.
48. With the testing framework the
Bureaus and OET have adopted herein,
they have provided maximum flexibility
to reduce the burden on smaller entities,
consistent with ensuring that these
carriers are meeting their latency and
speed requirements. Smaller entities
required to do testing can choose from
one of three methodologies to conduct
the required testing. All entities
providing broadband service should
already use testing mechanisms for
internal purposes, such as ensuring that
customers are receiving the appropriate
level of service and troubleshooting in
response to customer complaints. In
addition, the Bureaus and OET will be
providing an online portal so entities
can easily submit all of their test results
electronically and USAC will do all of
the necessary compliance calculations.
49. The RFA directs agencies to
provide a description of, and where
feasible, an estimate of the number of
small entities that may be affected by
the proposed rules, if adopted. The RFA
generally defines the term ‘‘small
entity’’ as having the same meaning as
the terms ‘‘small business,’’ ‘‘small
organization,’’ and ‘‘small governmental
jurisdiction.’’ In addition, the term
‘‘small business’’ has the same meaning
as the term ‘‘small-business concern’’
under the Small Business Act. A smallbusiness concern’’ is one which: (1) Is
independently owned and operated; (2)
is not dominant in its field of operation;
and (3) satisfies any additional criteria
established by the Small Business
Administration (SBA).
E:\FR\FM\20AUR1.SGM
20AUR1
daltland on DSKBBV9HB2PROD with RULES
42060
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
50. The Bureaus and OET actions,
over time, may affect small entities that
are not easily categorized at present.
The Bureaus and OET therefore describe
here, at the outset, three broad groups of
small entities that could be directly
affected herein. First, while there are
industry specific size standards for
small businesses that are used in the
regulatory flexibility analysis, according
to data from the SBA’s Office of
Advocacy, in general a small business is
an independent business having fewer
than 500 employees. These types of
small businesses represent 99.9 percent
of all businesses in the United States
which translates to 28.8 million
businesses.
51. Next, the type of small entity
described as a ‘‘small organization’’ is
generally ‘‘any not-for-profit enterprise
which is independently owned and
operated and is not dominant in its
field.’’ Nationwide, as of August 2016,
there were approximately 356,494 small
organizations based on registration and
tax data filed by nonprofits with the
Internal Revenue Service (IRS).
52. Finally, the small entity described
as a ‘‘small governmental jurisdiction’’
is defined generally as ‘‘governments of
cities, counties, towns, townships,
villages, school districts, or special
districts, with a population of less than
fifty thousand.’’ U.S. Census Bureau
data from the 2012 Census of
Governments indicates that there were
90,056 local governmental jurisdictions
consisting of general purpose
governments and special purpose
governments in the United States. Of
this number there were 37,132 General
purpose governments (county,
municipal and town or township) with
populations of less than 50,000 and
12,184 Special purpose governments
(independent school districts and
special districts) with populations of
less than 50,000. The 2012 U.S. Census
Bureau data for most types of
governments in the local government
category shows that the majority of
these governments have populations of
less than 50,000. Based on this data the
Bureaus and OET estimate that at least
49,316 local government jurisdictions
fall in the category of ‘‘small
governmental jurisdictions.’’
53. In the Order, the Bureaus and OET
establish for high-cost support
recipients serving fixed locations a
uniform framework for measuring speed
and latency performance and define the
requisite standards for full compliance
with those providers’ speed and latency
obligations. The Commission’s existing
rules require that high-cost recipients
report ‘‘[t]he results of network
performance tests pursuant to the
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
methodology and in the format
determined by the Wireline Competition
Bureau, Wireless Telecommunications
Bureau, and the Office of Engineering
and Technology’’ and that ETCs retain
such records for at least ten years from
the receipt of funding.
54. The Bureaus and OET now
provide some color to this requirement;
they require providers to submit speed
and latency test results, including the
technologies used to provide broadband
at the tested locations, for each state and
speed tier combination in addition to an
annual certification in a format to be
determined by WCB. High-latency
bidders conducting mean opinion score
(MOS) testing across their entire
networks, rather than state-by-state, may
submit and certify MOS test results on
a nationwide basis. To minimize the
burden on providers, USAC will
calculate the compliance percentages
required using the data submitted. By
requiring carriers to submit test results
annually and having USAC perform the
compliance calculations, the Bureaus
and OET minimize the potential for any
manipulation or gaming of the testing
regime, as providers will be required to
certify to a set of specific results rather
than to a general level of compliance.
However, providers that are not fully
compliant with the speed and latency
standards must submit quarterly reports
including one week of test results and
describing steps taken to resolve the
compliance gap.
55. The RFA requires an agency to
describe any significant alternatives that
it has considered in reaching its
proposed approach, which may include
(among others) the following four
alternatives: (1) The establishment of
differing compliance or reporting
requirements or timetables that take into
account the resources available to small
entities; (2) the clarification,
consolidation, or simplification of
compliance or reporting requirements
under the rule for small entities; (3) the
use of performance, rather than design,
standards; and (4) an exemption from
coverage of the rule, or any part thereof,
for small entities. The Bureaus and OET
have considered all of these factors
subsequent to receiving substantive
comments from the public and
potentially affected entities. The
Wireline Competition Bureau, Wireless
Telecommunications Bureau, and Office
of Engineering and Technology have
considered the economic impact on
small entities, as identified in any
comments filed in response to USF/ICC
Transformation FNPRM and IRFA, in
reaching its final conclusions and taking
action in this proceeding.
PO 00000
Frm 00044
Fmt 4700
Sfmt 4700
56. In the Order, the Bureaus and OET
adopt a clear, uniform framework for
high-cost support recipients serving
fixed locations to test speed and latency
to meet the obligations associated with
the support they receive. The
requirements the Bureaus and OET
adopt provide flexibility for carriers to
choose between different testing
methods suitable for carriers of different
sizes and technological sophistication.
Instead of requiring providers to invest
in and implement new internal systems,
the Bureaus and OET permit providers
to perform speed and latency tests with
readily available off-the-shelf solutions
or existing MBA infrastructure. The
Bureaus and OET expect that carriers
with testing features built into customer
premises equipment for their own
network management purposes may
prefer using their own self-testing
systems, which they also permit.
57. The Bureaus and OET require that
carriers, regardless of their preferred
testing methods, conduct tests using the
same parameters they establish. These
parameters take into account smaller
carriers’ circumstances to avoid
disproportionately burdening them. For
example, the Bureaus and OET expand
the list of locations to which carriers
may conduct required tests—allowing
smaller carriers that are farther from the
largest metropolitan areas to test speed
and latency over shorter distances. The
Bureaus and OET also permit providers
to conduct tests to the designated area
of their choosing, rather than to the
nearest designated metropolitan area.
Further, carriers with fewer subscribers
in a state and broadband service tier
may test fewer locations. Greater
percentages of subscribers are necessary
to achieve the same margin of error and
confidence level in smaller sample
sizes, but the Bureaus and OET
recognize that, below 450 subscribers,
that necessary percentage rises quickly
above 10 percent. Accordingly, in the
Order, the Bureaus and OET allow
providers with between 51 and 450
subscribers in a particular state and
service tier combination to test 10
percent of total subscribers. The
Bureaus and OET require providers with
fewer than 50 subscribers in a particular
state and service tier combination to test
five locations, but, to the extent
necessary, those carriers may test
existing, non-CAF-supported active
subscriber locations to satisfy that
requirement.
58. Finally, the Bureaus and OET
provide clarity regarding the
Commission’s existing requirement that
carriers must report the results of
network performance tests. Carriers
must annually (or, in some cases,
E:\FR\FM\20AUR1.SGM
20AUR1
Federal Register / Vol. 83, No. 161 / Monday, August 20, 2018 / Rules and Regulations
daltland on DSKBBV9HB2PROD with RULES
quarterly) submit detailed results of the
required tests, conducted pursuant to
the parameters the Bureaus and OET
establish. The Bureaus and OET hold all
carriers to the same speed and latency
test standards, but they recognize that
requiring carriers to take the additional
step of using their test results to
determine their level of compliance may
entail unnecessary burdens. Although
the Bureaus and OET anticipate that
carriers will find the adopted
compliance framework straightforward,
they conclude that requiring submission
of the actual test results and allowing
VerDate Sep<11>2014
17:41 Aug 17, 2018
Jkt 244001
USAC to calculate the compliance
percentages lessens the burden on small
entities even further.
VI. Ordering Clauses
59. Accordingly, it is ordered that,
pursuant to sections 1, 4(i), 5(c), 201(b),
214, and 254 of the Communications
Act of 1934, as amended, and section
706 of the Telecommunications Act of
1996, 47 U.S.C. 151, 154(i), 155(c),
201(b), 214, 254, 1302, §§ 0.91 and 0.291
of the Commission’s rules, 47 CFR 0.91,
0.291, and the delegations of authority
in paragraph 170 of the USF/ICC
Transformation Order, FCC 11–161, this
PO 00000
Frm 00045
Fmt 4700
Sfmt 9990
42061
Order is adopted, effective thirty (30)
days after publication of the text or
summary thereof in the Federal
Register, except for the requirements in
paragraphs 38 and 42 that are subject to
the PRA, which will become effective
upon announcement in the Federal
Register of OMB approval of the subject
information collection requirements.
Federal Communications Commission.
Kris A. Monteith,
Chief, Wireline Competition Bureau.
[FR Doc. 2018–17338 Filed 8–17–18; 8:45 am]
BILLING CODE 6712–01–P
E:\FR\FM\20AUR1.SGM
20AUR1
Agencies
[Federal Register Volume 83, Number 161 (Monday, August 20, 2018)]
[Rules and Regulations]
[Pages 42052-42061]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2018-17338]
-----------------------------------------------------------------------
FEDERAL COMMUNICATIONS COMMISSION
47 CFR Part 54
[WC Docket No. 10-90; DA 18-710]
Connect America Fund
AGENCY: Federal Communications Commission.
ACTION: Final action.
-----------------------------------------------------------------------
SUMMARY: In this document, the Wireline Competition Bureau (WCB), the
Wireless Telecommunications Bureau (WTB) (jointly referred to herein as
the Bureaus), and the Office of Engineering and Technology (OET) adopt
requirements promoting greater accountability for certain recipients of
Connect America Fund (CAF) high-cost universal service support,
including price cap carriers, rate-of-return carriers, rural broadband
experiment (RBE) support recipients, Alaska Plan carriers, and CAF
Phase II auction winners. Specifically, the Bureaus and OET establish a
uniform framework for measuring the speed and latency performance for
recipients of high-cost universal service support to serve fixed
locations.
DATES: This final action is effective September 19, 2018.
FOR FURTHER INFORMATION CONTACT: Suzanne Yelen, Wireline Competition
Bureau, (202) 418-7400 or TTY: (202) 418-0484.
SUPPLEMENTARY INFORMATION: This is a summary of the Commission's Order
in WC Docket No. 10-90; DA 18-710, adopted on July 6, 2018 and released
on July 6, 2018. The full text of this document is available for public
inspection during regular business hours in the FCC Reference Center,
Room CY-A257, 445 12th Street SW, Washington, DC 20554 or at the
following internet address: https://docs.fcc.gov/public/attachments/DA-18-710A1.pdf.
I. Introduction
1. In the Order, the Bureaus and OET adopt requirements promoting
greater accountability for certain recipients of CAF high-cost
universal service support, including price cap carriers, rate-of-return
carriers, RBE support recipients, Alaska Plan carriers, and CAF Phase
II auction winners. Specifically, the Bureaus and OET establish a
uniform framework for measuring the speed and latency performance for
recipients of high-cost universal service support to serve fixed
locations.
2. The Bureaus and OET also require providers to submit testing
results as
[[Page 42053]]
part of their annual compliance certification. Carriers that do not
comply with the Bureaus and OET's speed and latency requirements will
be subject to a reduction in support, commensurate with their level of
noncompliance. In addition, providers will be subject to audit of all
testing data. With this testing and compliance framework, the Bureaus
and OET aim to maximize the benefits consumers reap from its high-cost
universal service programs in even the hardest-to-reach areas, thus
making the best use of its Universal Service Fund (USF) dollars and
further closing the digital divide.
II. Choice of Testing Method
3. The Bureaus and OET provide high-cost support recipients that
serve fixed locations three options to afford flexibility in choosing
solutions to conduct required performance testing. Specifically, the
Bureaus and OET conclude that eligible telecommunications carriers
(ETCs) subject to fixed broadband performance obligations may conduct
required testing by employing either (1) Measuring Broadband America
(MBA) testing infrastructure (MBA testing), (2) existing network
management systems and tools (off-the-shelf testing), or (3) provider-
developed self-testing configurations (provider-developed self-testing
or self-testing). Providers may employ any of these three options as
long as the provider's implementation meets the testing requirements
established in this Order. The Bureaus and OET define the three options
as follows:
First, a high-cost support recipient may use MBA testing
by arranging with entities that manage and perform testing for the MBA
program to implement performance testing, as required, for CAF. The
provider is responsible for all costs required to implement testing of
its network, including any costs associated with obtaining and
maintaining Whiteboxes, to the extent that any additional Whiteboxes
are employed as part of the MBA testing. The Bureaus and OET note that
the MBA testing must occur in areas and for the locations supported by
CAF, e.g., in CAF Phase II eligible areas for price cap carriers and
for specific built-out locations for RBE, Alternative Connect America
Cost Model (A-CAM), and legacy rate-of-return support recipients.
Second, a high-cost support recipient may elect to use
existing network management systems and tools, ping tests, and other
commonly available performance measurement and network management
tools--off-the-shelf testing--to implement performance testing.
Third, a high-cost support recipient may implement a
provider-developed self-testing configuration using software installed
on residential gateways or in equipment attached to residential
gateways to regularly initiate speed and latency tests. Providers that
implement self-testing of their own networks may make network
performance testing services available to other providers. The Bureaus
and OET continue to consider whether the Universal Service
Administrative Company (USAC) may have a role in offering server
capacity at an internet Exchange Point in an FCC-designated
metropolitan area (FCC-designated IXP), without any oversight role in
conducting tests, to mitigate smaller providers' costs.
4. By providing these three options, the Bureaus and OET ensure
that there is a cost-effective method for conducting testing for
providers of different sizes and technological sophistication. The
Bureaus and OET do not require that providers invest in and implement
new internal systems; instead, providers may perform speed and latency
tests with readily-available, off-the-shelf solutions or existing MBA
infrastructure. On the other hand, some providers may prefer
implementing their own self-testing systems, especially if such testing
features are already built into CPE for the carrier's own network
management purposes. These three options allow the provider to align
required performance testing with their established network management
systems and operations, making it as easy as possible for carriers to
implement the required testing while establishing rigorous testing
parameters and standards, based on real-world data.
5. The Bureaus and OET recognize that self-testing using provider-
developed software may create opportunities for ``manipulation or
gaming'' by CAF recipients. However, the Bureaus and OET believe that
the testing and compliance requirements they adopt will minimize the
possibility of such behavior. First, as explained in more detail in the
following, the Bureaus and OET will be requiring providers to submit
and certify testing data annually. Second, USAC will be verifying
provider compliance and auditing performance testing results.
6. The Bureaus and OET reject Alaska Communications' proposal that
high-cost support recipients may submit radio frequency propagation
maps in lieu of conducting speed tests to demonstrate compliance with
speed obligations. Such maps are only illustrative of planned,
``theoretical'' coverage and do not provide actual data on what
consumers experience. The Bureaus and OET therefore require providers
to conduct the required testing using one of the three options
identified in this document.
III. General Testing Parameters
7. All ETCs subject to fixed broadband performance obligations must
conduct the required speed and latency testing using the parameters in
this Order, regardless of which of the three testing options the
carrier selects. The Bureaus and OET first define ``test'' and the
associated span of measurement, in the context of these performance
measurements. Next, the Bureaus and OET adopt requirements regarding
when tests must begin and when exactly carriers may perform the tests,
and they set the number of active subscriber locations carriers must
test, with variations depending on the size of the carrier. Finally,
the Bureaus and OET address how high-latency bidders in the CAF Phase
II auction must conduct required voice testing.
8. To maintain a stringent performance compliance regime while
avoiding unnecessary burdens on smaller carriers, the Bureaus and OET
allow flexibility concerning the specific testing approach so that
carriers can select, consistent with its adopted framework, the best
and most efficient testing methods for their particular circumstances.
The Bureaus and OET encourage the use of industry testing standards,
such as the TR-143 Standard, for conducting self-testing.
9. For reasons similar to those outlined in the CAF Phase II Price
Cap Service Obligation Order, 78 FR 70881, November 27, 2013, the
Bureaus and OET require that high-cost support recipients serving fixed
locations perform these tests over the measurement span already
applicable to price cap carriers receiving CAF Phase II model-based
support. ETCs must test speed and latency from the customer premises of
an active subscriber to a remote test server located at or reached by
passing through an FCC-designated IXP. Accordingly, a speed test is a
single measurement of download or upload speed of 10 to 15 seconds
duration between a specific consumer location and a specific remote
server location. Similarly, a latency test is a single measurement of
latency, often performed using a single User Datagram Protocol (UDP)
packet or a group of three internet Control Message Protocol (ICMP) or
UDP packets sent at essentially the same time, as is common with ping
tests.
10. Large and small ETCs alike commit to providing a certain level
of
[[Page 42054]]
service when accepting high-cost support to deploy broadband. ``Testing
. . . on only a portion of the network connecting a consumer to the
internet core will not show whether that customer is able to enjoy
high-quality real-time applications because it is network performance
from the customer's location to the destination that determines the
quality of the service from the customer's perspective.'' Although the
measurement span the Bureaus and OET adopt may include transport (e.g.,
backhaul or transit) that a provider does not control, the carrier can
influence the quality of transport purchased and can negotiate with the
transport provider for a level of service that will enable it to meet
the Commission's performance requirements. This is true for both price
cap carriers and smaller carriers. The Bureaus and OET therefore
disagree with suggestions that testing should only occur within a
provider's own network because providers do not always control the
portion of the network reaching the nearest FCC-designated IXP.
11. Previously, the Bureaus and OET designated the following ten
locations as FCC-designated IXPs: New York City, NY; Washington, DC;
Atlanta, GA; Miami, FL; Chicago, IL; Dallas-Fort Worth, TX; Los
Angeles, CA; San Francisco, CA; Seattle, WA; and Denver, CO. All of
these areas, except Denver, are locations used by the MBA program,
which selected these locations because they are geographically
distributed major U.S. Internet peering locations. Denver was added to
the list so that all contiguous areas in the United States are within
700 miles of an FCC-designated IXP. Because the Bureaus and OET are
expanding testing to additional CAF recipients, they add the following
six metropolitan areas as additional FCC-designated IXPs: Salt Lake
City, UT; St. Paul, MN; Helena, MT; Kansas City, MO; Phoenix, AZ; and
Boston, MA. This expanded list ensures that most mainland U.S.
locations are within 300 air miles of an FCC-designated IXP, and all
are within approximately 500 air miles of one. Further, the Bureaus and
OET find that there is no reason to limit testing to the provider's
nearest IXP; rather, providers can use any FCC-designated IXP for
testing purposes.
12. Still, the Bureaus and OET recognize that non-contiguous
providers face unique challenges in providing service outside the
continental U.S. The distance between a carrier and its nearest IXP
affects latency and may affect speed as well. At this time, the Bureaus
and OET do not have sufficient data to determine the extent of the
effect of distance on speed performance testing. Therefore, similar to
the existing exception for non-contiguous price cap carriers accepting
model-based CAF Phase II support, the Bureaus and OET permit all
providers serving non-contiguous areas greater than 500 air miles from
an FCC-designated IXP to conduct all required latency and speed testing
between the customer premises and the point at which traffic is
aggregated for transport to the continental U.S. The Bureaus and OET
have identified a sufficient number of IXPs so that no point in the
continental U.S. is more than approximately 500 miles from an FCC-
designated IXP. Therefore, allowing non-contiguous providers located
more than 500 miles from an FCC-designated IXP to test to the point in
the non-contiguous area where traffic is aggregated for transport to
the mainland will prevent these providers from being unfairly penalized
for failing to meet their performance obligations solely because of the
location of the areas being served. However, as the Commission gains
additional MBA and other data on speed and latency from non-contiguous
areas, the Bureaus and OET may revisit this conclusion.
13. First, the Bureaus and OET establish the specific test
intervals within the daily test period. For latency, the Bureaus and
OET require a minimum of one discrete test per minute, i.e., 60 tests
per hour, for each of the testing hours, at each subscriber test
location, with the results of each discrete test recorded separately.
The Bureaus and OET note that intensive consumer use of the network
(such as streaming video) during testing, referred to as cross-talk,
can influence both consumer service and testing results. The data usage
load for latency testing is minimal; sending 60 UDP packets of 64 bytes
each in one hour is approximately 4,000 bytes in total. However, to
prevent cross-talk from negatively affecting both the consumer
experience and test results, the Bureaus and OET adopt consumer load
thresholds--i.e., cross-talk thresholds--similar to those used by the
MBA program. Accordingly, for latency testing, if the consumer load
exceeds 64 Kbps downstream, the provider may cancel the test and
reevaluate whether the consumer load exceeds 64 Kbps downstream before
retrying the test in the next minute. Providers who elect to do more
than the minimum required number of latency tests at subscriber test
locations must include the results from all tests performed during
testing periods in their compliance calculations.
14. For speed, the Bureaus and OET require a minimum of one
download test and one upload test per testing hour at each subscriber
test location. The Bureaus and OET note that speed testing has greater
network impact than latency testing. For speed testing, the Bureaus and
OET require providers to start separate download and upload speed tests
at the beginning of each test hour window. As with latency, the Bureaus
and OET adopt cross-talk thresholds similar to those used in the MBA
program. If the consumer load is greater than 64 Kbps downstream for
download tests or 32 Kbps upstream for upload tests, the provider may
defer the affected download or upload test for one minute and
reevaluate whether the consumer load exceeds the relevant 64 Kbps or 32
Kbps threshold before retrying the test. This load check-and-retry must
continue at one-minute intervals until the speed test can be run or the
one-hour test window ends and the test for that hour is canceled. Also
as with latency, providers who elect to do more than the minimum
required number of speed tests at subscriber test locations must
include the results from all tests performed during testing periods for
compliance calculations.
15. Second, to capture any seasonal effects on a carrier's
broadband performance, the Bureaus and OET require that carriers
subject to the latency and speed testing requirements conduct one week
of testing in each quarter of the calendar year. Specifically, carriers
must conduct one week of testing in each of the following quarters:
January through March, April through June, July through September, and
October through December. By requiring measurements quarterly, rather
than in four consecutive weeks, the Bureaus and OET expect test results
to reflect a carrier's performance throughout the year, including
during times of the year in which there is a seasonal increase or
decrease in network usage. Although previously WCB required price cap
carriers receiving CAF Phase II support to test latency for two weeks
each quarter, the Bureaus and OET find that requiring testing one week
each quarter strikes a better balance of accounting for seasonal
changes in broadband usage and minimizing the burden on consumers who
may participate in testing.
16. Third, in establishing the daily testing period, the Bureaus
and OET slightly expand the test period and require that carriers
conduct tests between 6:00 p.m. and 12:00 a.m. (testing hours),
including on weekends.
[[Page 42055]]
The Bureaus and OET continue to find that MBA data supports its
conclusion that there is a peak period of internet usage every evening.
However, the Bureaus and OET intend to revisit this requirement
periodically to determine whether peak internet usage times have
changed substantially.
17. The Bureaus and OET conclude that requiring measurements over
an expanded period, by including one hour before the peak period and
one hour after, will best ensure that carriers meet the speed and
latency obligations associated with the high-cost support they receive.
MBA data shows that broadband internet access service providers that
perform well during the peak period tend to perform well consistently
throughout the day. Further, the Bureaus and OET required schedule of
testing is consistent with the specific, realistic standards they set
forth which were developed using MBA peak-period data. Thus, the
Bureaus and OET will be judging testing hours data based on a standard
developed using MBA data from the same time period.
18. Additionally, the Bureaus and OET disagree with assertions that
requiring speed testing during the peak period will introduce
problematic network congestion over the provider's core network. Based
on MBA speed test data, a download service speed test for 10 Mbps
requires approximately 624 MB combined downloaded data for 50 locations
per hour. This is less traffic than what would be generated by
streaming a little less than one-half of a high-definition movie. A
download service speed test for 25 Mbps requires approximately 1,841 MB
combined downloaded data for 50 locations, which is about the same
amount of traffic as a little less than two high-definition movies. The
small amount of data should have no noticeable effect on network
congestion. Upload test data-usage is even lower. Based upon MBA speed
test data, a one-hour upload service speed test for 1 Mbps and 3 Mbps
for 50 locations will be approximately 57 MB and 120 MB, respectively.
This testing will use bandwidth equivalent to uploading 12 photos to a
social media website at 1 Mbps or 24 photos at 3 Mbps. To the extent
that a carrier is concerned about possible impacts on the consumer
experience, the Bureaus and OET permit carriers the flexibility to
choose whether to stagger their tests, so long as they do not violate
any other testing requirements, as they explain in their discussion of
the testing intervals in the following.
19. Fourth, testing for all locations in a single speed tier in a
single state must be done during the same week. If a provider has more
than one speed tier in a state, testing for each speed tier can be
conducted during different weeks within the quarter. For a provider
serving multiple states, testing of each service tier does not need to
be done during the same week, i.e., a provider may test its 10/1 Mbps
customers in New York one week and in Pennsylvania during a different
week. The Bureaus and OET will generally consider requests for waiver
or extension in cases where a major, disruptive event (e.g., a
hurricane) negatively affects a provider's broadband performance.
However, prior to requesting a waiver, providers should determine
whether rescheduling testing within the 3-month test window will be
sufficient to handle the disruptive event.
20. The Bureaus and OET require that carriers test up to 50
locations per CAF-required service tier offering per state, depending
on the number of subscribers a carrier has in a state. The subscribers
eligible for testing must be at locations that are reported in the HUBB
where there is an active subscriber. The Bureaus and OET decline to
adopt a simple percentage-based alternative but, instead, adopt the
following scaled requirements for each state and service tier
combination for a carrier:
Required Test Locations for Speed
------------------------------------------------------------------------
Number of subscribers at CAF-
supported locations per state Number of test locations
and service tier combination
------------------------------------------------------------------------
50 or fewer.................. 5.
51-500....................... 10% of total subscribers.
Over 500..................... 50.
------------------------------------------------------------------------
The Bureaus and OET recognize that it is possible that a carrier
serving 50 or fewer subscribers in a state and particular service tier
cannot find the required number of five active subscribers for testing
purposes. To the extent necessary, the Bureaus and OET permit such
carriers to test existing, non-CAF-supported active subscriber
locations within the same state and service tier to satisfy its
requirement of testing five active subscriber locations. Carriers may
voluntarily test the speed and/or latency of additional randomly
selected CAF-supported subscribers over the minimum number of required
test locations as part of their quarterly testing. However, data for
all tested locations must be submitted for inclusion in the compliance
calculations, i.e., carriers must identify the set of testing locations
at the beginning of the testing and cannot exclude some locations
during or after the testing.
21. Carriers must test an adequate number of subscriber locations
to provide a clear picture of the carrier's performance and its
customers' broadband experience across a state. The Bureaus and OET
find that 50 test locations, per speed tier per state, remains a good
indicator as to whether providers are fulfilling their obligations. A
sample size of 50 test locations out of 2,500 or more subscribers
provides a picture of carriers' performance with a 11.5
percent margin of error and 90 percent confidence level. Testing 50
locations out of more than 500 subscribers yields a comparable picture
of carriers' performance. The Bureaus and OET acknowledge, however,
that smaller carriers may find testing 50 locations burdensome. Below
2,500 CAF-supported subscribers, greater percentages of subscribers are
necessary to achieve the same margin of error and confidence level, but
below 500 subscribers the necessary percentage rises quickly above 10
percent. Carriers serving fewer subscribers would thus be unable to
provide test results achieving the same margin of error and confidence
level without testing a more proportionately burdensome percentage of
their subscribers.
22. The Bureaus and OET also now find it preferable to use the
number of subscribers in a state and service tier, rather than the
number of lines for which a provider is receiving support, to determine
the required number of test locations. A carrier receiving support for
2,000 lines serving 100 subscribers would find it much more difficult
to test 50 active subscriber locations, compared to a carrier receiving
support for 2,000 lines but serving 1,500 subscribers, and commenters
have noted that providers may find it difficult to find a sufficient
[[Page 42056]]
number of locations if they have relatively few subscribers. Basing the
number of locations to be tested on the number of subscribers, rather
than the number of lines, addresses this concern.
23. The Bureaus and OET therefore require testing a specific number
of subscribers for carriers serving more than 500 subscribers in a
single service tier and state, but require carriers serving between 51
and 500 subscribers in a single service tier and state to test a fixed
percentage of subscribers. For carriers serving 50 or fewer subscribers
in a state and service tier, a percentage-based alternative may be
insufficient; in an extreme situation, data from a single subscriber
cannot clearly demonstrate a carrier's speed and latency performance.
Accordingly, the Bureaus and OET require those providers to test a
specific number of active subscriber locations. The Bureaus and OET
conclude that this scaled approach balances the need to test a
reasonable number of subscriber locations within a state based on the
total number of subscribers and performance tiers with minimizing the
burden on smaller providers to find consumer locations to be tested.
The Bureaus and OET note, also, that a carrier receiving different
types of CAF funding in the same state should aggregate its customers
in each speed tier for purposes of testing. The following examples
illustrate how this scaled approach should be implemented:
A carrier with 2,300 customers subscribed to a single
service tier of 10/1 Mbps in one state must test 50 locations in that
state, while a carrier providing solely 25/3 Mbps service to over 2,500
subscribers in each of three states must test 50 locations in each
state.
A carrier providing 10/1 Mbps service and 25/3 Mbps
service to 100 subscribers each in a single state must test 10
locations for each of the two service tiers--20 locations in total.
A carrier providing solely 10/1 Mbps service to 30
subscribers must test five locations, and if that carrier is only able
to test three CAF-supported locations, that carrier must test two non-
CAF-supported locations receiving 10/1 Mbps service in the same state.
A carrier with 2,000 customers subscribed to 10/1 Mbps in
one state through CAF Phase II funding and 500 RBE customers subscribed
to 10/1 Mbps in the same state, and no other high-cost support with
deployment obligations, must test a total of 50 locations in that state
for the 10/1 Mbps service tier.
24. Test subjects must be randomly selected every two years from
among the provider's active subscribers in each service tier in each
state. Subscribers for latency testing may be randomly selected from
those subscribers being tested for speed at all speed tiers or randomly
selected from all CAF-supported subscribers, every two years. Any
sample location lacking an active subscriber 12 months after that
location was selected must be replaced by an actively subscribed
location, randomly selected. Random selection will ensure that
providers cannot pick and choose amongst subscribers so that only those
subscribers likely to have the best performance (e.g., those closest to
a central office) are tested. Carriers may use inducements to encourage
subscribers to participate in testing. This may be particularly useful
in cases where support is tied to a particular performance level for
the network but the provider does not have enough subscribers to higher
performance service to test to comply with the testing sample sizes.
However, to ensure that the selection remains random, carriers must
offer the same inducement to all randomly-selected subscribers in the
areas for which participating subscribers are required for the carrier
to conduct testing. WCB will provide further guidance regarding random
selection by public notice.
25. The Bureaus and OET reiterate the Commission's requirement that
high-latency providers subject to testing must demonstrate a Mean
Opinion Score (MOS) of four or higher. The Bureaus and OET agree with
ADTRAN, Inc. (ADTRAN) that listening-opinion tests would not suffice to
demonstrate a high-quality consumer voice experience. Latency only
minimally affects participants' experiences and evaluations in
listening-opinion tests, which involve passive listening to audio
samples. However, in the USF/ICC Transformation Order, 76 FR 73830,
November 29, 2011, the Commission required ``ETCs to offer sufficiently
low latency to enable use of real-time applications, such as VoIP.''
Unlike a listening-opinion test, in a conversation-opinion test, two
participants actively participate in a conversation. The back-and-forth
of conversations highlights delay, echo, and other issues caused by
latency in a way that one-way, passive listening cannot. Therefore, the
Bureaus and OET require that high-latency providers conduct an ITU-T
Recommendation P.800 conversational-opinion test.
26. Specifically, the Bureaus and OET require the use of the
underlying conversational-opinion test requirements specified by the
ITU-T Recommendation P.800, with testing conditions as described in the
following. The Bureaus and OET believe that MOS testing under these
conditions will ensure that the test results reflect the consumer
experience as accurately as possible. First, high-latency providers
must use operational network infrastructure, such as actual satellite
links, for conducting MOS testing, not laboratory-based simulations
intended to reproduce service conditions. Second, the tests must be
implemented using equipment, systems, and processes that are used in
provisioning service to locations funded by high-cost universal service
support. Third, live interviews and surveys must be conducted by an
independent agency or organization (Reviewer) to determine the MOS.
Survey forms, mail-in documentation, automated phone calls, or other
non-interactive and non-person-to-person interviews are not permitted.
Any organization or laboratory with experience testing services for
compliance with telecommunications industry-specified standards and,
preferably, MOS testing experience, may be a Reviewer. Fourth, testing
must be conducted over a ``single hop'' satellite connection with at
least one endpoint at an active subscriber location using the
subscriber's end-user equipment. Finally, the second endpoint may be a
centralized location from which the Reviewer conducts live interviews
with the subscriber to determine the subscriber's MOS evaluation.
27. To reduce the burden of the MOS testing for high-latency
bidders while still ensuring high-quality voice service, the Bureaus
and OET adopt a separate scaled table for the number of locations that
are subject to MOS testing. Specifically, the Bureaus and OET will
determine the number of testing locations based upon the number of
subscribers nationally for which CAF-supported service is provided. The
Bureaus and OET recognize that the satellite infrastructures employed
by many high-latency bidders have characteristics different from
terrestrial networks that make testing of satellite service on a
national, rather than state, basis appropriate. That is, middle-mile/
backhaul for satellite networks are the direct links from the consumer
locations to the satellite and then from the satellite to selected
downlink sites, so there is unlikely to be significant variability
based on the state in which the subscriber is located. The consumers
must be randomly selected from the total CAF-supported subscriber base
in all applicable states to ensure that different types of geographic
locations are tested.
[[Page 42057]]
Required Test Locations for MOS Testing
------------------------------------------------------------------------
Number of
Number of subscribers at CAF-supported locations nationally MOS test
locations
------------------------------------------------------------------------
3500 or fewer.............................................. 100
Over 3500.................................................. 370
------------------------------------------------------------------------
This scaled, nationwide testing requirement will reduce high-
latency bidders' testing burden while ensuring a sufficient testing
sample to verify compliance with voice performance requirements.
IV. Compliance Framework
28. The Bureaus and OET extend the existing standard for full
compliance with high-cost support recipients' latency obligations and
adopt a standard for full compliance with speed obligations. The
Bureaus and OET also establish a compliance framework outlining
specific actions for various degrees of compliance that fall short of
those standards.
29. The Bureaus and OET reaffirm the existing low-latency and high-
latency standards and establish a speed standard for full compliance.
The data on round-trip latency in the United States has not markedly
changed since the 2013 CAF Phase II Price Cap Service Obligation Order,
and no party has challenged the Commission's reasoning for the existing
100 ms latency standard. Accordingly, the Bureaus and OET conclude that
all high-cost support recipients serving fixed locations, except those
carriers submitting high-latency bids in the CAF Phase II auction, must
certify that 95 percent or more of all testing hours measurements of
network round-trip latency are at or below 100 ms. High-latency bidders
must certify that 95 percent or more of all testing hours measurements
are at or below 750 ms. Providers must record the observed latency for
all latency test measurements, including all lost packet tests. Thus,
providers may not discard lost-packet tests from their test results;
these tests count as discrete tests not meeting the standard.
30. For speed, the Bureaus and OET require that 80 percent of
download and upload measurements be at or above 80 percent of the CAF-
required speed tier (i.e., an 80/80 standard). For example, if a
carrier receives high-cost support for 10/1 Mbps service, 80 percent of
the download speed measurements must be at or above 8 Mbps, while 80
percent of the upload speed measurements must be at or above 0.8 Mbps.
The Bureaus and OET require carriers to meet and test to their CAF
obligation speed(s) regardless of whether their subscribers purchase
internet service offerings with advertised speeds matching the CAF-
required speeds at CAF-eligible locations. Thus, carriers that have
deployed a network with the requisite speeds must include all
subscribers at that level in their testing, but may still find it
necessary to upgrade individual subscriber locations, at least
temporarily, to conduct speed testing. For example, a carrier may be
required to deploy and offer 100/20 Mbps service, but only 5 of its 550
subscribers at CAF-supported locations take 100/20 Mbps service, with
the remainder taking 20/20 Mbps service. To satisfy its testing
obligations, the carrier would be required to (1) test all 5 of the
100/20 Mbps subscribers and (2) randomly select 45 of its other CAF-
supported subscribers, raise those subscribers' speed to 100/20 Mbps,
at least temporarily, and test those 45 subscribers.
31. The Bureaus and OET believe that this standard best meets its
statutory requirement to ensure that high-cost-supported broadband
deployments provide reasonably comparable service as those available in
urban areas. The most recent MBA report cites the 80/80 standard as a
``key measure'' of network consistency. MBA data show that all fixed
terrestrial broadband technologies that are included in the MBA program
can meet this standard. The Bureaus and OET are confident that high-
cost support recipients' newer fixed broadband deployments will benefit
from more up-to-date technologies and network designs that should
provide even better performance.
32. Further, the Bureaus and OET expect that a realistic 80/80
standard will provide a ``cushion'' to address certain testing issues.
The Bureaus and OET noted in this document that some commenters
expressed concern that they would be responsible for testing to an IXP
even though that involved the use of backhaul that a provider may not
control. The Bureaus and OET believe that the 80/80 standard allows
sufficient leeway to providers so that they will meet performance
standards as long as they have reasonable backhaul arrangements. In
addition, commenters have raised a concern that speed testing could
possibly show misleadingly low results if the subscriber being tested
is using the connection at the time of the testing. However, the
testing methodology addresses this concern. As with the MBA, the
Bureaus and OET allow rescheduling of testing in instances where the
customer usage exceeds MBA cross-talk thresholds. Thus, the Bureaus and
OET do not anticipate that customer cross-talk will affect CAF
performance data any more (or less) than the MBA program data on which
its standard is based. Customer usage should not prevent carriers with
appropriately constructed networks from meeting its requirements.
33. The Bureaus and OET find that a speed standard similar to what
they have adopted for latency to measure broadband speed performance,
as proposed by ADTRAN, is not appropriate. Staff analysis has found
that this standard would not ensure CAF-supported service that is
comparable to that in urban areas. The 2016 MBA Report stated that
``[c]onsistency of speed may be more important to customers who are
heavy users of applications that are both high bandwidth and sensitive
to short duration declines in actual speed, such as streaming video.''
A speed standard relying on an average or median value would not ensure
consistency of speed because the distribution of values around the
median may vary significantly. A carrier could meet such a standard by
ensuring that the average or median speed test meets a target speed,
while not providing sufficiently fast service nearly half the time or
to nearly half its subscribers in locations supported by universal
service. The Bureaus and OET therefore conclude that the 80/80 standard
they adopt herein is a better measure of comparability and high-quality
service.
34. Finally, the Bureaus and OET recognize that, because of
technical limitations, it is currently unrealistic to expect that
providers obligated to provide gigabit service, i.e., speeds of 1,000
Mbps, achieve actual speeds of 1,000 Mbps download at the customer
premises. Typical customer premises equipment, including equipment for
gigabit subscribers, permits a maximum throughput of 1 Gbps, and the
overhead associated with gigabit internet traffic (whether in urban or
rural areas) can reach up to 60 Mbps out of the theoretical 1 Gbps.
Customer premises equipment with higher maximum throughput are
generally more costly and not readily available. Thus, even if a
gigabit provider were to ``overprovision'' its gigabit service, the
subscriber would not experience speeds of 1,000 Mbps. The Bureaus and
OET do not want to discourage carriers from bidding in the upcoming CAF
auction to provide 1 Gbps service by requiring unachievable service
levels. The Bureaus and OET note that the 80/80 standard they adopt
requires gigabit carriers to demonstrate that 80 percent of their
testing hours download speed tests are at or above 80 percent of 1,000
Mbps, i.e., 800 Mbps. This standard
[[Page 42058]]
should not pose a barrier to carriers bidding to provide 1 Gbps
service.
35. Consistent with the Commission's universal service goals, the
Bureaus and OET adopt a compliance framework that encourages ETCs to
comply fully with their performance obligations and includes the
potential for USAC to audit test results. The Bureaus and OET establish
a four-level framework that sets forth particular obligations and
automatic triggers based on an ETC's degree of compliance with its
latency, speed, and, if applicable, MOS testing standards in each state
and high-cost support program. The Bureaus and OET will determine a
carrier's compliance for each standard separately. In each case, the
Bureaus and OET will divide the percentage of its measurements meeting
the relevant standard by the required percentage of measurements to be
in full compliance.
36. In other words, for latency, in each state in which the carrier
has CAF-supported locations, the Bureaus and OET will calculate the
percentage of compliance using the 95-percent standard, so they will
divide the percentage of the carrier's testing hours' latency
measurements at or below the required latency (i.e., 100 ms or 750 ms)
by 95. As an example, if a low-latency provider observes that 90
percent of all its testing hours measurements are at or below 100 ms,
then that provider's latency compliance percentage would be 90/95 =
94.7 percent in that state. For speed, for each speed tier and state
the Bureaus and OET will calculate the percentage of compliance
relative to the 80-percent-based standard, so they will divide the
percentage of the carrier's testing hours speed measurements at or
above 80 percent of the target speed by 80. Thus, if a provider
observes that 65 percent of its testing hours speed measurements meet
80 percent of the required speed, the provider's compliance percentage
would be 65/80 = 81.25 percent for the relevant speed tier in that
state. Carriers must include and submit the results from all tests and
cannot exclude any tests conducted beyond the minimum numbers of tests,
as outlined in this Order, for the calculation of latency and speed
compliance percentages.
37. For MOS testing, the high-latency bidder must demonstrate a MOS
of 4 or higher, so a high-latency bidder would calculate its percentage
of compliance relative to 4. Thus, a provider demonstrating a MOS of 3
would have a compliance percentage of \3/4\ = 75 percent. For a high-
latency bidder conducting MOS testing across its entire network, rather
than state-by-state, the Bureaus and OET will calculate the same MOS
compliance percentage for each state that it serves with CAF Phase II
support.
38. To avoid penalizing a provider for failing to meet multiple
standards for the same locations, the Bureaus and OET adopt a
streamlined compliance framework in which the lowest of a carrier's
separate latency, speed, and, if applicable, MOS compliance percentages
(including percentages for each speed tier) determines its obligations.
All carriers not fully compliant in a particular state must submit
quarterly reports providing one week of testing hours test results,
subject to the same requirements the Bureaus and OET establish in this
Order, and describing steps taken to resolve the compliance gap, and
USAC will withhold a percentage of a non-compliant carrier's monthly
support. Whenever a carrier in Levels 1 through 3 comes into a higher
level of compliance, that level's requirements will apply, and USAC
will return the withheld support up to an amount reflecting the
difference between the levels' required withholding but not including
any support withheld by USAC for more than 12 months.
39. The Bureaus and OET define Level 1 compliance to include
carriers with compliance percentages at or above 85 but below 100
percent, and they direct USAC to withhold 5 percent of a Level 1-
compliant carrier's monthly support. Level 2 compliance includes
carriers with compliance percentages at or above 70 but below 85
percent, and the Bureaus and OET direct USAC to withhold 10 percent of
a Level 2-compliant carrier's monthly support. Level 3 compliance
includes carriers with compliance percentages at or above 55 but below
70 percent, and the Bureaus and OET direct USAC to withhold 15 percent
of a Level 3-compliant carrier's monthly support. Level 4 compliance
includes carriers with compliance percentages below 55 percent, and the
Bureaus and OET direct USAC to withhold 25 percent of a Level 4-
compliant carrier's monthly support. The Bureaus and OET will also
refer Level 4-compliant carriers to USAC for an investigation into the
extent to which the carrier has actually deployed broadband in
accordance with its deployment obligations. The following table
provides a summary of the compliance framework, where x is the
carrier's compliance percentage:
Compliance Levels and Support Reductions
----------------------------------------------------------------------------------------------------------------
Monthly support
Qualifying compliance Required quarterly reporting withheld
percentage x (percent)
----------------------------------------------------------------------------------------------------------------
Full Compliance.................... x >= 100%.................. No.......................... N/A
Level 1............................ 85% <= x < 100%............ Yes......................... 5
Level 2............................ 70% <= x < 85%............. Yes......................... 10
Level 3............................ 55% <= x < 70%............. Yes......................... 15
Level 4............................ x < 55%.................... Yes......................... 25
----------------------------------------------------------------------------------------------------------------
40. Similar to commenters' proposals, the framework the Bureaus and
OET adopt resembles the non-compliance framework for interim deployment
milestones in section 54.320(d) of the Commission's rules. The Bureaus
and OET emphasize that the goal of this compliance framework is to
provide incentives, rather than penalize. Balancing commenters'
concerns regarding the severity or leniency of a such a framework, the
Bureaus and OET conclude that its framework appropriately encourages
carriers to come into full compliance and offer, in areas requiring
high-cost support, broadband service meeting standards consistent with
what consumers typically experience.
41. Finally, the Bureaus and OET provide one exception to this non-
compliance framework. As discussed in this document, carriers that
serve 50 or fewer subscribers in a state and particular service tier
but cannot find five active subscribers for conducting the required
testing may test non-CAF-supported active subscriber locations to the
extent necessary. Because those carriers' test results would not solely
reflect the performance of CAF-supported locations, any such carriers
not fully complying with the Bureaus
[[Page 42059]]
and OET latency and speed standards will be referred to USAC for
further investigation of the level of performance at the CAF-supported
locations.
42. The Commission requires that providers subject to these testing
requirements annually certify and report the results to USAC, which may
audit the test results. To facilitate compliance monitoring, the
Bureaus and OET require providers to submit speed and latency test
results, including the technologies used to provide broadband at the
tested locations, for each state and speed tier combination in addition
to an annual certification in a format to be determined by WCB; high-
latency bidders conducting MOS testing across their entire networks,
rather than state-by-state, may submit and certify MOS test results on
a nationwide basis. To minimize the burden on providers, USAC will
calculate the compliance percentages required using the data submitted.
By requiring carriers to submit test results annually, or quarterly if
they are not fully in compliance with the Bureaus and OET standards,
and having USAC perform the compliance calculations, the Bureaus and
OET minimize the potential for any manipulation or gaming of the
testing regime, as providers will be required to certify to a set of
specific results rather than to a general level of compliance. Because
of the need to develop a mechanism for collecting the testing data and
obtain Paperwork Reduction Act (PRA) approval, carriers will be
required to submit the first set of testing data and accompanying
certification by July 1, 2020. This submission should include data for
at least the third and fourth quarters of 2019. Subsequently, data and
certifications will be due by July 1 of each year for the preceding
calendar year. WCB will provide further guidance by public notice
regarding how carriers will submit their testing data and
certifications. Together with USAC audits and possible withholding of
support, the Bureaus and OET believe these measures will provide ample
incentives for carriers to comply with their obligations.
V. Procedural Matters
A. Paperwork Reduction Act
43. This Order contains new or modified information collection
requirements subject to the Paperwork Reduction Act of 1995 (PRA),
Public Law 104-13. It will be submitted to the Office of Management and
Budget (OMB) for review under section 3507(d) of the PRA. OMB, the
general public, and other Federal agencies will be invited to comment
on the new or modified information collection requirements contained in
this proceeding. In addition, the Commission notes that pursuant to the
Small Business Paperwork Relief Act of 2002, Public Law 107-198, see 44
U.S.C. 3506(c)(4), it previously sought specific comment on how the
Commission might further reduce the information collection burden for
small business concerns with fewer than 25 employees. In this present
document, the Commission has assessed the effects of the new and
modified rules that might impose information collection burdens on
small business concerns, and find that they either will not have a
significant economic impact on a substantial number of small entities
or will have a minimal economic impact on a substantial number of small
entities.
B. Congressional Review Act
44. The Commission will send a copy of this Order to Congress and
the Government Accountability Office pursuant to the Congressional
Review Act, see 5 U.S.C. 801(a)(1)(A).
45. As required by the Regulatory Flexibility Act of 1980 (RFA), as
amended, an Initial Regulatory Flexibility Analysis (IRFA) was
incorporated in the USF/ICC Transformation FNPRM, 76 FR 78384, December
16, 2011. The Commission sought written public comment on the proposals
in the USF/ICC Transformation FNPRM, including comment on the IRFA. The
Commission did not receive any relevant comments on the USF/ICC
Transformation FNPRM IRFA. This present Final Regulatory Flexibility
Analysis (FRFA) conforms to the RFA.
46. As a condition of receiving high-cost universal service
support, eligible telecommunications carriers (ETCs) must offer
broadband service in their supported areas that meets certain basic
performance requirements. ETCs subject to broadband performance
obligations must currently offer broadband with latency suitable for
real-time applications, such as VoIP, and meet a minimum speed standard
of 10 Mbps downstream and 1 Mbps upstream or greater. Recipients of
high-cost support must also test their broadband networks for
compliance with speed and latency metrics and certify and report the
results to the Universal Service Administrative Company (USAC) and the
relevant state or tribal government on an annual basis, with those
results subject to audit.
47. In the Order, the Bureaus and OET define how ETCs with Connect
America Fund (CAF) Phase II, Alternative Connect America Cost Model (A-
CAM), rate-of-return mandatory buildout, rural broadband experiment
(RBE), or Alaska Plan obligations must test speed and latency and
certify and report the results. Specifically, the Bureaus and OET
establish a uniform framework for measuring speed and latency
performance. The Bureaus and OET permit three testing methods as
options for ETCs to conduct the required speed and latency tests, and
the Bureaus and OET provide a definition for a ``test'' in this context
and specify the measurement span associated with these tests. The
Bureaus and OET establish specific test parameters for latency and
speed, including how often and how many tests must be conducted and the
minimum test sample size. The Bureaus and OET also establish voice
testing requirements for high-latency bidders in the CAF Phase II
auction. Finally, the Bureaus and OET define compliance for latency and
speed standards and establish the required certifications, as well as a
compliance framework providing strong incentives for ETCs to meet its
standards.
48. With the testing framework the Bureaus and OET have adopted
herein, they have provided maximum flexibility to reduce the burden on
smaller entities, consistent with ensuring that these carriers are
meeting their latency and speed requirements. Smaller entities required
to do testing can choose from one of three methodologies to conduct the
required testing. All entities providing broadband service should
already use testing mechanisms for internal purposes, such as ensuring
that customers are receiving the appropriate level of service and
troubleshooting in response to customer complaints. In addition, the
Bureaus and OET will be providing an online portal so entities can
easily submit all of their test results electronically and USAC will do
all of the necessary compliance calculations.
49. The RFA directs agencies to provide a description of, and where
feasible, an estimate of the number of small entities that may be
affected by the proposed rules, if adopted. The RFA generally defines
the term ``small entity'' as having the same meaning as the terms
``small business,'' ``small organization,'' and ``small governmental
jurisdiction.'' In addition, the term ``small business'' has the same
meaning as the term ``small-business concern'' under the Small Business
Act. A small-business concern'' is one which: (1) Is independently
owned and operated; (2) is not dominant in its field of operation; and
(3) satisfies any additional criteria established by the Small Business
Administration (SBA).
[[Page 42060]]
50. The Bureaus and OET actions, over time, may affect small
entities that are not easily categorized at present. The Bureaus and
OET therefore describe here, at the outset, three broad groups of small
entities that could be directly affected herein. First, while there are
industry specific size standards for small businesses that are used in
the regulatory flexibility analysis, according to data from the SBA's
Office of Advocacy, in general a small business is an independent
business having fewer than 500 employees. These types of small
businesses represent 99.9 percent of all businesses in the United
States which translates to 28.8 million businesses.
51. Next, the type of small entity described as a ``small
organization'' is generally ``any not-for-profit enterprise which is
independently owned and operated and is not dominant in its field.''
Nationwide, as of August 2016, there were approximately 356,494 small
organizations based on registration and tax data filed by nonprofits
with the Internal Revenue Service (IRS).
52. Finally, the small entity described as a ``small governmental
jurisdiction'' is defined generally as ``governments of cities,
counties, towns, townships, villages, school districts, or special
districts, with a population of less than fifty thousand.'' U.S. Census
Bureau data from the 2012 Census of Governments indicates that there
were 90,056 local governmental jurisdictions consisting of general
purpose governments and special purpose governments in the United
States. Of this number there were 37,132 General purpose governments
(county, municipal and town or township) with populations of less than
50,000 and 12,184 Special purpose governments (independent school
districts and special districts) with populations of less than 50,000.
The 2012 U.S. Census Bureau data for most types of governments in the
local government category shows that the majority of these governments
have populations of less than 50,000. Based on this data the Bureaus
and OET estimate that at least 49,316 local government jurisdictions
fall in the category of ``small governmental jurisdictions.''
53. In the Order, the Bureaus and OET establish for high-cost
support recipients serving fixed locations a uniform framework for
measuring speed and latency performance and define the requisite
standards for full compliance with those providers' speed and latency
obligations. The Commission's existing rules require that high-cost
recipients report ``[t]he results of network performance tests pursuant
to the methodology and in the format determined by the Wireline
Competition Bureau, Wireless Telecommunications Bureau, and the Office
of Engineering and Technology'' and that ETCs retain such records for
at least ten years from the receipt of funding.
54. The Bureaus and OET now provide some color to this requirement;
they require providers to submit speed and latency test results,
including the technologies used to provide broadband at the tested
locations, for each state and speed tier combination in addition to an
annual certification in a format to be determined by WCB. High-latency
bidders conducting mean opinion score (MOS) testing across their entire
networks, rather than state-by-state, may submit and certify MOS test
results on a nationwide basis. To minimize the burden on providers,
USAC will calculate the compliance percentages required using the data
submitted. By requiring carriers to submit test results annually and
having USAC perform the compliance calculations, the Bureaus and OET
minimize the potential for any manipulation or gaming of the testing
regime, as providers will be required to certify to a set of specific
results rather than to a general level of compliance. However,
providers that are not fully compliant with the speed and latency
standards must submit quarterly reports including one week of test
results and describing steps taken to resolve the compliance gap.
55. The RFA requires an agency to describe any significant
alternatives that it has considered in reaching its proposed approach,
which may include (among others) the following four alternatives: (1)
The establishment of differing compliance or reporting requirements or
timetables that take into account the resources available to small
entities; (2) the clarification, consolidation, or simplification of
compliance or reporting requirements under the rule for small entities;
(3) the use of performance, rather than design, standards; and (4) an
exemption from coverage of the rule, or any part thereof, for small
entities. The Bureaus and OET have considered all of these factors
subsequent to receiving substantive comments from the public and
potentially affected entities. The Wireline Competition Bureau,
Wireless Telecommunications Bureau, and Office of Engineering and
Technology have considered the economic impact on small entities, as
identified in any comments filed in response to USF/ICC Transformation
FNPRM and IRFA, in reaching its final conclusions and taking action in
this proceeding.
56. In the Order, the Bureaus and OET adopt a clear, uniform
framework for high-cost support recipients serving fixed locations to
test speed and latency to meet the obligations associated with the
support they receive. The requirements the Bureaus and OET adopt
provide flexibility for carriers to choose between different testing
methods suitable for carriers of different sizes and technological
sophistication. Instead of requiring providers to invest in and
implement new internal systems, the Bureaus and OET permit providers to
perform speed and latency tests with readily available off-the-shelf
solutions or existing MBA infrastructure. The Bureaus and OET expect
that carriers with testing features built into customer premises
equipment for their own network management purposes may prefer using
their own self-testing systems, which they also permit.
57. The Bureaus and OET require that carriers, regardless of their
preferred testing methods, conduct tests using the same parameters they
establish. These parameters take into account smaller carriers'
circumstances to avoid disproportionately burdening them. For example,
the Bureaus and OET expand the list of locations to which carriers may
conduct required tests--allowing smaller carriers that are farther from
the largest metropolitan areas to test speed and latency over shorter
distances. The Bureaus and OET also permit providers to conduct tests
to the designated area of their choosing, rather than to the nearest
designated metropolitan area. Further, carriers with fewer subscribers
in a state and broadband service tier may test fewer locations. Greater
percentages of subscribers are necessary to achieve the same margin of
error and confidence level in smaller sample sizes, but the Bureaus and
OET recognize that, below 450 subscribers, that necessary percentage
rises quickly above 10 percent. Accordingly, in the Order, the Bureaus
and OET allow providers with between 51 and 450 subscribers in a
particular state and service tier combination to test 10 percent of
total subscribers. The Bureaus and OET require providers with fewer
than 50 subscribers in a particular state and service tier combination
to test five locations, but, to the extent necessary, those carriers
may test existing, non-CAF-supported active subscriber locations to
satisfy that requirement.
58. Finally, the Bureaus and OET provide clarity regarding the
Commission's existing requirement that carriers must report the results
of network performance tests. Carriers must annually (or, in some
cases,
[[Page 42061]]
quarterly) submit detailed results of the required tests, conducted
pursuant to the parameters the Bureaus and OET establish. The Bureaus
and OET hold all carriers to the same speed and latency test standards,
but they recognize that requiring carriers to take the additional step
of using their test results to determine their level of compliance may
entail unnecessary burdens. Although the Bureaus and OET anticipate
that carriers will find the adopted compliance framework
straightforward, they conclude that requiring submission of the actual
test results and allowing USAC to calculate the compliance percentages
lessens the burden on small entities even further.
VI. Ordering Clauses
59. Accordingly, it is ordered that, pursuant to sections 1, 4(i),
5(c), 201(b), 214, and 254 of the Communications Act of 1934, as
amended, and section 706 of the Telecommunications Act of 1996, 47
U.S.C. 151, 154(i), 155(c), 201(b), 214, 254, 1302, Sec. Sec. 0.91 and
0.291 of the Commission's rules, 47 CFR 0.91, 0.291, and the
delegations of authority in paragraph 170 of the USF/ICC Transformation
Order, FCC 11-161, this Order is adopted, effective thirty (30) days
after publication of the text or summary thereof in the Federal
Register, except for the requirements in paragraphs 38 and 42 that are
subject to the PRA, which will become effective upon announcement in
the Federal Register of OMB approval of the subject information
collection requirements.
Federal Communications Commission.
Kris A. Monteith,
Chief, Wireline Competition Bureau.
[FR Doc. 2018-17338 Filed 8-17-18; 8:45 am]
BILLING CODE 6712-01-P