Proposed Methodology for Connect America High-Cost Universal Service Support Recipients To Measure and Report Speed and Latency Performance to Fixed Locations, 69091-69095 [2014-27429]
Download as PDF
Federal Register / Vol. 79, No. 224 / Thursday, November 20, 2014 / Proposed Rules
restricted by statute. Certain other
material, such as copyrighted material,
is not placed on the Internet and will be
publicly available only in hard copy
form. Publicly available docket
materials are available either
electronically in www.regulations.gov or
in hard copy at the Regulatory
Development Section, Air Planning
Branch, Air, Pesticides and Toxics
Management Division, U.S.
Environmental Protection Agency,
Region 4, 61 Forsyth Street SW, Atlanta,
Georgia 30303–8960. EPA requests that
if at all possible, you contact the person
listed in the FOR FURTHER INFORMATION
CONTACT section to schedule your
inspection. The Regional Office’s
official hours of business are Monday
through Friday, 8:30 a.m. to 4:30 p.m.,
excluding Federal holidays.
FOR FURTHER INFORMATION CONTACT:
Nacosta Ward, Regulatory Development
Section, Air Planning Branch, Air,
Pesticides and Toxics Management
Division, U.S. Environmental Protection
Agency, Region 4, 61 Forsyth Street SW,
Atlanta, Georgia 30303–8960. The
telephone number is (404) 562–9140.
Ms. Ward can be reached via electronic
mail at ward.nacosta@epa.gov. For
information regarding the I/M program,
contact Ms. Amanetta Somerville, Air
Quality Modeling and Transportation
Section, at the same address above.
Telephone number: (404) 562–9025;
email address: somerville.amanetta@
epa.gov.
For
additional information see the
associated direct final rule which is
published in the Rules Section of this
Federal Register. A detailed rationale
for the approval is set forth in the direct
final rule. If no relevant adverse
comments are received in response to
this rule by December 22, 2014, no
further activity is contemplated. If EPA
receives relevant adverse comments by
December 22, 2014, the direct final rule
will be withdrawn and all relevant
adverse comments received during the
public comment period will be
addressed in a subsequent final rule
based on this proposed rule. EPA will
not institute a second comment period
on this action. Any parties interested in
commenting on this document must do
so by December 22, 2014.
rmajette on DSK2VPTVN1PROD with PROPOSALS
SUPPLEMENTARY INFORMATION:
Dated: October 23, 2014.
V. Anne Heard,
Acting Regional Administrator, Region 4.
[FR Doc. 2014–27027 Filed 11–19–14; 8:45 am]
BILLING CODE 6560–50–P
VerDate Sep<11>2014
13:19 Nov 19, 2014
Jkt 235001
FEDERAL COMMUNICATIONS
COMMISSION
47 CFR Part 54
[WC Docket No. 10–90; DA 14–1499]
Proposed Methodology for Connect
America High-Cost Universal Service
Support Recipients To Measure and
Report Speed and Latency
Performance to Fixed Locations
Federal Communications
Commission.
ACTION: Proposed rule.
AGENCY:
In this document, the
Wireline Competition Bureau, the
Wireless Telecommunications Bureau,
and the Office of Engineering and
Technology seek to further develop the
record on how compliance with speed
obligations should be determined for
recipients of high-cost support that
deploy broadband networks to serve
fixed locations.
DATES: Comments due December 22,
2014.
ADDRESSES: Interested parties may file
comments on or before December 22,
2014. All pleadings are to reference WC
Docket No. 10–90. Comments may be
filed using the Commission’s Electronic
Comment Filing System (ECFS) or by
filing paper copies, by any of the
following methods:
• Electronic Filers: Comments may be
filed electronically using the Internet by
accessing the ECFS: https://
fjallfoss.fcc.gov/ecfs2/.
• Paper Filers: Parties who choose to
file by paper must file an original and
one copy of each filing.
• People with Disabilities: To request
materials in accessible formats for
people with disabilities (Braille, large
print, electronic files, audio format),
send an email to fcc504@fcc.gov or call
the Consumer & Governmental Affairs
Bureau at (202) 418–0530 (voice), (202)
418–0432 (tty).
For detailed instructions for submitting
comments and additional information
on the rulemaking process, see the
SUPPLEMENTARY INFORMATION section of
this document.
FOR FURTHER INFORMATION CONTACT:
Alexander Minard, Wireline
Competition Bureau at (202) 418–7400
or TTY (202) 418–0484.
SUPPLEMENTARY INFORMATION: This is a
synopsis of the Wireline Competition
Bureau’s Public Notice (Notice) in WC
Docket No. 10–90; DA 14–1499, released
October 16, 2014. The complete text of
this document is available for
inspection and copying during normal
business hours in the FCC Reference
SUMMARY:
PO 00000
Frm 00027
Fmt 4702
Sfmt 4702
69091
Information Center, Portals II, 445 12th
Street SW., Room CY–A257,
Washington, DC 20554. The document
may also be purchased from the
Commission’s duplicating contractor,
Best Copy and Printing, Inc., 445 12th
Street SW., Room CY–B402,
Washington, DC 20554, telephone (800)
378–3160 or (202) 863–2893, facsimile
(202) 863–2898, or via Internet at
https://www.bcpiweb.com.
I. Introduction
1. In this document, the Wireline
Competition Bureau, the Wireless
Telecommunications Bureau, and the
Office of Engineering and Technology
(together, the Bureaus) seek to further
develop the record on how compliance
with speed (also referred to as
bandwidth) obligations should be
determined for recipients of high-cost
support that deploy broadband
networks to serve fixed locations. In
addition, the Bureaus seek comment on
whether the same testing methodologies
adopted for price cap carriers accepting
model-based Phase II support should be
applied to other recipients of support to
serve fixed locations, such as rate-ofreturn providers and those that are
awarded Connect America support
through a competitive bidding process.
Finally, the Bureaus seek comment on
the circumstances that would trigger an
audit of the speed and latency metrics.
II. Measuring Compliance With Service
Obligations
A. Speed Performance Measurement
2. The record received in response to
the 2011 USF/ICC Transformation Order
and Further Notice of Proposed
Rulemaking, 76 FR 73830, November
29, 2011 and 76 FR 78384, December 16,
2011, on the methodology to be
implemented for testing compliance
with service obligations was not well
developed. The Bureaus now seek to
refresh the record on the methodology
to be used for demonstrating
compliance with the speed obligation
for ETCs that receive high cost support
to deploy broadband networks to fixed
locations. Should internal network
management system (NMS) tools be
used to measure speed performance?
Alternatively, should external
measurement tools such as Speedtest/
Ookla or Network Diagnostic Tests
(NDT) by M-Labs? Are there better and
more reliable methods of measuring
speed?
3. Internal NMS tools vary among
providers. How can the Commission
ensure that internal NMS tool
measurements are valid? Will such tools
account for multiple transmission
E:\FR\FM\20NOP1.SGM
20NOP1
rmajette on DSK2VPTVN1PROD with PROPOSALS
69092
Federal Register / Vol. 79, No. 224 / Thursday, November 20, 2014 / Proposed Rules
control protocol (TCP) streams, TCP
window sizes, TCP slow start, and other
factors in speed measurement? How
would measurements from such tools be
verified? Are these types of tools too
burdensome or complex for speed
measurements? Would such tools have
any effect on customer service if used
during peak periods? If external testing
is adopted, how would measurements
be verified? Are there better external
measurement tools than those identified
above?
4. What testing parameters should be
used for speed testing? Should they be
different for internal and external
testing?
5. What testing parameters should be
used to measure broadband performance
for wireless providers offering service at
a given address? Should the testing
parameters be different if the service
utilizes a fixed attachment to the
building?
6. The Bureaus propose to require all
ETCs subject to broadband performance
obligations to serve fixed locations to
utilize testing parameters for speed
similar to those already adopted for
latency for price cap carriers.
Specifically, the Bureaus propose to
adopt a methodology that would require
measurements to be made once hourly
during peak periods, 7:00 p.m. to 11:00
p.m. daily local time, over four
consecutive weeks, require 95 percent of
the observations to be at or above the
specified minimum speed, define the
endpoints for the measurement as the
customer premises to Commissiondesignated IXP locations, require testing
to occur at least annually, and require
a minimum of 50 randomly selected
customers locations to be tested within
the geographic area being funded in a
given state. To the extent parties argue
that the process adopted for latency
testing be adjusted and used for speed
testing, they should describe with
specificity what changes should be
made. The Bureaus also seek comment
on whether the data usage in the
proposed tests would have a significant
effect on consumers and, if so, how such
effects could be mitigated. Should any
data caps or monthly usage limits be
adjusted to prevent the testing from
affecting consumers?
7. The Bureaus propose to allow
ETCs, including but not limited to price
cap carriers, the option of testing
compliance with speed requirements
through the MBA program, similar to
what WCB adopted for latency
obligations. If the Bureaus were to do so,
could they apply the same conditions
and parameters as adopted for latency
testing? Would any changes be needed?
VerDate Sep<11>2014
13:19 Nov 19, 2014
Jkt 235001
8. Should the testing options and
parameters be the same for rate-of-return
carriers and providers awarded support
through the Phase II competitive
bidding process as for price cap
carriers? If not, what should they be and
why?
9. The Bureaus seek to augment the
record received in response to the 2011
USF/ICC Transformation Order and
FNPRM based on the considerations
outlined above. Specifically, parties
such as AT&T and Alaska
Communications Systems argued that
the testing mechanism should not
require measuring service at all end-user
locations. A testing mechanism for
speed similar to that adopted for latency
would only require testing at a certain
number of locations. Frontier advocated
that the Commission provide a choice of
measurement test options. A speedtesting mechanism similar to that
adopted for latency would provide two
options for testing. A number of rural
associations stated that the Commission
should not impose measurement
requirements until technically feasible,
less burdensome testing procedures
were available. A speed testing
mechanism similar to that adopted for
latency should be easily manageable for
even very small carriers. The Bureaus
seek comment on these tentative
conclusions.
B. Latency Performance Testing for
Rate-of-Return Carriers and Providers
Awarded Connect America Support
Through Competitive Bidding
10. The Bureaus seek comment on
whether the two methods adopted to
test price cap carrier compliance with
latency service obligations should also
be used to test compliance with latency
service obligations for other recipients
of high-cost support with a broadband
public interest obligation to serve fixed
locations. If so, should the testing
parameters be the same for rate-of-return
providers and those that are awarded
Phase II support through a competitive
bidding process as adopted for price cap
carriers? If not, what should those
parameters be and why?
11. The latency-testing options
adopted for price cap carriers should
provide at least one readily achievable
method suitable for small, rural carriers.
The Bureaus seek comment on this
tentative conclusion. In response to the
2011 USF/ICC Transformation FNPRM,
rural carriers argued that broadband
performance should only be measured
for those portions of the network
controlled by the provider or its
commonly-controlled affiliates. The
Bureaus note that in the Phase II Price
Cap Order, 78 FR 70881, November 27,
PO 00000
Frm 00028
Fmt 4702
Sfmt 4702
2013, WCB rejected this argument for
price cap carriers because (1) testing
only part of the network will not
demonstrate the quality of service being
provided to the end user and (2) carriers
have a number of options to influence
the quality of service from their transit
and/or peering providers. Would that
same reasoning be applicable to other
providers, such as rate-of-return carriers
and non-traditional providers that may
receive support through a competitive
bidding process?
C. Use of MBA Program for Testing and
Reporting
12. The MBA program developed out
of a recommendation by the National
Broadband Plan to improve the
availability of information for
consumers about their broadband
service. The program examines service
offerings from the largest broadband
providers—which collectively account
for over 80 percent of all U.S. wireline
broadband connections—using
automated, direct measurements of
broadband performance delivered to the
homes of thousands of volunteer
broadband subscribers. The
methodology for the program focuses on
measuring broadband performance of an
Internet service provider’s network,
specifically performance from the
consumer Internet access point, or
consumer gateway, to a close major
Internet gateway point. A collaborative
process involving Commission staff,
industry representatives, and academics
was used to determine the test suite and
operations for the MBA program.
13. The MBA program uses
whiteboxes deployed to individual
consumers, called panelists, to collect
data on service levels. These whiteboxes
perform periodic tests to determine the
speed and latency of the service at a
particular panelist’s location, and the
results of the tests are automatically sent
to and recorded by an independent
vendor. Panelists are selected via a
process that allows for consumer
registration and verification by the
service provider followed by activation
as a testing panelist. More than 13,000
whiteboxes have been shipped since the
MBA program began.
14. Currently, the MBA program tests
wireline offerings of 15 large broadband
providers and one satellite-based
provider. If the Bureaus were to adopt
a regime in which ETCs subject to
broadband public interest obligations
could demonstrate compliance with
broadband testing requirements through
their MBA results, would that encourage
additional providers, including smaller
providers, to seek to join the MBA?
Could the MBA accommodate a large
E:\FR\FM\20NOP1.SGM
20NOP1
69093
Federal Register / Vol. 79, No. 224 / Thursday, November 20, 2014 / Proposed Rules
number of additional participants? Is it
feasible for smaller providers to
participate in the MBA, particularly if
they must pay the administrative and
hardware costs of the whiteboxes? Are
these costs likely to be greater or less
than the cost of performing ping-type
tests from 50 locations for latency and
the testing that will be required to verify
speed? Would allowing additional
providers to join the MBA provide more
detailed and more accurate information
on provider performance at lower total
cost?
15. If additional providers join the
MBA program for performance testing,
should their data be make public and
reported in the annual MBA reports as
is done for other MBA providers?
Should the MBA program consider
creating a separate category of
membership for providers that want to
limit testing to Connect Americasupported areas?
16. The Bureaus seek comment on
these and any other issues surrounding
additional provider participation in the
MBA program.
D. Commission-Developed Testing
Mechanism
17. In the event that joining the MBA
program proves infeasible for additional
providers, the Bureaus seek comment on
whether the Commission should
implement a performance testing
platform specifically for Connect
America-supported broadband services.
One possibility is to implement an
oversight mechanism that would be
similar to the MBA program. Like the
MBA program, this could be a
hardware-based test infrastructure
administered by one or more service
vendors with whiteboxes deployed to
consumers throughout Connect
America-supported areas. Having a
single entity, such as USAC, procure the
necessary vendor and infrastructure to
administer this program would
minimize the overall cost of the program
as well as the costs to participating
providers. The Bureaus seek comment
on whether such a program would be
feasible. If so, should it be similar to the
MBA program, or is there a better way
to measure broadband performance?
18. If the Commission were to
implement such a testing mechanism,
should all ETCs subject to broadband
public interest obligations to serve fixed
locations be required to participate? To
the extent commenters argue that any
ETCs should be exempt, they should
identify with specificity the costs and
benefits of requiring them to participate,
and identify alternative means of
achieving the Commission’s oversight
objectives.
19. The Bureaus estimate that the total
costs for an MBA-type performance
oversight program for ETCs receiving
high-cost support to serve fixed
locations would be approximately $4.2
million, which would include the
necessary hardware and software as
well as an initial allocation of 5,000
whiteboxes, in the first year and
approximately $5.9 million each year
thereafter (which incorporates an
additional 5,000 whiteboxes per year).
Our total cost calculation was based on
the following estimates:
Year 1
expenses
(millions)
Annual
expenses
after year 1
(millions)
$1.2 ......................
1.7 ........................
1.3 ........................
$1
1.65
1.3
Total Cost ................................................................................................................................................
rmajette on DSK2VPTVN1PROD with PROPOSALS
Whiteboxes (client testing devices) ................................................................................................................
Core Servers ...................................................................................................................................................
Program Administrative Expenses (could be performed by USAC) ...............................................................
4.2 ........................
3.9
The cost estimates above are based on
having a single entity contract for the
necessary hardware and services to
minimize costs through streamlined
administration and bulk hardware
purchases. If the Commission were to
implement such a centralized testing
program, should these costs be borne by
participating providers or by USAC as
part of its oversight over the universal
service fund? Should USAC pay the
costs of the core servers, with
participating providers paying the costs
of the whiteboxes deployed in their
service areas? If USAC were to pay all
of the equipment costs, including the
whiteboxes, the Bureaus anticipate that
the only cost for providers would be
primarily to verify the services of the
panelists selected in a particular
provider’s service territory.
20. If the Commission were to adopt
such an approach, how many
whiteboxes should be deployed in each
supported area? Should the number be
the same for all providers, vary based on
the number of customers in the
supported area, or be based on some
VerDate Sep<11>2014
13:19 Nov 19, 2014
Jkt 235001
other calculation? Should individual
consumers or consumer groups located
in areas served by a Connect Americasupported provider be allowed to
participate in such an MBA-type
mechanism by purchasing their own
whiteboxes? Such ‘‘citizen testing’’
would allow interested individuals to
evaluate the quality of their services
while providing additional testing data.
21. The Bureaus seek comment on the
initial performance measurement test
suite that should be used, if the
Commission were to implement an
MBA-type testing mechanism. The
MBA’s current test suite includes 13
tests that measure various aspects of
network performance with respect to
speed and latency and was developed
on a consensus basis by academics,
regulators, and industry participants.
Would the MBA’s test suite be an
appropriate for a Connect America
testing mechanism, or could it be
modified in some fashion? What aspects
of the MBA test suite are necessary to
meet the Commission’s objectives that
PO 00000
Frm 00029
Fmt 4702
Sfmt 4702
ETCs meet their broadband public
interest obligations?
22. The MBA program has found that
allowing consumers with whiteboxes
(referred to as panelists) access to their
testing data is an incentive to obtaining
a high number of volunteers. Should a
Commission-designed testing
mechanism for high-cost recipients
allow end user participants access to
their own testing data? MBA results are
currently made publically available via
the Commission’s Web site. Should the
Commission publish test results?
Making such data public would allow
consumers and policy makers to
evaluate whether ETCs are meeting their
service obligations and allow
comparisons of service quality among
providers. Is there any reason that such
performance results should be kept
confidential? If so, should the results be
treated as confidential for a particular
period of time?
E:\FR\FM\20NOP1.SGM
20NOP1
69094
Federal Register / Vol. 79, No. 224 / Thursday, November 20, 2014 / Proposed Rules
rmajette on DSK2VPTVN1PROD with PROPOSALS
III. Auditing Speed and Latency
23. In the USF/ICC Transformation
Order, the Commission concluded that
the results of speed and latency metric
testing ‘‘will be subject to audit.’’ The
Bureaus seek to further develop the
record on procedures for implementing
this requirement for all recipients of
Connect America funding. In particular,
the Bureaus seek comment on how to
incorporate this requirement into the
existing Beneficiary Compliance Audit
Program (BCAP), and whether
additional audits specifically focused on
broadband performance should be
implemented outside of BCAP.
24. High-cost recipients today are
subject to random and for-cause USAC
audits. The Bureaus seek comment on
the circumstances that would warrant
examining broadband performance for
cause. In particular, what events should
trigger a for cause audit of speed and
latency metrics? For example, failure to
file a certification that service
obligations are being met or a
certification that standards are not being
met would likely require an immediate
audit. Similarly, because MBA results
are publicly available, should MBA test
results that demonstrate a failure to
meet service obligations trigger an
audit? Should consumer or other
credible complaints regarding the
quality of service result in an audit? If
customer complaints are used to initiate
an audit, the Bureaus seek comment on
how this should be done. Should
complaints to state/local regulatory
agencies, the Commission, and/or
public watchdog organizations trigger
audits? If so, how many complaints over
what time period and what type of
complaints should be triggering events
for a performance audit? Should
requests from local, state, or tribal
authorities be sufficient to trigger an
audit? Are there other events that
should trigger an audit? Proposed audit
triggers should address both ensuring
that performance standards are met and
minimizing administrative costs.
25. In addition, the Bureaus seek
comment on whether a provider whose
VerDate Sep<11>2014
13:19 Nov 19, 2014
Jkt 235001
audit demonstrates a need for ongoing
monitoring be required to pay the costs
of this additional monitoring. Should
results of audits be made publicly
available? If not, what justifications
support keeping such results private
and for how long?
IV. Procedural Matters
26. Initial Regulatory Flexibility Act
Analysis. The USF/ICC Transformation
Order included an Initial Regulatory
Flexibility Analysis (IRFA) pursuant to
5 U.S.C. 603, exploring the potential
impact on small entities of the
Commission’s proposal. The Bureaus
invite parties to file comments on the
IRFA in light of this additional notice.
27. Initial Paperwork Reduction Act of
1995 Analysis. This document seeks
comment on a potential new or revised
information collection requirement. If
the Commission adopts any new or
revised information collection
requirement, the Commission will
publish a separate notice in the Federal
Register inviting the public to comment
on the requirement, as required by the
Paperwork Reduction Act of 1995,
Public Law 104–13 (44 U.S.C. 3501–
3520). In addition, pursuant to the
Small Business Paperwork Relief Act of
2002, Public Law 107–198, the
Commission seeks specific comment on
how it might ‘‘further reduce the
information collection burden for small
business concerns with fewer than 25
employees.’’
28. Filing Requirements. Pursuant to
§§ 1.415 and 1.419 of the Commission’s
rules, interested parties may file
comments on or before the dates
indicated on the first page of this
document. Comments may be filed
using the Commission’s Electronic
Comment Filing System (ECFS).
• Electronic Filers: Comments may be
filed electronically using the Internet by
accessing the ECFS: https://
fjallfoss.fcc.gov/ecfs2/.
• Paper Filers: Parties who choose to
file by paper must file an original and
one copy of each filing. If more than one
docket or rulemaking number appears in
the caption of this proceeding, filers
PO 00000
Frm 00030
Fmt 4702
Sfmt 4702
must submit two additional copies for
each additional docket or rulemaking
number.
29. Filings can be sent by hand or
messenger delivery, by commercial
overnight courier, or by first-class or
overnight U.S. Postal Service mail. All
filings must be addressed to the
Commission’s Secretary, Office of the
Secretary, Federal Communications
Commission.
D All hand-delivered or messengerdelivered paper filings for the
Commission’s Secretary must be
delivered to FCC Headquarters at 445
12th Street SW., Room TW–A325,
Washington, DC 20554. The filing hours
are 8:00 a.m. to 7:00 p.m. All hand
deliveries must be held together with
rubber bands or fasteners. Any
envelopes and boxes must be disposed
of before entering the building.
D Commercial overnight mail (other
than U.S. Postal Service Express Mail
and Priority Mail) must be sent to 9300
East Hampton Drive, Capitol Heights,
MD 20743.
D U.S. Postal Service first-class,
Express, and Priority mail must be
addressed to 445 12th Street SW.,
Washington, DC 20554.
30. People with Disabilities: To
request materials in accessible formats
for people with disabilities (Braille,
large print, electronic files, audio
format), send an email to fcc504@fcc.gov
or call the Consumer & Governmental
Affairs Bureau at (202) 418–0530
(voice), (202) 418–0432 (tty).
31. In addition, one copy of each
pleading must be sent to each of the
following:
(1) Alexander Minard,
Telecommunications Access Policy
Division, Wireline Competition Bureau,
445 12th Street SW., 5–B442,
Washington, DC 20554; email:
alexander.minard@fcc.gov.
(2) Suzanne Yelen, Industry Analysis
and Technology Division, Wireline
Competition Bureau, 445 12th Street
SW., Room 6–B115, Washington, DC
20554; email: suzanne.yelen@fcc.gov.
E:\FR\FM\20NOP1.SGM
20NOP1
Federal Register / Vol. 79, No. 224 / Thursday, November 20, 2014 / Proposed Rules
rmajette on DSK2VPTVN1PROD with PROPOSALS
32. The proceeding shall be treated as
a ‘‘permit-but-disclose’’ proceeding in
accordance with the Commission’s ex
parte rules. Persons making ex parte
presentations must file a copy of any
written presentation or a memorandum
summarizing any oral presentation
within two business days after the
presentation (unless a different deadline
applicable to the Sunshine period
applies). Persons making oral ex parte
presentations are reminded that
memoranda summarizing the
presentation must (1) list all persons
attending or otherwise participating in
the meeting at which the ex parte
presentation was made, and (2)
summarize all data presented and
arguments made during the
VerDate Sep<11>2014
13:19 Nov 19, 2014
Jkt 235001
presentation. If the presentation
consisted in whole or in part of the
presentation of data or arguments
already reflected in the presenter’s
written comments, memoranda or other
filings in the proceeding, the presenter
may provide citations to such data or
arguments in his or her prior comments,
memoranda, or other filings (specifying
the relevant page and/or paragraph
numbers where such data or arguments
can be found) in lieu of summarizing
them in the memorandum. Documents
shown or given to Commission staff
during ex parte meetings are deemed to
be written ex parte presentations and
must be filed consistent with rule
§ 1.1206(b). In proceedings governed by
rule § 1.49(f) or for which the
PO 00000
Frm 00031
Fmt 4702
Sfmt 9990
69095
Commission has made available a
method of electronic filing, written ex
parte presentations and memoranda
summarizing oral ex parte
presentations, and all attachments
thereto, must be filed through the
electronic comment filing system
available for that proceeding, and must
be filed in their native format (e.g., .doc,
.xml, .ppt, searchable .pdf). Participants
in this proceeding should familiarize
themselves with the Commission’s ex
parte rules.
Federal Communications Commission.
Ryan B. Palmer,
Chief, Telecommunications Access Policy
Division, Wireline Competition Bureau.
[FR Doc. 2014–27429 Filed 11–19–14; 8:45 am]
BILLING CODE 6712–01–P
E:\FR\FM\20NOP1.SGM
20NOP1
Agencies
[Federal Register Volume 79, Number 224 (Thursday, November 20, 2014)]
[Proposed Rules]
[Pages 69091-69095]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 2014-27429]
=======================================================================
-----------------------------------------------------------------------
FEDERAL COMMUNICATIONS COMMISSION
47 CFR Part 54
[WC Docket No. 10-90; DA 14-1499]
Proposed Methodology for Connect America High-Cost Universal
Service Support Recipients To Measure and Report Speed and Latency
Performance to Fixed Locations
AGENCY: Federal Communications Commission.
ACTION: Proposed rule.
-----------------------------------------------------------------------
SUMMARY: In this document, the Wireline Competition Bureau, the
Wireless Telecommunications Bureau, and the Office of Engineering and
Technology seek to further develop the record on how compliance with
speed obligations should be determined for recipients of high-cost
support that deploy broadband networks to serve fixed locations.
DATES: Comments due December 22, 2014.
ADDRESSES: Interested parties may file comments on or before December
22, 2014. All pleadings are to reference WC Docket No. 10-90. Comments
may be filed using the Commission's Electronic Comment Filing System
(ECFS) or by filing paper copies, by any of the following methods:
Electronic Filers: Comments may be filed electronically
using the Internet by accessing the ECFS: https://fjallfoss.fcc.gov/ecfs2/.
Paper Filers: Parties who choose to file by paper must
file an original and one copy of each filing.
People with Disabilities: To request materials in
accessible formats for people with disabilities (Braille, large print,
electronic files, audio format), send an email to fcc504@fcc.gov or
call the Consumer & Governmental Affairs Bureau at (202) 418-0530
(voice), (202) 418-0432 (tty).
For detailed instructions for submitting comments and additional
information on the rulemaking process, see the SUPPLEMENTARY
INFORMATION section of this document.
FOR FURTHER INFORMATION CONTACT: Alexander Minard, Wireline Competition
Bureau at (202) 418-7400 or TTY (202) 418-0484.
SUPPLEMENTARY INFORMATION: This is a synopsis of the Wireline
Competition Bureau's Public Notice (Notice) in WC Docket No. 10-90; DA
14-1499, released October 16, 2014. The complete text of this document
is available for inspection and copying during normal business hours in
the FCC Reference Information Center, Portals II, 445 12th Street SW.,
Room CY-A257, Washington, DC 20554. The document may also be purchased
from the Commission's duplicating contractor, Best Copy and Printing,
Inc., 445 12th Street SW., Room CY-B402, Washington, DC 20554,
telephone (800) 378-3160 or (202) 863-2893, facsimile (202) 863-2898,
or via Internet at https://www.bcpiweb.com.
I. Introduction
1. In this document, the Wireline Competition Bureau, the Wireless
Telecommunications Bureau, and the Office of Engineering and Technology
(together, the Bureaus) seek to further develop the record on how
compliance with speed (also referred to as bandwidth) obligations
should be determined for recipients of high-cost support that deploy
broadband networks to serve fixed locations. In addition, the Bureaus
seek comment on whether the same testing methodologies adopted for
price cap carriers accepting model-based Phase II support should be
applied to other recipients of support to serve fixed locations, such
as rate-of-return providers and those that are awarded Connect America
support through a competitive bidding process. Finally, the Bureaus
seek comment on the circumstances that would trigger an audit of the
speed and latency metrics.
II. Measuring Compliance With Service Obligations
A. Speed Performance Measurement
2. The record received in response to the 2011 USF/ICC
Transformation Order and Further Notice of Proposed Rulemaking, 76 FR
73830, November 29, 2011 and 76 FR 78384, December 16, 2011, on the
methodology to be implemented for testing compliance with service
obligations was not well developed. The Bureaus now seek to refresh the
record on the methodology to be used for demonstrating compliance with
the speed obligation for ETCs that receive high cost support to deploy
broadband networks to fixed locations. Should internal network
management system (NMS) tools be used to measure speed performance?
Alternatively, should external measurement tools such as Speedtest/
Ookla or Network Diagnostic Tests (NDT) by M-Labs? Are there better and
more reliable methods of measuring speed?
3. Internal NMS tools vary among providers. How can the Commission
ensure that internal NMS tool measurements are valid? Will such tools
account for multiple transmission
[[Page 69092]]
control protocol (TCP) streams, TCP window sizes, TCP slow start, and
other factors in speed measurement? How would measurements from such
tools be verified? Are these types of tools too burdensome or complex
for speed measurements? Would such tools have any effect on customer
service if used during peak periods? If external testing is adopted,
how would measurements be verified? Are there better external
measurement tools than those identified above?
4. What testing parameters should be used for speed testing? Should
they be different for internal and external testing?
5. What testing parameters should be used to measure broadband
performance for wireless providers offering service at a given address?
Should the testing parameters be different if the service utilizes a
fixed attachment to the building?
6. The Bureaus propose to require all ETCs subject to broadband
performance obligations to serve fixed locations to utilize testing
parameters for speed similar to those already adopted for latency for
price cap carriers. Specifically, the Bureaus propose to adopt a
methodology that would require measurements to be made once hourly
during peak periods, 7:00 p.m. to 11:00 p.m. daily local time, over
four consecutive weeks, require 95 percent of the observations to be at
or above the specified minimum speed, define the endpoints for the
measurement as the customer premises to Commission-designated IXP
locations, require testing to occur at least annually, and require a
minimum of 50 randomly selected customers locations to be tested within
the geographic area being funded in a given state. To the extent
parties argue that the process adopted for latency testing be adjusted
and used for speed testing, they should describe with specificity what
changes should be made. The Bureaus also seek comment on whether the
data usage in the proposed tests would have a significant effect on
consumers and, if so, how such effects could be mitigated. Should any
data caps or monthly usage limits be adjusted to prevent the testing
from affecting consumers?
7. The Bureaus propose to allow ETCs, including but not limited to
price cap carriers, the option of testing compliance with speed
requirements through the MBA program, similar to what WCB adopted for
latency obligations. If the Bureaus were to do so, could they apply the
same conditions and parameters as adopted for latency testing? Would
any changes be needed?
8. Should the testing options and parameters be the same for rate-
of-return carriers and providers awarded support through the Phase II
competitive bidding process as for price cap carriers? If not, what
should they be and why?
9. The Bureaus seek to augment the record received in response to
the 2011 USF/ICC Transformation Order and FNPRM based on the
considerations outlined above. Specifically, parties such as AT&T and
Alaska Communications Systems argued that the testing mechanism should
not require measuring service at all end-user locations. A testing
mechanism for speed similar to that adopted for latency would only
require testing at a certain number of locations. Frontier advocated
that the Commission provide a choice of measurement test options. A
speed-testing mechanism similar to that adopted for latency would
provide two options for testing. A number of rural associations stated
that the Commission should not impose measurement requirements until
technically feasible, less burdensome testing procedures were
available. A speed testing mechanism similar to that adopted for
latency should be easily manageable for even very small carriers. The
Bureaus seek comment on these tentative conclusions.
B. Latency Performance Testing for Rate-of-Return Carriers and
Providers Awarded Connect America Support Through Competitive Bidding
10. The Bureaus seek comment on whether the two methods adopted to
test price cap carrier compliance with latency service obligations
should also be used to test compliance with latency service obligations
for other recipients of high-cost support with a broadband public
interest obligation to serve fixed locations. If so, should the testing
parameters be the same for rate-of-return providers and those that are
awarded Phase II support through a competitive bidding process as
adopted for price cap carriers? If not, what should those parameters be
and why?
11. The latency-testing options adopted for price cap carriers
should provide at least one readily achievable method suitable for
small, rural carriers. The Bureaus seek comment on this tentative
conclusion. In response to the 2011 USF/ICC Transformation FNPRM, rural
carriers argued that broadband performance should only be measured for
those portions of the network controlled by the provider or its
commonly-controlled affiliates. The Bureaus note that in the Phase II
Price Cap Order, 78 FR 70881, November 27, 2013, WCB rejected this
argument for price cap carriers because (1) testing only part of the
network will not demonstrate the quality of service being provided to
the end user and (2) carriers have a number of options to influence the
quality of service from their transit and/or peering providers. Would
that same reasoning be applicable to other providers, such as rate-of-
return carriers and non-traditional providers that may receive support
through a competitive bidding process?
C. Use of MBA Program for Testing and Reporting
12. The MBA program developed out of a recommendation by the
National Broadband Plan to improve the availability of information for
consumers about their broadband service. The program examines service
offerings from the largest broadband providers--which collectively
account for over 80 percent of all U.S. wireline broadband
connections--using automated, direct measurements of broadband
performance delivered to the homes of thousands of volunteer broadband
subscribers. The methodology for the program focuses on measuring
broadband performance of an Internet service provider's network,
specifically performance from the consumer Internet access point, or
consumer gateway, to a close major Internet gateway point. A
collaborative process involving Commission staff, industry
representatives, and academics was used to determine the test suite and
operations for the MBA program.
13. The MBA program uses whiteboxes deployed to individual
consumers, called panelists, to collect data on service levels. These
whiteboxes perform periodic tests to determine the speed and latency of
the service at a particular panelist's location, and the results of the
tests are automatically sent to and recorded by an independent vendor.
Panelists are selected via a process that allows for consumer
registration and verification by the service provider followed by
activation as a testing panelist. More than 13,000 whiteboxes have been
shipped since the MBA program began.
14. Currently, the MBA program tests wireline offerings of 15 large
broadband providers and one satellite-based provider. If the Bureaus
were to adopt a regime in which ETCs subject to broadband public
interest obligations could demonstrate compliance with broadband
testing requirements through their MBA results, would that encourage
additional providers, including smaller providers, to seek to join the
MBA? Could the MBA accommodate a large
[[Page 69093]]
number of additional participants? Is it feasible for smaller providers
to participate in the MBA, particularly if they must pay the
administrative and hardware costs of the whiteboxes? Are these costs
likely to be greater or less than the cost of performing ping-type
tests from 50 locations for latency and the testing that will be
required to verify speed? Would allowing additional providers to join
the MBA provide more detailed and more accurate information on provider
performance at lower total cost?
15. If additional providers join the MBA program for performance
testing, should their data be make public and reported in the annual
MBA reports as is done for other MBA providers? Should the MBA program
consider creating a separate category of membership for providers that
want to limit testing to Connect America-supported areas?
16. The Bureaus seek comment on these and any other issues
surrounding additional provider participation in the MBA program.
D. Commission-Developed Testing Mechanism
17. In the event that joining the MBA program proves infeasible for
additional providers, the Bureaus seek comment on whether the
Commission should implement a performance testing platform specifically
for Connect America-supported broadband services. One possibility is to
implement an oversight mechanism that would be similar to the MBA
program. Like the MBA program, this could be a hardware-based test
infrastructure administered by one or more service vendors with
whiteboxes deployed to consumers throughout Connect America-supported
areas. Having a single entity, such as USAC, procure the necessary
vendor and infrastructure to administer this program would minimize the
overall cost of the program as well as the costs to participating
providers. The Bureaus seek comment on whether such a program would be
feasible. If so, should it be similar to the MBA program, or is there a
better way to measure broadband performance?
18. If the Commission were to implement such a testing mechanism,
should all ETCs subject to broadband public interest obligations to
serve fixed locations be required to participate? To the extent
commenters argue that any ETCs should be exempt, they should identify
with specificity the costs and benefits of requiring them to
participate, and identify alternative means of achieving the
Commission's oversight objectives.
19. The Bureaus estimate that the total costs for an MBA-type
performance oversight program for ETCs receiving high-cost support to
serve fixed locations would be approximately $4.2 million, which would
include the necessary hardware and software as well as an initial
allocation of 5,000 whiteboxes, in the first year and approximately
$5.9 million each year thereafter (which incorporates an additional
5,000 whiteboxes per year). Our total cost calculation was based on the
following estimates:
------------------------------------------------------------------------
Year 1
expenses Annual expenses after
(millions) year 1 (millions)
------------------------------------------------------------------------
Whiteboxes (client testing $1.2........... $1
devices).
Core Servers................. 1.7............ 1.65
Program Administrative 1.3............ 1.3
Expenses (could be performed
by USAC).
------------------------------------------
Total Cost............... 4.2............ 3.9
------------------------------------------------------------------------
The cost estimates above are based on having a single entity contract
for the necessary hardware and services to minimize costs through
streamlined administration and bulk hardware purchases. If the
Commission were to implement such a centralized testing program, should
these costs be borne by participating providers or by USAC as part of
its oversight over the universal service fund? Should USAC pay the
costs of the core servers, with participating providers paying the
costs of the whiteboxes deployed in their service areas? If USAC were
to pay all of the equipment costs, including the whiteboxes, the
Bureaus anticipate that the only cost for providers would be primarily
to verify the services of the panelists selected in a particular
provider's service territory.
20. If the Commission were to adopt such an approach, how many
whiteboxes should be deployed in each supported area? Should the number
be the same for all providers, vary based on the number of customers in
the supported area, or be based on some other calculation? Should
individual consumers or consumer groups located in areas served by a
Connect America-supported provider be allowed to participate in such an
MBA-type mechanism by purchasing their own whiteboxes? Such ``citizen
testing'' would allow interested individuals to evaluate the quality of
their services while providing additional testing data.
21. The Bureaus seek comment on the initial performance measurement
test suite that should be used, if the Commission were to implement an
MBA-type testing mechanism. The MBA's current test suite includes 13
tests that measure various aspects of network performance with respect
to speed and latency and was developed on a consensus basis by
academics, regulators, and industry participants. Would the MBA's test
suite be an appropriate for a Connect America testing mechanism, or
could it be modified in some fashion? What aspects of the MBA test
suite are necessary to meet the Commission's objectives that ETCs meet
their broadband public interest obligations?
22. The MBA program has found that allowing consumers with
whiteboxes (referred to as panelists) access to their testing data is
an incentive to obtaining a high number of volunteers. Should a
Commission-designed testing mechanism for high-cost recipients allow
end user participants access to their own testing data? MBA results are
currently made publically available via the Commission's Web site.
Should the Commission publish test results? Making such data public
would allow consumers and policy makers to evaluate whether ETCs are
meeting their service obligations and allow comparisons of service
quality among providers. Is there any reason that such performance
results should be kept confidential? If so, should the results be
treated as confidential for a particular period of time?
[[Page 69094]]
III. Auditing Speed and Latency
23. In the USF/ICC Transformation Order, the Commission concluded
that the results of speed and latency metric testing ``will be subject
to audit.'' The Bureaus seek to further develop the record on
procedures for implementing this requirement for all recipients of
Connect America funding. In particular, the Bureaus seek comment on how
to incorporate this requirement into the existing Beneficiary
Compliance Audit Program (BCAP), and whether additional audits
specifically focused on broadband performance should be implemented
outside of BCAP.
24. High-cost recipients today are subject to random and for-cause
USAC audits. The Bureaus seek comment on the circumstances that would
warrant examining broadband performance for cause. In particular, what
events should trigger a for cause audit of speed and latency metrics?
For example, failure to file a certification that service obligations
are being met or a certification that standards are not being met would
likely require an immediate audit. Similarly, because MBA results are
publicly available, should MBA test results that demonstrate a failure
to meet service obligations trigger an audit? Should consumer or other
credible complaints regarding the quality of service result in an
audit? If customer complaints are used to initiate an audit, the
Bureaus seek comment on how this should be done. Should complaints to
state/local regulatory agencies, the Commission, and/or public watchdog
organizations trigger audits? If so, how many complaints over what time
period and what type of complaints should be triggering events for a
performance audit? Should requests from local, state, or tribal
authorities be sufficient to trigger an audit? Are there other events
that should trigger an audit? Proposed audit triggers should address
both ensuring that performance standards are met and minimizing
administrative costs.
25. In addition, the Bureaus seek comment on whether a provider
whose audit demonstrates a need for ongoing monitoring be required to
pay the costs of this additional monitoring. Should results of audits
be made publicly available? If not, what justifications support keeping
such results private and for how long?
IV. Procedural Matters
26. Initial Regulatory Flexibility Act Analysis. The USF/ICC
Transformation Order included an Initial Regulatory Flexibility
Analysis (IRFA) pursuant to 5 U.S.C. 603, exploring the potential
impact on small entities of the Commission's proposal. The Bureaus
invite parties to file comments on the IRFA in light of this additional
notice.
27. Initial Paperwork Reduction Act of 1995 Analysis. This document
seeks comment on a potential new or revised information collection
requirement. If the Commission adopts any new or revised information
collection requirement, the Commission will publish a separate notice
in the Federal Register inviting the public to comment on the
requirement, as required by the Paperwork Reduction Act of 1995, Public
Law 104-13 (44 U.S.C. 3501-3520). In addition, pursuant to the Small
Business Paperwork Relief Act of 2002, Public Law 107-198, the
Commission seeks specific comment on how it might ``further reduce the
information collection burden for small business concerns with fewer
than 25 employees.''
28. Filing Requirements. Pursuant to Sec. Sec. 1.415 and 1.419 of
the Commission's rules, interested parties may file comments on or
before the dates indicated on the first page of this document. Comments
may be filed using the Commission's Electronic Comment Filing System
(ECFS).
Electronic Filers: Comments may be filed electronically
using the Internet by accessing the ECFS: https://fjallfoss.fcc.gov/ecfs2/.
Paper Filers: Parties who choose to file by paper must
file an original and one copy of each filing. If more than one docket
or rulemaking number appears in the caption of this proceeding, filers
must submit two additional copies for each additional docket or
rulemaking number.
29. Filings can be sent by hand or messenger delivery, by
commercial overnight courier, or by first-class or overnight U.S.
Postal Service mail. All filings must be addressed to the Commission's
Secretary, Office of the Secretary, Federal Communications Commission.
[ssquf] All hand-delivered or messenger-delivered paper filings for
the Commission's Secretary must be delivered to FCC Headquarters at 445
12th Street SW., Room TW-A325, Washington, DC 20554. The filing hours
are 8:00 a.m. to 7:00 p.m. All hand deliveries must be held together
with rubber bands or fasteners. Any envelopes and boxes must be
disposed of before entering the building.
[ssquf] Commercial overnight mail (other than U.S. Postal Service
Express Mail and Priority Mail) must be sent to 9300 East Hampton
Drive, Capitol Heights, MD 20743.
[ssquf] U.S. Postal Service first-class, Express, and Priority mail
must be addressed to 445 12th Street SW., Washington, DC 20554.
30. People with Disabilities: To request materials in accessible
formats for people with disabilities (Braille, large print, electronic
files, audio format), send an email to fcc504@fcc.gov or call the
Consumer & Governmental Affairs Bureau at (202) 418-0530 (voice), (202)
418-0432 (tty).
31. In addition, one copy of each pleading must be sent to each of
the following:
(1) Alexander Minard, Telecommunications Access Policy Division,
Wireline Competition Bureau, 445 12th Street SW., 5-B442, Washington,
DC 20554; email: alexander.minard@fcc.gov.
(2) Suzanne Yelen, Industry Analysis and Technology Division,
Wireline Competition Bureau, 445 12th Street SW., Room 6-B115,
Washington, DC 20554; email: suzanne.yelen@fcc.gov.
[[Page 69095]]
32. The proceeding shall be treated as a ``permit-but-disclose''
proceeding in accordance with the Commission's ex parte rules. Persons
making ex parte presentations must file a copy of any written
presentation or a memorandum summarizing any oral presentation within
two business days after the presentation (unless a different deadline
applicable to the Sunshine period applies). Persons making oral ex
parte presentations are reminded that memoranda summarizing the
presentation must (1) list all persons attending or otherwise
participating in the meeting at which the ex parte presentation was
made, and (2) summarize all data presented and arguments made during
the presentation. If the presentation consisted in whole or in part of
the presentation of data or arguments already reflected in the
presenter's written comments, memoranda or other filings in the
proceeding, the presenter may provide citations to such data or
arguments in his or her prior comments, memoranda, or other filings
(specifying the relevant page and/or paragraph numbers where such data
or arguments can be found) in lieu of summarizing them in the
memorandum. Documents shown or given to Commission staff during ex
parte meetings are deemed to be written ex parte presentations and must
be filed consistent with rule Sec. 1.1206(b). In proceedings governed
by rule Sec. 1.49(f) or for which the Commission has made available a
method of electronic filing, written ex parte presentations and
memoranda summarizing oral ex parte presentations, and all attachments
thereto, must be filed through the electronic comment filing system
available for that proceeding, and must be filed in their native format
(e.g., .doc, .xml, .ppt, searchable .pdf). Participants in this
proceeding should familiarize themselves with the Commission's ex parte
rules.
Federal Communications Commission.
Ryan B. Palmer,
Chief, Telecommunications Access Policy Division, Wireline Competition
Bureau.
[FR Doc. 2014-27429 Filed 11-19-14; 8:45 am]
BILLING CODE 6712-01-P