Revisions to Ambient Air Monitoring Regulations, 2710-2808 [06-179]
Download as PDF
2710
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
ENVIRONMENTAL PROTECTION
AGENCY
40 CFR Parts 53 and 58
[EPA–HQ–OAR–2004–0018; FRL–8015–9]
RIN 2060–AJ25
Revisions to Ambient Air Monitoring
Regulations
Environmental Protection
Agency (EPA).
ACTION: Proposed rule; amendments.
AGENCY:
SUMMARY: The EPA is proposing to
revise the ambient air monitoring
requirements for criteria pollutants.
This proposal establishes ambient air
monitoring requirements in support of
the proposed revisions to the National
Ambient Air Quality Standards
(NAAQS) for particulate matter
published elsewhere in today’s Federal
Register, including new minimum
monitoring network requirements for
PM10-2.5 and criteria for approval of
Federal reference and equivalent
methods for PM10-2.5 (to supplement the
Federal reference method for PM10-2.5
proposed elsewhere in today’s Federal
Register). This proposal also requires
each State to operate one to three
monitoring stations that take an
integrated, multipollutant approach to
ambient air monitoring. The proposed
amendments modify the requirements
for ambient air monitors by focusing
requirements on populated areas with
air quality problems and significantly
reducing the requirements for criteria
pollutant monitors that have measured
ambient air concentrations well below
the applicable NAAQS. Other proposed
amendments revise the requirements for
reference and equivalent method
determinations (including specifications
and test procedures) for fine particulate
monitors, monitoring network
descriptions and periodic assessments,
quality assurance, and data certification.
The purpose of the proposed
amendments is to enhance ambient air
quality monitoring to better serve
current and future air quality
management and research needs.
DATES: Comments must be received on
or before April 17, 2006.
ADDRESSES: Submit your comments,
identified by Docket ID No. EPA–HQ–
OAR–2004–0018, by one of the
following methods:
Category
• https://www.regulations.gov: Follow
the on-line instructions for submitting
comments.
• E-mail: a-and-r-docket@epa.gov.
• Fax: (202) 566–1741.
• Mail: Revisions to Ambient Air
Monitoring Regulations, Docket No.
EPA–HQ–OAR–2004–0018,
Environmental Protection Agency,
Mailcode 6102T, 1200 Pennsylvania
Ave., NW., Washington, DC 20460.
Please include a total of two copies. In
addition, please mail a copy of your
comments on the information collection
provisions to the Office of Information
and Regulatory Affairs, Office of
Management and Budget (OMB), Attn:
Desk Officer for EPA, 725 17th St., NW.,
Washington, DC 20503.
• Hand Delivery: EPA Docket Center,
1301 Constitution Avenue, NW., Room
B102, Washington, DC 20460. Such
deliveries are only accepted during the
Docket’s normal hours of operation, and
special arrangements should be made
for deliveries of boxed information.
Instructions: Direct your comments to
Docket ID No. EPA–HQ–OAR–2004–
0018. EPA’s policy is that all comments
received will be included in the public
docket without change and may be
made available online at https://
www.regulations.gov, including any
personal information provided, unless
the comment includes information
claimed to be Confidential Business
Information (CBI) or other information
whose disclosure is restricted by statute.
Do not submit information that you
consider to be CBI or otherwise
protected through https://
www.regulations.gov or e-mail. The
https://www.regulations.gov Web site is
an ‘‘anonymous access’’ system, which
means EPA will not know your identity
or contact information unless you
provide it in the body of your comment.
If you send an e-mail comment directly
to EPA without going through https://
www.regulations.gov your e-mail
address will be automatically captured
and included as part of the comment
that is placed in the public docket and
made available on the Internet. If you
submit an electronic comment, EPA
recommends that you include your
name and other contact information in
the body of your comment and with any
disk or CD ROM you submit. If EPA
cannot read your comment due to
technical difficulties and cannot contact
you for clarification, EPA may not be
NAIC code 1
Industry .....................................................
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
334513
541380
PO 00000
able to consider your comment.
Electronic files should avoid the use of
special characters, any form of
encryption, and be free of any defects or
viruses. For additional information
about EPA’s public docket, visit the EPA
Docket Center homepage at https://
www.epa.gov/epahome/dockets.htm.
Docket: All documents in the docket
are listed in the https://
www.regulations.gov index. Although
listed in the index, some information is
not publicly available, e.g., CBI or other
information whose disclosure is
restricted by statute. Certain other
material, such as copyrighted material,
will be publicly available only in hard
copy. Publicly available docket
materials are available either
electronically in https://
www.regulations.gov or in hard copy at
the Revisions to the Ambient Air
Monitoring Regulations Docket, EPA/
DC, EPA West, Room B102, 1301
Constitution Ave., NW., Washington,
DC. The Public Reading Room is open
from 8:30 a.m. to 4:30 p.m., Monday
through Friday, excluding legal
holidays. The telephone number for the
Public Reading Room is (202) 566–1744,
and the telephone number for the Air
Docket is (202) 566–1742.
For
general questions concerning today’s
proposed amendments, please contact
Mr. Lewis Weinstock, U.S. EPA, Office
of Air Quality Planning and Standards,
Emissions Monitoring and Analysis
Division, Ambient Air Monitoring
Group (D243–02), Research Triangle
Park, North Carolina 27711; telephone
number: (919) 541–3661; fax number:
(919) 541–1903; e-mail address:
weinstock.lewis@epa.gov. For technical
questions, please contact Mr. Tim
Hanley, U.S. EPA, Office of Air Quality
Planning and Standards, Emissions
Monitoring and Analysis Division,
Ambient Air Monitoring Group (D243–
02), Research Triangle Park, North
Carolina 27711; telephone number:
(919) 541–4417; fax number: (919) 541–
1903; e-mail address:
hanley.tim@epa.gov.
FOR FURTHER INFORMATION CONTACT:
SUPPLEMENTARY INFORMATION:
I. General Information
A. Does This Action Apply to Me?
Categories and entities potentially
regulated by this action include:
Examples of regulated entities
Manufacturer, supplier, distributor, or vendor of ambient air monitoring instruments;
analytical laboratories or other monitoring organizations that elect to submit an
application for a reference or equivalent method determination under 40 CFR
part 53.
Frm 00002
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Category
NAIC code 1
Federal government ..................................
924110
State/local/tribal government ....................
924110
1 North
2711
Examples of regulated entities
Federal agencies (that conduct ambient air monitoring similar to that conducted by
States under 40 CFR part 58 and that wish EPA to use their monitoring data in
the same manner as State data) or that elect to submit an application for a reference or equivalent method determination under 40 CFR part 53.
State, territorial, and local, air quality management programs that are responsible
for ambient air monitoring under 40 CFR part 58 or that elect to submit an application for a reference or equivalent method determination under 40 CFR part 53.
The proposal also may affect Tribes that conduct ambient air monitoring similar
to that conducted by States and that wish EPA to use their monitoring data in the
same manner as State monitoring data.
American Industry Classification System.
This table is not intended to be
exhaustive, but rather provides a guide
for readers regarding entities likely to be
regulated by this action. To determine
whether your facility or Federal, State,
local, or territorial agency would be
regulated by this action, you should
examine the requirements for reference
or equivalent method determinations in
40 CFR part 53, subpart A (General
Provisions) and the applicability criteria
in 40 CFR 51.1 of EPA’s requirements
for State implementation plans. If you
have any questions regarding the
applicability of this action to a
particular entity, consult the person
listed in the preceding FOR FURTHER
INFORMATION CONTACT section.
B. What Should I Consider as I Prepare
My Comments for EPA?
Do not submit information containing
Confidential Business Information (CBI)
to EPA through www.regulations.gov or
e-mail. Send or deliver information
identified as CBI only to the following
address: Roberto Morales, OAQPS
Document Control Officer (C404–02),
U.S. EPA, Office of Air Quality Planning
and Standards, Research Triangle Park,
North Carolina 27711, Attention Docket
ID EPA–HQ–OAR–2004–0018. Clearly
mark the part or all of the information
that you claim to be CBI. For CBI
information in a disk or CD ROM that
you mail to EPA, mark the outside of the
disk or CD ROM as CBI and then
identify electronically within the disk or
CD ROM the specific information that is
claimed as CBI. In addition to one
complete version of the comment that
includes information claimed as CBI, a
copy of the comment that does not
contain the information claimed as CBI
must be submitted for inclusion in the
public docket. Information so marked
will not be disclosed except in
accordance with procedures set forth in
40 CFR part 2.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
C. Where Can I Get a Copy of This
Document and Other Related
Information?
In addition to being available in the
docket, an electronic copy of today’s
proposed amendments is also available
on the Worldwide Web (WWW) through
the Technology Transfer Network
(TTN). Following the Administrator’s
signature, a copy of the proposed
amendments will be placed on the
TTN’s policy and guidance page for
newly proposed or promulgated rules at
https://www.epa.gov/ttn/oarpg. The TTN
provides information and technology
exchange in various areas of air
pollution control.
D. Will There Be a Public Hearing?
Public hearings will be held
concurrently with the public hearings
on the proposed amendments to the
NAAQS for particulate matter published
elsewhere in this Federal Register. The
EPA intends to hold public hearings
during February 2006 in Philadelphia,
Pennsylvania; Chicago, Illinois; and San
Francisco, California. The EPA will
announce the date, location, and time of
the public hearings in a separate
Federal Register notice.
E. Did EPA Conduct a Peer Review
Before Issuing This Notice?
The EPA sought expert scientific
review of the proposed methods,
technologies, and approach for ambient
air monitoring by the Clean Air
Scientific Advisory Committee
(CASAC). The CASAC is a Federal
advisory committee established to
review scientific and technical
information and make recommendations
to the EPA Administrator on issues
related to the air quality criteria and
corresponding NAAQS. CASAC
constituted a National Ambient Air
Monitoring Strategy (NAAMS)
Subcommittee in 2003 to provide advice
for a strategy for the national ambient
air monitoring programs. This
subcommittee, which operated over a
one-year period, and a new
subcommittee on Ambient Air
PO 00000
Frm 00003
Fmt 4701
Sfmt 4702
Monitoring and Methods (AAMM),
formed in 2004, provided the input for
CASAC on its consultations, advisories,
and peer-reviewed recommendations to
the EPA Administrator.
In July 2003, the CASAC NAAMS
Subcommittee held a public meeting to
review EPA’s draft National Ambient
Air Monitoring Strategy document
(dated September 6, 2002), which
contained technical information
underlying planned changes to the
ambient air monitoring networks. The
EPA continued to consult with the
CASAC AAMM Subcommittee
throughout the development of the
proposed amendments. Public meetings
were held in July 2004, December 2004,
and September 2005 to discuss the
CASAC review of nearly 20 documents
concerning methods and technology for
measurement of particulate matter (PM);
data quality objectives for PM
monitoring networks and related
performance-based standards for
approval of equivalent continuous PM
monitors; reconfiguration of ambient air
monitoring stations; 1 and other
technical aspects of the proposed
amendments. These documents, along
with CASAC review comments and
other information are available at:
http:
//www.epa.gov/ttn/amtic/casacinf.html.
F. How Is This Document Organized?
The information presented in this
preamble is organized as follows:
I. General Information
A. Does this action apply to me?
B. What should I consider as I prepare my
comments for EPA?
C. Where can I get a copy of this document
and other related information?
D. Will there be a public hearing?
E. Did EPA conduct a peer review before
issuing this notice?
F. How is this document organized?
II. Overview
1 ‘‘Station’’ and ‘‘site’’ are used somewhat
interchangeably in this notice of proposed
rulemaking. When there is a difference ‘‘site’’
generally refers to the location of a monitor, while
‘‘station’’ refers to a suite of measurements at a
particular site.
E:\FR\FM\17JAP3.SGM
17JAP3
2712
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
A. What is the purpose of today’s proposal?
B. What are the major changes proposed to
the ambient air monitoring regulations?
C. When would the proposed amendments
affect States, local governments, tribes,
and other stakeholders?
D. How would EPA implement the new
requirements?
III. Background
A. What is the role of ambient air
monitoring in air quality management?
B. What is the history of ambient air
monitoring?
C. What revisions to the National Ambient
Air Quality Standards for particulate
matter also are proposed today?
D. How do the monitoring data apply to
attainment or nonattainment
designations and findings?
IV. Proposed Monitoring Amendments
A. What are the proposed terminology
changes?
B. What are the proposed requirements for
approval of reference or equivalent
methods?
C. What are the proposed requirements for
quality assurance programs for the
National Ambient Air Monitoring
System?
D. What are the proposed monitoring
methods for the National Ambient Air
Monitoring System?
E. What are the proposed requirements for
the number and location of monitors to
be operated by State and local agencies?
F. What are the proposed probe and
monitoring path siting criteria?
G. What are the proposed data reporting,
data certification, and sample retention
requirements?
V. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory
Planning and Review
B. Paperwork Reduction Act
C. Regulatory Flexibility Act
D. Unfunded Mandates Reform Act
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation
and Coordination With Indian Tribal
Governments
G. Executive Order 13045: Protection of
Children From Environmental Health
and Safety Risks
H. Executive Order 13211: Actions that
Significantly Affect Energy Supply,
Distribution, or Use
I. National Technology Transfer
Advancement Act
J. Executive Order 12898: Federal Actions
to Address Environmental Justice in
Minority Populations and Low-Income
Populations
A. What Is the Purpose of Today’s
Proposal?
agencies. While much of today’s
proposed rule outlines changes to the
monitoring requirements for particulate
matter (PM), there are additional
changes relating to all the other criteria
pollutants (ozone (O3), carbon monoxide
(CO), sulfur dioxide (SO2), nitrogen
dioxide (NO2), and lead (Pb)) included
in this proposal.
Some of these proposed changes are
in support of the proposed revisions to
the National Ambient Air Quality
Standards (NAAQS) for PM in 40 CFR
part 50 published elsewhere in today’s
Federal Register.2 These changes are
essential to implementation of the
proposed NAAQS for PM. Included
among these proposed PM-related
changes are new provisions for addition
to 40 CFR parts 53 and 58 which
address approval of methods and
PM10-2.5 monitoring requirements. The
added provisions would address federal
reference method (FRM) equivalency
determinations for continuous PM10-2.5
monitors and the requirements for the
number of PM10-2.5 monitors a State
must deploy. Another important
element of the provisions for PM10-2.5 is
a proposal for the conditions under
which a PM10-2.5 monitor may be
compared to the PM10-2.5 NAAQS.
A number of amendments to existing
provisions for PM2.5 monitoring are also
proposed. These would be important to
the implementation of the revised PM2.5
NAAQS because they take advantage of
the experience and insight gained by
EPA and the States during the past 7
years of PM2.5 monitoring. One of the
proposed PM2.5 changes involves the
criteria for FRM equivalency
determinations for continuous PM2.5
monitors. We anticipate that this change
would allow States to operate
continuous monitors at more required
monitoring sites, providing more robust
data for the PM2.5 air quality program.
Other proposed changes are based on
EPA’s assessment that the monitoring
regulations are not fully aligned with
current data needs and opportunities
across all the NAAQS pollutants—
including PM but also including O3, CO,
SO2, NO2, and Pb. This misalignment
has developed over time as ambient
conditions have improved for some
pollutants. Also, new monitoring
technologies have been developed that
provide attractive opportunities for
The EPA is proposing a number of
changes to the ambient air quality
monitoring requirements of 40 CFR
parts 53 and 58 to ensure that the
national network of air monitors will
meet the current and future data needs
of EPA (and other Federal), State, local,
and tribal air quality management
2 The proposed amendments to the National
Ambient Air Quality Standards include revised
standards for PM2.5 (particulate mater with an
aerodynamic diameter less than or equal to a
nominal 2.5 micrometers) and new standards for
PM10-2.5 (particulate matter with an aerodynamic
diameter less than or equal to a nominal 10
micrometers and greater than or equal to a nominal
2.5 micrometers).
II. Overview
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00004
Fmt 4701
Sfmt 4702
obtaining more robust and useful data.
The EPA recognized that changes were
needed several years ago and since then,
we have been developing the specifics
of these changes with States and other
stakeholders.3 This group of proposed
changes includes relaxation of some
long-standing monitoring requirements
which we believe are outdated or
unnecessarily inflexible. This group of
proposed changes also includes a new
requirement for States to operate a new
type of multipollutant monitoring
station, which we plan to call National
Core (NCore) stations. Other proposed
changes relate to quality assurance
requirements, monitor siting, special
purpose monitoring, and data
management.
We are proposing both the PM
NAAQS review-related changes as well
as the overarching NAAQS monitoring
system changes together because they
are strongly related in terms of
regulatory language and in terms of
implementation decision making.
Resources for ambient monitoring are
limited, and the cost of new types of
monitoring to meet new requirements
such as those for PM10-2.5 must be offset,
at least in part, by reducing resources
for lower value types of monitoring. The
proposed revisions to the monitoring
regulations, when finalized, will
improve EPA’s and our monitoring
partners’ abilities to manage available
funds to support monitoring activities
and create a coordinated, integrated,
multipurpose, and flexible monitoring
system. In addition, it will be easier for
the public to comment on the proposed
changes if they are presented together
rather than in sequential proposals.
The EPA notes that in the proposed
regulatory language for 40 CFR parts 53
and 58, we are reprinting a number of
existing provisions without change (for
example, a number of definitions in
current 58.1). We are doing so solely for
the readers’ convenience in order that
the provisions we are proposing can
appear in a single context. The EPA is
not reproposing, reconsidering, or
otherwise reopening any of these
reprinted provisions. We will regard any
comments as to these provisions as
outside the scope of this proposal.
3 Our work with States and other monitoring
program stakeholders has included the
development of successive versions of a draft
report, ‘‘National Ambient Air Monitoring
Strategy’’. The most recent version, dated December
2005, is available in the public docket. The
document describes in more depth the reasons for
proposing many of the changes presented in this
notice, excluding the changes related to PM10-2.5. It
also discusses strategy elements that are related to,
but separate from, the regulatory provisions in 40
CFR parts 53 and 58 such as funding, training, etc.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
B. What Are the Major Changes
Proposed to the Ambient Air Monitoring
Regulations?
The summary of each proposed
change given here ends with a reference
to the part(s) of section IV of this
preamble that describes that change in
detail.
• We propose to require States to
operate from one to three National Core
(NCore) multipollutant monitoring
sites.4 Monitors at NCore multipollutant
sites would be required to measure
particles (PM2.5, speciated PM2.5,
PM10-2.5), O3, SO2, CO, nitrogen oxides
(NO/NO2/NOY), and basic meteorology.
Monitors for all the gases except for O3
would be required to be more sensitive
than standard Federal reference method
(FRM)/Federal equivalent method
(FEM) monitors, so they could
accurately report concentrations that are
well below the respective NAAQS but
that can be important in the formation
of O3 and PM. We are not proposing
specific locations for these sites, but
instead would collaborate on site
selection with States individually and
through multistate organizations. Our
objective is that sites be located in
broadly representative urban (about 55
sites) and rural (about 20 sites) locations
throughout the country to help
characterize regional and urban patterns
of air pollution. We expect that in many
cases States would collocate these new
stations with Photochemical
Assessment Monitoring Station (PAMS)
sites already measuring O3 precursors
4 The National Core (NCore) multi-pollutant
stations are part of an overall strategy to integrate
multiple monitoring networks and measurements,
including research grade sites and State and local
air monitoring stations (SLAMS). Research grade
sites would provide complex, research-grade
monitoring data for special studies; the proposed
amendments do not include requirements for these
sites. SLAMS would include sites needed for
National Ambient Air Quality Standard
comparisons and other data needs of monitoring
agencies. The number and placement of SLAMS
monitors would vary according to the pollutant,
population, and level of air quality problem. The
April 2004 draft version of the National Ambient
Air Monitoring Strategy presented a taxonomy in
which monitoring stations belonged to three levels,
called Level 1 (research sites), Level 2 (what are
called NCore multipollutant sites in this notice),
and Level 3 (what have been called SLAMS/NAMS
(national air monitoring stations) in the past). The
three Levels combined were referred to as the
NCore System. We have decided to dispense with
the three-level taxonomy because it does not
encompass all relevant monitoring efforts. We now
refer to the collection of all ambient air
monitoring—including research sites, all types of
monitoring by States and Tribes, and all types of
ambient monitoring by Federal agencies—as the
National Ambient Air Monitoring System
(NAAMS). We are retaining the ‘‘NCore’’ label for
the multipollutant sites in particular, because the
term with this meaning has become part of the
vocabulary of the State/local monitoring
community.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
and/or National Air Toxic Trends
Station (NATTS) sites measuring air
toxics.
These sites would still create points of
integration among the existing networks
for criteria pollutants, each of which
was originally designed with only a
single pollutant in mind. Where
collocated with sites already measuring
O3 precursors or air toxics, the degree of
integration across pollutants of concern
would be even stronger. Data from these
NCore sites would be used for several
purposes that cannot be served as well
using only data available from existing
networks. Forecasting of the Air Quality
Index (AQI) would be improved by
feeding several collocated and
interdependent pollutant concentration
measurements into an air quality model
in near real-time to better represent
current conditions, from which the
model could provide an improved
forecast of O3 and particle levels for the
public. Studies that track long-term
trends of criteria pollutants, and thereby
help demonstrate the accountability of
implemented emissions control
programs, would be improved by
utilizing higher-sensitivity monitoring
equipment for pollutants whose
measured levels are well below the
NAAQS. Air quality model
development and validation efforts
would benefit by having a long-term
network of several important and
interdependent measurements at
improved time-scales (e.g., hourly
instead of daily sample concentrations
on PM methods) at a network of sites
expected to remain in place over many
years to allow testing of how well
models simulate co-pollutant
interactions. Where applicable siting
criteria for PM or O3 monitoring stations
are met, NCore sites could also be used
to satisfy minimum monitoring
requirements for PM and O3 and data
from these stations could be used in
designation decisions and in
development of control strategies.5 The
NCore proposals are described more
fully in section IV.E.1 of this preamble.
• We propose monitoring
requirements for PM10-2.5 which are
based on deploying a network of FEM
monitors that would be approved based
on criteria for comparability to monitors
utilizing the FRM proposed elsewhere
in today’s Federal Register.
Requirements for PM10-2.5 Class I, Class
II, and Class III candidate equivalent
methods would be established. The
definition of a ‘‘Class III equivalent
5 While not a part of our rationale for requiring
States to operate these sites, we note that the data
from them will also be of use in future health effects
studies.
PO 00000
Frm 00005
Fmt 4701
Sfmt 4702
2713
method’’ would allow for designation of
continuous and semi-continuous
ambient air monitoring methods for
PM10-2.5.6 Because we intend that most
of the monitors used in the PM10-2.5
network will use continuous or semicontinuous equivalent methods, the
proposal for Class III approval
requirements is particularly important
for PM10-2.5. We are also proposing
minimum requirements for a PM10-2.5
monitoring network, including criteria
for the number of FRM/FEM monitoring
sites in each metropolitan area (which
would vary from zero to five) and
criteria for how monitors should be
placed within an area. Closely linked to
the placement criteria is a proposed test
for the suitability of a PM10-2.5
monitoring site for comparison with the
PM10-2.5 NAAQS. We are also proposing
that speciation monitoring of PM10-2.5 be
required in some areas. These proposals
appear in sections IV.B.2, IV.B.3, IV.B.5,
and IV.B.6 (dealing with equivalent
methods) and section IV.E.2 (dealing
with number of monitors, their
placement, and the use of data from
them in comparisons to the NAAQS) of
this preamble.
• We propose amendments to
facilitate the wider use of continuous
PM2.5 monitors by revising performancebased FEM equivalence standards for
continuous PM2.5 monitors and allowing
for approved regional methods (ARM)
for continuous PM2.5 mass monitors.
Existing requirements for PM2.5 Class I
and Class II candidate equivalent
methods would be revised, and new
requirements for PM2.5 Class III
candidate equivalent methods would be
added. The definition of a Class III
equivalent method would be revised to
allow for designation of continuous and
semi-continuous ambient air monitoring
methods for PM2.5. These proposals
appear in sections IV.B.4, IV.B.5, and
IV.B.6 (FEM equivalence standards) and
in section IV.D.2 (approved regional
methods) of this preamble.
• In association with the proposed
requirements for new PM10-2.5 stations
and new NCore multipollutant stations,
we propose to remove the existing
requirements for certain numbers of
State and local air FRM/FEM
monitoring stations for CO, PM10, SO2,
and NO2, and reduce them for Pb.
6 Class I equivalent methods have only minor
deviations or modifications from the specified
reference method. Class II equivalent methods
include other filter-based, integrated, gravimetrictype methods similar to the specified reference
method but with greater deviations than allowed for
a Class I method. Class III equivalent methods
include all candidate PM2.5 and PM10-2.5 methods
not classified as Class I or Class II. We expect that
most candidate Class III equivalent methods will be
continuous or semi-continuous methods.
E:\FR\FM\17JAP3.SGM
17JAP3
2714
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
However, States would still need EPA
approval to move or remove existing
monitoring stations for these
pollutants.7 To expedite reviews and
provide more certainty to State
planning, a specific process and several
substantive criteria are proposed to
govern EPA approval actions. Also, the
requirement that EPA approval be
obtained at the Administrator level
(rather than the Regional Administrator
level) for the subset of these monitors
historically designated as NAMS would
be eliminated, and all changes would be
reviewed by the Regional
Administrator.8 In addition, the
requirements for monitoring of O3
precursors under the PAMS program
would be reduced by about 50 percent.
These proposed changes allow PAMS
monitoring to be more customized to
local data needs rather than meeting so
many specific requirements common to
all subject O3 nonattainment areas; the
PAMS changes would also give States
the flexibility to reduce the overall size
of their PAMS programs—within
limits—and to use the associated
resources for other types of monitoring
they consider more useful.
Requirements for minimum numbers of
O3 and PM2.5 monitors would be
retained, with small adjustments. The
overall impact of these changes would
be to retain comprehensive monitoring
networks for PM2.5 and O3, and to
reduce the number of SO2, CO, NO2, Pb,
and PM10 monitors in areas that do not
have air quality problems for these
pollutants. PM2.5 and O3 monitoring
would be mostly unaffected because
PM2.5 and O3 are current nonattainment
challenges and comprehensive
monitoring is needed to support efforts
to attain the NAAQS. Many existing
monitors for SO2, CO, NO2, Pb, and
PM10 can be discontinued because they
are now well below the applicable
NAAQS and the data from most of these
monitors have low value for air quality
management and research purposes. We
expect reductions in the number of
monitors for these pollutants nationally
to be in the range of about 33 percent
for SO2 to about 90 percent for NO2.9
This would free up resources to go
beyond minimum requirements for O3,
7 Where the PM
10 annual and 24-hour NAAQS
have both been revoked, the proposed rule does not
require prior EPA approval for discontinuing a
PM10 monitor.
8 EPA Administrator approval would continue to
be required for changes to some PM2.5 speciation
monitoring stations, to any required NCore
multipollutant station, and to any PAMS station.
9 Detailed estimates of the current and expected
future number of each type of monitor over the 3
years following promulgation are given in the
supporting statement to the Information Collection
Request for this action, available in the docket.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PM2.5, PM10-2.5, or other pollutants such
as air toxics in areas where there are
ongoing or new air quality management
challenges. These proposed changes are
described in sections IV.E.3 (number of
PM2.5 monitors), IV.E.4 (PM10 monitors),
IV.E.5 (number of O3 monitors), IV.E.6
(number of CO, SO2, NO2, and Pb
monitors), IV.E.7 (PAMS monitors), and
IV.E.8 (process and criteria for moving
or removing monitors) of this preamble.
• We propose updated quality
assurance (QA) requirements for all
NAAQS pollutants, emphasizing the
responsibility of each monitoring
program for its data quality based on the
use of data quality objectives for
monitoring precision, data
completeness, and bias. States would be
required to provide for adequate,
independent performance audits of
FRM/FEM monitoring stations. We
describe several options for how they
could meet this audit responsibility.
One way would be to agree to have
appropriated State and Territorial Air
Grant (STAG) funds retained by EPA to
cover the cost of performing these
audits; another option would be a
partnership between State/local
monitoring agencies (or independent
subunits within one agency). The
statistics for calculating precision and
bias would also would be revised.
Quality assurance requirements would
be defined for PM10-2.5 monitoring. See
section IV.C of this preamble for details.
• We propose to revise the provisions
regarding special purpose monitors
(SPM) for all NAAQS pollutants. In
certain restricted situations, data from
SPM would not be usable for
nonattainment designations. SPM that
are FRM, FEM, or ARM monitors would
be required to meet standard quality
assurance requirements for their
monitor type, and States would be
required to report data from such SPM
to the Air Quality System (AQS). See
section IV.E.9 of this preamble for
details.
• We propose to require that States
conduct in-depth network assessments
every 5 years. These assessments are
intended to ensure that future gaps
between data needs and monitoring
operations are identified and filled in a
timely manner. See section IV.E.11 of
this preamble for specifics.
• We propose to move requirements
for reporting certain operational data
from PM samplers from 40 CFR part 50
to 40 CFR part 58, and to reduce the
number of data elements required to be
reported. This would put all similar
data reporting requirements together in
40 CFR part 58 and allow them to apply
to both FRM and FEM monitors. See
section IV.G.1 of this preamble.
PO 00000
Frm 00006
Fmt 4701
Sfmt 4702
• We propose a new requirement for
the reporting of PM2.5 field blank data.10
Only the data from field blanks which
States are already taking into the field
and weighing in their laboratories
would be required to be reported under
this proposal. Having the data from
these field blanks available to the
national monitoring community would
help EPA and other researchers
understand the relationship between the
mass of PM that is sampled and
weighed on a regular PM filter and the
PM that is actually present in ambient
air. See section IV.G.2 of this preamble
or details.
• We propose to require State or local
agencies to submit annual data
certification letters, by May 1 of each
year, to certify that the ambient air
concentration and QA data submitted to
EPA’s AQS for the previous year are
complete and accurate. These letters are
now required on July 1 of each year. See
section IV.G.3 of this preamble.
• We propose to require States to
archive PM2.5 and PM10-2.5 filters for one
year (the current requirement is only for
PM2.5 filters).11 See section IV.G.4 of
this preamble.
• We propose to increase the distance
that ozone monitors should be placed
downwind of roadways, to reduce the
possibility that ozone readings will be
artificially low due to ozone scavenging
by NO emitted by vehicles on roadways.
See section IV.F of this preamble.
C. When Would the Proposed
Amendments Affect State and Local
Governments, Tribes, and Other
Stakeholders?
1. State and Local Governments
Only State governments, and those
local governments that have been
assigned responsibility for ambient air
monitoring by their States, are subject to
the mandatory requirements of 40 CFR
part 58.12
The proposed compliance date for
deployment of PM10-2.5 monitors by
States is January 1, 2009. A plan for this
10 Field blanks are filters which are handled in
the field as much as possible like actual filters
except that ambient air is not pumped through
them, to help quantify contamination and sampling
artifacts.
11 A PM
10-2.5 ‘‘filter’’ from a FRM monitor would
actually consist of the separate PM10 and PM2.5
filters. Some equivalent methods, if approved,
could involve a single PM10-2.5 filter. All filters from
both types of monitors would be subject to the
archiving requirement.
12 Throughout this preamble, ‘‘States’’ is meant to
also refer to local governments that have been
assigned responsibility for ambient air monitoring
within their respective jurisdiction by their States.
We also use ‘‘monitoring organization’’ to refer to
States, local agencies, and/or Tribes conducting
monitoring under or guided by the provisions of 40
CFR part 58.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
deployment would be due January 1,
2008, unless an extension is granted to
July 1, 2008. These plans would be
subject to EPA approval at the Regional
Office level.
State (or local) agencies would also be
required to submit earlier annual data
certification letters and make electronic
reports of QA data to the AQS, starting
May 1, 2009.
The proposed amendments require
that State (or local) agencies fully
implement the required NCore
multipollutant sites by January 1, 2011
(more than 4 years after the expected
date of promulgation of the
amendments). A plan for this
implementation, including site
selection, would be due July 1, 2009.
Network assessments would be
required every 5 years starting July 1,
2009.
State and local agencies would be
required to comply with existing
requirements in 40 CFR part 58
(including annual network review and
data reporting), until the compliance
date for each new requirement is
reached.
Some provisions in the proposed
amendments to 40 CFR part 58 (those
that do not involve deployment of new
monitoring stations or new types of data
handling) would be effective as of the
effective date of the final rule.
2. Tribes
Under the Tribal Authority Rule
(TAR) (40 CFR part 49), which
implements section 301(d) of the CAA,
Tribes may elect to be treated in the
same manner as a State in implementing
sections of the CAA. However, the EPA
determined in the TAR that it was
inappropriate to treat Tribes in a
manner similar to a State with regard to
specific plan submittal and
implementation deadlines for NAAQSrelated requirements, including, but not
limited to, such deadlines in CAA
sections 110(a)(1), 172(a)(2), 182, 187,
and 191. See 40 CFR 49.4(a). For
example, an Indian tribe may choose,
but is not required, to submit
implementation plans for NAAQS
related requirements, nor are they
required to monitor. If a Tribe elects to
do an implementation plan, the plan
can contain program elements to
address specific air quality problems in
a partial program. The EPA will work
with the Tribe to develop an appropriate
schedule which meets the needs of each
Tribe.
Indian tribes have the same rights and
responsibilities as States under the CAA
to implement elements of air quality
programs as they deem necessary.
Tribes can choose to engage in ambient
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
air monitoring activities. In many cases,
Indian tribes are required by EPA
regions to institute strict quality
assurance programs, utilize FRM or
FEM when comparing their data to the
NAAQS, and to insure that the data
collected is qualitative and
representative of their respective
airsheds. For FRM and FEM monitors
used for NAAQS attainment or
nonattainment determinations, quality
assurance requirements of 40 CFR part
58 must be followed and would be
viewed by EPA as an indivisible
element of a regulatory air quality
monitoring program.
3. Other Stakeholders
Manufacturers of continuous PM2.5
and PM10-2.5 instruments would be able
to apply for designation of their
instruments as FEM as soon as the
notice of final rulemaking is signed. The
EPA is eager to receive such
applications as soon as manufacturers
can collect and analyze the necessary
supporting data.
D. How Would EPA Implement the New
Requirements?
After promulgation, we would
implement the new requirements using
several mechanisms. We expect to work
with each State to develop the
monitoring plans for their new PM10-2.5
and NCore multipollutant monitoring
stations. For example, we would
negotiate the selection of required new
monitoring sites (or new capabilities at
existing sites) and their schedules for
start up as well as plans to discontinue
sites that were no longer needed. The
EPA would negotiate with each State its
annual grants for air quality
management activities, including
ambient monitoring work. We would
negotiate grants that provide funding to
meet minimum requirements and which
have milestones for completion of
necessary changes. Once States have
established a new monitoring
infrastructure to meet the new
requirements, we would review State
monitoring activities, submitted data,
and plans for further changes on an
annual basis.
The EPA’s support for and
participation in enhancing the national
ambient air monitoring system to serve
current and future air quality
management and research needs will
extend beyond ensuring that States meet
the minimum requirements of the
monitoring rules, including the
proposed amendments. We will work
with each State or local air monitoring
agency to determine what affordable
monitoring activities above minimum
requirements would best meet the
PO 00000
Frm 00007
Fmt 4701
Sfmt 4702
2715
diverse needs of the individual air
quality management program as well as
the needs of other data users. In
particular, we may negotiate with some
States, and possibly with some Tribes,
for the establishment and operation of
some additional rural NCore
multipollutant monitoring stations to
complement the multipollutant stations
that would be required by the proposed
changes to the monitoring regulations.
We also expect to work with the States,
and possibly with some Tribes, to
establish and operate more PM10-2.5
speciation sites than the minimums that
would be required by the proposed
amendments. We expect to work with
the States, and possibly with some
Tribes, to establish and operate rural
PM10-2.5 mass concentration sites in less
urbanized locations.
An important element of
implementing the new requirements
will be EPA’s role in encouraging the
development and application of Federal
equivalent methods (FEM), in particular
for continuous methods of measuring
PM2.5 and PM10-2.5. We have determined
that continuous monitoring of PM2.5 has
many advantages over the filter-based
Federal reference method. One of the
proposed changes makes it more
practical for manufacturers of
continuous PM2.5 instruments to obtain
designation for them as FEM or
approved regional methods. To ensure
objectivity and sound science, EPA’s
Office of Research and Development
would continue to review applications
for FEM designations based on the
criteria proposed today and would
recommend approval or disapproval to
the EPA Administrator.
We will also provide technical
guidance documents and training
opportunities for State, local, and Tribal
monitoring staff to help them select,
operate, and use the data from new
types of monitoring equipment. We
have already distributed a technical
assistance document on the precursor
gas monitors 13 that will be part of the
multipollutant sites and we have
conducted three training workshops on
these monitors. Additional guidance
will be developed and provided on
some other types of monitors with
which many State monitoring staff are
currently unfamiliar, and on network
design, site selection, quality assurance,
and other topics. While Tribes are not
to be subject to the requirements of the
proposed monitoring amendments,
13 Technical Assistance Document (TAD) for
Precursor Gas Measurements in the NCore
Multipollutant Monitoring Network. Version 4. U.S.
Environmental Protection Agency. EPA–454/R–05–
003. September 2005. Available at: https://
www.epa.gov/ttn/amtic/pretecdoc.html.
E:\FR\FM\17JAP3.SGM
17JAP3
2716
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
these technical resources will also be
available to them directly from EPA and
via grantees, such as the Institute for
Tribal Environmental Professionals and
the Tribal Air Monitoring Support
Center.
In partnership with States, we will
also continue to plan and manage State
technical assistance grants (STAG) to
support the National Park Service’s
operation of the IMPROVE monitoring
network, which provides important data
for implementing both regional haze
and PM2.5 attainment programs.14
We will also continue to operate the
Clean Air Status and Trends Network
(CASTNET), which monitors for O3, PM,
and chemical components of PM in
rural areas across the nation.15 We are
in the process of revising CASTNET to
upgrade its monitoring capabilities to
allow it to provide even more useful
data to multiple data users. We expect
that about 20 CASTNET sites will have
new capabilities at least equivalent to
the capabilities envisioned for NCore
multipollutant sites. Those sites would
reduce the number of, and complement,
rural multipollutant sites funded with
limited State/local grant funds.
We recognize that some air quality
management issues require ambient
concentration and deposition data that
cannot be provided by the types of
monitoring required by the proposed
monitoring amendments and other
activities addressed in today’s proposal.
These issues include near-roadway
exposures to emissions from motor
vehicles and mercury deposition. We
are actively researching these issues and
developing plans for monitoring
programs to address them, but these
issues are outside the scope of this
proposal.
III. Background
A. What Is the Role of Ambient Air
Monitoring in Air Quality Management?
Ambient air monitoring systems are a
critical part of the nation’s air quality
management program infrastructure. We
use the ambient air monitoring data for
a wide variety of purposes as part of an
iterative process in managing air
quality. This iterative process involves a
continuum of setting standards and
objectives, designing and implementing
control strategies, assessing the results
of those control strategies, and
measuring progress. The data have
14 Additional information on EPA/National Park
Service IMPROVE (Interagency Monitoring of
Protected Visual Environments) Visibility Program
is available at: https://www.epa.gov/ttn/amtic/
visdata.html.
15 Additional information on CASTNET is
available at: https://www.epa.gov/castnet/.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
many uses throughout this system, such
as: Determining compliance with the
National Ambient Air Quality Standards
(NAAQS); characterizing air quality
status and trends; estimating health
risks and ecosystem impacts;
developing and evaluating emissions
control strategies; and measuring overall
progress for the air pollution control
program. Ambient air monitoring data
provide accountability for control
strategy reductions by tracking longterm trends of criteria and noncriteria
pollutants and their precursors. The
data also form the basis for air quality
forecasting and other public air quality
reports.
More detailed ambient monitoring
data are needed to meet current and
future program and research needs. The
data collected by State and local
agencies under the proposed monitoring
amendments would:
• Provide more timely Air Quality
Index reporting to the public by
supporting continuous particle
measurements needed for AIRNow air
quality forecasting and other public
reporting mechanisms;
• Improve the development of
emissions control strategies through
more effective air quality model
evaluation and other observational
methods; and
• Support long-term health
assessments that contribute to ongoing
reviews of the NAAQS and other
scientific studies ranging across
technological, health, and atmospheric
process disciplines.
B. What Is the History of Ambient Air
Monitoring?
1. Statutory Authority
The EPA rules for ambient air
monitoring are authorized under
sections 110, 301(a), and 319 of the
Clean Air Act (CAA). Section
110(a)(2)(B) of the CAA requires that
each State implementation plan (SIP)
provide for the establishment and
operation of devices, methods, systems,
and procedures needed to monitor,
compile, and analyze data on ambient
air quality and for the reporting of air
quality data to EPA. Section 301(a) of
the CAA authorizes EPA to develop
regulations needed to carry out the
Agency’s mission and establishes
rulemaking requirements. Uniform
criteria to be followed when measuring
air quality and provisions for daily air
pollution index reporting are required
by CAA section 319.
2. Ambient Air Monitoring Regulations
The EPA’s procedures for determining
and designating reference and
PO 00000
Frm 00008
Fmt 4701
Sfmt 4702
equivalent methods (40 CFR part 53)
have been in place since 1975 (40 FR
7049, February 18, 1975). Reference
methods for criteria pollutants provide
uniform, reproducible measurements of
concentrations in the ambient air.
Equivalent methods allow for the
introduction of new and innovative
technologies for the same purpose,
provided the technologies produce
measurements comparable to reference
methods under a variety of monitoring
conditions.
Subpart A of 40 CFR part 53 (General
Provisions) establishes definitions;
general requirements for designation of
Federal reference methods (FRM) and
Federal equivalent methods (FEM);
procedures for submitting, processing,
and approving applications; and
associated provisions. The general
requirements identify the applicable
requirements or tests that a candidate
method must meet to be approved as a
FRM or FEM. All manual or automated
methods must meet the applicable
requirements in 40 CFR part 53, subpart
C (Procedures for Determining
Comparability Between Candidate
Methods and Reference Methods).
Automated equivalent methods for
pollutants other than PM10 or PM2.5 also
must meet the requirements in 40 CFR
part 53, subpart B (Procedures for
Testing Performance Characteristics of
Automated Methods for SO2, CO, O3,
and NO2). A manual sampler or
automated method for PM10, Class I
equivalent method for PM2.5, or Class II
equivalent method for PM2.5 also must
meet the requirements in 40 CFR part
53, subpart D (Procedures for Testing
Performance Characteristics of Methods
for PM10), subpart E (Procedures for
Testing Physical (Design) and
Performance Characteristics of
Reference Methods and Class I
Equivalent Methods for PM2.5), or
subpart F (Procedures for Testing
Performance Characteristics of Class II
Equivalent Methods for PM2.5), as
applicable. The existing rule adopts a
case-by-case approach for PM2.5 Class III
candidate equivalent methods. The
regulations in 40 CFR part 53 have been
amended several times since 1975 to
reflect the addition of new and revised
reference methods and advances in
monitoring methods and technologies
for criteria pollutants.
In 1979 (44 FR 27558, May 10, 1979),
EPA issued the first regulations for
ambient air quality surveillance (40 CFR
part 58) for all pollutants subject to
NAAQS. Within 40 CFR part 58, subpart
A (General Provisions) establishes
definitions, and subpart B (Monitoring
Criteria) sets requirements for quality
assurance, methods, siting, operating
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
schedules, and special purpose
monitors. Subpart C (State and Local Air
Monitoring Stations), subpart D
(National Air Monitoring Stations), and
subpart E (Photochemical Assessment
Monitoring Stations) generally define
the current monitoring networks.
Appendices A through G to 40 CFR part
58 contain more detailed requirements
on quality assurance; monitoring
methods, network design, and siting
criteria; and air quality reporting.
Subpart F (Air Quality Index Reporting),
subpart G (Federal Monitoring), and
appendices F and G to 40 CFR part 58
define annual and daily reporting
requirements.
Most of the major amendments to the
monitoring regulations made after 1979
coincide with the NAAQS revisions and
include the addition of provisions for
PM10 (52 FR 24740, July 1, 1987) and
PM2.5 (62 FR 38833, July 18, 1997).
Photochemical assessment monitoring
stations (PAMS) were established in
1993 to monitor ozone and visibility (58
FR 8468, February 12, 1993).
3. Monitoring Networks
More than 5,500 monitors at about
3,000 sites in the State and local air
monitoring stations (SLAMS) and
national air monitoring stations (NAMS)
networks comprise the majority of
monitors measuring criteria pollutants
using FRM or FEM for direct
comparison to the NAAQS. The NAMS
are a subset of SLAMS that are
designated as national trends sites. The
PM2.5 network consists of ambient air
monitoring sites that make mass or
chemical speciation measurements.
Within the PM2.5 network operated by
State and local agencies, there are
approximately 1,200 FRM filter-based
samplers and about 450 continuous
monitors for mass measurements.
Chemical speciation measurements are
made at 54 ‘‘Speciation Trends
Network’’ sites that are intended to
remain in operation indefinitely and
about 200 other, potentially less
permanent sites used to support SIP
development and other monitoring
objectives. These stations collect aerosol
samples and analyze the filters for trace
elements, major ions, and carbon
fractions.
Ambient air monitors in the PAMS
network measure ozone precursors at
109 stations in 25 serious, severe, or
extreme ozone nonattainment areas. The
PAMS monitors use near-research-grade
measurement technologies to produce
continuous data for more than 50
volatile organic compounds during
summer ozone seasons.
In addition to the NAMS/SLAMS/
PAMS sites, there are approximately
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
310 ambient air toxics monitoring sites,
the majority of which are Federally
funded and report data to EPA’s Air
Quality System (AQS).
Ambient air monitoring stations also
are operated by Indian Tribes. Thirtyone Tribes are currently making data
from 119 individual monitors available
to EPA and others. Approximately 73
Tribal sites monitor for PM10 and PM2.5,
and about 16 monitor for ozone.
The Clean Air Status and Trends
Network (CASTNET) is cooperatively
operated and funded by EPA with the
National Park Service. The EPA’s Office
of Air and Radiation operates a majority
of the monitoring stations with
contractor support; however, the
National Park Service operates
approximately 30 stations in
cooperation with EPA. It the nation’s
primary source for data on dry acidic
deposition and rural, ground-level
ozone. Operating since 1987, CASTNET
is used in conjunction with other
national monitoring networks to provide
information for evaluating the
effectiveness of national emission
control strategies. CASTNET consists of
over 80 sites across the eastern and
western U.S. The longest data records
are primarily at eastern sites. CASTNET
provides atmospheric data on the dry
deposition component of total acid
deposition, ground-level ozone and
other forms of atmospheric pollution.
More information is available from the
CASTNET program Web site https://
www.epa.gov/castnet/.
The EPA is also one of many sponsors
of the National Atmospheric Deposition
Program/National Trends Network. The
National Atmospheric Deposition
Program/National Trends Network
(NADP/NTN) is a nationwide network
of precipitation monitoring stations. The
NADP/NTN has over 200 stations
spanning the continental U.S., Alaska,
and Puerto Rico, and the Virgin Islands.
The purpose of the network is to collect
data on the chemistry of precipitation
for monitoring of geographical and
temporal long-term trends. While
distinct from ambient air monitoring,
precipitation monitoring is related in
that it shares same of the same
objectives, including tracking the effects
of emission reduction programs. More
information on NADP is available at its
Internet Web site, https://
nadp.sws.uiuc.edu/.
The EPA is a major funding sponsor
of the Interagency Monitoring of
Protected Visual Environments
(IMPROVE) program. IMPROVE is a
cooperative measurement effort
governed by a steering committee
composed of representatives from EPA,
National Park Service, other Federal
PO 00000
Frm 00009
Fmt 4701
Sfmt 4702
2717
agencies, and Regional-State
organizations. A total of 110 monitoring
stations in Class I visibility areas have
particulate matter samplers to measure
speciated PM2.5 and PM10 mass. Select
stations also deploy transmissometer
and nephelometers to measure light
extinction and scattering respectively,
as well as automatic camera systems.
Some IMPROVE stations include an O3
monitor. The objectives of IMPROVE
are: (1) To establish current visibility
and aerosol conditions in mandatory
Class I areas; (2) to identify chemical
species and emission sources
responsible for existing man-made
visibility impairment; (3) to document
long-term trends for assessing progress
towards the national visibility goal; (4)
and with the enactment of the Regional
Haze Rule, to provide regional haze
monitoring representing all visibilityprotected Federal Class I areas where
practical. The IMPROVE stations
provide very useful information on
regional-scale particulate matter
concentrations which can help States
and EPA attribute urban concentrations
of PM2.5 to local versus regional sources
and to types of sources. More
information on the IMPROVE program
is available on its Internet Web site,
https://vista.cira.colostate.edu/improve/.
4. Data Storage and Dissemination
Systems
a. Air Quality System. The AQS stores
data collected from over 10,000
monitors, about 5,500 of which are
currently active for criteria pollutants.
The AQS also contains meteorological
data, air toxics data, descriptive
information about each monitoring
station (including its geographic
location and its operator), and data
quality assurance/quality control
information. The EPA and other AQS
users rely upon the system data to
assess air quality, assist in attainment
and non-attainment designations,
evaluate SIP, perform modeling for
permit review analysis, and other air
quality management functions. The
AQS information is also used to prepare
reports for Congress as mandated by the
CAA. The AQS Web site address is:
https://www.epa.gov/ttn/airs/airsaqs/
index.htm.
b. AIRNow. AIRNow is a crossgovernment Web site (https://airnow.gov/
) that provides the public with easy
access to national air quality
information. The Web site offers a daily
forecast of conditions and associated
health effects, known as the Air Quality
Index (AQI), as well as real-time
conditions for more than 300 cities
across country. The AQI focuses on
health effects that may occur within a
E:\FR\FM\17JAP3.SGM
17JAP3
2718
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
few hours or days after breathing
polluted air. The EPA calculates the
AQI for ground-level ozone, particulate
matter, carbon monoxide, sulfur
dioxide, and nitrogen dioxide. The
AIRNow Web site displays nationwide
and regional real-time PM2.5 and ozone
air quality maps for 48 States and parts
of Canada. The air quality data used in
these maps and to generate forecasts are
collected using either FRM, FEM, or
techniques approved by State
monitoring agencies.
c. Other existing data systems. Other
existing data systems for ambient air
quality-related data include EPA’s
National Emission Inventory (NEI) and
AirData. The NEI database at https://
www.epa.gov/ttn/chief/
eiinformation.html provides
information about sources that emit
criteria air pollutants and estimates of
annual air pollutant emissions from
point, nonpoint, and mobile sources.
The EPA compiles the NEI database
from emissions inventories compiled by
State and local environmental agencies
based on State reporting requirements in
40 CFR part 51, agency rulemaking
databases, and the Toxic Release
Inventory data from industry. The EPA
updates the NEI database every 3 years.
The AirData Web site at https://
www.epa.gov/air/data/ provides annual
summaries of ambient monitoring and
emissions inventory data from the AQS
and NEI. The database includes
emission estimates from all 50 States
plus the District of Columbia, Puerto
Rico, and the U.S. Virgin Islands, and
provides data in a variety of formats.
Other web-based data systems related to
ambient air concentration data include
VIEWS (https://vista.cira.colostate.edu/
views/) to support analysis of visibilityrelated data from the IMPROVE
network, and Web sites to support
analysis of CASTNET (https://
www.epa.gov/castnet/data.html) and
NADP (https://nadp.sws.uiuc.edu/) data
sets.
5. EPA Funding
The EPA has historically funded part
of the cost of installation and operation
of monitors to meet Federal monitoring
requirements to defray costs for State,
local, and tribal governments. Sections
103 and 105 of the CAA allow EPA to
provide grant funding for programs for
preventing and controlling air pollution
and for some research and development
efforts. States must apply for section 103
grants and State agencies must provide
nonfederal matching funds for section
105 grants.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
C. What Revisions to the National
Ambient Air Quality Standards for
Particulate Matter Also Are Proposed
Today?
1. PM2.5: Primary Standards, Secondary
Standard, and Federal Reference
Method
Elsewhere in this Federal Register,
we are proposing revisions to the
National Ambient Air Quality Standards
(NAAQS) for particulate matter (PM).
Under the proposal, the 24-hour
primary standard for PM2.5 would be
reduced from the current level of 65
micrograms per cubic meter (µg/m3) to
35 µg/m3 (based on the three-year
average of the annual 98th percentile
concentrations). We also are proposing
to retain the level of the current annual
PM2.5 standard at 15 µg/m3 and to add
additional constraints to the use of
spatial averaging to demonstrate
compliance with that standard. The EPA
is also proposing to revise the current
secondary standards for PM2.5 by
making them identical to the suite of
proposed primary standards.
The NAAQS proposal would also
make several changes to the Federal
reference method (FRM) for PM2.5 in 40
CFR part 50, appendix L. These changes
would improve the operation and
maintenance aspects of the PM2.5
monitoring network. Specifically, we
are proposing to adopt the ‘‘very sharp
cut cyclone’’ (VSCC) as an approved
second-stage impactor. The performance
of the VSSC separator is equivalent to
that of the WINS (Well Impactor Ninety
Six) impactor currently specified in the
proposed reference method and has a
considerably longer service interval. We
also are proposing to require dioctyl
sebacate as an alternative oil approved
for use in the WINS, to extend the
maximum allowed time to recover
filters from samplers, and to modify the
filter transport temperature and postsampling time requirements for final
laboratory analysis.
2. PM10-2.5: Primary Standard,
Secondary Standard, and Federal
Reference Method
The NAAQS proposal would also
revise the current 24-hour primary
standard for PM10 by replacing the
indicator with a PM10-2.5 indicator. The
proposed PM10-2.5 indicator is qualified
so as to include any ambient mix of
PM10-2.5 that is dominated by
resuspended dust from high-density
traffic on paved roads and PM generated
by industrial sources and construction
sources, and exclude any ambient mix
of PM10-2.5 that is dominated by rural
windblown dust and soils and PM
generated by agricultural and mining
PO 00000
Frm 00010
Fmt 4701
Sfmt 4702
sources. This standard shall not require
control of agricultural sources and
mining sources. The proposed level of
the standard is 70 µg/m3, based on the
three-year average of the annual 98th
percentile concentrations.
Accordingly, the proposed revisions
to the NAAQS include a new FRM for
measuring PM10-2.5 (Reference Method
for the Determination of Coarse
Particulate Matter as PM10-2.5 in the
Atmosphere) to be codified in a new
appendix O to 40 CFR part 50. The
proposed FRM is based on the
combination of two low-volume, filterbased methods, one for measuring PM10
and the other for measuring PM2.5, and
determines the PM10-2.5 measurement by
subtracting the PM2.5 measurement from
the concurrent PM10 measurement. The
PM2.5 measurement method is identical
to the PM2.5 FRM currently specified in
40 CFR part 50, appendix L (Reference
Method for the Determination of Fine
Particulate Matter as PM2.5 in the
Atmosphere), with the proposed
changes described above. The PM10
measurement method is very similar
and utilizes a sampler that is the same
as the PM2.5 sampler, except that it has
no PM2.5 particle size separator
downstream of the PM10 separator.
Thus, this proposed PM10-2.5 FRM is
based on the same aerodynamic particle
size separation and filter-based,
gravimetric technology that is also the
basis of the FRM for PM2.5 (with the
proposed changes described above).
3. Data Handling Procedures for PM2.5
and PM10-2.5
In the PM NAAQS proposal published
elsewhere in today’s Federal Register,
EPA is also proposing to revise the
conditions under which spatial
averaging of the annual primary PM2.5
NAAQS would be permitted. We also
propose to move the criteria for
determining if spatial averaging is
acceptable from section 2.8.1.6.1 of
appendix D to 40 CFR part 58 to
appendix N of 40 CFR part 50
(Interpretation of the National Ambient
Air Quality Standards for PM2.5). We
also propose to add a new appendix P
to 40 CFR part 50 (Interpretation of the
National Ambient Air Quality Standards
for PM10-2.5) to provide data handling
procedures for PM10-2.5.
4. Revocation of National Ambient Air
Quality Standards for PM10
In the PM NAAQS proposal, we are
proposing to revoke the current annual
PM10 standard immediately should we
finalize the primary standards for
PM10-2.5 proposed in that notice.
Further, we propose that the current 24hour PM10 standard be revoked in all
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
areas except for 20 areas listed in
section III of the NAAQS proposal
preamble.
D. How Do the Monitoring Data Apply
to Attainment or Nonattainment
Designations and Findings?
The criteria for determining when it is
appropriate to compare ambient
monitoring data from a specific monitor
and period to a National Ambient Air
Quality Standard (NAAQS) is an
important element of the air quality
management system because it can
identify what geographic areas have air
quality problems and may be designated
as nonattainment.
Later sections of this preamble,
discussing the proposed monitoring
requirements for the proposed PM10-2.5
NAAQS and the proposed provisions for
special purpose monitors (SPM), discuss
the use of monitoring data for
attainment or nonattainment
designations. We are also proposing a
change related to the required spacing
between ozone (O3) monitors and
roadways. Finally, we are proposing
changes to some quality assurance
requirements. This section of the
preamble provides background
information on current EPA policy and
regulations in order to facilitate
informed public comment on these
aspects of today’s proposal.
There are some preconditions to use
of data from an ambient monitor for
comparison to an NAAQS that generally
apply to the current NAAQS for O3,
PM10, PM2.5, CO, SO2, NO2, and Pb, with
a few exceptions and/or the opportunity
for waiver by EPA.16 These include the
following:
• The monitoring site must represent
ambient air, as defined in 40 CFR 50.1
(i.e., ‘‘that portion of the atmosphere,
external to buildings, to which the
general public has access’’). In practical
terms, this means that data from
monitoring sites within the boundaries
of a privately-owned facility to which
public access is restricted, for example,
a storage yard of a factory, are not
eligible for comparison to the NAAQS.
(On occasion, EPA has relied on data
from such sites when the air sampled is
ambient air, even though the monitor
may be sited on a facility to which
public access is restricted (e.g., the
monitor is very close to a fence line and
is monitoring the conditions that are
present in the adjacent publicly
accessible property.) Data from a
monitor in ambient air as so defined can
be compared to the NAAQS even if
members of the public infrequently
16 Monitors that have received waivers are eligible
for comparison to their respective NAAQS.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
come near the monitor’s location (e.g.,
O3 monitors that are located on the
ground on high elevation mountain
sites). However, data from monitors
located high above standing/walking
ground level, such as on a high roof or
tower, are not eligible for comparison to
an NAAQS. It should be noted that
although monitors are often sited with
the intention to represent an area of a
certain geographic scale, in general, a
monitor need not be representative of
the ambient air quality across an area of
any specific size to be eligible for
comparison to most NAAQS. However,
as described in section IV.E.2 of this
preamble, the current annual PM2.5
NAAQS is an exception, and the
proposed 24-hour PM10-2.5 NAAQS
would be an exception. (See also the
item in this list regarding proximity of
O3 and CO monitors to roadways.)
• The monitor must use a Federal
reference method (FRM) or Federal
equivalent method (FEM).
• The monitoring data must be
technically valid so as to be truly
representative of the actual air quality at
its location during the sampline period,
subject to the normal limitations of the
FRM or FEM when properly operating.
Generally, this means that the monitor’s
operation and subsequent sample
handling and laboratory analysis, if
applicable, must observe minimum
quality assurance (QA) procedures, as
set forth in 40 CFR 58.10 and 40 CFR
part 58, appendices A and B
(consolidated into a single appendix A
in the proposed amendments), to guard
against equipment malfunction,
miscalibration, drift, or operator error.
When States document that these
procedures have been followed, the data
are presumed to be valid although
specific evidence of instrument faults or
procedural errors can cause EPA to
disregard data from particular periods.
When documentation on whether these
specific procedures have been followed
is not available to EPA, as may be the
case if a State has not submitted QA
data to the Air Quality System (AQS) or
if the monitoring was performed by a
non-State organization not subject to the
QA requirements in 40 CFR part 58,
appendices A and B, the validity of data
is considered on a case-by-case basis if
the issue is raised by EPA, the State, or
another party during an NAAQS
designation process.
• The monitoring probe inlet (or open
path, for open path monitors) must meet
certain requirements for distance from
adjacent roadways. This is a feature of
the current monitoring requirements in
40 CFR part 58, appendix E (Probe and
Monitoring Path Siting Criteria for
Ambient Air Quality Monitoring) and
PO 00000
Frm 00011
Fmt 4701
Sfmt 4702
2719
the proposed amendments.17 Ozone
monitors too close to a roadway may be
measuring air in which O3 has been
scavenged by nitric oxide (NO). Carbon
monoxide and NO2 monitors that are too
close to a roadway can measure
concentrations that do not represent
likely human exposures of any
significant frequency or duration.
Requirements regarding spacing from
roadways can be waived if no other
suitable site is available.
• The monitoring probe inlet (or open
path, for open path monitors) must meet
certain minimum distance limits for
proximity to nearby obstructions, such
as walls of buildings.
• The probe height above the surface
on which the public would stand or
walk nearby must be within a certain
range so that the air it samples is
reasonably representative of what the
public breathes when near the monitor.
This requirement can be waived for
practicality reasons.
• The monitoring data must be
sufficiently complete according to
requirements defined for each NAAQS
in 40 CFR part 50, appendices H, I, K,
and N (a new appendix P proposed
elsewhere in today’s Federal Register
would add completeness requirements
for PM10-2.5).18
In addition to these generally
applicable preconditions or restrictions,
the current requirements of 40 CFR part
58 contain the following special
provisions for PM2.5:
• Data from a PM2.5 monitor can be
compared to the annual or 24-hour
PM2.5 NAAQS only if its location is
‘‘population-oriented.’’ 19 ‘‘Population17 Minimum separation distance requirements in
the current rule apply to O3, NO2, CO, Pb (for
stations designed to assess concentrations from
mobile sources) and PM (PM10 and PM2.5). Under
the proposed amendments, minimum separation
distance requirements would apply to O3, oxides of
nitrogen (NO, NO2, NOX, NOy), CO, PM (PM10,
PM2.5, PM10–2.5) and Pb for stations designed to
assess concentrations from stationary or mobile
sources.
18 Interpretation of the 1-Hour Primary and
Secondary National Ambient Air Quality Standards
for Ozone; Interpretation of the 8-Hour Primary and
Secondary National Ambient Air Quality Standards
for Ozone; Interpretation of the National Ambient
Air Quality Standards for PM10; Interpretation of
the National Ambient Air Quality Standards for
PM2.5; and Interpretation of the National Ambient
Air Quality Standards for PM10-2.5, respectively.
19 Section 2.8.1.2.3 of appendix D to 40 CFR part
58 states that PM2.5 data from state or local air
monitoring systems (SLAMS) and special purpose
monitors (SPM) that are ‘‘* * * representative of
relatively unique population-oriented microscale or
localized hot spot or unique population-oriented
middle scale impact sites are only eligible for
comparison to the 24-hour PM2.5 NAAQS.’’
However, under certain circumstances, the Regional
Administrator may approve population-oriented
microscale or middlescale impact sites for
comparison to the annual NAAQS.
E:\FR\FM\17JAP3.SGM
17JAP3
2720
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
oriented monitoring or sites’’ is
described in 40 CFR 50.1 as applying to
residential areas, commercial areas,
recreational areas, industrial areas, and
other areas where a substantial number
of people may spend a significant
fraction of their day.
• Data from a PM2.5 monitor that is
located in a ‘‘microscale’’ location,
meaning it is influenced by a nearby
emissions source while locations
somewhat further away would be much
less influenced, can be compared to the
annual PM2.5 NAAQS only if its location
is representative of many other locations
in the surrounding urban area, such that
significant numbers of people can be
expected to have similar PM2.5
concentration exposures as people
living, working, or visiting the location
of the monitor in question (section
2.8.1.2.3 of appendix D to 40 CFR part
58).
• Under certain conditions, a State
may, with the approval of EPA, average
data from specified monitors for
purposes of comparing the data to the
annual PM2.5 NAAQS. To be approved
for spatial averaging, as it is known,
monitors must meet certain
requirements for relative location and
measure concentrations as specified in
section 2.8 of appendix D to 40 CFR part
58 (section 4.7.5 of proposed appendix
D to 40 CFR part 58).20
• The first two complete calendar
years of data from an SPM for PM2.5 may
be excluded from comparisons to the
PM2.5 NAAQS, but only if the monitor
is not continued beyond those 2 years
(section 2.8.1.2.2 of appendix D to 40
CFR part 58).
The first three of these four special
provisions for PM2.5 are tied to the
reliance by EPA on community
epidemiology studies in setting the form
and levels of the annual and 24-hour
PM2.5 NAAQS. In simple terms, EPA
determined that the levels of these
NAAQS would be appropriately
protective of public health based on a
presumption that NAAQS compliance
determinations would be made using
data only from monitors that
represented concentrations to which a
large portion of the population would be
exposed, even though some individuals
would have higher or lower exposures.
Finally, EPA has policies addressing
situations in which natural events and
exceptional events have, or may have,
influenced monitored concentrations.
Under these policies, States may make
the case that data from an otherwise
eligible monitor from a specific period
20 Changes to the requirements for spatial
averaging are proposed elsewhere in this Federal
Register.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
should not be used in comparisons to
the NAAQS. We expect to revise these
policies and codify them in 40 CFR part
50 in a separate rulemaking.21
IV. Proposed Monitoring Amendments
A. What Are the Proposed Terminology
Changes?
In 40 CFR 58.1, we propose to replace
the definition of ‘‘National Air
Monitoring Stations (NAMS)’’ with a
new definition for the ‘‘National Core
(NCore)’’ network. The NCore
designation 22 structure would be based
on a tiered system of measurements
including complex research-oriented
stations,23 multipollutant stations
equipped to support a better
understanding of ozone, particulate
matter (PM), and PM precursors, and
sites with as few as one measured
pollutant identified as State and Local
Air Monitoring Stations (SLAMS) that
are primarily intended to support
compliance with the National Ambient
Air Quality Standards (NAAQS).
We are proposing to add a definition
for the term, ‘‘approved regional
methods’’ (ARM) to 40 CFR 58.1. This
term refers to alternative PM2.5 methods
that have been approved by EPA for use
specifically within a State, local, or
tribal air monitoring network for
purposes of comparison to the NAAQS
and to meet other monitoring objectives,
but which may not have been approved
as Federal equivalent methods (FEM) for
nationwide use. The proposed testing
criteria for approval of ARM are
specified in 40 CFR part 58, appendix
C (Ambient Air Monitoring
Methodology).
In 40 CFR 53.1, we are proposing to
revise the definition of the term ‘‘Class
III equivalent method’’ to apply only to
continuous or semi-continuous methods
having 1-hour (or less) measurement
resolution. The revised definition would
read:
* * * an equivalent method for PM2.5 or
PM10-2.5 that is an analyzer capable of
providing PM2.5 or PM10-2.5 ambient air
measurements representative of 1-hour
or less integrated PM2.5 or PM10-2.5
21 These policies on natural and exceptional
events will be discussed in the preamble to the
Natural and Exceptional Events rule to be published
in the near future.
22 Because the terms, SLAMS and NAMS, are
used extensively through the current rules, this
terminology change results in numerous changes.
For clarity, we are publishing the entire text of 40
CFR part 58, appendix D (Network Design Criteria
for Ambient Air Quality Monitoring).
23 The NCore research grade station designation is
defined in the proposed amendments in
anticipation that these stations will be initiated at
some time in the future. We are not proposing to
require (or to fund) NCore research grade stations
in this notice.
PO 00000
Frm 00012
Fmt 4701
Sfmt 4702
concentrations as well as 24-hour
measurements determined as, or
equivalent to, the mean of 24
consecutive 1-hour measurements.
Restricting the Class III definition as
proposed would offer a technical
advantage by allowing the establishment
of more tolerant minimum performance
limits than would be necessary if noncontinuous methods were included.
We are also proposing to add a
definition of the term ‘‘PM10c’’ to 40 CFR
53.1. This term refers to PM10
measurements obtained with a
specially-approved sampler that meets
more demanding performance
specifications than high-volume PM10
samplers described in 40 CFR part 50,
appendix J (Reference Method for the
Determination of Particulate Matter as
PM10 in the Atmosphere).
Measurements obtained with PM10c
samplers are intended to be paired with
PM2.5 measurements from Federal
reference method (FRM) samplers as
part of the difference measurement
(PM10-2.5 equals PM10c minus PM2.5)
specified in the proposed appendix O to
40 CFR part 50 (Reference Method for
the Determination of Coarse Particulate
Matter as PM10-2.5 in the Atmosphere)
published elsewhere in today’s Federal
Register.
B. What Are the Proposed Requirements
for Approval of Reference or Equivalent
Methods?
The provisions of 40 CFR part 50 and
related appendices define certain
ambient air monitoring methods (or
methodology) as reference methods for
the purpose of determining attainment
of the National Ambient Air Quality
Standards (NAAQS). Under 40 CFR part
53, EPA designates specific commercial
instruments or other versions of
methods as Federal reference methods
(FRM). Furthermore, to foster the
development of improved alternative air
monitoring methods, EPA also
designates alternative methods that are
shown to have comparable performance
as Federal equivalent methods (FEM).
Explicit performance tests, performance
standards, and other requirements for
designation of both FRM and FEM are
provided in 40 CFR part 53 for each of
the criteria pollutants. Only designated
reference or equivalent methods may be
used in the States’ air surveillance
monitoring networks. A list of all
methods that EPA has designated as
either FRM or FEM for all criteria
pollutants is available at www.epa.gov/
ttn/amtic/criteria.html.
Elsewhere in this Federal Register,
EPA is proposing a new reference
method (40 CFR part 50, appendix O)
for the measurement of coarse
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
particulate matter (PM) in the ambient
air. Concurrent with the proposal of this
new reference method, EPA is also
proposing amendments to 40 CFR part
53 to extend the designation provisions
to methods for PM10-2.5. These proposed
amendments would set forth explicit
tests, performance standards, and other
requirements for designation of specific
commercial samplers, sampler
configurations, or analyzers as either
FRM or FEM for PM10-2.5, as appropriate.
The EPA recognizes that the PM10-2.5
reference method, while providing a
good standard of performance for
comparison to other methods, is not
itself optimal for routine use in large
PM10-2.5 monitoring networks.
Accordingly, EPA is specifically
encouraging the development of
alternative methods (and particularly
continuous monitoring methods) for
PM10-2.5 by focusing on the explicit test
and qualification requirements
necessary for designation of such types
of methods as equivalent methods for
PM10-2.5. Virtual-impactor technology
provides a more direct measurement of
PM10-2.5 and can provide an integrated
PM10-2.5 sample filter for chemical
species analyses that can be important
in the development of PM10-2.5 control
strategies. Continuous (or semicontinuous) methods for PM10-2.5
typically provide significant operational
advantages over 24-hour integrated
monitoring methods, such as a selfcontained automatic measurement
process for output of nearly real-time
measurements, reduced on-site service
and off-site filter analysis and support
requirements, and measurement
resolution of one-hour or less. In
addition, corresponding provisions for
considering the designation of
continuous or semi-continuous
equivalent methods for PM2.5 are also
being proposed, since such provisions
are similar to those for PM10-2.5 and are
not currently included in 40 CFR part
53. The nature of the proposed new
provisions for automated methods,
which can accommodate a wide range of
potential PM10-2.5 or PM2.5 measurement
technologies, is based primarily on
ambient air testing at diverse monitoring
sites to demonstrate that the level of
comparability to collocated reference
method measurements is adequate to
meet established data quality objectives.
Furthermore, some existing
requirements for designation of
alternative, non-continuous methods for
PM2.5 would be modified to be more
consistent with the more advanced new
requirements for non-continuous
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
2721
effectively inhibit approval of adequate
and suitable improved or alternative
candidate methods.
1. Proposed Requirements for Candidate
In light of these constraints, EPA
Reference Methods for PM10-2.5
previously defined three classes of PM2.5
Because of the nearly complete
candidate equivalent methods in 40 CFR
similarity between the specifications of
part 53 with progressively greater
the proposed PM10-2.5 reference method
equivalent method qualification
and the existing PM2.5 reference method, burdens. Class I equivalent methods are
the proposed designation requirements
limited to methods having ‘‘* * * only
for PM10-2.5 reference methods are
minor deviations or modifications
essentially the same as those for PM2.5
* * *’’ from the specified reference
reference methods.25 In fact, EPA
method and have the most modest
proposes that a PM10-2.5 sampler pair
requirements for equivalent method
consisting of samplers that have been
designation (in addition to the
shown to meet the PM2.5 reference
applicable reference method designation
method requirements (except for the
requirements). Class II equivalent
PM2.5 particle size separator in the case
methods include other filter-based,
of the PM10c sampler) may be designated integrated, gravimetric-type methods
as a PM10-2.5 reference method without
similar to the reference method, but
further testing.
with greater deviation than allowed for
2. Proposed Requirements for Candidate Class I. Class III equivalent methods
include all other candidate PM2.5
Equivalent Methods for PM10-2.5
methods not classified as Class I or II.
As noted, EPA will strive to
The proposed amendments would
encourage the development of improved extend the definition of Class I, Class II,
alternative air monitoring methods by
and Class III candidate equivalent
providing for their designation as
methods to PM10-2.5.
equivalent methods. But developing
Because Class I equivalent methods
suitable qualification requirements for
for PM10-2.5 differ only very modestly
equivalent methods for PM10-2.5 is
from PM10-2.5 reference methods,
complicated by the complex physical
designation requirements would also be
and chemical nature of PM, the
very similar. The EPA is proposing that
definition of PM10-2.5 that to some extent PM10-2.5 Class I equivalent methods be
incorporates the nature of the
designated if the samplers of the
measurement technique defined in the
PM10-2.5 sampler pair are shown to meet
reference method, and a wide variety of all requirements for either PM2.5
alternative PM2.5 measurement
reference methods or Class I equivalent
techniques that are or may become
methods. As for PM10-2.5 reference
available or may be technically feasible. methods, no further tests would be
Alternative methods must be shown to
required.
provide concentration measurements
One type of Class II equivalent
closely comparable to those obtained
sampler for PM10-2.5 could be based on
with reference methods. Thus, the
virtual impactor technology, which is
requirements established for designation designed to separate coarse mode
aerosols from fine mode aerosols. The
of equivalent methods must identify
candidate methods that can achieve that resulting size-segregated filter samples
could be of great importance to State,
goal, while also having reasonable
local, and tribal agencies to obtain
testing protocols that are not so
PM10-2.5 sample filters for chemical
extensive or burdensome as to
speciation analyses. Class II methods,
24 For this reason, we view our proposal as
having greater deviation from the
consistent with the objectives of section 6102 of the
reference method, would have more
Transportation Equity Act for the 21st Century. See
extensive designation requirements.
section VI.5 of the preamble for the proposed
These methods still typically have many
amendments to the National Ambient Air Quality
similarities to the reference method, and
Standards for particulate matter published
elsewhere in this Federal Register.
therefore, many of the reference method
25 The proposed PM
10-2.5 reference method
designation requirements would apply
specifies a pair of samplers consisting of a
to Class II candidate equivalent
conventional PM2.5 sampler and a special PM10
methods. Generally, these methods must
sampler. The PM2.5 sampler must meet all
requirements for a PM2.5 reference method in 40
be subject to extensive laboratory and
CFR part 50, appendix L. However, the PM10
wind-tunnel tests to determine their
sampler required by the proposed method is not a
performance relative to the performance
conventional PM10 sampler as described in 40 CFR
of the reference method. However, for
part 50, appendix J; rather, it is a sampler specified
to be identical to the PM2.5 sampler of the pair,
methods that have only one substantial
except that the PM2.5 particle size separator is
difference from the reference method
removed. This special PM10 sampler is identified as
specifications (such as a virtual
a ‘‘PM10c’’ sampler to differentiate it from
impactor particle-size separator), only
conventional PM10 samplers that meet the lesser
requirements of 40 CFR part 50, appendix J.
those laboratory tests pertaining to the
methods for PM10-2.5 and for continuous
methods.24
PO 00000
Frm 00013
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
2722
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
performance of the deviating component
would be required. Further, for methods
that have more deviation from the
reference method specifications, the
proposed requirements would provide
an option to substitute more extensive
field comparison tests for some or all of
the extensive laboratory tests that would
otherwise be required. Since such
additional field tests would be similar to
field test requirements proposed for
PM10-2.5 methods, concurrent field
testing for PM2.5 and PM10-2.5 methods
could be carried out. Concurrent testing
would substantially reduce the testing
burden for candidate equivalent
methods that measure both PM2.5 and
PM10-2.5 (such as a dichotomous, virtual
impactor sampler), which could be
tested simultaneously for designation as
an equivalent method for both PM
indicators.
3. Continuous Methods for PM10-2.5
The EPA recognizes that filter-based
measurement methods for either PM2.5
or PM10-2.5 that require manual
gravimetric analysis, as embodied in the
corresponding reference methods, as
well as Class I and Class II equivalent
methods, are by nature very labor
intensive. They are expensive to operate
in routine monitoring networks and can
generally provide only delayed
reporting of multiple-hour integrated
measurements. Self-contained,
continuous-type automated monitoring
methods (analyzers), such as those that
are commonly used for monitoring
various gaseous pollutants, overcome
many of these shortcomings. Various
types of continuous (or nearly
continuous) analyzers have been
developed or are under development for
PM2.5 and PM10-2.5 that offer substantial
advantages over manual methods for
implementation in routine air
monitoring. These advantages include
reduced operational cost, greater
practicality for daily operation,
availability of short-term measurements
such as one-hour averages, and the
possibility for near real-time,
telemetered measurement acquisition.
Accordingly, EPA is very interested in
encouraging the further development of
these continuous-type methods by
providing requirements for designating
such methods as Class III equivalent
methods, so that they can be used in
monitoring networks. Because no such
explicit requirements exist, EPA is
today proposing new Class III
designation requirements for both PM2.5
and PM10-2.5.
Unfortunately, the continuous-type
methods for PM2.5 and PM10-2.5 often
tend to have performance characteristics
somewhat different than those of the
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
corresponding reference method.
Consequently, adequate comparability
to the corresponding reference method
measurements may be technically
difficult to achieve. Thus, the
comparability testing requirements for
Class III candidate methods must be
sufficiently sophisticated to effectively
differentiate between a method that
shows adequate comparability and one
that does not. At the same time, the
designation qualification requirements
must not be impractically extensive or
burdensome, such that monitoring
instrument manufacturers seeking
designation for their analyzers cannot
afford or economically justify the testing
regimen.
We are proposing to narrow the
definition of Class III equivalent
methods to apply only to continuous or
semi-continuous analyzer methods
having one-hour (or less) measurement
resolution, because such methods are of
the most interest to the air quality
monitoring community. While it would
be possible to develop new,
noncontinuous (or non-semicontinuous)
PM2.5 or PM10-2.5 methods that would be
categorized as Class III as currently
defined, there is little, if any, technical
need or economic incentive for
instrument manufacturers to do so.
Restricting the Class III definition to
continuous analyzers, as proposed,
would offer a substantial technical
advantage by allowing the establishment
of somewhat more tolerant limits of
adequate comparability than would be
necessary if non-continuous methods
were included. This statistical
advantage arises because the analyzers
are operated continuously rather than
on an intermittent, one-in-six day or
one-in-three day schedule, which is
typical of manually operated sampler
methods.
Any of the currently existing or
proposed requirements for designation
of reference methods and Class I and
Class II equivalent methods for PM2.5 or
PM10-2.5 that would or should
reasonably apply to a specific Class III
candidate method would be required for
the candidate Class III equivalent
method, as well. But because of the
wide variety of measurement techniques
or technologies possible for a Class III
candidate method, many of these
existing requirements would not, or may
not, apply. Therefore, the proposed
requirements for PM2.5 and PM10-2.5
Class III candidate equivalent methods
are based largely on demonstrating
comparability between candidate
method measurements and concurrent
reference method measurements when
both methods are collocated at several
diverse monitoring and during different
PO 00000
Frm 00014
Fmt 4701
Sfmt 4702
seasonal periods. These proposed
requirements would be added to subpart
C of 40 CFR part 53. Because we intend
that most of the PM10-2.5 monitors in the
network use continuous or semicontinuous methods, the proposal of
Class III approval requirements is
particularly important for PM10-2.5.
Although candidate PM2.5 and
PM10-2.5 Class III equivalent methods
would have hourly measurement
resolution, this capability would not be
subject to comparability requirements
because both PM2.5 and PM10-2.5 FRM
have only 24-hour measurement
capability.
In developing these proposed new
requirements for PM2.5 and PM10-2.5
Class III candidate equivalent methods,
EPA has attempted to provide
requirements that effectively reject
inadequately comparable methods while
minimizing the testing burden to the
extent possible. Because the
performance characteristics of Class III
methods are likely to vary at monitoring
sites having differing climatic and
aerosol conditions, comparison tests
would be required at sites in three
specified areas of the continental U.S.
during winter and summer seasons
(winter in only one of the areas). The
EPA believes these requirements would
provide the minimum of test venues
necessary to represent an adequate
degree of monitoring site diversity for
designation of a candidate equivalent
method. However, EPA specifically
solicits comments on the adequacy of
the proposed geographical test areas, the
appropriateness of the proposed
seasonal requirements, and whether an
additional test site may be needed
(including the nature of such an
additional site).
4. Specific Requirements for Class III
Equivalent Methods
The proposed amendments to 40 CFR
part 53 would revise the requirements
for comparison tests and the allowable
quantitative deviation from reference
method measurements that are based on
statistical analyses. The EPA has
previously used a documented
procedure 26 and a special computer
software aid 27 to establish data quality
objectives (DQO) for PM2.5 monitoring
data so that such data can be used
effectively in making decisions
regarding attainment of the NAAQS for
PM. Using these established DQO and
the software, statistical analyses of both
26 U.S. Environmental Protection Agency.
Guidance for the Data Quality Objectives Process.
EPA QA/G–4, EPA/600/R–96/055. August 2000.
27 U.S. Environmental Protection Agency (2004b)
DQO Companion Tool, Version 2.0. 2004. https://
www.epa.gov/ttn/amtic/dqotool.html.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
actual and simulated PM2.5 monitoring
data 28 29 were carried out to confirm the
suitability of the statistical parameters
selected to describe a comparison
relationship between the candidate and
reference methods and to set
appropriate and optimal limits for their
values in the proposed Class III
equivalent method tests. These
quantitative requirements then define
the minimum candidate method
comparability performance that would
be necessary to provide PM2.5
monitoring data of sufficient quality to
meet the established DQO.30 The DQO
for PM10-2.5 monitoring data have
recently been developed and are
incorporated into 40 CFR part 58,
appendix A. These DQO are similar to
the DQO for PM2.5. Accordingly, the
requirements proposed for PM10-2.5
methods are similar to those proposed
for PM2.5 methods.31 Furthermore,
similar or parallel requirements are also
proposed for Class II equivalent
methods for PM10-2.5 as well as for PM2.5.
However, the proposed requirements for
Class II equivalent methods for PM10-2.5
are stricter with regard to additive bias
(intercept) since this method would also
support other monitoring objectives.
These latter requirements proposed for
PM2.5 Class II methods would replace
the existing test requirements with the
more advanced, DQO-based
requirements.
The parameters selected to estimate
the performance of PM2.5 and PM10-2.5
Class II and Class III candidate method
measurements relative to the
performance of the reference method in
the proposed field tests are precision,
correlation, and the linear regression
slope and intercept of a linear plot fitted
to corresponding candidate and
reference method mean measurement
data pairs. Statistical analyses based on
the DQO model show that the precision
of a candidate method is not,
statistically, very important to annual
concentration averages used for NAAQS
attainment decisions, but would be
28 Data Quality Objectives for PM Continuous
Methods. Prepared for U.S. Environmental
Protection Agency by ManTech Environmental
Technology, Inc. EPA Contract 68–D–00–206,
Report TR–4423–03–08, June 2003.
29 Data Quality Objectives for PM Continuous
Methods II. Prepared for U.S. Environmental
Protection Agency by ManTech Environmental
Technology, Inc. EPA Contract 68–D–00–206.
Report TR–CAN–04–02, June 2004.
30 Criteria for Designation of Equivalence
Methods for Continuous Surveillance of PM2.5
Ambient Air Quality. Prepared for U.S.
Environmental Protection Agency by B. Coutant
and J. Sanford, Battelle Columbus, EPA Contract
68–D–02–061, 2004.
31 Method Equivalency Development for PM
10-2.5.
Prepared for U.S. Environmental Protection Agency
by B. Coutant, Battelle Columbus, 2005.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
important for a daily standard. Precision
is also consequential for other important
aspects and applications of the PM2.5 or
PM10-2.5 monitoring data. Accordingly,
the proposed amendments would
include a minimum requirement for an
estimate of the candidate method
precision for 24-hour measurements.
A minimum requirement for an
estimate of reference method precision
in the tests, as well as a test for possible
anomalous reference method
measurement values, also are proposed
to ensure that the quality of the
reference method measurements used
for the test meets the expected reference
method performance. The proposed
numerical limits for the Class II and III
precision test requirements for both the
reference and candidate methods are
somewhat larger than those currently
prescribed for Class I PM2.5 methods
because the Class II and III precision
would be calculated as the root mean
square average, rather than the simple
average, of the daily precision values
determined from multiple samplers or
instruments. This more statistically
appropriate aggregation of precision is
consistent with the way precision
would be expressed under proposed
revisions to the data quality assessment
provisions in appendix A to 40 CFR part
58.
As noted above, the proposed revision
to the definition for Class III equivalent
methods would require such methods to
provide one-hour (or less) concentration
measurements, because such short-term
measurements are useful for a variety of
applications. The EPA proposes that
hourly measurements from Class III
comparability tests be recorded and
submitted as part of the required test
data. No requirement for the precision
of these hourly measurements is
included in the proposed amendments
because no one-hour DQO have been
established for either PM2.5 or PM10-2.5
measurements and neither of the PM2.5
or PM10-2.5 reference methods provide
one-hour data or performance goals.
Nevertheless, in view of the substantial
potential utility of one-hour PM2.5 and
PM10-2.5 measurements, EPA solicits
comments on whether requirements for
one-hour measurement precision should
be included in the Class III equivalent
method designation requirements. In
particular, comments are requested on
whether such requirements, if included,
should provide merely an assessment of
one-hour precision or a specified
standard of performance, and if the
latter, to what extent would it be
appropriate to reject a candidate method
that exhibited poor one-hour precision
but adequate 24-hour precision.
PO 00000
Frm 00015
Fmt 4701
Sfmt 4702
2723
The regression comparability
parameters proposed for Class II and
Class III candidate methods would be
interpreted in ways somewhat different
from those now used for determining
candidate method comparability for
other types of candidate equivalent
methods for PM. The slope
(multiplicative bias) and intercept
(additive bias) are the performance
parameters most critical in achieving
the DQO for making correct attainment
decisions. However, these parameters
are interrelated, and statistical analyses
of simulated PM2.5 data 32 show that the
allowable limits for the intercept can be
somewhat less stringent if they are made
to be variable and related to the value
obtained for the slope. Accordingly,
EPA is proposing variable, slopedependant limits for the intercept.
Further, because Class III PM2.5 and
PM10-2.5 equivalent methods would be
redefined as continuous or semicontinuous methods, such methods
would normally be operated
continuously, just as continuous
gaseous pollutant analyzers are, rather
than on a one-day-in-six sampling
schedule typically used for PM2.5
reference method sampling. Again,
statistical analyses 33 show that this
more frequent (daily) sampling allows
the intercept limits to be set even wider
than would be needed for one-in-six day
sampling and still meet the established
DQO. The actual intercept limits for
PM10-2.5 methods proposed today are
somewhat more restrictive than the
analyses would indicate to provide a
factor of safety to account for inherent
differences between the way candidate
methods would be operated in the
proposed equivalent method tests and
the way they would be operated
routinely in State monitoring networks.
Another difference in the way the
conventional comparison parameters
would be interpreted relates to the
proposed lower limit requirement for
the comparison correlation. The
correlation test is instrumental in
detecting longer-term method
variability, such as seasonal bias. By its
nature, the correlation value calculated
for the comparison is quite dependent
on the range of concentrations measured
in the tests. The comparison tests are
subject to the actual PM2.5 or PM10-2.5
concentrations available at the test site,
which are generally related to variable
atmospheric conditions during the test
period and consequently may
32 Battelle
Columbus (2004).
Environmental Technology, Inc.
(June 2003); ManTech Environmental Technology,
Inc. (June 2004); Battelle Columbus (2004); Battelle
Columbus (2005).
33 ManTech
E:\FR\FM\17JAP3.SGM
17JAP3
2724
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
sometimes occur in a rather narrow
range. Therefore, the minimum value
proposed for this statistic is not a fixed
value but rather a variable that is related
to the concentration coefficient of
variation (CCV), which is a measure of
the range of the concentrations
measured in the test. This variable limit
for correlation would provide a more
effective test without unnecessarily
failing test data representative of an
unfortunately limited range of test
concentrations.
One minor difference from the
reference method would be necessitated
by the proposed Class III comparison
tests. The proposed reference methods
for PM2.5 and PM10-2.5 specify a
sampling period tolerance of 23 to 25
hours. Experience has shown that in
multiple-sampler candidate method
tests, which may be frequently
combined with tests of additional
instruments to reduce overall testing
costs, the time required to properly
change sample filters and service the
samplers and other instruments between
sample periods often requires more than
one hour. Accordingly, the proposed
test protocol would allow a 22-hour
minimum sample period for the
reference method to allow complete
sample set acquisition within a 24-hour
period. This proposed revision in the
reference method protocol should have
very little, if any, adverse impact on the
results of the comparability tests.
The proposed requirements for
PM10-2.5 and PM2.5 Class II and Class III
equivalent methods are the least
stringent requirements that would
provide reasonable assurance that
candidate methods meeting these
requirements will produce monitoring
data of quality commensurate with the
quality of reference method data and
that the data will meet the DQO
established for PM2.5 and the proposed
DQO for PM10-2.5. While recent field
studies suggest some potential PM10-2.5
continuous methods look promising,34
it is not certain at this time whether any
current commercial continuous or
nearly continuous methods can yet meet
the proposed requirements for Class III
methods. However, EPA believes that
the establishment of these requirements
would provide a definitive goal which
instrument manufacturers could
achieve.
5. Proposed Changes to Requirements
for PM10 and PM2.5 Class I and Class II
Equivalent Methods
The proposed amendments would
revise the existing provisions for PM10
and PM2.5 Class I and II candidate
equivalent methods. These changes
would clarify or simplify current
provisions or implement minor
improvements to test protocols
suggested by experience and
information acquired in processing
equivalent method applications for
these methods. The proposed changes
would have very little, if any, impact on
the nature, efficacy, or extent of any of
the test requirements.
In the tests for PM10 and PM2.5 Class
I and II candidate equivalent methods,
the minimum separation distance
between sampler or analyzer inlets is
proposed to be reduced from 2 meters
to 1 meter for instruments having flow
rates less than 200 liters per minute.
One meter separation has been found to
be entirely adequate for such low-flowrate instruments, and the change is
consistent with a similar minimum
separation allowance for audit samplers
used in assessing the precision of
network PM2.5 samplers.35 An identical
change is also proposed for appendix A
to 40 CFR part 58.
Another proposed change would
replace existing requirements for Class
II PM2.5 equivalent methods with similar
but new DQO-based requirements.
These proposed requirements are
similar to the Class III requirements and
would be based on daily sampling.
Therefore, PM10-2.5 and PM2.5 Class II
equivalent methods used for
determining compliance with the PM
NAAQS would generally be restricted to
daily operation. However, as discussed
previously, filter-based integrated
methods (such as Class II equivalent
methods) are not likely to be widely
used for compliance monitoring. These
methods would be used more for
chemical analysis of samples to
characterize the species of PM in a
monitoring area, which would not
require daily operation of the samplers.
For Class II methods (for either PM2.5
and PM10-2.5 methods), the test sites
would be similar in character to those
for Class III methods, but only two test
sites (one eastern and one western)
rather than three, and tests in only one
season at any time of year rather than
two seasons, would be required. These
34 U.S. Environmental Protection Agency. MultiSite Evaluations of Candidate Methodlogies for
Determining Coarse Particulate Matter (PM10-2.5)
Concentrations: August 2005 Updated Report
Regarding Second-generation and New PM10-2.5
Samplers.
35 Quality Assurance Guidance Document: Field
Standard Operating Procedures for the PM2.5
Performance Evaluation Program. U.S.
Environmental Protection Agency. Office of Air
Quality Planning and Standards, November 1998,
Section 4, page 8.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00016
Fmt 4701
Sfmt 4702
requirements would allow tests for
PM2.5 and PM10-2.5 methods (or for Class
II and Class III method) to be tested
simultaneously, to reduced testing costs.
Flow rates in the existing PM2.5 FRM
and proposed PM10-2.5 FRM would be
operated under conditions of actual
ambient temperature and barometric
pressure, ensuring compatibility of the
measured sample flows. The EPA
solicits comments on the adequacy and
appropriateness of these tests
requirements for Class II methods.
In addition, the proposed
amendments would lower many of the
minimum concentration limit
specifications for various existing test
requirements for PM10 and PM2.5 Class
I and Class II candidate equivalent
methods. These minimum limits were
established either to avoid possible
difficulties with interpretation of test
results due to increased measurement
variability that often occurs at very low
concentrations or to require a wide
range of concentration measurements
for the test. However, experience has
shown that these lower limits are
unnecessarily conservative and can be
decreased considerably without
encountering undue variability in the
measurements or an insufficient range
of concentrations. Further, applicants
often have difficulty obtaining a
sufficient number of measurement sets
that meet some of these minimum
limits. The proposed decreases in these
minimum limits would reduce the
number of test measurement sets that
are rejected as unacceptable due to test
concentration levels failing to meet the
test requirements without
compromising the efficacy of the tests.
These changes would reduce the costs
to applicants of conducting the tests.
6. Other Proposed Changes
The proposed amendments would
make subpart C of 40 CFR part 53 easier
to understand by consolidating the
provisions for the various types of
candidate equivalent methods. This
reorganization results in numerous
minor editorial and section number
changes of no technical impact. The
entire text of 40 CFR part 53, subpart C
is reprinted in the proposed
amendments.
We are proposing numerous minor
changes which are needed to
incorporate new provisions for PM10-2.5
methods into subparts A, C, E, and F of
40 CFR part 53, as well as a few minor
changes that would apply to methods
for PM2.5 or other pollutants. As noted
above, the definition of a ‘‘Class III
equivalent method’’ in 40 CFR 53.1
would be modified to include only
methods that provide automated
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
continuous or semi-continuous
measurements of PM2.5 and PM10-2.5
with one-hour or less resolution. We are
also proposing definitions for the terms,
‘‘PM’’, ‘‘PM10-2.5 sampler’’, and ‘‘PM10C
sampler’’. Another proposed change, to
paragraph (4) of 40 CFR 53.3 (General
requirements for an equivalent method),
would clarify that Class III PM10-2.5 and
PM2.5 candidate equivalent methods
would be subject to applicable
requirements for PM10-2.5 or PM2.5
reference methods contained in those
reference methods (40 CFR part 50,
appendixes L and O) and applicable
requirements for Class I and Class II
equivalent methods contained in
subparts E and F of 40 CFR part 53, in
addition to the proposed amendments to
subpart C. The requirement in 40 CFR
53.5 (Processing of applications) to
publish a notice in the Federal Register
upon receipt of an application would be
deleted, as would the requirements in
40 CFR 53.51(f)(2) and 53.2(a) for
manufacturers of PM2.5 designated
method samplers to submit an annual
Product Manufacturing Checklist. These
requirements have proved to be of little
value, and the significant cost burden to
the Government and to applicants for
these activities can therefore be
eliminated. The proposed amendments
would also delete the requirement in 40
CFR 53.8 (Designation of reference and
equivalent methods) for publishing a
notice of designation in the Federal
Register no later than 15 days after the
date of the determination. We are
proposing to delete the 15-day
requirement because it is not achievable
within the confines of EPA’s internal
review process.
C. What Are the Proposed Requirements
for Quality Assurance Programs of the
National Ambient Air Monitoring
System?
A quality system provides a
framework for planning, implementing
and assessing work performed by an
organization and for carrying out
required quality assurance (QA) and
quality control (QC) activities. The
proposed amendments to 40 CFR part
58, appendix A would provide the
requirements necessary to develop
quality systems for the NCore, State and
Local Air Monitoring Stations (SLAMS),
and Prevention of Significant
Deterioration (PSD) networks. The
proposed revisions address
responsibilities for implementing the
quality system for both EPA and
monitoring organizations, as well as
adherence to the Agency’s QA policy,
data quality objectives (DQO), and the
minimum QC requirements and
performance evaluations needed to
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
assess the data quality indicators of
precision, bias, detectability, and
completeness. In addition, the proposed
amendments would describe the
required frequency of the QC
requirements and performance
evaluations, the data to be collected,
and the statistical calculations for
estimates of the data quality indicators
at various levels of aggregation. The
revised statistical calculations would be
used to determine attainment of the
DQO. The proposed amendments would
also identify national programs that
help determine data quality
comparability across individual
monitoring programs.
The EPA has not conducted a
thorough review of the quality system
for many years. Based on our review of
the existing QA program in 40 CFR part
58, appendices A and B, we are
proposing changes to make the
requirements consistent with our
current QA policy, meet the objectives
of the NCore, SLAMS, and PSD
monitoring networks, and make the
requirements more user-friendly. These
proposed changes would produce a
more consistent QA program across
pollutant categories that fosters use of
new technologies by more directly
linking instrument performance with
programmatic objectives. The proposed
revisions were developed with the
assistance of a stakeholder group (QA
Strategy Workgroup) composed of QA
representatives from EPA, State, local,
and tribal monitoring organizations.
Recommendations from the workgroup
are provided in one of the draft versions
of the National Ambient Air Quality
Strategy document.36 We solicit
comments on all of the following
proposed amendments to 40 CFR part
58, appendix A.
1. Consolidation of Quality Assurance
Requirements
The requirements for State and local
air monitoring stations (SLAMS) and
prevention of significant deterioration
(PSD) monitoring stations have been
combined from two separate
appendices, 40 CFR part 58, appendices
A and B, into one single appendix A
because both programs have similar QA
requirements.
36 The National Ambient Air Monitoring Strategy
(Final Draft). U.S. Environmental Protection
Agency. Office of Air Quality Planning and
Standards, APril 2004. Some of the detailed content
of the April 2004 draft, including some of the
workgroup recommendations are not included in
the subsequent December 2005 version.
PO 00000
Frm 00017
Fmt 4701
Sfmt 4702
2725
2. Realignment to Current EPA Quality
Assurance Policies
EPA Order 5360.1 A2 requires
agencies that accept Federal grant
funding for their air monitoring
programs to have a QA program with
certain elements including quality
management plans (QMP), quality
assurance project plans (QAPP), and a
person designated as the quality
assurance manager. Many of these
elements are not in the existing
regulations, which predate EPA Order
5360.1 A2 (revised in 2000), but would
now be added under today’s proposal.
Grantee agencies have been following
the requirements of EPA Order 5360.1
A2 for several years, and as a result, we
do not expect these proposed revisions
would have a significant impact on
resources beyond the existing program.
Copies of EPA Order 5360.1 A2 are
available in the docket for this proposal
as well as on EPA’s Internet site
https://www.epa.gov/quality1.
A QMP is a document that describes
an organization’s quality system
including its policy and procedures,
functional responsibilities of
management and staff, and other general
practices of its data collection program.
Project-specific details are documented
in a QAPP. A QAPP would document,
for example, how the PM2.5 air
monitoring network will be operated
and how sampler performance will be
controlled and data quality evaluated.
EPA Order 5360.1 A2 requires grantee
agencies involved with data collection
activities to identify a quality assurance
manager. The proposed amendments to
40 CFR part 58, appendix A would
require each State (or delegated
monitoring agency) to identify and
maintain a ‘‘QA management function’’.
This proposed language captures the
essence of the requirements in EPA
Order 5360.1A2, while befitting the
nature of the ambient air monitoring
community which is made up of large
and small (local and tribal)
organizations.
The EPA also proposes to revise the
QA program by emphasizing the DQO
process. A DQO is a qualitative and
quantitative statement that defines the
appropriate quality of data needed for a
particular decision—for example, the
data quality necessary for EPA or a
monitoring organization to make data
comparisons against the National
Ambient Air Quality Standards
(NAAQS). The DQO help to establish
the requirements for precision, bias,
completeness, and detectability and the
rationale for their acceptance criteria.
The proposed amendments would
require monitoring organizations to
E:\FR\FM\17JAP3.SGM
17JAP3
2726
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
evaluate PM10-2.5 and ozone monitoring
system performance through the DQO
process. This is consistent with the
existing requirement for organizations to
evaluate their PM2.5 monitoring system
performance using the DQO process.
Priority for these evaluations is placed
on PM2.5, PM10-2.5, and ozone as these
are the pollutants of most concern
across the country. Quality assurance
procedures such as determining
precision through collocated sampling
and determining bias through an
independent performance evaluation
program for PM10-2.5 are proposed to
follow the same basic approach as the
PM2.5 monitoring network. The
proposed precision and bias
measurement uncertainty goals are
identified in 40 CFR part 58, appendix
A. The proposed amendments to
appendix A would also specify that EPA
is responsible for the development of
the DQO for NCore multi-pollutant
stations and State and local air
monitoring stations (SLAMS).
3. Quality Assurance Requirements for
PM10, PM10-2.5 and PM2.5
The proposed QA requirements for
PM10-2.5 would follow the same
approach as the requirements that
currently apply to both automated and
manual PM10 and PM2.5 monitors. These
requirements would include the
implementation of flow rates audits
conducted by the monitoring
organization, collocated monitoring, and
performance evaluations. Statistical
evaluations have allowed us to reduce
collocation and performance evaluation
sampling frequencies without
significant affects to data quality
assessments.
We are proposing to amend the PM2.5
and PM10 collocation sampling
frequency requirement. Statistical
assessments of the collocated PM2.5 and
PM10 data reveal that adequate estimates
of precision at the primary quality
assurance organization could be made at
a reduced sampling frequency.
Consequently, we are proposing to
reduce the frequency from every 6 days
to every 12 days. This change would
reduce the burden on the monitoring
organization without a significant effect
on precision estimates. This proposal
does not include a reduction in the
collocation requirements for total
suspended particulate (TSP) or PSD
monitors. In addition, we are proposing
to revise the concentration limits
applicable to collocated pairs of
monitors that are used to provide
precision estimates. The concentration
limits would be reduced from 6
micrograms per cubic meter (µg/m3) to
3 µg/m3 for PM2.5 and from 20 µg/m3 to
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
15 µg/m3 for PM10 (high-volume
samplers). Statistical evaluation of three
years of PM2.5 and PM10 data revealed
comparable estimates of precision using
data from both of these reduced
concentration ranges, and that the
addition of the data at these lower
ranges will increase the level of
confidence in the precision estimates.
This proposed change would make the
collocation sampling frequency
requirement consistent for PM2.5 PM10
and PM10-2.5. A document describing the
possible new approach is available in
the docket.37
We are proposing to revise the
sampling frequency for the
implementation of the PM Performance
Evaluation Program (PEP). This
proposed approach used historical
PM2.5 precision and bias data to identify
the minimum number of performance
evaluations required for all primary
quality assurance organizations to
provide an adequate assessment of bias,
rather than the current requirement that
a uniform 25 percent of monitors in a
primary quality assurance organization
be evaluated each year. The revision
would establish an equitable sampling
frequency of five valid audits a year for
organizations with less than or equal to
five monitoring sites and eight valid
audits a year for those organizations
with greater than five monitoring sites.
A valid performance evaluation audit
means that both the primary monitor
and PEP audit concentrations are valid
and above 3 µg/m3. As an example, if a
primary quality assurance organization
had 20 monitoring sites, the current
requirement would require five sites (25
percent of network) to be audited four
times each year (one each quarter) for a
total of 20 audits. The new proposal
would simply require eight audits be
provided (distributed across each
quarter) and that all monitoring sites be
audited within a six year period in order
to provide a representative estimate of
bias for the monitoring network. This
would equate to distributing eight
audits (or five for networks less than or
equal to 5) at 15 percent of the
monitoring network sites. In addition,
each method designation must be
audited. Therefore, if a primary quality
assurance organization had two
different monitoring instruments in
their network, both would need PEP
audits each year. Since bias data quality
objectives are evaluated on 3 years of
PEP audits, both sampling frequencies
should provide us with reasonable
assessments of bias. Preliminary
37 Proposal
to Change the PM2.5 and PM10
Collocation Sampling Frequency Requirement,
https://www.epa.gov/ttn/amtic/pmqainf.html
PO 00000
Frm 00018
Fmt 4701
Sfmt 4702
assessments of the impact of the
possible new method show that
organizations with smaller networks
would need more audits but fewer
audits would be needed at organizations
with larger networks. The net result
across all primary quality assurance
organizations would be fewer audits,
comparable bias results, and reduced
resource burden. A document
describing this possible approach is
available in the docket.38
4. Requirements to Ensure Adequate
Independent Quality Assurance for All
Pollutants Subject to National Ambient
Air Quality Standards
We are proposing to revise the current
regulatory requirements dealing with
responsibilities for independent
assessments of monitoring system
performance. These evaluations are the
subject of sections 2.4 and 3.5.3.1 of the
current appendix A to 40 CFR part 58.
Section 2.4 of appendix A to 40 CFR
part 58 applies to all National Ambient
Air Quality Standards (NAAQS)
pollutants and section 3.5.3.1 is
applicable only to PM2.5. Currently,
section 2.4 of appendix A requires the
monitoring organization to ‘‘participate’’
in EPA’s National Performance Audit
Program (NPAP). For the last few years,
EPA has considered that monitoring
organizations are in compliance with
the requirements of section 2.4 if, at a
minimum, the organizations made their
monitoring sites and equipment
accessible to EPA or contractors for
conducting the performance
evaluations. For continuous gas
instruments, a performance evaluation
involves the introduction of a gas or
gases of independently known
concentration to determine the bias of
the local monitor.
Section 3.5.3.1 of appendix A to 40
CFR part 58 describes the Performance
Evaluation Program (PEP) for PM2.5. The
PEP requirements are functionally
similar to the NPAP requirements but
differ in its specifics because of the
nature of particulate matter sampling
(i.e., it is not possible to introduce air
with a known concentration of PM2.5
into a monitor). Under the PEP for
PM2.5, a local monitor is evaluated by
placing a second, independentlymaintained Federal reference method
(FRM) monitor next to the local monitor
and allowing both monitors to sample
for 24 hours. The filter from the
independent FRM monitor is then
shipped to an independent laboratory
38 Review of the Potential to Reduce or Provide
a More Cost Efficient Means to Implement the PM2.5
Performance Evaluation Program, https://
www.epa.gov/ttn/amtic/pmpep.html.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
where it is weighed and the resulting
independently calculated concentration
is compared to the concentration from
the local monitor. The resulting
difference in concentrations between
the independent FRM monitor and local
monitor is used to calculate the bias
between the sampler results.
The monitoring organization is
responsible for having these PM2.5
performance evaluations take place, or
only for giving access to its sites for EPA
staff or contractors to perform them. In
practice, most monitoring organizations
comply with the requirements in section
3.5.3.1 by giving access to EPA staff or
contractors and by accepting that EPA
funds this activity by holding back part
of the grant funding that might
otherwise go directly to the monitoring
organization. One State complies with
requirements in section 3.5.3.1 by
having independent audits in one part
of the State performed by personnel and
laboratories from the monitoring
organization that is responsible for daily
operations in another part of the State.
The EPA proposes to revise the text of
40 CFR part 58, appendix A to clearly
provide that it is the responsibility of
each monitoring organization to make
arrangements for, and to provide any
necessary funding for, the conduct of
adequate independent performance
evaluations of all its FRM or Federal
equivalent method (FEM) criteria
pollutant monitors. The proposed
language would also clearly indicate
that it is the monitoring organization’s
choice whether to obtain its
independent performance evaluations
through EPA’s NPAP and PM2.5 PEP
programs, or from some other
independent organization. An
independent organization could be
another unit of the same agency that is
sufficiently separated in terms of
organizational reporting and which can
provide for independent filter weighing
and audit gas naming. This proposed
approach would ensure that adequate
and independent audits will be
performed but would provide flexibility
in the implementation approach.
Monitoring organizations that choose
to comply with the revised provisions of
appendix A to 40 CFR part 58 regarding
performance evaluations by relying on
EPA audits, for PM2.5, PM10-2.5, and/or
other NAAQS pollutants, would be
required to agree that EPA hold back
part of the grant funds they would
otherwise receive directly. The EPA
intends to develop guidance for
monitoring organizations that choose to
comply by obtaining audit services from
elsewhere. To ensure national
consistency and effective audits, this
guidance will include provisions for
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
EPA certification of data comparability
for audit services not provided by EPA
and for traceability of gases and other
audit standards to national standards
maintained by the National Institute for
Standards and Technology.
5. Revisions to Precision and Bias
Statistics
We are also proposing to change the
statistics for assessment of precision and
bias for criteria pollutants. Two
important data quality indicators that
are needed to assess the achievement of
DQO are bias and precision. Statistics in
the current requirements of 40 CFR part
58, appendix A (with the exception of
PM2.5) combine precision and bias
together into a probability limit at the
primary quality assurance organization
level of aggregation. In addition, the
statistical calculations of precision and
bias vary among criteria pollutants and
between manual and automated
methods within the same pollutant.
Since the DQO process uses separate
estimates of precision and bias, we
examined assessment methods that were
statistically reasonable and simple. The
proposed assessment methods are based
on the QA measurements that are
currently required in 40 CFR part 58,
appendix A.
For sulfur dioxide (SO2), nitrogen
dioxide (NO2), carbon monoxide (CO),
and ozone (O3), we are proposing to
estimate precision and bias on
confidence intervals at the site level of
data aggregation rather than the primary
quality assurance organization.
Estimates at the site level can be
accomplished with the automated
methods for SO2, NO2, CO and O3
because there is sufficient QC
information collected at the site level to
perform adequate assessments. Since
the criteria pollutant data are used for
very important decisions (comparison to
the NAAQS), providing precision and
bias estimates at upper confidence
limits would provide a higher
probability of making appropriate
decisions. The intent of this proposed
change is to move organizations to a
‘‘performance-based’’ quality system.
Organizations that demonstrate
acceptable performance would be
allowed the flexibility to reduce the
frequency of certain QC checks. These
agencies are expected to shift resources
used for these QC checks into higher
priority QA work. A document
describing this possible new approach is
available in the docket.39
39 Proposal: New Method for Estimating Precision
and Bais for Gaseous Automated Methods for
Ambient Air Monitoring Program, https://
www.epa.gov/ttn/amtic/files/ambient/gagc/
proprecision.pdf.
PO 00000
Frm 00019
Fmt 4701
Sfmt 4702
2727
The precision and bias statistics for
PM measurements (PM10, PM10-2.5 and
PM2.5) would be generated at a primary
quality assurance organization level
because, unlike the gaseous pollutants,
only a percentage of the sites have
precision and bias checks performed in
any year. As with the gaseous
pollutants, the statistics would use the
confidence limit approach. Using a
consistent set of statistics would
simplify procedures by removing a
significant number of equations and
confusing language in the appendix.
We are also proposing to change the
precision and bias statistics for lead (Pb)
to provide a framework for developing
and assessing DQO. The QC checks for
Pb come in three forms: flow rate audits,
Pb audit strips, and collocation. The
EPA proposes to combine information
from the flow rate audits and the Pb
audit strips to provide an estimate of
bias. Precision estimates would still be
made using collocated sampling but the
estimates would be based on the upper
95 percent confidence limit of the
coefficient of variation, similar to the
method described for the automated
instruments.
6. Program Updates
We are also proposing several QA
program changes to update the existing
requirements in 40 CFR part 58 to
reflect current program needs and
terminology:
• We are proposing to remove SO2
and NO2 manual audit checks. A review
of all SLAMS/NAMS/PAMS sites by
monitor type revealed that no
monitoring organizations are using
manual SO2 or NO2 methods, nor are
any monitoring organizations expected
to use these older technologies. Instead
of the old manual methods, monitoring
sites are using continuous methods to
perform these audit checks. We are
proposing to remove the manual method
QC checks because the continuous
check methods are covered by the
current QA procedures.
• We are proposing to change the
concentration ranges for QC checks and
annual audit concentrations. The onepoint QC check concentrations for the
gaseous pollutants SO2, NO2, O3 and CO
would be expanded to include lower
concentrations. Lower audit ranges
would also be added to concentration
ranges in the annual audit
concentrations. Adding or expanding
the required range to lower
concentration ranges is appropriate due
to the lower measured concentrations at
many monitoring sites as well as the
potential for NCore stations to monitor
areas where concentrations are at trace
ranges. In addition, EPA proposes that
E:\FR\FM\17JAP3.SGM
17JAP3
2728
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
the selection of QC check gas
concentration must reflect the routine
concentrations normally measured at
sites within the monitoring network in
order to appropriately estimate the
precision and bias at these routine
concentration ranges.
• We are proposing to revise the PM10
collocation requirement. Currently, 15
percent of all PM2.5 sites are required to
maintain collocated samplers. For
consistency, the proposed amendments
would change the PM10 collocation
requirement to match the PM2.5
requirement. This proposed change
would make the collocation requirement
consistent for PM2.5 PM10 and PM10-2.5.
• We are proposing to amend the
PM2.5 and PM10 collocation sampling
frequency requirement. Statistical
assessments of the collocated PM2.5 and
PM10 data reveal that adequate estimates
of precision at the primary quality
assurance organization could be made at
a reduced sampling frequency.
Consequently, we are proposing to
reduce the frequency from every 6 days
to every 12 days. This change would
reduce the burden on the monitoring
organization without a significant effect
on precision estimates. This proposal
does not include a reduction in the
collocation requirements for total
suspended particulate (TSP) or PSD
monitors. In addition, we are proposing
to revise the concentration limits
applicable to collocated pairs of
monitors that are used to provide
precision estimates. The concentration
limits would be reduced from 6
micrograms per cubic meter (µg/m3) to
3 µg/m3 for PM2.5 and from 20 µg/m3 to
15 µg/m3 for PM10 (high-volume
samplers). Statistical evaluation of 3
years of PM2.5 and PM10 data revealed
comparable estimates of precision using
data from both of these reduced
concentration ranges, and that the
addition of the data at these lower
ranges will increase the level of
confidence in the precision estimates.
This proposed change would make the
collocation sampling frequency
requirement consistent for PM2.5 PM10
and PM10-2.5.
• We are proposing to revise the
requirements for PM2.5 flow rate audits.
Based on an evaluation of flow rate data
and discussions within the QA Strategy
Workgroup, we are proposing to reduce
the frequency of flow rate audits from
quarterly to semiannually and remove
the alternative method which allows for
obtaining the precision check from the
analyzers internal flow meter without
the use of an external flow rate transfer
standard. Most monitoring organizations
participating in the QA Strategy
Workgroup considered auditing with a
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
external transfer standard to be the
preferred method and believed that the
quarterly audit data demonstrates the
instruments are sufficiently stable to
reduce the audit frequency. The
proposed amendments would provide
an efficient and effective approach by
reducing audit frequency to an adequate
level while ensuring the use of a
preferred approach.
D. What Are the Proposed Monitoring
Methods for the National Ambient Air
Monitoring System?
1. Federal Reference Methods and
Federal Equivalent Methods
Monitoring methods used in the
multi-pollutant NCore and SLAMS
networks would include Federal
reference methods (FRM), Federal
equivalent methods (FEM), and other
methods designed to meet the data
quality objectives of the network being
deployed. When appropriate, the
proposed amendments place emphasis
on continuous methods over filter-based
methods to provide for highly timeresolved data for better characterization
of diurnal patterns of air pollution and
for timely public availability of data.
While more emphasis is placed on
continuous methods, a limited number
of filter-based methods would still be
retained in most networks to tie together
historical data sets with new monitoring
data. EPA’s strategy for selecting the
proposed monitoring methods for the
National ambient air monitoring system
was to select methods that meet data
quality objectives for each pollutant and
that have the most utility to support
multiple monitoring objectives.
Specifics on the monitoring methods
proposed for use at each type of site are
described below.
• A wide variety of research, FRM/
FEM or other routine methods could be
used at NCore research-grade stations.
Maximum flexibility is provided in the
proposed amendments for these sites
because they would be used to
investigate the atmospheric processes
and air chemistry that go beyond the
capabilities of characterizing the air
with routine monitoring methods.
• NCore multi-pollutant stations
would use FRM or FEM for criteria
pollutants when the expected
concentration of the pollutants are at or
near the level of the National Ambient
Air Quality Standards (NAAQS). For
criteria pollutant measurements of
carbon monoxide (CO) and sulfur
dioxide (SO2), where the level of the
pollutant is well below the NAAQS, it
may be more appropriate to operate
higher sensitivity monitors than FRM or
FEM. In these cases, the higher
PO 00000
Frm 00020
Fmt 4701
Sfmt 4702
sensitivity methods are expected to
support different monitoring objectives
than the FRM or FEM. In some limited
cases, higher-sensitivity gas monitors
have also been approved as FEM and
can serve both NAAQS and other
monitoring objectives. Options for highsensitivity measurements of CO, SO2,
and total reactive nitrogen (NO) are
described in the report, ‘‘Technical
Assistance Document for Precursor Gas
Measurements in the NCore
Multipollutant Monitoring Network.’’
• State and local air monitoring
stations would use FRM or FEM for
criteria pollutants. For PM2.5, these sites
could also use approved regional
methods (ARM), which are described in
section IV.D.2 of this preamble.
• Photochemical assessment
monitoring stations (PAMS) would use
the ozone (O3) ultraviolet photometry
FEM and the nitric oxide (NO) and
nitrogen dioxide (NO2)
chemiluminescence FRM for criteria
pollutant measurements. Methods for
volatile organic compounds (VOC)
including carbonyls, additional
measurements of gaseous nitrogen, such
as NOy, and meteorological
measurements are routinely operated at
PAMS. Because these measurements are
not of criteria pollutants, the methods
are not subject to the requirements for
reference or equivalent methods.
However, these methods are described
in detail in the report, ‘‘Technical
Assistance Document (TAD) for
Sampling and Analysis of Ozone
Precursors.’’40
• Special purpose monitoring (SPM)
sites have no restrictions on the type of
method to be utilized. While FRM and
FEM can be employed at SPM sites,
other methods, not limited to
continuous, high-sensitivity, and
passive methods, may also be utilized.
Because SPM sites are designed to
encourage monitoring, agencies are
expected to design SPM sites with
methods to meet specific monitoring
objectives that may not be achievable
with FRM or FEM. For instance, a
community may be concerned with a
source impacting their neighborhood.
Because many PM FRMs are filter-based
manual methods, having a 24-hour
sample may not indicate if the source
impacted the neighborhood because of
the meteorological variability during the
sample collection period. However, a
continuous method may be able to
provide the high-time resolution
40 Technical Assistance Document (TAD) for
Sampling and Analysis of Ozone Precursors. U.S.
Environmental Protection Agency. HUman
Exposure and Atmospheric Sciences Division. EPA/
600–R–98/161. September 1998. Available at:
https://www.epa.gov/ttn/amtic/pams.html.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
necessary to detect the short-term
impacts of a plume on a neighborhood.
Another example could be the
utilization of passive monitors deployed
at many locations to determine the
location of maximum concentrations
within a neighborhood. Additional
information on SPM is included in
section IV.E.9 of this preamble.
2. Approved Regional Methods for PM2.5
The proposed amendments also
expand the use of alternative PM2.5
measurement methods through
approved regional methods (ARM). The
proposed amendments to 40 CFR part
58, appendix C extend the existing
provisions for EPA approval of a
nondesignated PM2.5 method as a
substitute for a FRM or FEM at a
specific individual site to a network of
sites. This approval would be extended
on a network basis to allow for
flexibility in operating a hybrid network
of PM2.5 FRM and continuous monitors.
The size of the network, in which the
ARM could be approved, would be
based on the location of test sites
operated during the testing of the
candidate ARM. The proposed
amendments require that test sites be
located in urban and rural locations that
characterize a wide range of aerosols
expected across the network. A hybrid
network of monitors would be operated
to address monitoring objectives beyond
just determining compliance with
NAAQS. The hybrid network would
lead to a reduced number of existing
FRM samplers for direct comparison to
NAAQS and an increase in continuous
samplers that meet specified
performance criteria related to their
ability to produce sound comparisons to
FRM data. Those ARM that meet the
specified performance criteria would be
approved for direct comparison to PM2.5
NAAQS.
Performance criteria for approval of
ARM would be used to determine
whether the continuous measurements
are sufficiently comparable for
integration into the PM2.5 network used
in NAAQS decisions. These criteria are
the same criteria for precision,
correlation, and additive and
multiplicative bias that are proposed for
approval of continuous PM2.5 Class III
equivalent methods, described in
section IV.B.3 of this preamble. These
performance criteria would be
demonstrated by monitoring agencies
independently or in cooperation with
instrument manufacturers under actual
operational conditions using one to two
FRM and one to two candidate monitors
each. This would be a departure from
the very tightly-controlled approach
used for national equivalency
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
demonstration in which three FRM and
three candidate monitors are operated.
The ARM would be validated
periodically in recognition of changing
aerosol composition and instrument
performance. These validations would
be performed on at least two levels: (1)
Through yearly assessments of data
quality provided for as part of the ongoing quality assurance (QA)
requirements in 40 CFR part 58,
appendix A, and (2) through network
assessments conducted at least every 5
years as described in section IV.E.11 of
this preamble.
The testing criteria EPA is proposing
for approval of PM2.5 continuous
methods as ARM are intended to be
robust but not overly burdensome. The
two main facets of testing are the
duration and location(s) of testing. The
duration is expected to be one year to
provide understanding of the quality of
the data on a seasonal basis. The
locations for testing are expected to be
a subset of sites in a network where the
State desires the PM2.5 continuous
monitor to be approved as an ARM.
Testing would be carried out in multiple
locations to include up to two Corebased Statistical Area/Combined
Statistical Areas (CBSA/CSA) and one
rural area or small city for a new
method. For methods that have already
been approved by EPA in other
networks, one CBSA/CSA and one rural
area or small city would be required.
To ensure that approvals of new
methods are made consistently on a
national basis, the procedures for
approval of methods would be similar to
the requirements specified in 40 CFR
part 53, i.e., the EPA Administrator (or
delegated office) would approve the
application. However, to optimize
flexibility in the approval process, all
other monitoring agencies seeking
approval of a method that is already
approved in another agency’s
monitoring network may seek approval
through their own EPA Regional
Administrator. This approach should
provide a streamlined approval process,
as well as an incentive for consistency
in selection and operation of PM2.5
continuous monitors across various
monitoring agency networks.
The proposed QA requirements for
approval of continuous PM2.5 ARM at a
network of sites would be the same as
for FEM in 40 CFR part 58, appendix A,
except that 30 percent of the required
sites that utilize a PM2.5 ARM would be
collocated with an FRM and required to
operate at a sample frequency of at least
a one-in-six day schedule. The higher
collocation requirement would support
the main goal of the particulate matter
continuous monitoring implementation
PO 00000
Frm 00021
Fmt 4701
Sfmt 4702
2729
plan, which is to have an optimized
FRM and PM2.5 continuous monitoring
network that can serve several
monitoring objectives. The current 15
percent collocation requirement in 40
CFR part 58, appendix A is adequate to
provide an estimate of site and network
precision; however, a higher amount of
collocation is necessary to retain a
minimum number of FRM for continued
validation of the ARM, direct
comparison to NAAQS, and for longterm trends that are consistent with the
historical data set archived in the Air
Quality System. The collocated sites are
to be located at the highest
concentration sites, starting with one
site in each of the largest population
CBSA or CSA in the network and
working to the next highest-population
CBSA or CSA with the second site and
so forth.
E. What are the Proposed Requirements
for the Number and Locations of
Monitors To Be Operated by State and
Local Agencies?
The proposed amendments modify
the requirements in appendix D to 40
CFR part 58 for the number and
locations of monitors necessary to
support ambient air data objectives.
This proposal requires States to deploy
a new network of multipollutant
monitoring stations called the National
Core (NCore) network; requires States to
maintain robust networks for PM2.5 and
ozone (O3) and to establish a robust
monitoring network for PM10-2.5; allows
States to make major reductions in
monitoring for other criteria pollutants,
where concentration data are well below
the applicable National Ambient Air
Quality Standards (NAAQS) and are not
expected to pose future air quality
problems; and allows States to reduce
the number of stations required for the
NCore photochemical assessment
monitoring stations (PAMS) network.
We also propose to establish or modify
certain monitoring frequency
requirements.
This proposal allows for reductions in
air pollution monitoring for select
pollutants in geographic areas that do
not have or are not expected to have
related air quality problems, while
increasing or maintaining monitoring
sites in areas with continuing or new air
quality problems. The proposal allows
for reductions in the carbon monoxide
(CO), sulfur dioxide (SO2), nitrogen
dioxide (NO2), PM10, and lead (Pb) air
monitoring networks in geographic
areas with historically low
concentrations of these specific
pollutants, except cases in which the
State implementation plan (SIP) or
source permits specifically require
E:\FR\FM\17JAP3.SGM
17JAP3
2730
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
certain monitoring. However,
monitoring requirements that are part of
a SIP or permit should be revisited as
part of the network assessments
described in section IV.E.11 of this
preamble. Overall, a limited number of
these monitors are still expected, but
not required, to be operated to support
studies of air quality trends, to allow
accountability for emissions control
programs, and for health effects studies.
This proposal also requires States to
increase or maintain monitoring sites in
most areas with continuing or new air
quality problems for O3 and PM2.5.
However, with EPA agreement, States
would be allowed to move some
monitors to better characterize the
spatial variability of these pollutants.
As discussed in section IV.E.2 of this
preamble, we also are proposing
requirements for the minimum
monitoring network for the proposed
PM10-2.5 NAAQS published elsewhere in
this Federal Register.41
Under the proposed monitoring
amendments, the PAMS network would
remain a requirement for serious,
severe, and extreme ozone
nonattainment areas. However, EPA is
promoting the development of more
individualized PAMS networks to suit
the specific data needs for a PAMS area.
We propose to make the PAMS
requirements more flexible to allow for
this redesign.
Minimum criteria pollutant
monitoring requirements, where
proposed for retention or addition,
would be based in part on population
statistics. The Office of Management
and Budget (OMB) has established
standards for defining metropolitan and
micropolitan statistical areas that
replace metropolitan statistical areas
defined in the 1990 standards (65 FR
82227, December 27, 2000). The EPA
has traditionally used the 1990
metropolitan statistical area definitions
within many of the air monitoring
requirements including the numbers of
monitoring sites within a network and
the Air Quality Index (AQI) reporting
requirements. The proposed
amendments use the new OMB
standards for defining metropolitan and
micropolitan areas, as well as the new
standards for Core-based Statistical
Areas (CBSA) and Combined Statistical
Areas (CSA).
41 Continuous PM
2.5 and PM10-2.5 methods that
can meet multiple monitoring objectives are being
promoted by proposing new performance-based
criteria for approval of these methods. See section
IV.B of this preamble.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
1. Proposed Requirements for Operation
of Multipollutant Monitoring Stations
Identified as the National Core Network
(NCore).
The EPA is proposing requirements
applicable to States individually that
may, in the aggregate, cause the
deployment of a new network of
monitors in approximately 60 mostly
urban multipollutant stations. Most
States would be required to operate at
least one urban station; however, rural
stations could be substituted in States
that have limited dense urban
exposures. States with Core-Based
Statistical Areas (CBSA) often also have
multiple air sheds with unique
characteristics and, often, elevated air
pollution. These States include, at a
minimum, California, Florida, Illinois,
Michigan, New York, North Carolina,
Ohio, Pennsylvania, and Texas. These
States would be required to identify one
to two additional NCore stations in
order to account for their unique
situations. These stations, combined
with about 20 multipollutant rural
stations, which are not specifically
being required of the States, would form
the new multipollutant NCore network.
The rural NCore stations will be
negotiated using grant authority as part
of an overall design of the network that
is expected to leverage existing rural
networks such as IMPROVE, CASTNET
and, in some cases, State-operated rural
sites.
These multipollutant NCore stations
are intended to track long-term trends
for accountability of emissions control
programs and health assessments that
contribute to ongoing reviews of the
NAAQS; support development of
emissions control strategies through air
quality model evaluation and other
observational methods; support
scientific studies ranging across
technological, health, and atmospheric
process disciplines; and support
ecosystem assessments. Of course, these
stations together with the more
numerous PM2.5 and O3 sites would also
provide data for use in the NAAQS
decision making process and for public
reporting and forecasting of the AQI.
The EPA proposes that these
multipollutant NCore stations be
required to measure O3; high-sensitivity
measurements, where appropriate, of
CO, SO2, and total reactive nitrogen
(NOy); PM2.5 with both a Federal
reference method (FRM) and a
continuous monitor, PM2.5 chemical
speciation, and PM10-2.5 with a
continuous FEM; and meteorological
measurements of temperature, wind
speed, wind direction, and relative
humidity. High-sensitivity
PO 00000
Frm 00022
Fmt 4701
Sfmt 4702
measurements are necessary for CO,
SO2, and NOy to adequately measure a
signal for these pollutants in most air
sheds for data purposes beyond NAAQS
attainment determinations. For the other
listed pollutants, conventional ambient
air monitoring methods could be used.
At least one NCore station would be
required in each State, unless a State
determines through the network design
process that a site which meets their
obligation can be reasonably
represented by a site in a second State,
and the second State has committed to
establishing and operating that site. Any
State, local, or tribal agency could
propose modifications to these
requirements for approval by the
Administrator. While the proposed
amendments do not specify the cities in
which the States must place their
multipollutant NCore Level 2
monitoring stations, EPA anticipates
that the overall result will be a network
that has a diversity of locations to
support the purposes listed earlier. For
example, there would be sites with
different levels and compositions of
PM2.5 and PM10-2.5, allowing air quality
strategies to be evaluated under a range
of conditions.
These sites would be located in a
manner that represents as large an area
of relatively uniform land use and
ambient air concentrations as possible
(i.e., out of the area of influence of
specific local sources, unless exposure
to the local source(s) is typical of
exposures across the urban area).
Neighborhood-scale sites may be
appropriate for multipollutant NCore
monitoring stations in cases where the
site is expected to be similar to many
other neighborhood scale locations
throughout the area. In some instances,
State and local agencies may have a
long-term record of several
measurements at an existing location
that deviates from the siting criteria in
the proposed amendments. The State or
local agency may propose utilizing these
kinds of sites as the multipollutant
NCore monitoring station to take
advantage of that record. The EPA will
approve these sites, considering both
existing and expected new users of the
data. The multipollutant NCore stations
should be collocated, when appropriate,
with other multipollutant air monitoring
stations including PAMS, National Air
Toxic Trends Station (NATTS) sites,
and the PM2.5 chemical Speciation
Trends Network (STN) sites. Collocation
would allow use of the same monitoring
platform and equipment to meet the
objectives of multiple programs where
possible and advantageous.
The proposed amendments would
require operation of the 60 NCore
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
stations by January 1, 2011. However,
up to 35 of these stations are already
being operated on a voluntary and EPAfunded basis with acquisition of highsensitivity monitors for CO, SO2, and
NOy. These three new measurements
and other existing measurements for
O3,PM2.5, and meteorology are the
foundation of this highly leveraged
network. PM10-2.5 measurements would
also be added to these stations once the
continuous technologies are approved
as FEM and are commercially available.
Once these multipollutant NCore
stations are established, it is EPA’s
intention that they operate for many
years in their respective locations.
Therefore, State and local agencies are
encouraged to insure long-term
accessability to the sites proposed for
NCore monitoring stations. Relocating
these stations would require EPA
approval, which would be based on the
data needs of the host State and other
clients of the information.
We may negotiate with some States,
and possibly with some Tribes, for the
establishment and operation of some
additional rural NCore multipollutant
monitoring stations to complement the
multipollutant stations that would be
required by the proposed changes to the
monitoring regulations. We are in the
process of revising CASTNET to
upgrade its monitoring capabilities to
allow it to provide even more useful
data to multiple data users. We expect
that about 20 CASTNET sites will have
new capabilities at least equivalent to
the capabilities envisioned for NCore
multipollutant sites. Those sites would
reduce the number of, and complement,
rural multipollutant sites funded with
limited State/local grant funds.
2. Proposed Monitoring Requirements
for the Proposed Primary National
Ambient Air Quality Standard for
PM10-2.5
The EPA is proposing elsewhere in
today’s Federal Register a new primary
standard for coarse particulate matter
(PM), and a new indicator for that
standard: PM10-2.5, qualified so as to
include any mix of PM10-2.5 dominated
by resuspended dust from high-density
traffic on paved roads and PM generated
by industrial sources and construction
sources, and excludes any ambient mix
of PM10-2.5 that is dominated by rural
windblown dust and soils and PM
generated by agricultural and mining
sources. See section III.D of the 40 CFR
part 50 proposal.42
42 As explained in section III of the NAAQS
proposal (published elsewhere in this Federal
Register), the focus on coarse particles associated
with these source types is derived from the
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Accordingly, EPA is proposing new
provisions in 40 CFR Part 58 to establish
the minimum requirements for States to
deploy and monitor for this proposed
NAAQS. A main goal of the minimum
required network will be the support of
NAAQS designation decisions. Other
data objectives include the improved
characterization of the composition of
coarse particles to support source
apportionment studies and the
development of control strategies;
support of epidemiological and
toxicological research efforts; public
reporting of real-time concentration
levels through the AQI and particle
pollution forecasting programs; the
quantification of coarse particle trends
over time; and identifying and
quantifying the factors that have
contributed to changes over time for
purposes of program accountability.
Requirements for monitor placement
by States that are specific, for example
requirements regarding the target
distances of monitors from sources of
concern, will also ensure a level of
consistency in network design that
allows monitoring results to be
generally comparable among areas
where minimum monitoring
requirements apply.
This section begins with a discussion
of the monitoring methods, types, and
sampling frequencies to be used in the
proposed network. We then turn to the
description of the proposed minimum
requirements for the PM10-2.5 monitoring
network including the proposed number
of monitors to be required in affected
areas and proposed requirements for
where those monitors should be located
within the areas. States would have the
discretion (and would be encouraged) to
place additional monitors to
supplement these minimum required
monitors.
Monitoring for an indicator described
in qualified terms poses issues regarding
how and when to determine the sites at
which the ambient mix of PM10-2.5
would be dominated by resuspended
dust from high-density traffic on paved
roads and PM generated by industrial
sources and construction sources, and
available epidemiological studies that examined
exposures to the ambient mix of PM10-2.5 in urban
areas, and the study which examined exposure to
unenriched natural crustal materials, as well as
dosimetric evidence and toxicological studies.
Adverse health effects associated with PM10-2.5
concentrations have been noted in studies
conducted in urban areas, while limited evidence
does not support the association of health effects
with PM10-2.5 concentrations resulting from the
suspension by wind of uncontaminated natural
crustal materials of geologic origin. Furthermore,
available evidence does not support either the
existence or the lack of causative associations for
community exposures to coarse particle emissions
from agricultural or mining sources.
PO 00000
Frm 00023
Fmt 4701
Sfmt 4702
2731
where it would not be dominated by
rural windblown dust and soils and PM
generated by agricultural and mining
sources. The proposed new provisions
for 40 CFR part 58 described in this
section address this issue.
a. Monitor type, methods, and
frequency of sampling.
We are proposing a Federal reference
method (FRM) for PM10-2.5 in a new
appendix 0 to 40 CFR part 50 (Reference
Method for the Determination of Coarse
Particulate Matter in the Atmosphere),
in section VI of the preamble to the Part
50 proposal elsewhere in this Federal
Register. See also section IV.B above.
The proposed FRM for measuring
PM10-2.5 is based on the combination of
two conventional low-volume filterbased methods, one for measuring PM10
and the other for measuring PM2.5, and
determining the PM10-2.5 measurement
by subtracting the PM2.5 measurement
from the concurrent PM10
measurement.43
The new filter-based FRM would not
be required to be widely deployed in the
operational PM10-2.5 network, but rather
would serve as the basis of comparison
for the equivalency procedures in 40
CFR part 53 described in section IV.B of
this preamble. The EPA intends (but
would not require) that the majority of
the monitors comprising the PM10-2.5
network be based on continuous
methods that will provide an hourly
43 As noted in section VI.A.5 ‘‘Relationship of
Proposed FRM to Section 6012 of the Safe,
Accountable, Flexible, Efficient Transportation
Equity Act: A Legacy for Users (SAFETEA–LU) (PL
109–59)’’ of the part 50 NAAQS proposal, section
6012 of SAFETEA–LU requires the Administrator to
‘‘develop a Federal reference method to measure
directly particles that are larger than 2.5
micrometers in diameter without reliance on
subtracting from coarse particle measurements
those particles that are equal to or smaller than 2.5
micrometers in diameter.’’
As explained above in section IV.B of this
preamble and in the NAAQS proposal, EPA,
consistent with Clean Air Scientific Advisory
Committee (CASAC) Peer Review and
recommendation, is proposing a difference method
as the Federal reference method (FRM). We are
doing so because other methods are not yet
sufficiently developed to serve as an FRM. We have
further explained, however, that we believe that
other methods, notably certain types of continuous
monitoring and dichotomous methods, are potential
Federal equivalent methods, and indeed, that we
expect actual monitoring networks to utilize these
other means of monitoring. We are also continuing
to investigate the possibility of promulgating the
dichotomous method as an FRM, and if technically
justified, will do so.
We view these actions as consistent with the new
statutory provisions. We are taking the steps
necessary to develop a compliance network using
non-difference, continuous methods as the
principal means of monitoring for PM10-2.5. We are
further devoting substantial effort to the possibility
of promulgating dichotomous methods as an
alternative FRM. The EPA will also submit the
required reports by August 10, 2007, the deadline
specified by SAFETEA–LU.
E:\FR\FM\17JAP3.SGM
17JAP3
2732
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
time resolution. At sites with locally
measured wind data and continuous
PM10-2.5 monitors, hourly time
resolution will help States and EPA
understand the emission sources that
are most important to control, by
relating wind direction and source
locations in particular hours with peaks,
and/or by matching the hourly pattern
of concentrations with known temporal
patterns of sources such as traffic. It
may also, in some cases, help in
understanding whether natural events
have influenced a day’s 24-hour
concentration. Whatever method a State
chooses to deploy, all PM10-2.5 monitors
counted by a State as part of its
compliance with the required minimum
number of PM10-2.5 monitoring sites
(proposed below) would be required to
sample every day. The EPA’s data
quality objective process has found
daily sampling to be a key factor in
reducing statistical uncertainty at
concentration levels near the proposed
daily PM10-2.5 NAAQS. The automation
inherent in continuous methods would
provide a more cost-effective alternative
to manual filter-based sampling for
achieving this daily sampling frequency.
The EPA is proposing January 1, 2009,
as the deadline for deployment of
PM10-2.5 monitors. This will provide
over 2 years from promulgation of the
final rule for one or more continuous
PM10-2.5 monitors to be approved by
EPA as meeting the proposed Class III
FEM requirements in 40 CFR part 53
and for the States to procure and deploy
those instruments. We believe this will
be sufficient time for the steps that are
required by monitor vendors, EPA, and
the States. At least two monitor vendors
have already developed prototype
continuous instruments expected to be
candidates for approval as equivalent
methods. These prototypes have already
been the subject of field trials in
cooperation with EPA. We expect
vendors to make improvements based
on this field experience so that final
designs can be field tested in the winter
of 2006/2007, after promulgation of the
final rule, and in the summer of 2007.
Under 40 CFR section 53.5, the
Administrator has up to 120 days to act
on equivalency applications. Thus, it is
feasible for applications to be submitted
and EPA to approve one or more
applications in late 2007 or early 2008
and for States (or EPA on behalf of
States) to place orders in time for
monitors to be manufactured, shipped,
and installed by January 1, 2009.
A small percentage of continuous
PM10-2.5 samplers (minimum of 15
percent) would be required to have a
collocated filter-based FRM sampler or
collocated continuous FEM monitor at
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
the same site for QA purposes (see
proposed 40 CFR part 58, appendix A,
Quality Assurance Requirements for
SLAMS, NCore, and Prevention of
Significant Deterioration (PSD) Air
Monitoring. While we have determined
that all of the PM10-2.5 monitors should
be of the continuous type, except for
these collocated FRM samplers, we are
not requiring the sole use of continuous
methods, in order to maintain flexibility
in the use of manual sampling
technology that can meet the proposed
PM10-2.5 FRM or FEM requirements, and
potentially address additional goals
such as speciation.
We have considered the issue of
whether a State should be allowed to
operate an appropriately sited PM10
monitor in lieu of a required PM10-2.5
monitor in a situation in which the
probability of a PM10-2.5 NAAQS
violation is small. Some State
monitoring officials have expressed
interest in such an option to save
resources or to spread the need for
monitor investments over time.44 We
expect that in the types of areas where
PM10-2.5 is dominated by emissions
generated from high density traffic on
paved roads, industrial sources, and
construction activity, a substantial
fraction of PM10 is likely to be PM2.5.
While a PM10 monitor will capture this
PM2.5 and thus would provide a
conservative estimate (i.e., an
overestimate) of PM10-2.5 concentrations,
there are complicating considerations.
Without data from FRM or FEM
PM10-2.5 monitors, an area would be
initially designated unclassifiable for
PM10-2.5.45 Some designated PM10 FRM
instruments have relatively poor
precision compared to the proposed
requirements for the PM10-2.5 FRM and
FEMs. It is possible that an area might
appear to meet the PM10-2.5 NAAQS
based on PM10 monitor readings but
actually not be in compliance. It is also
possible that a PM10 monitor might
unexpectedly indicate a high enough
concentration of PM10 as to suggest a
possible violation of the PM10-2.5
NAAQS. In such a situation, the result
could be a delay in efforts to meet the
PM10-2.5 NAAQS relative to what would
have been the case had an approved
FRM or FEM PM10-2.5 monitor been
deployed initially.
44 The Clean Air Scientific Advisory Committee
(CASAC) also supported this concept, although
without explicit discussion of the complicating
implementation considerations discussed here.
45 An area without a PM
10-2.5 monitor could in
concept be included in an adjacent nonattainment
area because of its contribution to concentrations in
the latter area. Given the typically short transport
distance of PM10-2.5, this would be unusual.
PO 00000
Frm 00024
Fmt 4701
Sfmt 4702
On balance, EPA believes it is
appropriate to allow use of any PM10
FRM or FEM monitor in lieu of a
required PM10-2.5 monitor, with
restrictions, including the requirement
for daily sampling at such PM10
monitors. This could only be initiated at
monitoring sites where the 98th
percentile value for the most recent
complete calendar year of PM10
monitoring data 46, reported at local
conditions of temperature and pressure
as specified for PM10-2.5 , is less than the
proposed PM10-2.5 NAAQS.47 During any
calendar year of PM10 sampling in lieu
of a required PM10-2.5 sampler, if more
than seven 24-hour average PM10
concentrations exceed the numerical
value of the proposed PM10-2.5 NAAQS,
the State would have to deploy a FRM
or FEM PM10-2.5 monitor within a one
year period. We invite comment on this
subject, including other possible
provisions for more limited use of PM10
monitors in lieu of PM10-2.5 monitors,
such as limiting the use of PM10
monitors to a period of 3 years after the
first approval of a continuous FEM
PM10-2.5 method.
b. Network design.
i. Number of required monitors. The
discussion of network design
requirements for PM10-2.5 begins with
the questions of how to define the
geographic units which should be
separately subject to minimum
monitoring requirements and how many
monitors should be required in each
such area. We propose that the
geographic unit for individual
application of monitoring requirements
be the Metropolitan Statistical Area
(MSA) (i.e., a CBSA which contains an
urbanized area with a population of at
least 50,000 persons).48 We also propose
that only those MSAs that contain all or
part of an urbanized area with a
population of at least 100,000 or more
be required to have monitors.
46 PM
10 data used to qualify a site for PM10
monitoring in place of PM10-2.5 monitoring must be
based on a 1-in-3 day sampling frequency, or more
frequent sampling.
47 The EPA’s intention regarding the substitution
of PM10 monitors for required PM10-2.5 monitors is
that siting criteria would not be affected, i.e., the
PM10 monitor that will substitute for a PM10-2.5
monitor would have to be located at a site that
would be appropriate for a required PM10-2.5
monitor. (What sites are appropriate for required
PM10-2.5 monitors is addressed below.) Also, PM10
data used to qualify a site for PM10 monitoring in
place of PM10-2.5 monitoring must also be from—or
clearly representative of—the site where a PM10
monitor will substitute for a PM10-2.5 monitor.
48 Defined metropolitan and micropolitan
statistical areas based on application of 2000
standards (which appeared in the Federal Register
on December 27, 2000) to 2000 decennial census
data. https://www.census.gov/population/www/
estimates/00–32997.txt.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Some MSAs contain multiple
urbanized areas with populations of
100,000 people or more, each containing
emission sources of interest for PM10-2.5,
which could be separately subject to
monitoring requirements; however, we
believe applying minimums at the
urbanized area level is not necessary to
support implementation of the proposed
NAAQS.49 Where more than one MSA
is part of a Combined Statistical Area
(CSA), each MSA would be treated
separately. We believe separate
treatment of MSAs is appropriate in
light of the typically short transport
distance of PM10-2.5 and the diversity of
situations that can exist in a CSA. For
comparison, PM2.5 and O3 monitoring,
minimum requirements apply at the
CSA level, because a broader geographic
frame is appropriate for those
photochemically formed pollutants.
Consistent with both the current State
and Local Air Monitoring Station
(SLAMS) minimum requirements for
PM2.5 described in 40 CFR part 58,
appendix D and the proposed minimum
requirements for PM2.5 described in
section IV.E.3 of this preamble, EPA
proposes that States be required to have
more PM10-2.5 monitors in higherpopulation MSA than in lowerpopulation MSA. A higher-population
MSA typically has more total roadway
surface, higher traffic counts, more and
larger industrial sources, and more
ongoing construction at any given time,
all of which make it more likely that the
MSA contains more locations with high
concentrations of coarse particles
attributable to these sources. Also, a
higher-population MSA potentially
contains more distinct types of
emissions situations causing PM10-2.5
nonattainment, i.e., more distinct mixes
of emission sources affecting different
locations, such that separate monitoring
49 Factors which contribute to this assessment
include the consideration that multiple urbanized
areas in a single Metropolitan statistical area (MSA)
will tend to have similar situations affecting
PM10-2.5 concentrations, for example similar
meteorological conditions which can favor or
suppress emissions of PM10-2.5 from paved
roadways and construction sites. Also, applying
monitoring requirements separately to urbanized
areas would both increase the total number of
required monitors and reduce State flexibility in
siting the required monitors since any requirements
would have to be met separately in each urbanized
area.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
may be needed to identify these and to
develop and track the success of control
strategies for them. More monitors will
also be useful in helping to define
nonattainment boundaries in larger and
potentially more complex MSAs.
Accordingly, we are proposing
minimum requirements for the number
of PM10-2.5 monitoring stations in each
MSA based, in part, on the total
population of the MSA.50
We are proposing that the actual or
estimated PM10-2.5 design value (threeyear average of 98th percentile 24-hour
concentrations) of an MSA, where one
can be calculated, be used as a second
factor to increase the minimum number
of monitors in MSA with higher
estimated ambient coarse particle levels
and to reduce requirements in MSA
with lower estimated levels. Given the
imprecision of current estimates of
PM10-2.5 ambient concentrations and the
resulting non-robust design value
statistics that will initially be available
to States when they develop their
monitoring plans, we are proposing
three categories of design values defined
by percentages of the proposed 24-hour
PM10-2.5 NAAQS. The proposed
amendments categorize MSA design
values as either low (less than 50
percent of the proposed PM10-2.5
NAAQS), medium (50 percent to 80
percent), or high (greater than 80
percent).
The EPA will assist States with the
development of PM10-2.5 design values
by analyzing the concentrations from
existing collocated or nearly collocated
PM10 and PM2.5 monitors in each MSA
and identifying which pairs meet the
proposed siting criteria appropriate for
comparison to the proposed PM10-2.5
NAAQS. Monitoring agencies may
propose other procedures for calculating
estimated PM10-2.5 design values as a
substitute for EPA-calculated values,
subject to Regional Office approval of
the monitoring methods, site
characteristics, and data handling
procedures being used to calculate
50 April 1, 2000 population in Metropolitan and
Micropolitan Statistical Areas in Alphabetical
Order and Numerical and Percent Change for the
United States and Puerto Rico: 1990 and 2000,
Source: U.S. Census Bureau, Census 2000 and 1990
Census. Internet Release date: December 30, 2003.
https://www.census.gov/population/cen2000/phct29/tab01a.xls.
PO 00000
Frm 00025
Fmt 4701
Sfmt 4702
2733
substitute estimated design values.
PM10-2.5 design values for purposes of
determining the number of required
monitors would be calculated using data
only from sites which are suitable for
comparison to the NAAQS under the
criteria presented later in this section. If
no such sites exist, medium area MSA
minimum requirements would apply.
After actual data using FRM or FEM
monitors is available to establish a true
design value based on 3 years of data,
a State would be allowed to reduce or
be required to increase the number of
monitors based on that design value.
This process of adjustment would be
ongoing, and would be a specific aspect
of the periodic network assessment that
would be required by the proposed
amendments.
Table 1 of this preamble presents the
specifics of the proposed requirements
for the minimum number of monitors in
an MSA, relating the minimum number
of PM10-2.5 monitors to total MSA
population and design value. For
example, an MSA with a total
population of between 1 million and 5
million people that contains all or part
of an urbanized area with a population
of at least 100,000 people, with an
actual or estimated PM10-2.5 design value
of between 50 percent and 80 percent of
the proposed PM10-2.5 NAAQS would be
required to have at least two monitors.
In another example, an MSA with a total
population between 100,000 and
500,000 people with an actual or
estimated PM10-2.5 design value of less
than 50 percent of the proposed PM10-2.5
NAAQS would not be required to have
any monitors, although States could
deploy discretionary monitors.
We invite comment on whether there
should be a different minimum size for
an MSA required to have monitors,
rather than applying the criteria in
Table 1 of this preamble to all MSA that
contain all or part of an urbanized area
with a population of at least 100,000
persons. We also invite comment on
whether factors in addition to MSA
population and estimated design value
should enter into the determination of
the number of required monitors, for
example, MSA or urbanized area(s)
population density, and if so, in what
way.
E:\FR\FM\17JAP3.SGM
17JAP3
2734
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE 1.—PM10-2.5 MINIMUM MONITORING REQUIREMENTS
Most recent
3-year design
value 2 >80%
of PM10-2.5
NAAQS 3
MSA total population 1 5
Most recent
3-year design
value 50%–
80% of
PM10-2.5
NAAQS 3 4
Most recent
3-year design
value <50% of
PM10-2.5
NAAQS 3
5
4
3
2
3
2
1
1
2
1
0
0
> 5,000,000 ..................................................................................................................................
1,000,000–<5,000,000 .................................................................................................................
500,000–<1,000,000 ....................................................................................................................
100,000–<500,000 .......................................................................................................................
1 Metropolitan Statistical Area (MSA) as defined by the Office of Management of Budget. The requirements of this table apply only to MSAs
that contain all or part of an urbanized area with a population of at least 100,000 persons. Metropolitan and micropolitan statistical areas based
on application of 2000 standards (which appeared in the Federal Register on December 27, 2000) to 2000 decennial census data.
2 A database of estimated PM
10-2.5 design values will be provided by EPA until the network is fully deployed for 3 years.
3 The proposed PM
10-2.5 National Ambient Air Quality Standards (NAAQS) levels and forms are defined in 40 CFR part 50.
4 These minimum monitoring requirements would apply in the absence of a design value.
5 Population based on latest available census figures.
The EPA estimates that the size of the
minimum required PM10-2.5 network
will be approximately 250 monitors
based on the proposed requirements and
our current estimates of PM10-2.5 design
values. Figure 1 of this preamble
illustrates our current estimates of how
many monitors would be required in
each MSA based on the criteria in Table
1, census data on MSA populations, and
current estimates of design value.51 We
are not proposing a specific number of
monitors for any MSA. The actual initial
number of monitors required in a given
MSA and the initial size of the
minimum required national network
may be different if monitoring agencies
propose and we approve alternate
approaches to estimating design values
for this purpose. It may be that later
review by States may determine that one
or more of the PM10 monitors we have
used to estimate PM10-2.5 design values
is not appropriate. Also, consideration
of exceptional events may be
appropriate and may affect estimated
design values. The size of the required
network may vary after its startup
depending on long-term changes in total
MSA population and design values.
BILLING CODE 6560–50–U
51 A document listing the current estimate of
PM10-2.5 design values used in constructing figure 1
of this preamble is available in the docket.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00026
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
BILLING CODE 6560–50–C
Figure 1 of this preamble shows that
the proposed minimum network criteria
could (depending on estimated design
values as of the time the States develop
their monitor siting plans) have the
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
effect of putting relatively more
monitors in the eastern States than in
western States. This occurs in part
because of currently estimated design
values but also in part because there are
so many individual MSA in eastern
PO 00000
Frm 00027
Fmt 4701
Sfmt 4702
2735
States compared to western States. In
western States, there are fewer small
and medium-sized cities which are in
separate MSA and thus qualify for
separate monitoring under the proposed
criteria, because the larger size of
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.000
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
2736
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
counties in the western States means
that many smaller cities are subsumed
within relatively few MSA.
We request comment on whether the
proposed minimum requirements
appropriately address the need for
monitoring data in both eastern and
western States, whether additional or
fewer monitors could be needed, and
whether additional monitors in some
areas, if needed, should be required by
the regulations or deployed through
collaborative planning and grant
support. A possibility on which we
request comment is to not adhere to the
formal county-based definition of MSA
in the West and in some way to require
separate monitoring of more urbanized
areas that are not distinct MSA and,
therefore, would not be separately
subject to the minimum monitoring
requirements as proposed. For example,
some MSA in some western states are
divided into distinct nonattainment
areas for ozone, reflecting natural
barriers to transport between air basins.
This division or similar divisions of a
large MSA in a western state could
perhaps play a role in determining
which population centers should
require separate monitoring for PM10-2.5.
We also request comment on
approaches that would aggregate
officially distinct MSAs in eastern
States for the purpose of determining
the required number of monitors.
ii. Location of required monitors and
comparability to the NAAQS. We now
turn to the criteria that should be used
to locate required monitoring sites
within an MSA (the number of monitors
to be sited being determined by the total
MSA population and estimated design
value criteria as just described). As
stated in the introduction to this
section, a main goal of the minimum
required monitors in a given MSA will
be to support NAAQS designation
decisions, including decisions on
nonattainment area boundaries. As
detailed in the NAAQS proposal also in
today’s Federal Register, the purpose of
the proposed qualified coarse particle
indicator and standard is to protect
against coarse particle mixes that are
likely to be similar to those present in
the urban epidemiological studies upon
which the proposed standard is based.
The indicator for the NAAQS includes
any ambient mix of PM10-2.5 that is
dominated by resuspended dust from
high-density traffic on paved roads and
PM generated by industrial sources and
construction sources, and excludes any
ambient mix of PM10-2.5 that is
dominated by rural windblown dust and
soils and PM generated by agricultural
and mining sources. In order to
implement the proposed standard, it is
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
necessary to separate where the mix is
dominated by the emissions of PM from
listed sources and where it is not. We
have been mindful of this goal in
developing the following proposals
regarding monitor siting. In particular
we have been mindful that the strategy
for locating PM10-2.5 monitors must be
developed in light of the qualified
indicator for the NAAQS. Monitors
should therefore be placed in locations
where concentrations of PM10-2.5 are
dominated by PM emissions generated
from high density traffic on paved
roads, industrial sources, and
construction activities.
We have also been mindful that the
strategy for locating PM10-2.5 monitors
must be developed in light of the
approach used to set the level of the
proposed PM10-2.5 NAAQS. As
explained in the NAAQS proposal
notice elsewhere in today’s Federal
Register, the proposed level of 70 µg/m3
for PM10-2.5 (98th percentile form) was
selected to be of equivalent stringency
to the current 24-hour PM10 NAAQS of
150 µg/m3 (one-expected exceedance
form). As discussed below, the approach
used to determine that these levels are
equivalent in stringency has
implications for PM10-2.5 monitor
placement.
The EPA recognizes that each MSA
will be characterized by a unique mix of
moderate to highly populated areas
together with unique arrangements of
paved roads, areas of construction, and
industrial sources of coarse particles.
Therefore, we are proposing network
design requirements that leave room for
later agreement between EPA and each
State on specific sites but that provide
the binding principles for those
agreements.
We envision that a typical PM10-2.5
monitoring network in a large MSA
would include some sites with heavy
impacts from PM emissions generated
from highly traveled roadways and/or
major industrial sources, but with a
relatively small exposed population
because the area around the site is not
a dense residential or commercial area,
and some sites in densely populated
areas with somewhat less proximity to
such sources. It could also include some
sites in lower-density suburban-type
population areas that are nonetheless
affected by sources with emissions of
concern. Within each of these three
categories of sites, there are some sites
that are not suitable for required
monitors because the sites have a good
possibility of not being dominated by
PM emissions generated from high
density traffic on paved roads,
industrial sources, and construction
activities, or because placement of
PO 00000
Frm 00028
Fmt 4701
Sfmt 4702
monitors for comparison to the NAAQS
in those locations would be inconsistent
with the intended stringency of the
NAAQS. The following proposal
addresses both how the required
number of monitors should be assigned
to the three categories of sites, and what
types of sites are suitable or unsuitable
for placement of monitors.
We are proposing a five-part test of
whether a potential monitoring site is
suitable for comparison to the NAAQS,
and two rules for how required monitors
should be assigned among such suitable
sites. All five parts of the suitability test
must be met. The suitability test also
would be used to determine whether
non-required or special purpose
monitors are suitable for comparison
with the proposed PM10-2.5 NAAQS.
The first two parts of the five-part
suitability test are based on using
readily available Census data to help
ensure that PM10-2.5 monitoring sites are
located near and will be dominated by
PM emissions from paved roads,
construction, and industrial sources.
The first part is that a monitoring site
must be within a U.S. Census Bureaudefined urbanized area that has a
population of at least 100,000 persons.
Restricting suitable sites to only those
within an urbanized area of this size
increases the likelihood that the
ambient mix of PM10-2.5 will be
dominated by resuspended dust from
high-density traffic on paved roads and
PM generated by industrial sources and
construction sources, rather than rural
windblown dust and soils and PM
generated by agricultural and mining
sources which are more typical of rural
areas.
The second part of the suitability test
is a minimum threshold for the
population density of the block group
containing the monitoring site. This
provides more assurance that
resuspended dust from high-density
traffic on paved roads and PM generated
by industrial sources and construction
sources will dominate in the vicinity of
the monitoring site.
We propose to employ population
density in addition to simple presence
within an urbanized area because
population density is highly correlated
to traffic density and is available on a
relevant geographic scale. It is
appropriate to expect that mixes of
PM10-2.5 monitored at sites located in
areas of sufficiently high population
density are dominated by resuspended
dust from high-density traffic on paved
roads and PM generated by industrial
sources and construction sources.
Accordingly, we have based the
proposed suitability test for a candidate
monitoring site on the population
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
density of the census block group in
which the site is located. There is a
strong correlation of county-level
estimates of Vehicle Miles Traveled
(VMT) density with county-based
population density.52 It is reasonable to
presume that this county-level
correlation indicates an association
between population density and
vehicular traffic and resulting emissions
of resuspended dust at smaller
geographic scales also, although
exceptions to the association no doubt
become more common. To a lesser
extent, there may also be associations
between population density and the
presence of other industrial sources and
construction activities.53 It is thus
appropriate to expect that mixes of
PM10-2.5 monitored at sites located in
areas of sufficiently high population
density are dominated by resuspended
dust from high-density traffic on paved
roads and PM generated by industrial
sources and construction sources, and
are not dominated by rural windblown
dust and soils and PM generated by
agricultural and mining sources.
The available census geographic
entities for which population density is
published by the U.S. Census are
counties, urbanized areas, urban
clusters, census tracts, and block
groups. Block groups typically
encompass one-half to two square miles,
and thus they provide a spatial
resolution of about one mile. On
average, there are approximately 200
block groups for each of the 370 MSA
in the U.S. In a State such as Michigan,
for example, the average land area in a
county is 700 square miles as compared
to just over 20 square miles for a census
tract and to about 0.5 square miles for
a block group. A large-scale unit of
density analysis, say the urbanized area
level, would not be as helpful for
guiding monitor placement since it
would be a mix of low and high density
sub-units that could have quite different
source mixes.
We considered a range of block group
population density thresholds for use in
identifying block groups within an
urbanized area that may be suitable for
52 Review of the National Ambient Air Quality
Standards for Particulate Matter: Policy Assessment
of Scientific and Technical Information, OAQPS
Staff Paper, EPA–452/R–05–005, June 2005, p. 5–
59. Counties are the geographic unit at which
vehicle miles traveled (VMT) is most readily
available from State departments of transportation.
The Federal Highway Administration maintains
VMT statistics at a higher level of aggregation.
53 Manufacturing and service industry facilities,
and areas of long-term construction such as
commercial development and roadway
construction, tend—with exceptions—to be in the
general area of populated areas that create the
demand for such activities and provide their
workers.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
comparison to the NAAQS, depending
on other parts of the suitability test. A
low population density threshold would
tend to identify as suitable low density
‘‘edge’’ block groups, which because of
their proximity to surrounding nonurbanized lands could tend to have
PM10-2.5 concentrations that are from
emission sources that are not of
concern, as these are explicitly rural
sources (windblown rural dust and soil)
or sources that are more typical in rural
lands (agriculture and mining). A low
population density threshold would
also tend to identify internal or
‘‘enclave’’ low density block groups
which may well have significant paved
road, industrial, and construction
emission sources but happen not to
have many residences; later we return to
such ‘‘enclave’’ block groups as an
exceptional case. A population density
threshold that is too high could leave
out areas where PM10-2.5 concentrations
are dominated by PM emissions from
high density traffic on paved roads,
industrial sources, and construction
activities.
We first noted that the U.S. Census
Bureau uses a population density of 500
persons per square mile in one step of
defining the ‘‘Initial Core’’ of an
urbanized area. The initial core of an
urbanized area always includes core
census block groups or blocks with a
density of at least 1,000 persons per
square mile and contiguous block
groups that have a density of at least 500
persons per square mile.54
We have investigated for comparison
the population densities of block groups
in which States and EPA have agreed in
the past to place PM10 monitors. We
observe that States have typically
located PM10 monitors in block groups
of population densities that are higher
than 500 people per square mile. The
median block group population density
of the approximately 1,200 PM10
monitoring sites active in the U.S.
between 2002 and 2004 is 1,390 people
per square mile. Sixty-three percent of
the approximately 1,200 PM10
monitoring sites are in block groups
with a density higher than 500 persons
per square mile.
We have also investigated for
comparison the block group population
54 See Urban Area Criteria for Census 2000, March
15, 2002, 51 FR 11663. The Census Bureau adds to
each urbanized area additional non-contiguous
block groups below and above 500 persons per
square mile using detailed ‘‘hop’’ and ‘‘jump’’
criteria. Any additional block groups below 500
persons per square mile would not be included in
our proposed suitability test because such areas are
less likely to have a dense concentration of paved
roads, construction, and industrial sources and may
be in close proximity to sources of emissions that
are not of concern.
PO 00000
Frm 00029
Fmt 4701
Sfmt 4702
2737
densities for those PM10 monitors which
are sited with or near a PM2.5 monitor.
The PM2.5 monitoring program was set
up to be more urban oriented than the
PM10 monitoring program. Thus, this
smaller set is of more relevance to the
structure of a PM10-2.5 monitoring
program. Among the 710 such monitors,
the median block group density is 2,306
persons per square mile. Seventy-eight
percent of the 710 monitoring sites are
in block groups with a density higher
than 500 persons per square mile.
After examining on an empirical basis
in a sampling of MSA the block groups
identified by population density
thresholds of 500 persons per square
mile, values lower than 500, and values
above 500, and in light of the practices
of the U.S. Census Bureau, we selected
500 as the proposed threshold value for
the second part of the suitability test
because it appears to result in inclusion
of most of the related urbanized area
while omitting fringe areas where paved
roads, construction sites, and industrial
sources are few in number and/or low
in emissions mass, and whose emissions
and ambient impact could be exceeded
by the impact of rural soil, dust, and
emissions from agricultural and mining
sources.
Regarding the above-mentioned issue
of enclaves within an urbanized area,
we are concerned not to exclude low
population density block groups that
contain paved roads, construction sites,
and/or industrial sources and do not
contain significant agricultural or
mining sources. The Census
incorporates enclaves consisting of
block groups with population density
below 500 persons per square mile if
certain conditions are satisfied.
Enclaves of less than five square miles
are always incorporated. Even larger
enclaves can be included as well. We
are concerned that such large enclaves
may not be industrial zones or
transportation corridors that happen to
have little resident population (which
could be appropriate for monitoring) but
instead could contain agricultural or
mining operations (which could make
them inappropriate for monitoring).
Therefore, we propose that block
group(s) with population densities less
than 500 persons per square mile, even
if part of an urbanized area, will be
considered to pass the second part of
the suitability test if those block groups
comprise an enclave of less than five
square miles in land area. We invite
comment on this special exception.
We propose that the third necessary
condition for siting a required monitor
and comparing any PM10-2.5 monitor to
the PM10-2.5 NAAQS be that the monitor
be population-oriented. The term
E:\FR\FM\17JAP3.SGM
17JAP3
2738
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
‘‘population-oriented sites’’ is presently
defined in 40 CFR 58.1 as sites in
residential areas, recreational areas,
industrial areas, and other areas where
a substantial number of people may
spend a significant fraction of their
day.55 The concept plays an important
role in the PM2.5 monitoring network in
that a PM2.5 monitor must be
population-oriented to be appropriate
for comparison to either the annual or
24-hour PM2.5 NAAQS. We believe that
this restriction is also appropriate for
PM10-2.5 for the same reasons as for
PM2.5.
The fourth part of the five-part
suitability test is a restriction against
monitoring sites that are adjacent to a
large emissions source or otherwise
within the micro scale environment
affected by a large source.56 This
restriction is intended to help ensure
that monitor siting is consistent with the
intended stringency of the proposed
NAAQS. The relatively large size of
coarse particles and resulting high rate
of deposition under most weather
conditions, and the fact that nearly all
coarse particles are primary57, mean
that the ambient concentration of
PM10-2.5 measured in a specific location
will be more dependent on the distance
of that monitor from coarse particle
sources than would typically be the case
for ambient PM2.5 and associated
sources of fine particles.58 Monitors
55 Population density of a block group and
population-orientation of a monitoring site are
distinct concepts. A monitoring site may not be
population-oriented even though it is within a
block group of high population density. Populationorientation refers to the presence of people in a
geographic area around a monitoring site that may
be much smaller than the block group. If there is
not a substantial number of people spending a
significant fraction of their day in the area around
the monitor with ambient concentrations of about
the magnitude indicated by a monitor, the monitor
is not population oriented, regardless of the
population density of the surrounding census block
group. For example, there could be a portion of a
high-density block group that is near a source but
which has few residents or visitors because of its
land use type, for example.
56 A microscale environment is one in which
there are significant differences in concentrations
between locations that are 10 meters to 100 meters
apart, and generally are areas that are impacted by
immediately adjacent sources such as industrial
sites, roadways, or construction sites.
57 i.e., coarse particles typically are deposited in
the form most recently emitted by their original
source (or in the form they had when resuspended
after having deposited to a roadway or construction
site) rather than being created or modified by
atmospheric chemical reactions during their
generally short transport from the point of original
emission (or resuspension). Particles that have been
resuspended may have incorporated secondarily
formed compounds at some time in their prior
history.
58 Air Quality Criteria for Particulate Matter,
Volume I of II, EPA/600/P–99/002aF, October 2004,
p. 2–49. See also section III.G in the NAAQS
proposal elsewhere in today’s Federal Register.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
placed adjacent to coarse particle
sources would typically measure higher
ambient concentrations than monitors
placed farther away. A PM10-2.5
monitoring site located adjacent to a
high emitting industrial source or a
heavily traveled highway, for example,
might measure high ambient
concentrations, but these concentrations
could be characteristic only of the
relatively small area around the
monitor, notably a smaller area than in
the case of a similarly sited PM2.5
monitor. Even if there are people living
or working at the monitor site, thus
qualifying it as population-oriented,
applying the proposed NAAQS level to
the concentration level measured at
such a monitor would be inconsistent
with the level of community protection
intended through the proposed NAAQS.
As explained in section III.G of the
NAAQS preamble, the EPA intends that
the proposed 24-hour PM10-2.5 NAAQS
be equivalent in stringency to the
current 24-hour PM10 NAAQS. In
determining the level for the PM10-2.5
NAAQS that would achieve this
equivalency, we relied on the
relationship between PM10-2.5 and PM10
observed at PM10 monitoring sites all or
most of which were not adjacent to large
emission sources. If PM10-2.5 monitors
were placed at sites that are adjacent to
emission sources, the effect would be to
make the proposed NAAQS less
community-oriented and more stringent
than intended. The EPA therefore
believes it is appropriate to have a
restriction that PM10-2.5 monitors in
source-influenced micro-environments,
such as on facility fence lines or along
the edge of traffic lanes, are not
appropriate for comparison to the
NAAQS even if there is some
population subject to exposure in that
location (even if EPA or the State
believes that there are other microenvironments similarly affected by other
sources of the same type). PM10-2.5
monitors placed in such micro
environment-types of situations thus
would not be eligible for comparison to
the NAAQS 59 and would not count
toward meeting minimum EPA
monitoring requirements.
The fifth part of the suitability test,
which would only need to be
considered for sites that satisfy all of the
59 We note that this proposed language is more
restrictive for the proposed 24-hour PM10-2.5
NAAQS than parallel language for the 24-hour
PM2.5 NAAQS (which allows such data to be used
for comparison with the 24-hour PM2.5 NAAQS, see
present 40 CFR part 58, appendix D, section
2.8.1.2.3). As explained in the text above, this is
because coarse PM is transported over shorter
distances such that a microscale PM10-2.5 monitor
would not be representative of community-wide
conditions.
PO 00000
Frm 00030
Fmt 4701
Sfmt 4702
first four parts, is that a site-specific
assessment shows that the ambient mix
of PM10-2.5 sampled at that site would be
dominated by resuspended dust from
high-density traffic on paved roads and
PM generated by industrial sources and
construction sources, and would not be
dominated by rural windblown dust and
soils and PM generated by agricultural
and mining sources. The first four parts
of the suitability test make it unlikely
that a candidate site would be
dominated by rural windblown dust
(other than perhaps during exceptional
events), but the site-specific assessment
may determine otherwise. The sitespecific assessment may also reveal the
presence of a dominant agricultural or
mining operation, for example, a gravel
or sand extraction and material
handling operation.
As an example of how this five-part
suitability test would work, consider the
Riverside-San Bernardino-Ontario,
California MSA. The first part of the test
excludes any site outside the Censusdesignated urbanized areas within the
MSA, of which there are several. The
second part of the test would indicate
that a monitoring site within a certain
boundary around the densest parts of
the Riverside-San Bernardino urbanized
area, the Indio-Cathedral City-Palm
Springs urbanized area, or any of the
other urbanized areas in the MSA that
have a population of at least 100,000
persons, is possibly suitable for
comparison with the NAAQS, while a
monitoring site in the small Yucca
Valley urban cluster would definitely
not be suitable. Each boundary would
follow block group borders, and would
leave out less dense parts of its
associated urbanized area. The third
part of the test (population-orientation)
would disqualify some sites within
these boundaries because of the small
number of people subject to exposure in
the vicinity that has concentrations
similar to what would be monitored at
the site. The fourth part would
disqualify sites adjacent to major
roadways (a source-influenced
microenvironment). The fifth part
would assess the remaining candidate
sites to verify that they are not exposed
to windblown rural dust and soils or PM
generated by agriculture and mining
sources to such an extent that emissions
from those sources would dominate the
mix of PM10-2.5 sampled at that site.
We invite comment on possible
variations of the proposed test for
suitability for comparison to the
NAAQS, for example the use of census
tracts in place of block groups or
different values for population density
or total population of a aggregation of
block groups or tract groups. Census
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
tracts are defined as combinations of
(usually a few) block groups, and would
provide a somewhat larger scale of
analysis around a candidate monitoring
site.
While the issue of setting boundaries
for nonattainment areas is not a subject
of this rulemaking, we note that the
considerations that underlie the
proposed suitability test, having to do
with the influence of sources on
measured concentrations, may also be
relevant to the setting of such
boundaries.
The five-part suitability test will leave
as suitable many sites in a MSA, falling
into the three broad categories described
earlier. We believe that States should be
given further direction on placement of
the required monitors among these sites.
A network design strategy should not
allow all required PM10-2.5 monitoring
sites to be located so far from large
emissions sources that they measure
ambient concentrations lower than
would be representative of the impact of
coarse particle sources on well
populated urban areas. We propose to
address this issue by adopting some of
the elements of the monitoring siting
approach that has been used for the
PM10 NAAQS. We propose that 50
percent of required PM10-2.5 monitors 60
be required to represent populationoriented middle scale-sized areas 61 62
near but not adjacent to large sources of
PM (i.e., heavily traveled paved
roadways, long-term construction sites,
large industrial sources) to characterize
air quality in significant-sized areas that
are affected by emissions from these
sources where people may spend a
greater part of their day.63 The
placement of a monitor on the grounds
of a school within a residential
60 Fractional monitor requirements would round
up. MSA with one, two, three, four, or five required
monitors would place one, one, two, two, or three
monitors in this manner, respectively.
61 A middle scale-sized area is one in which there
are significant differences in concentrations
between locations that are 100 meters to 500 meters
apart, and generally are areas that are impacted by
nearly adjacent (but not immediately adjacent)
sources, such as industrial sites, roadways, or
construction sites. Middle scale sites are common
in PM10 monitoring (see present 40 CFR part 58,
appendix D, section 2.8.0.2) and typical of the PM10
sites used to establish the equivalency of the
proposed PM10-2.5 NAAQS to the current PM10
NAAQS.
62 Additional information on middle-scale siting,
and on all such monitoring scales, can be found in
the document: Guidance For Network Design and
Optimum Site Exposure For PM2.5 and PM10. U.S.
Environmental Protection Agency. EPA–454/R–99–
022. December 1997. Available on the web at:
https://www.epa.gov/ttn/amtic/files/ambient/pm25/
network/r-99–022.pdf.
63 If only one monitor is required, then that
monitor would need to conform to this siting
requirement (if the monitor is to be considered as
part of the minimum network design).
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
community that is near but not adjacent
to an industrial facility would be an
example of such a site. With this
requirement for middle scale PM10-2.5
sites, EPA’s proposal provides the
intended degree of protection in
populated areas with high coarse
particle concentrations by requiring
sites that are likely to measure the
maximum concentrations (among sites
meeting the suitability test) in one or
more of the populated areas that are
impacted by the heaviest PM emissions
from roadways and/or industrial/
construction sources.
For those areas with monitoring
requirements greater than one required
monitor, we propose that at least one of
the required monitors must be sited in
a neighborhood scale-sized area 64 that
is highly populated and which may be
somewhat further away from emission
sources but is still expected to have
elevated levels of coarse particles of
concern. These sites would typically
still be impacted by roadway and/or
industrial/construction source
emissions, but to a lesser extent than
sites expected to measure maximum
concentrations. Among such sites, the
State should select a site characterized
by a very large number of people subject
to exposure; typically, this population
number would be higher than the
population at sites expected to record
maximum concentrations. A site located
within a heavily populated residential
and commercial area that is in
proximity to roadways with high
vehicular traffic would be an example of
this type of monitor placement. A site
of this type is useful for several reasons.
It will help define the spatial gradients
of PM10-2.5 concentrations, which may
be useful in setting nonattainment area
boundaries. It likely will provide
concentration data that are relevant for
informing a large segment of the
population of their exposure levels on a
given day. Also, areas of this type may
have PM10-2.5 nonattainment problems
that are caused by a different source mix
than problems found at the first type of
site, and require a different approach to
reducing concentrations. For example,
the mix of industrial and paved road
emissions may be different or the mix of
types of vehicles on paved roads may be
different.
For MSA with a requirement for one,
two, or three monitors, the above two
siting provisions address the siting of all
64 A neighborhood scale-sized are is one in which
there are not typically significant differences in
concentrations between locations that are 500
meters to four kilometers apart, and generally are
areas that are impacted by the more well-mixed
emissions of urban industrial and mobile sources in
the general vicinity of the site.
PO 00000
Frm 00031
Fmt 4701
Sfmt 4702
2739
required monitors with respect to
proximity to specific sources and
populations. For MSA with a
requirement for four or five monitors,
there is one remaining required monitor
not yet addressed. We propose that the
siting of this monitor be left to the
discretion of the State or local
monitoring agency, subject to a
restriction that the site satisfy the
suitability test described above. This
site could be placed in locations similar
to those that would be eligible as
monitoring sites for the other required
monitors, i.e., at other sites that meet
one of the above two proposed siting
requirements. A State may also choose
to place the site in a location that is
somewhat more distant from downtown
areas, main industrial source regions, or
areas of highest traffic density, such as
in a highly populated suburban
residential community. The comparison
of ambient PM10-2.5 concentrations
between such suburban monitors and
those monitors located at the previously
described maximum exposure-type of
sites would provide comparative data
for assessing the spatial variation of
PM10-2.5 concentrations over a
metropolitan area.
While we expect the proposed
suitability test described above will
appropriately identify areas where the
ambient mix of PM10-2.5 is dominated by
resuspended dust from high-density
traffic on paved roads and PM generated
by industrial sources and construction
sources, it may not identify them all. We
recognize that it does not address the
possibility that high density traffic on
paved roads, large industrial emission
sources, and/or construction activities
may be located outside an urbanized
area (including outside any MSA) or in
parts of an urbanized area that do not
satisfy the second part of the suitability
test (related to population density) such
that monitoring sites near these sources
would not meet the proposed test, yet
persons living or working near the
source could be exposed to
concentrations of PM10-2.5 which are
dominated by the PM emissions from
these sources. We invite comment on
alternative approaches that would
examine areas where States may wish to
place non-required monitors that do not
meet the proposed suitability test, but
are locations of industrial emissions or
high traffic on paved roads which create
the potential for ambient mixes of
coarse particles of the type intended to
be included by the indicator. In
particular, EPA solicits comment on a
modification of the proposed test that
would specify that a site meeting only
the third, fourth, and fifth parts of the
E:\FR\FM\17JAP3.SGM
17JAP3
2740
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
suitability test could be compared to the
NAAQS if it were close enough to an
industrial source of coarse particles of a
defined high enough emissions level
(for example, 100 tons per year or more
of emissions) that the ambient mix
would be dominated by PM generated
by that industrial source. The term
‘‘industrial’’ would be made operational
by using a source’s assigned industry
code under the North American
Industry Classification System (NAICS)
and excluding sources with codes
corresponding to agricultural or mining
industries.65 As noted, the site would
have to population-oriented and could
not be in the micro-scale environment
affected by a large source. A site-specific
assessment (the fifth part of the
suitability test) would still be required,
and would consider the local mix of
emission source types and sizes, their
relative locations to the potential
monitoring site, and local factors
affecting transport and deposition of
PM10-2.5. Such monitors, even if
determined to be comparable to the
NAAQS through the site-specific
assessment, would not count toward the
minimum number of monitors required
for each MSA.
We also invite comment on the
possibility of another, similar
modification to the proposed suitability
test as that just described for industrial
sources, but addressing emissions from
vehicle traffic on roadways. Nonrequired State sites otherwise excluded
from comparison to the NAAQS, based
on their location outside of a U.S.
Census Bureau-defined urbanized area
and/or their location in block groups
with population density below the
proposed threshold, but are population
oriented and within some distance of a
roadway with a certain traffic volume
per day, could be the subject of sitespecific analysis to determine if they are
in fact suitable for comparison to the
NAAQS based on the PM emissions
from sources that dominate PM10-2.5
concentrations at those sites. Such sites
would have to be population-oriented
and could not be in the micro-scale
environment affected by the roadway.
The site-specific assessment would
consider the local mix of emission
source types and sizes, their relative
locations to the potential monitoring
site, and local factors affecting transport
and deposition of PM10-2.5. We seek
comment on whether such sites would
be appropriate for comparison to the
NAAQS, and, if so, what levels of VMT
must occur and/or other conditions
exist before comparison to the NAAQS
65 Information on the NAICS is avaialble at
https://www.census.gov/epcd/naics02/.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
could be considered. We note that traffic
volume alone is not a direct predictor of
emissions of resuspended dust and
other PM10-2.5 emissions, since the load
of dust on the highway and the mix of
vehicle types matter also. Such
monitors, even if determined to be
comparable to the NAAQS through the
site-specific assessment, would not
count toward the minimum number of
monitors required for each MSA.
iii. Non-required monitoring. States
may deploy PM10-2.5 monitors in
addition to those that would be
required. For example, additional
monitors in areas that are required to
have one or more monitors may be very
useful for determining nonattainment
area boundaries. States might also want
to site monitors near large point sources,
if the final rule provides for the
suitability of monitoring sites near such
sources. The EPA will work with States
as they consider what additional
monitors to deploy and operate.
The proposed suitability test for
comparison with the PM10-2.5 NAAQS
applies to all non-required monitors (as
well as all required monitors). Data from
monitors that do not meet the suitability
test could not be used for nonattainment
determinations. For example, as with
required monitors, non-required
monitors must also be populationoriented as defined above in order to be
used for nonattainment designations.
Also, as with required monitors, nonrequired monitors could not be
compared to the NAAQS if they are
located in source-influenced microenvironments, such as on facility fence
lines or along the edge of traffic lanes.
iv. Speciation monitoring. In addition
to sites measuring PM10-2.5 mass
concentration, our experience with
PM2.5 suggests that it would be useful to
have a long-term PM10-2.5 speciation
network of 50 to 100 sites to assess
physical and chemical characteristics at
a nationally diverse set of locations.
Speciation data would help identify the
specific source types, address the
relative contribution of anthropogenic
and natural sources to ambient
concentrations, and support future
research concerning the health risks of
coarse particles of various compositions
and source origins. We propose that one
speciation site be located in each of the
MSAs with total population greater than
500,000 people and that also have an
estimated PM10-2.5 design value greater
than 80 percent of the proposed PM10-2.5
NAAQS. We expect that approximately
25 MSAs will be required to have
speciation monitors based on these
proposed criteria. These sites will gather
data in areas that have a higher
probability of exceeding the proposed
PO 00000
Frm 00032
Fmt 4701
Sfmt 4702
NAAQS and also have larger exposed
populations at risk, and would support
the characterization of coarse particles
concentrations that control the
attainment/nonattainment status of the
area. States would be required to
operate any of these speciation sites that
were located inside their borders. In
some cases, monitors could be
collocated with PM2.5 speciation
monitors at urban NCore multipollutant
monitoring stations to provide
comparative chemical characterization
studies between fine and coarse
particles. The PM10-2.5 mass
concentration data obtained with
speciation monitors would be
comparable to the NAAQS only in
situations where the underlying
sampling method used to obtain the
filters was an approved FRM or FEM
and the site met the suitability test
described earlier in this section.
We will collaborate with States to
select and fund additional sites based
on data requirements, individual State
needs, and availability of funds. The
EPA solicits comment on all aspects of
the PM10-2.5 speciation network
including the number of required sites,
the total size of the network, the criteria
for choosing the number of required
monitors in each area, the sampling
method used to obtain filters, and
frequency and types of analyses that
would be performed on those filters.
c. Monitoring plan requirements and
approval process.
We propose that each State be
required to submit to the respective EPA
Regional Administrator a plan
proposing how all affected monitoring
organizations within the State will
comply with the requirements described
above for the type, sampling schedule,
number, and location of PM10-2.5
monitoring stations. The plan would
also provide supporting information for
why each monitoring site which the
State proposes to count towards the
requirement for a minimum number of
monitors is suitable for comparison to
the PM10-2.5 NAAQS, based on the
criteria described above. In addition, for
each non-required monitoring site
which the State intends to deploy and
which the State considers would be
appropriate for comparison to the
proposed PM10-2.5 NAAQS, the plan
would also provide evidence that the
monitor is suitable for comparison,
based on the criteria described above.
The State would be required to make
this plan available for public inspection
for at least 30 days prior to submission
to EPA.
This plan would be due to EPA
January 1, 2008. The EPA Regional
Administrator may extend this due date
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
to July 1, 2008, for example to allow it
to be consolidated with the overall
annual monitoring review and plan due
at that time.
The EPA Regional Administrator will
review the submitted plan and approve
or disapprove it by a letter to the
submitting State official within 120 days
of submittal. The EPA Regional
Administrator will be required to invite
public comment; he/she must consider
relevant public comments, if any are
received in response to the invitation.
We are not proposing a specific
mechanism for the Regional
Administrator to make the plan
available for public comment, but we
invite comment now on mechanisms
that would be practical for the Regional
Administrators and effective for persons
likely to want to comment. The
approval, if given, will include
confirmation that EPA will treat each
planned monitoring site as suitable or
not suitable for comparison to the
PM10-2.5 NAAQS, along with the reasons
for each determination. This
confirmation will be a final EPA action
applicable to subsequent determinations
of attainment or nonattainment. This
status will then be recorded in AQS for
each monitor by the State.
Elsewhere in this notice (section
IV.E.11), we are proposing a new
requirement for States to conduct and
submit to EPA a comprehensive
monitoring system assessment at fiveyear intervals. The status of each
PM10-2.5 monitoring site with respect to
comparability to the NAAQS should be
re-examined during these assessments,
starting with the first assessment which
is submitted not less than 5 years after
EPA Regional Administrator approval of
the initial PM10-2.5 monitoring plan. The
State may also propose a change in the
status of a PM10-2.5 monitor whenever a
large existing source of PM10-2.5 near the
monitor ceases (or begins) operation and
is expected to remain shut down (or to
continue operation)for three or more
years, if the type of source involved is
such that its shut down or start up could
materially affect what types of
emissions dominate the PM10-2.5
measured at the site.
We invite comment on this proposed
process and possible alternatives or
additions to it, for example on whether
there should be review by the EPA
Administrator before the approval or
disapproval is considered a final
Agency action, or an opportunity for
appeal to the Administrator to alter the
final action.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
3. Monitoring Requirements for the
Proposed Primary and Secondary
National Ambient Air Quality Standards
for PM2.5
The current PM2.5 network includes
over 1,200 FRM samplers at
approximately 900 sites that are
operated to determine compliance with
the NAAQS; track trends, development,
and accountability of emission control
programs; and provide data for health
and ecosystem assessments that
contribute to periodic reviews of the
NAAQS. Over 450 continuous PM2.5
monitors are operated to support public
reporting and forecasting of the AQI.
For PM2.5, EPA proposes to modify
the network minimum requirements for
PM2.5 monitoring so that multiple urban
monitors in the same CBSA are not
required if they are redundant or
measuring concentrations well below
the NAAQS. We propose to base
minimum monitoring requirements for
PM2.5 on PM2.5 concentrations as
represented by a design value, and on
the census population of the CBSA.
Overall, this is expected to result in a
lower number of required sites;
however, we recommend and anticipate
that States continue to operate a high
percentage of the existing sites now
utilizing FRM, but with FEM and ARM
continuous methods replacing the FRM
monitors at many of these sites.66
We are proposing to require that all
sites counted by a State towards meeting
the minimum requirement for the
number of PM2.5 sites have an FRM,
FEM, or ARM monitor. We are also
proposing that at least one-half of all the
required PM2.5 sites be required to
operate PM2.5 continuous monitors of
some type even if not an FEM or ARM.
This requirement would ensure that
continuous methods continue to be well
utilized throughout the network to
support monitoring objectives such as
public reporting and forecasting of the
AQI not readily addressed by FRM and
filter-based FEM.
As noted, EPA proposes to use design
value and population as inputs in
deciding the minimum required PM2.5
monitoring sites in each CSA/CBSA. We
are proposing these inputs so that
monitoring resources are prioritized
based on the number of people who may
be exposed to a problem and the level
of exposure of that population.
Metropolitan areas with smaller
populations would not be required to
66 An approved regional method (ARM) is a PM
2.5
method that has been approved specifically within
a State, local, or tribal air monitoring network for
purposes of comparison to the National Ambient
Air Quality Standards and to meet other monitoring
objectives. See section IV.D.2 of this preamble.
PO 00000
Frm 00033
Fmt 4701
Sfmt 4702
2741
perform as much monitoring as larger
areas. If ambient air concentrations as
indicated by historical monitoring are
low enough, these smaller population
areas would not be required to continue
to perform any PM2.5 monitoring.
The proposed amendments would
require fewer sites when design values
are well above (rather than near) the
NAAQS to allow more flexibility in the
use of monitoring resources in these
areas where States and EPA are already
more certain of the severity and extent
of the PM2.5 problem and possibly in
more need of other types of data to
address it. For instance, an agency may
wish to operate more speciation
samplers rather than FRM to get a better
understanding of the atmospheric
chemistry of an area. We invite
comments on this approach, versus
requiring more FRM/FEM monitors in
areas well above the NAAQS.
The proposed siting criteria for PM2.5
monitors would remain the same as
current requirements, which have an
emphasis on population-oriented sites
at neighborhood scale and larger.
Population-oriented middle scale sites
would remain a part of the network for
comparison to both the daily and annual
standard when a site can represent
many other middle-scale locations
where people are exposed. For middlescale sites that are unique, only the
daily NAAQS would be considered
when comparing data to the standard.
Background and transport sites would
remain a required part of each State’s
network to support characterization of
regional transport and regional scale
episodes of PM2.5. To meet these
requirements, IMPROVE samplers may
be used even though they would not be
eligible for comparison to the PM2.5
NAAQS; these samplers are currently
used in visibility monitoring programs
in Class I areas and national parks. Sites
in other States which are located at
places that make them appropriate as
background and transport sites can also
fulfill these minimum siting
requirements.
The proposed change in the primary
24-hour PM2.5 NAAQS from 65 µg/m3 to
35 µg/m3 raises the issue of whether any
commensurate changes would be
needed in the PM2.5 ambient monitoring
network regulations. The current
specific network design criteria for
PM2.5 in appendix D to 40 CFR part 58
directs States to select sites mostly
representative of community-oriented
area-wide PM2.5 exposure levels at
locations of neighborhood or larger
scale, except in cases where a certain
population-oriented microscale or
middle-scale PM2.5 site is determined to
represent similar locations that
E:\FR\FM\17JAP3.SGM
17JAP3
2742
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
collectively form a larger region of
localized high ambient PM2.5
concentrations. The EPA believes that
these current design criteria remain
appropriate for implementation of the
proposed primary PM2.5 NAAQS. The
existing minimum requirements
effectively ensure that monitors are
placed in locations that appropriately
reflect the community-oriented areawide concentrations levels used in the
epidemiological studies that support the
proposed lowering of the 24-hour
NAAQS.
Most often, the current location of
maximum monitors around PM2.5
concentrations is the same as the
location of maximum monitored 24hour PM2.5 concentrations, suggesting
that no shifts in monitors would be
needed to implement the proposed 24hour NAAQS. In a relatively small
number of cases 67, certain microscale
PM2.5 monitors that have not been
eligible for comparison to the annual
PM2.5 NAAQS and that have been
complying with the 24-hour PM2.5
NAAQS, and therefore have not
impacted the attainment status, may
become more influential to attainment
status under a more stringent 24-hour
form of the NAAQS. Some sites that
have not measured high concentrations
relative to the current 24-hour NAAQS
may also become more influential to
attainment status under the proposed
more stringent 24-hour NAAQS. In
these cases, States may choose to move
accompanying speciation and
continuous monitors to the new site of
particular interest to get a better
characterization of PM at that location.
States and EPA may also agree on
changing the location of some PM2.5
FRM/FEM sites to insure measurements
at the population-oriented location(s) of
most interest.
In proposed changes to 40 CFR 58.10
(Monitoring Network Description and
Periodic Assessments), monitoring
agencies would be required to provide
a network plan that includes the
identification of any PM2.5 sites that are
not suitable for comparison against the
annual PM2.5 NAAQS. The proposed
requirements would also provide for a
public hearing and review of changes to
a PM2.5 monitoring network that impact
the location of a violating PM2.5
monitor, prior to requesting EPA
approval of the changes. Through this
process, monitoring agencies would be
able to consider changes to their PM2.5
monitoring networks made in response
to the proposed NAAQS, and inform the
67 EPA is presently aware of less than 10 PM
2.5
monitors that are sited in a manner that is
unsuitable for comparison to the annual NAAQS.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
public about the potential implications
on design values and resulting
attainment and nonattainment
decisions.
In today’s NAAQS proposal
(published elsewhere in this Federal
Register), EPA requests comments on
the alternative of basing a PM2.5
secondary standard on a shorter-term
averaging interval of less than 24-hours
to provide protection against visibility
impairment primarily in urban areas.
If the alternative short-term secondary
standard is promulgated, EPA envisions
that compliance would be assessed with
data from continuous PM2.5 monitoring
methods capable of providing hourly
time resolution. Continuous monitors
would be required to comply with FEM
or ARM requirements. Hourly PM2.5
data values would be averaged over the
appropriate short-term averaging
interval (e.g., four to eight hours) to
assess compliance with the proposed
short-term secondary NAAQS. The
alternative short-term secondary
NAAQS would also require minor
additions to the current PM2.5 siting
requirements. Some continuous
monitors would likely be required to be
sited on a neighborhood and urban scale
to form the basis of a network
representing ambient PM2.5 conditions
along corridors that influence visibility
of important scenic resources in and
around urban areas. Sites might also
want to consider collocating such
monitors with automated haze-cam
systems to quantify local relationships
between short-term PM2.5
concentrations and visual range.
4. Proposed Monitoring Requirements
for PM10
In the PM NAAQS proposal published
elsewhere in this Federal Register, EPA
proposes to revoke the PM10 annual
standard. Further, consistent with the
more targeted nature of the proposed
new PM10-2.5 indicator, the
Administrator proposes to revoke the
current 24-hour PM10 standard
everywhere except in areas where there
is at least one monitor that violates the
24-hour PM10 standard. In areas where
both applicable PM10 NAAQS are
revoked, we propose to have no
minimum PM10 monitoring
requirements and to allow
discontinuation of PM10 monitors
without prior EPA approval, although
monitoring organizations would have
the option of funding and operating
PM10 monitors as needed to satisfy any
still-applicable SIP commitments or to
monitor compliance with non-Federal
air quality standards. In areas where the
PM10 NAAQS are not both revoked, we
propose to have no minimum
PO 00000
Frm 00034
Fmt 4701
Sfmt 4702
requirements, but to require prior EPA
approval for changes to existing
monitors. See also section IV.E.8 of this
preamble.
5. Proposed Requirements for Operation
of Ozone Monitoring Sites
Ozone (O3) monitoring sites are
operated to determine compliance with
the NAAQS; to track trends,
development, and accountability of
emission control programs; to provide
data for health and ecosystem
assessments that contribute to ongoing
reviews of the NAAQS; and to support
public reporting and forecasting of the
AQI. For O3, EPA proposes to change
the minimum network requirement from
at least two sites in ‘‘any urbanized area
having a population of more than
200,000’’ to an approach that considers
the level of exposure of O3, as indicated
by the design value and the census
population of an area. Larger population
CSA and CBSA with design values near
the O3 NAAQS would be required to
operate at least four sites. Smaller CSA
and CBSA would be required to operate
as few as one site, provided the design
values were sufficiently low enough.
Similar to the proposal for PM2.5, EPA
proposes that areas with measured
ambient concentrations significantly
above the NAAQS be required to
operate fewer sites than areas with
measured ambient concentrations near
the NAAQS to allow flexibility of
resources in those areas. We invite
comments on this approach.
The O3 monitoring network is
primarily based on continuous FEM
using ultraviolet analysis. The network
is well deployed throughout the country
at about 1,100 sites with most
metropolitan areas already operating
more O3 monitors than would be
required by today’s proposed
amendments. The EPA does not
anticipate or recommend significant
changes to the size of this network
because O3 remains a pollutant with
measured levels near or above the
NAAQS in many areas throughout the
country. However, the proposed
amendments would help to better
prioritize monitoring resources
depending on the population and
relative levels of O3 in an area.
6. Proposed Requirements for Operation
of Carbon Monoxide, Sulfur Dioxide,
Nitrogen Dioxide, and Lead Monitoring
Sites
Criteria pollutant monitoring
networks for the measurement of CO,
SO2, NO2, and Pb are primarily operated
to determine compliance with the
NAAQS and to track trends and
accountability of emission control
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
programs as part of a SIP. Because these
criteria pollutant concentrations are
typically well below the NAAQS, there
is limited use for public reporting to the
AQI, except for a very small number of
locations with on-going local air quality
issues.
Gas measurements of CO, SO2, and
NO2 utilize continuous technologies.
Lead (Pb) is sampled by collecting total
suspended particulates (TSP) on a highvolume sampler and analyzed in a
laboratory.
We are proposing to revoke all
minimum requirements for CO, SO2,
and NO2, monitoring networks, and
reduce the requirements for Pb. This
proposal allows for reductions in
ambient air monitoring for CO, SO2,
NO2, and Pb, particularly where
measured levels are well below the
applicable NAAQS and air quality
problems are not expected, except in
cases with ongoing regulatory
requirements for monitoring such as SIP
or permit provisions. In these cases,
EPA encourages States to comment on
ways to reduce these potentially
unnecessary monitors. We will also
work with some States on a voluntary
basis to make sure that at least some
monitors for these pollutants remain in
place in each EPA region. Measurement
of CO, SO2, and NOy are being proposed
as required measurements at NCore
sites. There may be little regulatory
purpose for keeping many other sites
showing low concentrations, other than
specific State, local, or tribal
commitments to do so. However, in
limited cases, some of these monitors
may be part of a long-term record
utilized in a health effects study. The
EPA expects State and local agencies to
seek input on which monitors are being
used for heath effects studies prior to
shutting down a monitor. See also
section IV.E.8 of this preamble
(Proposed criteria and process for
discontinuing monitors).
7. Proposed Changes to Minimum
Requirements for Ozone Precursor
Monitoring
Section 182(c)(1) of the CAA required
us to promulgate rules requiring
enhanced monitoring of ozone, oxides
of nitrogen, and volatile organic
compounds in ozone nonattainment
areas classified as serious, severe, or
extreme. On February 12, 1993, we
promulgated requirements for State and
local monitoring agencies to establish
Photochemical Assessment Monitoring
Stations (PAMS) as part of their SIP
monitoring networks in ozone
nonattainment areas classified as
serious, severe, or extreme. During 2001,
we formed a workgroup consisting of
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
EPA, State, and local monitoring experts
to evaluate the existing PAMS network.
The PAMS workgroup recommended
that the existing PAMS requirements be
streamlined to allow for more
individualized PAMS networks to suit
the specific data needs for a PAMS area.
We are proposing changes to the
minimum PAMS monitoring
requirements in 40 CFR part 58 to
implement the recommendations of the
PAMS workgroup. Specifically, we are
proposing the following changes:
• The number of required PAMS sites
would be reduced. Only one Type 2 site
would be required per area regardless of
population and Type 4 sites would not
be required. Only one Type 1 or one
Type 3 site would be required per area.
• The requirements for speciated
VOC measurements would be reduced.
Speciated VOC measurements would
only be required at Type 2 sites and one
other site (either Type 1 or Type 3) per
PAMS area.
• Carbonyl sampling would only be
required in areas classified as serious or
above for the 8-hour O3 standard.
• NO2/NOX monitors would only be
required at Type 2 sites.
• NOy will be required at one site per
PAMS area (either Type 1 or Type 3).
• Trace level CO would be required at
Type 2 sites.
Note that on April 15, 2004, we
revised some O3 nonattainment
classifications, under the 8-hour O3
standard (69 FR 23951). While the
number of areas classified as serious,
severe, or extreme ozone nonattainment
under the 8-hour O3 standard has been
greatly reduced (69 FR 23857), areas
that had previously been classified as
serious, severe, or extreme ozone
nonattainment under the 1-hour O3
standard are required to comply with
the PAMS monitoring requirements
until they achieve compliance with the
8-hour ozone standard. See 40 CFR
51.900(f)(9). In addition, the PAMS
requirements would apply to any new
areas that are classified or reclassified as
serious, severe, or extreme O3
nonattainment under the 8-hour O3
standard.
We solicit comments on the proposed
revisions to the PAMS monitoring
program requirements including the
measurements to be made, the sampling
frequencies, and the location and
numbers of required monitoring sites
proposed.
8. Proposed Criteria and Process for
Discontinuing Monitors
The EPA has determined that many
single-pollutant monitors operated by
State and local agencies, specifically
many of those measuring CO, Pb, PM10,
PO 00000
Frm 00035
Fmt 4701
Sfmt 4702
2743
SO2, and NO2, are providing data that
have limited usefulness in air quality
management. This is likely the case for
monitors whose data indicate current
attainment of the corresponding
NAAQS with little prospect for future
nonattainment. Accordingly, consistent
with the draft National Ambient Air
Monitoring Strategy (NAAMS), we are
proposing to eliminate the current
requirements for operation of a certain
minimum number of monitors for CO,
PM10, SO2, and NO2, and to reduce the
requirements for Pb monitors, as
described in section IV.E.6 of this
preamble. We are also proposing
changes to loosen the minimum
requirements for monitoring of O3
precursors in the PAMS program, as
described in section IV.E.7 of this
preamble. We are also proposing
changes to the minimum requirements
for O3 and PM2.5 monitoring that may
have the effect of reducing the
minimum number of these monitors in
some areas. We note that the remaining
specific minimum requirements (limited
to O3, PM2.5, and PM10-2.5) are intended
to be necessary but are not always
sufficient to meet the requirement in
section 110(a)(2)(B) of the Clean Air Act
(CAA) that SIP provide for operation of
appropriate systems to monitor,
compile, and analyze data on ambient
air quality. We intend to require many
States to operate some monitors for
these pollutants, but to determine what
monitoring is appropriate on a more
case-by-case basis. The EPA encourages,
and in fact the proposed amendments to
40 CFR part 58 would require, all States
to assess their monitoring networks
periodically to determine what changes
should be made, including which
monitors should be discontinued and
which retained. Local situations will
differ, and should be considered
individually. Reducing low-value
monitoring expenditures would allow
resources to be devoted to under-served
and new monitoring purposes.
Some monitors in excess of the
remaining minimums may be necessary
to the State/local air quality
management process, or for other uses,
such as development and validation of
air quality models. We are proposing to
continue to require States to propose
changes in their monitoring networks
and obtain EPA approval before making
changes, even when the remaining
minimum requirements for number of
monitors would still be met. This EPA
review and approval can take place
through the mechanism of the annual
monitoring plan. The current rule
already requires State agencies to
prepare and submit the plan on July 1
E:\FR\FM\17JAP3.SGM
17JAP3
2744
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
of each year for EPA approval at the
Regional Office level. We are proposing
to retain this current requirement. We
will approve proposed changes to a
monitoring plan provided the proposed
network will still meet any applicable
SIP provisions related to ambient
monitoring and will provide data
needed to support the air quality control
program. Based on assessments that we
and individual States have done to date,
we generally expect to find that a large
percentage—between 33 percent for SO2
and 90 percent for NO2—of current
monitors for CO, PM10, SO2, and NO2
can be removed; that most O3 monitors
should continue although some should
be moved to more productive locations;
that some filter-based PM2.5 monitors
can be removed; and that some filterbased PM2.5 monitors should be
replaced by continuous instruments
when models that have been approved
as FEM or ARM are available.
While local situations need to be
considered individually, we believe that
certain general principles can be
articulated regarding reductions in
monitoring networks. We have
incorporated these principles in the
proposed amendments to reduce
uncertainties in the process and thereby
facilitate an efficient and timely process
for review and approval or disapproval
of proposed changes. These principles
would apply independently. A monitor
meeting any one of them would qualify
for EPA approval for discontinuation.
Situations not addressed by these
criteria would be considered on a caseby-case basis. The EPA Regional Offices
would have more time to give this caseby-case consideration to the exceptional
cases because cases meeting one of the
following criteria could be disposed of
more quickly.
• Any PM2.5, O3, CO, PM10, SO2, Pb,
or NO2 monitor which has shown
attainment during the previous 5 years,
that has a probability of less than 10
percent of exceeding 80 percent of the
NAAQS during the next 3 years based
on the levels, trends, and variability
observed in the past, and which is not
specifically required by an attainment
plan or maintenance plan, can be
removed or moved to another
location.68, 69 Few if any O3 monitors in
68 The concept of using historical data to
statistically predict the probability of a future
violation is an element of EPA’s current policy
memo on ‘‘Limited Maintenance Plan Option for
Moderate PM10 Nonattainment Areas,’’ August 9,
2001. See https://www.epa.gov/ttn/oarpg/t1/fact_
sheets/lmp_fs.pdf and https://www.epa.gov/ttn/
oarpg/t1/memoranda/cdv.pdf. EPA believes that
this concept can be generalized to the other
pollutants listed in this paragraph, but the details
of the probability estimation method(s) will likely
differ.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
urban areas would likely meet this
criterion, but some PM2.5 monitors may
do so. This criterion would not apply to
a PM2.5 monitor that is part of a spatial
averaging plan.
• A monitor for CO, PM10, SO2, or
NO2, which has consistently measured
lower concentrations than another
monitor for the same pollutant in the
same county and same nonattainment
area during the previous 5 years, and
which is not specifically required by an
attainment plan or maintenance plan,
could be removed or moved to another
location, if control measures scheduled
to be implemented or discontinued
during the next 5 years would apply to
the areas around both monitors and
have similar effects on measured
concentrations, such that the retained
monitor would remain the higher
reading of the two monitors being
compared.70
• For any pollutant, the highest
reading monitor (which may be the only
monitor) in a county (or portion of a
county within a distinct nonattainment
or maintenance area) could be removed
or moved to a new location provided the
monitor has not measured NAAQS
violations in the previous 5 years, the
CBSA within which the county lies (if
in any) would still meet requirements
for the minimum number of monitors
for the applicable pollutant if any, and
the approved SIP provides for a specific,
reproducible approach to representing
the air quality of the affected county in
the absence of actual monitoring data.
For example, the SIP could provide that
a continuing monitor in a neighboring
county will always be taken by the State
and EPA to represent both counties for
purposes of nonattainment and other
regulatory determinations. Because EPA
would review and approve any SIP
revision that provides such an approach
to representing air quality in the
affected county, EPA can ensure its
technical validity and protectiveness.
We intend to take a cautious approach
to allowing removal of such monitors,
particularly in urban areas. While
approval of such SIP revisions would be
69 Five years of historical data means five
successive calendar years of data sufficient for
making an attainment determination.
70 PM
2.5 and O3 are not included in this proposed
criterion because of the value of even low-reading
monitors in understanding the causes of
nonattainment and in informing the public about
potential exposures. Lead (Pb) is not included
because Pb concentrations are often very dependent
on effective control of Pb emissions of individual
sources very close to the monitor and we believe
it would be too risky to depend on area-wide
generalizations about the effect of scheduled
controls. Also, we believe the effectiveness of
emission controls on Pb sources may be more
variable over time than of CO, SO2, PM10, and NO2
emission controls on sources of those pollutants.
PO 00000
Frm 00036
Fmt 4701
Sfmt 4702
delegated to the Regional Offices, EPA
Headquarters officials would participate
in the review of proposed revisions that
present the first instance of specific
approaches, and would resolve issues of
national consistency if such issues arise.
• A monitor, which EPA has
determined cannot be compared to the
relevant NAAQS because of the siting of
the monitor, could be moved or
removed. For example, a PM2.5 monitor
must be population-oriented to be
comparable to the daily or annual
NAAQS, and one that is not populationoriented could be removed.71
• A monitor that is designed to
measure concentrations upwind of an
urban area for purposes of
characterizing transport into the area
and that has not recorded violations of
the relevant NAAQS in the previous 5
years could be moved to another
location where information on transport
will be more useful to SIP development.
• A monitor not eligible for removal
under any of the above criteria could be
moved to a nearby location with the
same scale of representation if logistical
problems beyond the State’s control
make it impossible to continue
operation at its current site. For
example, the State may lose access to a
monitoring site not owned by the State
itself, and this criterion would ensure
approval of a new site that was nearby
and that had the same scale of
representation (e.g., middle-scale or
neighborhood-scale). A move to a more
distant site would require case-by-case
EPA review of the appropriateness of
the new location compared to other
alternatives.
In the situations covered by these
proposed criteria, the State would need
to make a factual showing that the
specified conditions are met. Once the
EPA Regional Office accepts that
showing, the proposed amendments
would require approval of the State’s
request as part of the Regional Office
action on the annual monitoring plan.
We may issue guidance suggesting
appropriate ways these showings can be
made.
We invite comments on the specific
details of these proposed criteria, and
on other criteria that would be
appropriate.
In order to help information be
available to the State and to EPA that
could be relevant to the appropriateness
of monitoring network changes, we
propose that each State be required to
make available for public inspection its
draft annual monitoring plan for a
71 Section 2.8.1.2.3 of appendix D to 40 CFR part
58 (Network Design for State and Local Air
Monitoring Stations (SLAMS)).
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
period of at least 30 days prior to
submitting it to the EPA Regional Office
for approval. The State could, for
example, satisfy this proposed
requirement by making the draft plan
available for download via the air
agency’s Internet Web site. We also
propose that when submitting the
annual monitoring plan for EPA
approval, the State provide evidence
that: (1) The State has considered the
ability of the proposed network to
support air quality characterization for
areas with relatively high populations of
susceptible individuals (e.g., children
with asthma); and (2) if the State
proposes to discontinue any monitoring
sites, the State has considered how
discontinuing monitoring sites would
affect data users other than the
monitoring agency itself, such as nearby
States and tribes or health effects
research studies. We invite comment on
where EPA should provide opportunity
to examine and comment on monitoring
plans after they are reviewed by the
Regional Office.
9. Special Purpose Monitors
The development of today’s proposed
amendments has given EPA occasion to
re-examine the longstanding issue of
whether the ambient air monitoring
rules and current policies regarding use
of monitoring data for regulatory
determinations have the effect of
creating undue and counterproductive
disincentives to States and other
organizations deploying discretionary
monitors that overall and in the long
run would benefit air quality
management efforts. The EPA is
proposing a limited change in the
monitoring rules on this issue.
At present, each State at any given
time is required to operate a certain set
of monitors under the monitoring
regulations and its own approved
monitoring plan, or to meet
commitments it has made in its SIP and/
or grant agreement(s) with EPA. If a
State chooses to deploy an additional
monitor, it may designate it as a special
purpose monitor (SPM). Such
designation can afford the State certain
flexibility it would not have if the
monitor were designated as an NCore
station or State and local air monitoring
station (SLAMS).72 However, regardless
72 A special purpose monitor (SPM) is one which
the State does not count when showing compliance
with the minimum requirements for the number
and siting of monitors and which it has designated
as an SPM by so labeling it in the Air Quality
System (AQS) data system and/or in its monitoring
plan. In common practice EPA does not overrule
such designations provided the rest of the
monitoring network meets minium requirements.
Monitors carrying special purpose status need not
use Federal reference or equivalent methods, are
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
of whether a monitor is designated as an
SPM, if it is an appropriately-sited FRM
or FEM monitor and if its operation
meets the QA requirements of 40 CFR
part 58, or if the data are otherwise
determined to be technically valid, EPA
considers all available data from that
monitor whenever we make a
determination of attainment or
nonattainment. The possibility that data
from an SPM could result in a
nonattainment designation of an area
that would otherwise not be so
designated may discourage the State
from deploying a new monitor or
supporting the deployment of a monitor
by another organization, such as a
university, even when the monitor
would provide useful information for
determining the extent, severity, causes,
and possible solutions of a known or
suspected air quality problem. Thus, a
State that might have voluntarily
addressed a nonattainment problem
may never become aware of the
problem. Also, affected persons may
also be left unaware and unable to
reduce their own exposures by
modifying their behavior or to advocate
for State action to address the problem.
We addressed this issue in the 1997
rulemaking that established the current
requirements for PM2.5 monitoring, and
created a narrow exception to the
practice that all known, good air quality
data be considered in such
determinations. (See preamble
discussion at 62 FR 38770, July 18, 1997
and in existing 40 CFR 58.14(b).) That
narrow exception addressed only new
SPM for PM2.5 concentrations. It
provides that PM2.5 NAAQS violation
determinations shall not be exclusively
made based on data produced at a
population-oriented SPM site during the
first two complete years of its operation,
but only if monitoring is not continued
beyond those 2 years. More recently,
during the development of the draft
NAAMS and today’s proposal, EPA has
received input from various parties,
including the Clear Air Act Advisory
Committee, to the effect that EPA
‘‘should promote policies to avoid
disincentives to monitoring’’ by limiting
not subject to the quality system requirements of 40
CFR part 58 that apply to State and local air
monitoring stations (SLAMS), and are not subject to
siting requirements such as probe height or distance
from nearby obstructions (or, in this proposal, the
proposed siting suitability requirements for
monitors which can be used for comparison with
the proposed 24-hour PM10-2.5 standard. Their data
are not required to be submitted to AQS, and they
may be discontinued at will by the State (assuming
no grant commitment exists for their continued
operation). States start up and designate monitors
as special purpose as a flexible and economical way
to meet various local monitoring objectives, such as
exploring a possible air quality problem in response
to citizen concerns.
PO 00000
Frm 00037
Fmt 4701
Sfmt 4702
2745
the regulatory use of data from such
monitoring.73 A moratorium on any use
of data from the first 3 years after the
deployment of a discretionary monitor,
applicable to all NAAQS pollutants, was
a specific approach discussed in some
of our consultations with State and local
monitoring officials during the
development of this proposal. Such a
moratorium would give States time to
address the air quality problem with
more flexibility than it would have if
the area were designated nonattainment
and subject to CAA requirements for
nonattainment areas.
We understand and, to some degree,
sympathize with the States’ perception
that the current requirements create
disincentives to monitoring. We agree
that it is conceivable, and perhaps
likely, that it might ultimately be more
protective of public health to have more
monitoring data in hand even if the
early years of data from each additional,
discretionary monitor could not be used
for regulatory purposes, compared to
never having that data at all. However,
we believe we may not ignore
technically valid air quality data from
FRM and FEM monitors when making
attainment or nonattainment
determinations. If we know that an area
is actually not meeting an NAAQS
based on valid data, we cannot ignore
those data. This is premised on the
provisions of the CAA that the Agency
must follow in determining whether an
area is attainment or nonattainment.
Section 107(d)(1)(A)(i) of the CAA
defines ‘‘nonattainment’’ as ‘‘any area
that does not meet’’ an NAAQS and
CAA section 107(d)(1)(A)(ii) defines
‘‘attainment’’ as any area ‘‘that meets’’
an NAAQS. In light of this explicit
language, EPA does not believe we
could affirmatively determine an area to
be an attainment area for a particular
criteria pollutant, (i.e., an area ‘‘that
attains’’ the NAAQS) if we had the
requisite years of valid data from
appropriately sited FRM or FEM
monitors showing that the area was in
fact not attaining the standard.
In light of this legal requirement, we
believe that two limited exclusions on
use of data from SPM are possible. We
are proposing that: (1) The limited twoyear moratorium on the use of data from
SPM in determinations of NAAQS
violations established in the 1997
rulemaking for PM2.5 be extended to the
annual PM10 NAAQS (if it is retained
rather than revoked as proposed
73 See recommendation 1.4 in Recommendations
to the Clean Air Act Advisory Committee (CAAAC),
Air Quality Management Workgroup, January 2005,
transmitted by the CAAAC as a Committee
recommendation to Administrator Michael O.
Leavitt on January 19, 2005.
E:\FR\FM\17JAP3.SGM
17JAP3
2746
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
elsewhere in today’s Federal Register),
the O3 NAAQS, and the proposed 24hour PM10-2.5 NAAQS, rather than any
more extensive data exclusion
approach; and (2) for CO, SO2, NO2, Pb,
and 24-hour PM10, that data from the
first 2 years of a SPM would not be used
for nonattainment designations but
would be used in making findings of
whether a nonattainment area has
attained the NAAQS. In both cases, data
from the first 2 years of operation of a
new SPM would not be used provided
the monitor does not continue operation
beyond those 2 years. If the monitor
does continue operation beyond 2 years,
all years of data will be given full
consideration. This policy would in
some situations facilitate special
purpose monitoring that would
otherwise be discouraged by the risk of
a nonattainment finding, but we
acknowledge that these situations will
be limited.
This proposed approach would have
no practical effect for those NAAQS for
which three consecutive years of data
are always required before a
determination of attainment/
nonattainment can be made, i.e., the 24hour and annual PM2.5 NAAQS, the
annual PM10 NAAQS, the proposed
PM10-2.5 NAAQS, and the O3 NAAQS.
For these NAAQS, the proposed rule
provision would make it clear that there
is no risk of a nonattainment outcome
based on a two-year period of SPM
operation.
The CO, SO2, NO2, 24-hour PM10, and
Pb NAAQS present a different issue,
because under the form of these NAAQS
a single year of data can be sufficient to
make a finding of nonattainment. We
note that until such time as we revise
one of these NAAQS, we are under no
mandatory duty to designate an area
from attainment or unclassifiable to
nonattainment, so it is within our
discretion to simply not take such an
action if the critical data indicating
nonattainment is from the first 2 years
of an SPM.
However, if we are requested by a
State to redesignate a nonattainment
area to attainment, we do have a
mandatory duty to act on that request.
Consequently, we cannot overlook some
SPM data that is contrary to the
redesignation request by simply not
taking an action. We must respond to a
request for redesignation from
nonattainment to attainment, and if
there are valid data indicating that
nonattainment still exists we could not
approve the redesignation request.
Therefore, we can use the fact that
future designation of any new CO, SO2,
NO2, 24-hour PM10, or Pb nonattainment
areas is discretionary to protect States
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
from use of 2 years of data from a new
SPM for one of these pollutants
resulting in a nonattainment
designation, but we cannot protect an
area from use of such data in a finding
on whether an already designated
nonattainment area has subsequently
attained the relevant NAAQS.
Consequently, the proposed two-year
data moratorium should remove the
disincentive to place new monitors in
attainment areas for CO, SO2, NO2, 24hour PM10, or Pb, but may leave in place
disincentives to add monitors in
nonattainment areas that may appear to
have reached attainment or be
approaching attainment.
Despite the limited nature of the
proposed moratorium, States and other
organizations would still be able to
perform many useful types of
discretionary monitoring without fear of
triggering a near-term nonattainment
designation. In the case of PM2.5, PM10,
and the proposed PM10-2.5 NAAQS,
many of the most useful types of
monitors for purposes of understanding
the causes and possible solutions to a
nonattainment problem are not FRM,
FEM, or ARM monitors, and therefore
these monitors can be deployed for two
or even more years without any concern
about use of the data in nonattainment
designations. This includes a number of
filter-based sampler models including
the samplers used in the IMPROVE
program, all types of speciation
samplers for PM2.5, PM10, and the
proposed PM10-2.5, and all existing
continuous monitors for PM2.5. There
are also non-FRM/FEM for some of the
other NAAQS that currently can be
deployed indefinitely to characterize air
quality problems better without fear of
nonattainment designation
consequences (e.g., passive monitors).
Another situation in which the
limited nature of the proposed two-year
moratorium would have no practical
disincentive effect is when the siting of
a monitor precludes comparison to the
applicable NAAQS, even though it is an
FRM or FEM monitor that meets quality
system requirements. It could, for
example, be placed in an location that
is not ambient air and does not
represent ambient air. It could also be
placed inconsistently with siting criteria
found in the rules which specify when
monitoring data can be used for
comparison with the NAAQS. See
existing 40 CFR part 58, appendix D,
section 2.8.1.2.3 and the suitability
criteria proposed for the PM10-2.5
monitoring network discussed in
section IV.E.2 of this preamble.
The limited nature of the moratorium
would have a disincentive effect on
discretionary monitoring relative to a
PO 00000
Frm 00038
Fmt 4701
Sfmt 4702
hypothetically more encompassing
moratorium. For example, a State could
still be discouraged from operating an
O3 or PM2.5 monitor beyond 2 years, and
thus may miss becoming aware of an
actual public health problem. Therefore,
we invite comment on the Agency’s
legal interpretation, which has shaped
today’s proposal for the described
limited moratorium, and on what
provisions for SPM data we should
adopt if EPA was to change the legal
interpretation in light of public
comments. In particular, we invite
comments on an approach in which the
first 3 years of data from any SPM
would be permanently protected from
use in nonattainment determinations
regardless of whether it operates beyond
3 years, but any monitor showing a
violation in the first 3 years would be
required to continue operation unless its
discontinuation is approved as part of
EPA’s review of the State’s annual
monitoring plan. This approach would
result in the State having some time to
address the NAAQS violation before
three usable years of data became
available to make an official
nonattainment/attainment
determination from the fourth through
sixth year of operation.
Special purpose monitors are
presently not subject to the quality
system requirements of 40 CFR part 58.
With respect to data quality, EPA
wishes to encourage all State and local
monitoring agencies to adhere to the
quality system requirements of 40 CFR
part 58 for all FRM, FEM, and ARM
monitors (the monitor types to which
such requirements are applicable).
Substandard quality system practices
should not be deliberately used as a way
to prevent EPA from using data from an
SPM beyond the protection offered by
the proposed two-year moratorium.
However, under the current monitoring
rules, States may do so and some have
done so. Accordingly, EPA proposes to
amend 40 CFR part 58 to require that all
FRM, FEM, and ARM monitors operated
by States (or delegated local agencies)
comply with the quality system
requirement in 40 CFR part 58 relevant
to the monitor type(s) being used. We
propose that this requirement take effect
2 years after the date of publication of
the final rule, to provide States time to
prepare to meet the requirement and to
choose transition dates that fit with
other network plans. We also invite
comment on the alternative of using
grant agreements to attempt to achieve
quality system objectives for SPM
instead of including a specific
requirement in the proposed
amendments.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
We also propose that States be
required to submit to the Air Quality
System (AQS) all data collected by all
FRM, FEM, and ARM special purpose
monitors, starting no later than 2 years
after the date of publication of the final
amendments. In the past, when SPM
were not required to follow quality
system requirements, the uncertain data
quality from such monitors was a reason
to allow States discretion regarding
submission of data to AQS. With the
proposed requirement that FRM, FEM,
and ARM special purpose monitors
follow quality system requirements,
there is no rationale for their data not
being submitted to AQS to provide
transparency in the air quality
management process.
We propose to retain and clarify that
a State may discontinue use of an SPM
at any time, without need for EPA
approval. However, we encourage States
to continue the use of monitors that
have gone beyond the two-year point of
operation if they have recorded a
violation of a NAAQS. Otherwise, EPA
may designate the area as nonattainment
and the State would lack clear evidence
to show subsequent attainment.
10. Flexibility and Resources for NonRequired Monitoring
The EPA wishes to clarify that while
40 CFR part 58, including the proposed
amendments, contains a number of
minimum requirements for States to
operate ambient monitors, ensure data
quality, and report data, these
requirements are not a complete
blueprint for the monitoring networks
that we believe should and we hope will
be operated by State and local agencies.
Many specific features of minimum
requirements for these networks, such
as selection of specific monitoring sites
for PM10-2.5, are left to be made later at
the State level with EPA Regional Office
approval, so that the best information
and local insights can be applied to
deciding those features. Also, not every
type of monitoring that is needed can be
required through the provisions of 40
CFR part 58 in this rulemaking because,
in some cases, the specific State that
should be responsible for a monitoring
activity cannot be identified with
confidence at this time. For example,
the proposed amendments to 40 CFR
part 58 do not require any State to
operate a rural NCore multipollutant
NCore monitoring station, even though
we estimate that the Nation needs about
20 such sites, because it would be
premature and too rigid at this time to
select those sites. Instead, we will work
with States as they determine the
location of their required urban NCore
multipollutant site or sites, and we will
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
most likely negotiate for the voluntary
operation of some rural sites as well.
The provisions of 40 CFR part 58 can
and should only require the number and
types of monitoring activities that will
surely be needed in any State over a
reasonably long time period, to avoid
the need for frequent amendments to
allow States to stop the use of obsolete
monitors. However, aggregation of
hypothetical State networks that just
met the minimum requirements of 40
CFR part 58, including the proposed
amendments, would be inadequate to
meet the needs of air quality
management at the State and national
levels. We will negotiate with States for
monitoring activities that go beyond the
minimum requirements of 40 CFR part
58 using the draft National Ambient Air
Monitoring Strategy as a starting point
for those negotiations. The EPA will
generally provide at least partial
funding for such additional monitoring
through grants, sometimes very
specifically and sometimes though more
general air quality management support
grants. Where current monitoring
activities by a State exceed the final
minimum requirements in 40 CFR part
58, EPA may need to negotiate
reductions in is funding for those
activities if the data they produce are
not sufficiently valuable to the air
quality management process.
In particular, we anticipate that we
will be negotiating with States in the
next several years the specifics of the
following directional changes in their
networks:
• Creation and operation of rural
NCore multipollutant stations. We
expect that some of the need for rural
monitoring data can be met by required
stations that some states choose to place
in suitable rural areas and/or by
planned federally-operated rural
monitoring stations. We will identify
the remaining needed sites and recruit
and fund specific States to establish and
operate them.
• Creation and operation of more
PM10-2.5 speciation sites than the
minimum required in the proposed
amendments.
• Creation and operation of rural
PM10-2.5 mass concentration sites. In
addition to the urban PM10-2.5 sites
required by this proposal, having some
PM10-2.5 mass concentration sites in
rural areas may be useful to provide
ambient data to compare with the higher
coarse particle concentrations that are
typically found in urban locations.
Since these rural sites would typically
be located outside of any MSA and
would be characterized by lower
population densities than in
metropolitan areas, most would likely
PO 00000
Frm 00039
Fmt 4701
Sfmt 4702
2747
not be appropriate for NAAQS
comparisons. We may work with
selected States to establish such rural
sites, taking into account existing siting
opportunities such as the CASTNET and
IMPROVE networks, and we solicit
comment on the need for and siting
strategy for such rural monitors. We
note that monitoring sites in rural areas
may be useful in future health effects
research.
• Reduction in the number of PM2.5
filter-based monitors and replacement of
some such monitors with continuous
instruments.
• Reduction in the number of CO,
SO2, NO2, PM10, and Pb monitoring
sites.
• Changes in the number and/or
locations of PM2.5 speciation monitoring
sites. The EPA and the States have been
assessing these sites in the last year or
so, and some changes are underway. A
new factor to consider will be the
speciation data needs of areas that may
now be attaining the current PM2.5
NAAQS but appear likely to be
nonattainment with the proposed
NAAQS.
• Changes in PAMS networks. The
proposed minimum requirements for
PAMS monitoring would mean that
many current State networks exceed
minimum requirements, providing the
opportunity for reassessment and
redesign to better meet local conditions
and data needs.
• Other changes that would result in
networks that better meet State data
goals, which can be so individualistic
that they cannot be given consideration
in a rulemaking such as this, or even in
a nonbinding national strategy.
11. Proposed Requirements for Network
Assessments
In addition to annual network
reviews, EPA proposes to require
periodic and detailed network
assessments as a way to maintain
relevancy of ambient air monitoring to
emerging air program needs and
scientific findings. The EPA proposes
that State and local agencies conduct a
technical network assessment every 5
years to consider whether stations
should be removed or added, or whether
new program elements should be
adopted to account for changes in air
quality, population growth, emission
sources, and other parameters. The first
assessments would be due July 1, 2009.
These assessments would also evaluate
the adequacy of existing technologies
deployed in the network compared to
commercially available methods that
could potentially be deployed to
improve the network. Network
assessments are intended to probe the
E:\FR\FM\17JAP3.SGM
17JAP3
2748
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
current and expected relevancy of air
monitoring networks through a
combination of stakeholder
participation and technical analyses.
This would be accomplished, in part, by
periodically questioning the overall
usefulness of the existing sites and
identifying locations where additional
monitoring may be necessary. Typical
topics addressed in network
assessments would include reviewing
data objectives and data quality,
prioritizing measurement needs,
identifying redundant monitoring, and
identifying specific gaps in location and
measurement parameters. The EPA
anticipates developing non-binding
guidance on how to conduct these
proposed network assessments. We
solicit comment on the proposed
requirements and schedule for network
assessments.
12. Related Federal Monitoring
The EPA conducts or supports three
ambient monitoring programs directly,
related to but separate from, the State,
local, and tribal monitoring programs
that are the subject of today’s proposal.
These are CASTNET, NADP, and
IMPROVE programs, described in
section III.B.3 of this preamble. Today’s
proposals do not apply to these
programs, but the following brief
description of these programs may assist
the public in commenting on today’s
proposal.
The EPA plans to upgrade the
monitoring capabilities of many of the
CASTNET sites in the next couple of
years in ways that would allow them to
meet the same multipollutant
monitoring objectives as the proposed
State-operated rural NCore stations. As
these plans become more developed,
EPA expects to adjust its targets for the
number of rural NCore stations that are
voluntarily operated by States under
grant agreements with EPA.
The EPA is exploring with the
National Atmospheric Deposition
Network (NADP) sponsors the
possibility of expanding NADP’s
objectives and monitoring infrastructure
to investigate measurement of spatial
monitoring concentrations, from which
dry deposition could be estimated. Also,
NADP stations potentially provide
efficient opportunities to site ambient
air monitors for other purposes.
At present, the IMPROVE program
employs different sampling hardware
and laboratory analytical procedures to
measure speciated PM2.5 compared to
most PM2.5 speciation monitoring in
urban areas. The EPA is working to
achieve more consistency between the
two programs, so that monitoring results
at the two types of stations are more
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
directly comparable. We are also
reviewing the current IMPROVE site list
to determine which are of higher versus
lower priority for long-term
continuation.
F. What Are the Proposed Probe and
Monitoring Path Siting Criteria?
The EPA is proposing minor
organizational changes to 40 CFR 58,
appendix E (Probe and Monitoring Path
Siting Criteria for Ambient Air Quality
Monitoring). The EPA also is proposing
specific criteria for the placement of
PM10-2.5 samplers. Current vertical
placement requirements permit
microscale PM10 and PM2.5 monitors to
be located 2 to 7 meters above ground
level to allow for security, instrument
servicing, and operator safety, as well as
sampling particulate matter at the
breathing height. The EPA is proposing
that the same 2- to 7-meter vertical
placement requirements apply to
microscale PM10-2.5 sites.74 The EPA is
also proposing that the 2- to 7-meter
vertical placement requirement apply to
middle-scale PM10-2.5 sites, which
differs from the existing PM2.5 vertical
placement requirement permitting
middle-scale sites to have samplers
placed 2 to 15 meters above ground. We
recognize that significant PM10-2.5
vertical concentration gradients may
exist due to re-entrainment of coarse
particles from the surfaces that typically
surround monitoring sites, such as
adjacent streets, parking lots, and
landscaped surfaces, and such vertical
gradients may introduce additional
complexities in the comparison of data
from samplers at widely varying
heights. The EPA seeks to reduce this
variability by restricting the vertical
placement of PM10-2.5 samplers at
middle-scale sites to the 2 to 7 meter
requirement while recognizing that
PM10-2.5 monitors that would have been
at a higher level (e.g., 15 meters above
ground) would have likely measured
lower ambient concentrations. The EPA
proposes that PM10-2.5 sites with
neighborhood, urban, and regional
scales have identical horizontal and
vertical requirements with PM2.5 sites in
consideration of the lesser gradients of
coarse particle ambient concentrations
likely with sites representing larger,
more homogeneous conditions. The
EPA acknowledges the logistical
complexity of having different vertical
placement requirements for middlescale PM10-2.5 and PM2.5 sites, and
74 The proposed network design criteria for
PM10-2.5 would consider such data to be ineligible
for comparison to the NAAQS (see preamble section
IV.E.2.B.ii).
PO 00000
Frm 00040
Fmt 4701
Sfmt 4702
solicits comment on all aspects of
PM10-2.5 probe siting criteria.
Motor vehicle nitric oxide emissions
are known to scavenge ozone, and EPA
recognizes the difficulty that monitoring
agencies face when trying to locate
ozone air monitors in areas with
multiple roadways and streets. Based
upon concern about the scavenging
effects of motor vehicle emissions on
ozone, EPA proposes to increase the
minimum distances between ozone
monitors and roadways in certain cases.
Recent field studies have shown
significant effects of roadway emissions
at the distances currently listed in 40
CFR part 58, appendix E. Summary
information on this work is included in
the docket for this proposal. The EPA
solicits comments on these proposed
minimum distance requirements.
G. What Are the Proposed Data
Reporting, Data Certification, and
Sample Retention Requirements?
1. Reduction of PM2.5 Supplemental
Data Reporting Requirements
The EPA is proposing to reduce the
data reporting requirements associated
with PM2.5 Federal Reference Methods
(FRM) to reduce the data management
burden for monitoring agencies. The
following Air Quality System (AQS)
reporting requirements are proposed for
elimination: Maximum and minimum
ambient temperature, maximum and
minimum ambient pressure, flow rate
coefficient of variation (CV), total
sample volume, and elapsed sample
time. AQS reporting requirements are
being retained for average ambient
temperature and average ambient
pressure, and any applicable sampler
flags.
Supplemental monitoring parameters
were required to be reported to AQS
along with FRM mass concentration
data to evaluate the performance of the
FRM as implemented through the newly
developed sampler hardware that was
purchased by EPA for State and local
agencies at the beginning of the PM2.5
monitoring program. Since that time,
these supplemental data, along with
statistical analyses conducted on data
from collocated sampling and
independent Performance Evaluation
Program (PEP) audits, have confirmed
that the PM2.5 FRM samplers are
producing data that meet or exceed the
data quality objectives developed for the
method. As a result, the AQS reporting
requirement for many of the
supplemental data parameters can be
discontinued with no adverse effect on
PM2.5 data quality. Monitoring agencies
would still be expected to retain
supplemental data as required by their
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
approved Quality Assurance Program
Plans (QAPP).
AQS reporting requirements for
average ambient temperature and
average ambient pressure are being
retained to provide data useful for the
comparison of mass concentrations
based on actual and standard operating
conditions.
EPA is also proposing amendments to
40 CFR 58.16 (Data submittal) to add the
remaining PM2.5 supplemental data
reporting requirements, which presently
are only found in the FRM requirements
(Table L–1 of appendix L of part 50).
This change will ensure that
supplemental data are reported for
future PM2.5 samplers designated as a
Class I or Class II Federal equivalent
method under the proposed
amendments to 40 CFR part 53.
2. PM2.5 Field Blank Data Reporting
Requirement
We are proposing amendments to 40
CFR part 58.16 to require the
submission of data on PM2.5 field blank
mass in addition to PM2.5 filter-based
measurements. Field blanks are filters
which are handled in the field as much
as possible like actual filters except that
ambient air is not pumped through
them, to help quantify contamination
and sampling artifacts. Only the data
from field blanks which States are
already taking into the field and
weighing in their laboratories would be
required to be reported under this
proposal. Quantifying field blank mass
is important in order to complete the
material balance of the major
components of sampled PM2.5. In
addition, fluctuations of the field blank
value are a useful quality control metric
which can be used to help evaluate the
performance of filter-based samplers
and the quality of the sampled PM2.5
values. However, there is currently
limited information available to EPA
and other users of ambient air quality
data on the magnitude and trends in the
blank concentrations from PM2.5 Federal
reference method (FRM) samplers.
These data are produced by State and
local air pollution agencies on a regular
basis throughout the year, but the data
are not currently submitted to EPA.
Having the data from these field blanks
available to the national monitoring
community would help EPA and other
researchers better understand the
relationship between the mass of PM
that is sampled and weighed on a
regular PM filter and the PM that is
actually present in ambient air. The EPA
solicits comment on this additional
PM2.5 reporting requirement.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
3. Data Certification Schedule
To enhance timely certification of
each year’s air quality data to allow
more timely reporting to the public and
more timely regulatory findings and
actions based on those data, EPA
proposes to speed up official
certification of air quality data by
moving the annual data certification
date from July 1 to May 1 of each year.
We believe it can be met through more
expeditious administrative clearance
processes with State/local agencies and
will not require significant changes in
monitoring practices or equipment. The
EPA solicits comments on this proposed
change to the certification schedule. The
EPA solicits comments identifying
possible barriers to meeting the
proposed certification date and
information on how agencies that
presently certify their data ahead of the
current schedule accomplish this.
4. Particulate Matter Filter Archive
During the regulatory development
process, various governmental agencies
and health scientists indicated that
archiving particulate matter filters for
FRM and Federal equivalent methods
would be useful for later chemical
speciation analyses, mass analyses, or
other analyses. Therefore, we propose to
require archiving PM2.5, PM10-2.5, and
PM10C filters for one year (the current
requirement is only for PM2.5 filters).
The EPA solicits comment on this
proposed requirement, specifically from
those agencies or scientists interested in
using these filters.
V. Statutory and Executive Order
Reviews
A. Executive Order 12866: Regulatory
Planning and Review
Under Executive Order 12866 (58 FR
51735, October 4, 1993), EPA must
determine whether the regulatory action
is ‘‘significant’’ and therefore subject to
review by the Office of Management and
Budget (OMB) and to the requirements
of the Executive Order. The Executive
Order defines a ‘‘significant regulatory
action’’ as one that is likely to result in
a rule that may:
(1) Have an annual effect on the
economy of $100 million or more or
adversely affect in a material way the
economy, a sector of the economy,
productivity, competition, jobs, the
environment, public health or safety, or
State, local, or tribal governments or
communities;
(2) Create a serious inconsistency or
otherwise interfere with an action taken
or planned by another agency;
(3) Materially alter the budgetary
impact of entitlements, grants, user fees,
PO 00000
Frm 00041
Fmt 4701
Sfmt 4702
2749
or loan programs or the rights and
obligations of recipients thereof; or
(4) Raise novel legal or policy issues
arising out of legal mandates, the
President’s priorities, or the principles
set forth in the Executive Order.
Pursuant to the terms of Executive
Order 12866, OMB has notified EPA
that it considers this a ‘‘significant
regulatory action’’ within the meaning
of the Executive Order. EPA has
submitted this action to OMB for
review. Changes made in response to
OMB suggestions or recommendations
will be documented in the public
record.
B. Paperwork Reduction Act
The information collection
requirements in the proposed rule have
been submitted for approval to the
Office of Management and Budget
(OMB) under the Paperwork Reduction
Act, 44 U.S.C. 3501 et seq. The
Information Collection Request (ICR)
documents prepared by EPA have been
assigned EPA ICR No. 0559.09 (2080–
0005) for 40 CFR part 53 and 0940.19
(2060–0084) for 40 CFR part 58. The
provisions in 40 CFR parts 53 and 58
have been previously approved by OMB
under control numbers 2080–0005 (EPA
ICR number 0559.07) and 2060–0084
(EPA ICR number 0940.17),
respectively.
The monitoring, record keeping, and
reporting requirements in 40 CFR parts
53 and 58 are specifically authorized by
section 319 of the CAA (42 U.S.C. 7619).
All information submitted to EPA
pursuant to the monitoring, record
keeping, and reporting requirements for
which a claim of confidentiality is made
is safeguarded according to Agency
policies in 40 CFR part 2, subpart B.
The information collected under 40
CFR part 53 (e.g., test results,
monitoring records, instruction manual,
and other associated information) is
needed to determine whether a
candidate method intended for use in
determining attainment of the National
Ambient Air Quality Standards
(NAAQS) in 40 CFR part 50 will meet
the design, performance, and/or
comparability requirements for
designation as a Federal reference
method (FRM) or Federal equivalent
method (FEM). The proposed
amendments would add requirements
for PM10-2.5 FEM and FRM
determinations, Class II equivalent
methods for PM10-2.5 and Class III
equivalent methods for PM2.5 and
PM10-2.5; reduce certain monitoring and
data collection requirements; and
streamline EPA administrative
requirements.
E:\FR\FM\17JAP3.SGM
17JAP3
2750
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
The incremental annual reporting and
record keeping burden for this
collection of information under 40 CFR
part 53 (averaged over the first 3 years
of this ICR) for one additional
respondent per year is estimated to
increase by a total of 2,774 labor hours
per year with an increase in costs of
$32,000/year. The capital/startup costs
for test equipment and qualifying tests
are estimated at $3,832 with operation
and maintenance costs of $27,772.
The information collected and
reported under 40 CFR part 58 is needed
to determine compliance with the
NAAQS, to characterize air quality and
associated health and ecosystems
impacts, to develop emission control
strategies, and to measure progress for
the air pollution program. The proposed
amendments would revise the technical
requirements for certain types of sites,
add provisions for monitoring of
PM10-2.5, and reduce certain monitoring
requirements for criteria pollutants of
than particulate matter and ozone.
Monitoring agencies would be required
to submit annual monitoring network
plans, establish PM2.5 sites by January 1,
2009, establish NCore sites by January 1,
2011, conduct network assessments
every 5 years, and perform quality
assurance activities.
The annual average reporting burden
for the collection under 40 CFR part 58
(averaged over the first 3 years of this
ICR) for 168 respondents is estimated to
decrease by a total of 336,650 labor
hours per year with a decrease in costs
of $31,600,362. State, local, and tribal
entities are eligible for State assistance
grants provided by the Federal
government under the CAA for monitors
and related activities.
Burden means the total time, effort, or
financial resources expended by persons
to generate, maintain, retain, or disclose
or provide information to or for a
Federal agency. This includes the time
needed to review instructions; develop,
acquire, install, and utilize technology
and systems for the purposes of
collecting, validating, and verifying
information, processing and
maintaining information, and disclosing
and providing information; adjust the
existing ways to comply with any
previously applicable instructions and
requirements; train personnel to be able
to respond to a collection of
information; search data sources;
complete and review the collection of
information; and transmit or otherwise
disclose the information.
An agency may not conduct or
sponsor, and a person is not required to
respond to a collection of information
unless it displays a currently valid OMB
control number. The OMB control
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
numbers for EPA’s regulations in 40
CFR parts 53 and 58 are listed in 40 CFR
part 9.
To comment on the Agency’s need for
the information, the accuracy of the
provided burden estimates, and any
suggested methods for minimizing
respondent burden, including the use of
automated collection techniques, EPA
has established a public docket for the
proposed amendments, which includes
the ICR for 40 CFR part 58, under
Docket ID number EPA–HQ–OAR–
2004–0018. Submit any comments
related to the ICR for the proposed
amendments to 40 CFR part 58 to EPA
and OMB. See the ADDRESSES section at
the beginning of this notice for where to
submit comments to EPA. Send
comments to OMB at the Office of
Information and Regulatory Affairs,
Office of Management and Budget, 725
17th Street, NW., Washington, DC
20503, Attention: Desk Office for EPA.
Since OMB is required to make a
decision concerning the ICR between 30
and 60 days after January 17, 2006, a
comment to OMB is best assured of
having its full effect if OMB receives it
by February 16, 2006. The final
amendments will respond to any OMB
or public comments on the information
collection requirements for 40 CFR part
58 contained in this proposal.
C. Regulatory Flexibility Act
The Regulatory Flexibility Act
generally requires an agency to prepare
a regulatory flexibility analysis of any
rule subject to notice and comment
rulemaking requirements under the
Administrative Procedure Act or any
other statute unless the agency certifies
that the rule will not have a significant
economic impact on a substantial
number of small entities. Small entities
include small businesses, small not-forprofit enterprises, and small
governmental jurisdictions.
For the purposes of assessing the
impacts of today’s proposed
amendments on small entities, small
entity is defined as: (1) A small business
as defined by the Small Business
Administration; (2) a government
jurisdiction that is a government of a
city, county, town, school district or
special district with a population of less
than 50,000; and (3) a small
organization that is any not-for-profit
enterprise which is independently
owned and operated and that is not
dominant in its field.
After considering the economic
impacts of today’s proposed
amendments on small entities, I certify
that this action will not have a
significant economic impact on a
substantial number of small entities.
PO 00000
Frm 00042
Fmt 4701
Sfmt 4702
The proposed requirements in 40 CFR
part 53 for applications for designation
of equivalent methods do not address
small entities. The requirement to apply
is voluntary and, the criteria for
approval are the minimum necessary to
ensure that alternative methods meet
the same technical standards as the
proposed federal method. The proposed
amendments to 40 CFR part 58 would
reduce annual ambient air monitoring
costs for State and local agencies by
approximately $8.5 million and 40,000
labor hours from present levels. State
assistance grant funding provided by the
federal government can be used to
defray the costs of new or upgraded
monitors for the NCore and PM10-2.5
networks. We continue to be interested
in the potential impacts of the proposed
amendments on small entities and
welcome comments on issues related to
such impacts.
D. Unfunded Mandates Reform Act
Title II of the Unfunded Mandates
Reform Act of 1995 (UMRA), Public
Law 104–4, establishes requirements for
Federal agencies to assess the effects of
their regulatory actions on State, local,
and Tribal governments and the private
sector. Under section 202 of the UMRA,
EPA generally must prepare a written
statement, including a cost-benefit
analysis, for proposed and final rules
with ‘‘Federal mandates’’ that may
result in expenditures to State, local,
and Tribal governments, in the
aggregate, or to the private sector, of
$100 million or more in any one year.
Before promulgating an EPA rule for
which a written statement is needed,
section 205 of the UMRA generally
requires EPA to identify and consider a
reasonable number of regulatory
alternatives and adopt the least costly,
most cost-effective or least burdensome
alternative that achieves the objectives
of the rule. The provisions of section
205 do not apply when they are
inconsistent with applicable law.
Moreover, section 205 allows EPA to
adopt an alternative other than the least
costly, most cost-effective or least
burdensome alternative if the
Administrator publishes with the final
rule an explanation why that alternative
was not adopted. Before EPA establishes
any regulatory requirements that may
significantly or uniquely affect small
governments, including Tribal
governments, it must have developed
under section 203 of the UMRA a small
government agency plan. The plan must
provide for notifying potentially
affected small governments, enabling
officials of affected small governments
to have meaningful and timely input in
the development of EPA regulatory
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
proposals with significant Federal
intergovernmental mandates, and
informing, educating, and advising
small governments on compliance with
the regulatory requirements.
EPA has determined that the
proposed rule does not contain a
Federal mandate that may result in
expenditures of $100 million or more
for State, local, and Tribal governments,
in the aggregate, or the private sector in
any one year. The proposed
amendments to 40 CFR part 58 would
reduce annual ambient air monitoring
costs for State and local agencies by
approximately $8.5 million and 40,000
labor hours from present levels. The
costs for reconfiguring the existing
ambient air monitoring requirements to
implement the NCore network would be
borne by the Federal government in the
form of State assistance grants. Thus,
the proposed amendments are not
subject to the requirements of sections
202 and 205 of the UMRA.
EPA has determined that the
proposed rule contains no regulatory
requirements that might significantly or
uniquely affect small governments.
Small governments that may be affected
by the proposed amendments are
already meeting similar requirements
under the existing rules, the proposed
amendments would substantially reduce
the costs of the existing rules, and the
costs of changing the network design
requirements would be borne by the
Federal government through State
assistance grants. Therefore, the
proposed rule is not subject to the
requirements of section 203 of the
UMRA.
E. Executive Order 13132: Federalism
Executive Order 13132 (64 FR 43255,
August 10, 1999), requires EPA to
develop an accountable process to
ensure ‘‘meaningful and timely input by
State and local officials in the
development of regulatory policies that
have federalism implications.’’ ‘‘Policies
that have federalism implications’’ is
defined in the Executive Order to
include regulations that have
‘‘substantial direct effects on the States,
on the relationship between the national
government and the States, or on the
distribution of power and
responsibilities among the various
levels of government.’’
This proposed rule does not have
federalism implications. It will not have
substantial direct effects on the States,
on the relationship between the national
government and the States, or on the
distribution of power and
responsibilities among the various
levels of government, as specified in
Executive Order 13132. States currently
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
implement similar ambient air
monitoring requirements under 40 CFR
parts 53 and 58, and the costs of
implementing new requirements would
be borne by the Federal government
through State assistance grants. Thus,
Executive Order 13132 does not apply
to this proposed rule.
Although section 6 of the Executive
Order does not apply to this proposed
rule, EPA did consult with
representatives of State and local
governments early in the process of
developing this proposed rule. In 2001,
EPA organized a National Monitoring
Steering Committee (NMSC) to provide
oversight and guidance in reviewing the
existing air pollution monitoring
program and in developing a
comprehensive national ambient air
monitoring strategy. The NMSC
membership includes representatives
EPA, State and local agencies, State and
Territorial Air Pollution Program
Administrators/Association of Local Air
Pollution Control Officials (STAPPA/
ALAPCO), and tribal governments to
reflect the partnership between EPA and
governmental agencies that collect and
use ambient air data. The NMSC formed
workgroups to address quality
assurance, technology, and regulatory
review of the draft ambient air
monitoring strategy (NAAMS). These
workgroups met several times by phone
and at least once in a face-to-face
workshop to detail out
recommendations for improving the
ambient air monitoring program. A
record of the Steering Committee
members, workgroup members, and
workshop are available on the web at:
https://www.epa.gov/ttn/amtic/
monitor.html.
In the spirit of Executive Order 13132,
and consistent with EPA policy to
promote communications between EPA
and State and local governments, EPA
specifically solicits comments on the
proposed rule from State and local
officials.
F. Executive Order 13175: Consultation
and Coordination With Indian Tribal
Governments
Executive Order 13175, entitled
‘‘Consultation and Coordination with
Indian Tribal Governments’’ (65 FR
67249, November 9, 2000), requires EPA
to develop an accountable process to
ensure ‘‘meaningful and timely input by
tribal officials in the development of
regulatory policies that have tribal
implications.’’ This proposed rule does
not have tribal implications, as specified
in Executive Order 13175. The proposed
amendments would not directly apply
to Tribal governments. However, a tribal
government may elect to conduct
PO 00000
Frm 00043
Fmt 4701
Sfmt 4702
2751
ambient air monitoring and report the
data to AQS. Since it is possible that
tribal governments may choose to
establish and operate NCore sites as part
of the national monitoring program,
EPA consulted with tribal officials early
in the process of developing the
proposed rule to permit them to have
meaningful and timely input into its
development. As discussed in section
V.E of this preamble, tribal agencies
were represented on both the NMSSC
and the workgroups that developed the
NAAMS document and proposed
monitoring requirements. Tribal
monitoring programs were represented
on both the Quality Assurance and
Technology work groups. Participation
was also open to tribal monitoring
programs on the regulatory review
workgroup. EPA specifically solicits
additional comment on the proposed
amendments from tribal officials.
G. Executive Order 13045: Protection of
Children From Environmental Health
and Safety Risks
Executive Order 13045 (62 FR 19885,
April 23, 1997) applies to any rule that:
(1) Is determined to be ‘‘economically
significant’’ as defined under Executive
Order 12866, and (2) concerns an
environmental health or safety risk that
EPA has reason to believe may have a
disproportionate effect on children. If
the regulatory action meets both criteria,
the Agency must evaluate the
environmental health or safety effects of
the planned rule on children, and
explain why the planned regulation is
preferable to other potentially effective
and reasonably feasible alternatives
considered by the Agency.
EPA interprets Executive Order 13045
as applying only to those regulatory
actions that are based on health or safety
risks, such that the analysis required
under section 5–501 of the Order has
the potential to influence the regulation.
The proposed rule is not subject to
Executive Order 13045 because it is
based on technology and not on health
or safety risks.
H. Executive Order 13211: Actions That
Significantly Affect Energy Supply,
Distribution, or Use
The proposed rule is not a
‘‘significant energy action’’ as defined in
Executive Order 13211, ‘‘Actions
Concerning Regulations That
Significantly Affect Energy Supply,
Distribution or Use’’ (66 FR 28355, May
22, 2001) because it is not likely to have
a significant adverse effect on the
supply, distribution, or use of energy.
No significant change in the use of
energy is expected because the total
number of monitors for ambient air
E:\FR\FM\17JAP3.SGM
17JAP3
2752
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
quality measurements will not increase
above present levels. Further, we have
concluded that this proposed rule is not
likely to have any adverse energy
effects.
I. National Technology Transfer
Advancement Act
Section 12(d) of the National
Technology Transfer Advancement Act
of 1995 (NTTAA), Public Law 104–113,
section 12(d) (15 U.S.C. 272 note)
directs EPA to use voluntary consensus
standards in its regulatory activities
unless to do so would be inconsistent
with applicable law or otherwise
impractical. Voluntary consensus
standards are technical standards (e.g.,
materials specifications, test methods,
sampling procedures, and business
practices) that are developed or adopted
by voluntary consensus standards
bodies. The NTTAA directs EPA to
provide Congress, through OMB,
explanations when the Agency decides
not to use available and applicable
voluntary consensus standards.
The proposed amendments involve
environmental monitoring and
measurement. Ambient air
concentrations of PM2.5 are currently
measured by the Federal reference
method in 40 CFR part 50, appendix L
(Reference Method for the
Determination of Fine Particulate as
PM2.5 in the Atmosphere) or by an a
Federal reference or equivalent method
that meets the requirements in 40 CFR
part 53. Ambient air concentrations of
PM10-2.5 would be measured by the
proposed Federal reference method in
40 CFR part 50, appendix O (Reference
Method for the Determination of Coarse
Particulate Matter as PM10-2.5 in the
Atmosphere) published elsewhere in
this Federal Register or by a Federal
reference or equivalent method that
meets the requirements in 40 CFR part
53. As discussed in section IV.B of this
preamble, the proposed Federal
reference method for PM10-2.5 is similar
to the existing methods for PM2.5 and
PM10.
In the preamble to the proposed
NAAQS revisions published elsewhere
in this Federal Register, EPA requests
comments on selection of an alternative
filter-based dichotomous sampler as the
Federal reference method for PM10-2.5.
Procedures are included in the proposed
monitoring amendments that would
allow for approval of a candidate
equivalent method for PM10-2.5 that is
similar to the proposed Federal
reference method or to the alternative
method proposed for comment. Any
method that meets the performance
criteria for a candidate equivalent
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
method could be approved for use as a
Federal reference or equivalent method.
This approach is consistent with the
Agency’s Performance-Based
Measurement System (PBMS). The
PBMS approach is intended to be more
flexible and cost effective for the
regulated community; it is also intended
to encourage innovation in analytical
technology and improved data quality.
EPA is not precluding the use of any
method, whether it constitutes a
voluntary consensus standard or not, as
long as it meets the specified
performance criteria. EPA welcomes
comments on this aspect of the
proposed amendments and, specifically
invites the public to identify potentially
applicable voluntary consensus
standards and to explain why such
standards should be used in the
regulation.
J. Executive Order 12898: Federal
Actions To Address Environmental
Justice in Minority Populations and
Low-Income Populations
Executive Order 12848 (58 FR 7629,
February 11, 1994) requires that each
Federal agency make achieving
environmental justice part of its mission
by identifying and addressing, as
appropriate, disproportionately high
and adverse human health or
environmental effects of its programs,
policies, and activities on minorities
and low-income populations. These
requirements have been addressed to
the extent practicable in the Regulatory
Impact Analysis for the proposed
revisions to the NAAQS for particulate
matter.
List of Subjects in 40 CFR Parts 53 and
58
Environmental protection,
Administrative practice and procedure,
Air pollution control, Intergovernmental
relations, Reporting and recordkeeping
requirements.
Dated: December 20, 2005.
Stephen L. Johnson,
Administrator.
For the reasons set out in the
preamble, title 40, chapter I, parts 53
and 58 of the Code of Federal
Regulations are proposed to be amended
as follows:
PART 53—[AMENDED]
1. The authority citation for part 53
continues to read as follows:
Authority: Sec. 301(a) of the Clean Air Act
(42 U.S.C. sec. 1857g(a)), as amended by sec.
15(c)(2) of Pub. L. 91–604, 84 Stat. 1713,
unless otherwise noted.
PO 00000
Frm 00044
Fmt 4701
Sfmt 4702
Subpart A—[Amended]
2. Revise §§ 53.1 through 53.5 to read
as follows:
§ 53.1
Definitions.
Terms used but not defined in this
part shall have the meaning given them
by the Act.
Act means the Clean Air Act (42
U.S.C. 1857–1857l), as amended.
Additive and multiplicative bias
means the linear regression intercept
and slope of a linear plot fitted to
corresponding candidate and reference
method mean measurement data pairs.
Administrator means the
Administrator of the Environmental
Protection Agency (EPA) or his or her
authorized representative.
Agency means the Environmental
Protection Agency.
Applicant means a person or entity
who submits an application for a
reference or equivalent method
determination under § 53.4, or a person
or entity who assumes the rights and
obligations of an applicant under § 53.7.
Applicant may include a manufacturer,
distributor, supplier, or vendor.
Automated method or analyzer means
a method for measuring concentrations
of an ambient air pollutant in which
sample collection (if necessary),
analysis, and measurement are
performed automatically by an
instrument.
Candidate method means a method
for measuring the concentration of an
air pollutant in the ambient air for
which an application for a reference
method determination or an equivalent
method determination is submitted in
accordance with § 53.4, or a method
tested at the initiative of the
Administrator in accordance with
§ 53.7.
Class I equivalent method means an
equivalent method for PM2.5 or PM10-2.5
which is based on a sampler that is very
similar to the sampler specified for
reference methods in appendix L or
appendix O (as applicable) of part 50 of
this chapter, with only minor deviations
or modifications, as determined by EPA.
Class II equivalent method means an
equivalent method for PM2.5 or PM10-2.5
that utilizes a PM2.5 sampler or PM10-2.5
sampler in which integrated PM2.5
samples or PM10-2.5 samples are
obtained from the atmosphere by
filtration and subjected to a subsequent
filter conditioning process followed by
a gravimetric mass determination, but
which is not a Class I equivalent method
because of substantial deviations from
the design specifications of the sampler
specified for reference methods in
appendix L or appendix O (as
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
applicable) of part 50 of this chapter, as
determined by EPA.
Class III equivalent method means an
equivalent method for PM2.5 or PM10-2.5
that is an analyzer capable of providing
PM2.5 or PM10-2.5 ambient air
measurements representative of onehour or less integrated PM2.5 or PM10-2.5
concentrations as well as 24-hour
measurements determined as, or
equivalent to, the mean of 24 one-hour
consecutive measurements.
CO means carbon monoxideide.
Collocated means two or more air
samplers, analyzers, or other
instruments that are operated
simultaneously while located side by
side, separated by a distance that is
large enough to preclude the air
sampled by any of the devices from
being affected by any of the other
devices, but small enough so that all
devices obtain identical or uniform
ambient air samples that are equally
representative of the general area in
which the group of devices is located.
Equivalent method means a method
for measuring the concentration of an
air pollutant in the ambient air that has
been designated as an equivalent
method in accordance with this part; it
does not include a method for which an
equivalent method designation has been
canceled in accordance with § 53.11 or
§ 53.16.
ISO 9001-registered facility means a
manufacturing facility that is either:
(1) An International Organization for
Standardization (ISO) 9001-registered
manufacturing facility, registered to the
ISO 9001 standard (by the Registrar
Accreditation Board (RAB) of the
American Society for Quality Control
(ASQC) in the United States), with
registration maintained continuously.
(2) A facility that can be
demonstrated, on the basis of
information submitted to the EPA, to be
operated according to an EPA-approved
and periodically audited quality system
which meets, to the extent appropriate,
the same general requirements as an ISO
9001-registered facility for the design
and manufacture of designated reference
and equivalent method samplers and
monitors.
ISO-certified auditor means an
auditor who is either certified by the
Registrar Accreditation Board (in the
United States) as being qualified to
audit quality systems using the
requirements of recognized standards
such as ISO 9001, or who, based on
information submitted to the EPA,
meets the same general requirements as
provided for ISO-certified auditors.
Manual method means a method for
measuring concentrations of an ambient
air pollutant in which sample
VerDate Aug<31>2005
18:50 Jan 13, 2006
Jkt 208001
collection, analysis, or measurement, or
some combination thereof, is performed
manually. A method for PM10 or PM2.5
which utilizes a sampler that requires
manual preparation, loading, and
weighing of filter samples is considered
a manual method even though the
sampler may be capable of
automatically collecting a series of
sequential samples.
NO means nitrogen oxide.
NO2 means nitrogen dioxide.
NOX means oxides of nitrogen and is
defined as the sum of the concentrations
of NO2 and NO.
O3 means ozone.
Operated simultaneously means that
two or more collocated samplers or
analyzers are operated concurrently
with no significant difference in the
start time, stop time, and duration of the
sampling or measurement period.
Pb means lead.
PM means PM10, PM10C, PM2.5,
PM10-2.5, or particulate matter of
unspecified size range.
PM10 means particulate matter as
defined in section 1.1 of appendix J to
part 50 of this chapter.
PM2.5 means particulate matter as
defined in section 1.1 of appendix L to
part 50 of this chapter.
PM10-2.5 means particulate matter as
defined in section 1.1 of appendix O to
part 50 of this chapter.
PM10C means PM10 particulate matter
or PM10 measurements obtained with a
PM10C sampler.
PM2.5 sampler means a device,
associated with a manual method for
measuring PM2.5, designed to collect
PM2.5 from an ambient air sample, but
lacking the ability to automatically
analyze or measure the collected sample
to determine the mass concentrations of
PM2.5 in the sampled air.
PM10 sampler means a device,
associated with a manual method for
measuring PM10, designed to collect
PM10 from an ambient air sample, but
lacking the ability to automatically
analyze or measure the collected sample
to determine the mass concentrations of
PM10 in the sampled air.
PM10C sampler means a PM10 sampler
that meets the special requirements for
a PM10C sampler that is part of a PM10-2.5
reference method sampler, as specified
in appendix O to part 50 of this chapter,
or a PM10 sampler that is part of a
PM10-2.5 sampler that has been
designated as an equivalent method for
PM10-2.5.
PM10-2.5 sampler means a sampler, or
a collocated pair of samplers, associated
with a manual method for measuring
PM10-2.5 and designed to collect either
PM10-2.5 directly or PM10C and PM2.5
separately and simultaneously from
PO 00000
Frm 00045
Fmt 4701
Sfmt 4702
2753
concurrent ambient air samples, but
lacking the ability to automatically
analyze or measure the collected
sample(s) to determine the mass
concentrations of PM10-2.5 in the
sampled air.
Reference method means a method of
sampling and analyzing the ambient air
for an air pollutant that is specified as
a reference method in an appendix to
part 50 of this chapter, or a method that
has been designated as a reference
method in accordance with this part; it
does not include a method for which a
reference method designation has been
canceled in accordance with § 53.11 or
§ 53.16.
Sequential samples for PM samplers
means two or more PM samples for
sequential (but not necessarily
contiguous) time periods that are
collected automatically by the same
sampler without the need for
intervening operator service.
SO2 means sulfur dioxide.
Test analyzer means an analyzer
subjected to testing as part of a
candidate method in accordance with
subparts B, C, D, E, or F of this part, as
applicable.
Test sampler means a PM10 sampler,
PM2.5 sampler, or PM10-2.5 sampler
subjected to testing as part of a
candidate method in accordance with
subparts C, D, E, or F of this part.
Ultimate purchaser means the first
person or entity who purchases a
reference method or an equivalent
method for purposes other than resale.
§ 53.2 General requirements for a
reference method determination.
The following general requirements
for a reference method determination
are summarized in table A–1 of this
subpart.
(a) Manual methods. (1) Sulfur
dioxide (SO2) and lead. For measuring
SO2 and lead, appendices A and G of
part 50 of this chapter specify unique
manual reference methods for
measuring these pollutants. Except as
provided in § 53.16, other manual
methods for SO2 and lead will not be
considered for reference method
determinations under this part.
(2) PM10. A reference method for
measuring PM10 must be a manual
method that meets all requirements
specified in appendix J of part 50 of this
chapter and must include a PM10
sampler that has been shown in
accordance with this part to meet all
requirements specified in this subpart A
and subpart D of this part.
(3) PM2.5. A reference method for
measuring PM2.5 must be a manual
method that meets all requirements
specified in appendix L of part 50 of
E:\FR\FM\17JAP3.SGM
17JAP3
2754
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
this chapter and must include a PM2.5
sampler that has been shown in
accordance with this part to meet the
applicable requirements specified in
this subpart A and subpart E of this part.
Further, reference method samplers
must be manufactured in an ISO 9001registered facility, as defined in § 53.1
and as set forth in § 53.51.
(4) PM10-2.5. A reference method for
measuring PM10-2.5 must be a manual
method that meets all requirements
specified in appendix O of part 50 of
this chapter and must include PM10C
and PM2.5 samplers that have been
shown in accordance with this part to
meet the applicable requirements
specified in this subpart A and subpart
E of this part. Further, PM10-2.5 reference
method samplers must be manufactured
in an ISO 9001-registered facility, as
defined in § 53.1 and as set forth in
§ 53.51.
(b) Automated methods. An
automated reference method for
measuring CO, O3, or NO2 must utilize
the measurement principle and
calibration procedure specified in the
appropriate appendix to part 50 of this
chapter and must have been shown in
accordance with this part to meet the
requirements specified in this subpart A
and subpart B of this part.
§ 53.3 General requirements for an
equivalent method determination.
(a) Manual methods. A manual
equivalent method must have been
shown in accordance with this part to
satisfy the applicable requirements
specified in this subpart A and subpart
C of this part. In addition, a PM sampler
associated with a manual equivalent
method for PM10, PM2.5, or PM10-2.5 must
have been shown in accordance with
this part to satisfy the following
additional requirements, as applicable:
(1) PM10. A PM10 sampler associated
with a manual method for PM10 must
satisfy the requirements of subpart D of
this part.
(2) PM2.5 Class I. A PM2.5 Class I
equivalent method sampler must also
satisfy all requirements of subpart E of
this part, which shall include
appropriate demonstration that each
and every deviation or modification
from the reference method sampler
specifications does not significantly
alter the performance of the sampler.
(3) PM2.5 Class II. (i) A PM2.5 Class II
equivalent method sampler must also
satisfy the applicable requirements of
subparts E and F of this part or the
alternative requirements in paragraph
(a)(3)(ii) of this section.
(ii) In lieu of the applicable
requirements specified for Class II PM2.5
methods in subparts C and F of this
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
part, a Class II PM2.5 equivalent method
sampler may alternatively meet the
applicable requirements in paragraphs
(b)(3)(i) through (iii) of this section and
the testing, performance, and
comparability requirements specified
for Class III equivalent methods for
PM2.5 in subpart C of this part.
(4) PM10-2.5 Class I. A PM10-2.5 Class I
equivalent method sampler must also
satisfy the applicable requirements of
subpart E of this part (there are no
additional requirements specifically for
Class I PM10-2.5 methods in subpart C of
this part).
(5) PM10-2.5 Class II. (i) A PM10-2.5
Class II equivalent method must also
satisfy the applicable requirements of
subpart C of this part and also the
applicable requirements and provisions
of paragraphs (b)(3)(i) through (iii) of
this section, or the alternative
requirements in paragraph (a)(5)(ii) of
this section.
(ii) In lieu of the applicable
requirements specified for Class II
PM10-2.5 methods in subpart C of this
part and in paragraph (b)(3)(iii) of this
section, a Class II PM10-2.5 equivalent
method sampler may alternatively meet
the applicable requirements in
paragraphs (b)(3)(i) and (ii) of this
section and the testing, performance,
and comparability requirements
specified for Class III equivalent
methods for PM10-2.5 in subpart C of this
part.
(6) ISO 9001. All designated
equivalent methods for PM2.5 or PM10-2.5
must be manufactured in an ISO 9001registered facility, as defined in § 53.1
and as set forth in § 53.51.
(b) Automated methods. All types of
automated equivalent methods must
have been shown in accordance with
this part to satisfy the applicable
requirements specified in this subpart A
and subpart C of this part. In addition,
an automated equivalent method must
have been shown in accordance with
this part to satisfy the following
additional requirements, as applicable:
(1) An automated equivalent method
for pollutants other than PM must be
shown in accordance with this part to
satisfy the applicable requirements
specified in subpart B of this part.
(2) An automated equivalent method
for PM10 must be shown in accordance
with this part to satisfy the applicable
requirements of subpart D of this part.
(3) A Class III automated equivalent
method for PM2.5 or PM10-2.5 must be
shown in accordance with this part to
satisfy the requirements in paragraphs
(b)(3)(i) through (iii) of this section, as
applicable.
(i) All pertinent requirements of 40
CFR part 50, appendix L, including
PO 00000
Frm 00046
Fmt 4701
Sfmt 4702
sampling height, range of operational
conditions, ambient temperature and
pressure sensors, outdoor enclosure,
electrical power supply, control devices
and operator interfaces, data output
port, operation/instruction manual, data
output and reporting requirements, and
any other requirements that would be
reasonably applicable to the method,
unless adequate (as determined by the
Administrator) rationale can be
provided to support the contention that
a particular requirement does not or
should not be applicable to the
particular candidate method.
(ii) All pertinent tests and
requirements of subpart E of this part,
such as instrument manufacturing
quality control; final assembly and
inspection; manufacturer’s audit
checklists; leak checks; flow rate
accuracy, measurement accuracy, and
flow rate cut-off; operation following
power interruptions; effect of variations
in power line voltage, ambient
temperature and ambient pressure; and
aerosol transport; unless adequate (as
determined by the Administrator)
rationale can be provided to support the
contention that a particular test or
requirement does not or should not be
applicable to the particular candidate
method.
(iii) Candidate methods shall be tested
for and meet any performance
requirements, such as inlet aspiration,
particle size separation or selection
characteristics, change in particle
separation or selection characteristics
due to loading or other operational
conditions, or effects of surface
exposure and particle volatility,
determined by the Administrator to be
necessary based on the nature, design,
and specifics of the candidate method
and the extent to which it deviates from
the design and performance
characteristics of the reference method.
These performance requirements and
the specific test(s) for them will be
determined by Administrator for each
specific candidate method or type of
candidate method and may be similar to
or based on corresponding tests and
requirements set forth in subpart F of
this part or may be special requirements
and tests tailored by the Administrator
to the specific nature, design, and
operational characteristics of the
candidate method. For example, a
candidate method with an inlet design
deviating substantially from the design
of the reference method inlet would
likely be subject to an inlet aspiration
test similar to that set forth in § 53.63.
Similarly, a candidate method having an
inertial fractionation system
substantially different from that of the
reference method would likely be
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
subject to a static fractionation test and
a loading test similar to those set forth
in §§ 53.64 and 53.65, respectively. A
candidate method with more extensive
or profound deviations from the design
and function of the reference method
may be subject to other tests, full windtunnel tests similar to those described in
§ 53.62, or to special tests adapted or
developed individually to accommodate
the specific type of measurement or
operation of the candidate method.
(4) All designated equivalent methods
for PM2.5 or PM10-2.5 must be
manufactured in an ISO 9001-registered
facility, as defined in § 53.1 and as set
forth in § 53.51.
§ 53.4 Applications for reference or
equivalent method determinations.
(a) Applications for reference or
equivalent method determinations shall
be submitted in duplicate to: Director,
National Exposure Research Laboratory,
Reference and Equivalent Method
Program (MD–D205–03), U.S.
Environmental Protection Agency,
Research Triangle Park, North Carolina
27711 (Commercial delivery address:
4930 Old Page Road, Durham, North
Carolina 27703).
(b) Each application shall be signed
by an authorized representative of the
applicant, shall be marked in
accordance with § 53.15 (if applicable),
and shall contain the following:
(1) A clear identification of the
candidate method, which will
distinguish it from all other methods
such that the method may be referred to
unambiguously. This identification
must consist of a unique series of
descriptors such as title, identification
number, analyte, measurement
principle, manufacturer, brand, model,
etc., as necessary to distinguish the
method from all other methods or
method variations, both within and
outside the applicant’s organization.
(2) A detailed description of the
candidate method, including but not
limited to the following: The
measurement principle, manufacturer,
name, model number and other forms of
identification, a list of the significant
components, schematic diagrams,
design drawings, and a detailed
description of the apparatus and
measurement procedures. Drawings and
descriptions pertaining to candidate
methods or samplers for PM2.5 or
PM10-2.5 must meet all applicable
requirements in reference 1 of appendix
A of this subpart, using appropriate
graphical, nomenclature, and
mathematical conventions such as those
specified in references 3 and 4 of
appendix A of this subpart.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
(3) A copy of a comprehensive
operation or instruction manual
providing a complete and detailed
description of the operational,
maintenance, and calibration
procedures prescribed for field use of
the candidate method and all
instruments utilized as part of that
method (under § 53.9(a)).
(i) As a minimum this manual shall
include:
(A) Description of the method and
associated instruments.
(B) Explanation of all indicators,
information displays, and controls.
(C) Complete setup and installation
instructions, including any additional
materials or supplies required.
(D) Details of all initial or startup
checks or acceptance tests and any
auxiliary equipment required.
(E) Complete operational instructions.
(F) Calibration procedures and
descriptions of required calibration
equipment and standards.
(G) Instructions for verification of
correct or proper operation.
(H) Trouble-shooting guidance and
suggested corrective actions for
abnormal operation.
(I) Required or recommended routine,
periodic, and preventative maintenance
and maintenance schedules.
(J) Any calculations required to derive
final concentration measurements.
(K) Appropriate references to any
applicable appendix of part 50 of this
chapter; reference 6 of appendix A of
this subpart; and any other pertinent
guidelines.
(ii) The manual shall also include
adequate warning of potential safety
hazards that may result from normal use
and/or malfunction of the method and
a description of necessary safety
precautions. (See § 53.9(b).) However,
the previous requirement shall not be
interpreted to constitute or imply any
warranty of safety of the method by
EPA. For samplers and automated
methods, the manual shall include a
clear description of all procedures
pertaining to installation, operation,
preventive maintenance, and
troubleshooting and shall also include
parts identification diagrams. The
manual may be used to satisfy the
requirements of paragraphs (b)(1) and
(2) of this section to the extent that it
includes information necessary to meet
those requirements.
(4) A statement that the candidate
method has been tested in accordance
with the procedures described in
subparts B, C, D, E, and/or F of this part,
as applicable.
(5) Descriptions of test facilities and
test configurations, test data, records,
calculations, and test results as
PO 00000
Frm 00047
Fmt 4701
Sfmt 4702
2755
specified in subparts B, C, D, E, and/or
F of this part, as applicable. Data must
be sufficiently detailed to meet
appropriate principles described in part
B, sections 3.3.1 (paragraph 1) and 3.5.1
and part C, section 4.6 of reference 2 of
appendix A of this subpart; and in
paragraphs 1 through 3 of section 4.8
(Records) of reference 5 of appendix A
of this subpart. Salient requirements
from these references include the
following:
(i) The applicant shall maintain and
include records of all relevant
measuring equipment, including the
make, type, and serial number or other
identification, and most recent
calibration with identification of the
measurement standard or standards
used and their National Institute of
Standards and Technology (NIST)
traceability. These records shall
demonstrate the measurement capability
of each item of measuring equipment
used for the application and include a
description and justification (if needed)
of the measurement setup or
configuration in which it was used for
the tests. The calibration results shall be
recorded and identified in sufficient
detail so that the traceability of all
measurements can be determined and
any measurement could be reproduced
under conditions close to the original
conditions, if necessary, to resolve any
anomalies.
(ii) Test data shall be collected
according to the standards of good
practice and by qualified personnel.
Test anomalies or irregularities shall be
documented and explained or justified.
The impact and significance of the
deviation on test results and
conclusions shall be determined. Data
collected shall correspond directly to
the specified test requirement and be
labeled and identified clearly so that
results can be verified and evaluated
against the test requirement.
Calculations or data manipulations must
be explained in detail so that they can
be verified.
(6) A statement that the method,
analyzer, or sampler tested in
accordance with this part is
representative of the candidate method
described in the application.
(c) For candidate automated methods
and candidate manual methods for
PM10, PM2.5, and PM10-2.5 the
application shall also contain the
following:
(1) A detailed description of the
quality system that will be utilized, if
the candidate method is designated as a
reference or equivalent method, to
ensure that all analyzers or samplers
offered for sale under that designation
will have essentially the same
E:\FR\FM\17JAP3.SGM
17JAP3
2756
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
performance characteristics as the
analyzer(s) or samplers tested in
accordance with this part. In addition,
the quality system requirements for
candidate methods for PM2.5 and
PM10-2.5 must be described in sufficient
detail, based on the elements described
in section 4 of reference 1 (Quality
System Requirements) of appendix A of
this subpart. Further clarification is
provided in the following sections of
reference 2 of appendix A of this
subpart: part A (Management Systems),
sections 2.2 (Quality System and
Description), 2.3 (Personnel
Qualification and Training), 2.4
(Procurement of Items and Services), 2.5
(Documents and Records), and 2.7
(Planning); part B (Collection and
Evaluation of Environmental Data),
sections 3.1 (Planning and Scoping), 3.2
(Design of Data Collection Operations),
and 3.5 (Assessment and Verification of
Data Usability); and part C (Operation of
Environmental Technology), sections
4.1 (Planning), 4.2 (Design of Systems),
and 4.4 (Operation of Systems).
(2) A description of the durability
characteristics of such analyzers or
samplers (see § 53.9(c)). For methods for
PM2.5 and PM10-2.5 the warranty program
must ensure that the required
specifications (see Table A–1 to this
subpart) will be met throughout the
warranty period and that the applicant
accepts responsibility and liability for
ensuring this conformance or for
resolving any nonconformities,
including all necessary components of
the system, regardless of the original
manufacturer. The warranty program
must be described in sufficient detail to
meet appropriate provisions of the
ANSI/ASQC and ISO 9001 standards
(references 1 and 2 in appendix A of
this subpart) for controlling
conformance and resolving
nonconformance, particularly sections
4.12, 4.13, and 4.14 of reference 1 in
appendix A of this subpart.
(i) Section 4.12 in reference 1 of
appendix A of this subpart requires the
manufacturer to establish and maintain
a system of procedures for identifying
and maintaining the identification of
inspection and test status throughout all
phases of manufacturing to ensure that
only instruments that have passed the
required inspections and tests are
released for sale.
(ii) Section 4.13 in reference 1 of
appendix A of this subpart requires
documented procedures for control of
nonconforming product, including
review and acceptable alternatives for
disposition; section 4.14 in reference 1
of appendix A of this subpart requires
documented procedures for
implementing corrective (4.14.2) and
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
preventive (4.14.3) action to eliminate
the causes of actual or potential
nonconformities. In particular, section
4.14.3 requires that potential causes of
nonconformities be eliminated by using
information such as service reports and
customer complaints to eliminate
potential causes of nonconformities.
(d) For candidate reference or
equivalent methods for PM2.5 and Class
II or Class III equivalent methods for
PM10-2.5, the applicant, if requested by
EPA, shall provide to EPA for test
purposes one sampler or analyzer that is
representative of the sampler or
analyzer associated with the candidate
method. The sampler or analyzer shall
be shipped FOB destination to Director,
National Exposure Research Laboratory,
Reference and Equivalent Method
Program (MD–D205–03), U.S.
Environmental Protection Agency, 4930
Old Page Road, Durham, North Carolina
27703, scheduled to arrive concurrent
with or within 30 days of the arrival of
the other application materials. This
analyzer or sampler may be subjected to
various tests that EPA determines to be
necessary or appropriate under § 53.5(f),
and such tests may include special tests
not described in this part. If the
instrument submitted under this
paragraph malfunctions, becomes
inoperative, or fails to perform as
represented in the application before the
necessary EPA testing is completed, the
applicant shall be afforded an
opportunity to repair or replace the
device at no cost to EPA. Upon
completion of EPA testing, the analyzer
or sampler submitted under this
paragraph shall be repacked by EPA for
return shipment to the applicant, using
the same packing materials used for
shipping the instrument to EPA unless
alternative packing is provided by the
applicant. Arrangements for, and the
cost of, return shipment shall be the
responsibility of the applicant. EPA
does not warrant or assume any liability
for the condition of the analyzer or
sampler upon return to the applicant.
§ 53.5
Processing of applications.
After receiving an application for a
reference or equivalent method
determination, the Administrator will,
within 120 calendar days after receipt of
the application, take one or more of the
following actions:
(a) Send notice to the applicant, in
accordance with § 53.8, that the
candidate method has been determined
to be a reference or equivalent method.
(b) Send notice to the applicant that
the application has been rejected,
including a statement of reasons for
rejection.
PO 00000
Frm 00048
Fmt 4701
Sfmt 4702
(c) Send notice to the applicant that
additional information must be
submitted before a determination can be
made and specify the additional
information that is needed (in such
cases, the 120-day period shall
commence upon receipt of the
additional information).
(d) Send notice to the applicant that
additional test data must be submitted
and specify what tests are necessary and
how the tests shall be interpreted (in
such cases, the 120-day period shall
commence upon receipt of the
additional test data).
(e) Send notice to the applicant that
the application has been found to be
substantially deficient or incomplete
and cannot be processed until
additional information is submitted to
complete the application and specify
the general areas of substantial
deficiency.
(f) Send notice to the applicant that
additional tests will be conducted by
the Administrator, specifying the nature
of and reasons for the additional tests
and the estimated time required (in such
cases, the 120-day period shall
commence 1 calendar day after the
additional tests have been completed).
2a. Revise §§ 53.8 and 53.9 to read as
follows:
§ 53.8 Designation of reference and
equivalent methods.
(a) A candidate method determined
by the Administrator to satisfy the
applicable requirements of this part
shall be designated as a reference
method or equivalent method (as
applicable) by and upon publication of
a notice of the designation in the
Federal Register.
(b) Upon designation, a notice
indicating that the method has been
designated as a reference method or an
equivalent method shall be sent to the
applicant.
(c) The Administrator will maintain a
current list of methods designated as
reference or equivalent methods in
accordance with this part and will send
a copy of the list to any person or group
upon request. A copy of the list will be
available for inspection or copying at
EPA Regional Offices and may be
available via the Internet or other
sources.
§ 53.9
Conditions of designation.
Designation of a candidate method as
a reference method or equivalent
method shall be conditioned to the
applicant’s compliance with the
following requirements. Failure to
comply with any of the requirements
shall constitute a ground for
E:\FR\FM\17JAP3.SGM
17JAP3
2757
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
cancellation of the designation in
accordance with § 53.11.
(a) Any method offered for sale as a
reference or equivalent method shall be
accompanied by a copy of the manual
referred to in § 53.4(b)(3) when
delivered to any ultimate purchaser, and
an electronic copy of the manual
suitable for incorporating into user
specific standard operating procedure
documents shall be readily available to
any users.
(b) Any method offered for sale as a
reference or equivalent method shall
generate no unreasonable hazard to
operators or to the environment during
normal use or when malfunctioning.
(c) Any analyzer, PM10 sampler, PM2.5
sampler, or PM10-2.5 sampler offered for
sale as part of a reference or equivalent
method shall function within the limits
of the performance specifications
referred to in § 53.20(a), § 53.30(a),
§ 53.50, or § 53.60, as applicable, for at
least 1 year after delivery and
acceptance when maintained and
operated in accordance with the manual
referred to in § 53.4(b)(3).
(d) Any analyzer, PM10 sampler, PM2.5
sampler, or PM10-2.5 sampler offered for
sale as a reference or equivalent method
shall bear a prominent, permanently
affixed label or sticker indicating that
the analyzer or sampler has been
designated by EPA as a reference
method or as an equivalent method (as
applicable) in accordance with this part
and displaying any designated method
identification number that may be
assigned by EPA.
(e) If an analyzer is offered for sale as
a reference or equivalent method and
has one or more selectable ranges, the
label or sticker required by paragraph
(d) of this section shall be placed in
close proximity to the range selector and
shall indicate clearly which range or
ranges have been designated as parts of
the reference or equivalent method.
(f) An applicant who offers analyzers,
PM10 samplers, PM2.5 samplers, or
PM10-2.5 samplers for sale as reference or
equivalent methods shall maintain an
accurate and current list of the names
and mailing addresses of all ultimate
purchasers of such analyzers or
samplers. For a period of 7 years after
publication of the reference or
equivalent method designation
applicable to such an analyzer or
sampler, the applicant shall notify all
ultimate purchasers of the analyzer or
sampler within 30 days if the
designation has been canceled in
accordance with § 53.11 or § 53.16 or if
adjustment of the analyzer or sampler is
necessary under § 53.11(b).
(g) If an applicant modifies an
analyzer, PM10 sampler, PM2.5 sampler,
or PM10-2.5 sampler that has been
designated as a reference or equivalent
method, the applicant shall not sell the
modified analyzer or sampler as a
reference or equivalent method nor
attach a label or sticker to the modified
analyzer or sampler under paragraph (d)
or (e) of this section until the applicant
has received notice under § 53.14(c) that
the existing designation or a new
designation will apply to the modified
analyzer or sampler or has applied for
and received notice under § 53.8(b) of a
new reference or equivalent method
determination for the modified analyzer
or sampler.
(h) An applicant who has offered
PM2.5 or PM10-2.5 samplers or analyzers
for sale as part of a reference or
equivalent method may continue to do
so only so long as the facility in which
the samplers or analyzers are
manufactured continues to be an ISO
9001-registered facility, as set forth in
subpart E of this part. In the event that
the ISO 9001 registration for the facility
is withdrawn, suspended, or otherwise
becomes inapplicable, either
permanently or for some specified time
interval, such that the facility is no
longer an ISO 9001-registered facility,
the applicant shall notify EPA within 30
days of the date the facility becomes
other than an ISO 9001-registered
facility, and upon such notification,
EPA shall issue a preliminary finding
and notification of possible cancellation
of the reference or equivalent method
designation under § 53.11.
(i) An applicant who has offered PM2.5
or PM10-2.5 samplers or analyzers for sale
as part of a reference or equivalent
method may continue to do so only so
long as updates of the Product
Manufacturing Checklist set forth in
subpart E of this part are submitted
annually. In the event that an annual
Checklist update is not received by EPA
within 12 months of the date of the last
such submitted Checklist or Checklist
update, EPA shall notify the applicant
within 30 days that the Checklist update
has not been received and shall, within
30 days from the issuance of such
notification, issue a preliminary finding
and notification of possible cancellation
of the reference or equivalent method
designation under § 53.11.
Table A–1 to subpart A of part 53 is
revised to read as follows:
TABLE A–1 TO SUBPART A OF PART 53.—SUMMARY OF APPLICABLE REQUIREMENTS FOR REFERENCE AND EQUIVALENT
METHODS FOR AIR MONITORING OF CRITERIA POLLUTANTS
Pollutant
SO2
Ref. or equivalent
...........................
Reference .................
Equivalent ................
CO ............................
Reference .................
Equivalent ................
O3 ..............................
Reference .................
Equivalent ................
NO2 ...........................
Reference .................
Equivalent ................
Pb .............................
Reference .................
Equivalent ................
Reference .................
Equivalent ................
PM10 ..........................
PM2.5 .........................
VerDate Aug<31>2005
Reference .................
Equivalent Class I ....
Equivalent Class II ...
18:46 Jan 13, 2006
Jkt 208001
Manual or automated
Manual .....................
Manual .....................
Automated ................
Automated ................
Manual .....................
Automated ................
Automated ................
Manual .....................
Automated ................
Automated ................
Manual .....................
Automated ................
Manual .....................
Manual .....................
Manual .....................
Manual .....................
Automated ................
Manual .....................
Manual .....................
Manual .....................
PO 00000
Frm 00049
Fmt 4701
Applicable
part 50 appendix
A
B
C
D
E
F
A ................
....................
....................
C ................
....................
....................
D ................
....................
....................
F .................
....................
....................
G ................
....................
J .................
....................
....................
L .................
L .................
L1 ...............
..........
✔
✔
✔
✔
✔
✔
✔
✔
✔
✔
✔
..........
✔
✔
✔
✔
✔
✔
✔
..........
..........
✔
✔
..........
✔
✔
..........
✔
✔
..........
✔
..........
..........
..........
..........
..........
..........
..........
..........
..........
✔
✔
..........
✔
✔
..........
✔
✔
..........
✔
✔
..........
✔
..........
✔
✔
..........
✔
2✔
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
✔
✔
✔
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
..........
✔
✔
✔
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
........................
1 2✔
Sfmt 4702
Applicable subparts of part 53
E:\FR\FM\17JAP3.SGM
17JAP3
2758
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE A–1 TO SUBPART A OF PART 53.—SUMMARY OF APPLICABLE REQUIREMENTS FOR REFERENCE AND EQUIVALENT
METHODS FOR AIR MONITORING OF CRITERIA POLLUTANTS—Continued
Pollutant
Ref. or equivalent
PM10-2.5 .....................
1 Some
Equivalent Class III ..
Reference .................
Equivalent ................
Equivalent Class II ...
Equivalent Class III ..
Manual or automated
Automated ................
Manual .....................
..................................
Manual .....................
Automated ................
Applicable
part 50 appendix
Applicable subparts of part 53
A
B
C
D
L1 ...............
O2 ..............
....................
O2 ..............
L1, O1 2 ......
✔
✔
..........
✔
✔
..........
..........
..........
..........
..........
✔
..........
..........
2✔
✔
..........
..........
..........
..........
..........
E
1✔
✔
..........
1✔
1✔
F
1✔
........................
........................
1 2✔
1✔
requirements may apply, based on the nature of each particular candidate method, as determined by the Administrator.
Class III requirements may be substituted.
2 Alternative
4. Paragraph (6) of appendix A to
subpart A of part 53 is revised to read
as follows:
Appendix A to Subpart A of Part 53—
References
*
*
*
*
*
(6) Quality Assurance Guidance
Document 2.12. Monitoring PM2.5 in
Ambient Air Using Designated
Reference or Class I Equivalent
Methods. U.S. EPA, National Exposure
Research Laboratory, Research Triangle
Park, NC, November 1998 or later
edition. Currently available at https://
www.epa.gov/ttn/amtic/pmqainf.html.
Subpart C—[Amended]
5. Section 53.30 is revised to read as
follows:
§ 53.30
General provisions.
(a) Determination of comparability.
The test procedures prescribed in this
subpart shall be used to determine if a
candidate method is comparable to a
reference method when both methods
measure pollutant concentrations in
ambient air. Minor deviations in testing
requirements and acceptance
requirements set forth in this subpart, in
connection with any documented
extenuating circumstances, may be
determined by the Administrator to be
acceptable, at the discretion of the
Administrator.
(b) Selection of test sites. (1) Each test
site shall be in an area which can be
shown to have at least moderate
concentrations of various pollutants.
Each site shall be clearly identified and
shall be justified as an appropriate test
site with suitable supporting evidence
such as a description of the surrounding
area, characterization of the sources and
pollutants typical in the area, maps,
population density data, vehicular
traffic data, emission inventories,
pollutant measurements from previous
years, concurrent pollutant
measurements, meteorological data, and
other information useful in supporting
VerDate Aug<31>2005
19:22 Jan 13, 2006
Jkt 208001
the suitability of the site for the
comparison test or tests.
(2) If approval of one or more
proposed test sites is desired prior to
conducting the tests, a written request
for approval of the test site or sites must
be submitted to the address given in
§ 53.4. The request should include
information identifying the type of
candidate method and one or more
specific proposed test sites along with a
justification for each proposed specific
site as described in paragraph (b)(1) of
this section. The EPA will evaluate each
proposed site and approve the site,
disapprove the site, or request more
information about the site. Any such
pre-test approval of a test site by the
EPA shall indicate only that the site
meets the applicable test site
requirements for the candidate method
type; it shall not indicate, suggest, or
imply that test data obtained at the site
will necessarily meet any of the
applicable data acceptance
requirements. The Administrator may
exercise discretion in selecting a
different site (or sites) for any additional
tests the Administrator decides to
conduct.
(c) Test atmosphere. Ambient air
sampled at an appropriate test site or
sites shall be used for these tests.
Simultaneous concentration
measurements shall be made in each of
the concentration ranges specified in
tables C–1, C–3, or C–4 of this subpart,
as appropriate.
(d) Sampling or sample collection. All
test concentration measurements or
samples shall be taken in such a way
that both the candidate method and the
reference method obtain air samples
that are alike or as nearly identical as
practical.
(e) Operation. Set-up and start-up of
the test analyzer(s), test sampler(s), and
reference method analyzers or samplers
shall be in strict accordance with the
applicable operation manual(s).
(f) Calibration. The reference method
shall be calibrated according to the
appropriate appendix to part 50 of this
chapter (if it is a manual method) or
PO 00000
Frm 00050
Fmt 4701
Sfmt 4702
according to the applicable operation
manual(s) (if it is an automated
method). A candidate method (or
portion thereof) shall be calibrated
according to the applicable operation
manual(s), if such calibration is a part
of the method.
(g) Submission of test data and other
information. All recorder charts,
calibration data, records, test results,
procedural descriptions and details, and
other documentation obtained from (or
pertinent to) these tests shall be
identified, dated, signed by the analyst
performing the test, and submitted. For
candidate methods for PM2.5 and
PM10-2.5, all submitted information must
meet the requirements of the ANSI/
ASQC E4 Standard, sections 3.3.1,
paragraphs 1 and 2 (reference 1 of
appendix A of this subpart).
§ 53.31
[Removed]
6. Section 53.31 is removed and
reserved.
7. Section 53.32 is revised to read as
follows:
§ 53.32 Test procedures for methods for
SO2, CO, O3, and NO2.
(a) Comparability. Comparability is
shown for SO2, CO, O3, and NO2
methods when the differences between:
(1) Measurements made by a
candidate manual method or by a test
analyzer representative of a candidate
automated method, and;
(2) Measurements made
simultaneously by a reference method
are less than or equal to the values for
maximum discrepancy specified in table
C–1 of this subpart.
(b) Test measurements. All test
measurements are to be made at the
same test site. If necessary, the
concentration of pollutant in the
sampled ambient air may be augmented
with artificially generated pollutant to
facilitate measurements in the specified
ranges, as described under paragraph
(f)(4) of this section.
(c) Requirements for measurements or
samples. All test measurements made or
test samples collected by means of a
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
sample manifold as specified in
paragraph (f)(4) of this section shall be
at a room temperature between 20° and
30°C, and at a line voltage between 105
and 125 volts. All methods shall be
calibrated as specified in § 53.30(f) prior
to initiation of the tests.
(d) Set-up and start-up. (1) Set-up and
start-up of the test analyzer, test
sampler(s), and reference method shall
be in strict accordance with the
applicable operation manual(s). If the
test analyzer does not have an integral
strip chart or digital data recorder,
connect the analyzer output to a suitable
strip chart or digital data recorder. This
recorder shall have a chart width of at
least 25 centimeters, a response time of
1 second or less, a deadband of not more
than 0.25 percent of full scale, and
capability of either reading
measurements at least 5 percent below
zero or offsetting the zero by at least 5
percent. Digital data shall be recorded at
appropriate time intervals such that
trend plots similar to a strip chart
recording may be constructed with a
similar or suitable level of detail.
(2) Other data acquisition components
may be used along with the chart
recorder during the conduct of these
tests. Use of the chart recorder is
intended only to facilitate visual
evaluation of data submitted.
(3) Allow adequate warmup or
stabilization time as indicated in the
applicable operation manual(s) before
beginning the tests.
(e) Range. (1) Except as provided in
paragraph (e)(2) of this section, each
method shall be operated in the range
specified for the reference method in the
appropriate appendix to part 50 of this
chapter (for manual reference methods),
or specified in table B–1 of subpart B of
this part (for automated reference
methods).
(2) For a candidate method having
more than one selectable range, one
range must be that specified in table B–
1 of subpart B of this part, and a test
analyzer representative of the method
must pass the tests required by this
subpart while operated on that range.
The tests may be repeated for a broader
range (i.e., one extending to higher
concentrations) than the one specified
in table B–1 of subpart B of this part,
provided that the range does not extend
to concentrations more than two times
the upper range limit specified in table
B–1 of subpart B of this part and that the
test analyzer has passed the tests
required by subpart B of this part (if
applicable) for the broader range. If the
tests required by this subpart are
conducted or passed only for the range
specified in table B–1 of subpart B of
this part, any equivalent method
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
determination with respect to the
method will be limited to that range. If
the tests are passed for both the
specified range and a broader range (or
ranges), any such determination will
include the broader range(s) as well as
the specified range. Appropriate test
data shall be submitted for each range
sought to be included in such a
determination.
(f) Operation of automated methods.
(1) Once the test analyzer has been set
up and calibrated and tests started,
manual adjustment or normal periodic
maintenance, as specified in the manual
referred to in § 53.4(b)(3), is permitted
only every 3 days. Automatic
adjustments which the test analyzer
performs by itself are permitted at any
time. The submitted records shall show
clearly when manual adjustments were
made and describe the operations
performed.
(2) All test measurements shall be
made with the same test analyzer; use
of multiple test analyzers is not
permitted. The test analyzer shall be
operated continuously during the entire
series of test measurements.
(3) If a test analyzer should
malfunction during any of these tests,
the entire set of measurements shall be
repeated, and a detailed explanation of
the malfunction, remedial action taken,
and whether recalibration was necessary
(along with all pertinent records and
charts) shall be submitted.
(4) Ambient air shall be sampled from
a common intake and distribution
manifold designed to deliver
homogenous air samples to both
methods. Precautions shall be taken in
the design and construction of this
manifold to minimize the removal of
particulate matter and trace gases, and
to insure that identical samples reach
the two methods. If necessary, the
concentration of pollutant in the
sampled ambient air may be augmented
with artificially generated pollutant.
However, at all times the air sample
measured by the candidate and
reference methods under test shall
consist of not less than 80 percent
ambient air by volume. Schematic
drawings, physical illustrations,
descriptions, and complete details of the
manifold system and the augmentation
system (if used) shall be submitted.
(g) Tests. (1) Conduct the first set of
simultaneous measurements with the
candidate and reference methods:
(i) Table C–1 of this subpart specifies
the type (1- or 24-hour) and number of
measurements to be made in each of the
three test concentration ranges.
(ii) The pollutant concentration must
fall within the specified range as
measured by the reference method.
PO 00000
Frm 00051
Fmt 4701
Sfmt 4702
2759
(iii) The measurements shall be made
in the sequence specified in table C–2
of this subpart, except for the 1-hour
SO2 measurements, which are all in the
high range.
(2) For each pair of measurements,
determine the difference (discrepancy)
between the candidate method
measurement and reference method
measurement. A discrepancy which
exceeds the discrepancy specified in
table C–1 of this subpart constitutes a
failure. Figure C–1 of this subpart
contains a suggested format for
reporting the test results.
(3) The results of the first set of
measurements shall be interpreted as
follows:
(i) Zero failures: The candidate
method passes the test for
comparability.
(ii) Three or more failures: The
candidate method fails the test for
comparability.
(iii) One or two failures: Conduct a
second set of simultaneous
measurements as specified in table C–1
of this subpart. The results of the
combined total of first-set and secondset measurements shall be interpreted as
follows:
(A) One or two failures: The candidate
method passes the test for
comparability.
(B) Three or more failures: The
candidate method fails the test for
comparability.
(iv) For SO2, the 1-hour and 24-hour
measurements shall be interpreted
separately, and the candidate method
must pass the tests for both 1- and 24hour measurements to pass the test for
comparability.
(4) A 1-hour measurement consists of
the integral of the instantaneous
concentration over a 60-minute
continuous period divided by the time
period. Integration of the instantaneous
concentration may be performed by any
appropriate means such as chemical,
electronic, mechanical, visual judgment,
or by calculating the mean of not less
than 12 equally-spaced instantaneous
readings. Appropriate allowances or
corrections shall be made in cases
where significant errors could occur due
to characteristic lag time or rise/fall time
differences between the candidate and
reference methods. Details of the means
of integration and any corrections shall
be submitted.
(5) A 24-hour measurement consists
of the integral of the instantaneous
concentration over a 24-hour
continuous period divided by the time
period. This integration may be
performed by any appropriate means
such as chemical, electronic,
mechanical, or by calculating the mean
E:\FR\FM\17JAP3.SGM
17JAP3
2760
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
(a) Comparability. Comparability is
shown for Pb methods when the
differences between:
(1) Measurements made by a
candidate method, and
(2) Measurements made by the
reference method on simultaneously
collected Pb samples (or the same
sample, if applicable), are less than or
equal to the value specified in table C–
3 of this subpart.
(b) Test measurements. Test
measurements may be made at any
number of test sites. Augmentation of
pollutant concentrations is not
permitted, hence an appropriate test site
or sites must be selected to provide Pb
concentrations in the specified range.
(c) Collocated samplers. The ambient
air intake points of all the candidate and
reference method collocated samplers
shall be positioned at the same height
above the ground level, and between 2
meters (1 meter for samplers with flow
rates less than 200 liters per minute (L/
min)) and 4 meters apart. The samplers
shall be oriented in a manner that will
minimize spatial and wind directional
effects on sample collection.
(d) Sample collection. Collect
simultaneous 24-hour samples (filters)
of Pb at the test site or sites with both
the reference and candidate methods
until at least 10 filter pairs have been
obtained. A candidate method which
employs a sampler and sample
collection procedure that are identical
to the sampler and sample collection
procedure specified in the reference
method, but uses a different analytical
procedure, may be tested by analyzing
common samples. The common samples
shall be collected according to the
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Equation 1
R i ave
Frm 00052
Qi ave =
Fmt 4701
Sfmt 4702
QiA + QiB + QiC
3
where, i is audit sample number.
(ii) Calculate the percent difference
(Dq) between the indicated Pb
concentration for each audit sample and
the true Pb concentration (Tq) using
equation 3 of this section:
Equation 3
Dqi =
Qi ave − Tqi
Tqi
×100%
(2) If any difference value (Dqi)
exceeds ±5 percent, the accuracy of the
reference method analytical procedure
is out-of-control. Corrective action must
be taken to determine the source of the
error(s) (e.g., calibration standard
discrepancies, extraction problems, etc.)
and the reference method and audit
sample determinations must be repeated
according to paragraph (f) of this
section, or the entire test procedure
(starting with paragraph (d) of this
section) must be repeated.
(i) Acceptable filter pairs. Disregard
all filter pairs for which the Pb
concentration, as determined in
paragraph (g) of this section by the
average of the three reference method
determinations, falls outside the range
of 0.5 to 4.0 µg/m3. All remaining filter
pairs must be subjected to the tests for
precision and comparability in
paragraphs (j) and (k) of this section. At
least five filter pairs must be within the
0.5 to 4.0 µg/m3 range for the tests to be
valid.
(j) Test for precision. (1) Calculate the
precision (P) of the analysis (in percent)
for each filter and for each method, as
the maximum minus the minimum
divided by the average of the three
concentration values, using equation 4
or equation 5 of this section:
Equation 4
R + R iB + R iC
= iA
3
where, i is the filter number.
(h) Accuracy. (1)(i) For the audit
samples, calculate the average Pb
PO 00000
Equation 2
PRi =
or
E:\FR\FM\17JAP3.SGM
17JAP3
R i max − R i min
× 100%
R i ave
EP17JA06.008
Test procedure for methods for Pb.
concentration for each strip by
averaging the concentrations calculated
from the three analyses using equation
2 of this section:
EP17JA06.007
§ 53.33
sample collection procedure specified
by the reference method and each shall
be divided for respective analysis in
accordance with the analytical
procedures of the candidate method and
the reference method.
(e) Audit samples. Three audit
samples must be obtained from the
address given in § 53.4(a). The audit
samples are 3/4 × 8-inch glass fiber
strips containing known amounts of Pb
at the following nominal levels: 100
micrograms per strip (µg/strip); 300 µg/
strip; 750 µg/strip. The true amount of
Pb, in total µg/strip, will be provided
with each audit sample.
(f) Filter analysis. (1) For both the
reference method samples and the audit
samples, analyze each filter extract three
times in accordance with the reference
method analytical procedure. The
analysis of replicates should not be
performed sequentially, i.e., a single
sample should not be analyzed three
times in sequence. Calculate the
indicated Pb concentrations for the
reference method samples in
micrograms per cubic meter (µg/m3) for
each analysis of each filter. Calculate
the indicated total Pb amount for the
audit samples in µg/strip for each
analysis of each strip. Label these test
results as R1A, R1B, R1C, R2A, R2B, * * *,
Q1A, Q1B, Q1C, * * * ., where R denotes
results from the reference method
samples; Q denotes results from the
audit samples; 1, 2, 3 indicate the filter
number, and A, B, C indicate the first,
second, and third analysis of each filter,
respectively.
(2) For the candidate method samples,
analyze each sample filter or filter
extract three times and calculate, in
accordance with the candidate method,
the indicated Pb concentration in µg/m3
for each analysis of each filter. Label
these test results as C1A, C1B, C2C, . . .,
where C denotes results from the
candidate method. For candidate
methods which provide a direct
measurement of Pb concentrations
without a separable procedure,
C1A=C1B=C1C, C2A=C2B=C2C, etc.
(g) Average Pb concentration. For the
reference method, calculate the average
Pb concentration for each filter by
averaging the concentrations calculated
from the three analyses using equation
1 of this section:
EP17JA06.005 EP17JA06.006
of twenty-four (24) sequential 1-hour
measurements.
(6) For O3 and CO, no more than six
1-hour measurements shall be made per
day. For SO2, no more than four 1-hour
measurements or one 24-hour
measurement shall be made per day.
One-hour measurements may be made
concurrently with 24-hour
measurements if appropriate.
(7) For applicable methods, control or
calibration checks may be performed
once per day without adjusting the test
analyzer or method. These checks may
be used as a basis for a linear
interpolation-type correction to be
applied to the measurements to correct
for drift. If such a correction is used, it
shall be applied to all measurements
made with the method, and the
correction procedure shall become a
part of the method.
8. Section 53.33 is revised to read as
follows:
PCi =
Ci max − Ci min
×100%
Ci ave
where, i indicates the filter number.
(2) If any reference method precision
value (PRi) exceeds 15 percent, the
precision of the reference method
analytical procedure is out-of-control.
Corrective action must be taken to
determine the source(s) of imprecision,
and the reference method
determinations must be repeated
according to paragraph (f) of this
section, or the entire test procedure
(starting with paragraph (d) of this
section) must be repeated.
(3) If any candidate method precision
value (PCi) exceeds 15 percent, the
candidate method fails the precision
test.
(4) The candidate method passes this
test if all precision values (i.e., all PRi’s
and all PCi’s) are less than 15 percent.
(k) Test for comparability. (1) For each
filter or analytical sample pair, calculate
all nine possible percent differences (D)
between the reference and candidate
methods, using all nine possible
combinations of the three
determinations (A, B, and C) for each
method using equation 6 of this section:
Equation 6
Din =
Cij − R ik
R ik
×100%
where, i is the filter number, and n
numbers from 1 to 9 for the nine
possible difference combinations for the
three determinations for each method (j
= A, B, C, candidate; k = A, B, C,
reference).
(2) If none of the percent differences
(D) exceeds ±20 percent, the candidate
method passes the test for
comparability.
(3) If one or more of the percent
differences (D) exceed ±20 percent, the
candidate method fails the test for
comparability.
(4) The candidate method must pass
both the precision test (paragraph (j) of
this section) and the comparability test
(paragraph (k) of this section) to qualify
for designation as an equivalent method.
9. Section 53.34 is revised to read as
follows:
§ 53.34 Test procedure for methods for
PM10 and Class I methods for PM2.5.
(a) Comparability. Comparability is
shown for PM10 methods and for Class
I methods for PM2.5 when the
relationship between:
VerDate Aug<31>2005
19:22 Jan 13, 2006
Jkt 208001
(1) Measurements made by a
candidate method, and
(2) Measurements made by a
corresponding reference method on
simultaneously collected samples (or
the same sample, if applicable) at each
of one or more test sites (as required) is
such that the linear regression
parameters (slope, intercept, and
correlation coefficient) describing the
relationship meet the requirements
specified in table C–4 of this subpart.
(b) Methods for PM10. Test
measurements must be made, or derived
from particulate samples collected, at
not less than two test sites, each of
which must be located in a geographical
area characterized by ambient
particulate matter that is significantly
different in nature and composition
from that at the other test site(s).
Augmentation of pollutant
concentrations is not permitted, hence
appropriate test sites must be selected to
provide the minimum number of test
PM10 concentrations in the ranges
specified in table C–4 of this subpart.
The tests at the two sites may be
conducted in different calendar seasons,
if appropriate, to provide PM10
concentrations in the specified ranges.
(c) PM10 methods employing the same
sampling procedure as the reference
method but a different analytical
method. Candidate methods for PM10
which employ a sampler and sample
collection procedure that are identical
to the sampler and sample collection
procedure specified in the reference
method, but use a different analytical
procedure, may be tested by analyzing
common samples. The common samples
shall be collected according to the
sample collection procedure specified
by the reference method and shall be
analyzed in accordance with the
analytical procedures of both the
candidate method and the reference
method.
(d) Methods for PM2.5. Augmentation
of pollutant concentrations is not
permitted, hence appropriate test sites
must be selected to provide the
minimum number of test measurement
sets to meet the requirements for PM2.5
concentrations in the ranges specified in
table C–4 of this subpart. Only one test
site is required, and the site need only
meet the PM2.5 ambient concentration
levels required by table C–4 of this
subpart. A total of 10 valid
measurement sets is required.
(e) Collocated measurements. (1) Set
up three reference method samplers
collocated with three candidate method
samplers or analyzers at each of the
number of test sites specified in table C–
4 of this subpart.
PO 00000
Frm 00053
Fmt 4701
Sfmt 4702
(2) The ambient air intake points of all
the candidate and reference method
collocated samplers or analyzers shall
be positioned at the same height above
the ground level, and between 2 meters
(1 meter for samplers or analyzers with
flow rates less than 200 L/min) and 4
meters apart. The samplers shall be
oriented in a manner that will minimize
spatial and wind directional effects on
sample collection.
(3) At each site, obtain as many sets
of simultaneous PM10 or PM2.5
measurements as necessary (see table C–
4 of this subpart), each set consisting of
three reference method and three
candidate method measurements, all
obtained simultaneously.
(4) Candidate PM10 method
measurements shall be nominal 24-hour
(±1 hour) integrated measurements or
shall be averaged to obtain the mean
concentration for a nominal 24-hour
period. PM2.5 measurements may be
either nominal 24- or 48-hour integrated
measurements. All collocated
measurements in a measurement set
must cover the same nominal 24- or 48hour time period.
(5) For samplers, retrieve the samples
promptly after sample collection and
analyze each sample according to the
reference method or candidate method,
as appropriate, and determine the PM10
or PM2.5 concentration in µg/m3. If the
conditions of paragraph (c) of this
section apply, collect sample sets only
with the three reference method
samplers. Guidance for quality
assurance procedures for PM2.5 methods
is found in ‘‘Quality Assurance
Document 2.12’’ (reference (2) in
appendix A to this subpart).
(f) Sequential samplers. For
sequential samplers, the sampler shall
be configured for the maximum number
of sequential samples and shall be set
for automatic collection of all samples
sequentially such that the test samples
are collected equally, to the extent
possible, among all available sequential
channels or utilizing the full available
sequential capability.
(g) Calculation of reference method
averages and precisions. (1) For each of
the measurement sets, calculate the
average PM10 or PM2.5 concentration
obtained with the reference method
samplers, using equation 7 of this
section:
Equation 7
3
Rj =
Where:
E:\FR\FM\17JAP3.SGM
17JAP3
∑R
i =1
3
i, j
EP17JA06.009 EP17JA06.010
Equation 5
EP17JA06.011
2761
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
2762
Equation 8
1 3
∑ R i2, j − 3 ∑ R i, j
i =1
i =1
2
3
PRj =
2
(3) For each measurement set, also
calculate the precision of the reference
method PM10 or PM2.5 measurements as
the relative standard deviation, RPRj, in
percent, using equation 9 of this section:
Equation 9
RPRj =
PRj
Rj
×100%
(h) Acceptability of measurement sets.
Each measurement set is acceptable and
valid only if the three reference method
measurements and the three candidate
method measurements are obtained and
¯
are valid, Rj falls within the acceptable
concentration range specified in table
C–4 of this subpart, and either PRj or
RPRj is within the corresponding limit
for reference method precision specified
in table C–4 of this subpart. For each
site, table C–4 of this subpart specifies
the minimum number of measurement
¯
sets required having Rj above and below
specified concentrations for 24- or 48hour samples. Additional measurement
sets shall be obtained, as necessary, to
provide the minimum number of
acceptable measurement sets for each
category and the minimum total number
of acceptable measurement sets for each
test site. If more than the minimum
number of measurement sets are
collected that meet the acceptability
criteria, all such measurement sets shall
be used to demonstrate comparability.
(i) Candidate method average
concentration measurement. For each of
the acceptable measurement sets,
calculate the average PM10 or PM2.5
concentration measurements obtained
with the candidate method samplers,
using equation 10 of this section:
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Equation 10
3
Cj =
∑C
i, j
i =1
3
Where:
C = The concentration measurements
from the candidate methods;
i = The measurement number in the set;
and
j = The measurement set number.
(j) Test for comparability. (1) For each
site, plot all of the average PM10 or PM2.5
measurements obtained with the
¯
candidate method (Cj) against the
corresponding average PM10 or PM2.5
measurements obtained with the
¯
reference method (Rj). For each site,
calculate and record the linear
regression slope and intercept, and the
correlation coefficient.
(2) To pass the test for comparability,
the slope, intercept, and correlation
coefficient calculated under paragraph
(j)(1) of this section must be within the
limits specified in table C–4 of this
subpart for all test sites.
10. Section 53.35 is added to read as
follows:
§ 53.35 Test procedure for Class II and
Class III methods for PM2.5 and PM10-2.5.
(a) Overview. Class II and Class III
candidate equivalent methods shall be
tested for comparability of PM2.5 or
PM10-2.5 measurements to corresponding
collocated PM2.5 or PM10-2.5 reference
method measurements at each of
multiple field sites, as required.
Comparability is shown for the
candidate method when simultaneous
collocated measurements made by
candidate and reference methods meet
the comparability requirements
specified in this section § 53.35 and in
table C–4 of this subpart at each of the
required test sites.
(b) Test sites and seasons. (1) Test
sites. Comparability testing is required
at each of the applicable test sites
required by this paragraph (b). Each test
site must also meet the general test site
requirements specified in § 53.30(b).
(i) PM2.5 Class II and Class III
candidate methods. Test sites should be
chosen to provide representative
chemical and meteorological
characteristics with respect to nitrates,
sulfates, organic compounds, and
various levels of humidity, wind, and
elevation. For Class III methods, one test
site shall be selected in each of the
following general locations. For Class II
methods, two test sites, one eastern site
and one western site, shall be selected
from these locations. Test site A shall be
in the Los Angeles basin area in a
PO 00000
Frm 00054
Fmt 4701
Sfmt 4702
location that is characterized by
relatively high PM2.5, nitrates, and semivolatile organic pollutants. Test site B
shall be in a northeastern or midAtlantic U.S. city that is seasonally
characterized by high sulfate
concentrations, high relative humidity,
and wintertime conditions. Test site C
shall be in a western U.S. city such as
Denver, Salt Lake City, or Albuquerque
in a location that is in an area
characterized by cold weather, higher
elevation, winds, and dust.
(ii) PM10-2.5 Class II and Class III
candidate methods. Test sites shall be
chosen to provide modest to high levels
of PM10-2.5 representative of locations in
proximity to urban sources of PM10-2.5
such as high-density traffic on paved
roads, industrial sources, and
construction activities. For Class III
methods, one test site shall be selected
in each of the following general
locations. At least one of the test sites
shall have characteristic wintertime
temperatures of 0°C or lower. For Class
II methods, two test sites, one eastern
site and one western site, shall be
selected from these locations. Test site
A shall be in the Los Angeles basin or
the California Central Valley area. Test
site B shall be in a large U.S. city east
of the Mississippi River, having
characteristically high humidity levels.
Test site C shall be in a western U.S.
city characterized by a high ratio of
PM10-2.5 to PM2.5, with exposure to rural
windblown dust, such as Las Vegas or
Phoenix.
(2) Test seasons. (i) For PM2.5 and
PM10-2.5 Class III candidate methods, test
campaigns are required in both summer
and winter seasons at test sites A and B.
A test campaign is required only in the
winter season at test site C. (A total of
5 test campaigns is required.) The
summer season shall be defined as the
typically warmest 3 or 4 months of the
year at the site; the winter season shall
be defined as the typically coolest 3 or
4 months of the year at the site.
(ii) For Class II PM2.5 and PM10-2.5
candidate methods, only one test
campaign is required at each site, at any
time of year (total of 2 test campaigns).
(3) Test concentrations. The test sites
should be selected to provide ambient
concentrations within the concentration
limits specified in table C–4 of this
subpart, and also to provide a wide
range of test concentrations. A narrow
range of test concentrations may result
in a low concentration coefficient of
variation statistic for the test
measurements, making the test for
correlation coefficient more difficult to
pass (see paragraph (h) of this section,
test for comparison correlation).
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.012 EP17JA06.013
R = The concentration measurements
from the reference methods;
i = The sampler number; and
j = The measurement set number.
(2) For each of the measurement sets,
calculate the precision of the reference
method PM10 or PM2.5 measurements as
the standard deviation, PRj, using
equation 8 of this section:
EP17JA06.014
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
more representative conditions,
additional higher or lower
measurements, or to otherwise improve
the comparison of the methods. All
valid data sets obtained during each test
campaign shall be submitted and shall
be included in the analysis of the data.
(4) The integrated-sample reference
method measurements shall be of at
least 22 hours and not more than 25
hours duration. Each reference method
sample shall be retrieved promptly after
sample collection and analyzed
according to the reference method to
determine the PM2.5 or PM10-2.5
measured concentration in µg/m3.
Guidance and quality assurance
procedures applicable to PM2.5 or
PM10-2.5 reference methods are found in
‘‘Quality Assurance Document 2.12’’
(reference (2) in appendix A to this
subpart).
(5) Candidate method measurements
shall be timed or processed and
averaged as appropriate to determine an
equivalent mean concentration
representative of the same time period
as that of the concurrent integratedsample reference method
measurements, such that all
measurements in a measurement set
shall be representative of the same time
period. In addition, hourly average
concentration measurements shall be
obtained from each of the Class III
candidate method analyzers for each
valid measurement set and submitted as
part of the application records.
(6) In the following tests, all
measurement sets obtained at a
particular test site, from both seasonal
campaigns if applicable, shall be
combined and included in the test data
analysis for the site. Data obtained at
different test sites shall be analyzed
separately. All measurements should be
reported as normally obtained, and no
measurement values should be rounded
or truncated prior to data analysis. In
particular, no negative measurement
value, if otherwise apparently valid,
should be modified, adjusted, replaced,
or eliminated merely because its value
is negative. Calculated mean
concentrations or calculated
intermediate quantities should retain at
least one order-of-magnitude greater
resolution than the input values. All
measurement data and calculations
shall be recorded and submitted in
accordance with § 53.30(g), including
hourly test measurements obtained from
Class III candidate methods.
(d) Calculation of mean
concentrations. (1) Reference method
outlier test. For each of the
measurement sets for each test site,
check each reference method
measurement to see if it might be an
PO 00000
Frm 00055
Fmt 4701
Sfmt 4702
anomalous value (outlier) as follows,
where Ri,j is the measurement of
reference method sampler i on test day
j. In the event that one of the reference
method measurements is missing or
invalid due to a specific, positivelyidentified physical cause (e.g., sampler
malfunction, operator error, accidental
damage to the filter, etc.; see paragraph
(c)(2) of this section), then substitute
zero for the missing measurement, for
the purposes of this outlier test only.
(i) Calculate the quantities 2 × R1, j/
(R1, j + R2, j) and 2 × R1, j/(R1, j + R3, j). If
both quantities fall outside of the
interval, (0.93, 1.07), then R1, j is an
outlier.
(ii) Calculate the quantities 2 × R2, j/
(R2, j + R1, j) and 2 × R2, j/(R2, j + R3, j). If
both quantities fall outside of the
interval, (0.93, 1.07), then R2, j is an
outlier.
(iii) Calculate the quantities 2 × R3,j/
(R3,j + R1,j) and 2 × R3,j/(R3,j + R2,j). If
both quantities fall outside of the
interval, (0.93, 1.07), then R3,j is an
outlier.
(iv) If this test indicates that one of
the reference method measurements in
the measurement set is an outlier, the
outlier measurement shall be eliminated
from the measurement set, and the other
two measurements considered valid. If
the test indicates that more than one
reference method measurement in the
measurement set is an outlier, the entire
measurement set (both reference and
candidate method measurements) shall
be excluded from further data analysis
for the tests of this section.
(2) For each of the measurement sets
for each test site, calculate the mean
concentration for the reference method
measurements, using equation 11 of this
section:
Equation 11
Rj =
1 n
∑ R i, j
n i =1
Where:
¯
Rj = The mean concentration measured
by the reference method for the
measurement set;
Ri,j = The measurement of reference
method sampler i on test day j; and
n =The number of valid reference
method measurements in the
measurement set (normally 3).
¯
(3) Any measurement set for which Rj
does not fall in the acceptable
concentration range specified in table
C–4 of this subpart is not valid, and the
entire measurement set (both reference
and candidate method measurements)
must be eliminated from further data
analysis.
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.015
(4) Pre-approval of test sites. The EPA
recommends that the applicant seek
EPA approval of each proposed test site
prior to conducting test measurements
at the site. To do so, the applicant
should submit a request for approval as
described in § 53.30(b)(2).
(c) Collocated measurements. (1) For
each test campaign, three reference
method samplers and three candidate
method samplers or analyzers shall be
installed and operated concurrently at
each test site within each required
season (if applicable), as specified in
paragraph (b) of this section. All
reference method samplers shall be of
single-filter design (not multi-filter,
sequential sample design). Each
candidate method shall be setup and
operated in accordance with its
associated manual referred to in
§ 53.4(b)(3) and in accordance with
applicable guidance in ‘‘Quality
Assurance Document 2.12’’ (reference
(2) in appendix A to this subpart). All
samplers or analyzers shall be placed so
that they sample or measure air
representative of the surrounding area
(within one kilometer) and are not
unduly affected by adjacent buildings,
air handling equipment, industrial
operations, traffic, or other local
influences. The ambient air inlet points
of all samplers and analyzers shall be
positioned at the same height above the
ground level and between 2 meters (1
meter for instruments having sample
inlet flow rates less than 200 L/min) and
4 meters apart.
(2) A minimum of 23 valid and
acceptable measurement sets of PM2.5 or
PM10-2.5 24-hour (nominal) concurrent
concentration measurements shall be
obtained during each test campaign at
each test site. To be considered
acceptable for the test, each
measurement set shall consist of at least
two valid reference method
measurements and at least two valid
candidate method measurements, and
the PM2.5 or PM10-2.5 measured
concentration, as determined by the
average of the reference method
measurements, must fall within the
acceptable concentration range specified
in table C–4 of this subpart. Each
measurement set shall include all valid
measurements obtained. For each
measurement set containing fewer than
three reference method measurements
or fewer than three candidate method
measurements, an explanation and
appropriate justification shall be
provided to account for the missing
measurement or measurements.
(3) More than 23 valid measurement
sets may be obtained during a particular
test campaign to provide a more
advantageous range of concentrations,
2763
2764
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
RPj =
2
i, j
1
Rj
(2) For each site, calculate an estimate
of reference method relative precision
for the site, RP, using the root mean
square calculation of equation 14 of this
section:
CP =
2
1 J
∑ ( CPj )
J j=1
where, J is the total number of valid
measurement sets for the site.
(3) To pass the test for precision, the
mean candidate method relative
precision at each site must not be
greater than the value for candidate
method precision specified in table C–
4 of this subpart.
(g) Test for additive and
multiplicative bias (comparative slope
and intercept). (1) For each test site,
calculate the mean concentration
¯
measured by the reference method, R,
using equation 17 of this section:
R=
2
1 J
∑ ( RPj )
J j=1
where, J is the total number of valid
measurement sets for the site.
(3) Verify that the estimate for
reference method relative precision for
the site, RP, is not greater than the value
specified for reference method precision
in table C–4 of this subpart. A reference
method relative precision greater than
the value specified in table C–4 of this
subpart indicates that quality control for
the reference method is inadequate, and
corrective measures must be
implemented before proceeding with
the test.
VerDate Aug<31>2005
21:14 Jan 13, 2006
Jkt 205001
(4) To pass this test, at each test site:
(i) The slope must be in the interval
specified for regression slope in table C–
4 of this subpart; and
(ii) The intercept must be in the
interval specified for regression
intercept in table C–4 of this subpart.
(iii) The slope and intercept limits are
illustrated in figures C–2 and C–3 of this
subpart.
(h) Tests for comparison correlation.
(1) For each test site, calculate the
(Pearson) correlation coefficient, r (not
the coefficient of determination, r 2),
using equation 21 of this section:
Equation 21
∑(R
J
r=
∑(R
J
−R
)
) ∑ (C − C)
2
J
2
j
j=1
Equation 22
(3) For each test site, calculate the
linear regression slope and intercept of
the mean candidate method
¯
measurements (Cj) against the mean
¯
reference method measurements (Rj),
∑(R
J
CCV =
1 J
C = ∑ Cj
J j=1
Sfmt 4702
)(
− R Cj − C
(2) For each test site, calculate the
concentration coefficient of variation,
CCV, using equation 22 of this section:
1 J
∑Rj
J j=1
Fmt 4701
j
j=1
Equation 18
Frm 00056
j
j=1
1
R
j
−R
j=1
)
2
J −1
(3) To pass the test, the correlation
coefficient, r, for each test site must not
be less than the values, for various
values of CCV, specified for correlation
in table C–4 of this subpart. These limits
are illustrated in figure C–4 of this
subpart.
11. Tables C–1, C–2, C–3, and C–4 to
subpart C are revised to read as follows:
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.026
Intercept = C − slope × R
(2) For each test site, calculate the
mean concentration measured by the
¯
candidate method, C, using equation 18
of this section:
PO 00000
2
Equation 20
Equation 17
Equation 14
RP =
j=1
)
EP17JA06.025
Equation 16
−R
EP17JA06.024
(2) For each site, calculate an estimate
of candidate method relative precision
for the site, CP, using the root mean
square calculation of equation 16 of this
section:
j
EP17JA06.023
2
1 n
∑ R − n ∑ R i, j
i =1
i =1
×100%
n −1
n
1
Cj
J
)
EP17JA06.022
Equation 13
CPj =
)(
− R Cj − C
∑(R
2
1 m
∑ Ci,2j − m ∑ Ci, j
i=1
i=1
× 100%
m −1
m
where:
Cj = The mean concentration measured
by the candidate method for the
measurement set;
Ci, j = The measurement of candidate
method analyzer i on test day j; and
m = The number of valid candidate
method measurements in the
measurement set (normally 3).
(e) Test for reference method
precision. (1) For each of the
measurement sets for each site, calculate
an estimate for the relative precision of
the reference method measurements,
RPj, using equation 13 of this section:
Slope =
j
j=1
EP17JA06.021
1 m
∑ Ci , j
m i =1
∑(R
J
EP17JA06.020
Cj =
Equation 19
EP17JA06.019
Equation 12
Equation 15
using equations 19 and 20 of this
section, respectively:
EP17JA06.018
(f) Test for candidate method
precision. (1) For each of the
measurement sets, for each site,
calculate an estimate for the relative
precision of the candidate method
measurements, CPj, using equation 15 of
this section:
EP17JA06.016 EP17JA06.017
(4) For each of the valid measurement
sets at each test site, calculate the mean
concentration for the candidate method
measurements, using equation 12 of this
section. (The outlier test in paragraph
(d)(1) of this section shall not be applied
to the candidate method measurements.)
2765
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE C–1 TO SUBPART C OF PART 53.—TEST CONCENTRATION RANGES, NUMBER OF MEASUREMENTS REQUIRED, AND
MAXIMUM DISCREPANCY SPECIFICATION
Simultaneous measurements required
Pollutant
Concentration range parts per million
1-hr
24-hr
First
set
Ozone ..................................
Low 0.06 to 0.10 ..............................................................
Med 0.15 to 0.25 ..............................................................
High 0.35 to 0.45 .............................................................
Total .................................................................................
Low 7 to 11 ......................................................................
Med 20 to 30 ....................................................................
High 35 to 45 ...................................................................
Total .................................................................................
Low 0.02 to 0.05 ..............................................................
Med 0.10 to 0.15 ..............................................................
High 0.30 to 0.50 .............................................................
Total .................................................................................
Low 0.02 to 0.08 ..............................................................
Med 0.10 to 0.20 ..............................................................
High 0.25 to 0.35 .............................................................
Total .................................................................................
Carbon monoxide ................
Sulfur dioxide ......................
Nitrogen dioxide ..................
TABLE C–2 TO SUBPART C OF PART
53.—SEQUENCE OF TEST MEASUREConcentration range
1
2
3
4
5
6
7
8
9
First set
...........
...........
...........
...........
...........
...........
...........
...........
...........
Low ..................
High ..................
Medium ............
High ..................
Low ..................
Medium ............
Low ..................
Medium ............
High ..................
Second
set
5
5
4
14
5
5
4
14
............
............
7
7
............
............
............
............
6
6
6
18
6
6
6
18
............
............
8
8
............
............
............
............
............
............
............
............
............
............
............
............
3
2
2
7
3
2
2
7
............
............
............
............
............
............
............
............
3
3
2
8
3
3
2
8
10
11
12
13
14
15
16
17
18
.........
.........
.........
.........
.........
.........
.........
.........
.........
First set
Second set
Medium ............
High ..................
Low ..................
Medium ............
Low ..................
..........................
..........................
..........................
..........................
Low.
Medium.
High.
Medium.
High.
Low.
Medium.
Low.
High.
0.02
.03
.04
..........................
1.5
2.0
3.0
..........................
0.02
.03
.04
..........................
0.02
.03
.03
..........................
TABLE C–3 TO SUBPART C OF PART
53.—TEST SPECIFICATIONS FOR PB
METHODS
Concentration range, µg/m3
Concentration range
Measurement
Second set
Medium.
High.
Low.
High.
Medium.
Low.
Medium.
Low.
High.
First
set
TABLE C–2 TO SUBPART C OF PART
53.—SEQUENCE OF TEST MEASUREMENTS—Continued
MENTS
Measurement
Second
set
Maximum discrepancy specification, parts
per million
0.5–4.0
Minimum number of 24-hr
measurements ......................
Maximum analytical precision,
percent ..................................
Maximum analytical accuracy,
percent ..................................
Maximum difference, percent of
reference method ..................
5
15
±5
±20
TABLE C–4 TO SUBPART C.—TEST SPECIFICATIONS FOR PM10, PM2.5 AND PM10-2.5 CANDIDATE EQUIVALENT METHODS
PM2.5
Specification
PM10-2.5
PM10
Class I
Acceptable concentration
range (Rj), µg/m3.
Minimum number of test sites
Minimum number of candidate
method samplers or analyzers per site.
Number of reference method
samplers per site.
Minimum number of acceptable sample sets per site
for PM10 methods:.
Rj < 60 µg/m3 .........................
Rj > 60 µg/m3 .........................
Total ................................
Minimum number of acceptable sample sets
per site for PM2.5 and
PM10-2.5 candidate
equivalent methods:.
Rj < 30 µg/m3 for 24-hr or Rj
< 20 µg/m3 for 48-hr samples.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Class II
Class III
Class II
15–300 .............
3–200 ...............
3–200 ...............
3–200 ...............
3–200 ...............
3–200
2 .......................
3 .......................
1 .......................
3 .......................
2 .......................
3 1 .....................
3 .......................
3 1 .....................
2 .......................
3 1 .....................
3
31
3 .......................
3 .......................
3 1 .....................
3 1 .....................
3 1 .....................
31
3
3
10
...........................
Jkt 208001
PO 00000
3
Frm 00057
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
Class III
2766
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE C–4 TO SUBPART C.—TEST SPECIFICATIONS FOR PM10, PM2.5 AND PM10-2.5 CANDIDATE EQUIVALENT METHODS—
Continued
PM2.5
Specification
PM10-2.5
PM10
Class I
Class II
Class III
Class II
Class III
23 .....................
23 .....................
10% 2 ................
23
46 (23 for single
season site)
10% 2
Rj > 30 µg/m3 for 24-hr or Rj
> 20 µg/m3 for 48-hr samples.
Each season ..........................
Total, each site ...............
...........................
3
...........................
...........................
10 .....................
10 .....................
23 .....................
23 .....................
Precision of replicate reference method measurements, PRj or RPRj, respectively; RP for Class II or III
PM2.5 or PM10-2.5, maximum.
Precision of PM2.5 or PM10-2.5
candidate method, CP,
each site.
Slope of regression relationship.
Intercept of regression relationship, µg/m3.
5 µg/m3 or 7% ..
2 µg/m3 or 5% ..
10% 2 ................
23 .....................
46 (23 for single
season site).
10% 2 ................
...........................
...........................
10% 2 ................
15% 2 ................
15% 2 ................
15% 2
1±0.1 .................
1±0.05 ...............
1±0.10 ...............
1±0.10 ...............
1±0.10 ...............
1±0.12
0±5 ....................
0±1 ....................
≥0.97 .................
≥0.97 .................
Between:
Between:
Between:
Between:
13.55¥(15.05
15.05¥(17.32
59.93¥(70.50
70.50¥(82.93
× slope), but
× slope); and
× slope), but
× slope); and
not less than
15.05¥(13.20
not less than
70.50¥(61.16
¥1.5; and
× slope).
¥7.0; and
× slope)
16.56¥(15.05
81.08¥(70.50
× slope), but
× slope), but
not more than
not more than
+1.5.
+7.0.
≥0.93 for CCV≤0.4; ≥0.85+0.2×CCV for 0.4≤CCV≤0.5; ≥0.95 for CCV≥0.5
Correlation of reference method and candidate method
measurements.
1 Some
missing daily measurement values may be permitted; see test procedure.
as the root mean square over all measurement sets.
2 Calculated
11. Figure C–1 to subpart C is revised
to read as follows:
BILLING CODE 6560–50–U
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00058
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
2767
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00059
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.072
13. Figures C–2, C–3, and C–4 are
added to subpart C to read as follows:
VerDate Aug<31>2005
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00060
Fmt 4701
Sfmt 4725
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.001
2768
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00061
Fmt 4701
Sfmt 4725
E:\FR\FM\17JAP3.SGM
17JAP3
2769
EP17JA06.002
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
2770
BILLING CODE 6560–50–C
14. Appendix A to subpart C is
amended by adding reference (2) to read
as follows:
Appendix A to Subpart C—References
*
*
*
*
*
(2) Quality Assurance Guidance
Document 2.12. Monitoring PM2.5 in
Ambient Air Using Designated
Reference or Class I Equivalent
Methods. U.S. EPA, National Exposure
Research Laboratory, Research Triangle
Park, NC, November 1998 or later
edition. Currently available at https://
www.epa.gov/ttn/amtic/pmqainf.html.
Subpart E—Procedures for Testing
Physical (Design) andPerformance
Characteristics of Reference Methods
and Class I and Class II Equivalent
Methods for PM2.5 or PM10-2.5
15. The heading for subpart E is
revised as set out above.
16. Section 53.50 is revised to read as
follows:
§ 53.50
General provisions.
(a) A candidate method for PM2.5 or
PM10-2.5 described in an application for
a reference or equivalent method
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
determination submitted under § 53.4
shall be determined by the EPA to be a
reference method or a Class I, II, or III
equivalent method on the basis of the
definitions for such methods given in
§ 53.1. This subpart sets forth the
specific tests that must be carried out
and the test results, evidence,
documentation, and other materials that
must be provided to EPA to demonstrate
that a PM2.5 or PM10-2.5 sampler
associated with a candidate reference
method or Class I or Class II equivalent
method meets all design and
performance specifications set forth in
appendix L or O, respectively, of part 50
of this chapter as well as additional
requirements specified in this subpart E.
Some or all of these tests may also be
applicable to a candidate Class III
equivalent method or analyzer, as may
be determined under § 53.3(b)(3).
(b) PM2.5 methods. (1) Reference
method. A sampler associated with a
candidate reference method for PM2.5
shall be subject to the provisions,
specifications, and test procedures
prescribed in §§ 53.51 through 53.58.
(2) Class I method. A sampler
associated with a candidate Class I
equivalent method for PM2.5 shall be
PO 00000
Frm 00062
Fmt 4701
Sfmt 4702
subject to the provisions, specifications,
and test procedures prescribed in all
sections of this subpart.
(3) Class II method. A sampler
associated with a candidate Class II
equivalent method for PM2.5 shall be
subject to the provisions, specifications,
and test procedures prescribed in all
applicable sections of this subpart, as
specified in subpart F of this part or as
specified in § 53.3(a)(3).
(c) PM10-2.5 methods. (1) Reference
method. A sampler associated with a
reference method for PM10-2.5, as
specified in appendix O to part 50 of
this chapter, shall be subject to the
requirements in this paragraph (c)(1).
(i) The PM2.5 sampler of the PM10-2.5
sampler pair shall be verified to be
either currently designated under this
part 53 as a reference method for PM2.5,
or shown to meet all requirements for
designation as a reference method for
PM2.5, in accordance with this part 53.
(ii) The PM10c sampler of the PM10-2.5
sampler pair shall be verified to be of
like manufacturer, design,
configuration, and fabrication to the
PM2.5 sampler of the PM10-2.5 sampler
pair, except for replacement of the
particle size separator specified in
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.003
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
section 7.3.4 of appendix L to part 50 of
this chapter with the downtube
extension as specified in Figure O–1 of
appendix O to part 50 of this chapter.
(iii) For samplers that meet the
provisions of paragraphs (c)(1)(i) and (ii)
of this section, the candidate PM10-2.5
reference method may be determined to
be a reference method without further
testing.
(2) Class I method. A sampler
associated with a Class I candidate
equivalent method for PM10-2.5 shall
meet the requirements in this paragraph
(c)(2).
(i) The PM2.5 sampler of the PM10-2.5
sampler pair shall be verified to be
either currently designated under this
part 53 as a reference method or Class
I equivalent method for PM2.5, or shown
to meet all requirements for designation
as a reference method or Class I
equivalent method for PM2.5, in
accordance with this part 53.
(ii) The PM10c sampler of the PM10-2.5
sampler pair shall be verified to be of
similar design to the PM10-2.5 sampler
and to meet all requirements for
designation as a reference method or
Class I equivalent method for PM2.5, in
accordance with this part 53, except for
replacement of the particle size
separator specified in section 7.3.4 of
appendix L to part 50 of this chapter
with the downtube extension as
specified in Figure O–1 of appendix O
to part 50 of this chapter.
(iii) For samplers that meet the
provisions of paragraphs (c)(2)(i) and (ii)
of this section, the candidate PM10-2.5
method may be determined to be a Class
I equivalent method without further
testing.
(3) Class II method. A sampler
associated with a Class II candidate
equivalent method for PM10-2.5 shall be
subject to the applicable requirements of
this subpart E, as described in
§ 53.3(a)(5).
(d) The provisions of § 53.51 pertain
to test results and documentation
required to demonstrate compliance of a
candidate method sampler with the
design specifications set forth in 40 CFR
part 50, appendix L or O, as applicable.
The test procedures prescribed in
§§ 53.52 through 53.59 pertain to
performance tests required to
demonstrate compliance of a candidate
method sampler with the performance
specifications set forth in 40 CFR part
50, appendix L or O, as applicable, as
well as additional requirements
specified in this subpart E. These latter
test procedures shall be used to test the
performance of candidate samplers
against the performance specifications
and requirements specified in each
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
procedure and summarized in table E–
1 of this subpart.
(e) Test procedures prescribed in
§ 53.59 do not apply to candidate
reference method samplers. These
procedures apply primarily to candidate
Class I or Class II equivalent method
samplers for PM2.5 or PM10-2.5 that have
a sample air flow path configuration
upstream of the sample filter that is
modified from that specified for the
reference method sampler, as set forth
in 40 CFR part 50, appendix L, Figures
L–1 to L–29 or 40 CFR part 50 appendix
O, Figure O–1, if applicable, such as
might be necessary to provide for
sequential sample capability. The
additional tests determine the adequacy
of aerosol transport through any altered
components or supplemental devices
that are used in a candidate sampler
upstream of the filter. In addition to the
other test procedures in this subpart,
these test procedures shall be used to
further test the performance of such an
equivalent method sampler against the
performance specifications given in the
procedure and summarized in table E–
1 of this subpart.
(f) A 10-day operational field test of
measurement precision is required
under § 53.58 for both reference and
Class I equivalent method samplers for
PM2.5. This test requires collocated
operation of 3 candidate method
samplers at a field test site. For
candidate equivalent method samplers,
this test may be combined and carried
out concurrently with the test for
comparability to the reference method
specified under § 53.34, which requires
collocated operation of three reference
method samplers and three candidate
equivalent method samplers.
(g) All tests and collection of test data
shall be performed in accordance with
the requirements of reference 1, section
4.10.5 (ISO 9001) and reference 2, part
B, section 3.3.1, paragraphs 1 and 2 and
Part C, section 4.6 (ANSI/ASQC E4) in
appendix A of this subpart. All test data
and other documentation obtained
specifically from or pertinent to these
tests shall be identified, dated, signed
by the analyst performing the test, and
submitted to EPA in accordance with
subpart A of this part.
17. Section 53.51 is revised to read as
follows:
§ 53.51 Demonstration of compliance with
design specifications and manufacturing
and test requirements.
(a) Overview. (1) The subsequent
paragraphs of this section specify
certain documentation that must be
submitted and tests that are required to
demonstrate that samplers associated
with a designated reference or
PO 00000
Frm 00063
Fmt 4701
Sfmt 4702
2771
equivalent method for PM2.5 or PM10-2.5
are properly manufactured to meet all
applicable design and performance
specifications and have been properly
tested according to all applicable test
requirements for such designation.
Documentation is required to show that
instruments and components of a PM2.5
or PM10-2.5 sampler are manufactured in
an ISO 9001-registered facility under a
quality system that meets ISO–9001
requirements for manufacturing quality
control and testing.
(2) In addition, specific tests are
required by paragraph (d) of this section
to verify that critical features of
reference method samplers—the particle
size separator and the surface finish of
surfaces specified to be anodized—meet
the specifications of 40 CFR part 50,
appendix L or appendix O, as
applicable. A checklist is required to
provide certification by an ISO-certified
auditor that all performance and other
required tests have been properly and
appropriately conducted, based on a
reasonable and appropriate sample of
the actual operations or their
documented records. Following
designation of the method, another
checklist is required initially to provide
an ISO-certified auditor’s certification
that the sampler manufacturing process
is being implemented under an
adequate and appropriate quality
system.
(3) For the purposes of this section,
the definitions of ISO 9001-registered
facility and ISO-certified auditor are
found in § 53.1. An exception to the
reliance by EPA on ISO-certified
auditors is the requirement for the
submission of the operation or
instruction manual associated with the
candidate method to EPA as part of the
application. This manual is required
under § 53.4(b)(3). The EPA has
determined that acceptable technical
judgment for review of this manual may
not be assured by ISO-certified auditors,
and approval of this manual will
therefore be performed by EPA.
(b) ISO registration of manufacturing
facility. The applicant must submit
documentation verifying that the
samplers identified and sold as part of
a designated PM2.5 or PM10-2.5 reference
or equivalent method will be
manufactured in an ISO 9001-registered
facility and that the manufacturing
facility is maintained in compliance
with all applicable ISO 9001
requirements (reference 1 in appendix A
of this subpart). The documentation
shall indicate the date of the original
ISO 9001 registration for the facility and
shall include a copy of the most recent
certification of continued ISO 9001
facility registration. If the manufacturer
E:\FR\FM\17JAP3.SGM
17JAP3
2772
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
does not wish to initiate or complete
ISO 9001 registration for the
manufacturing facility, documentation
must be included in the application to
EPA describing an alternative method to
demonstrate that the facility meets the
same general requirements as required
for registration to ISO–9001. In this
case, the applicant must provide
documentation in the application to
demonstrate, by required ISO-certified
auditor’s inspections, that a quality
system is in place which is adequate to
document and monitor that the sampler
system components and final assembled
samplers all conform to the design,
performance and other requirements
specified in this part and in 40 CFR part
50, appendix L.
(c) Sampler manufacturing quality
control. The manufacturer must ensure
that all components used in the
manufacture of PM2.5 or PM10-2.5
samplers to be sold as part of a reference
or equivalent method and that are
specified by design in 40 CFR part 50,
appendix L or O (as applicable), are
fabricated or manufactured exactly as
specified. If the manufacturer’s quality
records show that its quality control
(QC) and quality assurance (QA) system
of standard process control inspections
(of a set number and frequency of
testing that is less than 100 percent)
complies with the applicable QA
provisions of section 4 of reference 4 in
appendix A of this subpart and prevents
nonconformances, 100 percent testing
shall not be required until that
conclusion is disproved by customer
return or other independent
manufacturer or customer test records. If
problems are uncovered, inspection to
verify conformance to the drawings,
specifications, and tolerances shall be
performed. Refer also to paragraph (e) of
this section-final assembly and
inspection requirements.
(d) Specific tests and supporting
documentation required to verify
conformance to critical component
specifications. (1) Verification of PM2.5
(WINS) impactor jet diameter. For
samplers utilizing the WINS impactor
particle size separator specified in
paragraphs 7.3.4.1, 7.3.4.2, and 7.3.4.3
of appendix L to part 50 of this chapter,
the diameter of the jet of each impactor
manufactured for a PM2.5 or PM10-2.5
sampler under the impactor design
specifications set forth in 40 CFR part
50, appendix L, shall be verified against
the tolerance specified on the drawing,
using standard, NIST-traceable ZZ go/no
go plug gages. This test shall be a final
check of the jet diameter following all
fabrication operations, and a record
shall be kept of this final check. The
manufacturer shall submit evidence that
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
this procedure is incorporated into the
manufacturing procedure, that the test is
or will be routinely implemented, and
that an appropriate procedure is in
place for the disposition of units that
fail this tolerance test.
(2) VSCC separator. For samplers
utilizing the BGI VSCCTM Very Sharp
Cut Cyclone particle size separator
specified in paragraph 7.3.4.4 of
appendix L to part 50 of this chapter,
the VSCC manufacturer shall identify
the critical dimensions and
manufacturing tolerances for the device,
develop appropriate test procedures to
verify that the critical dimensions and
tolerances are maintained during the
manufacturing process, and carry out
those procedures on each VSCC
manufactured to verify conformance of
the manufactured products. The
manufacturer shall also maintain
records of these tests and their results
and submit evidence that this procedure
is incorporated into the manufacturing
procedure, that the test is or will be
routinely implemented, and that an
appropriate procedure is in place for the
disposition of units that fail this
tolerance test.
(3) Verification of surface finish. The
anodization process used to treat
surfaces specified to be anodized shall
be verified by testing treated specimen
surfaces for weight and corrosion
resistance to ensure that the coating
obtained conforms to the coating
specification. The specimen surfaces
shall be finished in accordance with
military standard specification 8625F,
Type II, Class I (reference 4 in appendix
A of this subpart) in the same way the
sampler surfaces are finished, and
tested, prior to sealing, as specified in
section 4.5.2 of reference 4 in appendix
A of this subpart.
(e) Final assembly and inspection
requirements. Each sampler shall be
tested after manufacture and before
delivery to the final user. Each
manufacturer shall document its postmanufacturing test procedures. As a
minimum, each test shall consist of the
following: Tests of the overall integrity
of the sampler, including leak tests;
calibration or verification of the
calibration of the flow measurement
device, barometric pressure sensor, and
temperature sensors; and operation of
the sampler with a filter in place over
a period of at least 48 hours. The results
of each test shall be suitably
documented and shall be subject to
review by an ISO-certified auditor.
(f) Manufacturer’s audit checklists.
Manufacturers shall require an ISOcertified auditor to sign and date a
statement indicating that the auditor is
aware of the appropriate manufacturing
PO 00000
Frm 00064
Fmt 4701
Sfmt 4702
specifications contained in 40 CFR part
50, appendix L or O (as applicable), and
the test or verification requirements in
this subpart. Manufacturers shall also
require an ISO-certified auditor to
complete the checklists, shown in
figures E–1 and E–2 of this subpart,
which describe the manufacturer’s
ability to meet the requirements of the
standard for both designation testing
and product manufacture.
(1) Designation testing checklist. The
completed statement and checklist as
shown in figure E–1 of this subpart shall
be submitted with the application for
reference or equivalent method
determination.
(2) Product manufacturing checklist.
Manufacturers shall require an ISOcertified auditor to complete a Product
Manufacturing Checklist (figure E–2 of
this subpart), which evaluates the
manufacturer on its ability to meet the
requirements of the standard in
maintaining quality control in the
production of reference or equivalent
devices. The completed checklist shall
be submitted with the application for
reference or equivalent method
determination.
18. Section 53.52 is amended by
revising paragraph (e)(1) to read as
follows:
§ 53.52
Leak check test.
*
*
*
*
*
(e) Test setup. (1) The test sampler
shall be set up for testing as described
in the sampler’s operation or instruction
manual referred to in § 53.4(b)(3). The
sampler shall be installed upright and
set up in its normal configuration for
collecting PM samples, except that the
sample air inlet shall be removed and
the flow rate measurement adaptor shall
be installed on the sampler’s downtube.
*
*
*
*
*
19. Section 53.53 is amended by
revising paragraph (e)(1) to read as
follows:
§ 53.53 Test for flow rate accuracy,
regulation, measurement accuracy, and cutoff.
*
*
*
*
*
(e) Test setup. (1) Setup of the
sampler shall be as required in this
paragraph (e) and otherwise as
described in the sampler’s operation or
instruction manual referred to in
§ 53.4(b)(3). The sampler shall be
installed upright and set up in its
normal configuration for collecting PM
samples. A sample filter and (or) the
device for creating an additional 55 mm
Hg pressure drop shall be installed for
the duration of these tests. The
sampler’s ambient temperature, ambient
pressure, and flow rate measurement
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
§ 53.54 Test for proper sampler operation
following power interruptions.
*
*
*
*
*
(d) Test setup. (1) Setup of the
sampler shall be performed as required
in this paragraph (d) and otherwise as
described in the sampler’s operation or
instruction manual referred to in
§ 53.4(b)(3). The sampler shall be
installed upright and set up in its
normal configuration for collecting PM
samples. A sample filter and (or) the
device for creating an additional 55 mm
Hg pressure drop shall be installed for
the duration of these tests. The
sampler’s ambient temperature, ambient
pressure, and flow measurement
systems shall all be calibrated per the
sampler’s operating manual within 7
days prior to this test.
*
*
*
*
*
21. Section 53.55 is amended as
follows:
a. By revising paragraphs (a)(1)
introductory text and (a)(2).
b. By revising paragraph (e)(1).
c. By revising paragraph (g)(5)(i).
§ 53.55 Test for effect of variations in
power line voltage and ambient
temperature.
(a) Overview. (1) This test procedure
is a combined procedure to test various
performance parameters under
variations in power line voltage and
ambient temperature. Tests shall be
conducted in a temperature controlled
environment over four 6-hour time
periods during which reference
temperature and flow rate
measurements shall be made at intervals
not to exceed 5 minutes. Specific
parameters to be evaluated at line
voltages of 105 and 125 volts and
temperatures of ¥20 °C and +40 °C are
as follows:
*
*
*
*
*
(2) The performance parameters tested
under this procedure, the corresponding
minimum performance specifications,
and the applicable test conditions are
summarized in table E–1 of this subpart.
Each performance parameter tested, as
described or determined in the test
procedure, must meet or exceed the
associated performance specification
given. The candidate sampler must meet
all specifications for the associated
PM2.5 or PM10-2.5 method (as applicable)
to pass this test procedure.
*
*
*
*
*
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
(e) * * * (1) Setup of the sampler
shall be performed as required in this
paragraph (e) and otherwise as
described in the sampler’s operation or
instruction manual referred to in
§ 53.4(b)(3). The sampler shall be
installed upright and set up in the
temperature-controlled chamber in its
normal configuration for collecting PM
samples. A sample filter and (or) the
device for creating an additional 55 mm
Hg pressure drop shall be installed for
the duration of these tests. The
sampler’s ambient temperature, ambient
pressure, and flow measurement
systems shall all be calibrated per the
sampler’s operating manual within 7
days prior to this test.
*
*
*
*
*
(g) * * *
(5) * * * (i) Calculate the absolute
value of the difference between the
mean ambient air temperature indicated
by the test sampler and the mean
ambient (chamber) air temperature
measured with the ambient air
temperature recorder as:
Equation 16
Tdiff = Tind , ave − Tref , ave
Where:
Tind,ave = mean ambient air temperature
indicated by the test sampler,°C;
and
Tref,ave = mean ambient air temperature
measured by the reference
temperature instrument,°C.
*
*
*
*
*
22. Section 53.56 is amended by
revising paragraphs (a)(2) and (e)(1) to
read as follows:
§ 53.56 Test for effect of variations in
ambient pressure.
(a) * * *
(2) The performance parameters tested
under this procedure, the corresponding
minimum performance specifications,
and the applicable test conditions are
summarized in table E–1 of this subpart.
Each performance parameter tested, as
described or determined in the test
procedure, must meet or exceed the
associated performance specification
given. The candidate sampler must meet
all specifications for the associated
PM2.5 or PM10-2.5 method (as applicable)
to pass this test procedure.
*
*
*
*
*
(e) * * * (1) Setup of the sampler
shall be performed as required in this
paragraph (e) and otherwise as
described in the sampler’s operation or
instruction manual referred to in
§ 53.4(b)(3). The sampler shall be
PO 00000
Frm 00065
Fmt 4701
Sfmt 4702
installed upright and set up in the
pressure-controlled chamber in its
normal configuration for collecting PM
samples. A sample filter and (or) the
device for creating an additional 55 mm
Hg pressure drop shall be installed for
the duration of these tests. The
sampler’s ambient temperature, ambient
pressure, and flow measurement
systems shall all be calibrated per the
sampler’s operating manual within 7
days prior to this test.
*
*
*
*
*
23. Section 53.57 is amended by
revising paragraphs (a), (b), and (e)(1) to
read as follows:
§ 53.57 Test for filter temperature control
during sampling and post-sampling
periods.
(a) Overview. This test is intended to
measure the candidate sampler’s ability
to prevent excessive overheating of the
PM sample collection filter (or filters)
under conditions of elevated solar
insolation. The test evaluates radiative
effects on filter temperature during a 4hour period of active sampling as well
as during a subsequent 4-hour nonsampling time period prior to filter
retrieval. Tests shall be conducted in an
environmental chamber which provides
the proper radiant wavelengths and
energies to adequately simulate the
sun’s radiant effects under clear
conditions at sea level. For additional
guidance on conducting solar radiative
tests under controlled conditions,
consult military standard specification
810–E (reference 6 in appendix A of this
subpart). The performance parameters
tested under this procedure, the
corresponding minimum performance
specifications, and the applicable test
conditions are summarized in table E–
1 of this subpart. Each performance
parameter tested, as described or
determined in the test procedure, must
meet or exceed the associated
performance specification to
successfully pass this test.
(b) Technical definition. Filter
temperature control during sampling is
the ability of a sampler to maintain the
temperature of the particulate matter
sample filter within the specified
deviation (5 °C) from ambient
temperature during any active sampling
period. Post-sampling temperature
control is the ability of a sampler to
maintain the temperature of the
particulate matter sample filter within
the specified deviation from ambient
temperature during the period from the
end of active sample collection by the
sampler until the filter is retrieved from
the sampler for laboratory analysis.
*
*
*
*
*
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.027
systems shall all be calibrated per the
sampler’s operation or instruction
manual within 7 days prior to this test.
*
*
*
*
*
20. Section 53.54 is amended by
revising paragraph (d)(1) to read as
follows:
2773
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
(e) * * * (1) Setup of the sampler
shall be performed as required in this
paragraph (e) and otherwise as
described in the sampler’s operation or
instruction manual referred to in
§ 53.4(b)(3). The sampler shall be
installed upright and set up in the solar
radiation environmental chamber in its
normal configuration for collecting PM
samples (with the inlet installed). The
sampler’s ambient and filter
temperature measurement systems shall
be calibrated per the sampler’s operating
manual within 7 days prior to this test.
A sample filter shall be installed for the
duration of this test. For sequential
samplers, a sample filter shall also be
installed in each available sequential
channel or station intended for
collection of a sequential sample (or at
least 5 additional filters for magazinetype sequential samplers) as directed by
the sampler’s operation or instruction
manual.
*
*
*
*
*
24. Section 53.58 is revised to read as
follows:
§ 53.58 Operational field precision and
blank test.
(a) Overview. This test is intended to
determine the operational precision of
the candidate sampler during a
minimum of 10 days of field operation,
using three collocated test samplers.
Measurements of PM are made at a test
site with all of the samplers and then
compared to determine replicate
precision. Candidate sequential
samplers are also subject to a test for
possible deposition of particulate matter
on inactive filters during a period of
storage in the sampler. This procedure
is applicable to both reference and
equivalent methods. In the case of
equivalent methods, this test may be
combined and conducted concurrently
with the comparability test for
equivalent methods (described in
subpart C of this part), using three
reference method samplers collocated
with three candidate equivalent method
samplers and meeting the applicable
site and other requirements of subpart C
of this part.
(b) Technical definition. (1) Field
precision is defined as the standard
deviation or relative standard deviation
of a set of PM measurements obtained
concurrently with three or more
collocated samplers in actual ambient
air field operation.
(2) Storage deposition is defined as
the mass of material inadvertently
deposited on a sample filter that is
stored in a sequential sampler either
prior to or subsequent to the active
sample collection period.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
(c) Test site. Any outdoor test site
having PM2.5 (or PM10-2.5, as applicable)
concentrations that are reasonably
uniform over the test area and that meet
the minimum level requirement of
paragraph (g)(2) of this section is
acceptable for this test.
(d) Required facilities and equipment.
(1) An appropriate test site and suitable
electrical power to accommodate three
test samplers are required.
(2) Teflon sample filters, as specified
in section 6 of 40 CFR part 50, appendix
L, conditioned and preweighed as
required by section 8 of 40 CFR part 50,
appendix L, as needed for the test
samples.
(e) Test setup. (1) Three identical test
samplers shall be installed at the test
site in their normal configuration for
collecting PM samples in accordance
with the instructions in the associated
manual referred to in § 53.4(b)(3) and
also in accordance with applicable
supplemental guidance provided in
reference 3 in appendix A of this
subpart. The test samplers’ inlet
openings shall be located at the same
height above ground and between 2 (1
for samplers with flow rates less than
200 L/min.) and 4 meters apart
horizontally. The samplers shall be
arranged or oriented in a manner that
will minimize the spatial and wind
directional effects on sample collection
of one sampler on any other sampler.
(2) Each test sampler shall be
successfully leak checked, calibrated,
and set up for normal operation in
accordance with the instruction manual
and with any applicable supplemental
guidance provided in reference 3 in
appendix A of this subpart.
(f) Test procedure. (1) Install a
conditioned, preweighed filter in each
test sampler and otherwise prepare each
sampler for normal sample collection.
Set identical sample collection start and
stop times for each sampler. For
sequential samplers, install a
conditioned, preweighed specified filter
in each available channel or station
intended for automatic sequential
sample filter collection (or at least 5
additional filters for magazine-type
sequential samplers), as directed by the
sampler’s operation or instruction
manual. Since the inactive sequential
channels are used for the storage
deposition part of the test, they may not
be used to collect the active PM test
samples.
(2) Collect either a nominal 24-hour or
48-hour atmospheric PM sample
simultaneously with each of the three
test samplers.
(3) Following sample collection,
retrieve the collected sample from each
sampler. For sequential samplers,
PO 00000
Frm 00066
Fmt 4701
Sfmt 4702
retrieve the additional stored (blank,
unsampled) filters after at least 5 days
(120 hours) storage in the sampler if the
active samples are 24-hour samples, or
after at least 10 days (240 hours) if the
active samples are 48-hour samples.
(4) Determine the measured PM mass
concentration for each sample in
accordance with the applicable
procedures prescribed for the candidate
method in appendix L or appendix O,
as applicable, of part 50 of this chapter,
or in accordance with the associated
manual referred to in § 53.4(b)(3) and
supplemental guidance in reference 2 in
appendix A of this subpart. For
sequential samplers, also similarly
determine the storage deposition as the
net weight gain of each blank,
unsampled filter after the 5-day (or 10day) period of storage in the sampler.
(5) Repeat this procedure to obtain a
total of 10 sets of any combination of
(nominal) 24-hour or 48-hour PM
measurements over 10 test periods. For
sequential samplers, repeat the 5-day (or
10-day) storage test of additional blank
filters once for a total of two sets of
blank filters.
(g) Calculations. (1) Record the PM
concentration for each test sampler for
each test period as Ci, j, where i is the
sampler number (i = 1,2,3) and j is the
test period (j = 1,2, * * * 10).
(2)(i) For each test period, calculate
and record the average of the three
measured PM concentrations as Cave, j
where j is the test period using equation
26 of this section:
Equation 26
1 3
Cave , j = × ∑ Ci , j
3 i =1
(ii) If Cave, j <3 µg/m3 for any test
period, data from that test period are
unacceptable, and an additional sample
collection set must be obtained to
replace the unacceptable data.
(3)(i) Calculate and record the
precision for each of the 10 test periods,
as the standard deviation, using
equation 27 of this section:
Equation 27
3
∑C
Pj =
i =1
2
i, j
1 3
− ∑ Ci , j
3 i =1
2
2
(ii) For each of the 10 test periods,
also calculate and record the precision
as the relative standard deviation, in
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.028 EP17JA06.029
2774
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
blank filter divided by the number of
filters in the set) from each test sampler
(six sets in all) is less than 50 µg.
25. Section 53.59 is amended by
revising paragraphs (a) and (b)(5) to read
as follows:
percent, using equation 28 of this
section:
Equation 28
RPj = 100% ×
Pj
§ 53.59 Aerosol transport test for Class I
equivalent method samplers.
Cave , j
(h) Test results. (1) The candidate
method passes the precision test if
either Pj or RPj is less than or equal to
the corresponding specification in table
E–1 of this subpart for all 10 test
periods.
(2) The candidate sequential sampler
passes the blank filter storage deposition
test if the average net storage deposition
weight gain of each set of blank filters
(total of the net weight gain of each
(a) Overview. This test is intended to
verify adequate aerosol transport
through any modified or air flow
splitting components that may be used
in a Class I candidate equivalent method
sampler such as may be necessary to
achieve sequential sampling capability.
This test is applicable to all Class I
candidate samplers in which the aerosol
flow path (the flow path through which
sample air passes upstream of sample
collection filter) differs significantly
2775
from that specified for reference method
samplers as specified in 40 CFR part 50,
appendix L or appendix O, as
applicable. The test requirements and
performance specifications for this test
are summarized in table E–1 of this
subpart.
(b) * * *
(5) An added component is any
physical part of the sampler which is
different in some way from that
specified for a reference method
sampler in 40 CFR part 50, appendix L
or appendix O, as applicable, such as a
device or means to allow or cause the
aerosol to be routed to one of several
channels.
*
*
*
*
*
26. Table E–1 to subpart E is revised
to read as follows:
TABLE E–1 TO SUBPART E.—SUMMARY OF TEST REQUIREMENTS FOR REFERENCE AND CLASS I EQUIVALENT METHODS
FOR PM2.5 AND PM10-2.5
Part 50, Appendix L reference
Performance test
Performance specification
Test conditions
§ 53.52 Sample leak
check test
Sampler leak check facility
Controlled leak flow rate of
80 mL/min
Sec. 7.4.6.
§ 53.53
test
Sample flow rate
1. Mean
2. Regulation
3. Meas accuracy
4. CV accuracy
5. Cut-off
External leakage: 80 mL/
min, max
Internal leakage: 80 mL/
min, max
1. 67.67 ±5% L/min
2. 2%, max
3. 2%, max
4. 0.3% max
5. Flow rate cut-off if flow
rate deviates more than
10% from design flow
rate for >60±30 seconds
(a) 6-hour normal operational test plus flow
rate cut-off test
(b) Norman conditions
(c) Additional 55 mm Hg
pressure drop to simulate loaded filter
(d) Variable flow restrictions used for cut-off test
(a) 6-hour normal operational test
(b) Nominal conditions
(c) Additional 55 mm Hg
pressure drop to simulate loaded filter
(d) 6 power interruptions of
various durations
Sec. 7.4.1, Sec. 7.4.2,
Sec. 7.4.3, Sec. 7.4.4,
Sec. 7.4.5.
(a) 6-hour normal operational test
(b) Normal conditions
(c) Additional 55 mm Hg
pressure drop to simulate loaded filter
(d) Ambient temperature at
¥20 and +40 °C
(e) Line voltage: 105 Vac
to 125 Vac
(a) 6-hour normal operational test
(b) Normal conditions
(c) Additional 55 mm Hg
pressure drop to simulate loaded filter
(d) Barometer pressure at
600 and 800 mm Hg
Sec. 7.4.1, Sec. 7.4.2,
Sec. 7.4.3, Sec. 7.4.5,
Sec. 7.4.8, Sec.
7.4.15.1.
Base flow rate
§ 53.54 Power interruption test
§ 53.55 Temperature and
line voltage test
§ 53.56 Barometric pressure effect test
VerDate Aug<31>2005
Sample flow rate:
1. Mean
2. Regulation
3. Meas. accuracy
4. CV accuracy
5. Occurrence time of
power interruptions
6. Elapsed sample time
7. Sample volume
Sample flow rate
1. Mean
2. Regulation
3. Meas. accuracy
4. CV accuracy
5. Temperature meas. accuracy
6. Proper operation
1.
2.
3.
4.
5.
6.
7.
Sample flow rate
1. Mean
2. Regulation
3. Meas. accuracy
4. CV accuracy
5. Pressure meas. accuracy
6. Proper operation
1.
2.
3.
4.
5.
18:15 Jan 13, 2006
Jkt 208001
PO 00000
16.67 ± 5% L/min
2%, max
2%, max
0.3 max
±2 min if >60 seconds
±20 seconds
±2%, max
1. 16.67 ± 5% L/min
2. 2%, max
3. 2%, max
4. 0.3 max
5 2 °C
16.67 ± 5% L/min
2%, max
2%, max
0.3% max
10 mm Hg
Frm 00067
Fmt 4701
Sfmt 4702
E:\FR\FM\17JAP3.SGM
17JAP3
Sec. 7.4.1, Sec. 7.4.2,
Sec. 7.4.3, Sec. 7.4.5,
Sec. 7.4.12, Sec. 7.4.13,
Sec. 7.4.15.4, Sec.
7.4.15.5.
Sec. 7.4.1, Sec. 7.4.2,
Sec. 7.4.3, Sec. 7.4.5,
Sec. 7.4.9.
EP17JA06.030
Subpart E procedure
2776
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE E–1 TO SUBPART E.—SUMMARY OF TEST REQUIREMENTS FOR REFERENCE AND CLASS I EQUIVALENT METHODS
FOR PM2.5 AND PM10-2.5—Continued
Part 50, Appendix L reference
Subpart E procedure
Performance test
Performance specification
Test conditions
§ 53.57 Filter temperature
control test
1. Filter temp meas. accuracy
2. Ambient temp. meas.
accuracy
3. Filter temp. control accuracy, sampling and
non-sampling
1 Measurement precision
2. Storage deposition test
for sequential samplers
1. 2 °C
2. 2 °C
3. Not more than 5 °C
above ambient temp. for
more than 30 min.
(a) 4-hour simulated solar
radiation, sampling
(b) 4-hour simulated solar
radiation, non-sampling
(c) Solar flux of 1000 ±50
W/m2
Sec. 7.4.8, Sec. 7.4.10,
Sec. 7.4.11.
1. Pj <2 µg/m3 or RPj <5%
2. 50 µg max. average
weight gain/blank filter
(a) 3 collocated samples at
1 site for at least 10
days;
(b) PM2.5 conc. > 3 µg/m3
(c) 25- or 48-hour samples
(d) 5- or 10-day storage
period for inactive stored
filters
Sec. 5.1, Sec. 7.4.5, Sec.
8, Sec. 9, Sec. 10.
§ 53.58
test
Field precision
The Following Requirement Is Applicable to Class I Candidate Equivalent Methods Only
§ 53.59
test
Aerosol transport
Aerosol transport
27. References (3) and (5) in appendix
A to subpart E of part 53 are revised to
read as follows:
Appendix A to Subpart E of Part 53—
References
*
*
*
*
*
(3) Quality Assurance Guidance
Document 2.12. Monitoring PM2.5 in
Ambient Air Using Designated
Reference or Class I Equivalent
Methods. U.S. EPA, National Exposure
Research Laboratory, Research Triangle
Park, NC, November 1998 or later
edition. Currently available at https://
www.epa.gov/ttn/amtic/pmgainf.html.
*
*
*
*
*
(5) Quality Assurance Handbook for
Air Pollution Measurement Systems,
Volume IV: Meteorological
Measurements. Revised March, 1995.
EPA–600/R–94–038d. Available from
National Technical Information Service,
Springfield, VA 22161, (800–553–6847,
https://www.ntis.gov). NTIS number
PB95–199782INZ.
*
*
*
*
*
Subpart F—[Amended]
28. Section 53.60 is amended by
revising paragraphs (b), (c), (d)
introductory text, and (f)(4) to read as
follows:
§ 53.60
*
*
General provisions.
*
VerDate Aug<31>2005
*
*
18:15 Jan 13, 2006
Jkt 208001
97%, min. for all channels
Determine aerosol transport through any new or
modified components
with respect to the reference method sampler
before the filter for each
channel.
(b) A candidate method described in
an application for a reference or
equivalent method determination
submitted under § 53.4 shall be
determined by the EPA to be a Class II
candidate equivalent method on the
basis of the definition of a Class II
equivalent method given in § 53.1.
(c) Any sampler associated with a
Class II candidate equivalent method
(Class II sampler) must meet all
applicable requirements for reference
method samplers or Class I equivalent
method samplers specified in subpart E
of this part, as appropriate. Except as
provided in § 53.3(a)(3), a Class II PM2.5
sampler must meet the additional
requirements as specified in paragraph
(d) of this section.
(d) Except as provided in paragraphs
(d)(1), (2), and (3) of this section, all
Class II samplers are subject to the
additional tests and performance
requirements specified in § 53.62 (full
wind tunnel test), § 53.65 (loading test),
and § 53.66 (volatility test). Alternative
tests and performance requirements, as
described in paragraphs (d)(1), (2), and
(3) of this section, are optionally
available for certain Class II samplers
which meet the requirements for
reference method or Class I equivalent
method samplers given in 40 CFR part
50, appendix L, and in subpart E of this
part, except for specific deviations of
the inlet, fractionator, or filter.
*
*
*
*
*
(f) * * *
PO 00000
Frm 00068
Fmt 4701
Sfmt 4702
(4) Loading test. The loading test is
conducted to ensure that the
performance of a candidate sampler is
not significantly affected by the amount
of particulate deposited on its interior
surfaces between periodic cleanings.
The candidate sampler is artificially
loaded by sampling a test environment
containing aerosolized, standard test
dust. The duration of the loading phase
is dependent on both the time between
cleaning as specified by the candidate
method and the aerosol mass
concentration in the test environment.
After loading, the candidate’s
performance must then be evaluated by
§ 53.62 (full wind tunnel evaluation),
§ 53.63 (wind tunnel inlet aspiration
test), or § 53.64 (static fractionator test).
If the results of the appropriate test meet
the criteria presented in table F–1 of this
subpart, then the candidate sampler
passes the loading test under the
condition that it be cleaned at least as
often as the cleaning frequency
proposed by the candidate method and
that has been demonstrated to be
acceptable by this test.
*
*
*
*
*
29. The section heading of § 53.61 is
revised to read as follows.
§ 53.61
Test conditions.
*
*
*
*
*
30. Section 53.66 is amended by
revising paragraph (e)(2)(iii) to read as
follows:
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
§ 53.66
*
Test procedure: Volatility test.
*
*
(e) * * *
(2) * * *
*
(iii) Operate the candidate and the
reference samplers such that they
simultaneously sample the test aerosol
for 2 hours for a candidate sampler
operating at 16.7 L/min or higher, or
*
2777
proportionately longer for a candidate
sampler operating at a lower flow rate.
*
*
*
*
*
31. Table F–1 to subpart F is revised
to read as follows:
TABLE F–1 TO SUBPART F.—PERFORMANCE SPECIFICATIONS FOR PM2.5 CLASS II EQUIVALENT SAMPLERS
Performance test
Specifications
Acceptance criteria
§ 53.62 Full Wind Tunnel
Evaluation.
§ 53.63 Wind Tunnel Inlet
Aspiration Test.
§ 53.64 Static Fractionator
Test.
§ 53.65 Loading Test ...........
Solid VOAG produced aerosol at 2 km/hr and 24 km/hr
Dp50 = 2.5 µm ± 0.2 µm
Numerical Analysis Results: 95% ≤Rc≤105%.
Relative Aspiration:
95% ≤A≤105%.
Dp50 = 2.5 µm ± 0.2 µm
Numerical Analysis Results: 95% ≤Rc≤105%.
Acceptance criteria as specified in the post-loading
evaluation test (§ 53.62, § 53.63, or § 53.64).
Regression Parameters
Slope = 1 ± 0.1,
Intercept = 0 ± 0.15 mg r ≥ 0.97.
§ 53.66 Volatility Test ...........
Liquid VOAG produced aerosol at 2 km/hr and 24 km/
hr.
Evaluation of the fractionator under static conditions .....
Loading of the clean candidate under laboratory conditions.
Polydisperse liquid aerosol producted by air
nebulization of A.C.S. reagent grade glycerol, 99.5%
minimum purity.
32. In Figure E–1 to subpart F, the
figure number ‘‘E–1’’ is revised to read
‘‘F–1.’’
PART 58—[AMENDED]
33. The authority citation for part 58
continues to read as follows:
Authority: 42 U.S.C. 7410, 7601(a), 7613,
and 7619.
Subpart A—[Amended]
34. Sections 58.1, 58.2 and 58.3 are
revised to read as follows:
§ 58.1
Definitions.
As used in this part, all terms not
defined herein have the meaning given
them in the Act.
Act means the Clean Air Act as
amended (42 U.S.C. 7401, et seq.)
Additive and multiplicative bias
means the linear regression intercept
and slope of a linear plot fitted to
corresponding candidate and reference
method mean measurement data pairs.
Administrator means the
Administrator of the Environmental
Protection Agency (EPA) or his or her
authorized representative.
Air Quality System (AQS) means
EPA’s computerized system for storing
and reporting of information relating to
ambient air quality data.
Approved regional method (ARM)
means a continuous PM2.5 method that
has been approved specifically within a
State or local air monitoring network for
purposes of comparison to the NAAQS
and to meet other monitoring objectives.
AQCR means air quality control
region.
CO means carbon monoxide.
Combined statistical area (CSA) is
defined by the U.S. Office of
Management and Budget as a
geographical area consisting of two or
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
more adjacent Core Based Statistical
Areas (CBSA) with employment
interchange of at least 15 percent.
Combination is automatic if the
employment interchange is 25 percent
and determined by local opinion if more
than 15 but less than 25 percent
(https://www.census.gov/population/
estimates/metro-city/List6.txt).
Community monitoring zone (CMZ)
means an optional averaging area with
established, well defined boundaries,
such as county or census block, within
an MPA that has relatively uniform
concentrations of annual PM2.5 as
defined by appendix N of part 50 of this
chapter. Two or more communityoriented SLAMS monitors within a
CMZ that meet certain requirements as
set forth in appendix N of part 50 of this
chapter may be averaged for making
comparisons to the annual PM2.5
NAAQS.
Core-based statistical area (CBSA) is
defined by the U.S. Office of
Management and Budget, as a statistical
geographic entity consisting of the
county or counties associated with at
least one urbanized area/urban cluster
of at least 10,000 population, plus
adjacent counties having a high degree
of social and economic integration.
Metropolitan and micropolitan
statistical areas (MSA) are the two
categories of CBSA (metropolitan areas
have populations greater than 50,000;
and micropolitan areas have
populations between 10,000 and
50,000). In the case of very large cities
where two or more CBSA are combined,
these larger areas are referred to as
combined statistical areas (https://
www.census.gov/population/estimates/
metro-city/List1.txt).
Corrected concentration pertains to
the result of an accuracy or precision
PO 00000
Frm 00069
Fmt 4701
Sfmt 4702
assessment test of an open path analyzer
in which a high-concentration test or
audit standard gas contained in a short
test cell is inserted into the optical
measurement beam of the instrument.
When the pollutant concentration
measured by the analyzer in such a test
includes both the pollutant
concentration in the test cell and the
concentration in the atmosphere, the
atmospheric pollutant concentration
must be subtracted from the test
measurement to obtain the corrected
concentration test result. The corrected
concentration is equal to the measured
concentration minus the average of the
atmospheric pollutant concentrations
measured (without the test cell)
immediately before and immediately
after the test.
Design value means the calculated
concentration according to the
applicable appendix of part 50 of this
chapter for the highest site in an
attainment or nonattainment area.
EDO means environmental data
operations.
Effective concentration pertains to
testing an open path analyzer with a
high-concentration calibration or audit
standard gas contained in a short test
cell inserted into the optical
measurement beam of the instrument.
Effective concentration is the equivalent
ambient-level concentration that would
produce the same spectral absorbance
over the actual atmospheric monitoring
path length as produced by the highconcentration gas in the short test cell.
Quantitatively, effective concentration
is equal to the actual concentration of
the gas standard in the test cell
multiplied by the ratio of the path
length of the test cell to the actual
atmospheric monitoring path length.
Equivalent method means a method of
sampling and analyzing the ambient air
E:\FR\FM\17JAP3.SGM
17JAP3
2778
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
for an air pollutant that has been
designated as an equivalent method in
accordance with part 53 of this chapter;
it does not include a method for which
an equivalent method designation has
been canceled in accordance with
§ 53.11 or § 53.16 of this chapter.
HNO3 means nitric acid.
Local agency means any local
government agency, other than the State
agency, which is charged by a State with
the responsibility for carrying out a
portion of the plan.
Meteorological measurements means
measurements of wind speed, wind
direction, barometric pressure,
temperature, relative humidity, solar
radiation, ultraviolet radiation, and
precipitation.
Metropolitan Statistical Area (MSA)
means a CBSA associated with at least
one urbanized area of at least 50,000
population. The central county plus
adjacent counties with a high degree of
integration comprise the area.
Monitor means an instrument,
sampler, analyzer, or other device that
measures or assists in the measurement
of atmospheric air pollutants and which
is acceptable for use in ambient air
surveillance under the applicable
provisions of appendix C to this part.
Monitoring agency means a State or
local agency responsible for meeting the
requirements of this part.
Monitoring organization means a
State, local, or other monitoring
organization responsible for operating a
monitoring site for which the quality
assurance regulations apply.
Monitoring path for an open path
analyzer means the actual path in space
between two geographical locations over
which the pollutant concentration is
measured and averaged.
Monitoring path length of an open
path analyzer means the length of the
monitoring path in the atmosphere over
which the average pollutant
concentration measurement (pathaveraged concentration) is determined.
See also, optical measurement path
length.
Monitoring planning area (MPA)
means a contiguous geographic area
with established, well defined
boundaries, such as a CBSA, county or
State, having a common area that is
used for planning monitoring locations
for PM2.5. An MPA may cross State
boundaries, such as the Philadelphia
PA-NJ MSA, and be further subdivided
into community monitoring zones. MPA
are generally oriented toward CBSA or
CSA with populations greater than
200,000, but for convenience, those
portions of a State that are not
associated with CBSA can be considered
as a single MPA.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
NATTS means the national air toxics
trends stations. This network provides
hazardous air pollution ambient data.
NCore means the National Core
multipollutant monitoring stations.
Monitors at these sites are required to
measure particles (PM2.5, speciated
PM2.5, PM10-2.5), O3, SO2, CO, nitrogen
oxides (NO/NO2/NOY), and basic
meteorology.
Network means all stations of a given
type or types.
NH3 means ammonia.
NO2 means nitrogen dioxide. NO
means nitrogen oxide. NOX means
oxides of nitrogen and is defined as the
sum of the concentrations of NO2 and
NO.
NOy means the sum of all total
reactive nitrogen oxides, including NO,
NO2, and other nitrogen oxides referred
to as NOZ.
O3 means ozone.
Open path analyzer means an
automated analytical method that
measures the average atmospheric
pollutant concentration in situ along
one or more monitoring paths having a
monitoring path length of 5 meters or
more and that has been designated as a
reference or equivalent method under
the provisions of part 53 of this chapter.
Optical measurement path length
means the actual length of the optical
beam over which measurement of the
pollutant is determined. The pathintegrated pollutant concentration
measured by the analyzer is divided by
the optical measurement path length to
determine the path-averaged
concentration. Generally, the optical
measurement path length is:
(1) Equal to the monitoring path
length for a (bistatic) system having a
transmitter and a receiver at opposite
ends of the monitoring path;
(2) Equal to twice the monitoring path
length for a (monostatic) system having
a transmitter and receiver at one end of
the monitoring path and a mirror or
retroreflector at the other end; or
(3) Equal to some multiple of the
monitoring path length for more
complex systems having multiple passes
of the measurement beam through the
monitoring path.
PAMS means photochemical
assessment monitoring stations.
Pb means lead.
Plan means a implementation plan
approved or promulgated pursuant to
section 110 of the Act.
PM2.5 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 2.5 micrometers as
measured by a reference method based
on appendix L of part 50 of this chapter
and designated in accordance with part
53 of this chapter, by an equivalent
PO 00000
Frm 00070
Fmt 4701
Sfmt 4702
method designated in accordance with
part 53 of this chapter, or by an
approved regional method designated in
accordance with appendix C to this part.
PM10 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers as
measured by a reference method based
on appendix J of part 50 of this chapter
and designated in accordance with part
53 of this chapter or by an equivalent
method designated in accordance with
part 53 of this chapter.
PM10C means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers as
measured by a reference method based
on appendix O of part 50 of this chapter
and designated in accordance with part
53 of this chapter or by an equivalent
method designated in accordance with
part 53 of this chapter.
PM10-2.5 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers and
greater than a nominal 2.5 micrometers
as measured by a reference method
based on appendix O to part 50 of this
chapter and designated in accordance
with part 53 of this chapter or by an
equivalent method designated in
accordance with part 53 of this chapter.
Point analyzer means an automated
analytical method that measures
pollutant concentration in an ambient
air sample extracted from the
atmosphere at a specific inlet probe
point and that has been designated as a
reference or equivalent method in
accordance with part 53 of this chapter.
Population-oriented monitoring (or
sites) means residential areas,
commercial areas, recreational areas,
industrial areas where workers from
more than one company are located, and
other areas where a substantial number
of people may spend a significant
fraction of their day.
Primary quality assurance
organization means a monitoring
organization or other organization that
is responsible for a set of stations that
monitors the same pollutant and for
which data quality assessments can be
pooled. Each criteria pollutant sampler/
monitor at a monitoring station in the
SLAMS and SPM networks must be
associated with one, and only one,
primary quality assurance organization.
Probe means the actual inlet where an
air sample is extracted from the
atmosphere for delivery to a sampler or
point analyzer for pollutant analysis.
PSD station means any station
operated for the purpose of establishing
the effect on air quality of the emissions
from a proposed source for purposes of
prevention of significant deterioration
as required by § 51.24(n) of this chapter.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Reference method means a method of
sampling and analyzing the ambient air
for an air pollutant that is specified as
a reference method in an appendix to
part 50 of this chapter, or a method that
has been designated as a reference
method in accordance with this part; it
does not include a method for which a
reference method designation has been
canceled in accordance with § 53.11 or
§ 53.16 of this chapter.
Regional Administrator means the
Administrator of one of the ten EPA
Regional Offices or his or her authorized
representative.
Reporting organization means an
entity, such as a State, local, or Tribal
monitoring agency, that collects and
reports air quality data to EPA.
Site means a geographic location. One
or more stations may be at the same site.
SLAMS means State or local air
monitoring stations. The SLAMS make
up the ambient air quality monitoring
sites that are primarily needed for
NAAQS comparisons, but may serve
other data purposes. SLAMS exclude
special purpose monitor (SPM) stations
and include NCore, PAMS, and all other
State or locally operated stations that
have not been designated as SPM
stations.
SO2 means sulfur dioxide.
Special purpose monitor (SPM)
station means a monitor included in an
agency’s monitoring network that the
agency has designated as a special
purpose monitor station in its
monitoring network plan and in the Air
Quality System, and which the agency
does not count when showing
compliance with the minimum
requirements of this subpart for the
number and siting of monitors of
various types.
State agency means the air pollution
control agency primarily responsible for
development and implementation of a
plan under the Act.
State speciation site means a
supplemental PM2.5 speciation station
that is not part of the speciation trends
network.
Station means a single monitor, or a
group of monitors with a shared
objective, located at a particular site.
STN station means a PM2.5 speciation
station designated to be part of the
speciation trends network. This network
provides chemical species data of fine
particulate.
Traceable means that a local standard
has been compared and certified, either
directly or via not more than one
intermediate standard, to a National
Institute of Standards and Technology
(NIST)-certified primary standard such
as a NIST-traceable Reference Material
(NTRM) or a NIST-certified Gas
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Manufacturer’s Internal Standard
(GMIS).
TSP (total suspended particulates)
means particulate matter as measured
by the method described in appendix B
of part 50 of this chapter.
Urbanized area means an area with a
minimum residential population of at
least 50,000 people and which generally
includes core census block groups or
blocks that have a population density of
at least 1,000 people per square mile
and surrounding census blocks that
have an overall density of at least 500
people per square mile. The Census
Bureau notes that under certain
conditions, less densely settled territory
may be part of each Urbanized Area.
VOC means volatile organic
compounds.
§ 58.2
Purpose.
(a) This part contains requirements for
measuring ambient air quality and for
reporting ambient air quality data and
related information. The monitoring
criteria pertain to the following areas:
(1) Quality assurance procedures for
monitor operation and data handling.
(2) Methodology used in monitoring
stations.
(3) Operating schedule.
(4) Siting parameters for instruments
or instrument probes.
(5) Minimum ambient air quality
monitoring network requirements used
to provide support to the State
implementation plans (SIP), national air
quality assessments, and policy
decisions. These minimums are
described as part of the network design
requirements, including minimum
numbers and placement of monitors of
each type.
(6) Air quality data reporting, and
requirements for the daily reporting of
an index of ambient air quality.
(b) The requirements pertaining to
provisions for an air quality surveillance
system in the SIP are contained in this
part.
(c) This part also acts to establish a
national ambient air quality monitoring
network for the purpose of providing
timely air quality data upon which to
base national assessments and policy
decisions.
§ 58.3
Applicability
This part applies to:
(a) State air pollution control
agencies.
(b) Any local air pollution control
agency to which the State has delegated
authority to operate a portion of the
State’s SLAMS network.
(c) Owners or operators of proposed
sources.
PO 00000
Frm 00071
Fmt 4701
Sfmt 4702
2779
Subpart B—Monitoring Network
35. The heading for subpart B is
revised as set forth above.
36. Sections 58.10 through 58.14 are
revised and §§ 58.15 and 58.16 are
added to read as follows:
§ 58.10 Annual monitoring network plan
and periodic network assessment.
(a)(1) Beginning July 1, 2007, the
State, or where applicable local, agency
shall adopt and submit to the Regional
Administrator an annual monitoring
network plan which shall provide for
the establishment and maintenance of
an air quality surveillance system that
consists of a network of monitoring
stations including Federal reference
method (FRM), Federal equivalent
method (FEM), and approved regional
method (ARM) monitors that are part of
SLAMS, NCore stations, STN stations,
State speciation stations, SPM stations,
and/or, in serious, severe and extreme
ozone nonattainment areas, PAMS
stations. The plan shall include a
statement of purpose for each monitor
and evidence that siting and operation
of each monitor meets the requirements
of appendices A, C, D, and E of this part,
where applicable. The annual
monitoring network plan must be made
available for public inspection for at
least 30 days prior to submission to
EPA.
(2) Any annual monitoring network
plan that proposes SLAMS network
modifications including new monitoring
sites is subject to the approval of the
EPA Regional Administrator, who shall
provide opportunity for public comment
and shall approve or disapprove the
plan and schedule within 120 days.
(3) PM10-2.5 stations.
(i) The plan for establishing a network
of PM10-2.5 stations is due not later than
January 1, 2008, as an addendum to the
annual monitoring network plan
required to be submitted July 1, 2007,
unless the Regional Administrator
extends this due date to July 1, 2008, in
which case it shall be part of the annual
monitoring network plan due by that
date.
(ii) The plan shall provide for
required PM10-2.5 stations to be
operational by January 1, 2009.
(iii) The plan shall identify whether
each planned PM10-2.5 station is suitable
for comparison with the PM10-2.5
NAAQS under the criteria of § 58.30(b),
and shall include evidence for that
identification including the information
obtained and conclusions reached in
each site-specific assessment.
(iv) Identification of existing and
proposed sites as suitable for
comparison against the 24-hour PM10-2.5
E:\FR\FM\17JAP3.SGM
17JAP3
2780
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
NAAQS are subject to approval by the
EPA Regional Administrator as part of
the approval of the plan for the PM10-2.5
monitoring network. Such approval will
constitute a final action by EPA.
(4) The plan for establishing required
NCore multipollutant stations is due
July 1, 2009. The plan shall provide for
all required stations to be operational by
January 1, 2011.
(b) The annual monitoring network
plan must contain cost information for
the network and the following
information for each existing and
proposed site:
(1) The AQS site identification
number.
(2) The location, including street
address and geographical coordinates.
(3) The sampling and analysis
method(s) for each measured parameter.
(4) The operating schedules for each
monitor.
(5) Any proposals to remove or move
a monitoring station within a period of
18 months following plan submittal.
(6) The monitoring objective and
spatial scale of representativeness for
each monitor as defined in appendix D
to this part.
(7) The identification of any sites that
are suitable and sites that are not
suitable for comparison against the
annual PM2.5 NAAQS or 24-hour
PM10-2.5 NAAQS as described in § 58.30.
(8) Information supporting the basis
for determining that PM10-2.5 sites are
either suitable or not suitable for
comparison to the 24-hour PM10-2.5
NAAQS as described in § 58.30(b).
(9) The MSA, CBSA, CSA or other
area represented by the monitor.
(c) The annual monitoring network
plan must consider the ability of
existing and proposed sites to support
air quality characterization for areas
with relatively high populations of
susceptible individuals (e.g., children
with asthma), and, for any sites that are
being proposed for discontinuance, the
effect on data users other than the
agency itself, such as nearby States and
Tribes or health effects studies.
(d) The annual monitoring network
plan must document how States and
local agencies provide for the review of
changes to a PM2.5 monitoring network
that impact the location of a violating
PM2.5 monitor or the creation/change to
a community monitoring zone,
including a description of the proposed
use of spatial averaging for purposes of
making comparisons to the annual PM2.5
NAAQS as set forth in appendix N to
part 50 of this chapter. The affected
State or local agency must document the
process for providing public hearings
and include any comments received
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
through the public notification process
within their submitted plan.
(e) The State, or where applicable
local, agency shall perform and submit
to the EPA Regional Administrator an
assessment of the air quality
surveillance system every 5 years to
determine, at a minimum, if the network
meets the monitoring objectives defined
in appendix D to this part, whether new
sites are needed, whether existing sites
are no longer needed and can be
terminated, and whether new
technologies are appropriate for
incorporation into the ambient air
monitoring network. For PM2.5, the
assessment also must identify needed
changes to population-oriented sites.
The State, or where applicable local,
agency must submit a copy of this 5year assessment, along with a revised
annual network plan, to the Regional
Administrator. The first assessment is
due July 1, 2009. For PM10-2.5, each
assessment due on or after July 1, 2014
must identify needed changes to the
identification of whether each site is
suitable or unsuitable for comparison to
the NAAQS under the criteria of
§ 58.30(b), based on changes in
emissions sources affecting the site or
better information about these sources.
(f) All proposed additions and
discontinuations of monitors in annual
monitoring network plans and periodic
network assessments are subject to
approval according to § 58.14.
§ 58.11
Network technical requirements.
(a) State and local governments shall
follow the applicable quality assurance
criteria contained in appendix A to this
part when operating the SLAMS and
SPM networks. The owner or operator of
an existing or a proposed source shall
follow the quality assurance criteria in
appendix A to this part that apply to
PSD monitoring when operating a PSD
site.
(b) State and local governments must
follow the criteria in appendix C to this
part to determine acceptable monitoring
methods or instruments for use in
SLAMS networks. Appendix C criteria
are optional at SPM stations.
(c) State and local governments must
follow the network design criteria
contained in appendix D to this part in
designing and maintaining the SLAMS
stations. The final network design and
all changes in design are subject to
approval of the Regional Administrator.
NCore, STN, and PAMS network design
and changes are also subject to approval
of the Administrator. Changes in SPM
stations do not require approvals, but a
change in the designation of a
monitoring site from SLAMS to SPM
PO 00000
Frm 00072
Fmt 4701
Sfmt 4702
requires approval of the Regional
Administrator.
(d) State and local governments must
follow the criteria contained in
appendix E to this part for siting
monitor inlets, paths or probes at
SLAMS stations. Appendix E adherence
is optional for SPM stations that do not
use appendix C methods.
§ 58.12
Operating schedules.
State and local governments shall
collect ambient air quality data at any
SLAMS station on the following
operational schedules:
(a) For continuous analyzers,
consecutive hourly averages must be
collected except during:
(1) Periods of routine maintenance,
(2) Periods of instrument calibration,
or
(3) Periods or monitoring seasons
exempted by the Regional
Administrator.
(b) For Pb and PM10 manual methods,
at least one 24-hour sample must be
collected every 6 days except during
periods or seasons exempted by the
Regional Administrator.
(c) For PAMS VOC samplers, samples
must be collected as specified in section
5 of appendix D to this part. Areaspecific PAMS operating schedules
must be included as part of the PAMS
network description and must be
approved by the Regional
Administrator.
(d) For manual PM2.5 samplers:
(1) Manual PM2.5 samplers at other
SLAMS stations must operate on at least
a 1-in-3 day schedule at sites without a
collocated continuously operating PM2.5
monitor. For SLAMS PM2.5 sites with
both manual and continuous PM2.5
monitors operating, the PM2.5 manual
sampler may be operated with a 1-in-6
day sampling frequency under certain
conditions. A monitoring agency may
request approval for a reduction to 1-in6 day PM2.5 sampling at SLAMS stations
or for seasonal sampling from the EPA
Regional Administrator. The EPA
Regional Administrator may grant
sampling frequency reductions after
consideration of the historical PM2.5
data quality assessments, the location of
current PM2.5 design value sites, and
their regulatory data needs. Sites that
have design values that are within ±10
percent of the NAAQS; and sites where
the 24-hour values exceed the NAAQS
for a period of 3 years are required to
maintain at least a 1-in-3 day sampling
frequency.
(2) Manual PM2.5 samplers at NCore
stations and required regional
background and regional transport sites
must operate on at least a 1-in-3 day
sampling frequency.
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
(3) Manual PM2.5 speciation samplers
at STN stations must operate on a 1-in3 day sampling frequency.
(e) Manual PM10-2.5 samplers at
SLAMS stations must operate on a daily
schedule at sites without a collocated
continuously operating equivalent
PM10-2.5 method that has been
designated in accordance with part 53 of
this chapter.
§ 58.13
Monitoring network completion.
(a) The network of PM10-2.5 sites must
be physically established no later than
January 1, 2009, and at that time,
operating under all of the requirements
of this part, including the requirements
of appendices A, C, D, E, and G to this
part.
(b) The network of NCore
multipollutant sites must be physically
established no later than January 1,
2011, and at that time, operating under
all of the requirements of this part,
including the requirements of
appendices A, C, D, E, and G to this
part.
§ 58.14
System modification.
(a) The State, or where appropriate
local, agency shall develop and
implement a plan and schedule to
modify the ambient air quality
monitoring network that complies with
the findings of the network assessments
required every 5 years by § 58.10(e). The
State or local agency shall consult with
the EPA Regional Administrator during
the development of the schedule to
modify the monitoring program, and
shall make the plan and schedule
available to the public for 30 days prior
to submission to the EPA Regional
Administrator. The final plan and
schedule are subject to the approval of
the EPA Regional Administrator, who
shall provide opportunity for public
comment and shall approve or
disapprove the plan and schedule
within 120 days.
(b) Nothing in this section shall
preclude the State, or where appropriate
local, agency from making modifications
to the SLAMS network for reasons other
than those resulting from the periodic
network assessments. These
modifications must be reviewed and
approved by the Regional
Administrator. Each monitoring
network may make or be required to
make changes between the 5-year
assessment periods, including for
example, site relocations or the addition
of PAMS networks in bumped-up ozone
nonattainment areas. These
modifications must address changes
invoked by a new census and changes
due to changing air quality levels. The
State, or where appropriate local,
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
agency shall provide written
communication describing the network
changes to the Regional Administrator
for review and approval as these
changes are identified.
(c) State, or where appropriate, local
agency requests for monitor station
discontinuation, subject to the review of
the Regional Administrator, will be
approved if any of the following criteria
are met. Other requests for
discontinuation may also be approved
on a case by case basis if discontinuance
does not compromise data collection
needed for implementation of a
NAAQS.
(1) Any PM2.5, O3, CO, PM10, SO2, Pb,
or NO2 monitor which has shown
attainment during the previous five
years, that has a probability of less than
10 percent of exceeding 80 percent of
the applicable NAAQS during the next
three years based on the levels, trends,
and variability observed in the past, and
which is not specifically required by an
attainment plan or maintenance plan.
(2) Any monitor for CO, PM10, SO2, or
NO2 which has consistently measured
lower concentrations than another
monitor for the same pollutant in the
same county and same nonattainment
area during the previous five years, and
which is not specifically required by an
attainment plan or maintenance plan, if
control measures scheduled to be
implemented or discontinued during
the next five years would apply to the
areas around both monitors and have
similar effects on measured
concentrations, such that the retained
monitor would remain the higher
reading of the two monitors being
compared.
(3) For any pollutant, the highest
reading monitor (which may be the only
monitor) in a county (or portion of a
county within a distinct nonattainment
or maintenance area) provided the
monitor has not measured violations of
the applicable NAAQS in the previous
five years, the MSA or CSA within
which the county lies (if in any) would
still meet requirements for the
minimum number of monitors for the
applicable pollutant if any, and the
approved SIP provides for a specific,
reproducible approach to representing
the air quality of the affected county in
the absence of actual monitoring data.
(4) A monitor which EPA has
determined cannot be compared to the
relevant NAAQS because of the siting of
the monitor, in accordance with § 58.30.
(5) A monitor that is designed to
measure concentrations upwind of an
urban area for purposes of
characterizing transport into the area
and that has not recorded violations of
the relevant NAAQS in the previous five
PO 00000
Frm 00073
Fmt 4701
Sfmt 4702
2781
years, if discontinuation of the monitor
is tied to start-up of another station also
characterizing transport.
§ 58.15 Annual air monitoring data
certification.
(a) Beginning May 1, 2009, the State,
or where appropriate local, agency shall
submit to the EPA Regional
Administrator an annual air monitoring
data certification letter to certify data
collected at all SLAMS and at all SPM
stations that meet appendix C and
appendix E criteria from January 1 to
December 31 of the previous year. The
senior air pollution control officer in
each agency, or their designee, shall
certify that the previous year of ambient
concentration and quality assurance
data are completely submitted to AQS
and that the ambient concentration data
are accurate to the best of her or his
knowledge, taking into consideration
the quality assurance findings.
(b) Along with each certification
letter, the State shall submit to the
Administrator (through the appropriate
Regional Office) an annual summary
report of all the ambient air quality data
from all monitoring stations designated
as SLAMS. The State also shall submit
an annual summary to the appropriate
Regional Administrator of all the
ambient air quality monitoring data
from all FRM, FEM, and ARM at SPM
stations that are described in the State’s
current monitoring network description.
The annual report(s) shall be submitted
for data collected from January 1 to
December 31 of the previous year. The
annual summary report(s) must contain
all information and data required by the
State’s approved plan and be submitted
by July 1 of each year, unless an
approved alternative date is included in
the plan. The annual summary serves as
the record of the specific data that is the
object of the certification letter.
§ 58.16
Data submittal.
(a) The State, or where appropriate,
local agency, shall report to the
Administrator, via AQS all ambient air
quality data and associated quality
assurance data for SO2, CO, O3, NO2,
NO, NOY, Pb, PM10, PM2.5 mass
concentration, for filter-based PM2.5
FRM/FEM (field blank mass, samplergenerated average daily temperature,
sampler-generated average daily
pressure), chemically speciated PM2.5
mass concentration data, PM10-2.5 (mass
concentration and chemically speciated
data), meteorological data from NCore
and PAMS sites, and metadata records
and information specified by the AQS
Data Coding Manual (https://
www.epa.gov/ttn/airs/airsaqs/manuals/
manuals.htm). Such air quality data and
E:\FR\FM\17JAP3.SGM
17JAP3
2782
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
information must be submitted directly
to the AQS via electronic transmission
on the specified quarterly schedule
described in paragraph (b) of this
section.
(b) The specific quarterly reporting
periods are January 1–March 31, April
1–June 30, July 1–September 30, and
October 1–December 31. The data and
information reported for each reporting
period must contain all data and
information gathered during the
reporting period, and be received in the
AQS within 90 days after the end of the
quarterly reporting period. For example,
the data for the reporting period January
1–March 31 are due on or before June
30 of that year.
(c) Air quality data submitted for each
reporting period must be edited,
validated, and entered into the AQS
(within the time limits specified in
paragraph (b) of this section) pursuant
to appropriate AQS procedures. The
procedures for editing and validating
data are described in the AQS Data
Coding Manual and in each monitoring
agency’s quality assurance project plan.
(d) The State shall report VOC and if
collected, carbonyl, NH3, and HNO3
data, from PAMS sites to AQS within 6
months following the end of each
quarterly reporting period listed in
paragraph (b) of this section.
(e) The State shall also submit any
portion or all of the SLAMS and SPM
data to the appropriate Regional
Administrator upon request.
Subpart C—Special Purpose Monitors
37. The heading for subpart C is
revised as set forth above.
38. Section 58.20 is revised to read as
follows:
§ 58.20
Special purpose monitors (SPM).
(a) An SPM is defined as any monitor
included in an agency’s monitoring
network that the agency has designated
as a special purpose monitor in its
annual monitoring network plan and in
AQS, and which the agency does not
count when showing compliance with
the minimum requirements of this
subpart for the number and siting of
monitors of various types. Any SPM
operated by an air monitoring agency
must be included in the periodic
assessments and annual monitoring
network plan required by § 58.10. The
plan shall include a statement of
purpose for each SPM monitor and a
evidence that siting and operation of
each monitor meets the requirements of
appendix A where applicable. The
monitoring agency may designate a
monitor as an SPM after January 1, 2007
only if it is a new monitor not
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
previously included in the monitoring
plan.
(b) Any SPM data collected by an air
monitoring agency using a Federal
reference method (FRM), Federal
equivalent method (FEM), or approved
regional method (ARM) must meet the
requirements of § 58.11, § 58.12, and
appendices A and C to this part.
Compliance with appendix E to this part
is optional but encouraged except when
the monitoring agency’s data objectives
are inconsistent with those
requirements. Data collected at an SPM
meeting these requirements must be
submitted to AQS according to the
requirements of § 58.16. The monitoring
agency must also submit to AQS an
indication of whether the monitor meets
the requirements of appendix E to this
part.
(c) All data from an SPM using an
FRM, FEM, or ARM which has operated
for more than 24 months is eligible for
comparison to the relevant NAAQS,
subject to the conditions of § 58.30,
unless the air monitoring agency
demonstrates in the documentation
required in paragraph (a) of this section
that the data from a particular period
does not meet the requirements in
paragraph (b) of this section.
(d) If an SPM using an FRM, FEM, or
ARM is discontinued within 24 months
of start-up, the Administrator will not
use data from the SPM for NAAQS
violation determinations for the PM2.5,
PM10-2.5, ozone, or the annual PM10
NAAQS.
(e) If an SPM using an FRM, FEM, or
ARM is discontinued within 24 months
of start-up, the Administrator will not
use data from the SPM for NAAQS
violation determinations for purposes of
designating an area as nonattainment,
for the CO, SO2, NO2, Pb, or 24-hour
PM10 NAAQS. Such data are eligible for
use in determinations of whether a
nonattainment area has attained one of
these NAAQS.
(f) Prior approval from EPA is not
required for discontinuance of an SPM.
39. Sections 58.21 through 58.28 are
removed.
Subpart D—Comparability of Ambient
Data to NAAQS
40. The heading for subpart D is
revised as set forth above.
41. Section 58.30 is revised to read as
follows:
§ 58.30 Special considerations for data
comparisons to the NAAQS.
(a) Comparability of PM2.5 data. (1)
There are two forms of the PM2.5
NAAQS described in part 50 of this
chapter. The PM2.5 monitoring site
characteristics (see appendix D, section
PO 00000
Frm 00074
Fmt 4701
Sfmt 4702
4.7.1) impact how the resulting PM2.5
data can be compared to the annual
PM2.5 NAAQS form. PM2.5 data that are
representative, not of areawide but
rather, of relatively unique populationoriented microscale, or localized hot
spot, or unique population-oriented
middle-scale impact sites are only
eligible for comparison to the 24-hour
PM2.5 NAAQS. For example, if the PM2.5
monitoring site is adjacent to a unique
dominating local PM2.5 source or can be
shown to have average 24-hour
concentrations representative of a
smaller than neighborhood spatial scale,
then data from a monitor at the site
would only be eligible for comparison to
the 24-hour PM2.5 NAAQS.
(2) There are cases where certain
population-oriented, microscale or
middle scale PM2.5 monitoring sites are
determined by the Regional
Administrator to collectively identify a
larger region of localized high ambient
PM2.5 concentrations. In those cases,
data from these population-oriented
sites would be eligible for comparison to
the annual PM2.5 NAAQS.
(b) Comparability of PM10-2.5 data. To
be eligible (or suitable) for comparison
to the PM10-2.5 NAAQS, PM10-2.5 data
must be from a monitoring site that
meets all five of the following
conditions.
(1) The site must be within the
boundaries of an urbanized area as
defined by the U.S. Bureau of the
Census which has a population of at
least 100,000 persons.
(2) The site must be in a census block
group with a population density of 500
or more persons per square mile.
Alternatively, the site may be in a
census block group with a lower
population density if the block group is
part of an enclave that is not more than
five square miles in land area.
(3) The site must be populationoriented.
(4) The site may not be in sourceinfluenced microenvironments (such as
a microscale or localized hot spot site)
not eligible for comparison to the
annual PM2.5 NAAQS under the
conditions of paragraph (a) of this
section. For example, if the PM10-2.5
monitoring site is located on the
fenceline of a dominating local PM10-2.5
source, then data from a monitor at the
site would not be eligible for
comparison to the 24-hour PM10-2.5
NAAQS.
(5) PM10-2.5 concentrations at the site
must be dominated by resuspended dust
from high-density traffic on paved roads
and PM generated by industrial sources
and construction sources, and must not
be dominated by rural windblown dust
and soils and PM generated by
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
agricultural and mining sources, as
determined by the State (and approved
by the Regional Administrator) in a sitespecific assessment. The site-specific
assessment shall consider the types and
sizes of sources that may impact the
site, the impact of meteorological
conditions on site-source relationships,
verification that the site is not exposed
to windblown rural dust and soil or
emissions from agriculture and mining
to such an extent that those sources
would dominate the mix of PM10-2.5
sampled at that site, and other factors
necessary for completing the
assessment.
42. Sections 58.31 through 58.36 are
removed.
Subpart E—[Removed and Reserved]
43. Subpart E of part 58 is removed
and reserved.
Subpart F—[Amended]
44. Section 58.50 is revised to read as
follows:
§ 58.50
Index reporting.
(a) The State or where applicable,
local agency shall report to the general
public on a daily basis through
prominent notice an air quality index
that complies with the requirements of
appendix G to this part.
(b) Reporting is required for all
individual MSA with a population
exceeding 350,000.
(c) The population of a MSA for
purposes of index reporting is the most
recent decennial U.S. census
population.
Subpart G—[Amended]
45. Sections 58.60 and 58.61 are
revised to read as follows:
§ 58.60
Federal monitoring.
The Administrator may locate and
operate an ambient air monitoring site if
the State or local agency fails to locate,
or schedule to be located, during the
initial network design process, or as a
result of the 5-year network assessments
required within § 58.10, a SLAMS
station at a site which is necessary in
the judgement of the Regional
Administrator to meet the objectives
defined in appendix D to this part.
§ 58.61
Monitoring other pollutants.
The Administrator may promulgate
criteria similar to that referenced in
subpart B of this part for monitoring a
pollutant for which an NAAQS does not
exist. Such an action would be taken
whenever the Administrator determines
that a nationwide monitoring program is
necessary to monitor such a pollutant.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
49. Appendix A to part 58 is revised
to read as follows:
Appendix A to Part 58—Quality
Assurance Requirements for SLAMS,
NCore, and PSD Air Monitoring
1. General Information.
2. Quality System Requirements.
3. Measurement Quality Check
Requirements.
4. Calculations for Data Quality
Assessments.
5. Reporting Requirements.
6. References.
1. General Information.
This appendix specifies the minimum
quality system requirements applicable to
SLAMS air monitoring data and PSD data
submitted to EPA. In this section, NCore
stations and SPM stations (using FRM, FEM,
or ARM methods) are considered a subset of
the SLAMS network. Monitoring
organizations are encouraged to develop and
maintain quality systems more extensive
than the required minimums. The permitgranting authority for PSD may require more
frequent or more stringent requirements.
Monitoring organizations may, based on their
quality objectives, be required to develop and
maintain quality systems beyond the
required minimum. Additional guidance for
the requirements reflected in this appendix
can be found in the ‘‘Quality Assurance
Handbook for Air Pollution Measurement
Systems’’, volume II, part 1 (see reference 10
of this appendix) and at a national level in
references 1, 2, and 3 of this appendix.
1.1 Similarities and Differences Between
SLAMS and PSD Monitoring. In most cases,
the quality assurance requirements for
SLAMS and PSD are the same. Table A–1 of
this appendix summarizes the major
similarities and differences of the
requirements for SLAMS and PSD. Both
programs require:
(a) The development, documentation, and
implementation of an approved quality
system;
(b) The assessment of data quality;
(c) The use of reference, equivalent, or
approved methods (optional for SPM);
(d) The use of calibration standards
traceable to NIST or other primary standard;
(e) Performance evaluations and systems.
1.1.1 The monitoring and quality
assurance responsibilities for SLAMS are
with the State or local agency, hereafter
called the monitoring organization, whereas
for PSD they are with the owner/operator
seeking the permit. The monitoring duration
for SLAMS is indefinite, whereas for PSD the
duration is usually 12 months. Whereas the
reporting period for precision and accuracy
data is on an annual or calendar quarter basis
for SLAMS, it is on a continuing sampler
quarter basis for PSD—since the monitoring
may not commence at the beginning of a
calendar quarter.
1.1.2 The performance evaluations for
PSD must be conducted by personnel
different from those who perform routine
span checks and calibrations, whereas for
SLAMS, it is the preferred but not the
required condition. For PSD, the evaluation
rate is 100 percent of the sites per reporting
quarter whereas for SLAMS it is 25 percent
PO 00000
Frm 00075
Fmt 4701
Sfmt 4700
2783
of the sites or instruments quarterly. Note
that monitoring for sulfur dioxide (SO2) and
nitrogen dioxide (NO2) for PSD must be done
with automated analyzers—the manual
bubbler methods are not permitted.
1.1.3 The requirements for precision
assessment for the automated methods are
the same for both SLAMS and PSD. However,
for manual methods, only one collocated site
is required for PSD.
1.1.4 The precision, accuracy and bias
data for PSD are reported separately for each
sampler (site), whereas for SLAMS, the report
may be by sampler (site) or primary quality
assurance organization, depending on the
pollutant. SLAMS data are required to be
reported to the AQS, PSD data are required
to be reported to the permit-granting
authority. Requirements in this appendix,
with the exception to the differences
discussed in this section, and in Table A–1
of this appendix will be expected to be
followed by both SLAMS and PSD networks
unless directly specified in a particular
section.
1.2 Measurement Uncertainty.
Measurement uncertainty is a term used to
describe deviations from a true concentration
or estimate that are related to the
measurement process and not to spatial or
temporal population attributes of the air
being measured. Monitoring organizations
must develop quality assurance project plans
(QAPP) which describe how the organization
intends to control measurement uncertainty
to an appropriate level in order to achieve the
data quality objectives. Data quality
indicators associated with measurement
uncertainty include:
(a) Precision. A measurement of mutual
agreement among individual measurements
of the same property usually under
prescribed similar conditions, expressed
generally in terms of the standard deviation.
(b) Bias. The systematic or persistent
distortion of a measurement process which
causes errors in one direction.
(c) Accuracy. The degree of agreement
between an observed value and an accepted
reference value. Accuracy includes a
combination of random error (imprecision)
and systematic error (bias) components
which are due to sampling and analytical
operations.
(d) Completeness. A measure of the
amount of valid data obtained from a
measurement system compared to the
amount that was expected to be obtained
under correct, normal conditions.
(e) Detectability. The low critical range
value of a characteristic that a method
specific procedure can reliably discern.
1.3 Measurement Quality Checks. The
SLAMS measurement quality checks
described in sections 3.2 and 3.3 of this
appendix shall be reported to AQS and are
included in the data required for
certification. The PSD network is required to
implement the measurement quality checks
and submit this information quarterly along
with assessment information to the permitgranting authority.
1.4 Assessments and Reports. Periodic
assessments and documentation of data
quality are required to be reported to EPA or
to the permit granting authority (PSD). To
E:\FR\FM\17JAP3.SGM
17JAP3
2784
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
provide national uniformity in this
assessment and reporting of data quality for
all networks, specific assessment and
reporting procedures are prescribed in detail
in sections 3, 4, and 5 of this appendix. On
the other hand, the selection and extent of
the quality assurance and quality control
activities used by a monitoring organization
depend on a number of local factors such as
field and laboratory conditions, the
objectives for monitoring, the level of data
quality needed, the expertise of assigned
personnel, the cost of control procedures,
pollutant concentration levels, etc. Therefore,
quality system requirements in section 2 of
this appendix are specified in general terms
to allow each monitoring organization to
develop a quality system that is most
efficient and effective for its own
circumstances while achieving the data
quality objectives required for the SLAMS
sites.
2. Quality System Requirements.
A quality system is the means by which an
organization manages the quality of the
monitoring information it produces in a
systematic, organized manner. It provides a
framework for planning, implementing,
assessing and reporting work performed by
an organization and for carrying out required
quality assurance and quality control
activities.
2.1 Quality Management Plans and
Quality Assurance Project Plans. All
monitoring organizations must develop a
quality system that is described and
approved in quality management plans
(QMP) and quality assurance project plans
(QAPP) to ensure that the monitoring results:
(a) Meet a well-defined need, use, or
purpose;
(b) Provide data of adequate quality for the
intended monitoring objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards
specifications;
(e) Comply with statutory (and other)
requirements of society; and
(f) Reflect consideration of cost and
economics.
2.1.1 The QMP describes the quality
system in terms of the organizational
structure, functional responsibilities of
management and staff, lines of authority, and
required interfaces for those planning,
implementing, assessing and reporting
activities involving environmental data
operations (EDO). The QMP must be suitably
documented in accordance with EPA
requirements (reference 2 of this appendix),
and approved by the appropriate Regional
Administrator, or Regional Administrator’s
designee. The quality system will be
reviewed during the systems audits described
in section 2.5 of this appendix. Organizations
that implement long-term monitoring
programs with EPA funds should have a
separate QMP document. Smaller
organizations or organizations that do
infrequent work with EPA funds may
combine the QMP with the QAPP based on
negotiations with the funding agency.
Additional guidance on this process can be
found in reference 10 of this appendix.
Approval of the recipient’s QMP by the
appropriate Regional Administrator, or the
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Regional Administrator’s designee, may
allow delegation of the authority to review
and approve QAPP to the recipient, based on
adequacy of quality assurance procedures
described and documented in the QMP. The
QAPP will be reviewed by EPA during
systems audits or circumstances related to
data quality.
2.1.2 The QAPP is a formal document
describing, in sufficient detail, the quality
system that must be implemented to ensure
that the results of work performed will satisfy
the stated objectives. The quality assurance
policy of the EPA requires every EDO to have
written and approved QAPP prior to the start
of the EDO. It is the responsibility of the
monitoring organization to adhere to this
policy. The QAPP must be suitably
documented in accordance with EPA
requirements (reference 3 of this appendix).
2.1.3 The monitoring organizations’
quality system must have adequate resources
both in personnel and funding to plan,
implement, assess and report on the
achievement of the requirements of this
appendix and its approved QAPP.
2.2 Independence of Quality Assurance.
The monitoring organization must provide
for a quality assurance management function;
that aspect of the overall management system
of the organization that determines and
implements the quality policy defined in a
monitoring organization’s QMP. Quality
management includes strategic planning,
allocation of resources and other systematic
planning activities (e.g. planning,
implementation, assessing and reporting)
pertaining to the quality system. The quality
assurance management function must have
sufficient technical expertise and
management authority to conduct
independent oversight and assure the
implementation of the organization’s quality
system relative to the Ambient Air Quality
Monitoring Program and should be
organizationally independent of
environmental data generation activities.
2.3 Data Quality Performance
Requirements.
2.3.1 Data Quality Objectives. Data
quality objectives (DQO) or the results of
other systematic planning processes are
statements that define the appropriate type of
data to collect and specify the tolerable levels
of potential decision errors that will be used
as a basis for establishing the quality and
quantity of data needed to support the
objectives of the SLAMS stations. DQO will
be developed by EPA to support the primary
SLAMS objectives for each criteria pollutant.
As they are developed they will be added to
the regulation. DQO or the results of other
systematic planning processes for PSD or
other monitoring will be the responsibility of
the monitoring organizations. The quality of
the conclusions made from data
interpretation can be affected by population
uncertainty (spatial or temporal uncertainty)
and measurement uncertainty (uncertainty
associated with collecting, analyzing,
reducing and reporting concentration data).
This appendix focuses on assessing and
controlling measurement uncertainty.
2.3.1.1 Measurement Uncertainty for
Automated and Manual PM2.5 Methods. The
goal for acceptable measurement uncertainty
PO 00000
Frm 00076
Fmt 4701
Sfmt 4700
is defined as 10 percent coefficient of
variation (CV) for total precision and ± 10
percent for total bias.
2.3.1.2 Measurement Uncertainty for
Automated Ozone Methods. The goal for
acceptable measurement uncertainty is
defined for precision as an upper 90 percent
confidence limit for the coefficient variation
(CV) of 7 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 7 percent.
2.3.1.3 Measurement Uncertainty for
PM10-2.5 Methods. The goal for acceptable
measurement uncertainty is defined for
precision as an upper 90 percent confidence
limit for the coefficient variation (CV) of 15
percent and for bias as an upper 95 percent
confidence limit for the absolute bias of 15
percent.
2.4 National Performance Evaluation
Programs. Monitoring plans or QAPP shall
provide for the implementation of a program
of independent and adequate audits of all
monitors providing data for SLAMS and PSD
including the provision of adequate resources
for such audit programs. A monitoring plan
(or QAPP) which provides for monitoring
organization participation in EPA’s National
Performance Audit Program (NPAP) and the
PM Performance Evaluation Program (PEP)
program and which indicates the consent of
the monitoring organization for EPA to apply
an appropriate portion of the grant funds,
which EPA would otherwise award to the
monitoring organization for monitoring
activities, will be deemed by EPA to meet
this requirement. For clarification and to
participate, monitoring organizations should
contact either the appropriate EPA Regional
Quality Assurance (QA) Coordinator at the
appropriate EPA Regional Office location, or
the NPEP Coordinator, Emissions Monitoring
and Analysis Division (D205–02), U.S.
Environmental Protection Agency, Research
Triangle Park, NC 27711.
2.5 Technical Systems Audit Program.
Technical systems audits of each ambient air
monitoring organization shall be conducted
at least every 3 years by the appropriate EPA
Regional Office and reported to the AQS.
Systems audit programs are described in
reference 10 of this appendix. For further
instructions, monitoring organizations
should contact the appropriate EPA Regional
QA Coordinator.
2.6 Gaseous and Flow Rate Audit
Standards.
2.6.1 Gaseous pollutant concentration
standards (permeation devices or cylinders of
compressed gas) used to obtain test
concentrations for carbon monoxide (CO),
sulfur dioxide (SO2), nitrogen oxide (NO),
and nitrogen dioxide (NO2) must be traceable
to either a National Institute of Standards and
Technology (NIST) Traceable Reference
Material (NTRM) or a NIST-certified Gas
Manufacturer’s Internal Standard (GMIS),
certified in accordance with one of the
procedures given in reference 4 of this
appendix. Vendors advertizing certification
with the procedures provided in reference 4
of this appendix and distributing gasses as
‘‘EPA Protocol Gas’’ must participate in the
EPA Protocol Gas Verification Program or not
use ‘‘EPA’’ in any form of advertizing.
2.6.2 Test concentrations for ozone (O3)
must be obtained in accordance with the
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
ultra violet photometric calibration
procedure specified in appendix D to part 50
of this chapter, or by means of a certified O3
transfer standard. Consult references 7 and 8
of this appendix for guidance on primary and
transfer standards for O3.
2.6.3 Flow rate measurements must be
made by a flow measuring instrument that is
traceable to an authoritative volume or other
applicable standard. Guidance for certifying
some types of flowmeters is provided in
reference 10 of this appendix.
2.7 Primary Requirements and Guidance.
Requirements and guidance documents for
developing the quality system are contained
in references 1 through 10 of this appendix,
which also contain many suggested
procedures, checks, and control
specifications. Reference 10 of this appendix
describes specific guidance for the
development of a quality system for SLAMS.
Many specific quality control checks and
specifications for methods are included in
the respective reference methods described
in part 50 of this chapter or in the respective
equivalent method descriptions available
from EPA (reference 6 of this appendix).
Similarly, quality control procedures related
to specifically designated reference and
equivalent method analyzers are contained in
the respective operation or instruction
manuals associated with those analyzers.
3. Measurement Quality Check
Requirements.
This section provides the requirements for
performing the measurement quality checks
that can be used to assess data quality and
with the exception of the flow rate
verifications (sections 3.2.3 and 3.3.2 of this
appendix) are required to be submitted to the
AQS within the same time frame
requirements as routine data. Section 3.2 of
this appendix describes checks of automated
or continuous instruments while section 3.3
describe checks associated with manual
sampling instruments. Other quality control
samples are identified in the various
references described earlier and can be used
to control certain aspects of the measurement
system.
3.1 Primary Quality Assurance
Organization. Estimates of data quality will
be calculated on the basis of single monitors,
and primary quality assurance organizations.
A primary quality assurance organization is
defined as a monitoring organization or other
organization that is responsible for a set of
stations that monitors the same pollutant and
for which data quality assessments can be
pooled. Each criteria pollutant sampler/
monitor at a monitoring station in the
SLAMS network must be associated with
one, and only one, primary quality assurance
organization.
3.1.1 Each primary quality assurance
organization shall be defined such that
measurement uncertainty among all stations
in the organization can be expected to be
reasonably homogeneous, as a result of
common factors. Common factors that should
be considered by monitoring organizations in
defining primary quality assurance
organizations include:
(a) Operation by a common team of field
operators according to a common set of
procedures;
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
(b) Use of a common QAPP or standard
operating procedures;
(c) Common calibration facilities and
standards;
(d) Oversight by a common quality
assurance organization; and
(e) Support by a common management,
laboratory or headquarters.
3.1.2 Primary quality assurance
organizations are not necessarily related to
the organization reporting data to the AQS.
Monitoring organizations having difficulty in
defining the primary quality assurance
organizations or in assigning specific sites to
primary quality assurance organizations
should consult with the appropriate EPA
Regional Office. All definitions of primary
quality assurance organizations shall be
subject to final approval by the appropriate
EPA Regional Office during scheduled
network reviews or systems audits.
3.1.3 Assessment results shall be reported
as specified in section 5 of this appendix.
3.2 Measurement Quality Checks of
Automated Methods. Table A–2 of this
appendix provides a summary of the types
and frequency of the measurement quality
checks that will be described in this section.
3.2.1 One-Point Quality Control Check for
SO2, NO2, O3, and CO. A one-point quality
control (QC) check must be performed at
least once every 2 weeks on each automated
analyzer used to measure SO2, NO2, O3 and
CO. The frequency of QC checks may be
reduced based upon review, assessment and
approval of the EPA Regional Administrator.
However, with the advent of automated
calibration systems more frequent checking is
encouraged. See Reference 10 of this
appendix for guidance on the review
procedure. The QC check is made by
challenging the analyzer with a QC check gas
of known concentration (effective
concentration for open path analyzers)
between 0.01 and 0.10 parts per million
(ppm) for SO2, NO2, and O3, and between 1
and 10 ppm for CO analyzers. The ranges
allow for appropriate check gas selection for
SLAMS sites that may be sampling for
different objectives, i.e., trace gas monitoring
vs. comparison to National Ambient Air
Quality Standards (NAAQS). It is suggested
that the QC check gas concentration selected
should be related to the routine
concentrations normally measured at sites
within the monitoring network in order to
appropriately reflect the precision and bias at
these routine concentration ranges. To check
the precision and bias of SLAMS analyzers
operating at ranges either above or below the
levels identified, use check gases of
appropriate concentrations as approved by
the appropriate EPA Regional Administrator
or their designee. The standards from which
check concentrations are obtained must meet
the specifications of section 2.6 of this
appendix.
3.2.1.1 Except for certain CO analyzers
described below, point analyzers must
operate in their normal sampling mode
during the QC check, and the test atmosphere
must pass through all filters, scrubbers,
conditioners and other components used
during normal ambient sampling and as
much of the ambient air inlet system as is
practicable. If permitted by the associated
PO 00000
Frm 00077
Fmt 4701
Sfmt 4700
2785
operation or instruction manual, a CO point
analyzer may be temporarily modified during
the QC check to reduce vent or purge flows,
or the test atmosphere may enter the analyzer
at a point other than the normal sample inlet,
provided that the analyzer’s response is not
likely to be altered by these deviations from
the normal operational mode. If a QC check
is made in conjunction with a zero or span
adjustment, it must be made prior to such
zero or span adjustments.
3.2.1.2 Open path analyzers are tested by
inserting a test cell containing a QC check gas
concentration into the optical measurement
beam of the instrument. If possible, the
normally used transmitter, receiver, and as
appropriate, reflecting devices should be
used during the test and the normal
monitoring configuration of the instrument
should be altered as little as possible to
accommodate the test cell for the test.
However, if permitted by the associated
operation or instruction manual, an alternate
local light source or an alternate optical path
that does not include the normal atmospheric
monitoring path may be used. The actual
concentration of the QC check gas in the test
cell must be selected to produce an effective
concentration in the range specified earlier in
this section. Generally, the QC test
concentration measurement will be the sum
of the atmospheric pollutant concentration
and the QC test concentration. If so, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the QC test from the
QC check gas concentration measurement. If
the difference between these before and after
measurements is greater than 20 percent of
the effective concentration of the test gas,
discard the test result and repeat the test. If
possible, open path analyzers should be
tested during periods when the atmospheric
pollutant concentrations are relatively low
and steady.
3.2.1.3 Report the audit concentration
(effective concentration for open path
analyzers) of the QC gas and the
corresponding measured concentration
(corrected concentration, if applicable, for
open path analyzers) indicated by the
analyzer. The percent differences between
these concentrations are used to assess the
precision and bias of the monitoring data as
described in sections 4.1.2 (precision) and
4.1.3 (bias) of this appendix.
3.2.2 Performance evaluation for SO2,
NO2, O3, or CO. Each calendar quarter
(during which analyzers are operated),
evaluate at least 25 percent of the SLAMS
analyzers that monitor for SO2, NO2, O3, or
CO such that each analyzer is evaluated at
least once per year. If there are fewer than
four analyzers for a pollutant within a
primary quality assurance organization, it is
suggested to randomly evaluate one or more
analyzers so that at least one analyzer for that
pollutant is evaluated each calendar quarter.
Where possible, EPA strongly encourages
more frequent evaluations, up to a frequency
of once per quarter for each SLAMS analyzer.
It is also suggested that the evaluation be
E:\FR\FM\17JAP3.SGM
17JAP3
2786
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
conducted by a trained experienced
technician other than the routine site
operator.
3.2.2.1 (a) The evaluation is made by
challenging the analyzer with audit gas
standard of known concentration (effective
concentration for open path analyzers) from
at least three consecutive ranges that are
applicable to the analyzer being evaluated:
Concentration range, ppm
Audit level
O3
1
2
3
4
5
...............................................................................................................
...............................................................................................................
...............................................................................................................
...............................................................................................................
...............................................................................................................
(b) An additional 4th range is encouraged
for those monitors that have the potential for
exceeding the concentration ranges described
by the initial three selected.
3.2.2.2(a) NO2 audit gas for
chemiluminescence-type NO2 analyzers must
also contain at least 0.08 ppm NO. NO
concentrations substantially higher than 0.08
ppm, as may occur when using some gas
phase titration (GPT) techniques, may lead to
evaluation errors in chemiluminescence
analyzers due to inevitable minor NO–NOX
channel imbalance. Such errors may be
atypical of routine monitoring errors to the
extent that such NO concentrations exceed
typical ambient NO concentrations at the
site. These errors may be minimized by
modifying the GPT technique to lower the
NO concentrations remaining in the NO2
audit gas to levels closer to typical ambient
NO concentrations at the site.
(b) To evaluate SLAMS analyzers operating
on ranges higher than 0 to 1.0 ppm for SO2,
NO2, and O3 or 0 to 50 ppm for CO, use audit
gases of appropriately higher concentration
as approved by the appropriate EPA Regional
Administrator or the Administrators’s
designee.
3.2.2.3 The standards from which audit
gas test concentrations are obtained must
meet the specifications of section 2.6 of this
appendix. The gas standards and equipment
used for evaluations must not be the same as
the standards and equipment used for
calibration or calibration span adjustments.
For SLAMS sites, the auditor should not be
the operator or analyst who conducts the
routine monitoring, calibration, and analysis.
For PSD sites the auditor must not be the
operator or analyst who conducts the routine
monitoring, calibration, and analysis.
3.2.2.4 For point analyzers, the
evaluation shall be carried out by allowing
the analyzer to analyze the audit gas test
atmosphere in its normal sampling mode
such that the test atmosphere passes through
all filters, scrubbers, conditioners, and other
sample inlet components used during normal
ambient sampling and as much of the
ambient air inlet system as is practicable. The
exception provided in section 3.2.1 of this
appendix for certain CO analyzers does not
apply for evaluations.
3.2.2.5 Open path analyzers are evaluated
by inserting a test cell containing the various
audit gas concentrations into the optical
measurement beam of the instrument. If
possible, the normally used transmitter,
receiver, and, as appropriate, reflecting
devices should be used during the
evaluation, and the normal monitoring
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
SO2
NO2
CO
0.02–0.05
0.06–0.10
0.11–0.20
0.21–0.30
0.31–0.90
0.0003–0.005
0.006–0.01
0.02–0.10
0.11–0.40
0.41–0.90
0.0002–0.002
0.003–0.005
0.006–0.10
0.11–0.30
0.31–0.60
0.08–0.10
0.50–1.00
1.50–4.00
5–15
20–50
configuration of the instrument should be
modified as little as possible to accommodate
the test cell for the evaluation. However, if
permitted by the associated operation or
instruction manual, an alternate local light
source or an alternate optical path that does
not include the normal atmospheric
monitoring path may be used. The actual
concentrations of the audit gas in the test cell
must be selected to produce effective
concentrations in the evaluation level ranges
specified in this section of this appendix.
Generally, each evaluation concentration
measurement result will be the sum of the
atmospheric pollutant concentration and the
evaluation test concentration. If so, the result
must be corrected to remove the atmospheric
concentration contribution. The corrected
concentration is obtained by subtracting the
average of the atmospheric concentrations
measured by the open path instrument under
test immediately before and immediately
after the evaluation test (or preferably before
and after each evaluation concentration level)
from the evaluation concentration
measurement. If the difference between the
before and after measurements is greater than
20 percent of the effective concentration of
the test gas standard, discard the test result
for that concentration level and repeat the
test for that level. If possible, open path
analyzers should be evaluated during periods
when the atmospheric pollutant
concentrations are relatively low and steady.
Also, the monitoring path length must be
reverified to within ±3 percent to validate the
evaluation, since the monitoring path length
is critical to the determination of the effective
concentration.
3.2.2.6 Report both the evaluation
concentrations (effective concentrations for
open path analyzers) of the audit gases and
the corresponding measured concentration
(corrected concentrations, if applicable, for
open path analyzers) indicated or produced
by the analyzer being tested. The percent
differences between these concentrations are
used to assess the quality of the monitoring
data as described in section 4.1.4 of this
appendix.
3.2.3 Flow Rate Verification for
Particulate Matter. A one-point flow rate
verification check must be performed at least
once every month on each automated
analyzer used to measure PM10, PM10-2.5 and
PM2.5. The verification is made by checking
the operational flow rate of the analyzer. If
the verification is made in conjunction with
a flow rate adjustment, it must be made prior
to such flow rate adjustment. Randomization
of the flow rate verification with respect to
PO 00000
Frm 00078
Fmt 4701
Sfmt 4700
time of day, day of week, and routine service
and adjustments is encouraged where
possible. For the standard procedure, use a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix
to check the analyzer’s normal flow rate. Care
should be used in selecting and using the
flow rate measurement device such that it
does not alter the normal operating flow rate
of the analyzer. Report the flow rate of the
transfer standard and the corresponding flow
rate measured (indicated) by the analyzer.
The percent differences between the audit
and measured flow rates are used to assess
the bias of the monitoring data as described
in section 4.2.2 of this appendix (using flow
rates in lieu of concentrations).
3.2.4 Semi-Annual Flow Rate Audit for
Particulate Matter. Every 6 months, audit the
flow rate of the PM10, PM10-2.5 and PM2.5
particulate analyzers. Where possible, EPA
strongly encourages more frequent auditing.
It is also suggested that the audit be
conducted by a trained experienced
technician other than the routine site
operator. The audit is made by measuring the
analyzer’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
to calibrate the analyzer. However, both the
calibration standard and the audit standard
may be referenced to the same primary flow
rate or volume standard. Great care must be
used in auditing the flow rate to be certain
that the flow measurement device does not
alter the normal operating flow rate of the
analyzer. Report the audit flow rate of the
transfer standard and the corresponding flow
rate measured (indicated) by the analyzer.
The percent differences between these flow
rates are used to validate the one-point flow
rate verification checks used to estimate bias
as described in section 4.2.3 of this appendix.
3.2.5 Collocated Procedures for PM10-2.5
and PM2.5. For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the audit monitor.
3.2.5.1 Each EPA designated Federal
reference method (FRM) or Federal
equivalent method (FEM) within a primary
quality assurance organization must:
(a) Have 15 percent of the monitors
collocated (values of .5 and greater round
up); and
(b) Have at least 1 collocated monitor (if
the total number of monitors is less than 3).
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
The first collocated monitor must be a
designated FRM monitor.
3.2.5.2 In addition, monitors selected for
collocation must also meet the following
requirements:
(a) A primary monitor designated as an
EPA FRM shall be collocated with an audit
monitor having the same EPA FRM method
designation.
(b) For each primary monitor designated as
an EPA FEM, 50 percent of the monitors
designated for collocation shall be collocated
with an audit monitor having the same
method designation and 50 percent of the
monitors shall be collocated with an FRM
audit monitor. If the primary quality
assurance organization only has one FEM
monitor it shall be collocated with an FRM
audit monitor. If there are an odd number of
collocated monitors required, the additional
monitor shall be an FRM audit monitor. An
example of this procedure is found in Table
A–3 of this appendix.
3.2.5.3 The collocated monitors should be
deployed according to the following protocol:
(a) 80 percent of the collocated audit
monitors should be deployed at sites with
annual average or daily concentrations
estimated to be within ± 20 percent of the
applicable NAAQS and the remainder at
what the monitoring organizations designate
as high value sites;
(b) If an organization has no sites with
annual average or daily concentrations
within ± 20 percent of the annual NAAQS (or
24-hour NAAQS if that is affecting the area),
60 percent of the collocated audit monitors
should be deployed at those sites with the
annual mean concentrations (or 24-hour
NAAQS if that is affecting the area) among
the highest 25 percent for all sites in the
network.
3.2.5.4 In determining the number of
collocated sites required for PM2.5,
monitoring networks for visibility
assessments should not be treated
independently from networks for particulate
matter, as the separate networks may share
one or more common samplers. However, for
Class I visibility areas, EPA will accept
visibility aerosol mass measurement instead
of a PM2.5 measurement if the latter
measurement is unavailable. Any PM2.5
monitoring site which does not have a
monitor which is an EPA FRM or FEM is not
required to be included in the number of
sites which are used to determine the number
of collocated monitors.
3.2.5.5 For each PSD monitoring network,
one site must be collocated. A site with the
predicted highest 24-hour pollutant
concentration must be selected.
3.2.5.6 The two collocated monitors must
be within 4 meters of each other and at least
2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference.
Calibration, sampling, and analysis must be
the same for both collocated samplers and
the same as for all other samplers in the
network.
3.2.5.7 Sample the collocated audit
monitor for SLAMS sites on a 12-day
schedule; sample PSD sites on a 6-day
schedule or every third day for PSD daily
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
monitors. If a primary quality assurance
organization has only one collocated
monitor, higher sampling frequencies than
the 12-day schedule may be needed in order
to produce ∼25 valid sample pairs a year.
Report the measurements from both primary
and collocated audit monitors at each
collocated sampling site. The calculations for
evaluating precision between the two
collocated monitors are described in section
4.3.1 of this appendix.
3.2.6 Performance Evaluation Procedures
for PM10-2.5 and PM2.5. (a) The performance
evaluation is an independent assessment
used to estimate total measurement system
bias. These evaluations will be performed
under the PM Performance Evaluation
Program (PEP) (section 2.4 of this appendix)
or a comparable program. Performance
evaluations will be performed on the SLAMS
monitors annually within each primary
quality assurance organization. For primary
quality assurance organizations with less
than or equal to five monitoring sites, five
valid performance evaluation audits must be
collected and reported each year. For primary
quality assurance organizations with greater
than five monitoring sites, eight valid
performance evaluation audits must be
collected and reported each year. A valid
performance evaluation audit means that
both the primary monitor and PEP audit
concentrations are valid and above 3 µg/m3.
Additionally, each year, every designated
FRM or FEM within a primary quality
assurance organization must:
(1) Have each method designation
evaluated each year; and,
(2) Have all FRM or FEM samplers subject
to an PEP audit at least once every six years;
which equates to approximately 15 percent of
the monitoring sites audited each year.
(b) Additional information concerning the
Performance Evaluation Program is contained
in reference 10 of this appendix. The
calculations for evaluating bias between the
primary monitor and the performance
evaluation monitor for PM2.5 are described in
section 4.3.2 of this appendix. The
calculations for evaluating bias between the
primary monitor(s) and the performance
evaluation monitors for PM10-2.5 are described
in section 4.1.3 of this appendix.
3.3 Measurement Quality Checks of
Manual Methods. Table A–2 of this appendix
provides a summary of the types and
frequency of the measurement quality checks
that will be described in this section.
3.3.1 Collocated Procedures for PM10. For
each network of manual PM10 methods,
select 15 percent (or at least one) of the
monitoring sites within the primary quality
assurance organization for collocated
sampling. For purposes of precision
assessment, networks for measuring total
suspended particulate (TSP) and PM10 shall
be considered separately from one another.
PM10 and TSP sites having annual mean
particulate matter concentrations among the
highest 25 percent of the annual mean
concentrations for all the sites in the network
must be selected or, if such sites are
impractical, alternative sites approved by the
EPA Regional Administrator may be selected.
3.3.1.1 In determining the number of
collocated sites required for PM10,
PO 00000
Frm 00079
Fmt 4701
Sfmt 4700
2787
monitoring networks for lead (Pb) should be
treated independently from networks for
particulate matter (PM), even though the
separate networks may share one or more
common samplers. However, a single pair of
samplers collocated at a common-sampler
monitoring site that meets the requirements
for both a collocated Pb site and a collocated
PM site may serve as a collocated site for
both networks.
3.3.1.2 The two collocated monitors must
be within 4 meters of each other and at least
2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference.
Calibration, sampling, analysis and
verification/validation procedures must be
the same for both collocated samplers and
the same as for all other samplers in the
network.
3.3.1.3 For each pair of collocated
samplers, designate one sampler as the
primary sampler whose samples will be used
to report air quality for the site, and designate
the other as the audit sampler. Sample
SLAMS sites on a 12-day schedule; sample
PSD sites on a 6-day schedule or every third
day for PSD daily samplers. If a primary
quality assurance organization has only one
collocated monitor, higher sampling
frequencies than the 12-day schedule may be
needed in order to produce 25 valid sample
pairs a year. Report the measurements from
both samplers at each collocated sampling
site. The calculations for evaluating precision
between the two collocated samplers are
described in section 4.2.1 of this appendix.
3.3.2 Flow Rate Verification for
Particulate Matter. Follow the same
procedure as described in section 3.2.3 of
this appendix for PM2.5, PM10, PM10-2.5 and
TSP instruments. The percent differences
between the audit and measured flow rates
are used to assess the bias of the monitoring
data as described in section 4.2.2 of this
appendix.
3.3.3 Semi-Annual Flow Rate Audit for
Particulate Matter. Follow the same
procedure as described in section 3.2.4 of
this appendix for PM2.5, PM10, PM10-2.5 and
TSP instruments. The percent differences
between these flow rates are used to validate
the one-point flow rate verification checks
used to estimate bias as described in section
4.2.3 of this appendix. Great care must be
used in auditing high-volume particulate
matter samplers having flow regulators
because the introduction of resistance plates
in the audit flow standard device can cause
abnormal flow patterns at the point of flow
sensing. For this reason, the flow audit
standard should be used with a normal filter
in place and without resistance plates in
auditing flow-regulated high-volume
samplers, or other steps should be taken to
assure that flow patterns are not perturbed at
the point of flow sensing.
3.3.4 Pb Methods.
3.3.4.1 Annual Flow Rate. For the Pb
Reference Method (40 CFR part 50, appendix
G), the flow rates of the high-volume Pb
samplers shall be verified and audited using
the same procedures described in sections
3.3.2 and 3.3.3 of this appendix.
3.3.4.2 Pb Strips. Each calendar quarter or
sampling quarter (PSD), audit the Pb
E:\FR\FM\17JAP3.SGM
17JAP3
2788
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Reference Method analytical procedure using
glass fiber filter strips containing a known
quantity of Pb. These audit sample strips are
prepared by depositing a Pb solution on
unexposed glass fiber filter strips of
dimensions 1.9 centimeters (cm) by 20.3 cm
(3/4 inch by 8 inch) and allowing them to dry
thoroughly. The audit samples must be
prepared using batches of reagents different
from those used to calibrate the Pb analytical
equipment being audited. Prepare audit
samples in the following concentration
ranges:
Pb concentration, µg/strip
Range
1 ...............................................................................................................................................................................
2 ...............................................................................................................................................................................
Equivalent ambient Pb concentration, µg/
m3 1
100–300
400–1000
0.5–1.5
3.0–5.0
1 Equivalent ambient Pb concentration in µg/m3 is based on sampling at 1.7 m3/min for 24 hours on a 20.3 cm x 25.4 cm (8 inch x 10 inch)
glass fiber filter.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
4.1 Statistics for the Assessment of QC
Checks for SO2, NO2, O3 and CO.
4.1.1 Percent Difference. All
measurement quality checks start with a
comparison of an audit concentration or
value (flowrate) to the concentration/value
measured by the analyzer and use percent
difference as the comparison statistic as
described in equation 1 of this section.
For each single point check, calculate the
percent difference, di, as follows:
Equation 3
bias = AB + t 0.95, n −1 ⋅
Equation 4
meas − audit
⋅100
audit
CV =
where, X 0.1,n¥1 is the 10th percentile of a
chi-squared distribution with n–1 degrees of
freedom.
4.1.3 Bias Estimate. The bias estimate is
calculated using the one-point QC checks for
SO2, NO2, O3, or CO described in section
3.2.1 of this appendix and the performance
evaluation program for PM10-2.5 described in
section 3.2.6 of this appendix. The bias
estimator is an upper bound on the mean
absolute value of the percent differences as
described in equation 3 of this section:
PO 00000
Frm 00080
Fmt 4701
Sfmt 4700
and the quantity AS is the standard deviation
of the absolute value of the di’s and is
calculated using equation 5 of this section:
Equation 5
AS =
n
n
2
n ⋅ ∑ di − ∑ di
i =1
i =1
n ( n − 1)
2
4.1.3.1 Assigning a sign (positive/
negative) to the bias estimate. Since the bias
statistic as calculated in equation 3 of this
appendix uses absolute values, it does not
have a tendency (negative or positive bias)
associated with it. A sign will be designated
by rank ordering the percent differences of
the QC check samples from a given site for
a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th
percentiles of the percent differences for each
site. The absolute bias upper bound should
be flagged as positive if both percentiles are
positive and negative if both percentiles are
negative. The absolute bias upper bound
would not be flagged if the 25th and 75th
percentiles are of different signs.
4.1.4 Validation of Bias Using
Performance Evaluations. The annual
performance evaluations for SO2, NO2, O3, or
CO described in section 3.2.2 of this
appendix are used to verify the results
obtained from the one-point QC checks and
to validate those results across a range of
concentration levels. To quantify this
annually at the site level and at the 3-year
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.035
2
n
n ⋅ ∑ d − ∑ di
n −1
i =1
i =1 ⋅
n ( n − 1)
X 2 0.1, n −1
2
i
1 n
⋅ ∑ di
n i =1
EP17JA06.034
Equation 2
AB =
EP17JA06.033
where, meas is the concentration indicated
by the monitoring organization’s instrument
and audit is the audit concentration of the
standard used in the QC check being
measured.
4.1.2 Precision Estimate. The precision
estimate is used to assess the one-point QC
checks for SO2, NO2, O3, or CO described in
section 3.2.1 of this appendix. The precision
estimator is the coefficient of variation upper
bound and is calculated using equation 2 of
this section:
n
n
where, n is the number of single point checks
being aggregated; t0.95,n¥1 is the 95th quantile
of a t-distribution with n-1 degrees of
freedom; the quantity AB is the mean of the
absolute values of the di’s and is calculated
using equation 4 of this section:
Equation 1
di =
AS
EP17JA06.031 EP17JA06.032
(a) Audit samples must be extracted using
the same extraction procedure used for
exposed filters.
(b) Analyze three audit samples in each of
the two ranges each quarter samples are
analyzed. The audit sample analyses shall be
distributed as much as possible over the
entire calendar quarter.
(c) Report the audit concentrations (in µg
Pb/strip) and the corresponding measured
concentrations (in µg Pb/strip) using AQS
unit code 077. The relative percent
differences between the concentrations are
used to calculate analytical accuracy as
described in section 4.4.2 of this appendix.
(d) The audits of an equivalent Pb method
are conducted and assessed in the same
manner as for the reference method. The flow
auditing device and Pb analysis audit
samples must be compatible with the specific
requirements of the equivalent method.
3.3.5 Collocated Procedures for PM10-2.5
and PM2.5. Follow the same procedure as
described in section 3.2.5 of this appendix.
3.3.6 Performance Evaluation Procedures
for PM10-2.5 and PM2.5. Follow the same
procedure as described in section 3.2.6 of
this appendix.
4. Calculations for Data Quality
Assessment.
(a) Calculations of measurement
uncertainty are carried out by EPA according
to the following procedures. Primary quality
assurance organizations should report the
data for all appropriate measurement quality
checks as specified in this appendix even
though they may elect to perform some or all
of the calculations in this section on their
own.
(b) The EPA will provide annual
assessments of data quality aggregated by site
and primary quality assurance organization
for SO2, NO2, O3 and CO and by primary
quality assurance organization for PM10,
PM2.5, PM10-2.5 and Pb.
(c) At low concentrations, agreement
between the measurements of collocated
samplers, expressed as relative percent
difference or percent difference, may be
relatively poor. For this reason, collocated
measurement pairs are selected for use in the
precision and bias calculations only when
both measurements are equal to or above the
following limits:
(1) TSP: 20 µg/m3.
(2) Pb: 0.15 µg/m3.
(3) PM10 (Hi-Vol): 15 µg/m3.
(4) PM10 (Lo-Vol): 3 µg/m3.
(5) PM10-2.5 and PM2.5: 3 µg/m3.
Equation 7
Lower probability limit = m + 1.96 ⋅ S
Equation 11
Where, m is the mean (equation 8 of this
appendix):
2
CV =
Equation 8
m=
1 k
∑ di
k i =1
where, k is the total number of one point QC
checks for the interval being evaluated and
S is the standard deviation of the percent
differences (equation 9 of this appendix) as
follows:
Equation 9
S=
k
k
k ⋅ ∑ d i2 − ∑ d i
i =1
i =1
k ( k − 1)
2
4.1.5 Percent Difference. Percent
differences for the performance evaluations,
calculated using equation 1 of this appendix
can be compared to the probability intervals
for the respective site or at the primary
quality assurance organization level. Ninetyfive percent of the individual percent
differences (all audit concentration levels) for
the performance evaluations should be
captured within the probability intervals for
the primary quality assurance organization.
4.2 Statistics for the Assessment of PM10.
4.2.1 Precision Estimate from Collocated
Samplers. Precision is estimated via
duplicate measurements from collocated
samplers of the same type. It is recommended
that the precision be aggregated at the
primary quality assurance organization level
quarterly, annually, and at the 3-year level.
The data pair would only be considered valid
if both concentrations are greater than the
minimum values specified in section 4(c) of
this appendix. For each collocated data pair,
calculate the relative percent difference, di,
using equation 10 of this appendix:
VerDate Aug<31>2005
18:31 Jan 13, 2006
Jkt 208001
n
n
n ⋅ ∑ d i2 − ∑ d i
i =1
i =1 ⋅ n − 1
2
2n ( n − 1)
X 0.1, n −1
where, n is the number of valid data pairs
being aggregated, and X 0.1,n¥1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di
is calculated from two values with error.
4.2.2 Bias Estimate Using One-Point Flow
Rate Verifications. For each one-point flow
rate verification described in sections 3.2.3
and 3.3.2 of this appendix, calculate the
percent difference in volume using equation
1 of this appendix where meas is the value
indicated by the sampler’s volume
measurement and audit is the actual volume
indicated by the auditing flow meter. The
absolute volume bias upper bound is then
calculated using equation 3, where n is the
number of flow rate audits being aggregated;
t0.95,n¥1 is the 95th quantile of a t-distribution
with n–1 degrees of freedom, the quantity AB
is the mean of the absolute values of the di’s
and is calculated using equation 4 of this
appendix, and the quantity AS in equation 3
of this appendix is the standard deviation of
the absolute values of the di’s and is
calculated using equation 5 of this appendix.
4.2.3 Assessment Semi-Annual Flow Rate
Audits. The flow rate audits described in
sections 3.2.4 and 3.3.3 of this appendix are
used to assess the results obtained from the
one-point flow rate verifications and to
provide an estimate of flow rate acceptability.
For each flow rate audit, calculate the
percent difference in volume using equation
1 of this appendix where meas is the value
indicated by the sampler’s volume
measurement and audit is the actual volume
indicated by the auditing flow meter. To
quantify this annually and at the 3-year
primary quality assurance organization level,
probability limits are calculated from the
percent differences using equations 6 and 7
of this appendix where m is the mean
described in equation 8 of this appendix and
k is the total number of one-point flow rate
verifications for the year and S is the
PO 00000
Frm 00081
Fmt 4701
Sfmt 4700
Equation 12
n
D=
j
1
× ∑ di
n j i =1
where, nj is the number of pairs and d1, d2,
. . ., dnj are the biases for each of the pairs
to be averaged.
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.040
Upper probability limit = m + 1.96 ⋅ S
where, Xi is the concentration from the
primary sampler and Yi is the concentration
value from the audit sampler. The coefficient
of variation upper bound is calculated using
the equation 11 of this appendix:
EP17JA06.039
Equation 6
X i − Yi
⋅ 100
( Xi + Yi ) / 2
EP17JA06.038
di =
standard deviation of the percent differences
as described in equation 9 of this appendix.
4.2.4 Percent Difference. Percent
differences for the annual flow rate audit
concentration, calculated using equation 1 of
this appendix, can be compared to the
probability intervals for the one-point flow
rate verifications for the respective primary
quality assurance organization. Ninety-five
percent of the individual percent differences
(all audit concentration levels) for the
performance evaluations should be captured
within the probability intervals for primary
quality assurance organization.
4.3 Statistics for the Assessment of PM2.5
and PM10-2.5.
4.3.1 Precision Estimate. Precision for
collocated instruments for PM2.5 and PM10-2.5
may be estimated where both the primary
and collocated instruments are the same
method designation and when the method
designations are not similar. Follow the
procedure described in section 4.2.1 of this
appendix. In addition, one may want to
perform an estimate bias when the primary
monitor is an FEM and the collocated
monitor is an FRM. Follow the procedure
described in section 4.1.3 of this appendix in
order to provide an estimate of bias using the
collocated data.
4.3.2 Bias Estimate. Follow the procedure
described in section 4.1.3 of this appendix
for the bias estimate of PM10-2.5. The PM2.5
bias estimate is calculated using the paired
routine and the PEP monitor data described
in section 3.2.6 of this appendix. Calculate
the percent difference, di, using equation 1 of
this appendix, where meas is the measured
concentration from agency’s primary monitor
and audit is the concentration from the PEP
monitor. The data pair would only be
considered valid if both concentrations are
greater than the minimum values specified in
section 4(c) of this appendix. Estimates of
bias are presented for various levels of
aggregation, sometimes aggregating over time,
sometimes aggregating over samplers, and
sometimes aggregating over both time and
samplers. These various levels of aggregation
are achieved using the same basic statistic.
4.3.2.1 This statistic averages the
individual biases described in equation 1 of
this appendix to the desired level of
aggregation using equation 12 of this
appendix:
EP17JA06.037
Equation 10
EP17JA06.036 EP17JA06.051
primary quality assurance organization level,
probability limits will be calculated from the
one-point QC checks using equations 6 and
7 of this appendix:
EP17JA06.041
2789
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
2790
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
4.3.2.2 Confidence intervals can be
constructed for these average bias estimates
in equation 12 of this appendix using
equations 13 and 14 of this appendix:
Equation 13
Lower 90% Confidence Interval =D − t 0.95, df ×
s
nj
Equation 14
Upper 90% Confidence Interval =D − t 0.95, df ×
Where, t0.95,df is the 95th quantile of a tdistribution with degrees of freedom df=nj¥1
and s is an estimate of the variability of the
average bias calculated using equation 15 of
this appendix:
Equation 15
nj
s=
∑ (d
i =1
i
− D)
2
n j −1
4.4 Statistics for the Assessment of Pb.
4.4.1 Precision Estimate. Follow the same
procedures as described for PM10 in section
4.2.1 of this appendix using the data from the
collocated instruments. The data pair would
only be considered valid if both
concentrations are greater than the minimum
values specified in section 4(c) of this
appendix.
4.4.2 Bias Estimate. In order to estimate
bias, the information from the flow rate
audits and the Pb strip audits needs to be
combined as described below. To be
consistent with the formulas for the gases,
the recommended procedures are to work
s
nj
with relative errors of the lead
measurements. The relative error in the
concentration is related to the relative error
in the volume and the relative error in the
mass measurements using equation 16 of this
appendix:
Equation 16
( measured concentration
− audit concentration )
audit concentration
(concentration) error is bounded by equation
17 of this appendix:
The quality indicator data collected are then
used to bound each part of equation 17
separately.
4.4.2.1 Flow rate calculations. For each
flow rate audit, calculate the percent
difference in volume by equation 1 of this
appendix where meas is the value indicated
by the sampler’s volume measurement and
audit is the actual volume indicated by the
auditing flow meter. The absolute volume
bias upper bound is then calculated using
equation 3 of this appendix where n is the
number of flow rate audits being aggregated;
t0.95,n¥1 is the 95th quantile of a t-distribution
with n¥1 degrees of freedom; the quantity
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
relative mass error + relative volume error
1 − relative volume error
AB is the mean of the absolute values of the
di’s and is calculated using equation 4, and
the quantity AS in equation 3 of this
appendix is the standard deviation of the
absolute values of the di’s and is calculated
using equation 5 of this appendix.
4.4.2.2 Lead strip calculations. Similarly
for each lead strip audit, calculate the
percent difference in mass by equation 1
where meas is the value indicated by the
mass measurement and audit is the actual
lead mass on the audit strip. The absolute
mass bias upper bound is then calculated
using equation 3 of this appendix where n is
the number of lead strip audits being
PO 00000
Frm 00082
Fmt 4701
Sfmt 4700
aggregated; t0.95,n¥1 is the 95th quantile of a
t-distribution with n-1 degrees of freedom;
the quantity AB is the mean of the absolute
values of the di’s and is calculated using
equation 4 of this appendix and the quantity
AS in equation 3 of this appendix is the
standard deviation of the absolute values of
the di’s and is calculated using equation 5 of
this appendix.
4.4.2.3 Final bias calculation. Finally, the
absolute bias upper bound is given by
combining the absolute bias estimates of the
flow rate and Pb strips using equation 18 of
this appendix:
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.045
rel. error ≤
EP17JA06.046
Equation 17
EP17JA06.044
As with the gases, an upper bound for the
absolute bias is desired. Using equation 16
above, the absolute value of the relative
1
=
( rel. mass error − rel. volume error )
1 = rel. error
EP17JA06.042 EP17JA06.043
rel. error =
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
Equation 18
bias =
mass bias + vol. bias
100 − vol. bias
⋅ 100
where, the numerator and denominator have
been multiplied by 100 since everything is
expressed as a percentage.
4.5 Time Period for Audits. The statistics
in this section assume that the mass and flow
rate audits represent the same time period.
Since the two types of audits are not
performed at the same time, the audits need
to be grouped by common time periods.
Consequently, the absolute bias estimates
should be done on annual and 3-year levels.
The flow rate audits are site-specific, so the
absolute bias upper bound estimate can be
done and treated as a site-level statistic.
5. Reporting Requirements.
5.1 SLAMS Reporting Requirements. For
each pollutant, prepare a list of all
monitoring sites and their AQS site
identification codes in each primary quality
assurance organization and submit the list to
the appropriate EPA Regional Office, with a
copy to AQS. Whenever there is a change in
this list of monitoring sites in a primary
quality assurance organization, report this
change to the EPA Regional Office and to
AQS.
5.1.1 Quarterly Reports. For each quarter,
each primary quality assurance organization
shall report to AQS directly (or via the
appropriate EPA Regional Office for
organizations not direct users of AQS) the
results of all valid measurement quality
checks it has carried out during the quarter.
The quarterly reports must be submitted
consistent with the data reporting
requirements specified for air quality data as
set forth in § 58.16. EPA strongly encourages
early submission of the quality assurance
data in order to assist the monitoring
organizations control and evaluate the
quality of the ambient air data.
5.1.2 Annual Reports.
5.1.2.1 When the monitoring organization
has certified their data for the calendar year,
EPA will calculate and report the
measurement uncertainty for the entire
calendar year. These limits will then be
associated with the data submitted in the
annual report required by § 58.15.
5.1.2.2 Each primary quality assurance
organization shall submit, along with its
annual report, a listing by pollutant of all
monitoring sites in the primary quality
assurance organization.
5.2 PSD Reporting Requirements. At the
end of each sampling quarter, the
organization must report the appropriate
statistical assessments in section 4 of this
appendix for the pollutants measured. All
data used to calculate reported estimates of
precision and bias including span checks,
collocated sampler and audit results must be
made available to the permit granting
authority upon request.
6.0 References.
(1) American National Standard—
Specifications and Guidelines for Quality
Systems for Environmental Data Collection
and Environmental Technology Programs.
ANSI/ASQC E4–2004. February 2004.
Available from American Society for Quality
Control, 611 East Wisconsin Avenue,
Milwaukee, WI 53202.
(2) EPA Requirements for Quality
Management Plans. EPA QA/R–2. EPA/240/
B–01/002. March 2001. Office of
Environmental Information, Washington DC
20460. https://www.epa.gov/quality/qs-docs/
r2-final.pdf.
(3) EPA Requirements for Quality
Assurance Project Plans for Environmental
Data Operations. EPA QA/R–5. EPA/240/B–
01/003. March 2001. Office of Environmental
Information, Washington DC 20460. https://
www.epa.gov/quality/qs-docs/r5-final.pdf.
(4) EPA Traceability Protocol for Assay and
Certification of Gaseous Calibration
Standards. EPA–600/R–97/121. September
1997. Available from U.S. Environmental
Protection Agency, ORD Publications Office,
2791
Center for Environmental Research
Information (CERI), 26 W. Martin Luther
King Drive, Cincinnati, OH 45268.
(5) Guidance for the Data Quality
Objectives Process. EPA QA/G–4. EPA/600/
R–96/055. August 2000. Office of
Environmental Information, Washington DC
20460. https://www.epa.gov/quality/qs-docs/
g4-final.pdf.
(6) List of Designated Reference and
Equivalent Methods. Available from U.S.
Environmental Protection Agency, National
Exposure Research Laboratory, Human
Exposure and Atmospheric Sciences
Division, MD–D205–03, Research Triangle
Park, NC 27711. https://www.epa.gov/ttn/
amtic/criteria.html.
(7) McElroy, F.F. Transfer Standards for the
Calibration of Ambient Air Monitoring
Analyzers for Ozone. EPA–600/4–79–056.
U.S. Environmental Protection Agency,
Research Triangle Park, NC 27711,
September, 1979. https://www.epa.gov/ttn/
amtic/cpreldoc.html.
(8) Paur, R.J. and F.F. McElroy. Technical
Assistance Document for the Calibration of
Ambient Ozone Monitors. EPA–600/4–79–
057. U.S. Environmental Protection Agency,
Research Triangle Park, NC 27711,
September, 1979. https://www.epa.gov/ttn/
amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume 1—
A Field Guide to Environmental Quality
Assurance. EPA–600/R–94/038a. April 1994.
Available from U.S. Environmental
Protection Agency, ORD Publications Office,
Center for Environmental Research
Information (CERI), 26 W. Martin Luther
King Drive, Cincinnati, OH 45268. https://
www.epa.gov/ttn/amtic/qabook.html.
(10) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume II:
Part 1—Ambient Air Quality Monitoring
Program Quality System Development. EPA–
454/R–98–004. https://www.epa.gov/ttn/
amtic/qabook.html.
TABLE A–1 OF APPENDIX A TO PART 58.—DIFFERENCE AND SIMILARITIES BETWEEN SLAMS AND PSD REQUIREMENTS
Topic
SLAMS
Requirements ................
1. The development, documentation, and implementation
of an approved quality system.
2. The assessment of data quality.
3. The use of reference, equivalent, or approved methods.
4. The use of calibration standards traceable to NIST or
other primary standard.
5. The participation in EPA performance evaluations and
the permission for EPA to conduct system audits.
State/local agency via the ‘‘primary quality assurance organization’’.
Indefinitely ...........................................................................
Standards and equipment different from those used for
spanning, calibration, and verifications. Prefer different
personnel.
PE audit rate:
—Automated ..........
—Manual ................
Precision Assessment:
—Automated ..........
VerDate Aug<31>2005
Source owner/operator.
Usually up to 12 months.
Personnel, standards and equipment different from those
used for spanning, calibration, and verifications.
100% per year ....................................................................
Varies depending on pollutant. See Table A–2 of this appendix.
100% per quarter.
100% per quarter.
One-point QC check biweekly but data quality dependent
One point QC check biweekly.
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00083
Fmt 4701
Sfmt 4700
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.047
Monitoring and QA Responsibility.
Monitoring Duration .......
Annual Performance
Evaluation (PE).
PSD
2792
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE A–1 OF APPENDIX A TO PART 58.—DIFFERENCE AND SIMILARITIES BETWEEN SLAMS AND PSD REQUIREMENTS—
Continued
Topic
SLAMS
PSD
—Manual ................
Varies depending on pollutant. See Table A–2 of this appendix.
One site: 1 every 6 days or every third day for daily monitoring (TSP and Pb).
Reporting:
—Automated ..........
By site—EPA performs calculations annually ....................
—Manual ................
By reporting organization—EPA performs calculations annually.
By site—source owner/operator performs calculations
each sampling quarter.
By site—source owner/operator performs calculations
each sampling quarter.
TABLE A–2 OF APPENDIX A TO PART 58.—MINIMUM DATA ASSESSMENT REQUIREMENTS FOR SLAMS SITES
Method
Assessment method
Minimum
frequency
Coverage
Parameters reported
Automated Methods
1-Point QC: for
SO2, NO2, O3,
CO.
Performance Evaluation for SO2,
NO2, O3, CO.
Flow rate
verification PM10,
PM2.5, PM10-2.5.
Semi-annual flow
rate audit PM10,
PM2.5, PM10-2.5.
Collocated Sampling PM2.5,
PM10-2.5.
Performance Evaluation PM2.5,
PM10-2.5.
Response check at concentration 0.01–0.1 ppm
SO2, NO2, O3, and 1–10
ppm CO.
See section 3.2.2 of this
appendix.
Each analyzer ...................
Once per 2
weeks.
Audit concentration 1 and measured concentration 2.
Each analyzer ...................
Once per year ....
Audit concentration 1 and measured concentration 2 for each level.
Check of sampler flow rate
Each sampler ...................
Once every
month.
Audit flow rate and measured flow rate indicated by the sampler.
Check of sampler flow rate
using independent
standard.
Collocated samplers .........
Each sampler ...................
Once every 6
months.
Audit flow rate and measured flow rate indicated by the sampler.
15% ..................................
Every twelve
days.
Primary sampler concentration and duplicate
sampler concentration.
Collocated samplers .........
1. 5 valid audits for primary QA orgs, with ≤5
sites.
2. 8 valid audits for primary QA orgs, with >5
sites.
3. All samplers in 6 years.
over all 4 quarters.
Primary sampler concentration and performance evaluation sampler concentration.
Manual Methods
Collocated Sampling PM10, TSP,
PM10-2.5, PM2.5,.
Flow rate
verification PM10,
TSP, PM10-2.5
PM2.5.
Semi-annual flow
rate audit PM10,
TSP, PM10-2.5
PM2.5.
Manual Methods
Lead.
Performance Evaluation PM2.5,
PM10-2.5.
1 Effective
Collocated samplers .........
15% ..................................
Check of sampler flow rate
Each sampler ...................
Check of sampler flow rate
using independent
standard.
1. Check of sample flow
rate as for TSP.
2. Check of analytical system with Pb audit strips.
Collocated samplers .........
Every 12 days,
TSP—every 6
days.
Once every
month.
Audit flow rate and measured flow rate indicated by the sampler.
Each sampler, all locations
Once every 6
months.
Audit flow rate and measured flow rate indicated by the sampler.
1. Each sampler ...............
1. Include with
TSP.
2. Each quarter ..
1. Same as for TSP.
2. Analytical system ..........
1. 5 valid audits for primary QA orgs, with ≤5
sites..
2. 8 valid audits for primary QA orgs, with ≥5
sites.
3. All samplers in 6 years.
Over all 4 quarters.
Primary sampler concentration and duplicate
sampler concentration.
2. Actual concentration and measured (indicated) concentration of audit samples (µg
Pb/strip).
Primary sampler concentration and performance evaluation sampler concentration.
concentration for open path analyzers.
concentration, if applicable, for open path analyzers.
2 Corrected
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00084
Fmt 4701
Sfmt 4700
E:\FR\FM\17JAP3.SGM
17JAP3
2793
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE A–3 TO APPENDIX A OF PART 58.—SUMMARY OF PM2.5 OR PM10-2.5. NUMBER AND TYPE OF COLLOCATION (15%
COLLOCATION REQUIREMENT) NEEDED AS AN EXAMPLE OF A PRIMARY QUALITY ASSURANCE ORGANIZATION THAT
HAS 54 MONITORS AND PROCURED FRMS AND THREE OTHER EQUIVALENT METHOD TYPES
Total number of
monitors
Primary sampler method designation
FRM .................................................................................................
FEM (A) ...........................................................................................
FEM (C) ...........................................................................................
FEM (D) ...........................................................................................
50. Appendix C is revised to read as
follows:
Appendix C to Part 58—Ambient Air Quality
Monitoring Methodology
1.0 Purpose.
2.0 SLAMS Ambient Air Monitoring
Stations.
3.0 NCore Ambient Air Monitoring
Stations.
4.0 Photochemical Assessment
Monitoring Stations (PAMS).
5.0 Particulate Matter Episode
Monitoring.
6.0 References.
1.0 Purpose.
This appendix specifies the criteria
pollutant monitoring methods (manual
methods or automated analyzers) which must
be used in the State and local air monitoring
stations (SLAMS) and the National Core
(NCore) stations that are a subset of SLAMS.
2.0 SLAMS Ambient Air Monitoring
Network.
2.1 Except as otherwise provided in this
appendix, a criteria pollutant monitoring
method used for making NAAQS decisions at
a SLAMS site must be a reference or
equivalent method as defined in § 50.1 of this
chapter.
2.2 Through December 31, 2012, data
produced from any PM10 method approved
under part 53 of this chapter may be used in
lieu of a required PM10-2.5 monitor to
determine attainment of the PM10-2.5 NAAQS
according to the following stipulations.
2.2.1 At any sites proposed for
monitoring in lieu of PM10-2.5 monitoring, the
98th percentile value for the most recent
complete calendar year of PM10 monitoring
data must be less than the PM10-2.5 NAAQS,
based on a sample frequency of at least 1 in
3 sample days, and reported at local
conditions of temperature and pressure.
2.2.2 PM10 data used in lieu of required
PM10-2.5 monitoring must be based on a daily
sampling frequency.
2.2.3 During any calendar year of
sampling in lieu of a required PM10-2.5
sampler, if more than seven 24-hour average
PM10 concentrations exceed the numerical
value of the PM10-2.5 NAAQS, as reported at
local conditions of temperature and pressure,
the State must deploy a Federal reference
method (FRM) or Federal equivalent method
(FEM) PM10-2.5 monitor within a 1-year
period.
2.3 Any manual method or analyzer
purchased prior to cancellation of its
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
20
20
2
12
reference or equivalent method designation
under § 53.11 or § 53.16 of this chapter may
be used at a SLAMS site following
cancellation for a reasonable period of time
to be determined by the Administrator.
2.4 Approval of Non-designated
Continuous PM2.5 Methods as Approved
Regional Methods (ARM) Operated Within a
Network of Sites. A method for PM2.5 that has
not been designated as an FRM or FEM as
defined in § 50.1 of this chapter may be
approved as an approved regional method
(ARM) for purposes of section 2.1 of this
appendix at a particular site or network of
sites under the following stipulations.
2.4.1 The candidate ARM must be
demonstrated to meet the requirements for
PM2.5 Class III equivalent methods as defined
in subpart C of part 53 of this chapter.
Specifically the requirements for precision,
correlation, and additive and multiplicative
bias apply. For purposes of this section 2.4,
the following requirements shall apply:
2.4.1.1 The candidate ARM shall be
tested at the site(s) in which it is intended
to be used. For a network of sites operated
by one reporting agency, the testing shall
occur at a subset of sites to include one site
in each MSA/CSA, up to the first 2 highest
population MSA/CSA and at least one rural
area or Micropolitan Statistical Area site. If
the candidate ARM for a network is already
approved for purposes of this section in
another agency’s network, subsequent testing
shall minimally occur at one site in a MSA/
CSA and one rural area or Micropolitan
Statistical Area. There shall be no
requirement for tests at any other sites.
2.4.1.2 For purposes of this section, a full
year of testing may begin and end in any
season, so long as all seasons are covered.
2.4.1.3 No PM10 samplers shall be
required for the test, as determination of the
PM2.5/PM10 ratio at the test site shall not be
required.
2.4.1.4 The test specification for PM2.5
Class III equivalent method precision defined
in subpart C of part 53 of this chapter
applies; however, there is no specific
requirement that collocated continuous
monitors be operated for purposes of
generating a statistic for coefficient of
variation (CV). To provide an estimate of
precision that meets the requirement
identified in subpart C of part 53 of this
chapter, agencies may cite peer-reviewed
published data or data in AQS that can be
presented demonstrating the candidate ARM
operated will produce data that meets the
PO 00000
Frm 00085
Fmt 4701
Sfmt 4700
Number of collocated FRM
Total number
collocated
Number of collocated monitors of
same method
designation as
primary
3
2
1
1
N/A
1
0
1
3
3
1
2
specification for precision of Class III PM2.5
methods.
2.4.1.5 A minimum of 90 valid sample
pairs per site for the year with no less than
20 valid sample pairs per season must be
generated for use in demonstrating that
additive bias, multiplicative bias and
correlation meet the comparability
requirements specified in subpart C of part
53 of this chapter. A valid sample pair may
be generated with as little as one valid FRM
and one valid candidate ARM measurement
per day.
2.4.1.6 For purposes of determining bias,
FRM data with concentrations less than 3
micrograms per cubic meter µg/m3) may be
excluded. Exclusion of data does not result
in failure of sample completeness specified
in this section.
2.4.2 The monitoring agency wishing to
use an ARM must develop and implement
appropriate quality assurance procedures for
the method. Additionally, the following
procedures are required for the method:
2.4.2.1 The ARM must be consistently
operated throughout the network. Exceptions
to a consistent operation must be approved
according to section 2.8 of this appendix;
2.4.2.2 The ARM must be operated on an
hourly sampling frequency capable of
providing data suitable for aggregation into
daily 24-hour average measurements;
2.4.2.3 The ARM must use an inlet and
separation device, as needed, that are already
approved in either the reference method
identified in appendix L to part 50 of this
chapter or under part 53 of this chapter as
approved for use on a PM2.5 reference or
equivalent method. The only exceptions to
this requirement are those methods that by
their inherent measurement principle may
not need an inlet or separation device that
segregates the aerosol; and
2.4.2.4 The ARM must be capable of
providing for flow audits, unless by its
inherent measurement principle, measured
flow is not required. These flow audits are to
be performed on the frequency identified in
appendix A to this part.
2.4.3 The monitoring agency wishing to
use the method must develop and implement
appropriate procedures for assessing and
reporting the precision and accuracy of the
method comparable to the procedures set
forth in appendix A of this part for
designated reference and equivalent
methods.
2.4.4 Assessments of data quality shall
follow the same frequencies and calculations
E:\FR\FM\17JAP3.SGM
17JAP3
2794
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
as required under section 3 of appendix A to
this part with the following exceptions:
2.4.4.1 Collocation of ARM with FRM/
FEM samplers must be maintained at a
minimum of 30 percent of the SLAMS sites
with a minimum of 1 per network;
2.4.4.2 All collocated FRM/FEM samplers
must maintain a sample frequency of at least
1 in 6 sample days;
2.4.4.3 Collocated FRM/FEM samplers
shall be located at the design value site, with
the required FRM/FEM samplers deployed
among the largest MSA/CSA in the network,
until all required FRM/FEM are deployed;
and
2.4.4.4 Data from collocated FRM/FEM
are to be substituted for any calendar quarter
that an ARM method has incomplete data.
2.4.4.5 Collocation with an ARM under
this part for purposes of determining the
coefficient of variation of the method shall be
conducted at a minimum of 7.5 percent of the
sites with a minimum of 1 per network. This
is consistent with the requirements in
appendix A to this part for one-half of the
required collocation of FRM/FEM (15
percent) to be collocated with the same
method.
2.4.4.6 Assessments of bias with an
independent audit of the total measurement
system shall be conducted with the same
frequency as an FEM as identified in
appendix A to this part.
2.4.5 Request for approval of a candidate
ARM, that is not already approved in another
agency’s network under this section, must
meet the general submittal requirements of
section 2.7 of this appendix. Requests for
approval under this section when an ARM is
already approved in another agency’s
network are to be submitted to the EPA
Regional Administrator. Requests for
approval under section 2.4 of this appendix
must include the following requirements:
2.4.5.1 A clear and unique description of
the site(s) at which the candidate ARM will
be used and tested, and a description of the
nature or character of the site and the
particulate matter that is expected to occur
there.
2.4.5.2 A detailed description of the
method and the nature of the sampler or
analyzer upon which it is based.
2.4.5.3 A brief statement of the reason or
rationale for requesting the approval.
2.4.5.4 A detailed description of the
quality assurance procedures that have been
developed and that will be implemented for
the method.
2.4.5.5 A detailed description of the
procedures for assessing the precision and
accuracy of the method that will be
implemented for reporting to AQS.
2.4.5.6 Test results from the
comparability tests as required in section
2.4.1 through 2.4.1.4 of this appendix.
2.4.5.7 Such further supplemental
information as may be necessary or helpful
to support the required statements and test
results.
2.4.6 Within 120 days after receiving a
request for approval of the use of an ARM at
a particular site or network of sites under
section 2.4 of this appendix, the
Administrator will approve or disapprove the
method by letter to the person or agency
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
requesting such approval. When appropriate
for methods that are already approved in
another SLAMS network, the EPA Regional
Administrator has approval/disapproval
authority. In either instance, additional
information may be requested to assist with
the decision.
2.5 [Reserved]
2.6 Use of Methods With Higher,
Nonconforming Ranges in Certain
Geographical Areas.
2.6.1 [Reserved]
2.6.2 An analyzer may be used
(indefinitely) on a range which extends to
concentrations higher than two times the
upper limit specified in table B–1 of part 53
of this chapter if:
2.6.2.1 The analyzer has more than one
selectable range and has been designated as
a reference or equivalent method on at least
one of its ranges, or has been approved for
use under section 2.5 (which applies to
analyzers purchased before February 18,
1975);
2.6.2.2 The pollutant intended to be
measured with the analyzer is likely to occur
in concentrations more than two times the
upper range limit specified in table B–1 of
part 53 of this chapter in the geographical
area in which use of the analyzer is
proposed; and
2.6.2.3 The Administrator determines
that the resolution of the range or ranges for
which approval is sought is adequate for its
intended use. For purposes of this section
(2.6), ‘‘resolution’’ means the ability of the
analyzer to detect small changes in
concentration.
2.6.3 Requests for approval under section
2.6.2 of this appendix must meet the
submittal requirements of section 2.7. Except
as provided in section 2.7.3 of this appendix,
each request must contain the information
specified in section 2.7.2 in addition to the
following:
2.6.3.1 The range or ranges proposed to
be used;
2.6.3.2 Test data, records, calculations,
and test results as specified in section 2.7.2.2
of this appendix for each range proposed to
be used;
2.6.3.3 An identification and description
of the geographical area in which use of the
analyzer is proposed;
2.6.3.4 Data or other information
demonstrating that the pollutant intended to
be measured with the analyzer is likely to
occur in concentrations more than two times
the upper range limit specified in table B–1
of part 53 of this chapter in the geographical
area in which use of the analyzer is
proposed; and
2.6.3.5 Test data or other information
demonstrating the resolution of each
proposed range that is broader than that
permitted by section 2.5 of this appendix.
2.6.4 Any person who has obtained
approval of a request under this section
(2.6.2) shall assure that the analyzer for
which approval was obtained is used only in
the geographical area identified in the
request and only while operated in the range
or ranges specified in the request.
2.7 Requests for Approval; Withdrawal of
Approval.
2.7.1 Requests for approval under
sections 2.4, 2.6.2, or 2.8 of this appendix
PO 00000
Frm 00086
Fmt 4701
Sfmt 4700
must be submitted to: Director, National
Exposure Research Laboratory, (MD–D205–
03), U.S. Environmental Protection Agency,
Research Triangle Park, North Carolina
27711. For ARM that are already approved in
another agency’s network, subsequent
requests for approval under section 2.4 are to
be submitted to the applicable EPA Regional
Administrator.
2.7.2 Except as provided in section 2.7.3
of this appendix, each request must contain:
2.7.2.1 A statement identifying the
analyzer (e.g., by serial number) and the
method of which the analyzer is
representative (e.g., by manufacturer and
model number); and 2.7.2.2 Test data,
records, calculations, and test results for the
analyzer (or the method of which the
analyzer is representative) as specified in
subpart B, subpart C, or both (as applicable)
of part 53 of this chapter.
2.7.3 A request may concern more than
one analyzer or geographical area and may
incorporate by reference any data or other
information known to EPA from one or more
of the following:
2.7.3.1 An application for a reference or
equivalent method determination submitted
to EPA for the method of which the analyzer
is representative, or testing conducted by the
applicant or by EPA in connection with such
an application;
2.7.3.2 Testing of the method of which
the analyzer is representative at the initiative
of the Administrator under § 53.7 of this
chapter; or
2.7.3.3 A previous or concurrent request
for approval submitted to EPA under this
section (2.7).
2.7.4 To the extent that such
incorporation by reference provides data or
information required by this section (2.7) or
by sections 2.4, 2.5, or 2.6 of this appendix,
independent data or duplicative information
need not be submitted.
2.7.5 After receiving a request under this
section (2.7), the Administrator may request
such additional testing or information or
conduct such tests as may be necessary in his
judgment for a decision on the request.
2.7.6 If the Administrator determines, on
the basis of any available information, that
any of the determinations or statements on
which approval of a request under this
section was based are invalid or no longer
valid, or that the requirements of section 2.4,
2.5, or 2.6, as applicable, have not been met,
he/she may withdraw the approval after
affording the person who obtained the
approval an opportunity to submit
information and arguments opposing such
action.
2.8 Modifications of Methods by Users.
2.8.1 Except as otherwise provided in this
section, no reference method, equivalent
method, or ARM may be used in a SLAMS
network if it has been modified in a manner
that could significantly alter the performance
characteristics of the method without prior
approval by the Administrator. For purposes
of this section, ‘‘alternative method’’ means
an analyzer, the use of which has been
approved under section 2.4, 2.5, or 2.6 of this
appendix or some combination thereof.
2.8.2 Requests for approval under this
section (2.8) must meet the submittal
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
requirements of sections 2.7.1 and 2.7.2.1 of
this appendix.
2.8.3 Each request submitted under this
section (2.8) must include:
2.8.3.1 A description, in such detail as
may be appropriate, of the desired
modification;
2.8.3.2 A brief statement of the purpose(s)
of the modification, including any reasons for
considering it necessary or advantageous;
2.8.3.3 A brief statement of belief
concerning the extent to which the
modification will or may affect the
performance characteristics of the method;
and
2.8.3.4 Such further information as may
be necessary to explain and support the
statements required by sections 2.8.3.2 and
2.8.3.3.
2.8.4 The Administrator will approve or
disapprove the modification by letter to the
person or agency requesting such approval
within 75 days after receiving a request for
approval under this section and any further
information that the applicant may be asked
to provide.
2.8.5 A temporary modification that
could alter the performance characteristics of
a reference, equivalent, or ARM may be made
without prior approval under this section if
the method is not functioning or is
malfunctioning, provided that parts
necessary for repair in accordance with the
applicable operation manual cannot be
obtained within 45 days. Unless such
temporary modification is later approved
under section 2.8.4 of this appendix, the
temporarily modified method shall be
repaired in accordance with the applicable
operation manual as quickly as practicable
but in no event later than 4 months after the
temporary modification was made, unless an
extension of time is granted by the
Administrator. Unless and until the
temporary modification is approved, air
quality data obtained with the method as
temporarily modified must be clearly
identified as such when submitted in
accordance with § 58.16 and must be
accompanied by a report containing the
information specified in section 2.8.3 of this
appendix. A request that the Administrator
approve a temporary modification may be
submitted in accordance with sections 2.8.1
through 2.8.4 of this appendix. In such cases
the request will be considered as if a request
for prior approval had been made.
2.9 Use of IMPROVE Samplers at a
SLAMS Site. ‘‘IMPROVE’’ samplers may be
used in SLAMS for monitoring of regional
background and regional transport
concentrations of fine particulate matter. The
IMPROVE samplers were developed for use
in the Interagency Monitoring of Protected
Visual Environments (IMPROVE) network to
characterize all of the major components and
many trace constituents of the particulate
matter that impair visibility in Federal Class
I Areas. Descriptions of the IMPROVE
samplers and the data they collect are
available in references 4, 5, and 6 of this
appendix.
3.0 NCore Ambient Air Monitoring
Stations.
3.1 Methods employed in NCore
multipollutant sites used to measure SO2,
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
CO, NO2, O3, PM2.5, or PM10-2.5 must be
reference or equivalent methods as defined in
§ 50.1 of this chapter, or an ARM as defined
in section 2.4 of this appendix, for any
monitors intended for comparison with
applicable NAAQS.
3.2 If alternative SO2, CO, NO2, O3, PM2.5,
or PM10-2.5 monitoring methodologies are
proposed for monitors not intended for
NAAQS comparison, such techniques must
be detailed in the network description
required by § 58.10 and subsequently
approved by the Administrator.
4.0 Photochemical Assessment
Monitoring Stations (PAMS).
4.1 Methods used for O3 monitoring at
PAMS must be automated reference or
equivalent methods as defined in § 50.1 of
this chapter.
4.2 Methods used for NO, NO2 and NOX
monitoring at PAMS should be automated
reference or equivalent methods as defined
for NO2 in § 50.1 of this chapter. If alternative
NO, NO2 or NOX monitoring methodologies
are proposed, such techniques must be
detailed in the network description required
by § 58.10 and subsequently approved by the
Administrator.
4.3 Methods for meteorological
measurements and speciated VOC
monitoring are included in the guidance
provided in references 2 and 3 of this
appendix. If alternative VOC monitoring
methodology (including the use of new or
innovative technologies), which is not
included in the guidance, is proposed, it
must be detailed in the network description
required by § 58.10 and subsequently
approved by the Administrator.
5.0 Particulate Matter Episode
Monitoring.
5.1 For short-term measurements of PM10
during air pollution episodes (see § 51.152 of
this chapter) the measurement method must
be:
5.1.1 Either the ‘‘Staggered PM10’’ method
or the ‘‘PM10 Sampling Over Short Sampling
Times’’ method, both of which are based on
the reference method for PM10 and are
described in reference 1: or
5.1.2 Any other method for measuring
PM10:
5.1.2.1 Which has a measurement range
or ranges appropriate to accurately measure
air pollution episode concentration of PM10,
5.1.2.2 Which has a sample period
appropriate for short-term PM10
measurements, and 5.1.2.3 For which a
quantitative relationship to a reference or
equivalent method for PM10 has been
established at the use site. Procedures for
establishing a quantitative site-specific
relationship are contained in reference 1.
5.2 PM10 methods other than the
reference method are not covered under the
quality assessment requirements of appendix
A to this part. Therefore, States must develop
and implement their own quality assessment
procedures for those methods allowed under
this section 4. These quality assessment
procedures should be similar or analogous to
those described in section 3 of appendix A
to this part for the PM10 reference method.
6.0 References.
1. Pelton, D. J. Guideline for Particulate
Episode Monitoring Methods, GEOMET
PO 00000
Frm 00087
Fmt 4701
Sfmt 4700
2795
Technologies, Inc., Rockville, MD. Prepared
for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Contract
No. 68–02–3584. EPA 450/4–83–005.
February 1983.
2. Technical Assistance Document For
Sampling and Analysis of Ozone Precursors.
Atmospheric Research and Exposure
Assessment Laboratory, U.S. Environmental
Protection Agency, Research Triangle Park,
NC 27711. EPA 600/8–91–215. October 1991.
3. Quality Assurance Handbook for Air
Pollution Measurement Systems: Volume IV.
Meteorological Measurements. Atmospheric
Research and Exposure Assessment
Laboratory, U.S. Environmental Protection
Agency, Research Triangle Park, NC 27711.
EPA 600/4–90–0003. August 1989.
4. Eldred, R.A., Cahill, T.A., Wilkenson,
L.K., et al., Measurements of fine particles
and their chemical components in the
IMPROVE/NPS networks, in Transactions of
the International Specialty Conference on
Visibility and Fine Particles, Air and Waste
Management Association: Pittsburgh, PA,
1990; pp 187–196.
5. Sisler, J.F., Huffman, D., and Latimer,
D.A.; Spatial and temporal patterns and the
chemical composition of the haze in the
United States: An analysis of data from the
IMPROVE network, 1988–1991, ISSN No.
0737–5253–26, National Park Service, Ft.
Collins, CO, 1993.
6. Eldred, R.A., Cahill, T.A., Pitchford, M.,
and Malm, W.C.; IMPROVE—a new remote
area particulate monitoring system for
visibility studies, Proceedings of the 81st
Annual Meeting of the Air Pollution Control
Association, Dallas, Paper 88–54.3, 1988.
51. Appendix D to part 58 is revised
to read as follows:
Appendix D to Part 58—Network
Design Criteria for Ambient Air Quality
Monitoring
1. Monitoring Objectives and Spatial
Scales.
2. General Monitoring Requirements.
3. Design Criteria for NCore Sites.
4. Pollutant-Specific Design Criteria for
SLAMS Sites.
5. Design Criteria for Photochemical
Assessment Monitoring Stations (PAMS).
6. References.
1. Monitoring Objectives and Spatial
Scales.
The purpose of this appendix is to describe
monitoring objectives and general criteria to
be applied in establishing the required
SLAMS ambient air quality monitoring
stations and for choosing general locations
for additional monitoring sites. This
appendix also describes specific
requirements for the number and location of
FRM, FEM, and ARM sites for specific
pollutants, NCore multipollutant sites,
PM10-2.5 mass sites, chemically-speciated
PM10-2.5 sites, continuous PM2.5 mass sites,
chemically-speciated PM2.5 sites, and O3
precursor measurements sites (PAMS). These
criteria will be used by EPA in evaluating the
adequacy of the air pollutant monitoring
networks.
1.1 Monitoring Objectives. The ambient
air monitoring networks must be designed to
E:\FR\FM\17JAP3.SGM
17JAP3
2796
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
meet three basic monitoring objectives. These
basic objectives are listed below. The
appearance of any one objective in the order
of this list is not based upon a prioritized
scheme. Each objective is important and
must be considered individually.
(a) Provide air pollution data to the general
public in a timely manner. Data can be
presented to the public in a number of
attractive ways including through air quality
maps, newspapers, Internet sites, and as part
of weather forecasts and public advisories.
(b) Support compliance with ambient air
quality standards and emissions strategy
development. Data from FRM, FEM, and
ARM monitors will be used for comparing an
area’s air pollution levels against the
National Ambient Air Quality Standards
(NAAQS). Data from monitors of various
types can be used in the development of
attainment and maintenance plans. SLAMS,
and especially NCore station data, will be
used to evaluate the regional air quality
models used in developing emission
strategies, and to track trends in air pollution
abatement control measures’ impact on
improving air quality. In monitoring
locations near major air pollution sources,
source-oriented monitoring data can provide
insight into how well industrial sources are
controlling their pollutant emissions.
(c) Support for air pollution research
studies. Air pollution data from the NCore
network can be used to supplement data
collected by researchers working on health
effects assessments and atmospheric
processes, or for monitoring methods
development work.
1.1.1 In order to support the air quality
management work indicated in the three
basic air monitoring objectives, a network
must be designed with a variety of types of
monitoring sites. Monitoring sites must be
capable of informing managers about many
things including the peak air pollution levels,
typical levels in populated areas, air
pollution transported into and outside of a
city or region, and air pollution levels near
specific sources. To summarize some of these
sites, here is a listing of six general site types:
(a) Sites located to determine the highest
concentrations expected to occur in the area
covered by the network.
(b) Sites located to measure typical
concentrations in areas of high population
density.
(c) Sites located to determine the impact of
significant sources or source categories on air
quality.
(d) Sites located to determine general
background concentration levels.
(e) Sites located to determine the extent of
Regional pollutant transport among
populated areas; and in support of secondary
standards.
(f) Sites located to measure air pollution
impacts on visibility, vegetation damage, or
other welfare-based impacts.
1.1.2 This appendix contains criteria for
the basic air monitoring requirements. The
total number of monitoring sites that will
serve the variety of data needs will be
substantially higher than these minimum
requirements provide. The optimum size of
a particular network involves trade-offs
among data needs and available resources.
This regulation intends to provide for
national air monitoring needs, and to lend
support for the flexibility necessary to meet
data collection needs of area air quality
managers. EPA, State, and local agencies will
periodically collaborate on network design
issues through the network assessment
process outlined in § 58.10.
1.1.3 This appendix focuses on the
relationship between monitoring objectives,
site types, and the geographic location of
monitoring sites. Included are a rationale and
set of general criteria for identifying
candidate site locations in terms of physical
characteristics which most closely match a
specific monitoring objective. The criteria for
more specifically locating the monitoring
site, including spacing from roadways and
vertical and horizontal probe and path
placement, are described in appendix E to
this part.
1.2 Spatial Scales. (a) To clarify the
nature of the link between general
monitoring objectives, site types, and the
physical location of a particular monitor, the
concept of spatial scale of representativeness
is defined. The goal in locating monitors is
to correctly match the spatial scale
represented by the sample of monitored air
with the spatial scale most appropriate for
the monitoring site type, air pollutant to be
measured, and the monitoring objective.
(b) Thus, spatial scale of representativeness
is described in terms of the physical
dimensions of the air parcel nearest to a
monitoring site throughout which actual
pollutant concentrations are reasonably
similar. The scales of representativeness of
most interest for the monitoring site types
described above are as follows:
(1) Microscale—defines the concentrations
in air volumes associated with area
dimensions ranging from several meters up to
about 100 meters.
(2) Middle scale—defines the concentration
typical of areas up to several city blocks in
size with dimensions ranging from about 100
meters to 0.5 kilometer.
(3) Neighborhood scale—defines
concentrations within some extended area of
the city that has relatively uniform land use
with dimensions in the 0.5 to 4.0 kilometers
range. The neighborhood and urban scales
listed below have the potential to overlap in
applications that concern secondarily formed
or homogeneously distributed air pollutants.
(4) Urban scale—defines concentrations
within an area of city-like dimensions, on the
order of 4 to 50 kilometers. Within a city, the
geographic placement of sources may result
in there being no single site that can be said
to represent air quality on an urban scale.
(5) Regional scale—defines usually a rural
area of reasonably homogeneous geography
without large sources, and extends from tens
to hundreds of kilometers.
(6) National and global scales—these
measurement scales represent concentrations
characterizing the nation and the globe as a
whole.
(c) Proper siting of a monitor requires
specification of the monitoring objective, the
types of sites necessary to meet the objective,
and then the desired spatial scale of
representativeness. For example, consider the
case where the objective is to determine
NAAQS compliance by understanding the
maximum ozone concentrations for an area.
Such areas would most likely be located
downwind of a metropolitan area, quite
likely in a suburban residential area where
children and other susceptible individuals
are likely to be outdoors. Sites located in
these areas are most likely to represent an
urban scale of measurement. In this example,
physical location was determined by
considering ozone precursor emission
patterns, public activity, and meteorological
characteristics affecting ozone formation and
dispersion. Thus, spatial scale of
representativeness was not used in the
selection process but was a result of site
location.
(d) In some cases, the physical location of
a site is determined from joint consideration
of both the basic monitoring objective and
the type of monitoring site desired, or
required by this appendix. For example, to
determine PM2.5 concentrations which are
typical over a geographic area having
relatively high PM2.5 concentrations, a
neighborhood scale site is more appropriate.
Such a site would likely be located in a
residential or commercial area having a high
overall PM2.5 emission density but not in the
immediate vicinity of any single dominant
source. Note that in this example, the desired
scale of representativeness was an important
factor in determining the physical location of
the monitoring site.
(e) In either case, classification of the
monitor by its type and spatial scale of
representativeness is necessary and will aid
in interpretation of the monitoring data for a
particular monitoring objective (e.g., public
reporting, NAAQS compliance, or research
support).
(f) Table D–1 of this appendix illustrates
the relationship between the various site
types that can be used to support the three
basic monitoring objectives, and the scales of
representativeness that are generally most
appropriate for that type of site.
TABLE D–1 OF APPENDIX D TO PART 58.—RELATIONSHIP BETWEEN SITE TYPES AND SCALES OF REPRESENTATIVENESS
Site type
Appropriate siting scales
1. Highest concentration ...........................................................................
Micro, middle, neighborhood (sometimes urban or regional for secondarily formed pollutants).
Neighborhood, urban.
Micro, middle, neighborhood.
2. Population oriented ..............................................................................
3. Source impact ......................................................................................
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00088
Fmt 4701
Sfmt 4700
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
2797
TABLE D–1 OF APPENDIX D TO PART 58.—RELATIONSHIP BETWEEN SITE TYPES AND SCALES OF
REPRESENTATIVENESS—Continued
Site type
Appropriate siting scales
4. General/background & regional transport ............................................
5. Welfare-related impacts .......................................................................
2. General Monitoring Requirements.
(a) The National ambient air monitoring
system includes several types of monitoring
stations, each targeting a key data collection
need and each varying in technical
sophistication.
(b) Research grade sites are platforms for
scientific studies, either involved with health
or welfare impacts, measurement methods
development, or other atmospheric studies.
These sites may be collaborative efforts
between regulatory agencies and researchers
with specific scientific objectives for each.
Data from these sites might be collected with
both traditional and experimental
techniques, and data collection might involve
specific laboratory analyses not common in
routine measurement programs. The research
grade sites are not required by regulation;
however, they are mentioned here due to
their important role in supporting the air
quality management program.
(c) The NCore multipollutant sites are sites
that measure multiple pollutants in order to
provide support to integrated air quality
management data needs. NCore sites include
urban scale measurements in general, in a
selection of metropolitan areas and a limited
number of more rural locations. Continuous
monitoring methods are to be used at the
NCore sites when available for a pollutant to
be measured, as it is important to have data
collected over common time periods for
integrated analyses. NCore multipollutant
sites are intended to be long-term sites useful
for a variety of applications including air
quality trends analyses, model evaluation,
and tracking metropolitan area statistics. As
such, the NCore sites should be placed away
from direct emission sources that could
substantially impact the ability to detect areawide concentrations. NCore sites will also
supplement other SLAMS sites in reporting
to the public in major metropolitan areas. It
is not the intent of the NCore sites to monitor
in every area where the NAAQS are violated,
rather they provide only a subset of the total
monitoring effort necessary to accomplish air
quality management goals. The total number
of monitoring sites that will serve the variety
of national, State, and local governmental
needs will be substantially higher than these
NCore requirements. The Administrator must
approve the NCore sites.
(d) Monitoring sites designated as SLAMS
sites, but not as NCore sites, are intended to
address specific air quality management
interests, and as such, are frequently singlepollutant measurement sites. The EPA
Regional Administrator must approve the
SLAMS sites.
(e) This appendix uses the statistical-based
definitions for metropolitan areas provided
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Urban, regional.
Urban, regional.
by the Office of Management and Budget and
the Census Bureau. These areas are referred
to as metropolitan statistical areas (MSA),
micropolitan statistical areas, core-based
statistical areas (CBSA), and combined
statistical areas (CSA). A CBSA associated
with at least one urbanized area of at least
50,000 population is termed a Metropolitan
Statistical Area. A CBSA associated with at
least one urbanized cluster of at least 10,000
population is termed a Micropolitan
Statistical Area. CSA consist of two or more
adjacent CBSA. In this appendix, the term
MSA is used to refer to a Metropolitan
Statistical Area. By definition, both MSA and
CSA have a high degree of integration;
however, many such areas cross State or
other political boundaries. MSA and CSA
may also cross more than one air shed. EPA
recognizes that State or local agencies must
consider MSA/CSA boundaries and their
own political boundaries and geographical
characteristics in designing their air
monitoring networks. EPA recognizes that
there may be situations where the EPA
Regional Administrator and the affected State
or local agencies may need to augment or to
divide the overall MSA/CSA monitoring
responsibilities and requirements among
these various agencies to achieve an effective
network design. Full monitoring
requirements apply separately to each
affected State or local agency in the absence
of an agreement between the affected
agencies and the EPA Regional
Administrator.
3. Design Criteria for NCore Sites.
(a) Each State is required to operate one
NCore site. States may delegate this
requirement to a local agency. States with
many MSA often also have multiple air sheds
with unique characteristics and, often,
elevated air pollution. These States include,
at a minimum, California, Florida, Illinois,
Michigan, New York, North Carolina, Ohio,
Pennsylvania, and Texas. These States are
required to identify one to two additional
NCore sites in order to account for their
unique situations. Any State or local agency
can propose additional candidate NCore sites
or modifications to these requirements for
approval by the Administrator. The NCore
locations should be leveraged with other
multipollutant air monitoring sites including
PAMS sites, NATTS sites, CASTNET sites,
and STN sites. Site leveraging includes using
the same monitoring platform and equipment
to meet the objectives of the variety of
programs where possible and advantageous.
(b) The NCore sites must measure, at a
minimum, PM2.5 particle mass using
continuous and integrated/filter-based
PO 00000
Frm 00089
Fmt 4701
Sfmt 4700
samplers, speciated PM2.5, PM10-2.5 particle
mass using continuous samplers, O3, SO2,
CO, NO/NOY wind speed, wind direction,
relative humidity, and ambient temperature.
EPA recognizes that, in some cases, the
physical location of the NCore site may not
be suitable for representative meteorological
measurements due to the site’s physical
surroundings. It is also possible that nearby
meteorological measurements may be able to
fulfill this data need. In these cases, the
requirement for meteorological monitoring
can be waived by the Administrator.
(c) In addition to the continuous
measurements listed above, 10 of the NCore
locations (either at the same sites or
elsewhere within the MSA/CSA boundary)
must also measure lead (Pb). These ten Pb
sites are included within the NCore networks
because they are intended to be long-term in
operation, and not impacted directly from a
single lead source. These locations for Pb
monitoring must be located in the most
populated MSA/CSA in each of the 10 EPA
Regions. Alternatively, it is also acceptable to
use the Pb concentration data provided at
urban air toxics sites. In approving any
substitutions, the Administrator must
consider whether these alternative sites are
suitable for collecting long-term lead trends
data for the broader area.
4. Pollutant-Specific Design Criteria for
SLAMS Sites.
4.1 Ozone (O3) Design Criteria. (a) State,
and where appropriate, local Agencies must
operate O3 sites for various locations
depending upon area size (in terms of
population and geographic characteristics)
and typical peak concentrations (expressed
in percentages above, below, or near the O3
NAAQS). Specific SLAMS O3 site minimum
requirements are included in Table D–2 of
this appendix. Typically, most of these
required ozone sites will be SLAMS. The
NCore sites are expected to compliment the
O3 data collection that takes place at SLAMS
sites, and both types of sites can be used to
meet the network minimum requirements.
The total number of O3 sites needed to
support the basic monitoring objectives of
public data reporting, air quality mapping,
compliance, and understanding O3-related
atmospheric processes will include more
sites than these minimum numbers required
in Table D–2 of this appendix. The EPA
Regional Administrator and the responsible
State or local air monitoring agency must
work together to design and/or maintain the
most appropriate O3 network to service the
variety of data needs in an area.
E:\FR\FM\17JAP3.SGM
17JAP3
2798
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE D–2 OF APPENDIX D TO PART 58.—SLAMS MINIMUM O3 MONITORING REQUIREMENTS
Most recent
3-year design
value concentrations >115% of
any O3 NAAQS 1
MSA or CSA population 3, 5
Most recent
3-year design
value concentrations ±15% of
any O3 NAAQS 1
Most recent
3-year design
value concentrations <85% of
any O3
NAAQS 1, 2
3
2
2
2
1
1
4
3
2
2
1
1
2
1
1
1
0
0
>10 million ........................................................................................................................
4–10 million ......................................................................................................................
1–4 million ........................................................................................................................
350,000–1 million .............................................................................................................
200,000–350,000 .............................................................................................................
50,000–<200,000 4 ...........................................................................................................
1 The
ozone (O3) National Ambient Air Quality Standards (NAAQS) levels and forms are defined in 40 CFR part 50.
minimum monitoring requirements apply in the absence of a design value.
monitoring requirements apply to the Combined statistical area (CSA) as a whole, if applicable.
4 Metropolitan statistical areas (MSA) must contain an urbanized area of 50,000 or more population.
5 Population based on latest available census figures.
2 These
3 Minimum
(b) At least one O3 site in each MSA/CSA’s
O3 network must be designed to record the
maximum concentration for that particular
metropolitan area. More than one maximum
concentration site may be necessary in some
areas. Table D–2 of this appendix does not
account for the full breadth of additional
factors that would be considered in designing
a complete ozone monitoring program for an
area. Some of these additional factors include
geographic size, population density,
complexity of terrain and meteorology,
adjacent ozone monitoring programs, air
pollution transport from neighboring areas,
and measured air quality in comparison to all
forms of the O3 NAAQS (i.e., 8-hour and 1hour forms). Networks must be designed to
account for all of these area characteristics.
Network designs must be re-examined in
periodic network assessments. Deviations
from the above O3 requirements are allowed
if approved by the EPA Regional
Administrator.
(c) The appropriate spatial scales for ozone
sites are neighborhood, urban, and regional.
Since ozone requires appreciable formation
time, the mixing of reactants and products
occurs over large volumes of air, and this
reduces the importance of monitoring small
scale spatial variability.
(1) Neighborhood scale—Measurements in
this category represent conditions throughout
some reasonably homogeneous urban
subregion, with dimensions of a few
kilometers. Homogeneity refers to pollutant
concentrations. Neighborhood scale data will
provide valuable information for developing,
testing, and revising concepts and models
that describe urban/regional concentration
patterns. These data will be useful to the
understanding and definition of processes
that take periods of hours to occur and hence
involve considerable mixing and transport.
Under stagnation conditions, a site located in
the neighborhood scale may also experience
peak concentration levels within a
metropolitan area.
(2) Urban scale—Measurement in this scale
will be used to estimate concentrations over
large portions of an urban area with
dimensions of several kilometers to 50 or
more kilometers. Such measurements will be
used for determining trends, and designing
VerDate Aug<31>2005
19:24 Jan 13, 2006
Jkt 208001
area-wide control strategies. The urban scale
sites would also be used to measure high
concentrations downwind of the area having
the highest precursor emissions.
(3) Regional scale—This scale of
measurement will be used to typify
concentrations over large portions of a
metropolitan area and even larger areas with
dimensions of as much as hundreds of
kilometers. Such measurements will be
useful for assessing the ozone that is
transported to and from a metropolitan area,
as well as background concentrations. In
some situations, particularly when
considering very large metropolitan areas
with complex source mixtures, regional scale
sites can be the maximum concentration
location.
(d) EPA’s technical guidance documents on
ozone monitoring network design should be
used to evaluate the adequacy of each
existing O3 monitor, to relocate an existing
site, or to locate any new O3 sites.
(e) For locating a neighborhood scale site
to measure typical city concentrations, a
reasonably homogeneous geographical area
near the center of the region should be
selected which is also removed from the
influence of major NOX sources. For an urban
scale site to measure the high concentration
areas, the emission inventories should be
used to define the extent of the area of
important nonmethane hydrocarbons and
NOX emissions. The meteorological
conditions that occur during periods of
maximum photochemical activity should be
determined. These periods can be identified
by examining the meteorological conditions
that occur on the highest ozone air quality
days. Trajectory analyses, an evaluation of
wind and emission patterns on high ozone
days, can also be useful in evaluating an
ozone monitoring network. In areas without
any previous ozone air quality
measurements, meteorological and ozone
precursor emissions information would be
useful.
(f) Once the meteorological and air quality
data are reviewed, the prospective maximum
concentration monitor site should be selected
in a direction from the city that is most likely
to observe the highest ozone concentrations,
more specifically, downwind during periods
PO 00000
Frm 00090
Fmt 4701
Sfmt 4700
of photochemical activity. In many cases,
these maximum concentration ozone sites
will be located 10 to 30 miles or more
downwind from the urban area where
maximum ozone precursor emissions
originate. The downwind direction and
appropriate distance should be determined
from historical meteorological data collected
on days which show the potential for
producing high ozone levels. Monitoring
agencies are to consult with their EPA
Regional Office when considering siting a
maximum ozone concentration site.
(g) In locating a neighborhood scale site
which is to measure high concentrations, the
same procedures used for the urban scale are
followed except that the site should be
located closer to the areas bordering on the
center city or slightly further downwind in
an area of high density population.
(h) For regional scale background
monitoring sites, similar meteorological
analysis as for the maximum concentration
sites may also inform the decisions for
locating regional scale sites. Regional scale
sites may be located to provide data on ozone
transport between cities, as background sites,
or for other data collection purposes.
Consideration of both area characteristics,
such as meteorology, and the data collection
objectives, such as transport, must be jointly
considered for a regional scale site to be
useful.
(i) Since ozone levels decrease significantly
in the colder parts of the year in many areas,
ozone is required to be monitored at SLAMS
monitoring sites only during the ‘‘ozone
season’’ as designated in the AQS files on a
State-by-State basis and described below in
Table D–3 of this appendix. Deviations from
the ozone monitoring season must be
approved by the EPA Regional
Administrator, documented within the
annual monitoring network plan, and
updated in AQS. Information on how to
analyze ozone data to support a change to the
ozone season in support of the 8-hour
standard for a specific State can be found in
reference 8 to this appendix.
E:\FR\FM\17JAP3.SGM
17JAP3
2799
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE D–3 TO APPENDIX D OF PART 58.—OZONE MONITORING SEASON BY STATE
State
Begin month
Alabama ..................................................................................
Alaska ......................................................................................
Arizona ....................................................................................
Arkansas .................................................................................
California .................................................................................
Colorado ..................................................................................
Connecticut .............................................................................
Delaware .................................................................................
District of Columbia .................................................................
Florida .....................................................................................
Georgia ....................................................................................
Hawaii ......................................................................................
Idaho .......................................................................................
Illinois ......................................................................................
Indiana .....................................................................................
Iowa .........................................................................................
Kansas ....................................................................................
Kentucky ..................................................................................
Louisiana AQCR 019,022 .......................................................
Louisiana AQCR 106 ..............................................................
Maine .......................................................................................
Maryland ..................................................................................
Massachusetts ........................................................................
Michigan ..................................................................................
Minnesota ................................................................................
Mississippi ...............................................................................
Missouri ...................................................................................
Montana ..................................................................................
Nebraska .................................................................................
Nevada ....................................................................................
New Hampshire .......................................................................
New Jersey .............................................................................
New Mexico .............................................................................
New York .................................................................................
North Carolina .........................................................................
North Dakota ...........................................................................
Ohio .........................................................................................
Oklahoma ................................................................................
Oregon ....................................................................................
Pennsylvania ...........................................................................
Puerto Rico .............................................................................
Rhode Island ...........................................................................
South Carolina ........................................................................
South Dakota ..........................................................................
Tennessee ...............................................................................
Texas AQCR 106,153, 213, 214, 216 ....................................
Texas AQCR 022, 210, 211, 212, 215, 217, 218 ...................
Utah .........................................................................................
Vermont ...................................................................................
Virginia ....................................................................................
Washington .............................................................................
West Virginia ...........................................................................
Wisconsin ................................................................................
Wyoming .................................................................................
American Samoa .....................................................................
Guam .......................................................................................
Virgin Islands ...........................................................................
March .....................................................................................
April ........................................................................................
January ..................................................................................
March .....................................................................................
January ..................................................................................
March .....................................................................................
April ........................................................................................
April ........................................................................................
April ........................................................................................
March .....................................................................................
March .....................................................................................
January ..................................................................................
May ........................................................................................
April ........................................................................................
April ........................................................................................
April ........................................................................................
April ........................................................................................
March .....................................................................................
March .....................................................................................
January ..................................................................................
April ........................................................................................
April ........................................................................................
April ........................................................................................
April ........................................................................................
April ........................................................................................
March .....................................................................................
April ........................................................................................
June .......................................................................................
April ........................................................................................
January ..................................................................................
April ........................................................................................
April ........................................................................................
January ..................................................................................
April ........................................................................................
April ........................................................................................
May ........................................................................................
April ........................................................................................
March .....................................................................................
May ........................................................................................
April ........................................................................................
January ..................................................................................
April ........................................................................................
April ........................................................................................
June .......................................................................................
March .....................................................................................
January ..................................................................................
March .....................................................................................
May ........................................................................................
April ........................................................................................
April ........................................................................................
May ........................................................................................
April ........................................................................................
April 15 ..................................................................................
April ........................................................................................
January ..................................................................................
January ..................................................................................
January ..................................................................................
4.2 Carbon Monoxide (CO) Design
Criteria. (a) There are no minimum
requirements for the number of CO
monitoring sites. Continued operation of
existing SLAMS CO sites using FRM or FEM
methods is required until discontinuation is
approved by the EPA Regional
Administrator. Where SLAMS CO monitoring
is required, at least one site must be a
maximum concentration site for that area
under investigation.
(b) Microscale and middle scale
measurements are useful site classifications
VerDate Aug<31>2005
19:24 Jan 13, 2006
Jkt 208001
for SLAMS sites since most people have the
potential for exposure on these scales.
Carbon monoxide maxima occur primarily in
areas near major roadways and intersections
with high traffic density and often poor
atmospheric ventilation.
(1) Microscale—This scale applies when air
quality measurements are to be used to
represent distributions within street canyons,
over sidewalks, and near major roadways. In
the case with carbon monoxide, microscale
measurements in one location can often be
PO 00000
Frm 00091
Fmt 4701
Sfmt 4700
End month
October.
October.
December.
November.
December.
September.
September.
October.
October.
October.
October.
December.
September.
October.
September.
October.
October.
October.
October.
December.
September.
October.
September.
September.
October.
October.
October.
September.
October.
December.
September.
October.
December.
October.
October.
September.
October.
November.
September.
October.
December.
September.
October.
September.
October.
December.
October.
September.
September.
October.
September.
October.
October 15.
October.
December.
December.
December.
considered as representative of other similar
locations in a city.
(2) Middle scale—Middle scale
measurements are intended to represent areas
with dimensions from 100 meters to 0.5
kilometer. In certain cases, middle scale
measurements may apply to areas that have
a total length of several kilometers, such as
‘‘line’’ emission source areas. This type of
emission sources areas would include air
quality along a commercially developed
E:\FR\FM\17JAP3.SGM
17JAP3
2800
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
street or shopping plaza, freeway corridors,
parking lots and feeder streets.
(c) After the spatial scale and type of site
has been determined to meet the monitoring
objective for each location, the technical
guidance in reference 2 of this appendix
should be used to evaluate the adequacy of
each existing CO site and must be used to
relocate an existing site or to locate any new
sites.
4.3 Nitrogen Dioxide (NO2) Design
Criteria. (a) There are no minimum
requirements for the number of NO2
monitoring sites. Continued operation of
existing SLAMS NO2 sites using FRM or FEM
methods is required until discontinuation is
approved by the EPA Regional
Administrator. Where SLAMS NO2
monitoring is required, at least one NO2 site
in the area must be located to measure the
maximum concentration of NO2.
(b) NO/NOY measurements are included
within the NCore multipollutant site
requirements and the PAMS program. These
NO/NOY measurements will produce
conservative estimates for NO2 that can be
used to track continued compliance with the
NO2 NAAQS. NO/NOY monitors are used at
these sites because it is important to collect
data on total reactive nitrogen species for
understanding ozone photochemistry.
4.4 Sulfur Dioxide (SO2) Design Criteria.
(a) There are no minimum requirements for
the number of SO2 monitoring sites.
Continued operation of existing SLAMS SO2
sites using FRM or FEM methods is required
until discontinuation is approved by the EPA
Regional Administrator. Where SLAMS SO2
monitoring is required, at least one of the
SLAMS SO2 sites must be a maximum
concentration site for that specific area.
(b) The appropriate spatial scales for SO2
SLAMS monitoring are the microscale,
middle, and possibly neighborhood scales.
The multi-pollutant NCore sites can provide
for metropolitan area trends analyses and
general control strategy progress tracking.
Other SLAMS sites are expected to provide
data that are useful in specific compliance
actions, for maintenance plan agreements, or
for measuring near specific stationary sources
of SO2.
(1) Micro and middle scale—Some data
uses associated with microscale and middle
scale measurements for SO2 include
assessing the effects of control strategies to
reduce concentrations (especially for the 3hour and 24-hour averaging times) and
monitoring air pollution episodes.
(2) Neighborhood scale—This scale applies
where there is a need to collect air quality
data as part of an ongoing SO2 stationary
source impact investigation. Typical
locations might include suburban areas
adjacent to SO2 stationary sources for
example, or for determining background
concentrations as part of these studies of
population responses to exposure to SO2.
(c) Technical guidance in reference 1 of
this appendix should be used to evaluate the
adequacy of each existing SO2 site, to
relocate an existing site, or to locate new
sites.
4.5 Lead (Pb) Design Criteria. (a) State,
and where appropriate, local agencies are
required to conduct Pb monitoring for all
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
areas where Pb levels have been shown or are
expected to be of concern over the most
recent 2 years. As a minimum, there must be
two SLAMS sites in any area where Pb
concentrations currently exceed or have
exceeded the Pb NAAQS in the most recent
2 years, and at least one of these two required
sites must be a maximum concentration site.
Where the Pb air quality violations are
widespread or the emissions density,
topography, or population locations are
complex and varied, the EPA Regional
Administrator may require more than two Pb
ambient air monitoring sites.
(b) The most important spatial scales to
effectively characterize the emissions from
point sources are the micro, middle, and
neighborhood scales.
(1) Microscale—This scale would typify
areas in close proximity to lead point
sources. Emissions from point sources such
as primary and secondary lead smelters, and
primary copper smelters may under
fumigation conditions likewise result in high
ground level concentrations at the
microscale. In the latter case, the microscale
would represent an area impacted by the
plume with dimensions extending up to
approximately 100 meters. Data collected at
microscale sites provide information for
evaluating and developing ‘‘hot-spot’’ control
measures.
(2) Middle scale—This scale generally
represents Pb air quality levels in areas up to
several city blocks in size with dimensions
on the order of approximately 100 meters to
500 meters. The middle scale may for
example, include schools and playgrounds in
center city areas which are close to major Pb
point sources. Pb monitors in such areas are
desirable because of the higher sensitivity of
children to exposures of elevated Pb
concentrations (reference 3 of this appendix).
Emissions from point sources frequently
impact on areas at which single sites may be
located to measure concentrations
representing middle spatial scales.
(3) Neighborhood scale—The
neighborhood scale would characterize air
quality conditions throughout some
relatively uniform land use areas with
dimensions in the 0.5 to 4.0 kilometer range.
Sites of this scale would provide monitoring
data in areas representing conditions where
children live and play. Monitoring in such
areas is important since this segment of the
population is more susceptible to the effects
of Pb. Where a neighborhood site is located
away from immediate Pb sources, the site
may be very useful in representing typical air
quality values for a larger residential area,
and therefore suitable for population
exposure and trends analyses.
(c) Technical guidance is found in
references 4 and 5 of this appendix. These
documents provide additional guidance on
locating sites to meet specific urban area
monitoring objectives and should be used in
locating new sites or evaluating the adequacy
of existing sites.
4.6 Particulate Matter (PM10) Design
Criteria. (a) There are no minimum
requirements for the number of PM10
monitoring sites. In areas where the PM10
NAAQS has not been revoked, continued
operation of existing SLAMS PM10 sites using
PO 00000
Frm 00092
Fmt 4701
Sfmt 4700
FRM or FEM methods is required until
discontinuation is approved by the EPA
Regional Administrator. In areas for where
the PM10 NAAQS has been revoked, there is
no requirement for continued operation of
existing sites.
(b) The most important spatial scales to
effectively characterize the emissions of PM10
from both mobile and stationary sources are
the middle scales and neighborhood scales.
For purposes of establishing monitoring sites
to represent large homogenous areas other
than the above scales of representativeness
and to characterize regional transport, urban
or regional scale sites would also be needed.
(1) Microscale—This scale would typify
areas such as downtown street canyons,
traffic corridors, and fence line stationary
source monitoring locations where the
general public could be exposed to maximum
PM10 concentrations. Microscale particulate
matter sites should be located near inhabited
buildings or locations where the general
public can be expected to be exposed to the
concentration measured. Emissions from
stationary sources such as primary and
secondary smelters, power plants, and other
large industrial processes may, under certain
plume conditions, likewise result in high
ground level concentrations at the
microscale. In the latter case, the microscale
would represent an area impacted by the
plume with dimensions extending up to
approximately 100 meters. Data collected at
microscale sites provide information for
evaluating and developing hot spot control
measures.
(2) Middle scale—Much of the short-term
public exposure to coarse fraction particles
(PM10) is on this scale and on the
neighborhood scale. People moving through
downtown areas or living near major
roadways or stationary sources, may
encounter particulate pollution that would be
adequately characterized by measurements of
this spatial scale. Middle scale PM10
measurements can be appropriate for the
evaluation of possible short-term exposure
public health effects. In many situations,
monitoring sites that are representative of
micro-scale or middle-scale impacts are not
unique and are representative of many
similar situations. This can occur along
traffic corridors or other locations in a
residential district. In this case, one location
is representative of a neighborhood of small
scale sites and is appropriate for evaluation
of long-term or chronic effects. This scale
also includes the characteristic
concentrations for other areas with
dimensions of a few hundred meters such as
the parking lot and feeder streets associated
with shopping centers, stadia, and office
buildings. In the case of PM10, unpaved or
seldomly swept parking lots associated with
these sources could be an important source
in addition to the vehicular emissions
themselves.
(3) Neighborhood scale—Measurements in
this category represent conditions throughout
some reasonably homogeneous urban
subregion with dimensions of a few
kilometers and of generally more regular
shape than the middle scale. Homogeneity
refers to the particulate matter
concentrations, as well as the land use and
E:\FR\FM\17JAP3.SGM
17JAP3
2801
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
land surface characteristics. In some cases, a
location carefully chosen to provide
neighborhood scale data would represent not
only the immediate neighborhood but also
neighborhoods of the same type in other
parts of the city. Neighborhood scale PM10
sites provide information about trends and
compliance with standards because they
often represent conditions in areas where
people commonly live and work for extended
periods. Neighborhood scale data could
provide valuable information for developing,
testing, and revising models that describe the
larger-scale concentration patterns, especially
those models relying on spatially smoothed
emission fields for inputs. The neighborhood
scale measurements could also be used for
neighborhood comparisons within or
between cities.
(4) Urban scale—This class of
measurement would be made to characterize
the particulate matter concentration over an
entire metropolitan or rural area ranging in
size from 4 to 50 kilometers. Such
measurements would be useful for assessing
trends in area-wide air quality, and hence,
the effectiveness of large scale air pollution
control strategies.
(5) Regional scale—These measurements
would characterize conditions over areas
with dimensions of as much as hundreds of
kilometers. As noted earlier, using
representative conditions for an area implies
some degree of homogeneity in that area. For
this reason, regional scale measurements
would be most applicable to sparsely
populated areas. Data characteristics of this
scale would provide information about larger
scale processes of particulate matter
emissions, losses and transport.
4.7 Fine Particulate Matter (PM2.5) Design
Criteria.
4.7.1 General Requirements. (a) State, and
where applicable local, agencies must
operate the minimum number of required
PM2.5 SLAMS sites listed in Table D–4 of this
appendix. The NCore sites are expected to
complement the PM2.5 data collection that
takes place at non-NCore SLAMS sites, and
both types of sites can be used to meet the
minimum PM2.5 network requirements.
Deviations from these PM2.5 monitoring
requirements must be approved by the EPA
Regional Administrator.
TABLE D–4 OF APPENDIX D TO PART 58.—PM2.5 MINIMUM MONITORING REQUIREMENTS
MSA or CSA
Most recent
3-year design
value ≥115%
of any PM2.5
NAAQS 1
population 3, 5
Most recent
3-year design
value ±15% of
PM2.5
NAAQS 1
Most recent
3-year design
value ≤85% of
any PM2.5
NAAQS1 2
2
1
1
1
1
3
2
1
1
1
2
1
0
0
0
> 1,000,000 ..................................................................................................................................
500,000–1,000,000 ......................................................................................................................
250,000–500,000 .........................................................................................................................
100,000–250,000 .........................................................................................................................
50,000–<100,000 4 .......................................................................................................................
1 The
PM2.5 National Ambient Air Quality Standards (NAAQS) levels and forms are defined in 40 CFR part 50.
minimum monitoring requirements apply in the absence of a design value.
monitoring requirements apply to the Combined statistical area (CSA) as a whole, where applicable.
4 Metropolitan statistical areas (MSA) must contain an urbanized area of 50,000 or more population.
5 Population based on latest available census figures.
2 These
3 Minimum
(b) The technical guidance in references 6
and 7 of this appendix should be used for
siting PM2.5 monitors.
(c) The most important spatial scale to
effectively characterize the emissions of
particulate matter from both mobile and
stationary sources is the neighborhood scale
for PM2.5. For purposes of establishing
monitoring sites to represent large
homogenous areas other than the above
scales of representativeness and to
characterize regional transport, urban or
regional scale sites would also be needed.
Most PM2.5 monitoring in urban areas should
be representative of a neighborhood scale.
(1) Microscale—This scale would typify
areas such as downtown street canyons and
traffic corridors where the general public
would be exposed to maximum
concentrations from mobile sources. In some
circumstances, the microscale is appropriate
for particulate sites; community-oriented
SLAMS sites measured at the microscale
level should, however, be limited to urban
sites that are representative of long-term
human exposure and of many such
microenvironments in the area. In general,
microscale particulate matter sites should be
located near inhabited buildings or locations
where the general public can be expected to
be exposed to the concentration measured.
Emissions from stationary sources such as
primary and secondary smelters, power
plants, and other large industrial processes
may, under certain plume conditions,
likewise result in high ground level
concentrations at the microscale. In the latter
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
case, the microscale would represent an area
impacted by the plume with dimensions
extending up to approximately 100 meters.
Data collected at microscale sites provide
information for evaluating and developing
hot spot control measures. Unless these sites
are indicative of population-oriented
monitoring, they may be more appropriately
classified as special purpose monitors
(SPMs). Microscale PM2.5 sites would be
excluded from comparison with the annual
PM2.5 NAAQS in accordance with
§ 58.30(a)(1).
(2) Middle scale—People moving through
downtown areas, or living near major
roadways, encounter particle concentrations
that would be adequately characterized by
this spatial scale. Thus, measurements of this
type would be appropriate for the evaluation
of possible short-term exposure public health
effects of particulate matter pollution. In
many situations, monitoring sites that are
representative of microscale or middle-scale
impacts are not unique and are representative
of many similar situations. This can occur
along traffic corridors or other locations in a
residential district. In this case, one location
is representative of a number of small scale
sites and is appropriate for evaluation of
long-term or chronic effects. This scale also
includes the characteristic concentrations for
other areas with dimensions of a few
hundred meters such as the parking lot and
feeder streets associated with shopping
centers, stadia, and office buildings.
(3) Neighborhood scale—Measurements in
this category would represent conditions
PO 00000
Frm 00093
Fmt 4701
Sfmt 4700
throughout some reasonably homogeneous
urban subregion with dimensions of a few
kilometers and of generally more regular
shape than the middle scale. Homogeneity
refers to the particulate matter
concentrations, as well as the land use and
land surface characteristics. Much of the
PM2.5 exposures are expected to be associated
with this scale of measurement. In some
cases, a location carefully chosen to provide
neighborhood scale data would represent the
immediate neighborhood as well as
neighborhoods of the same type in other
parts of the city. PM2.5 sites of this kind
provide good information about trends and
compliance with standards because they
often represent conditions in areas where
people commonly live and work for periods
comparable to those specified in the NAAQS.
In general, most PM2.5 monitoring in urban
areas should have this scale.
(4) Urban scale—This class of
measurement would be used to characterize
the particulate matter concentration over an
entire metropolitan or rural area ranging in
size from 4 to 50 kilometers. Such
measurements would be useful for assessing
trends in area-wide air quality, and hence,
the effectiveness of large scale air pollution
control strategies. Community-oriented PM2.5
sites may have this scale.
(5) Regional scale—These measurements
would characterize conditions over areas
with dimensions of as much as hundreds of
kilometers. As noted earlier, using
representative conditions for an area implies
some degree of homogeneity in that area. For
E:\FR\FM\17JAP3.SGM
17JAP3
2802
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
this reason, regional scale measurements
would be most applicable to sparsely
populated areas. Data characteristics of this
scale would provide information about larger
scale processes of particulate matter
emissions, losses and transport. PM2.5
transport contributes to elevated particulate
concentrations and may affect multiple urban
and State entities with large populations
such as in the eastern United States.
Development of effective pollution control
strategies requires an understanding at
regional geographical scales of the emission
sources and atmospheric processes that are
responsible for elevated PM2.5 levels and may
also be associated with elevated ozone and
regional haze.
4.7.2 Requirement for Continuous PM2.5
Monitoring. State, or where appropriate, local
agencies must operate continuous fine
particulate analyzers at one-half (round up)
of the minimum required sites listed in Table
D–4 of this appendix. State and local air
monitoring agencies must use methodologies
and quality assurance/quality control (QA/
QC) procedures approved by the EPA
Regional Administrator for these sites.
4.7.3 Requirement for PM2.5 Background
and Transport Sites. Each State shall install
and operate at least one PM2.5 site to monitor
for regional background and at least one
PM2.5 site to monitor regional transport.
These monitoring sites may be at communityoriented sites and this requirement may be
satisfied by a corresponding monitor in an
area having similar air quality in another
State. State and local air monitoring agencies
must use methodologies and QA/QC
procedures approved by the EPA Regional
Administrator for these sites. Methods used
at these sites may include non-federal
reference method samplers such as IMPROVE
or continuous PM2.5 monitors.
4.7.4 PM2.5 Chemical Speciation Site
Requirements. Each State shall continue to
conduct chemical speciation monitoring and
analyses at sites designated to be part of the
PM2.5 Speciation Trends Network (STN). The
selection and modification of these STN sites
must be approved by the Administrator. The
PM2.5 chemical speciation urban trends sites
shall include analysis for elements, selected
anions and cations, and carbon. Samples
must be collected using the monitoring
methods and the sampling schedules
approved by the Administrator. Chemical
speciation is encouraged at additional sites
where the chemically resolved data would be
useful in developing State implementation
plans and supporting atmospheric or health
effects related studies.
4.7.5 Special Network Considerations
Required When Using PM2.5 Spatial
Averaging Approaches. (a) The PM2.5
NAAQS, specified in 40 CFR 50, provides
State and local air monitoring agencies with
an option for spatially averaging PM2.5 air
quality data. More specifically, two or more
community-oriented (i.e., sites in populated
areas) PM2.5 monitors may be averaged for
comparison with the annual PM2.5 NAAQS.
This averaging approach is directly related to
epidemiological studies used as the basis for
the PM2.5 annual NAAQS. Spatial averaging
does not apply to comparisons with the daily
PM2.5 NAAQS.
(b) State and local agencies must carefully
consider their approach for PM2.5 network
design when they intend to spatially average
the data for compliance purposes. These
State and local air monitoring agencies must
define the area over which they intend to
average PM2.5 air quality concentrations. This
area is defined as a Community Monitoring
Zone (CMZ), which characterizes an area of
relatively similar annual average air quality.
State and local agencies can define a CMZ in
a number of ways, including as part or all of
a metropolitan area. These CMZ must be
defined within a State or local agencies
network description, as required in § 58.10 of
this part and approved by the EPA Regional
Administrator. When more than one CMZ is
described within an agency’s network design
plan, CMZs must not overlap in their
geographical coverage. The criteria that must
be used for evaluating the acceptability of
spatial averaging are defined in Appendix N
of 40 CFR Part 50.
4.8 Coarse Particulate Matter (PM10-2.5)
Design Criteria.
4.8.1 General Monitoring Requirements.
(a) Consistent with the indicator for the
proposed PM10-2.5 NAAQS, required PM10-2.5
monitoring will address areas where the mix
of PM10-2.5 is dominated by resuspended dust
from high-density traffic on paved roads and
PM generated by industrial sources and
construction sources, and will not address
areas where it is dominated by rural
windblown dust and soils and PM generated
by agricultural and mining sources.
(b) State, and where applicable, local
Agencies must operate, at a minimum, the
number of required PM10-2.5 SLAMS sites
listed in Table D–5 of this appendix. The
minimum requirements of Table D–5 apply
only to MSAs that contain all or part of an
urbanized area with a population of at least
100,000 persons. NCore sites are expected to
complement the PM10-2.5 data collection that
takes place at SLAMS Sites. Data from urban
NCore sites can be used to meet minimum
PM10-2.5 network requirements if those sites
meet the NAAQS comparability criteria in
§ 58.30(b). Modifications from the PM10-2.5
monitoring requirements must be approved
by the Regional Administrator.
TABLE D–5 OF APPENDIX D TO PART 58.—PM10-2.5 MINIMUM MONITORING REQUIREMENTS
Most recent
3-year design
value 2 ≥ 80%
of PM10-2.5
NAAQS 3
MSA population 1, 5
Most recent
3-year design
value 50%–
80% of
PM10-2.5
NAAQS 3 4
Most recent
3-year design
value < 50% of
PM10-2.5
NAAQS 3
5
4
3
2
3
2
1
1
2
1
0
0
> 5,000,000 ..................................................................................................................................
1,000,000–< 5,000,000 ................................................................................................................
500,000–< 1,000,000 ...................................................................................................................
100,000–< 500,000 ......................................................................................................................
1 Metropolitan Statistical Area (MSA) as defined by the Office of Management of Budget. The minimum requirements of this table apply only to
MSAs that contain all or part of an urbanized area with a population of at least 100,000 persons. Multiple MSA in a Combined statistical area
(CSA) are separately subject to these requirements based on their population and design value.
2 A database of estimated PM
10-2.5 design values will be provided by EPA until the network is fully deployed for three years. States may propose alternate estimates for EPA Regional Administrator approval.
3 The PM
10-2.5 National Ambient Air Quality Standards (NAAQS) levels and forms are defined in part 50 of this chapter.
4 These minimum monitoring requirements apply in the absence of a design value.
5 Population based on latest available census figures.
(c) Middle and neighborhood scale
measurements are the most important station
classifications for PM10-2.5 to assess the
variation in coarse particle concentrations
that would be expected across populated
areas that are in proximity to large emissions
sources. Sites that represent larger spatial
scales would characterize concentrations in
the suburban, highly populated areas of
larger MSA’s that are more distant from the
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
zones of most concentrated industrial
activity.
(1) Microscale—This scale would typify
relatively small areas immediately adjacent
to: Industrial sources; locations experiencing
ongoing construction, redevelopment, and
soil disturbance; and heavily traveled
roadways. Data collected at microscale
stations would characterize exposure over
areas of limited spatial extent and population
PO 00000
Frm 00094
Fmt 4701
Sfmt 4700
exposure, and may provide information
useful for evaluating and developing sourceoriented control measures. Microscale sites
would be excluded from comparison with the
NAAQS in accordance with § 58.30(b)(4), and
may be more appropriately classified as
SPMs.
(2) Middle scale—People living or working
near major roadways or industrial districts
encounter particle concentrations that would
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
be adequately characterized by this spatial
scale. Thus, measurements of this type would
be appropriate for the evaluation of public
health effects of coarse particle exposure.
Monitors located in populated areas that are
nearly adjacent to large industrial point
sources of coarse particles provide suitable
locations for assessing maximum population
exposure levels and identifying areas of
potentially poor air quality. Similarly,
monitors located in populated areas that
border dense networks of heavily-traveled
traffic are appropriate for assessing the
impacts of resuspended road dust. This scale
also includes the characteristic
concentrations for other areas with
dimensions of a few hundred meters such as
school grounds and parks that are nearly
adjacent to major roadways and industrial
point sources, locations exhibiting mixed
residential and commercial development,
and downtown areas featuring office
buildings, shopping centers, and stadiums.
(3) Neighborhood scale—Measurements in
this category would represent conditions
throughout some reasonably homogeneous
urban subregion with dimensions of a few
kilometers and of generally more regular
shape than the middle scale. Homogeneity
refers to the particulate matter
concentrations, as well as the land use and
land surface characteristics. This category
includes suburban neighborhoods dominated
by residences that are somewhat distant from
major roadways and industrial districts but
still impacted by urban sources, and areas of
diverse land use where residences are
interspersed with commercial and industrial
neighborhoods. In some cases, a location
carefully chosen to provide neighborhood
scale data would represent the immediate
neighborhood as well as neighborhoods of
the same type in other parts of the city. The
comparison of data from middle scale and
neighborhood scale sites would provide
valuable information for determining the
variation of PM10-2.5 levels across urban areas
and assessing the spatial extent of elevated
concentrations caused by major industrial
point sources and heavily traveled roadways.
Neighborhood scale sites would provide
concentration data that are relevant to
informing a large segment of the population
of their exposure levels on a given day.
4.8.2 PM10-2.5 Specific Siting
Requirements.
4.8.2.1 A minimum of 50 percent of the
PM10-2.5 sites required in Table D–5 of this
appendix must characterize middle scalesized areas (values of 0.5 monitors and
greater round up). Middle-scale sites must be
situated in areas of expected maximum
concentration among sites eligible for
comparison to the NAAQS.
4.8.2.2 For those areas with monitoring
requirements greater than one required
monitor, at least one of the required monitors
must be at a population-oriented site in a
neighborhood scale-sized area that is highly
populated and which may be somewhat
further away from emission sources than the
required middle-scale sites, subject to the
requirement that the site must meet the
comparability criteria in § 58.30(b). Among
such sites, the State should select a site
characterized by a large number of people
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
subject to exposure; typically, this
population number would be higher than the
population at middle-scale sites expected to
record maximum concentrations.
4.8.2.3 For MSA’s with a requirement for
four or five monitors, the siting of the
remaining unspecified monitor is left to the
discretion of the State or local monitoring
agency, subject to the requirement that the
site must meet the comparability criteria in
§ 58.30(b). This site could be placed in
middle-scale or neighborhood scale locations
similar to those that would be eligible as
monitoring sites for the other required
monitors. A State may also choose to place
the site in a location that is somewhat more
distant from downtown areas, main
industrial source regions, or areas of highest
traffic density, such as in a suburban
residential community.
4.8.3 PM10-2.5 Chemical Speciation Site
Requirements. One chemical speciation
monitoring site is required in each MSA with
total population over 500,000 people that
also has an estimated PM10-2.5 design value
greater than 80% of the NAAQS. These sites
will gather data in areas that have a higher
probability of exceeding the proposed
NAAQS and also have larger exposed
populations at risk, and will support the
characterization of coarse particles
concentrations that control the attainment/
nonattainment status of the area. Samples
must be collected using monitoring methods
and the sampling schedules approved by the
EPA Regional Administrator. Chemical
speciation is encouraged at additional sites to
support development of State
implementation plans and atmospheric or
health effects related studies. These
additional locations may include STN,
NCore, CASTNET, and IMPROVE sites to
provide coverage of sources typical of urban
core locations, suburban regions typified by
predominantly residential districts, and less
densely-settled rural locations that may be
characterized by naturally occurring geologic
materials. The selection and modification of
PM10-2.5 chemical speciation sites must be
approved by the EPA Regional
Administrator.
4.9 Filter Archive Requirements for
PM2.5, PM10, and PM10-2.5. Air pollution
control agencies shall archive PM2.5, PM10,
and PM10-2.5 filters from all SLAMS sites for
1 year after collection. These filters shall be
made available during the course of that year
for supplemental analyses at the request of
EPA or to provide information to State and
local agencies on PM2.5 composition. Other
Federal Agencies may request access to filters
for purposes of supporting air quality
management or community health—such as
biological assay—through the applicable EPA
Regional Administrator. The filters shall be
archived according to procedures approved
by the Administrator. EPA recommends that
particulate matter filters be archived for
longer periods, especially for key sites in
making NAAQS related decisions or for
supporting health-related air pollution
studies.
5. Network Design for Photochemical
Assessment Monitoring Stations (PAMS).
The PAMS program provides more
comprehensive data on O3 air pollution in
PO 00000
Frm 00095
Fmt 4701
Sfmt 4700
2803
areas classified as serious, severe, or extreme
nonattainment for ozone than would
otherwise be achieved through the NCore and
SLAMS sites. More specifically, the PAMS
program includes measurements for ozone,
oxides of nitrogen, volatile organic
compounds, and meteorology.
5.1 PAMS Monitoring Objectives. PAMS
design criteria are site specific. Concurrent
measurements of O3, oxides of nitrogen,
speciated VOC, CO, and meteorology are
obtained at PAMS sites. Design criteria for
the PAMS network are based on locations
relative to O3 precursor source areas and
predominant wind directions associated with
high O3 events. Specific monitoring
objectives are associated with each location.
The overall design should enable
characterization of precursor emission
sources within the area, transport of O3 and
its precursors, and the photochemical
processes related to O3 nonattainment.
Specific objectives that must be addressed
include assessing ambient trends in O3,
oxides of nitrogen, VOC species, and
determining spatial and diurnal variability of
O3, oxides of nitrogen, and VOC species.
Specific monitoring objectives associated
with each of these sites may result in four
distinct site types. Detailed guidance for the
locating of these sites may be found in
reference 9 of this appendix.
(a) Type 1 sites are established to
characterize upwind background and
transported O3 and its precursor
concentrations entering the area and will
identify those areas which are subjected to
transport.
(b) Type 2 sites are established to monitor
the magnitude and type of precursor
emissions in the area where maximum
precursor emissions are expected to impact
and are suited for the monitoring of urban air
toxic pollutants.
(c) Type 3 sites are intended to monitor
maximum O3 concentrations occurring
downwind from the area of maximum
precursor emissions.
(d) Type 4 sites are established to
characterize the downwind transported O3
and its precursor concentrations exiting the
area and will identify those areas which are
potentially contributing to overwhelming
transport in other areas.
5.2 Monitoring Period. PAMS precursor
monitoring must be conducted annually
throughout the months of June, July and
August (as a minimum) when peak O3 values
are expected in each area. Alternate
precursor monitoring periods may be
submitted for approval to the Administrator
as a part of the annual monitoring network
plan required by § 58.10.
5.3 Minimum Monitoring Network
Requirements. A Type 2 site is required for
each area. Overall, only two sites are required
for each area, providing all chemical
measurements are made. For example, if a
design includes two Type 2 sites, then a third
site will be necessary to capture the NOy
measurement. The minimum required
number and type of monitoring sites and
sampling requirements are listed in Table D–
6 of this appendix. Any alternative plans may
be put in place in lieu of these requirements,
if approved by the Administrator.
E:\FR\FM\17JAP3.SGM
17JAP3
2804
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
TABLE D–6 OF APPENDIX D TO PART 58.—MINIMUM REQUIRED PAMS MONITORING LOCATIONS AND FREQUENCIES
Measurement
Where required
Sampling frequency (all daily except for upper air
meteorology)1
Speciated VOC 2 ..................
Two sites per area, one of which must be a Type 2 site
Carbonyl Sampling ...............
Type 2 site in areas classified as serious or above for
the 8-hour ozone standard.
All Type 2 sites ...............................................................
One site per area at the Type 3 or Type 1 site ..............
One site per area at a Type 2 site .................................
All sites ............................................................................
All sites ............................................................................
One representative location within PAMS area ..............
During the PAMS monitoring period: (1) Hourly auto
GC, or (2) Eight 3-hour canisters, or (3) 1 morning
and 1 afternoon canister with a 3-hour or less averaging time plus Continuous Total Non-methane Hydrocarbon measurement.
3-hour samples every day during the PAMS monitoring
period.
Hourly during the ozone monitoring season.3
Hourly during the ozone monitoring season.
Hourly during the ozone monitoring season.
Hourly during the ozone monitoring season.
Hourly during the ozone monitoring season.
Sampling frequency must be approved as part of the
PAMS Network Description described in 40 CFR
58.41.
NOX ......................................
NOY ......................................
CO (ppb level) ......................
Ozone ...................................
Surface met ..........................
Upper air meteorology .........
1 Daily
or with an approved alternative plan.
VOC is defined in the ‘‘Technical Assistance Document for Sampling and Analysis of Ozone Precursors’’, EPA/600–R–98/161,
September 1998.
3 Approved ozone monitoring season as stipulated in 40 CFR part 58, Table D–3 of this appendix.
2 Speciated
5.4 Transition Period. A transition period
is allowed for phasing in the operation of
newly required PAMS programs (due
generally to reclassification of an area into
serious, severe, or extreme nonattainment for
ozone). Following the date of redesignation
or reclassification of any existing O3
nonattainment area to serious, severe, or
extreme, or the designation of a new area and
classification to serious, severe, or extreme
O3 nonattainment, a State is allowed one year
to develop plans for its PAMS
implementation strategy. Subsequently, a
minimum of one Type 2 site must be
operating by the first month of the following
approved PAMS season. Operation of the
remaining site(s) must, at a minimum, be
phased in at the rate of one site per year
during subsequent years as outlined in the
approved PAMS network description
provided by the State.
6. References.
1. Ball, R.J. and G. E. Anderson. Optimum
Site Exposure Criteria for SO2 Monitoring.
The Center for the Environment and Man,
Inc., Hartford, CT. Prepared for U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA Publication No. EPA–
450/3–77–013. April 1977.
2. Ludwig, F.F., J.H.S. Kealoha, and E.
Shelar. Selecting Sites for Carbon Monoxide
Monitoring. Stanford Research Institute,
Menlo Park, CA. Prepared for U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA Publication No. EPA–
450/3–75–077, September 1975.
3. Air Quality Criteria for Lead. Office of
Research and Development, U.S.
Environmental Protection Agency,
Washington, DC. EPA Publication No. 600/8–
89–049F. August 1990. (NTIS document
numbers PB87–142378 and PB91–138420.)
4. Optimum Site Exposure Criteria for Lead
Monitoring. PEDCo Environmental, Inc.
Cincinnati, OH. Prepared for U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA Contract No. 68–02–
3013. May 1981.
5. Guidance for Conducting Ambient Air
Monitoring for Lead Around Point Sources.
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Office of Air Quality Planning and Standards,
U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA–454/R–92–
009. May 1997.
6. Koch, R.C. and H.E. Rector. Optimum
Network Design and Site Exposure Criteria
for Particulate Matter. GEOMET
Technologies, Inc., Rockville, MD. Prepared
for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Contract
No. 68–02–3584. EPA 450/4–87–009. May
1987.
7. Watson et al. Guidance for Network
Design and Optimum Site Exposure for PM2.5
and PM10. Prepared for U.S. Environmental
Protection Agency, Research Triangle Park,
NC. EPA–454/R–99–022, December 1997.
8. Guideline for Selecting and Modifying
the Ozone Monitoring Season Based on an 8Hour Ozone Standard. Prepared for U.S.
Environmental Protection Agency, RTP, NC.
EPA–454/R–98–001, June 1998.
9. Photochemical Assessment Monitoring
Stations Implementation Manual. Office of
Air Quality Planning and Standards, U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA–454/B–93–051.
March 1994.
52. Appendix E to part 58 is revised
to read as follows:
Appendix E to Part 58—Probe and
Monitoring Path Siting Criteria for
Ambient Air Quality Monitoring
1. Introduction.
2. Horizontal and Vertical Placement.
3. Spacing from Minor Sources.
4. Spacing From Obstructions.
5. Spacing From Trees.
6. Spacing From Roadways.
7. Cumulative Interferences on a
Monitoring Path.
8. Maximum Monitoring Path Length.
9. Probe Material and Pollutant Sample
Residence Time.
10. Waiver Provisions.
11. Summary.
12. References.
1. Introduction.
PO 00000
Frm 00096
Fmt 4701
Sfmt 4700
(a) This appendix contains specific
location criteria applicable to SLAMS,
NCore, and PAMS ambient air quality
monitoring probes, inlets, and optical paths
after the general location has been selected
based on the monitoring objectives and
spatial scale of representation discussed in
appendix D to this part. Adherence to these
siting criteria is necessary to ensure the
uniform collection of compatible and
comparable air quality data.
(b) The probe and monitoring path siting
criteria discussed in this appendix must be
followed to the maximum extent possible. It
is recognized that there may be situations
where some deviation from the siting criteria
may be necessary. In any such case, the
reasons must be thoroughly documented in a
written request for a waiver that describes
how and why the proposed siting deviates
from the criteria. This documentation should
help to avoid later questions about the
validity of the resulting monitoring data.
Conditions under which the EPA would
consider an application for waiver from these
siting criteria are discussed in section 11 of
this appendix.
(c) The pollutant-specific probe and
monitoring path siting criteria generally
apply to all spatial scales except where noted
otherwise. Specific siting criteria that are
phrased with a ‘‘must’’ are defined as
requirements and exceptions must be
approved through the waiver provisions.
However, siting criteria that are phrased with
a ‘‘should’’ are defined as goals to meet for
consistency but are not requirements.
2. Horizontal and Vertical Placement.
The probe or at least 80 percent of the
monitoring path must be located between 2
and 15 meters above ground level for all
ozone, sulfur dioxide and nitrogen dioxide
monitoring sites, and for neighborhood scale
Pb, PM10, PM10-2.5, PM2.5, and carbon
monoxide sites. Middle scale PM10-2.5 sites
are required to have sampler inlets between
2 and 7 meters above ground level.
Microscale Pb, PM10, and PM2.5 sites are
required to have sampler inlets between 2
and 7 meters above ground level. The inlet
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
probes for microscale carbon monoxide
monitors that are being used to measure
concentrations near roadways must be 3±1⁄2
meters above ground level. The probe or at
least 90 percent of the monitoring path must
be at least 1 meter vertically or horizontally
away from any supporting structure, walls,
parapets, penthouses, etc., and away from
dusty or dirty areas. If the probe or a
significant portion of the monitoring path is
located near the side of a building, then it
should be located on the windward side of
the building relative to the prevailing wind
direction during the season of highest
concentration potential for the pollutant
being measured.
3. Spacing from Minor Sources.
(a) It is important to understand the
monitoring objective for a particular location
in order to interpret this particular
requirement. Local minor sources of a
primary pollutant, such as SO2, lead, or
particles, can cause high concentrations of
that particular pollutant at a monitoring site.
If the objective for that monitoring site is to
investigate these local primary pollutant
emissions, then the site is likely to be
properly located nearby. This type of
monitoring site would in all likelihood be a
microscale type of monitoring site. If a
monitoring site is to be used to determine air
quality over a much larger area, such as a
neighborhood or city, a monitoring agency
should avoid placing a monitor probe, path,
or inlet near local, minor sources. The plume
from the local minor sources should not be
allowed to inappropriately impact the air
quality data collected at a site. Particulate
matter sites should not be located in an
unpaved area unless there is vegetative
ground cover year round, so that the impact
of wind blown dusts will be kept to a
minimum.
(b) Similarly, local sources of nitric oxide
(NO) and ozone-reactive hydrocarbons can
have a scavenging effect causing
unrepresentatively low concentrations of O3
in the vicinity of probes and monitoring
paths for O3. To minimize these potential
interferences, the probe or at least 90 percent
of the monitoring path must be away from
furnace or incineration flues or other minor
sources of SO2 or NO. The separation
distance should take into account the heights
of the flues, type of waste or fuel burned, and
the sulfur content of the fuel.
4. Spacing From Obstructions.
(a) Buildings and other obstacles may
possibly scavenge SO2, O3, or NO2, and can
act to restrict airflow for any pollutant. To
avoid this interference, the probe, inlet, or at
least 90 percent of the monitoring path must
have unrestricted airflow and be located
away from obstacles. The distance from the
obstacle to the probe, inlet, or monitoring
path must be at least twice the height that the
obstacle protrudes above the probe, inlet, or
monitoring path. An exception to this
requirement can be made for measurements
taken in street canyons or at source-oriented
sites where buildings and other structures are
unavoidable.
(b) Generally, a probe or monitoring path
located near or along a vertical wall is
undesirable because air moving along the
wall may be subject to possible removal
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
mechanisms. A probe, inlet, or monitoring
path must have unrestricted airflow in an arc
of at least 180 degrees. This arc must include
the predominant wind direction for the
season of greatest pollutant concentration
potential. For particle sampling, a minimum
of 2 meters of separation from walls,
parapets, and structures is required for
rooftop site placement.
(c) Special consideration must be devoted
to the use of open path analyzers due to their
inherent potential sensitivity to certain types
of interferences, or optical obstructions. A
monitoring path must be clear of all trees,
brush, buildings, plumes, dust, or other
optical obstructions, including potential
obstructions that may move due to wind,
human activity, growth of vegetation, etc.
Temporary optical obstructions, such as rain,
particles, fog, or snow, should be considered
when siting an open path analyzer. Any of
these temporary obstructions that are of
sufficient density to obscure the light beam
will affect the ability of the open path
analyzer to continuously measure pollutant
concentrations. Transient, but significant
obscuration of especially longer
measurement paths could occur as a result of
certain meteorological conditions (e.g., heavy
fog, rain, snow) and/or aerosol levels that are
of a sufficient density to prevent the open
path analyzer’s light transmission. If certain
compensating measures are not otherwise
implemented at the onset of monitoring (e.g.,
shorter path lengths, higher light source
intensity), data recovery during periods of
greatest primary pollutant potential could be
compromised. For instance, if heavy fog or
high particulate levels are coincident with
periods of projected NAAQS-threatening
pollutant potential, the representativeness of
the resulting data record in reflecting
maximum pollutant concentrations may be
substantially impaired despite the fact that
the site may otherwise exhibit an acceptable,
even exceedingly high overall valid data
capture rate.
5. Spacing From Trees.
(a) Trees can provide surfaces for SO2, O3,
or NO2 adsorption or reactions, and surfaces
for particle deposition. Trees can also act as
obstructions in cases where they are located
between the air pollutant sources or source
areas and the monitoring site, and where the
trees are of a sufficient height and leaf
canopy density to interfere with the normal
airflow around the probe, inlet, or monitoring
path. To reduce this possible interference/
obstruction, the probe, inlet, or at least 90
percent of the monitoring path must be at
least 10 meters or further from the drip line
of trees.
(b) The scavenging effect of trees is greater
for O3 than for other criteria pollutants.
Monitoring agencies must take steps to
consider the impact of trees on ozone
monitoring sites and take steps to avoid this
problem.
(c) For microscale sites of any air pollutant,
no trees or shrubs should be located between
the probe and the source under investigation,
such as a roadway or a stationary source.
6. Spacing From Roadways.
6.1 Spacing for Ozone and Oxide of
Nitrogen Probes and Monitoring Paths. In
siting an O3 analyzer, it is important to
PO 00000
Frm 00097
Fmt 4701
Sfmt 4700
2805
minimize destructive interferences from
sources of NO, since NO readily reacts with
O3. In siting NO2 analyzers for neighborhood
and urban scale monitoring, it is important
to minimize interferences from automotive
sources. Table E–1 of this appendix provides
the required minimum separation distances
between a roadway and a probe or, where
applicable, at least 90 percent of a monitoring
path for various ranges of daily roadway
traffic. A sampling site having a point
analyzer probe located closer to a roadway
than allowed by the Table E–1 requirements
should be classified as middle scale rather
than neighborhood or urban scale, since the
measurements from such a site would more
closely represent the middle scale. If an open
path analyzer is used at a site, the monitoring
path(s) must not cross over a roadway with
an average daily traffic count of 10,000
vehicles per day or more. For those situations
where a monitoring path crosses a roadway
with fewer than 10,000 vehicles per day, one
must consider the entire segment of the
monitoring path in the area of potential
atmospheric interference from automobile
emissions. Therefore, this calculation must
include the length of the monitoring path
over the roadway plus any segments of the
monitoring path that lie in the area between
the roadway and the minimum separation
distance, as determined from Table E–1 of
this appendix. The sum of these distances
must not be greater than 10 percent of the
total monitoring path length.
TABLE E–1 TO APPENDIX E OF PART
58.—MINIMUM SEPARATION DISTANCE BETWEEN ROADWAYS AND
PROBES OR MONITORING PATHS
FOR MONITORING NEIGHBORHOOD
AND URBAN SCALE OZONE (O3) AND
OXIDES OF NITROGEN (NO, NO2,
NOX, NOY)
Roadway average daily traffic,
vehicles per day
≤1,000 .......................................
10,000 .......................................
15,000 .......................................
20,000 .......................................
40,000 .......................................
70,000 .......................................
110,000 .....................................
Minimum
distance1
(meters)
10
20
30
40
60
100
250
1 Distance from the edge of the nearest traffic lane. The distance for intermediate traffic
counts should be interpolated from the table
values based on the actual traffic count.
6.2 Spacing for Carbon monoxide Probes
and Monitoring Paths. (a) Street canyon and
traffic corridor sites (microscale) are intended
to provide a measurement of the influence of
the immediate source on the pollution
exposure of the population. In order to
provide some reasonable consistency and
comparability in the air quality data from
microscale sites, a minimum distance of 2
meters and a maximum distance of 10 meters
from the edge of the nearest traffic lane must
be maintained for these CO monitoring inlet
probes. This should give consistency to the
E:\FR\FM\17JAP3.SGM
17JAP3
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
data, yet still allow flexibility of finding
suitable locations.
(b) Street canyon/corridor (microscale)
inlet probes must be located at least 10
meters from an intersection and preferably at
a midblock location. Midblock locations are
preferable to intersection locations because
intersections represent a much smaller
portion of downtown space than do the
streets between them. Pedestrian exposure is
probably also greater in street canyon/
corridors than at intersections.
(c) In determining the minimum separation
between a neighborhood scale monitoring
site and a specific roadway, the presumption
is made that measurements should not be
substantially influenced by any one roadway.
Computations were made to determine the
separation distance, and Table E–2 of this
appendix provides the required minimum
separation distance between roadways and a
probe or 90 percent of a monitoring path.
Probes or monitoring paths that are located
closer to roads than this criterion allows
should not be classified as a neighborhood
scale, since the measurements from such a
site would closely represent the middle scale.
Therefore, sites not meeting this criterion
should be classified as middle scale.
BILLING CODE 6560–50–C
7. Cumulative Interferences on a
Monitoring Path.
(This paragraph applies only to open path
analyzers.) The cumulative length or portion
of a monitoring path that is affected by minor
sources, trees, or roadways must not exceed
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
TABLE E–2 TO APPENDIX E OF PART
58.—MINIMUM SEPARATION DISTANCE BETWEEN ROADWAYS AND
PROBES OR MONITORING PATHS
FOR MONITORING NEIGHBORHOOD
SCALE CARBON MONOXIDE
Roadway average daily traffic,
vehicles per day
≤10,000 .....................................
15,000 .......................................
20,000 .......................................
30,000 .......................................
40,000 .......................................
50,000 .......................................
≥60,000 .....................................
Minimum
distance 1
(meters)
10
25
45
80
115
135
150
1 Distance from the edge of the nearest traffic lane. The distance for intermediate traffic
counts should be interpolated from the table
values based on the actual traffic count.
6.3 Spacing for Particulate Matter (PM2.5,
PM10, Pb) Inlets. (a) Since emissions
associated with the operation of motor
vehicles contribute to urban area particulate
matter ambient levels, spacing from roadway
criteria are necessary for ensuring national
consistency in PM sampler siting.
(b) The intent is to locate localized hot-spot
sites in areas of highest concentrations
whether it be from mobile or multiple
stationary sources. If the area is primarily
affected by mobile sources and the maximum
10 percent of the total monitoring path
length.
8. Maximum Monitoring Path Length.
(This paragraph applies only to open path
analyzers.) The monitoring path length must
not exceed 1 kilometer for analyzers in
neighborhood, urban, or regional scale. For
PO 00000
Frm 00098
Fmt 4701
Sfmt 4700
concentration area(s) is judged to be a traffic
corridor or street canyon location, then the
monitors should be located near roadways
with the highest traffic volume and at
separation distances most likely to produce
the highest concentrations. For the
microscale traffic corridor site, the location
must be between 5 and 15 meters from the
major roadway. For the microscale street
canyon site the location must be between 2
and 10 meters from the roadway. For the
middle scale site, a range of acceptable
distances from the roadway is shown in
figure E–1 of this appendix. This figure also
includes separation distances between a
roadway and neighborhood or larger scale
sites by default. Any site, 2 to 15 meters high,
and further back than the middle scale
requirements will generally be neighborhood,
urban or regional scale. For example,
according to Figure E–1 of this appendix, if
a PM sampler is primarily influenced by
roadway emissions and that sampler is set
back 10 meters from a 30,000 ADT (average
daily traffic) road, the site should be
classified as microscale, if the sampler height
is between 2 and 7 meters. If the sampler
height is between 7 and 15 meters, the site
should be classified as middle scale. If the
sample is 20 meters from the same road, it
will be classified as middle scale; if 40
meters, neighborhood scale; and if 110
meters, an urban scale.
BILLING CODE 6560–50–U
middle scale monitoring sites, the monitoring
path length must not exceed 300 meters. In
areas subject to frequent periods of dust, fog,
rain, or snow, consideration should be given
to a shortened monitoring path length to
minimize loss of monitoring data due to
these temporary optical obstructions. For
E:\FR\FM\17JAP3.SGM
17JAP3
EP17JA06.004
2806
2807
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
certain ambient air monitoring scenarios
using open path analyzers, shorter path
lengths may be needed in order to ensure that
the monitoring site meets the objectives and
spatial scales defined in appendix D to this
part. The Regional Administrator may require
shorter path lengths, as needed on an
individual basis, to ensure that the SLAMS
sites meet the appendix D requirements.
Likewise, the Administrator may specify the
maximum path length used at NCore
monitoring sites.
9. Probe Material and Pollutant Sample
Residence Time.
For the reactive gases, SO2, NO2, and O3,
special probe material must be used for point
analyzers. (a) Studies 20–24 have been
conducted to determine the suitability of
materials such as polypropylene,
polyethylene, polyvinyl chloride, Tygon,
aluminum, brass, stainless steel, copper,
Pyrex glass and Teflon for use as intake
sampling lines. Of the above materials, only
Pyrex glass and Teflon have been found to
be acceptable for use as intake sampling lines
for all the reactive gaseous pollutants.
Furthermore, the EPA25 has specified
borosilicate glass or FEP Teflon as the only
acceptable probe materials for delivering test
atmospheres in the determination of
reference or equivalent methods. Therefore,
borosilicate glass, FEP Teflon, or their
equivalent must be used for existing and new
NCore monitors.
(b) For volatile organic compound (VOC)
monitoring at PAMS, FEP Teflon is
unacceptable as the probe material because of
VOC adsorption and desorption reactions on
the FEP Teflon. Borosilicate glass, stainless
steel, or its equivalent are the acceptable
probe materials for VOC and carbonyl
sampling. Care must be taken to ensure that
the sample residence time is kept to 20
seconds or less.
(c) No matter how nonreactive the
sampling probe material is initially, after a
period of use reactive particulate matter is
deposited on the probe walls. Therefore, the
time it takes the gas to transfer from the
probe inlet to the sampling device is also
critical. Ozone in the presence of nitrogen
oxide (NO) will show significant losses even
in the most inert probe material when the
residence time exceeds 20 seconds.26 Other
studies 27–28 indicate that a 10-second or less
residence time is easily achievable.
Therefore, sampling probes for reactive gas
monitors at NCore must have a sample
residence time less than 20 seconds.
10. Waiver Provisions.
Most sampling probes or monitors can be
located so that they meet the requirements of
this appendix. New sites with rare
exceptions, can be located within the limits
of this appendix. However, some existing
sites may not meet these requirements and
yet still produce useful data for some
purposes. EPA will consider a written
request from the State agency to waive one
or more siting criteria for some monitoring
sites providing that the State can adequately
demonstrate the need (purpose) for
monitoring or establishing a monitoring site
at that location.
10.1 For establishing a new site, a waiver
may be granted only if both of the following
criteria are met:
10.1.1 The site can be demonstrated to be
as representative of the monitoring area as it
would be if the siting criteria were being met.
10.1.2 The monitor or probe cannot
reasonably be located so as to meet the siting
criteria because of physical constraints (e.g.,
inability to locate the required type of site the
necessary distance from roadways or
obstructions).
10.2 However, for an existing site, a
waiver may be granted if either of the criteria
in sections 10.1.1 and 10.1.2 of this appendix
are met.
10.3 Cost benefits, historical trends, and
other factors may be used to add support to
the criteria in sections 10.1.1 and 10.1.2 of
this appendix, however, they in themselves,
will not be acceptable reasons for granting a
waiver. Written requests for waivers must be
submitted to the Regional Administrator.
11. Summary.
Table E–4 of this appendix presents a
summary of the general requirements for
probe and monitoring path siting criteria
with respect to distances and heights. It is
apparent from Table E–4 that different
elevation distances above the ground are
shown for the various pollutants. The
discussion in this appendix for each of the
pollutants describes reasons for elevating the
monitor, probe, or monitoring path. The
differences in the specified range of heights
are based on the vertical concentration
gradients. For CO, the gradients in the
vertical direction are very large for the
microscale, so a small range of heights are
used. The upper limit of 15 meters is
specified for consistency between pollutants
and to allow the use of a single manifold or
monitoring path for monitoring more than
one pollutant.
TABLE E–4 OF APPENDIX E TO PART 58.—SUMMARY OF PROBE AND MONITORING PATH SITING CRITERIA
Scale
(maximum monitoring path length,
meters)
Height from ground
to probe, inlet or
80% of
monitoring path1
Horizontal and
vertical distance supporting structures2 to
probe, inlet or 90%
of monitoring path1
(meters)
Distance from trees
to probe, inlet or
90% of
monitoring path1
(meters)
Middle (300 m)
Neighborhood
Urban, and Regional (1 km).
Micro, middle (300
m), Neighborhood
(1 km).
2–15 .........................
>1 .............................
>10 ...........................
N/A.
3±1⁄2: 2–15 ...............
> 1 ...........................
> 10 .........................
Middle (300 m)
Neighborhood,
Urban, and Regional (1 km).
Neighborhood and
Urban (1 km).
2–15 .........................
> 1 ...........................
> 10 .........................
2–10; see Table E–2
of this appendix
for middle and
neighborhood
scales.
See Table E–1 of
this appendix for
all scales.
2–15 .........................
> 1 ...........................
> 10 .........................
2–7 (micro); 2–7
(middle PM10-2.5);
2–15 (all other
scales).
> 2 (all scales, horizontal distance
only).
> 10 (all scales) .......
Pollutant
SO2 3, 4, 5, 6 ..........................
CO4, 5, 7 ..............................
NO2, O33, 4, 5 ......................
Ozone precursors (for
PAMS)3, 4, 5.
PM, Pb3, 4, 5, 6, 8 ..................
Micro: Middle, Neighborhood, Urban
and Regional.
Distance from roadways to probe, inlet
or monitoring path1
(meters)
See Table E–4 of
this appendix for
all scales.
2–10 (micro); see
Figure E–1 of this
appendix for all
other scales.
N/A—Not applicable.
1 Monitoring path for open path analyzers is applicable only to middle or neighborhood scale CO monitoring and all applicable scales for monitoring SO2,O3, O3 precursors, and NO2.
2 When probe is located on a rooftop, this separation distance is in reference to walls, parapets, or penthouses located on roof.
3 Should be >20 meters from the dripline of tree(s) and must be 10 meters from the dripline when the tree(s) act as an obstruction.
4 Distance from sampler, probe, or 90% of monitoring path to obstacle, such as a building, must be at least twice the height the obstacle protrudes above the sampler, probe, or monitoring path. Sites not meeting this criterion may be classified as middle scale (see text).
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
PO 00000
Frm 00099
Fmt 4701
Sfmt 4700
E:\FR\FM\17JAP3.SGM
17JAP3
2808
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 / Proposed Rules
5 Must
have unrestricted airflow 270 degrees around the probe or sampler; 180 degrees if the probe is on the side of a building.
probe, sampler, or monitoring path should be away from minor sources, such as furnace or incineration flues. The separation distance is
dependent on the height of the minor source’s emission point (such as a flue), the type of fuel or waste burned, and the quality of the fuel (sulfur,
ash, or lead content). This criterion is designed to avoid undue influences from minor sources.
7 For microscale CO monitoring sites, the probe must be >10 meters from a street intersection and preferably at a midblock location.
8 Collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates greater than 200 liters/min or at least 1
meter apart for samplers having flow rates less than 200 liters/min to preclude airflow interference.
6 The
12. References.
1. Bryan, R.J., R.J. Gordon, and H. Menck.
Comparison of High Volume Air Filter
Samples at Varying Distances from Los
Angeles Freeway. University of Southern
California, School of Medicine, Los Angeles,
CA. (Presented at 66th Annual Meeting of Air
Pollution Control Association. Chicago, IL.,
June 24–28, 1973. APCA 73–158.)
2. Teer, E.H. Atmospheric Lead
Concentration Above an Urban Street. Master
of Science Thesis, Washington University, St.
Louis, MO. January 1971.
3. Bradway, R.M., F.A. Record, and W.E.
Belanger. Monitoring and Modeling of
Resuspended Roadway Dust Near Urban
Arterials. GCA Technology Division,
Bedford, MA. (Presented at 1978 Annual
Meeting of Transportation Research Board,
Washington, DC. January 1978.)
4. Pace, T.G., W.P. Freas, and E.M. Afify.
Quantification of Relationship Between
Monitor Height and Measured Particulate
Levels in Seven U.S. Urban Areas. U.S.
Environmental Protection Agency, Research
Triangle Park, NC. (Presented at 70th Annual
Meeting of Air Pollution Control Association,
Toronto, Canada, June 20–24, 1977. APCA
77–13.4.)
5. Harrison, P.R. Considerations for Siting
Air Quality Monitors in Urban Areas. City of
Chicago, Department of Environmental
Control, Chicago, IL. (Presented at 66th
Annual Meeting of Air Pollution Control
Association, Chicago, IL., June 24–28, 1973.
APCA 73–161.)
6. Study of Suspended Particulate
Measurements at Varying Heights Above
Ground. Texas State Department of Health,
Air Control Section, Austin, TX. 1970. p. 7.
7. Rodes, C.E. and G.F. Evans. Summary of
LACS Integrated Pollutant Data. In: Los
Angeles Catalyst Study Symposium. U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA Publication No. EPA–
600/4–77–034. June 1977.
8. Lynn, D.A. et. al. National Assessment
of the Urban Particulate Problem: Volume 1,
National Assessment. GCA Technology
Division, Bedford, MA. U.S. Environmental
Protection Agency, Research Triangle Park,
NC. EPA Publication No. EPA–450/3–75–
024. June 1976.
9. Pace, T.G. Impact of Vehicle-Related
Particulates on TSP Concentrations and
Rationale for Siting Hi-Vols in the Vicinity of
Roadways. OAQPS, U.S. Environmental
Protection Agency, Research Triangle Park,
NC. April 1978.
10. Ludwig, F.L., J.H. Kealoha, and E.
Shelar. Selecting Sites for Monitoring Total
Suspended Particulates. Stanford Research
Institute, Menlo Park, CA. Prepared for U.S.
Environmental Protection Agency, Research
VerDate Aug<31>2005
18:15 Jan 13, 2006
Jkt 208001
Triangle Park, NC. EPA Publication No. EPA–
450/3–77–018. June 1977, revised December
1977.
11. Ball, R.J. and G.E. Anderson. Optimum
Site Exposure Criteria for SO2 Monitoring.
The Center for the Environment and Man,
Inc., Hartford, CT. Prepared for U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA Publication No. EPA–
450/3–77–013. April 1977.
12. Ludwig, F.L. and J.H.S. Kealoha.
Selecting Sites for Carbon Monoxide
Monitoring. Stanford Research Institute,
Menlo Park, CA. Prepared for U.S.
Environmental Protection Agency, Research
Park, NC. EPA Publication No. EPA–450/3–
75–077. September 1975.
13. Ludwig, F.L. and E. Shelar. Site
Selection for the Monitoring of
Photochemical Air Pollutants. Stanford
Research Institute, Menlo Park, CA. Prepared
for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Publication
No. EPA–450/3–78–013. April 1978.
14. Lead Analysis for Kansas City and
Cincinnati, PEDCo Environmental, Inc.,
Cincinnati, OH. Prepared for U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA Contract No. 66–02–
2515, June 1977.
15. Barltrap, D. and C. D. Strelow. Westway
Nursery Testing Project. Report to the Greater
London Council. August 1976.
16. Daines, R.H., H. Moto, and D. M.
Chilko. Atmospheric Lead: Its Relationship to
Traffic Volume and Proximity to Highways.
Environ. Sci. and Technol., 4:318, 1970.
17. Johnson, D.E., et al. Epidemiologic
Study of the Effects of Automobile Traffic on
Blood Lead Levels, Southwest Research
Institute, Houston, TX. Prepared for U.S.
Environmental Protection Agency, Research
Triangle Park, NC. EPA–600/1–78–055,
August 1978.
18. Air Quality Criteria for Lead. Office of
Research and Development, U.S.
Environmental Protection Agency,
Washington, DC EPA–600/8–83–028 aF–dF,
1986, and supplements EPA–600/8–89/049F,
August 1990. (NTIS document numbers
PB87–142378 and PB91–138420.)
19. Lyman, D.R. The Atmospheric
Diffusion of Carbon monoxide and Lead from
an Expressway, Ph.D. Dissertation,
University of Cincinnati, Cincinnati, OH.
1972.
20. Wechter, S.G. Preparation of Stable
Pollutant Gas Standards Using Treated
Aluminum Cylinders. ASTM STP. 598:40–
54, 1976.
21. Wohlers, H.C., H. Newstein and D.
Daunis. Carbon Monoxide and Sulfur
Dioxide Adsorption On and Description
PO 00000
Frm 00100
Fmt 4701
Sfmt 4700
From Glass, Plastic and Metal Tubings. J. Air
Poll. Con. Assoc. 17:753, 1976.
22. Elfers, L.A. Field Operating Guide for
Automated Air Monitoring Equipment. U.S.
NTIS. p. 202, 249, 1971.
23. Hughes, E.E. Development of Standard
Reference Material for Air Quality
Measurement. ISA Transactions, 14:281–291,
1975.
24. Altshuller, A.D. and A.G. Wartburg.
The Interaction of Ozone with Plastic and
Metallic Materials in a Dynamic Flow
System. Intern. Jour. Air and Water Poll.,
4:70–78, 1961.
25. Code of Federal Regulations. Title 40
part 53.22, July 1976.
26. Butcher, S.S. and R.E. Ruff. Effect of
Inlet Residence Time on Analysis of
Atmospheric Nitrogen Oxides and Ozone,
Anal. Chem., 43:1890, 1971.
27. Slowik, A.A. and E.B. Sansone.
Diffusion Losses of Sulfur Dioxide in
Sampling Manifolds. J. Air. Poll. Con. Assoc.,
24:245, 1974.
28. Yamada, V.M. and R.J. Charlson. Proper
Sizing of the Sampling Inlet Line for a
Continuous Air Monitoring Station. Environ.
Sci. and Technol., 3:483, 1969.
29. Koch, R.C. and H.E. Rector. Optimum
Network Design and Site Exposure Criteria
for Particulate Matter, GEOMET
Technologies, Inc., Rockville, MD. Prepared
for U.S. Environmental Protection Agency,
Research Triangle Park, NC. EPA Contract
No. 68–02–3584. EPA 450/4–87–009. May
1987.
30. Burton, R.M. and J.C. Suggs.
Philadelphia Roadway Study. Environmental
Monitoring Systems Laboratory, U.S.
Environmental Protection Agency, Research
Triangle Park, N.C. EPA–600/4–84–070
September 1984.
31. Technical Assistance Document For
Sampling and Analysis of Ozone Precursors.
Atmospheric Research and Exposure
Assessment Laboratory, U.S. Environmental
Protection Agency, Research Triangle Park,
NC 27711. EPA 600/8–91–215. October 1991.
32. Quality Assurance Handbook for Air
Pollution Measurement Systems: Volume IV.
Meteorological Measurements. Atmospheric
Research and Exposure Assessment
Laboratory, U.S. Environmental Protection
Agency, Research Triangle Park, NC 27711.
EPA 600/4–90–0003. August 1989.
33. On-Site Meteorological Program
Guidance for Regulatory Modeling
Applications. Office of Air Quality Planning
and Standards, U.S. Environmental
Protection Agency, Research Triangle Park,
NC 27711. EPA 450/4–87–013. June 1987.
[FR Doc. 06–179 Filed 1–13–06; 8:45 am]
BILLING CODE 6560–50–U
E:\FR\FM\17JAP3.SGM
17JAP3
Agencies
[Federal Register Volume 71, Number 10 (Tuesday, January 17, 2006)]
[Proposed Rules]
[Pages 2710-2808]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 06-179]
[[Page 2709]]
-----------------------------------------------------------------------
Part III
Environmental Protection Agency
-----------------------------------------------------------------------
40 CFR Parts 53 and 58
Revisions to Ambient Air Monitoring Regulations; Proposed Rule
Federal Register / Vol. 71, No. 10 / Tuesday, January 17, 2006 /
Proposed Rules
[[Page 2710]]
-----------------------------------------------------------------------
ENVIRONMENTAL PROTECTION AGENCY
40 CFR Parts 53 and 58
[EPA-HQ-OAR-2004-0018; FRL-8015-9]
RIN 2060-AJ25
Revisions to Ambient Air Monitoring Regulations
AGENCY: Environmental Protection Agency (EPA).
ACTION: Proposed rule; amendments.
-----------------------------------------------------------------------
SUMMARY: The EPA is proposing to revise the ambient air monitoring
requirements for criteria pollutants. This proposal establishes ambient
air monitoring requirements in support of the proposed revisions to the
National Ambient Air Quality Standards (NAAQS) for particulate matter
published elsewhere in today's Federal Register, including new minimum
monitoring network requirements for PM10-2.5 and criteria
for approval of Federal reference and equivalent methods for
PM10-2.5 (to supplement the Federal reference method for
PM10-2.5 proposed elsewhere in today's Federal Register).
This proposal also requires each State to operate one to three
monitoring stations that take an integrated, multipollutant approach to
ambient air monitoring. The proposed amendments modify the requirements
for ambient air monitors by focusing requirements on populated areas
with air quality problems and significantly reducing the requirements
for criteria pollutant monitors that have measured ambient air
concentrations well below the applicable NAAQS. Other proposed
amendments revise the requirements for reference and equivalent method
determinations (including specifications and test procedures) for fine
particulate monitors, monitoring network descriptions and periodic
assessments, quality assurance, and data certification. The purpose of
the proposed amendments is to enhance ambient air quality monitoring to
better serve current and future air quality management and research
needs.
DATES: Comments must be received on or before April 17, 2006.
ADDRESSES: Submit your comments, identified by Docket ID No. EPA-HQ-
OAR-2004-0018, by one of the following methods:
https://www.regulations.gov: Follow the on-line
instructions for submitting comments.
E-mail: a-and-r-docket@epa.gov.
Fax: (202) 566-1741.
Mail: Revisions to Ambient Air Monitoring Regulations,
Docket No. EPA-HQ-OAR-2004-0018, Environmental Protection Agency,
Mailcode 6102T, 1200 Pennsylvania Ave., NW., Washington, DC 20460.
Please include a total of two copies. In addition, please mail a copy
of your comments on the information collection provisions to the Office
of Information and Regulatory Affairs, Office of Management and Budget
(OMB), Attn: Desk Officer for EPA, 725 17th St., NW., Washington, DC
20503.
Hand Delivery: EPA Docket Center, 1301 Constitution
Avenue, NW., Room B102, Washington, DC 20460. Such deliveries are only
accepted during the Docket's normal hours of operation, and special
arrangements should be made for deliveries of boxed information.
Instructions: Direct your comments to Docket ID No. EPA-HQ-OAR-
2004-0018. EPA's policy is that all comments received will be included
in the public docket without change and may be made available online at
https://www.regulations.gov, including any personal information
provided, unless the comment includes information claimed to be
Confidential Business Information (CBI) or other information whose
disclosure is restricted by statute. Do not submit information that you
consider to be CBI or otherwise protected through https://
www.regulations.gov or e-mail. The https://www.regulations.gov Web site
is an ``anonymous access'' system, which means EPA will not know your
identity or contact information unless you provide it in the body of
your comment. If you send an e-mail comment directly to EPA without
going through https://www.regulations.gov your e-mail address will be
automatically captured and included as part of the comment that is
placed in the public docket and made available on the Internet. If you
submit an electronic comment, EPA recommends that you include your name
and other contact information in the body of your comment and with any
disk or CD ROM you submit. If EPA cannot read your comment due to
technical difficulties and cannot contact you for clarification, EPA
may not be able to consider your comment. Electronic files should avoid
the use of special characters, any form of encryption, and be free of
any defects or viruses. For additional information about EPA's public
docket, visit the EPA Docket Center homepage at https://www.epa.gov/
epahome/dockets.htm.
Docket: All documents in the docket are listed in the https://
www.regulations.gov index. Although listed in the index, some
information is not publicly available, e.g., CBI or other information
whose disclosure is restricted by statute. Certain other material, such
as copyrighted material, will be publicly available only in hard copy.
Publicly available docket materials are available either electronically
in https://www.regulations.gov or in hard copy at the Revisions to the
Ambient Air Monitoring Regulations Docket, EPA/DC, EPA West, Room B102,
1301 Constitution Ave., NW., Washington, DC. The Public Reading Room is
open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding
legal holidays. The telephone number for the Public Reading Room is
(202) 566-1744, and the telephone number for the Air Docket is (202)
566-1742.
FOR FURTHER INFORMATION CONTACT: For general questions concerning
today's proposed amendments, please contact Mr. Lewis Weinstock, U.S.
EPA, Office of Air Quality Planning and Standards, Emissions Monitoring
and Analysis Division, Ambient Air Monitoring Group (D243-02), Research
Triangle Park, North Carolina 27711; telephone number: (919) 541-3661;
fax number: (919) 541-1903; e-mail address: weinstock.lewis@epa.gov.
For technical questions, please contact Mr. Tim Hanley, U.S. EPA,
Office of Air Quality Planning and Standards, Emissions Monitoring and
Analysis Division, Ambient Air Monitoring Group (D243-02), Research
Triangle Park, North Carolina 27711; telephone number: (919) 541-4417;
fax number: (919) 541-1903; e-mail address: hanley.tim@epa.gov.
SUPPLEMENTARY INFORMATION:
I. General Information
A. Does This Action Apply to Me?
Categories and entities potentially regulated by this action
include:
----------------------------------------------------------------------------------------------------------------
Category NAIC code 1 Examples of regulated entities
----------------------------------------------------------------------------------------------------------------
Industry...................................................... 334513 Manufacturer, supplier,
541380 distributor, or vendor of
ambient air monitoring
instruments; analytical
laboratories or other
monitoring organizations that
elect to submit an application
for a reference or equivalent
method determination under 40
CFR part 53.
[[Page 2711]]
Federal government............................................ 924110 Federal agencies (that conduct
ambient air monitoring similar
to that conducted by States
under 40 CFR part 58 and that
wish EPA to use their
monitoring data in the same
manner as State data) or that
elect to submit an application
for a reference or equivalent
method determination under 40
CFR part 53.
State/local/tribal government................................. 924110 State, territorial, and local,
air quality management programs
that are responsible for
ambient air monitoring under 40
CFR part 58 or that elect to
submit an application for a
reference or equivalent method
determination under 40 CFR part
53. The proposal also may
affect Tribes that conduct
ambient air monitoring similar
to that conducted by States and
that wish EPA to use their
monitoring data in the same
manner as State monitoring
data.
----------------------------------------------------------------------------------------------------------------
1 North American Industry Classification System.
This table is not intended to be exhaustive, but rather provides a
guide for readers regarding entities likely to be regulated by this
action. To determine whether your facility or Federal, State, local, or
territorial agency would be regulated by this action, you should
examine the requirements for reference or equivalent method
determinations in 40 CFR part 53, subpart A (General Provisions) and
the applicability criteria in 40 CFR 51.1 of EPA's requirements for
State implementation plans. If you have any questions regarding the
applicability of this action to a particular entity, consult the person
listed in the preceding FOR FURTHER INFORMATION CONTACT section.
B. What Should I Consider as I Prepare My Comments for EPA?
Do not submit information containing Confidential Business
Information (CBI) to EPA through www.regulations.gov or e-mail. Send or
deliver information identified as CBI only to the following address:
Roberto Morales, OAQPS Document Control Officer (C404-02), U.S. EPA,
Office of Air Quality Planning and Standards, Research Triangle Park,
North Carolina 27711, Attention Docket ID EPA-HQ-OAR-2004-0018. Clearly
mark the part or all of the information that you claim to be CBI. For
CBI information in a disk or CD ROM that you mail to EPA, mark the
outside of the disk or CD ROM as CBI and then identify electronically
within the disk or CD ROM the specific information that is claimed as
CBI. In addition to one complete version of the comment that includes
information claimed as CBI, a copy of the comment that does not contain
the information claimed as CBI must be submitted for inclusion in the
public docket. Information so marked will not be disclosed except in
accordance with procedures set forth in 40 CFR part 2.
C. Where Can I Get a Copy of This Document and Other Related
Information?
In addition to being available in the docket, an electronic copy of
today's proposed amendments is also available on the Worldwide Web
(WWW) through the Technology Transfer Network (TTN). Following the
Administrator's signature, a copy of the proposed amendments will be
placed on the TTN's policy and guidance page for newly proposed or
promulgated rules at https://www.epa.gov/ttn/oarpg. The TTN provides
information and technology exchange in various areas of air pollution
control.
D. Will There Be a Public Hearing?
Public hearings will be held concurrently with the public hearings
on the proposed amendments to the NAAQS for particulate matter
published elsewhere in this Federal Register. The EPA intends to hold
public hearings during February 2006 in Philadelphia, Pennsylvania;
Chicago, Illinois; and San Francisco, California. The EPA will announce
the date, location, and time of the public hearings in a separate
Federal Register notice.
E. Did EPA Conduct a Peer Review Before Issuing This Notice?
The EPA sought expert scientific review of the proposed methods,
technologies, and approach for ambient air monitoring by the Clean Air
Scientific Advisory Committee (CASAC). The CASAC is a Federal advisory
committee established to review scientific and technical information
and make recommendations to the EPA Administrator on issues related to
the air quality criteria and corresponding NAAQS. CASAC constituted a
National Ambient Air Monitoring Strategy (NAAMS) Subcommittee in 2003
to provide advice for a strategy for the national ambient air
monitoring programs. This subcommittee, which operated over a one-year
period, and a new subcommittee on Ambient Air Monitoring and Methods
(AAMM), formed in 2004, provided the input for CASAC on its
consultations, advisories, and peer-reviewed recommendations to the EPA
Administrator.
In July 2003, the CASAC NAAMS Subcommittee held a public meeting to
review EPA's draft National Ambient Air Monitoring Strategy document
(dated September 6, 2002), which contained technical information
underlying planned changes to the ambient air monitoring networks. The
EPA continued to consult with the CASAC AAMM Subcommittee throughout
the development of the proposed amendments. Public meetings were held
in July 2004, December 2004, and September 2005 to discuss the CASAC
review of nearly 20 documents concerning methods and technology for
measurement of particulate matter (PM); data quality objectives for PM
monitoring networks and related performance-based standards for
approval of equivalent continuous PM monitors; reconfiguration of
ambient air monitoring stations; \1\ and other technical aspects of the
proposed amendments. These documents, along with CASAC review comments
and other information are available at: http: //www.epa.gov/ttn/amtic/
casacinf.html.
---------------------------------------------------------------------------
\1\ ``Station'' and ``site'' are used somewhat interchangeably
in this notice of proposed rulemaking. When there is a difference
``site'' generally refers to the location of a monitor, while
``station'' refers to a suite of measurements at a particular site.
---------------------------------------------------------------------------
F. How Is This Document Organized?
The information presented in this preamble is organized as follows:
I. General Information
A. Does this action apply to me?
B. What should I consider as I prepare my comments for EPA?
C. Where can I get a copy of this document and other related
information?
D. Will there be a public hearing?
E. Did EPA conduct a peer review before issuing this notice?
F. How is this document organized?
II. Overview
[[Page 2712]]
A. What is the purpose of today's proposal?
B. What are the major changes proposed to the ambient air
monitoring regulations?
C. When would the proposed amendments affect States, local
governments, tribes, and other stakeholders?
D. How would EPA implement the new requirements?
III. Background
A. What is the role of ambient air monitoring in air quality
management?
B. What is the history of ambient air monitoring?
C. What revisions to the National Ambient Air Quality Standards
for particulate matter also are proposed today?
D. How do the monitoring data apply to attainment or
nonattainment designations and findings?
IV. Proposed Monitoring Amendments
A. What are the proposed terminology changes?
B. What are the proposed requirements for approval of reference
or equivalent methods?
C. What are the proposed requirements for quality assurance
programs for the National Ambient Air Monitoring System?
D. What are the proposed monitoring methods for the National
Ambient Air Monitoring System?
E. What are the proposed requirements for the number and
location of monitors to be operated by State and local agencies?
F. What are the proposed probe and monitoring path siting
criteria?
G. What are the proposed data reporting, data certification, and
sample retention requirements?
V. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory Planning and Review
B. Paperwork Reduction Act
C. Regulatory Flexibility Act
D. Unfunded Mandates Reform Act
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation and Coordination With
Indian Tribal Governments
G. Executive Order 13045: Protection of Children From
Environmental Health and Safety Risks
H. Executive Order 13211: Actions that Significantly Affect
Energy Supply, Distribution, or Use
I. National Technology Transfer Advancement Act
J. Executive Order 12898: Federal Actions to Address
Environmental Justice in Minority Populations and Low-Income
Populations
II. Overview
A. What Is the Purpose of Today's Proposal?
The EPA is proposing a number of changes to the ambient air quality
monitoring requirements of 40 CFR parts 53 and 58 to ensure that the
national network of air monitors will meet the current and future data
needs of EPA (and other Federal), State, local, and tribal air quality
management agencies. While much of today's proposed rule outlines
changes to the monitoring requirements for particulate matter (PM),
there are additional changes relating to all the other criteria
pollutants (ozone (O3), carbon monoxide (CO), sulfur dioxide
(SO2), nitrogen dioxide (NO2), and lead (Pb))
included in this proposal.
Some of these proposed changes are in support of the proposed
revisions to the National Ambient Air Quality Standards (NAAQS) for PM
in 40 CFR part 50 published elsewhere in today's Federal Register.\2\
These changes are essential to implementation of the proposed NAAQS for
PM. Included among these proposed PM-related changes are new provisions
for addition to 40 CFR parts 53 and 58 which address approval of
methods and PM10-2.5 monitoring requirements. The added
provisions would address federal reference method (FRM) equivalency
determinations for continuous PM10-2.5 monitors and the
requirements for the number of PM10-2.5 monitors a State
must deploy. Another important element of the provisions for
PM10-2.5 is a proposal for the conditions under which a
PM10-2.5 monitor may be compared to the PM10-2.5
NAAQS.
---------------------------------------------------------------------------
\2\ The proposed amendments to the National Ambient Air Quality
Standards include revised standards for PM2.5
(particulate mater with an aerodynamic diameter less than or equal
to a nominal 2.5 micrometers) and new standards for
PM10-2.5 (particulate matter with an aerodynamic diameter
less than or equal to a nominal 10 micrometers and greater than or
equal to a nominal 2.5 micrometers).
---------------------------------------------------------------------------
A number of amendments to existing provisions for PM2.5
monitoring are also proposed. These would be important to the
implementation of the revised PM2.5 NAAQS because they take
advantage of the experience and insight gained by EPA and the States
during the past 7 years of PM2.5 monitoring. One of the
proposed PM2.5 changes involves the criteria for FRM
equivalency determinations for continuous PM2.5 monitors. We
anticipate that this change would allow States to operate continuous
monitors at more required monitoring sites, providing more robust data
for the PM2.5 air quality program.
Other proposed changes are based on EPA's assessment that the
monitoring regulations are not fully aligned with current data needs
and opportunities across all the NAAQS pollutants--including PM but
also including O3, CO, SO2, NO2, and
Pb. This misalignment has developed over time as ambient conditions
have improved for some pollutants. Also, new monitoring technologies
have been developed that provide attractive opportunities for obtaining
more robust and useful data. The EPA recognized that changes were
needed several years ago and since then, we have been developing the
specifics of these changes with States and other stakeholders.\3\ This
group of proposed changes includes relaxation of some long-standing
monitoring requirements which we believe are outdated or unnecessarily
inflexible. This group of proposed changes also includes a new
requirement for States to operate a new type of multipollutant
monitoring station, which we plan to call National Core (NCore)
stations. Other proposed changes relate to quality assurance
requirements, monitor siting, special purpose monitoring, and data
management.
We are proposing both the PM NAAQS review-related changes as well
as the overarching NAAQS monitoring system changes together because
they are strongly related in terms of regulatory language and in terms
of implementation decision making. Resources for ambient monitoring are
limited, and the cost of new types of monitoring to meet new
requirements such as those for PM10-2.5 must be offset, at
least in part, by reducing resources for lower value types of
monitoring. The proposed revisions to the monitoring regulations, when
finalized, will improve EPA's and our monitoring partners' abilities to
manage available funds to support monitoring activities and create a
coordinated, integrated, multipurpose, and flexible monitoring system.
In addition, it will be easier for the public to comment on the
proposed changes if they are presented together rather than in
sequential proposals.
---------------------------------------------------------------------------
\3\ Our work with States and other monitoring program
stakeholders has included the development of successive versions of
a draft report, ``National Ambient Air Monitoring Strategy''. The
most recent version, dated December 2005, is available in the public
docket. The document describes in more depth the reasons for
proposing many of the changes presented in this notice, excluding
the changes related to PM10-2.5. It also discusses
strategy elements that are related to, but separate from, the
regulatory provisions in 40 CFR parts 53 and 58 such as funding,
training, etc.
---------------------------------------------------------------------------
The EPA notes that in the proposed regulatory language for 40 CFR
parts 53 and 58, we are reprinting a number of existing provisions
without change (for example, a number of definitions in current 58.1).
We are doing so solely for the readers' convenience in order that the
provisions we are proposing can appear in a single context. The EPA is
not reproposing, reconsidering, or otherwise reopening any of these
reprinted provisions. We will regard any comments as to these
provisions as outside the scope of this proposal.
[[Page 2713]]
B. What Are the Major Changes Proposed to the Ambient Air Monitoring
Regulations?
The summary of each proposed change given here ends with a
reference to the part(s) of section IV of this preamble that describes
that change in detail.
We propose to require States to operate from one to three
National Core (NCore) multipollutant monitoring sites.\4\ Monitors at
NCore multipollutant sites would be required to measure particles
(PM2.5, speciated PM2.5, PM10-2.5),
O3, SO2, CO, nitrogen oxides (NO/NO2/
NOY), and basic meteorology. Monitors for all the gases
except for O3 would be required to be more sensitive than
standard Federal reference method (FRM)/Federal equivalent method (FEM)
monitors, so they could accurately report concentrations that are well
below the respective NAAQS but that can be important in the formation
of O3 and PM. We are not proposing specific locations for
these sites, but instead would collaborate on site selection with
States individually and through multistate organizations. Our objective
is that sites be located in broadly representative urban (about 55
sites) and rural (about 20 sites) locations throughout the country to
help characterize regional and urban patterns of air pollution. We
expect that in many cases States would collocate these new stations
with Photochemical Assessment Monitoring Station (PAMS) sites already
measuring O3 precursors and/or National Air Toxic Trends
Station (NATTS) sites measuring air toxics.
---------------------------------------------------------------------------
\4\ The National Core (NCore) multi-pollutant stations are part
of an overall strategy to integrate multiple monitoring networks and
measurements, including research grade sites and State and local air
monitoring stations (SLAMS). Research grade sites would provide
complex, research-grade monitoring data for special studies; the
proposed amendments do not include requirements for these sites.
SLAMS would include sites needed for National Ambient Air Quality
Standard comparisons and other data needs of monitoring agencies.
The number and placement of SLAMS monitors would vary according to
the pollutant, population, and level of air quality problem. The
April 2004 draft version of the National Ambient Air Monitoring
Strategy presented a taxonomy in which monitoring stations belonged
to three levels, called Level 1 (research sites), Level 2 (what are
called NCore multipollutant sites in this notice), and Level 3 (what
have been called SLAMS/NAMS (national air monitoring stations) in
the past). The three Levels combined were referred to as the NCore
System. We have decided to dispense with the three-level taxonomy
because it does not encompass all relevant monitoring efforts. We
now refer to the collection of all ambient air monitoring--including
research sites, all types of monitoring by States and Tribes, and
all types of ambient monitoring by Federal agencies--as the National
Ambient Air Monitoring System (NAAMS). We are retaining the
``NCore'' label for the multipollutant sites in particular, because
the term with this meaning has become part of the vocabulary of the
State/local monitoring community.
---------------------------------------------------------------------------
These sites would still create points of integration among the
existing networks for criteria pollutants, each of which was originally
designed with only a single pollutant in mind. Where collocated with
sites already measuring O3 precursors or air toxics, the
degree of integration across pollutants of concern would be even
stronger. Data from these NCore sites would be used for several
purposes that cannot be served as well using only data available from
existing networks. Forecasting of the Air Quality Index (AQI) would be
improved by feeding several collocated and interdependent pollutant
concentration measurements into an air quality model in near real-time
to better represent current conditions, from which the model could
provide an improved forecast of O3 and particle levels for
the public. Studies that track long-term trends of criteria pollutants,
and thereby help demonstrate the accountability of implemented
emissions control programs, would be improved by utilizing higher-
sensitivity monitoring equipment for pollutants whose measured levels
are well below the NAAQS. Air quality model development and validation
efforts would benefit by having a long-term network of several
important and interdependent measurements at improved time-scales
(e.g., hourly instead of daily sample concentrations on PM methods) at
a network of sites expected to remain in place over many years to allow
testing of how well models simulate co-pollutant interactions. Where
applicable siting criteria for PM or O3 monitoring stations
are met, NCore sites could also be used to satisfy minimum monitoring
requirements for PM and O3 and data from these stations
could be used in designation decisions and in development of control
strategies.\5\ The NCore proposals are described more fully in section
IV.E.1 of this preamble.
---------------------------------------------------------------------------
\5\ While not a part of our rationale for requiring States to
operate these sites, we note that the data from them will also be of
use in future health effects studies.
---------------------------------------------------------------------------
We propose monitoring requirements for PM10-2.5
which are based on deploying a network of FEM monitors that would be
approved based on criteria for comparability to monitors utilizing the
FRM proposed elsewhere in today's Federal Register. Requirements for
PM10-2.5 Class I, Class II, and Class III candidate
equivalent methods would be established. The definition of a ``Class
III equivalent method'' would allow for designation of continuous and
semi-continuous ambient air monitoring methods for
PM10-2.5.\6\ Because we intend that most of the monitors
used in the PM10-2.5 network will use continuous or semi-
continuous equivalent methods, the proposal for Class III approval
requirements is particularly important for PM10-2.5. We are
also proposing minimum requirements for a PM10-2.5
monitoring network, including criteria for the number of FRM/FEM
monitoring sites in each metropolitan area (which would vary from zero
to five) and criteria for how monitors should be placed within an area.
Closely linked to the placement criteria is a proposed test for the
suitability of a PM10-2.5 monitoring site for comparison
with the PM10-2.5 NAAQS. We are also proposing that
speciation monitoring of PM10-2.5 be required in some areas.
These proposals appear in sections IV.B.2, IV.B.3, IV.B.5, and IV.B.6
(dealing with equivalent methods) and section IV.E.2 (dealing with
number of monitors, their placement, and the use of data from them in
comparisons to the NAAQS) of this preamble.
---------------------------------------------------------------------------
\6\ Class I equivalent methods have only minor deviations or
modifications from the specified reference method. Class II
equivalent methods include other filter-based, integrated,
gravimetric-type methods similar to the specified reference method
but with greater deviations than allowed for a Class I method. Class
III equivalent methods include all candidate PM2.5 and
PM10-2.5 methods not classified as Class I or Class II.
We expect that most candidate Class III equivalent methods will be
continuous or semi-continuous methods.
---------------------------------------------------------------------------
We propose amendments to facilitate the wider use of
continuous PM2.5 monitors by revising performance-based FEM
equivalence standards for continuous PM2.5 monitors and
allowing for approved regional methods (ARM) for continuous
PM2.5 mass monitors. Existing requirements for
PM2.5 Class I and Class II candidate equivalent methods
would be revised, and new requirements for PM2.5 Class III
candidate equivalent methods would be added. The definition of a Class
III equivalent method would be revised to allow for designation of
continuous and semi-continuous ambient air monitoring methods for
PM2.5. These proposals appear in sections IV.B.4, IV.B.5,
and IV.B.6 (FEM equivalence standards) and in section IV.D.2 (approved
regional methods) of this preamble.
In association with the proposed requirements for new
PM10-2.5 stations and new NCore multipollutant stations, we
propose to remove the existing requirements for certain numbers of
State and local air FRM/FEM monitoring stations for CO,
PM10, SO2, and NO2, and reduce them
for Pb.
[[Page 2714]]
However, States would still need EPA approval to move or remove
existing monitoring stations for these pollutants.\7\ To expedite
reviews and provide more certainty to State planning, a specific
process and several substantive criteria are proposed to govern EPA
approval actions. Also, the requirement that EPA approval be obtained
at the Administrator level (rather than the Regional Administrator
level) for the subset of these monitors historically designated as NAMS
would be eliminated, and all changes would be reviewed by the Regional
Administrator.\8\ In addition, the requirements for monitoring of
O3 precursors under the PAMS program would be reduced by
about 50 percent. These proposed changes allow PAMS monitoring to be
more customized to local data needs rather than meeting so many
specific requirements common to all subject O3 nonattainment
areas; the PAMS changes would also give States the flexibility to
reduce the overall size of their PAMS programs--within limits--and to
use the associated resources for other types of monitoring they
consider more useful. Requirements for minimum numbers of O3
and PM2.5 monitors would be retained, with small
adjustments. The overall impact of these changes would be to retain
comprehensive monitoring networks for PM2.5 and
O3, and to reduce the number of SO2, CO,
NO2, Pb, and PM10 monitors in areas that do not
have air quality problems for these pollutants. PM2.5 and
O3 monitoring would be mostly unaffected because
PM2.5 and O3 are current nonattainment challenges
and comprehensive monitoring is needed to support efforts to attain the
NAAQS. Many existing monitors for SO2, CO, NO2,
Pb, and PM10 can be discontinued because they are now well
below the applicable NAAQS and the data from most of these monitors
have low value for air quality management and research purposes. We
expect reductions in the number of monitors for these pollutants
nationally to be in the range of about 33 percent for SO2 to
about 90 percent for NO2.\9\ This would free up resources to
go beyond minimum requirements for O3, PM2.5,
PM10-2.5, or other pollutants such as air toxics in areas
where there are ongoing or new air quality management challenges. These
proposed changes are described in sections IV.E.3 (number of
PM2.5 monitors), IV.E.4 (PM10 monitors), IV.E.5
(number of O3 monitors), IV.E.6 (number of CO,
SO2, NO2, and Pb monitors), IV.E.7 (PAMS
monitors), and IV.E.8 (process and criteria for moving or removing
monitors) of this preamble.
---------------------------------------------------------------------------
\7\ Where the PM10 annual and 24-hour NAAQS have both
been revoked, the proposed rule does not require prior EPA approval
for discontinuing a PM10 monitor.
\8\ EPA Administrator approval would continue to be required for
changes to some PM2.5 speciation monitoring stations, to
any required NCore multipollutant station, and to any PAMS station.
\9\ Detailed estimates of the current and expected future number
of each type of monitor over the 3 years following promulgation are
given in the supporting statement to the Information Collection
Request for this action, available in the docket.
---------------------------------------------------------------------------
We propose updated quality assurance (QA) requirements for
all NAAQS pollutants, emphasizing the responsibility of each monitoring
program for its data quality based on the use of data quality
objectives for monitoring precision, data completeness, and bias.
States would be required to provide for adequate, independent
performance audits of FRM/FEM monitoring stations. We describe several
options for how they could meet this audit responsibility. One way
would be to agree to have appropriated State and Territorial Air Grant
(STAG) funds retained by EPA to cover the cost of performing these
audits; another option would be a partnership between State/local
monitoring agencies (or independent subunits within one agency). The
statistics for calculating precision and bias would also would be
revised. Quality assurance requirements would be defined for
PM10-2.5 monitoring. See section IV.C of this preamble for
details.
We propose to revise the provisions regarding special
purpose monitors (SPM) for all NAAQS pollutants. In certain restricted
situations, data from SPM would not be usable for nonattainment
designations. SPM that are FRM, FEM, or ARM monitors would be required
to meet standard quality assurance requirements for their monitor type,
and States would be required to report data from such SPM to the Air
Quality System (AQS). See section IV.E.9 of this preamble for details.
We propose to require that States conduct in-depth network
assessments every 5 years. These assessments are intended to ensure
that future gaps between data needs and monitoring operations are
identified and filled in a timely manner. See section IV.E.11 of this
preamble for specifics.
We propose to move requirements for reporting certain
operational data from PM samplers from 40 CFR part 50 to 40 CFR part
58, and to reduce the number of data elements required to be reported.
This would put all similar data reporting requirements together in 40
CFR part 58 and allow them to apply to both FRM and FEM monitors. See
section IV.G.1 of this preamble.
We propose a new requirement for the reporting of
PM2.5 field blank data.\10\ Only the data from field blanks
which States are already taking into the field and weighing in their
laboratories would be required to be reported under this proposal.
Having the data from these field blanks available to the national
monitoring community would help EPA and other researchers understand
the relationship between the mass of PM that is sampled and weighed on
a regular PM filter and the PM that is actually present in ambient air.
See section IV.G.2 of this preamble or details.
---------------------------------------------------------------------------
\10\ Field blanks are filters which are handled in the field as
much as possible like actual filters except that ambient air is not
pumped through them, to help quantify contamination and sampling
artifacts.
---------------------------------------------------------------------------
We propose to require State or local agencies to submit
annual data certification letters, by May 1 of each year, to certify
that the ambient air concentration and QA data submitted to EPA's AQS
for the previous year are complete and accurate. These letters are now
required on July 1 of each year. See section IV.G.3 of this preamble.
We propose to require States to archive PM2.5
and PM10-2.5 filters for one year (the current requirement
is only for PM2.5 filters).\11\ See section IV.G.4 of this
preamble.
---------------------------------------------------------------------------
\11\ A PM10-2.5 ``filter'' from a FRM monitor would
actually consist of the separate PM10 and
PM2.5 filters. Some equivalent methods, if approved,
could involve a single PM10-2.5 filter. All filters from
both types of monitors would be subject to the archiving
requirement.
---------------------------------------------------------------------------
We propose to increase the distance that ozone monitors
should be placed downwind of roadways, to reduce the possibility that
ozone readings will be artificially low due to ozone scavenging by NO
emitted by vehicles on roadways. See section IV.F of this preamble.
C. When Would the Proposed Amendments Affect State and Local
Governments, Tribes, and Other Stakeholders?
1. State and Local Governments
Only State governments, and those local governments that have been
assigned responsibility for ambient air monitoring by their States, are
subject to the mandatory requirements of 40 CFR part 58.\12\
---------------------------------------------------------------------------
\12\ Throughout this preamble, ``States'' is meant to also refer
to local governments that have been assigned responsibility for
ambient air monitoring within their respective jurisdiction by their
States. We also use ``monitoring organization'' to refer to States,
local agencies, and/or Tribes conducting monitoring under or guided
by the provisions of 40 CFR part 58.
---------------------------------------------------------------------------
The proposed compliance date for deployment of PM10-2.5
monitors by States is January 1, 2009. A plan for this
[[Page 2715]]
deployment would be due January 1, 2008, unless an extension is granted
to July 1, 2008. These plans would be subject to EPA approval at the
Regional Office level.
State (or local) agencies would also be required to submit earlier
annual data certification letters and make electronic reports of QA
data to the AQS, starting May 1, 2009.
The proposed amendments require that State (or local) agencies
fully implement the required NCore multipollutant sites by January 1,
2011 (more than 4 years after the expected date of promulgation of the
amendments). A plan for this implementation, including site selection,
would be due July 1, 2009.
Network assessments would be required every 5 years starting July
1, 2009.
State and local agencies would be required to comply with existing
requirements in 40 CFR part 58 (including annual network review and
data reporting), until the compliance date for each new requirement is
reached.
Some provisions in the proposed amendments to 40 CFR part 58 (those
that do not involve deployment of new monitoring stations or new types
of data handling) would be effective as of the effective date of the
final rule.
2. Tribes
Under the Tribal Authority Rule (TAR) (40 CFR part 49), which
implements section 301(d) of the CAA, Tribes may elect to be treated in
the same manner as a State in implementing sections of the CAA.
However, the EPA determined in the TAR that it was inappropriate to
treat Tribes in a manner similar to a State with regard to specific
plan submittal and implementation deadlines for NAAQS-related
requirements, including, but not limited to, such deadlines in CAA
sections 110(a)(1), 172(a)(2), 182, 187, and 191. See 40 CFR 49.4(a).
For example, an Indian tribe may choose, but is not required, to submit
implementation plans for NAAQS related requirements, nor are they
required to monitor. If a Tribe elects to do an implementation plan,
the plan can contain program elements to address specific air quality
problems in a partial program. The EPA will work with the Tribe to
develop an appropriate schedule which meets the needs of each Tribe.
Indian tribes have the same rights and responsibilities as States
under the CAA to implement elements of air quality programs as they
deem necessary. Tribes can choose to engage in ambient air monitoring
activities. In many cases, Indian tribes are required by EPA regions to
institute strict quality assurance programs, utilize FRM or FEM when
comparing their data to the NAAQS, and to insure that the data
collected is qualitative and representative of their respective
airsheds. For FRM and FEM monitors used for NAAQS attainment or
nonattainment determinations, quality assurance requirements of 40 CFR
part 58 must be followed and would be viewed by EPA as an indivisible
element of a regulatory air quality monitoring program.
3. Other Stakeholders
Manufacturers of continuous PM2.5 and
PM10-2.5 instruments would be able to apply for designation
of their instruments as FEM as soon as the notice of final rulemaking
is signed. The EPA is eager to receive such applications as soon as
manufacturers can collect and analyze the necessary supporting data.
D. How Would EPA Implement the New Requirements?
After promulgation, we would implement the new requirements using
several mechanisms. We expect to work with each State to develop the
monitoring plans for their new PM10-2.5 and NCore
multipollutant monitoring stations. For example, we would negotiate the
selection of required new monitoring sites (or new capabilities at
existing sites) and their schedules for start up as well as plans to
discontinue sites that were no longer needed. The EPA would negotiate
with each State its annual grants for air quality management
activities, including ambient monitoring work. We would negotiate
grants that provide funding to meet minimum requirements and which have
milestones for completion of necessary changes. Once States have
established a new monitoring infrastructure to meet the new
requirements, we would review State monitoring activities, submitted
data, and plans for further changes on an annual basis.
The EPA's support for and participation in enhancing the national
ambient air monitoring system to serve current and future air quality
management and research needs will extend beyond ensuring that States
meet the minimum requirements of the monitoring rules, including the
proposed amendments. We will work with each State or local air
monitoring agency to determine what affordable monitoring activities
above minimum requirements would best meet the diverse needs of the
individual air quality management program as well as the needs of other
data users. In particular, we may negotiate with some States, and
possibly with some Tribes, for the establishment and operation of some
additional rural NCore multipollutant monitoring stations to complement
the multipollutant stations that would be required by the proposed
changes to the monitoring regulations. We also expect to work with the
States, and possibly with some Tribes, to establish and operate more
PM10-2.5 speciation sites than the minimums that would be
required by the proposed amendments. We expect to work with the States,
and possibly with some Tribes, to establish and operate rural
PM10-2.5 mass concentration sites in less urbanized
locations.
An important element of implementing the new requirements will be
EPA's role in encouraging the development and application of Federal
equivalent methods (FEM), in particular for continuous methods of
measuring PM2.5 and PM10-2.5. We have determined
that continuous monitoring of PM2.5 has many advantages over
the filter-based Federal reference method. One of the proposed changes
makes it more practical for manufacturers of continuous
PM2.5 instruments to obtain designation for them as FEM or
approved regional methods. To ensure objectivity and sound science,
EPA's Office of Research and Development would continue to review
applications for FEM designations based on the criteria proposed today
and would recommend approval or disapproval to the EPA Administrator.
We will also provide technical guidance documents and training
opportunities for State, local, and Tribal monitoring staff to help
them select, operate, and use the data from new types of monitoring
equipment. We have already distributed a technical assistance document
on the precursor gas monitors \13\ that will be part of the
multipollutant sites and we have conducted three training workshops on
these monitors. Additional guidance will be developed and provided on
some other types of monitors with which many State monitoring staff are
currently unfamiliar, and on network design, site selection, quality
assurance, and other topics. While Tribes are not to be subject to the
requirements of the proposed monitoring amendments,
[[Page 2716]]
these technical resources will also be available to them directly from
EPA and via grantees, such as the Institute for Tribal Environmental
Professionals and the Tribal Air Monitoring Support Center.
---------------------------------------------------------------------------
\13\ Technical Assistance Document (TAD) for Precursor Gas
Measurements in the NCore Multipollutant Monitoring Network. Version
4. U.S. Environmental Protection Agency. EPA-454/R-05-003. September
2005. Available at: https://www.epa.gov/ttn/amtic/pretecdoc.html.
---------------------------------------------------------------------------
In partnership with States, we will also continue to plan and
manage State technical assistance grants (STAG) to support the National
Park Service's operation of the IMPROVE monitoring network, which
provides important data for implementing both regional haze and
PM2.5 attainment programs.\14\
---------------------------------------------------------------------------
\14\ Additional information on EPA/National Park Service IMPROVE
(Interagency Monitoring of Protected Visual Environments) Visibility
Program is available at: https://www.epa.gov/ttn/amtic/visdata.html.
---------------------------------------------------------------------------
We will also continue to operate the Clean Air Status and Trends
Network (CASTNET), which monitors for O3, PM, and chemical
components of PM in rural areas across the nation.\15\ We are in the
process of revising CASTNET to upgrade its monitoring capabilities to
allow it to provide even more useful data to multiple data users. We
expect that about 20 CASTNET sites will have new capabilities at least
equivalent to the capabilities envisioned for NCore multipollutant
sites. Those sites would reduce the number of, and complement, rural
multipollutant sites funded with limited State/local grant funds.
---------------------------------------------------------------------------
\15\ Additional information on CASTNET is available at: https://
www.epa.gov/castnet/.
---------------------------------------------------------------------------
We recognize that some air quality management issues require
ambient concentration and deposition data that cannot be provided by
the types of monitoring required by the proposed monitoring amendments
and other activities addressed in today's proposal. These issues
include near-roadway exposures to emissions from motor vehicles and
mercury deposition. We are actively researching these issues and
developing plans for monitoring programs to address them, but these
issues are outside the scope of this proposal.
III. Background
A. What Is the Role of Ambient Air Monitoring in Air Quality
Management?
Ambient air monitoring systems are a critical part of the nation's
air quality management program infrastructure. We use the ambient air
monitoring data for a wide variety of purposes as part of an iterative
process in managing air quality. This iterative process involves a
continuum of setting standards and objectives, designing and
implementing control strategies, assessing the results of those control
strategies, and measuring progress. The data have many uses throughout
this system, such as: Determining compliance with the National Ambient
Air Quality Standards (NAAQS); characterizing air quality status and
trends; estimating health risks and ecosystem impacts; developing and
evaluating emissions control strategies; and measuring overall progress
for the air pollution control program. Ambient air monitoring data
provide accountability for control strategy reductions by tracking
long-term trends of criteria and noncriteria pollutants and their
precursors. The data also form the basis for air quality forecasting
and other public air quality reports.
More detailed ambient monitoring data are needed to meet current
and future program and research needs. The data collected by State and
local agencies under the proposed monitoring amendments would:
Provide more timely Air Quality Index reporting to the
public by supporting continuous particle measurements needed for AIRNow
air quality forecasting and other public reporting mechanisms;
Improve the development of emissions control strategies
through more effective air quality model evaluation and other
observational methods; and
Support long-term health assessments that contribute to
ongoing reviews of the NAAQS and other scientific studies ranging
across technological, health, and atmospheric process disciplines.
B. What Is the History of Ambient Air Monitoring?
1. Statutory Authority
The EPA rules for ambient air monitoring are authorized under
sections 110, 301(a), and 319 of the Clean Air Act (CAA). Section
110(a)(2)(B) of the CAA requires that each State implementation plan
(SIP) provide for the establishment and operation of devices, methods,
systems, and procedures needed to monitor, compile, and analyze data on
ambient air quality and for the reporting of air quality data to EPA.
Section 301(a) of the CAA authorizes EPA to develop regulations needed
to carry out the Agency's mission and establishes rulemaking
requirements. Uniform criteria to be followed when measuring air
quality and provisions for daily air pollution index reporting are
required by CAA section 319.
2. Ambient Air Monitoring Regulations
The EPA's procedures for determining and designating reference and
equivalent methods (40 CFR part 53) have been in place since 1975 (40
FR 7049, February 18, 1975). Reference methods for criteria pollutants
provide uniform, reproducible measurements of concentrations in the
ambient air. Equivalent methods allow for the introduction of new and
innovative technologies for the same purpose, provided the technologies
produce measurements comparable to reference methods under a variety of
monitoring conditions.
Subpart A of 40 CFR part 53 (General Provisions) establishes
definitions; general requirements for designation of Federal reference
methods (FRM) and Federal equivalent methods (FEM); procedures for
submitting, processing, and approving applications; and associated
provisions. The general requirements identify the applicable
requirements or tests that a candidate method must meet to be approved
as a FRM or FEM. All manual or automated methods must meet the
applicable requirements in 40 CFR part 53, subpart C (Procedures for
Determining Comparability Between Candidate Methods and Reference
Methods). Automated equivalent methods for pollutants other than
PM10 or PM2.5 also must meet the requirements in
40 CFR part 53, subpart B (Procedures for Testing Performance
Characteristics of Automated Methods for SO2, CO,
O3, and NO2). A manual sampler or automated
method for PM10, Class I equivalent method for
PM2.5, or Class II equivalent method for PM2.5
also must meet the requirements in 40 CFR part 53, subpart D
(Procedures for Testing Performance Characteristics of Methods for
PM10), subpart E (Procedures for Testing Physical (Design)
and Performance Characteristics of Reference Methods and Class I
Equivalent Methods for PM2.5), or subpart F (Procedures for
Testing Performance Characteristics of Class II Equivalent Methods for
PM2.5), as applicable. The existing rule adopts a case-by-
case approach for PM2.5 Class III candidate equivalent
methods. The regulations in 40 CFR part 53 have been amended several
times since 1975 to reflect the addition of new and revised reference
methods and advances in monitoring methods and technologies for
criteria pollutants.
In 1979 (44 FR 27558, May 10, 1979), EPA issued the first
regulations for ambient air quality surveillance (40 CFR part 58) for
all pollutants subject to NAAQS. Within 40 CFR part 58, subpart A
(General Provisions) establishes definitions, and subpart B (Monitoring
Criteria) sets requirements for quality assurance, methods, siting,
operating
[[Page 2717]]
schedules, and special purpose monitors. Subpart C (State and Local Air
Monitoring Stations), subpart D (National Air Monitoring Stations), and
subpart E (Photochemical Assessment Monitoring Stations) generally
define the current monitoring networks. Appendices A through G to 40
CFR part 58 contain more detailed requirements on quality assurance;
monitoring methods, network design, and siting criteria; and air
quality reporting. Subpart F (Air Quality Index Reporting), subpart G
(Federal Monitoring), and appendices F and G to 40 CFR part 58 define
annual and daily reporting requirements.
Most of the major amendments to the monitoring regulations made
after 1979 coincide with the NAAQS revisions and include the addition
of provisions for PM10 (52 FR 24740, July 1, 1987) and
PM2.5 (62 FR 38833, July 18, 1997). Photochemical assessment
monitoring stations (PAMS) were established in 1993 to monitor ozone
and visibility (58 FR 8468, February 12, 1993).
3. Monitoring Networks
More than 5,500 monitors at about 3,000 sites in the State and
local air monitoring stations (SLAMS) and national air monitoring
stations (NAMS) networks comprise the majority of monitors measuring
criteria pollutants using FRM or FEM for direct comparison to the
NAAQS. The NAMS are a subset of SLAMS that are designated as national
trends sites. The PM2.5 network consists of ambient air
monitoring sites that make mass or chemical speciation measurements.
Within the PM2.5 network operated by State and local
agencies, there are approximately 1,200 FRM filter-based samplers and
about 450 continuous monitors for mass measurements. Chemical
speciation measurements are made at 54 ``Speciation Trends Network''
sites that are intended to remain in operation indefinitely and about
200 other, potentially less permanent sites used to support SIP
development and other monitoring objectives. These stations collect
aerosol samples and analyze the filters for trace elements, major ions,
and carbon fractions.
Ambient air monitors in the PAMS network measure ozone precursors
at 109 stations in 25 serious, severe, or extreme ozone nonattainment
areas. The PAMS monitors use near-research-grade measurement
technologies to produce continuous data for more than 50 volatile
organic compounds during summer ozone seasons.
In addition to the NAMS/SLAMS/PAMS sites, there are approximately
310 ambient air toxics monitoring sites, the majority of which are
Federally funded and report data to EPA's Air Quality System (AQS).
Ambient air monitoring stations also are operated by Indian Tribes.
Thirty-one Tribes are currently making data from 119 individual
monitors available to EPA and others. Approximately 73 Tribal sites
monitor for PM10 and PM2.5, and about 16 monitor
for ozone.
The Clean Air Status and Trends Network (CASTNET) is cooperatively
operated and funded by EPA with the National Park Service. The EPA's
Office of Air and Radiation operates a majority of the monitoring
stations with contractor support; however, the National Park Service
operates approximately 30 stations in cooperation with EPA. It the
nation's primary source for data on dry acidic deposition and rural,
ground-level ozone. Operating since 1987, CASTNET is used in
conjunction with other national monitoring networks to provide
information for evaluating the effectiveness of national emission
control strategies. CASTNET consists of over 80 sites across the
eastern and western U.S. The longest data records are primarily at
eastern sites. CASTNET provides atmospheric data on the dry deposition
component of total acid deposition, ground-level ozone and other forms
of atmospheric pollution. More information is available from the
CASTNET program Web site https://www.epa.gov/castnet/.
The EPA is also one of many sponsors of the National Atmospheric
Deposition Program/National Trends Network. The National Atmospheric
Deposition Program/National Trends Network (NADP/NTN) is a nationwide
network of precipitation monitoring stations. The NADP/NTN has over 200
stations spanning the continental U.S., Alaska, and Puerto Rico, and
the Virgin Islands. The purpose of the network is to collect data on
the chemistry of precipitation for monitoring of geographical and
temporal long-term trends. While distinct from ambient air monitoring,
precipitation monitoring is related in that it shares same of the same
objectives, including tracking the effects of emission reduction
programs. More information on NADP is available at its Internet Web
site, https://nadp.sws.uiuc.edu/.
The EPA is a major funding sponsor of the Interagency Monitoring of
Protected Visual Environments (IMPROVE) program. IMPROVE is a
cooperative measurement effort governed by a steering committee
composed of representatives from EPA, National Park Service, other
Federal agencies, and Regional-State organizations. A total of 110
monitoring stations in Class I visibility areas have particulate matter
samplers to measure speciated PM2.5 and PM10
mass. Select stations also deploy transmissometer and nephelometers to
measure light extinction and scattering res