Revisions to Ambient Monitoring Quality Assurance and Other Requirements, 17247-17299 [2016-06226]
Download as PDF
Vol. 81
Monday,
No. 59
March 28, 2016
Part II
Environmental Protection Agency
Lhorne on DSK5TPTVN1PROD with RULES2
40 CFR Part 58
Revisions to Ambient Monitoring Quality Assurance and Other
Requirements; Final Rule
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
PO 00000
Frm 00001
Fmt 4717
Sfmt 4717
E:\FR\FM\28MRR2.SGM
28MRR2
17248
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
ENVIRONMENTAL PROTECTION
AGENCY
40 CFR Part 58
[EPA–HQ–OAR–2013–0619; FRL–9942–91–
OAR]
RIN 2060–AS00
Revisions to Ambient Monitoring
Quality Assurance and Other
Requirements
Environmental Protection
Agency (EPA).
ACTION: Final rule.
AGENCY:
This action promulgates
revisions to ambient air monitoring
requirements for criteria pollutants.
These revisions include adding and
harmonizing definitions; clarifying
annual monitoring network plan public
notice requirements; revising network
design requirements; system
modifications and operating schedules;
clarifying data certification, data
submittal and archiving procedures;
reorganizing and clarifying quality
assurance requirements; and revising
certain network design criteria for nonsource oriented lead monitoring. These
revisions also address other issues in
the Ambient Air Quality Surveillance
Requirements, to help reduce the
compliance burden of monitoring
agencies operating ambient monitoring
networks.
SUMMARY:
This final rule is effective on
April 27, 2016.
ADDRESSES: The EPA has established a
docket for this action under Docket ID
No. EPA–HQ–OAR–2013–0619. All
documents in the docket are listed on
the https://www.regulations.gov Web
site. Although listed in the index, some
information is not publicly available,
e.g., Confidential Business Information
(CBI) or other information whose
disclosure is restricted by statute.
Certain other material, such as
copyrighted material, is not placed on
the Internet and will be publicly
available only in hard copy form.
Publicly available docket materials are
available either electronically through
https://www.regulations.gov or in hard
copy at Docket ID No. EPA–HQ–OAR–
2013–0619, EPA Docket Center, EPA
WJC West Building, Room 3334, 1301
Constitution Ave. NW., Washington,
DC. The Docket Facility is open from
8:30 a.m. to 4:30 p.m. Monday through
Friday, excluding legal holidays. The
docket telephone number is (202) 566–
1742. The Public Reading Room is open
from 8:30 a.m. to 4:30 p.m., Monday
through Friday, excluding legal
Lhorne on DSK5TPTVN1PROD with RULES2
DATES:
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
holidays. The telephone number for the
Public Reading Room is (202) 566–1744.
FOR FURTHER INFORMATION CONTACT: Mr.
Lewis Weinstock, Air Quality
Assessment Division, Office of Air
Quality Planning and Standards, U.S.
Environmental Protection Agency, Mail
code C304–06, Research Triangle Park,
NC 27711; telephone: (919) 541–3661;
fax: (919) 541–1903; email:
weinstock.lewis@epa.gov.
SUPPLEMENTARY INFORMATION:
A. Does this action apply to me?
This action applies to state, territorial,
and local air quality management
programs that are responsible for
ambient air monitoring under 40 CFR
part 58. Categories and entities
potentially regulated by this action
include:
NAICS a code
Category
State/territorial/local/tribal
government .......................
a North
American
Industry
924110
Classification
System.
B. Where can I get a copy of this
document?
In addition to being available in the
docket, an electronic copy of this action
will also be available on the Worldwide
Web (WWW) through the Technology
Transfer Network (TTN). Following
signature, a copy of this action will be
posted at the TTN’s Ambient
Monitoring Technology Information
Center at the following address: https://
www3.epa.gov/ttnamti1/monregs.html.
The TTN provides information and
technology exchange in various areas of
air pollution control.
C. Judicial Review
This rule is nationally applicable and,
furthermore, the Administrator finds
that it is of nationwide scope and effect.
Under section 307(b)(1) of the Clean Air
Act (CAA), judicial review of this final
rule is available by filing a petition for
review in the U.S. Court of Appeals for
the District of Columbia Circuit by May
27, 2016. Moreover, under section
307(b)(2) of the CAA, the requirements
established by this action may not be
challenged separately in any civil or
criminal proceedings brought by the
EPA to enforce these requirements.
Table of Contents
The following topics are discussed in this
preamble:
I. Background
II. Amendments to the Ambient Monitoring
Requirements
A. General Information
B. Definitions
PO 00000
Frm 00002
Fmt 4701
Sfmt 4700
C. Annual Monitoring Network Plan and
Periodic Network Assessment
D. Network Technical Requirements
E. Operating Schedules
F. System Modification
G. Annual Air Monitoring Data
Certification
H. Data Submittal and Archiving
Requirements
I. Network Design Criteria (Appendix D)
III. Amendments to Quality Assurance
Requirements
A. Quality Assurance Requirements for
Monitors Used in Evaluations for
National Ambient Air Quality
Standards—Appendix A
1. General Information
2. Quality System Requirements
3. Measurement Quality Checks for Gases
4. Measurement Quality Checks for
Particulate Monitors
5. Calculations for Data Quality
Assessment
B. Quality Assurance Requirements for
Monitors Used in Evaluations of
Prevention of Significant Deterioration
Projects—Appendix B
1. General Information
2. Quality System Requirements
3. Measurement Quality Checks for Gases
4. Measurement Quality Checks for
Particulate Monitors
5. Calculations for Data Quality
Assessment
IV. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory
Planning and Review and Executive
Order 13563: Improving Regulation and
Regulatory Review
B. Paperwork Reduction Act (PRA)
C. Regulatory Flexibility Act (RFA)
D. Unfunded Mandates Reform Act
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation
and Coordination With Indian Tribal
Governments
G. Executive Order 13045: Protection of
Children From Environmental Health
and Safety Risks
H. Executive Order 13211: Actions
Concerning Regulations That
Significantly Affect Energy Supply,
Distribution, or Use
I. National Technology Transfer and
Advancement Act
J. Executive Order 12898: Federal Actions
To Address Environmental Justice in
Minority Populations and Low-Income
Populations
K. Congressional Review Act
I. Background
On September 11, 2014, the EPA
proposed revisions to its ambient air
monitoring requirements for criteria
pollutants to provide clarifications to
existing requirements and to reduce the
compliance burden of monitoring
agencies operating ambient monitoring
networks (79 FR 54356). The proposal
focused on ambient monitoring
requirements that are found in 40 CFR
part 58 and the associated appendices
(A, D, and new Appendix B), including
issues such as operating schedules, the
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
development of annual monitoring
network plans, data reporting and
certification requirements, and the
operation of the required quality
assurance (QA) program. These
revisions were proposed to maintain the
robust nature of the ambient monitoring
networks while identifying efficiencies
and flexibilities that would help ensure
the successful operation of the national
monitoring system.
The EPA last completed a
comprehensive revision of its ambient
air monitoring regulations in a final rule
published on October 17, 2006 (71 FR
61236). Minor revisions were completed
in a direct final rule published on June
12, 2007 (72 FR 32193). Periodic
pollutant-specific monitoring updates
have occurred in conjunction with
revisions to the National Ambient Air
Quality Standards (NAAQS). In such
cases, the monitoring revisions were
typically finalized as part of the NAAQS
final rules.1
Lhorne on DSK5TPTVN1PROD with RULES2
II. Amendments to the Ambient
Monitoring Requirements
A. General Information
This section describes revisions to the
EPA’s ambient air monitoring
requirements found in 40 CFR part 58—
Ambient Air Quality Surveillance:
Subpart A—General Provisions, Subpart
B—Monitoring Network, and Appendix
D—Network Design Criteria for Ambient
Air Quality Monitoring.
The EPA received public comments
on its September 2014 proposal from 31
respondents including 15 state agencies,
12 local agencies, two
multijurisdictional organizations (MJO),
one consulting firm, and one
environmental organization whose
comments represented two
organizations. Due to the relatively large
number of individual revisions
contained in the proposal, commenters
typically focused their attention on
particular items of interest while
occasionally providing a more general,
overarching statement of support for the
remaining provisions. In some cases,
commenters remained silent on other
provisions of the proposal and the level
of support for those provisions cannot
be ascertained. In the following
sections, the specific comments will be
noted as they pertain to each particular
proposed revision. This preamble will
summarize the affected regulation,
proposed changes, public comments
that were received, the EPA’s analysis of
those comments where applicable, and
EPA’s final decision concerning the
revisions. A detailed description of
1 Links to the NAAQS final rules are available at:
https://www3.epa.gov/ttn/naaqs/criteria.html.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
changes to Quality Assurance
Requirements is contained in section III
of the preamble.
B. Definitions
The presence of a definitions section
in the regulation ensures a consistent
interpretation of technical terminology
across the various parts of the CFR that
pertain to ambient air monitoring, as
well as in supporting guidance
documents, databases, and outreach
materials that support the monitoring
community.
The EPA proposed to add and revise
several terms to ensure consistent
interpretation within the monitoring
regulations and to harmonize usage of
terms with the definition of key
metadata fields that are important
components of the Air Quality System
(AQS).2
The EPA proposed to add the term
‘‘Certifying Agency’’ to the list of
definitions. The certifying agency field
was added to the AQS in 2013 as part
of the development of a revised process
for states and the EPA Regions to meet
the data certification requirements
described in 40 CFR 58.15. The new
term specifically describes any
monitoring agency that is responsible
for meeting data certification
requirements for a set of monitors. In
practice, a certifying agency is typically
a state, local, or tribal agency depending
on the particular data reporting
arrangements that have been approved
by an EPA Regional Office for a given
state. A list of certifying agencies by
individual monitor is available on the
AQS–TTN Web site.3
The term ‘‘Chemical Speciation
Network,’’ or CSN, was proposed for
addition to the definition list. The CSN
has been functionally defined as being
composed of the Speciation Trends
Network (STN) sites and the
supplemental speciation sites that are
collectively operated by monitoring
agencies to obtain particulate matter up
to 2.5 micrometers (PM2.5) chemical
species data.
The term ‘‘Implementation Plan’’ was
proposed for addition to provide more
specificity to current definitions that
reference the word ‘‘plan’’ in their
description. The EPA wishes to ensure
that references to State Implementation
Plans (SIPs) are not confused with
2 The AQS is the EPA’s repository of ambient air
quality data. The AQS stores data from over 10,000
monitors, 5,000 of which are currently active. State,
local and tribal agencies collect the data and submit
it to the AQS on a periodic basis. See https://
www.epa.gov/aqs/aqs-obtaining-aqs-data for
additional information.
3 https://aqs.epa.gov/aqsweb/codes/data/
CertifyingAgenciesByMonitor.html.
PO 00000
Frm 00003
Fmt 4701
Sfmt 4700
17249
references to Annual Monitoring
Network Plans that are described in 40
CFR 58.10.
The EPA proposed to revise the term
‘‘Local Agency’’ to clarify that such
organizations are responsible for
implementing portions of Annual
Monitoring Network Plans. The current
definition refers to the carrying out a
plan that is not specifically defined,
leading to possible confusion with SIPs.
The EPA proposed to revise the term
‘‘Meteorological Measurements’’ to
clarify that such measurements refer to
required parameters at the National Core
Monitoring Program (NCore) and
photochemical assessment monitoring
stations (PAMS).
The terms ‘‘Monitoring Agency’’ and
‘‘Monitoring Organization’’ were
proposed for clarification to include
tribal monitoring agencies and to
simplify the definition of monitoring
organization to reference the definition
of monitoring agency.
The term ‘‘NCore’’ was proposed for
revision to remove nitrogen dioxide
(NO2) and lead in PM10 (Pb-PM10) as a
required measurement and to expand
the definition of basic meteorology to
specifically reference the required
measurements: Wind speed, wind
direction, temperature, and relative
humidity. The EPA clarifies that NO2
was never a required NCore
measurement and that the current
definition was erroneous on this issue.
Additionally, the requirement to
measure Pb-PM10 at NCore sites in areas
over 500,000 population was proposed
for elimination due to the extremely low
concentrations being measured at these
sites.
The term ‘‘Near-road NO2 Monitor’’
was proposed for revision to ‘‘Near-road
Monitor.’’ This revision is being made to
broaden the definition of near-road
monitors to include all monitors
operating under the specific
requirements described in 40 CFR part
58, appendix D (sections 4.2.1, 4.3.2,
4.7.1(b)(2)) and appendix E (section
6.4(a), Table E–4) for near-road
measurement of PM2.5 and carbon
monoxide (CO) in addition to NO2.
The term ‘‘Network Plan’’ was
proposed for addition to clarify that any
such references in 40 CFR part 58 refer
to the annual monitoring network plan
required in 40 CFR 58.10.
The term ‘‘Plan’’ was proposed for
deletion as its usage has been replaced
with more specific references to either
the annual monitoring network plan
required in 40 CFR 58.10 or the SIP
approved or promulgated pursuant to
CAA section 110.
The term ‘‘Population-oriented
Monitoring (or sites)’’ was proposed for
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17250
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
deletion. This term, along with the
related concept of population-oriented
monitoring, was deleted from 40 CFR
part 58 in the 2013 PM2.5 NAAQS final
rule (78 FR 3235–3236). This was to
ensure consistency with the
longstanding definition of ambient air
applied to the other NAAQS pollutants.
The term ‘‘Primary Monitor’’ was
proposed for addition to the definition
list. The use of this term has become
important in AQS to better define the
processes used to calculate NAAQS
design values when more than one
monitor is being operated by a
monitoring agency for a given pollutant
at the same site. This term identifies the
primary monitor used as the default
data source in AQS for creating a
combined site record for pollutants that
allow site combinations per 40 CFR part
50.
The term ‘‘Primary Quality Assurance
Organization’’ was proposed for revision
to include the use of the acronym,
‘‘PQAO,’’ and to note that a PQAO
could include a group of monitoring
organizations.
The terms ‘‘PSD Monitoring
Organization’’ and ‘‘PSD Monitoring
Network’’ were proposed for addition to
support the proposed new appendix B
that will pertain specifically to QA
requirements for prevention of
significant deterioration (PSD)
networks.
The term ‘‘PSD Reviewing Authority’’
was proposed for addition to support
the addition of appendix B to the part
58 appendices and to clarify the
identification of the lead authority in
determining the applicability of QA
requirements for PSD monitoring
projects.
The term ‘‘Reporting Organization’’
was proposed for revision to clarify that
the term refers specifically to the
reporting of data as defined in AQS. The
AQS does allow the distinct designation
of agency roles that include analyzing,
certifying, collecting, reporting, and
PQAO.
The term ‘‘SLAMS’’ (state and local
air monitoring stations) was proposed
for clarification to indicate that the
designation of a monitor as SLAMS
generally refers to a monitor required
under appendix D of part 58 and is
needed to meet monitoring objectives.
The SLAMS monitors make up
networks that include NCore, PAMS,
CSN, and other state or local agency
sites that have been so designated in
annual monitoring network plans.
The terms ‘‘State Agency’’ and ‘‘STN’’
were proposed for minor wording
changes for purposes of clarity only.
The term ‘‘State Speciation Site’’ was
proposed for deletion given the
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
proposed addition of ‘‘Supplemental
Speciation Station’’ to better describe
the distinct elements of the CSN, which
includes the STN stations that are
required under section 4.7.4 of
appendix D of part 58, and
supplemental speciation stations that
are operated for specific monitoring
agency needs and are not considered to
be required monitors under appendix D.
We received relatively few comments
on the proposed revisions to definitions.
One commenter noted that the
clarification of Meteorological
Measurements should specify that those
parameters are also required at SLAMS
sites, which include both the NCore and
PAMS sites. They noted the use of the
undefined phrase ‘‘combined data
record’’ in the Primary Monitor
definition and recommended that a
definition be provided. They also
recommended that the EPA include an
explanation of the term ‘‘Special
Purpose Monitor’’ (SPM) in the
definitions section of the preamble and
not rely solely on the amended
regulatory text. A commenter from a
state air program noted that the
proposed definition for ‘‘Monitoring
Organization’’ includes the phrase ‘‘or
other monitoring organization.’’ They
believe the phrase is ambiguous and
could extend the applicability of
requirements such as technical systems
audits to universities, contractors, and
other government organizations. This
commenter was concerned that the
phrasing could expand the applicability
of regulations, and that the phrase
should be either defined or removed
from the final definition verbiage.
The EPA has made several revisions
to definitions in response to these
comments. The Meteorological
Measurements definition has been
amended to include a clarifying
reference that SLAMS stations include
sites that comprise the NCore and
PAMS networks. Additionally, the
words ‘‘or other monitoring
organization’’ have been removed from
the definition for Monitoring
Organization to remove any ambiguity
that monitoring regulations apply to
entities other than state, local, or tribal
agencies.4 The EPA does not believe
that the definition for Primary Monitor
needs to be amended as the term
‘‘combined data record’’ is already
4 The EPA does note that other mechanisms can
be used to extend the applicability of monitoring
requirements to sites operated by other entities, e.g.,
industrial monitors. For example, states can
develop Memorandum of Understanding (MOU’s)
with the operators of such sites to ensure that the
monitors are operated according to part 58
requirements and that the resulting data are of
known quality.
PO 00000
Frm 00004
Fmt 4701
Sfmt 4700
defined as part of appendix N to Part 50
(Interpretation of the National Ambient
Air Quality Standards for PM2.5). The
EPA acknowledges that the preamble to
the proposal inadvertently failed to
discuss a clarification to the Special
Purpose Monitor definition included in
the proposal. The proposed revision to
this definition was the addition of two
sentences that merely restated existing
requirements already established in 40
CFR 58.10 with regard to annual
monitoring network plans and network
assessments. The EPA believes that the
proposed definition is a useful but
minor revision that should be retained
as proposed. No other comments were
received on the proposed revisions to
definitions and they will be finalized as
proposed.
C. Annual Monitoring Network Plan and
Periodic Network Assessment
The annual monitoring network plan
process provides an important
communications and planning pathway
between monitoring agencies, EPA
Regional Offices, and the general public.
The network assessment process,
required every 5 years, provides an
opportunity to conduct more in-depth
planning and analyses of current and
future ambient monitoring needs and
objectives to help ensure that
monitoring programs respond to
changing requirements, demographics,
air quality trends, and updated
technology.
The EPA proposed several changes to
the annual monitoring network plan
process and related requirements. We
received significant comment on these
changes. Therefore, each individual
proposed revision is discussed below
along with relevant comments.
Since the revision of the annual
monitoring network plan process in
2006, the EPA has received feedback
about confusion concerning the
difference between the process of
obtaining public inspection versus
comment, the responsibility of
monitoring agencies to respond to
public comment in their submitted
annual monitoring network plans, and
the responsibility of the EPA Regional
Offices to obtain public comment
depending on a monitoring agency’s
prior action, as well as whether the
annual monitoring network plan was
modified based on discussions with the
monitoring agency following plan
submission. Accordingly, we proposed
that the public inspection aspect of the
requirement contained in 40 CFR
58.10(a)(1) be revised to clearly indicate
that obtaining public comment is a
required part of the process, and that
plans that are submitted to the EPA
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Regional Offices should address such
comments that were received during the
public notice period. A related part of
the annual monitoring network plan
process is described in 40 CFR
58.10(a)(2) with the distinction that this
section pertains specifically to plans
that propose SLAMS modifications and,
thereby, also require specific approval
from the EPA Regional Administrator.
Consistent with the proposed change
to the comment process described
above, the EPA proposed changes to the
text in 40 CFR 58.10(a)(1) to reflect the
fact that public comments will have
been required to be obtained by
monitoring agencies prior to
submission, and that the role of the EPA
Regional Office would be to review the
submitted plan together with public
comments and any modifications to the
plan based on these comments.
A number of state monitoring
agencies and two MJOs commented that
the proposed requirement to solicit and
address comments during the public
inspection period would impose
additional burden, inflexibility, and
delays on the process by requiring that
the comments be addressed before the
original plan is submitted to the EPA.
Some of these commenters estimated
that it would take an additional two
months compared with the current
process to handle comments in this
manner, and that they could only
support the proposed change if the
deadline for submittal was revised as
well. They requested that the EPA waive
this proposed requirement or make the
procedure more flexible by allowing
comments to be submitted later, perhaps
as an amendment before the plan is
approved, or even with the next year’s
plan. Four state programs supported the
proposed revision noting the
importance of soliciting public input on
the content of the plan and the
perspective that states should take the
lead in responding to comments versus
the EPA. One of these states noted that
they attempt to schedule a public
comment period for every SLAMS
modification. They also noted that
flexibility would be needed in
emergency situations that demand
immediate changes to their network.
Another of these states requested that
the term ‘‘address’’ be clarified and
noted that the timeliest way to handle
comments and responses would be to
include this information in an appendix
to the plan when submitted to the EPA.
A different perspective was offered by
comments received from a joint
environmental group submission. They
commented that the proposed changes
did not go far enough to ensure a
meaningful public comment
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
opportunity. They noted that annual
monitoring network plans are integral
parts of SIPs and that the CAA requires
that SIP submittals and revisions be
more formally publicly noticed. They
suggested that the EPA require states to
prominently advertise monitoring plans,
allow at least 30 days for public
comment, then either hold a public
hearing or provide such an opportunity
if requested. They also added that a
separate notice and comment
opportunity must be required on the
EPA’s proposed action on a submitted
plan or a related amendment to an
approved plan, and that all of the
suggested public comment requirements
must also be applicable to the 5-year
network assessment.
The EPA recognizes the diversity of
comments on this aspect of the
proposal. Nearly all commenters
recognized that fostering public
involvement in the annual monitoring
network plan is important and
desirable. Those commenters supporting
the proposal noted that their existing
procedures already address the
proposed requirements and that they
found it desirable to be able to respond
directly to stakeholders. Adverse
comment was related to the implied
additional burden of obtaining comment
versus the current requirement of
posting for public inspection, concern
about limiting the flexibility to
subsequently modify the plan following
submission to the EPA, and the
perceived impracticality of adequately
responding to public comments in a
timely manner.
The EPA does not agree with the
comments received from the joint
environmental group submission on this
aspect of the proposal. First, the final
rule text requires annual monitoring
network plans to be made available for
at least 30 days of public inspection and
comment and further requires
monitoring agencies to address, as
appropriate, any significant issues
raised in public comment. Requiring at
least 30 days of public participation and
consideration of significant comments is
consistent with the CAA and the
Administrative Procedure Act (APA)
and, at the same time, affords
monitoring agencies with the flexibility
and discretion to provide for additional
time and public participation
procedures.
Second, the EPA disagrees that state
action on an annual monitoring network
plan triggers the same public
participation requirements applicable to
SIP adoption and revision. Section
110(a)(2)(B) of the CAA provides that
each SIP shall ‘‘provide for
establishment and operation of
PO 00000
Frm 00005
Fmt 4701
Sfmt 4700
17251
appropriate devices, methods, systems,
and procedures necessary to (i) monitor,
compile, and analyze data on ambient
air quality, and (ii) upon request, make
such data available to the
Administrator.’’ To meet these
requirements, our September 2013
Guidance on Infrastructure State
Implementation Plan (SIP) Elements
under Clean Air Act Sections 110(a)(1)
and 110(a)(2) states that ‘‘the best
practice for an air agency submitting an
infrastructure SIP would be to submit,
for inclusion into the SIP . . . , the
statutory or regulatory provisions that
provide the air agency or official with
the authority and responsibility to
perform’’ certain actions required under
40 CFR part 58. (See 2013 iSIP
Guidance, p. 22.) In other words, CAA
section 110(a)(2)(B) simply requires that
monitoring agencies have the legal
authority to implement 40 CFR part 58;
it does not treat annual monitoring
network plans required under 40 CFR
part 58 as ‘‘integral parts’’ of a SIP
subject to public participation whenever
such network plans are established or
modified.
Third, the EPA disagrees that EPA
action on an annual monitoring network
plan requires a separate notice and
comment opportunity. The EPA reviews
and acts on network plans through
informal adjudications in which the
EPA determines whether such network
plans satisfy the requirements in 40 CFR
58.10. Such adjudications are not
rulemakings subject to the public
participation requirements of the APA
(see 5 U.S.C. 553), although they are
final agency actions subject to judicial
review (see 5 U.S.C. 706). The EPA’s
decision to treat network plan decisions
as case-by-case adjudications rather
than ‘‘rules’’ reflects the fact that the
EPA simply compares the information
supplied in the network plan with the
requirements of 40 CFR part 58 and
notifies the relevant monitoring
agencies that design and operate the
corresponding networks whether their
particular networks satisfy Part 58 or
need further revision.
Finally, the EPA disagrees that public
notice and comment is required ‘‘at both
the state and federal levels on the 5-year
monitoring network assessments
required at 40 CFR 58.10(d).’’ To the
extent that the EPA takes ‘‘substantive
action’’ on such assessments, such
actions are not rulemakings subject to
public participation requirements under
the CAA or the APA.
Given the relatively broad support for
the concept of soliciting public
comment as part of the annual
monitoring network plan posting
process, as well as the concern for the
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17252
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
implied logistical challenge of both
obtaining comment and developing (and
getting management approval for)
adequate responses, while still meeting
the required submission deadline of July
1, the EPA believes that some
modification of the proposed language
is appropriate. As noted by several
commenters, the implied burden to
‘‘reference and address any such
received comments’’ as described in the
proposed regulatory language may be
too difficult to achieve. As suggested by
one commenter, it may be more
practical for monitoring agencies to
review and consider the comments, and
only to modify the plan when
‘‘appropriate and feasible.’’ By
modifying the proposed language to
provide more flexibility and discretion
in addressing comments based on each
agency’s technical evaluation of
received comments and the associated
management review chain, the EPA can
finalize the generally supported goal of
increasing public involvement in the
process while lessening the burden on
agencies that have not previously
included the solicitation of public
comment in their process. Accordingly,
the EPA is revising the regulatory
language in the last sentence of 40 CFR
58.10(a)(1) from ‘‘The annual
monitoring network plan must be made
available for public inspection and
comment for at least 30 days prior to
submission to the EPA and the
submitted plan shall reference and
address any received comments’’ to
‘‘The annual monitoring network plan
must be made available for public
inspection and comment for at least 30
days prior to submission to the EPA and
the submitted plan shall include and
address, as appropriate, any received
comments.’’ The EPA believes that this
revised language, including the
clarification that the plan ‘‘address, as
appropriate, any received comments,’’
provides sufficient flexibility to
monitoring agencies and ensures
adequate public participation practices.
Under this approach, all agencies will
review public comments and make
changes to the plan as appropriate in
light of public comments, taking into
account the requirement for timely
submission of network plans. The EPA
encourages states to provide responses
to significant comments but
understands that developing formal
responses may potentially delay
submission of the plan beyond the July
1 deadline, in light of internal timelines
and management review procedures. To
avoid such delays, it would also be
acceptable for states to submit the
proposed plan with comments and any
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
resulting changes, and where the EPA
finds it necessary to discuss how the
state considered and addressed specific
comments, the EPA will follow up as
part of our process for reviewing the
plan for approval.
Another aspect of the annual
monitoring network plan requirements
is the listing of required elements and
site information in 40 CFR 58.10. The
EPA proposed to add two requirements
to this list as described below. First, the
EPA proposed to require that a PAMS
network description be specifically
included in the 40 CFR 58.10(a)
requirements for any monitoring
agencies affected by PAMS
requirements. The requirements for
such a plan are already referenced in
appendix D, sections 5.2 and 5.4 of this
part. Second, the EPA proposed that
‘‘long-term’’ SPMs, i.e., those SPMs
operating for longer than 24 months
whose data could be used to calculate
design values for NAAQS pollutants in
cases where the EPA-approved methods
are being employed, should be
identified in the 40 CFR 58.10(b)
requirements along with a discussion of
the rationale for keeping the monitor(s)
as SPMs or potentially reclassifying to
SLAMS. The EPA did not propose that
such monitors must become SLAMS,
only that the ongoing operation of such
monitors and the rationale for retaining
them as SPMs be explicitly discussed to
avoid confusion, particularly because
the monitoring data could be used to
calculate design values regardless of
whether the monitors are designated
SPMs or SLAMs. Thus, there is
potential for unintended complexities in
the designations process if any design
value SPMs would be discontinued
without adequate discussion.
Nine commenters addressed the above
issues. Only one commenter specifically
addressed the addition of the PAMS
network description and that comment
was ‘‘Support this action.’’ The
remainder of comments addressed the
issue of requiring an annual monitoring
network plan discussion and rationale
for whether longer-term SPMs should be
retained as SPMs or reclassified to
SLAMS. Three of these commenters
were supportive of the proposed
revision with several noting that they
expected that monitoring agencies
would still be granted discretion on the
issues by the EPA Regional Offices. Two
commenters suggested revised language
to limit the proposed SPM discussion to
only criteria pollutant monitors and also
only those monitors utilizing federal
reference methods (FRM) or federal
equivalent methods (FEM). One
commenter only supported the revision
if the EPA could provide grant funding.
PO 00000
Frm 00006
Fmt 4701
Sfmt 4700
Three commenters did not support the
proposed revision, either because they
interpreted the provision as meaning
that the EPA was proposing that such
longer-term SPMs be automatically
converted to SLAMS in the absence of
a justification, due to the belief that
such a rationale would create a burden
for monitoring agencies and that such a
discussion is misplaced in the annual
monitoring network plan, or because of
the belief that ongoing discussions
between the states and EPA Regional
Offices are already sufficient to handle
such issues, and that the additional
requirement is an unnecessary limit on
monitoring network flexibility.
After consideration of these
comments, the addition of the PAMS
network description to the list of
requirements in 40 CFR 58.10(a) will be
finalized as proposed due to general
support and lack of comment on this
revision.
The EPA will not finalize the
proposed changes to 40 CFR 58.10(b).
The EPA believes that some
misunderstanding still exists as to the
intent of the proposed addition of a
required discussion and rationale
concerning longer-term SPM monitors.
Although preamble language explicitly
stated that the EPA was not intending to
propose an automatic conversion
process for such SPMs, several
commenters interpreted the proposal in
that way. One commenter noted, ‘‘Also
the mechanism is unclear for how SPMs
not granted approval will convert to a
SLAMS monitor.’’ It was not the EPA’s
intention to imply any limitations on
monitoring agency discretion to employ
SPMs as part of their network design
strategy, only to raise the awareness
among all stakeholders of such
situations when they occur, particularly
with longer-term SPMs that may have
design values approaching or exceeding
the NAAQS. Comments regarding the
need to limit the proposed requirement
to FRMs or FEMs also indicate a
misunderstanding of the proposed
language as this limitation was already
included in the regulatory language in
the proposal. Given these apparent areas
of confusion and the concern about
additional burden that the inclusion of
such a rationale would place on plan
submitters, the EPA will not finalize
this proposed change to 58.10(b).
Nevertheless, we continue to believe
that an open and robust discussion
about such longer-term SPMs is an
important part of interactions between
monitoring agencies and EPA Regional
Offices, particularly in the context of
monitors utilizing EPA-approved
methods that are measuring
concentrations near the level of
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
applicable NAAQS. While continuing to
support the use of SPMs to provide
flexible options for investigating air
quality problems, we encourage
reference to these situations in annual
monitoring network plans and
thoughtful consideration of the pros and
cons of converting such monitors to
SLAMS particularly to avoid potential
disruption of implementation actions
due to discontinuance of important
SPMs.
The EPA proposed a minor edit to the
annual monitoring network plan
requirements to revise terminology
referring to PM2.5 speciation monitoring.
No comments were received on this
issue and the change will be finalized as
proposed.
The EPA received comments on a
general rewording of regulatory
language that was included as part of
the revisions to 40 CFR 58.10(a).
Specifically, we revised the sentence
‘‘The plan shall include a statement of
purposes for each monitor and evidence
that siting and operation of each
monitor meets the requirements of
appendices A, C, D, and E of this part,
where applicable’’ to ‘‘The plan shall
include a purpose statement for each
monitor along with a statement of
whether the operation of each monitor
meets the requirements of appendices
A, B, C, D, and E of this part, where
applicable.’’ Additionally, the proposed
language added the following sentence:
‘‘The Regional Administrator may
require the submission of additional
information as needed to evaluate
compliance with applicable
requirements of Part 58 and its
appendices.’’
One state monitoring agency noted
that there was overlap between the
monitoring objective and the purpose of
a monitor as referenced in the regulatory
language. They suggested that the terms
be defined in the definitions section of
the rule. They also suggested removing
the purpose statement entirely as it
appears duplicative with other annual
monitoring plan requirements that are
already present. Two MJOs referenced
the statement concerning the Regional
Administrator’s discretion to require the
submission of additional information to
evaluate the compliance of the
submitted plan with part 58 and
appendices. They commented that the
proposed language was ‘‘vague and
open-ended’’ and that the presence of
this requirement would lead to
significant differences among the EPA
Regions concerning the level of detail
needed to evaluate plan submittals. It
was suggested that the EPA consider
amending the language to more clearly
define the circumstances when
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
additional information would be
needed.
The EPA believes that some revision
of the referenced language is
appropriate to achieve the goal of
providing monitoring agencies with a
more explicit description of the
documentation that is required in the
plans as well as providing the EPA
Regional Offices with a clear basis for
review and approval. We agree with the
comment that the requirement for a
‘‘purpose statement’’ is vaguely worded
and duplicative of existing requirements
(in 40 CFR 58.10(b)) that pertain to
factors such as monitoring objective and
spatial scale. We also note the
comments concerning the open-ended
nature of the statement that the Regional
Administrator has discretion to require
the submission of additional
information to evaluate the compliance
of the submitted plan with Part 58 and
appendices. The EPA observes that this
type of statement is not unusual in the
context of various monitoring
requirements, particularly in the
Network Design Criteria described in
appendix D. We do not anticipate
frequent requests for additional
information in the context of the Annual
Monitoring Network Plan requirements,
but we would anticipate that additional
information would be needed by
Regional Offices when the reasons
supporting compliance with the
applicable requirements of part 58 and
its appendices have changed from the
previous year’s plan, or when a monitor
has been added since the previous
year’s plan was approved.
Accordingly, the EPA is revising the
proposed language by deleting the
words ‘‘a purpose statement for each
monitor along with’’ from the second
sentence of 40 CFR 58.10(a)(1) and also
revising the sentence ‘‘The Regional
Administrator may require the
submission of additional information as
needed to evaluate compliance with
applicable requirements of Part 58 and
its appendices’’ to ‘‘The Regional
Administrator may require additional
information in support of this
statement,’’ which is a somewhat
narrower framing of the need for
Regional Administrator discretion in the
context of assuring whether the
operation of each monitor meets the
requirements of appendices A, B, C, D,
and E of this part, as described in the
submitted Annual Monitoring Network
Plan.
Finally, two public comments were
received on preamble language in the
proposal pertaining to the EPA’s
discussion about the ability of Regional
Offices to handle partial approvals of
annual monitoring network plans in
PO 00000
Frm 00007
Fmt 4701
Sfmt 4700
17253
cases where one or more of the required
elements is problematic. A joint
environmental organization comment
noted that the EPA’s discussion did not
indicate a timeframe for the correction
of deficiencies and, hence, the described
partial approval process was unlawful
and arbitrary. They further suggested
that an appropriate time limit for the
correction of deficiencies would be 90
days. A MJO comment noted that a
partial approval process is not an
appropriate strategy for the longer term,
although the process as it exists now has
been found to be useful in some cases.
This commenter supported language in
the preamble discussion relating to an
approval process while noting technical
deficiencies, as long as such
deficiencies were related to required
elements of the plan.
The EPA notes that the preamble
discussion (79 FR 54360) was not tied
to any proposed revisions to
requirements or regulatory language, but
was intended as an articulation of what
we believe to be currently available
flexibility in the handling of annual
monitoring network plan submissions.
The EPA agrees that deficiencies should
be corrected and intends to work with
monitoring agencies to address
deficiencies in a timely manner.
However, the EPA does not believe that
the lack of a regulatory schedule for
correcting deficiencies is unlawful or
that it would be appropriate to establish
one without having solicited comment
on the topic in the proposal.
Accordingly, no additional action was
taken within the context of this
rulemaking.
D. Network Technical Requirements
The Network Technical Requirements
section provides a place for crossreferencing and clarifying the
applicability of the various
requirements that are described in the
appendices to part 58.
The EPA proposed to revise the
language in 40 CFR 58.11(a)(3) to note
the proposed revisions to appendix B to
the QA requirements that would pertain
to PSD monitoring sites. One supportive
comment was received on this issue and
the revision will be finalized as
proposed.
E. Operating Schedules
The operating schedule requirements
described in 40 CFR 58.12 pertain to the
minimum required frequency of
sampling for continuous analyzers (for
example, hourly averages) and manual
methods for particulate matter (PM) and
Pb sampling (typically 24-hour averages
for manual methods).
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17254
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
The EPA proposed to revise these
requirements by (1) adding flexibility in
the minimum required sampling for
PM2.5 mass sampling and for PM2.5
speciation sampling; (2) modifying
language pertaining to continuous mass
monitoring to reflect revisions in
regulatory language that were finalized
in the 2013 PM NAAQS final rule; and
(3) clarifying the applicability of certain
criteria that can lead to an increase in
the required sampling frequency, for
example, to a daily schedule. Ten
commenters responded to these
proposed changes. Most of the
comments were generally supportive of
these changes as they provide additional
flexibility and potential burden
reductions for monitoring agencies.
Some comments noted concern with
specific changes to the period of time
that a PM2.5 sampler would have to
utilize an increased sampling frequency
if triggered by design values. Additional
details on these generally supportive
comments are discussed below in the
relevant sections. A joint environmental
organization comment opposed all the
sampling frequency changes; they noted
concern for the increased risk of not
detecting daily variations in PM2.5 by
allowing samplers to follow reduced
sampling schedules and also noted the
lack of a cost analysis documenting the
burden of monitoring as well as the fact
that the EPA was not requiring
additional monitoring to compensate for
the reduced sampling frequency.
With regard to the minimum required
sampling frequency for manual PM2.5
samplers, current requirements state
that at least a 1-in-3 day frequency is
mandated for required SLAMS monitors
without a collocated continuous
monitor. The EPA believes that some
regulatory flexibility is appropriate in
situations where a particular monitor is
highly unlikely to record a violation of
the PM2.5 NAAQS, such as in areas with
very low PM2.5 concentrations relative
to the NAAQS and/or in urban areas
with many more monitors than are
required by appendix D (when a subset
of those monitors is reading lower than
other monitors in the area). The EPA
specifically proposed that the required
sampling frequency could be reduced to
1-in-6 day sampling or another alternate
schedule through a case-by-case
approval by the EPA Regional
Administrator. Such approvals could be
based on factors that are already
described in 40 CFR 58.12(d)(1)(ii) such
as historical PM2.5 data assessments, the
attainment status of the area, the
location of design value sites, and the
presence of continuous PM2.5 monitors
at nearby locations. The EPA noted that
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
the request for such reductions in
sampling frequency would occur as part
of the annual monitoring network plan
process as operating schedules are a
required part of the plans as stated in 40
CFR 58.10(b)(4). For sites with a
collocated continuous monitor, the EPA
also proposed that the current
regulatory flexibility to reduce to 1-in6 day sampling or a seasonal sampling
schedule is appropriate based on factors
described above and, in certain cases,
may also be applicable to lower-reading
SLAMS sites without a collocated
continuous monitor, for example, to
reduce frequency from 1-in-6 day
sampling to a seasonal schedule. Such
flexibility was proposed through
changes in the regulatory language in 40
CFR 58.12(d)(1)(i) and (ii).
With the one exception noted earlier,
supportive comments were received on
this specific proposed revision. One
MJO commented that flexibility is
needed in specifying operating
schedules, and that it is preferable to
retain lower reading sites with a
reduced sampling frequency rather than
close them completely. Similar
comments included ‘‘Support this
action’’ and the observation that the
proposed changes should reduce
monitoring burden. Concerning the joint
environmental organization comment
noting the potential increased risk of not
characterizing the risk from PM2.5 levels
that might be missed when sampling
frequency is reduced, the EPA notes that
these case-by-case situations would be
reviewed by EPA Regional Offices for
approval, and that the pertinent
approval criteria would include an
assessment of prevailing PM2.5
concentrations and the availability of
other manual or continuous monitors
that would provide characterization in
the general area. As stated in the
proposal, we expect these sampling
reduction requests to be made for lower
reading sites so the impact on area
design values would be negligible. We
also note that the requests would be
made through the annual monitoring
network plan process and, therefore,
would be open for public inspection
and comment prior to potential
approval by the EPA. On an overall
basis, the EPA believes that it is
important to have operational
flexibilities with regard to sampling
frequency to permit monitoring agencies
to shift resources (e.g., higher sampling
frequency samplers) to high priority
areas; this flexibility supports the ability
of the monitoring network to react to
changing air quality trends and
problems in a manner most protective of
public health. Concerning the
PO 00000
Frm 00008
Fmt 4701
Sfmt 4700
observation that the EPA has not
provided an analysis of relevant costs,
we note the public availability of such
financial information in information
collection request documents that are
regularly updated and submitted for
public comment according to Office of
Management and Budget regulation.5
In consideration of the comments
above, the EPA is finalizing the
revisions to add flexibility to sampling
frequency requirements for PM2.5 mass
samplers as proposed.
The EPA also proposed added
flexibility in sampling frequency for
PM2.5 CSN sites, specifically the STN
sites that are currently operated at
approximately 53 locations.6 The STN
stations are currently required to sample
on at least a 1-in-3 day frequency with
no opportunity for flexibility.
Justifications for the proposed
additional flexibility include the
conservation of resources for
reinvestment in other needs within the
CSN, rising analytical costs, and the
availability of new technologies that
provide continuous measurement of
PM2.5 species. Accordingly, the EPA
proposed that a reduction in sampling
frequency from 1-in-3 day be
permissible for manual PM2.5 samplers
at STN stations, for example, to a 1-in6 day frequency. The approval for such
changes at STN stations, on a case-bycase basis, would be made by the EPA
Administrator as the authority for
changes to STN has been retained at the
Administrator level per appendix D of
this part, section 4.7.4.7 Factors that
would be considered as part of the
decision would include an area’s design
value, the role of the particular site in
national health studies, the correlation
of the site’s species data with nearby
sites, and presence of other leveraged
measurements.
Few commenters specifically
addressed this proposed change as the
aforementioned comments pertaining to
changes in sampling frequency for PM2.5
mass samplers were likely deemed
pertinent to the CSN. Where this
proposed change was mentioned
specifically, monitoring agency
comments noted support as a means of
increasing flexibility and potentially
protecting sites by reducing sampling
frequency versus eliminating sites
completely. The joint environmental
organization comment stated that a
5 See https://www.regulations.gov/
#!documentDetail;D=EPA-HQ-OAR-2002-00910017.
6 https://www.epa.gov/ttn/amtic/specgen.html.
7 The approval process has been delegated, in
practice, to the Director of the Air Quality
Assessment Division within the Office of Air
Quality Planning and Standards.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
reasoned justification for the change
was not provided, and noted that
speciation data are critical in
development of SIP control strategies,
health studies, modeling exercises, and
investigation of air pollution episodes.
The EPA notes the supportive
comments from monitoring agencies
and agrees that increasing flexibility
with respect to sampling frequency as
an alternative to site elimination was a
motivation for the revision. With respect
to the environmental organization
comment noting concern about the
additional flexibility and the potential
for reduced sampling frequency, the
EPA agrees with the observation that
PM2.5 speciation data are critical to
supporting many different monitoring
objectives. Because we believe that
PM2.5 speciation data are critical for the
objectives noted above, we recently
completed an in-depth assessment of
the CSN with the goal of protecting, to
the greatest extent possible, the longterm operation of the network.8 In the
face of rising analytical costs and
unchanging budgets, the EPA
considered factors such as site
reductions, changes in sampling
frequency, and alterations in operational
procedures to support long-term
viability of the CSN. The results of the
assessment were implemented in late
2014 and early 2015, and the EPA
believes the revised CSN continues to
provide strong support for key
monitoring objectives noted by the
commenter and would do so even if
sampling frequency were selectively
reduced at a small number of STN sites
based on substantive and suitable
criteria. The EPA notes that a proposal
to reduce sampling frequency would
need to be accompanied by a technical
rationale justifying the request and
evaluating the impact on data users and
the ability of the site to meet the
aforementioned key objectives, for
example, by employing new technology
such as continuous monitoring of PM2.5
species, in lieu of the reduced number
of filter samples.
In consideration of the comments and
detailed network assessment described
above, the EPA is finalizing the
revisions to add flexibility to sampling
frequency requirements for the PM2.5
STN sites as proposed.
The EPA proposed editorial revisions
to 40 CFR 58.12(d)(1)(ii) to harmonize
the language regarding the use of
continuous FEM or approved regional
methods (ARM) monitors to support
sampling frequency flexibility for
manual PM2.5 samplers with the current
8 https://www.sdas.battelle.org/CSNAssessment/
html/Default.html.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
language in 40 CFR 58.12(d)(1)(iii) that
was revised as part of 2013 PM NAAQS
final rule. Specifically, the phrase
‘‘unless it is identified in the monitoring
agency’s annual monitoring network
plan as not appropriate for comparison
to the NAAQS and the EPA Regional
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS’’ was
proposed for appending to the current
regulatory language to reflect the new
process that was finalized in the 2013
PM NAAQS final rule that allows
monitoring agencies to request that
continuous PM2.5 FEM data be excluded
from NAAQS comparison based on
technical criteria described in 40 CFR
58.11(e). We also proposed the addition
of the phrase ‘‘and the EPA Regional
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS’’ to the
revisions that were made with the 2013
PM NAAQS. This revision was
proposed to clearly indicate that two
distinct actions are necessary for the
data from a continuous PM2.5 FEM to be
considered not comparable to the
NAAQS; first, the identification of the
relevant monitor(s) in an agency’s
annual monitoring network plan, and,
second, the approval by the EPA
Regional Administrator of that request
to exclude data. The language used by
the EPA in the relevant sections of 40
CFR 58.12 related to the initial request
by monitoring agencies but did not
specifically address the needed
approval by the EPA.
No comments specifically addressed
these editorial changes in regulatory
language and they will be finalized as
proposed.
Finally, the EPA proposed to clarify
the applicability of statements in 40
CFR 58.12(d)(1)(ii) and (iii) that
reference the relationship of sampling
frequency to site design values.
Specifically, we proposed clarifications
and revisions affecting the following
statements: (1) ‘‘Required SLAMS
stations whose measurements determine
the design value for their area and that
are within ±10 percent of the NAAQS;
and all required sites where one or more
24-hour values have exceeded the
NAAQS each year for a consecutive
period of at least 3 years are required to
maintain at least a 1-in-3 day sampling
frequency,’’ and (2) ‘‘Required SLAMS
stations whose measurements determine
the 24-hour design value for their area
and whose data are within ±5 percent of
the level of the 24-hour PM2.5 NAAQS
must have a FRM or FEM operate on a
daily schedule.’’ These revisions were
proposed to avoid confusion among
monitoring agencies and Regional
PO 00000
Frm 00009
Fmt 4701
Sfmt 4700
17255
Offices concerning the applicability of
the sampling frequency adjustments
since design values are recalculated
annually and, in some situations, such
revised design values can either fall
below the comparative criteria or rise
above the criteria. To provide some
clarity to this situation as well as to
provide a framework where changes in
sampling frequency occur on a more
consistent and predictable basis, the
EPA proposed that design value-driven
sampling frequency changes be
maintained for a minimum 3-year
period once such a change is triggered.
Additionally, such changes in sampling
frequency would be required to be
implemented no later than January 1 of
the year that follows the recalculation
and certification of a triggering design
value.
A number of supportive comments
were received on this specific issue
from monitoring agencies. These
comments ranged from unqualified
support to more conditional support
based on concerns related to funding
levels and the overall burden of
analyzing more PM2.5 filters when
sampling frequency is increased. One
agency commented that the proposed
change ‘‘makes sense where the
concentrations have reached a plateau
or fluctuate back and forth from year to
year.’’ However, concern was noted
about waiting for 3 years to decrease
sampling frequency when design values
are clearly trending downward. Another
state agency generally agreed with the
proposed approach but requested
clarifying language that the same criteria
that would require an increase in
sampling frequency for a 3-year period
due to an increase in design values
would also allow a decrease in sampling
frequency for a 3-year period if the
corresponding site design value
decreased below a threshold. Other
commenters expressed concern about
the associated resource burdens noting
that their gravimetric laboratories are
already operating at full capacity and
that an increase from 1-in-3 day
sampling to daily sampling would triple
the number of filters to be weighed.
Accordingly, these commenters
requested that the EPA allow the
affected design value sampler to drop
back to a reduced sampling frequency as
soon as a design value fell out of the
specific range and not be required to
wait for the proposed 3-year period. One
commenter expressed concern that the
provision could trigger daily sampling
even if the higher values were caused by
a rare or exceptional event, and
requested that the proposed revision be
omitted. Finally, one state monitoring
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17256
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
agency expressed concern about the
apparent deletion of PM10 monitoring
requirements from 40 CFR 58.12, and
also offered suggested revisions to the
current requirements in 40 CFR
58.12(e).
The EPA notes the range of responses
on this issue and acknowledges that in
cases where the sampling frequency for
a PM2.5 sampler is increased, for
example from 1-in-3 day to daily
sampling, the associated burden, which
includes field support and gravimetric
lab support, would increase for a
minimum period of 3 years based on the
proposed change. After that 3-year
period of increased sampling, the
sampling frequency would be eligible to
be reduced if the triggering design value
was no longer in the specified range
(e.g., ±5 percent of the 24-hour PM2.5
NAAQS). The EPA agrees that the
treatment of sampling frequency in
situations where a sampler is no longer
in the specific triggering range after a 3year period of increased sampling,
should be analogous to the treatment of
sampling frequency in situations where
a sampler first enters into the specific
triggering range, for purposes of
providing predictability to monitoring
agencies in terms of anticipating
operational burden. In other words,
where the sampling frequency is
reduced at a sampler after a 3-year
period of increased sampling frequency
(for example, where the design value
falls out of the ±5 percent range), that
sampler should not be subject to an
increased sampling frequency
requirement for at least 3 years. With
regard to the concern that an
exceptional event could trigger the
increased burden of operating a higher
sampling frequency sampler, we believe
that this is a plausible situation that
deserves additional consideration.
Rather than trying to account for this
situation in this rule, however, we
believe it is best dealt with in the
context of the ongoing process of
developing guidance and proposed
revisions to the Exceptional Events
rule.9 Once those actions are finalized,
the EPA will work with Regional Offices
to clarify how to address this situation.
On the related concern of a ‘‘rare’’ event
triggering increased sampling frequency,
the EPA notes that the form of the PM2.5
NAAQS is intended to address such
year-to-year variations such that design
values should not be overly affected by
‘‘rare’’ occurrences of PM2.5
concentrations in any given year. With
regard to the comment indicating an
9 https://www2.epa.gov/air-quality-analysis/
treatment-data-influenced-exceptionalevents#Proposed%20EE%20Rule.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
apparent deletion of the PM10 sampling
frequency requirements in 40 CFR
58.12(e), we note that such changes
were not included as part of the
proposal and those requirements
remain.
The EPA believes that this proposed
revision to sampling frequency
procedures is a necessary clarification to
the regulatory change that was finalized
in 2006, and will provide a more
predictable and statistically robust
process for making design value driven
changes in sampling frequency. Based
on the unqualified and qualified
supportive comments, we are finalizing
the regulatory language as proposed.
While we are mindful of the potential
for added burden in cases where PM2.5
samplers must move to a more frequent
sampling frequency for a longer period
of time based on this revision, we also
note that the likelihood of such
occurrences affecting monitoring
agencies is relatively small. Based on an
AQS retrieval conducted in August
2014, fewer than ten PM2.5 monitors out
of a pool of 980 FRM monitors were
required to operate on a daily sampling
frequency based on the rule provisions
of 40 CFR 58.12(d)(1)(iii).10 While this
analysis is not predictive in nature, we
believe the overall risk of increasing
burden on monitoring programs is quite
small and an acceptable consequence of
providing a more specific way of
implementing an important aspect of
the sampling frequency requirements.
Alternatively, as noted in the regulatory
text, monitoring agencies have the
option of installing a continuous PM2.5
FEM monitor to satisfy this requirement
and, thereby, avoid the consequence of
handling an increased number of filters.
F. System Modification
The System Modification section
pertains to the specific requirements
that must be followed when monitoring
agencies request changes to the SLAMS
portion of their networks.
In the 2006 monitoring amendments,
the EPA finalized a requirement in 40
CFR 58.14(a) for monitoring agencies to
‘‘develop and implement a plan and
schedule to modify the ambient air
quality network that complies with the
finding of the network assessments
required every 5 years by 58.10(e).’’
Since 2006, there has been confusion
between the EPA and monitoring
agencies as to whether a separate plan
was required to be submitted by 40 CFR
58.14(a) relative to the annual
10 Hanley, T. (2015). Assessment of PM
2.5 data to
determine the number of sites that would be
potentially required to increase their sample
frequency to daily. Memorandum to the Docket,
EPA–HQ–OAR–2013–0619.
PO 00000
Frm 00010
Fmt 4701
Sfmt 4700
monitoring network plan, with that
separate plan devoted specifically to
discussing the results of the 5-year
network assessment. As explained in
the monitoring proposal, the EPA did
not intend for the submission of a
distinct plan devoted specifically to the
implementation of the 5-year network
assessment. Accordingly, the EPA
proposed to revise the regulatory
language in 40 CFR 58.14(a) to clearly
indicate that a separate plan is not
needed to account for the findings of the
5-year network assessment, and that the
information concerning the
implementation of the 5-year
assessment, referred to in the proposed
regulatory language as a ‘‘network
modification plan,’’ shall be submitted
as part of the annual monitoring
network plan that is due no later than
the year after the network assessment is
due.11 According to the proposed
schedule, the annual monitoring
network plans that are due in 2016,
2021, etc., would contain the
information referencing the network
assessments.
A number of comments were received
on this issue. Most of the commenters
provided the perspective that the
clarification in the regulatory text was
useful but that additional clarification
was needed to address how the phrase
‘‘implement the findings’’ was used in
the language. Five of these commenters
noted that states should only have to
address those changes in the network
assessments that are specifically
required by regulation, and that the EPA
should clarify that monitoring agencies
have the flexibility to discuss what
findings they intend to implement and
which findings they do not intend to
implement. Two commenters noted that
monitoring agencies should not have to
summarize the findings of their network
assessment in a network modification
plan that is due one year after the
assessment, but rather should have the
flexibility to address and implement
those findings that are appropriate
based on available resources and
changing priorities over some period of
time. Two commenters supported the
proposed language without additional
elaboration.
The EPA agrees with the comments
requesting additional clarification. The
intention of the proposed revision was
to clarify the process for how and when
monitoring agencies should deal with
11 Monitoring agencies, at their discretion, could
submit the network modification plan in the year
that the assessment is due if sufficient feedback had
been received. On balance, the EPA believes that
the extra year following the completion of the
network assessment would be valuable to assure a
productive outcome from the assessment process.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
the results from these important
network assessments, not to imply that
all the results should be implemented or
were necessarily required. The network
assessment requirements detailed in 40
CFR 58.10(d) reference a mix of required
elements (e.g., meeting the monitoring
objectives of appendix D) as well as
useful but non-required elements such
as evaluation of new technologies and
the evaluation of the impact on data
users of site discontinuance. To the
extent that the EPA used the phrase
‘‘implements the findings of the
network assessment’’ in the proposed
regulatory language of 40 CFR 58.14(a),
the concern from monitoring agencies
about specifying which results from the
network assessment are required and
not required is understandable. The
EPA always intended that the results of
the network assessments should be used
as a flexible planning tool for informing
the next 5 years of monitoring network
operations, and the specificity being
implied by the monitoring agency
comments reflects a misreading of those
intentions.12 The EPA disagrees with
the comments suggesting that a network
modification plan is unnecessary. Such
a requirement has been a part of the
monitoring regulations since the
inception of the network assessment,
and having the network modification
plan submitted as part of the annual
monitoring network plan insures public
involvement in a key process that
occurs on a relatively infrequent basis.
To address the concerns noted above,
the proposed regulatory language is
being revised to replace ‘‘implements’’
with ‘‘addresses,’’ as follows: ‘‘The state,
or where appropriate local, agency shall
develop a network modification plan
and schedule to modify the ambient air
quality monitoring network that
addresses the findings of the network
assessment required every 5 years by
§ 58.10(d).’’ With this revision, the EPA
is indicating that the network
modification plan should reference or
‘‘address’’ the findings of the network
assessment without the unintended
implication that some of the findings are
required network changes that must be
implemented. The correct vehicle for
the discussion of required elements that
must be implemented is the annual
monitoring network plan that is
required to be submitted each year, as
discussed earlier in section II.C of this
preamble.
The EPA also proposed to revise an
incorrect cross-reference in the current
text of 40 CFR 58.14(a) in which the
network assessment requirement is
12 See https://www.epa.gov/ttn/amtic/files/
2014conference/monnaweinstock.pdf.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
noted as being contained in 40 CFR
58.10(e) when the correct crossreference is 40 CFR 58.10(d). One
supportive comment addressed this
issue, and the revision will be finalized
as proposed.
G. Annual Air Monitoring Data
Certification
The data certification requirement is
intended to provide ambient air quality
data users with an indication that all
required validation and reporting steps
have been completed, and that the
certified data sets are now considered
final and appropriate for all uses
including the calculation of design
values and the determination of NAAQS
attainment status. Current requirements
include the certification of data
collected at all monitors at SLAMS and
monitors at SPMs using FRM, FEM, or
ARM methods. In practice, this
requirement includes a very wide range
of measurements that are not limited to
criteria pollutants but also extend to
non-criteria pollutant measurements at
PAMS stations, meteorological
measurements at PAMS and NCore
stations, and PM2.5 chemical speciation
parameters.
The EPA proposed several changes in
the data certification requirements to
accomplish a streamlining of this
important process. First, to support the
focus on certification of criteria
pollutant measurements, the EPA
proposed to revise relevant sections of
40 CFR 58.15 to focus the requirement
on FRM, FEM, and ARM monitors at
SLAMS and at SPM stations rather than
at all SLAMS, which also include PAMS
and CSN measurements that may not
utilize federally approved methods.
Second, the EPA proposed that the
required AQS reports be submitted to
the Regional Administrator rather than
through the Regional Administrator to
the Administrator as is currently
required. Finally, minor editorial
changes were proposed in 40 CFR 58.15
to generalize the title of the official
responsible for data certification (senior
official versus senior air pollution
control officer) and to remove an
outdated reference to the former due
date for the data certification letter (July
1 versus the current due date of May 1).
Seven commenters specifically
addressed the proposed changes to data
certification. Three monitoring agencies,
one MJO, and one consulting firm were
supportive of the changes. One of these
commenters also noted that the data
certification and QA report hosted on
the AQS system, the AMP600 report,
should be modified to provide more
useful data certification flag
recommendations for regions and states.
PO 00000
Frm 00011
Fmt 4701
Sfmt 4700
17257
Another of these supportive
commenters also stated that the EPA
should ensure that QA practices and
responsibilities remain in place to
validate PAMS and PM2.5 chemical
speciation data. A joint environmental
group comment stated that the EPA had
not provided a rational basis for the
proposed changes, and that an
inconsistency exists between proposing
to retain the data certification process
for criteria pollutants while stating that
existing QA plans and procedures
would be sufficient to validate noncriteria pollutant measurements. In this
commenter’s view, the data certification
process, as it exists today, appears to
delay the availability of data for use in
computing criteria pollutant design
values, so perhaps the agency should
consider eliminating the process
entirely if it is deemed unnecessary.
Finally, one commenter asked that the
EPA consider moving the data
certification deadline from May 1 back
to July 1, and also to consider not
requiring chemical speciation data to be
certified.
With regard to the adverse comment,
the EPA notes that the proposed
changes were made to protect the
viability of the process in the face of a
rapidly increasing volume of data
subject to certification requirements
versus the available resources at the
monitoring agency and EPA level
needed to meet the requirements and
deadline. We continue to believe that
the data certification process adds the
greatest degree of value when focused
on criteria pollutants that support the
calculation of design values and the
mandatory designations process. The
review of design values occurs on an
annual basis and there is a longstanding practice of waiting for criteria
pollutant data to be certified before such
calculations are completed.13 This
process provides a basis for
documenting that a state’s review of
their data is complete and that the data
are considered final for key purposes
such as comparison to the NAAQS. The
same annual pattern of regular data
usage and oversight does not exist for
non-criteria pollutants such as PAMS,
PM2.5 chemical species, and air toxics
data, and these data are not directly
compared to the NAAQS. Therefore, the
EPA believes that the applicability and
visibility of the data certification
process for these measurements is less
critical. As stated in the proposal, there
are existing standardized procedures
and QA documents that provide a
framework for assuring the quality of
13 See 40 CFR part 50, appendix N, section 3.0(a)
as revised on January 15, 2013 (78 FR 3278).
E:\FR\FM\28MRR2.SGM
28MRR2
17258
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
non-criteria pollutants,14 and we believe
that the resulting quality of such data
will not be compromised by their
removal from the data certification
process. With regard to the comment
requesting that the data certification
deadline be pushed back to July 1, the
EPA notes that this deadline was not
proposed for revision and, therefore, is
not being considered in this final
rulemaking. With regard to the comment
about excluding chemical speciation
data from the certification process, the
EPA notes that this procedural change
would occur as a result of the proposed
revisions as explained above.
After reviewing the comments, the
EPA is finalizing the changes to data
certification requirements as proposed.
The EPA agrees with commenters that
efforts to improve the validation
procedures for non-criteria data should
continue and the agency has invested in
revised tools, such as the recently
launched Data Analysis and Reporting
Tool (DART) web resource that can
assist monitoring agencies with the
validation of data including PAMS and
air toxics data.15 Improvements are also
being made to the AMP600 report to
improve the utility of the program for
generating recommended certification
flags for consideration by monitoring
agencies and EPA Regional Offices
during the annual review process.
Lhorne on DSK5TPTVN1PROD with RULES2
H. Data Submittal and Archiving
Requirements
The requirements described in 40 CFR
58.16 address the specific
measurements that must be reported to
AQS as well as the relevant schedule for
doing so. Required measurements
include criteria pollutants in support of
NAAQS monitoring objectives and
public reporting; specific ozone (O3) and
PM2.5 precursor measurements such as
those obtained at PAMS, NCore, and
CSN stations; selected meteorological
measurements at PAMS and NCore
stations; and associated QA data that
support the assessment of precision and
bias. In 1997, an additional set of
required supplemental measurements
was added to 40 CFR 58.16 in support
of the newly promulgated FRM for
PM2.5, described in 40 CFR part 50,
appendix L. In the 2006 monitoring
amendments, many of these
supplemental measurements were
removed from the requirements based
14 See https://www.epa.gov/ttn/amtic/
specguid.html and https://www.epa.gov/ttn/amtic/
airtoxqa.html.
15 See https://www.epa.gov/ttn/amtic/files/
2014conference/mondatdewinter.pdf or access
DART at https://www.airnowtech.org/dart/
dartwelcome.cfm (username and password
required).
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
on the EPA’s confidence that the PM2.5
FRM was meeting data quality
objectives (see 71 FR 2748). At that
time, reporting requirements were
retained for average daily ambient
temperature and average daily ambient
pressure, as well as any applicable
sampler flags, in addition to PM2.5 mass
and field blank mass.
The EPA believes that it is no longer
necessary to require agencies to report
the average daily temperature and
average daily pressure from manual
PM2.5 samplers, given the long-standing
experience with the FRM and the
ubiquitous availability of meteorological
data, and these specific AQS reporting
requirements were proposed for removal
in the monitoring proposal. The EPA
also proposed to remove similar
language referenced elsewhere in 40
CFR 58.16 that pertains to
measurements at Pb sites as well as to
other average temperature and average
pressure measurements recorded by
samplers or from nearby airports. For
the reasons noted above, the EPA
believes that meteorological data are
more than adequately available from a
number of sources, and that the removal
of specific requirements for such data to
be reported to AQS represents an
opportunity for burden reduction. The
EPA notes that the requirement to report
specific meteorological data for NCore
and PAMS stations remains unchanged
given the importance of having on-site
meteorological data to correlate with
PM2.5 and O3 precursor measurements.
The EPA also proposed a change to the
data reporting schedule described in 40
CFR 58.16(b) and (d) to provide
additional flexibility for reporting PM2.5
chemical speciation data measured at
CSN stations. Specifically, we proposed
that such data be required to be reported
to AQS within 6 months following the
end of each quarterly reporting period,
as is presently required for certain
PAMS measurements such as volatile
organic compounds. This change would
provide an additional 90 days for PM2.5
chemical speciation data to be reported
compared with the current requirement
of reporting 90 days after the end of
each quarterly reporting period. This
change was proposed to provide both
the EPA and monitoring agencies with
potential data reporting flexibility as
technological and procedural revisions
are considered for the national
analytical frameworks that support the
CSN network.
Seven commenters specifically
addressed the proposed changes to data
submittal and archiving requirements.
One state monitoring agency, one MJO,
and one consulting firm were
supportive of all of the proposed
PO 00000
Frm 00012
Fmt 4701
Sfmt 4700
changes in this rule section, with the
consulting firm comment also noting
that average temperature and pressure
information should still be archived
within monitoring programs for data
validation purposes. Two state
monitoring agencies expressed concerns
about the proposed change in the
reporting deadline for PM2.5 chemical
speciation data by noting the impacts on
their usage of the data, one agency
noting that efforts to submit timely
exceptional event demonstrations
would be impacted by the longer period
allowed for reporting data, and the other
state agency noting that their use of the
speciation data to validate PM2.5 FRM
and ion (e.g., sulfate, nitrate) data would
be impacted.
With specific regard to the impact on
state submissions of exceptional event
data exclusion determinations, the EPA
understands the impact of the
additional 90-day delay in gaining
access to PM2.5 chemical speciation
data, but also notes that the relatively
long timelines that currently exist
within the exceptional events rule
framework can typically accommodate
an additional delay of 90 days without
significant impact on the submitting
agency. Accordingly, we do not believe
that the additional 90 days being
proposed for reporting PM2.5 chemical
speciation data should materially
impact the ability of submitters to
develop exceptional event data
exclusion determinations within
allowable timeframes.16 Concerning the
comment relating to the availability of
PM2.5 chemical speciation data to QA
practices for PM2.5 FRM data, the EPA
acknowledges the comparative value of
such data but believes that the existing
availability of PM2.5 sampler diagnostic
records, collocated FRM data, as well as
the potential availability of continuous
monitoring data from collocated
monitors and/or nearby sites, should be
more than sufficient to validate PM2.5
FRM data in the absence of more timely
reported speciation data.
In consideration of the comments
noted above, the EPA is finalizing the
changes to data submittal and archiving
requirements as proposed.
I. Network Design Criteria (Appendix D)
Appendix D to part 58 contains
important information about ambient
monitoring objectives, site types, spatial
scales, as well as other general and
specific minimum requirements
16 The EPA expects chemical speciation data to be
reported within 30 days of PM2.5 mass data based
on the revised analytical framework that took effect
in late 2015.
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Lhorne on DSK5TPTVN1PROD with RULES2
concerning network size and design
criteria.
The EPA proposed two changes that
affect the required suite of
measurements in the NCore network.
This multi-pollutant network became
operational on January 1, 2011, and
includes approximately 80 stations that
are located in both urban and rural
areas.17
The EPA proposed a minor change to
section 3 of appendix D to part 58, the
design criteria for NCore sites,
specifically, the deletion of the
requirement to measure speciated
PM10–2.5 from the list of measurements
in section 3(b). An identical revision
was finalized in the text of 40 CFR
58.16(a) in the 2013 p.m. NAAQS final
rule (see 78 FR 3244). During this
process, the EPA inadvertently failed to
complete a similar change that was
required in the language of section 3 of
appendix D. Accordingly we proposed
this change to align the NCore
monitoring requirements between the
two sections noted above.
The EPA also proposed to delete the
requirement to measure Pb at urban
NCore sites, either as Pb in Total
Suspended Particles (Pb-TSP) or as PbPM10. This requirement was finalized as
part of the reconsideration of Pb
monitoring requirements that occurred
in 2010 (see 75 FR 81126). Since that
time, non-source oriented Pb data has
been measured at 50 urban NCore sites,
with the majority of sites having already
collected at least 2 years of data. In all
cases, valid ambient Pb readings have
been low, with maximum 3-month
rolling averages typically reading
around 0.01 micrograms per cubic meter
as compared to the NAAQS level of 0.15
micrograms per cubic meter.18 This is
an expected result given the elimination
of Pb from gasoline and the refocusing
of the ambient network to characterize
emissions at sites that have been placed
in relative close proximity to the
remaining industrial sources around a
given threshold. We expect the vast
majority of non-source sites to have the
3 years of data necessary to calculate a
design value following the completion
of monitoring in 2015. Given the
uniformly low readings being measured
at these NCore sites, we believe it is
appropriate to consider eliminating this
requirement. As noted in the associated
docket memo, non-source oriented Pb
17 See https://www3.epa.gov/ttn/amtic/ncore.html
for more information.
18 See supporting information for reconsideration
of existing requirements to monitor for lead at
urban NCore site, Kevin Cavender, Docket number
EPA–HQ–OAR–2013–0619, https://
www.regulations.gov/#!documentDetail;D=EPA–
HQ–OAR–2013–0619–0002.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
data will continue to be measured (as
Pb-PM10) at the 27 National Air Toxics
Trends Sites (NATTS) and at hundreds
of PM2.5 speciation stations that
comprise the CSN and IMPROVE
networks.
Accordingly, the EPA proposed to
delete the requirement to monitor for
non-source oriented Pb at NCore sites
from appendix D of 40 CFR part 58.19
Given the requirement to collect a
minimum of 3 years of Pb data in order
to support the calculation of design
values, the EPA proposed that
monitoring agencies would be able to
request permission to discontinue nonsource oriented monitoring following
the collection of at least 3 years of data
at each urban NCore site.20
Eight commenters specifically
addressed the proposed changes to
network design criteria. Five state or
local monitoring agencies, one MJO, and
one consulting firm were supportive of
all of the proposed changes in this
appendix, with several of the
monitoring agencies characterizing their
measurements of Pb at urban NCore
sites as either ‘‘extremely low’’ or
between 3 percent or 5 to 7 percent of
the Pb NAAQS. One joint
environmental group comment
disagreed with the proposed change to
Pb monitoring, noting the perspective
that there is no safe level of Pb and that
data even well below the level of the
NAAQS could assist communities with
finding ways of reducing Pb exposure
and that such data would also assist
researchers investigating the risks of Pb
exposure for children. This commenter
also noted that the EPA might propose
to lower the Pb NAAQS in an upcoming
rulemaking that was pending at the time
when the comment was submitted.
With regard to the adverse comment,
the EPA notes in the referenced docket
memo that well over 300 monitoring
sites for Pb would remain in operation
following the proposed termination of
monitoring at urban NCore sites. These
remaining sites would provide
characterization of Pb in TSP, PM10, and
PM2.5 in a variety of urban and rural
locations including source oriented
sites, neighborhood/community
locations, and background areas. We
also note that the EPA retains the
authority to require additional Pb
19 Specific revisions are proposed in 40 CFR part
58, appendix D, section 3(b) and sections 4.5(b) and
4.5(c).
20 The EPA will review requests for shutdown
under the provisions of 40 CFR 58.14. Although the
EPA anticipates that these non-source oriented
monitors will have design values well below the
NAAQS and will be eligible to be discontinued after
3 years of data have been collected, in the event that
a monitor records levels approaching the NAAQS,
it may not qualify to be discontinued.
PO 00000
Frm 00013
Fmt 4701
Sfmt 4700
17259
monitoring as determined by Regional
Administrators per the rule language in
appendix D, section 4.5(c). With regard
to the reference to the EPA’s upcoming
decision on the Pb NAAQS, we note
that on December 19, 2014, based on a
review of the full body of evidence, the
EPA proposed to retain, without
revision, the current NAAQS of 0.15
micrograms per cubic meter (as a 3month average in TSP) as requisite to
protect public health and welfare.21
In consideration of the supportive
comments noted above, the EPA is
finalizing the changes to network design
criteria as proposed. With specific
regard to Pb monitoring at urban NCore
sites, monitoring agencies should
request permission from the EPA
Regional Administrator to discontinue
non-source oriented monitoring
following the collection of at least 3
years of complete data at each affected
site. Monitoring agencies should work
closely with their respective EPA
Regional Offices to ensure review and
coordination of these changes to the
network and inclusion of such changes
in annual monitoring network plans.
III. Amendments to Quality Assurance
Requirements
A. Quality Assurance Requirements for
Monitors Used in Evaluations for
National Ambient Air Quality
Standards—Appendix A
1. General Information
The following changes to monitoring
requirements relate to appendix A to
part 58. Changes that affect the overall
appendix are discussed in this section
of the preamble while changes specific
to the various sections of the appendix
will be addressed in subsequent
sections of the preamble. The EPA notes
that the entire regulatory text section for
appendix A will be reprinted since this
section is being reorganized for clarity
as well as being selectively revised as
described in detail below. Additionally,
although the EPA proposed a new
appendix B to apply to PSD monitors,
much of the proposed content of
appendix B was taken directly from the
existing requirements for these monitors
set forth in appendix A. It should be
noted that a number of provisions from
appendix A were reprinted in the
regulatory text for appendix B solely for
clarity, to assist the public in
understanding the changes being
proposed. The EPA did not solicit
comment on those provisions and did
not make any changes to those
provisions in this rulemaking.
21 https://www.epa.gov/airquality/lead/
actions.html#dec2014.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17260
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
The QA requirements in appendix A
have been developed for measuring the
criteria pollutants of O3, NO2, sulfur
dioxide (SO2), CO, Pb and PM (PM10
and PM2.5), and are minimum
requirements for monitoring these
ambient air pollutants for use in
NAAQS attainment demonstrations. To
emphasize the objective of this
appendix, the EPA proposed to change
the title of appendix A to ‘‘Quality
Assurance Requirements for Monitors
used in Evaluations of National
Ambient Air Quality Standards,’’ and
remove the terms SLAMS and SPMs
from the title. We do, however, in the
applicability paragraph, indicate that
any monitor identified as SLAMS must
meet the appendix A criteria in order to
avoid any confusion about SLAMS
monitors measuring criteria pollutants.
Special purpose monitors may in fact be
monitoring for a criteria pollutant for
other objectives besides making
comparisons to the NAAQS. Therefore,
appendix A clarifies in the title and the
applicability section that the QA
requirements specified in this appendix
are for criteria pollutant monitors that
are designated, through the Part 58
ambient air regulations and monitoring
organization annual monitoring network
plans, as eligible to be used for NAAQS
evaluation purposes. The applicability
section also provides a reporting
mechanism in AQS to identify any
criteria pollutant monitors that are not
used for NAAQS evaluations. The
criteria pollutants identified for NAAQS
exclusion will require review and
approval by the EPA Regional Offices
and will increase transparency and
efficiencies in the NAAQS designation,
data quality evaluation and data
certification processes. There were no
adverse comments to the change in the
title and, therefore, the title will be
changed as proposed.
The previous appendix A regulation
had separate sections for automated
(continuous) and manual method types.
The EPA proposed to reformat the
document by pollutant rather than by
method type. The four gaseous
pollutants (CO, NO2, SO2 and O3) will
be contained in one section since the
quality control (QC) requirements are
very similar, and separate sections will
be provided for PM10, PM2.5, and Pb.
The EPA received one supportive
comment from a consulting firm made
on the proposed reformatting and no
adverse comments. Therefore, appendix
A and appendix B will be reformatted
as proposed.
In the 2006 monitoring rule revisions,
the PSD QA requirements, which were
previously in appendix B, were added
to appendix A and appendix B was
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
reserved. The PSD requirements, in
most cases, mimicked appendix A in
structure but because PSD monitoring is
often operated only for a period of 1
year, some of the frequencies of
implementation of the PSD
requirements are higher than the
appendix A requirements. In addition,
the agencies governing the
implementation, assessment and
approval of the QA requirements are
different for PSD and ambient air
monitoring for NAAQS decisions (i.e.,
the EPA Regions for appendix A versus
PSD reviewing authorities for PSD). The
combined regulations have caused
confusion among monitoring
organizations and those implementing
PSD requirements, so the EPA proposed
that the PSD requirements be moved
back to a separate appendix B. This
change would also provide more
flexibility for revision if changes in
either appendix are needed.
The EPA received one supportive
comment to adopt this change and
received no adverse comments.
Therefore, PSD QA requirements will be
placed into appendix B as proposed.
Finally, the EPA proposed that
appendix A emphasize the use of PQAO
and moved the definition and
explanation to the beginning of the
regulation in order to ensure that the
application and use of PQAO in
appendix A is clearly understood. The
definition for PQAO was not proposed
for change. Since the PQAO can be a
consolidation of a number of local
monitoring organizations, the EPA
proposed to add a sentence clarifying
that the agency identified as the PQAO
(usually the state agency) will be
responsible for overseeing that the
appendix A requirements are being met
by all local agencies within the PQAO.
Current appendix A regulation requires
PQAOs to be approved by the EPA
Regions during network reviews or
audits. The EPA believes this approval
can occur at any time and proposed to
eliminate wording that suggests that
PQAO approvals can only occur during
events like network reviews or audits.
The EPA received one comment
supporting the clarifying language
suggesting it will reduce unnecessary
work on the part of the monitoring
agencies by combining and
consolidating QA/QC activities and also
fostering a unified approach to air
monitoring across an entire state’s
PQAO. The EPA received no adverse
comments. Therefore, the EPA is
finalizing the language as proposed.
2. Quality System Requirements
The EPA proposed to remove the QA
requirements for PM10-2.5 (see current
PO 00000
Frm 00014
Fmt 4701
Sfmt 4700
sections 3.2.6, 3.2.8, 3.3.6, 3.3.8, 4.3).
Appendix A has traditionally been used
to describe the QA requirements of the
criteria pollutants used in making
NAAQS attainment decisions. While the
part 58 Ambient Air Monitoring
regulation requires monitoring for the
CSN, PAMS, and total oxides of
nitrogen (NOy) for NCore, the QA
requirements for these networks are
found in technical assistance documents
and not in appendix A. In 2006, the EPA
proposed a PM10-2.5 NAAQS along with
requisite QA requirements in appendix
A. While the PM10-2.5 NAAQS was not
promulgated, PM10-2.5 monitoring was
required to be performed at NCore sites
and the EPA proposed requisite QA
requirements in appendix A. Some of
the PM requirements, like collocation
for precision and the performance
evaluation programs for bias, are
accomplished on a percentage of
monitoring sites within a PQAO. For
example, collocated sampling for PM2.5
and PM10 is required at approximately
15 percent of the monitoring sites
within a PQAO. Since virtually every
NCore site is the responsibility of a
different PQAO, the appendix A
requirements for PM10-2.5, if
implemented at the PQAO level, would
have been required to be implemented
at almost every NCore site, which would
have been expensive and an unintended
burden. Therefore, the EPA required the
implementation of the PM10-2.5 QC
requirements at a national level and
worked with the EPA Regions and
monitoring organizations to identify the
sites that would implement the
requirements. The implementation of
the PM10-2.5 QC requirements at NCore
sites fundamentally changed how QC is
implemented in appendix A and has
been a cause of confusion. Since
PM10-2.5 is not a NAAQS pollutant and
the QC requirements cannot be costeffectively implemented at a PQAO
level, the EPA proposed to eliminate the
PM10-2.5 requirements including flow
rate verifications, semi-annual flow rate
audits, collocated sampling procedures,
and the PM10-2.5 Performance Evaluation
Program (PEP). Similar to the technical
assistance documents associated for the
CSN 22 and PAMS 23 networks, the EPA
will develop QA guidance for the
PM10-2.5 network which will afford more
flexibility for implementation and
revision of QC activities for PM10-2.5.
The EPA received comments from a
state and a consulting firm in support of
22 See https://www.epa.gov/ttn/amtic/
specguid.html for CSN quality assurance project
plan.
23 See https://www.epa.gov/ttn/amtic/
pamsguidance.html for PAMS technical assistance
document.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
the removal of these requirements and
no adverse comments. Therefore, the
EPA will remove the PM10-2.5 QA
requirements as proposed.
The EPA proposed that the QA Pb
requirements of collocated sampling
(see current section 3.3.4.3) and Pb
performance evaluation procedures (see
current section 3.3.4.4) for non-source
oriented NCore sites be eliminated. The
2010 Pb rule in 40 CFR part 58,
appendix D, section 4.5(b), added a
requirement to conduct non-source
oriented Pb monitoring at each NCore
site in a core based statistical area
(CBSA) with a population of 500,000 or
more. This requirement had some
monitoring organizations implementing
Pb monitoring at only their NCore sites.
Since the appendix A requirements are
focused on PQAOs, the QC
requirements would increase at PQAOs
who were required to implement Pb
monitoring at their NCore site. Similar
to the PM10-2.5 QA requirements, the
requirement for Pb at NCore sites forced
the EPA away from a focus on PQAOs
to working with the EPA Regions and
monitoring organizations for
implementation of the Pb-PEP at NCore
sites at national levels. Therefore, the
EPA proposed to eliminate the
collocation requirement and the Pb-PEP
requirements at NCore sites while
retaining the requirements for flow rate
verifications and flow rate audits, which
do not require additional monitors or
independent sampling and analysis.
Similar to the CSN and PAMS programs,
the EPA will develop QA guidance for
Pb monitoring in the NCore network,
which will afford more flexibility for
change/revision to accommodate Pb
monitoring at non-source oriented
NCore sites. Additionally, the EPA
proposed to delete the requirement to
measure Pb at these specific NCore sites,
either as Pb-TSP or as Pb-PM10 (see
section II.I). Such a revision would
eliminate the need for any associated
QA requirements including collocation,
Pb-PEP or any QC requirements for
these monitors.
The EPA received two state comments
and one MJO comment in support of the
removal of this requirement and no
adverse comments. Therefore, the EPA
will remove the Pb QA requirements at
non-source oriented NCore sites as
proposed. As noted earlier in section
II.I, the EPA is also finalizing the
proposed deletion of Pb monitoring
requirements at NCore sites from
appendix D.
The EPA proposed that quality
management plan (QMP) (current
section 2.1.1) and quality assurance
project plan (QAPP) (current section
2.1.2) submission and approval dates be
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
reported by monitoring organizations
and the EPA. This will allow for timely
and accurate reporting of this
information. From 2007 to 2011, the
EPA tracked the submission and
approval of QMPs and QAPPs by
polling the EPA Regions each year and
updating a spreadsheet that was posted
on the Ambient Monitoring Technical
Information Center (AMTIC) Web site.
The development of the annual
spreadsheet was time-consuming on the
part of monitoring organizations and the
EPA and, due to polling delays, took a
significant amount of time to assemble
a final version for posting. It is expected
that simplified reporting by monitoring
organizations and EPA to AQS will
reduce entry errors and the burden of
incorporating this information into
annual spreadsheets, and increase
transparency of this important quality
system documentation. In order to
reduce the initial burden of this data
entry activity, the EPA populated AQS
with the last set of updated QMP and
QAPP data from the 2011 listing.
Monitoring organizations will need to
update AQS only when submitting new
or revised versions of QAPP or QMPs
(one or two fields) and the EPA can then
add approval dates.
The EPA received one state comment
in support of this proposal, and two
states, a consulting firm and one MJO
commented expressing concern. One
state commenter mentioned that the
preamble indicates that the monitoring
organizations would be responsible for
submitting the dates associated with
QMP and QAPP submittals and
approvals and, if this was the intent of
the proposed rule, AQS must be
modified to allow monitoring
organizations the ability to enter this
data. The commenter also mentioned
that the EPA’s AQS web application
only allows monitoring organizations to
view QAPP and QMP dates, but the
functionality to enter or revise those
dates is unavailable. The commenter
mentioned other issues related to the
current functionality of the system but
not a disagreement with the proposed
requirement to report the data.
The MJO commenter mentioned that
reporting to AQS was an unnecessary
burden on state air monitoring agencies
because the EPA Regional Offices
receive these reports and the
information is available to the public on
the EPA AMTIC Web site. The
consulting firm did not understand how
shifting this burden to ‘‘monitoring
organizations’’ would relieve the
reporting burden on any organization
other than the EPA.
As mentioned in the proposal, the
approach of reporting QAPP and QMP
PO 00000
Frm 00015
Fmt 4701
Sfmt 4700
17261
information to AMTIC was not only
time-consuming for monitoring
organizations but also for EPA who
would work for 2 to 3 months to pull
together this annual report. By reporting
the information directly to AQS, the
monitoring organization’s requirements
are also reduced since they do not need
to be polled every year to gather this
information, review it for accuracy and
completeness, and transmit it to the
EPA Regional Office. The monitoring
organizations will only need to report
updates to AQS when they occur and
will not be burdened with this request/
review process every year.
In regard to the comment related to
the current functionality of AQS, which
did not allow agency reporting of the
QMP/QAPP information, the EPA notes
that AQS is now available for
monitoring organizations, and EPA
Regional Offices, to report this
information that has currently been
reported and revised by the EPA.
Therefore, rather than posting a static
table on AMTIC each year (which could
change through-out the time period
between updates), AMTIC can host a
link to the most up-to-date information
in AQS, which is a much more efficient
method than the cumbersome annual
collection and reporting method
described above. Therefore, the EPA is
finalizing the requirement as proposed.
The EPA proposed that if a PQAO or
monitoring organization has been
delegated authority to review and
approve their QAPP, an electronic copy
must be submitted to the EPA Regional
Office at the time it is submitted to the
PQAO/monitoring organization’s QAPP
approving authority. Submission of an
electronic version to the EPA at the time
of completion is not considered an
added burden on the monitoring
organization because such submission is
already a standard practice as part of the
review process for technical systems
audits (TSA).
The EPA did not receive any
supporting or adverse comments to this
proposal, but did receive a state
comment suggesting that a copy of all
approved QAPP’s be submitted annually
rather than at the time when a QAPP is
submitted or approved. The EPA notes
that during recent systems audits, EPA
auditors have found language in
approved QAPPs that do not meet
ambient air regulatory requirements.
Non-conformance with a regulatory
requirement can lead to data
invalidation. In an effort to identify any
non-conformance with regulatory
requirements as early as possible,
especially with monitoring
organizations that have been delegated
responsibility to approve their own
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17262
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
QAPPs, the EPA believes it is important
to have the opportunity to review these
documents as early as possible to
eliminate potential data invalidation
issues. Therefore, the EPA is finalizing
this language as proposed.
In the QAPP requirement language,
the EPA proposed to clarify that the
QAPP include a list of sites and
monitors associated with the QAPP.
The EPA received a state comment
that considered it a burden to update
the QAPP every time a site or monitor
is changed or is added. The commenter
suggested adding that this information
can be referenced in other publicly
available documents. Since this section
allows standard operating procedures to
be referenced in the QAPP, the EPA will
also allow the referencing of monitors
and sites.
The requirement to identify the sites/
monitors in a QAPP is a standard QAPP
requirement and is why it is included in
the regulation. However, the QAPP can
refer to an official table that is updated
annually that may be on a Web site or
other official documentation (e.g.,
annual network plan). In addition, if the
QAPP does contain this information, an
addendum to the QAPP modifying this
information (with reference to the
QAPP) can be accomplished without
having to physically edit the document
each time a monitoring site is added
because the addition of the site does not
affect how the quality system is
implemented.
The EPA is finalizing the requirement
as proposed, but is also clarifying that
sites and monitors may be allowed to be
referenced from other up-to-date
sources.
The EPA proposed to add some
clarifying language to the section
describing the National Performance
Evaluation Program (NPEP) (current
section 2.4) explaining selfimplementation of the performance
evaluation by the monitoring
organization. The clarification also adds
the definition of ‘‘independent
assessment’’ which is included in the
PM2.5-PEP, Pb-PEP and National
Performance Audit Program (NPAP)
QAPPs, and is included in the selfimplementation memo sent to the
monitoring organizations on an annual
basis and posted on the AMTIC Web
site.24 The clarification codifies in
regulation what was in guidance, and
provides a better reference for this
information in addition to the annual
memo sent to the monitoring
organizations.
24 See https://www.epa.gov/ttn/amtic/
npepqa.html.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
The EPA received one state comment
in support of the addition of the
independent assessment definition and
one state comment noting concern.
The state comment of concern
included a reference to the NPAP
revisions that are proposed below
(section 3.1.3) and does not appear to be
related to the actual definition that was
proposed in this section. Further, we
note that the state that made the
comment qualifies as eligible to conduct
an ‘‘independent assessment’’ under the
current definition that was proposed
and has been defined in this way in
annual self-implementation decision
memorandums that have been sent to
monitoring organizations since 2008.
This definition has not changed and was
expected to be achieved by monitoring
organizations in order to self-implement
the various performance evaluations
defined in this section. Therefore, the
EPA is finalizing the requirement as
proposed.
The EPA proposed to add clarifying
language to the TSA section (current
section 2.4). As described in more detail
below, the current TSA requirements
are clearly intended to be performed at
the monitoring organization level.
The EPA proposed a TSA frequency
of 3 years for each PQAO, but included
language that if a PQAO is made up of
a number of monitoring organizations,
all monitoring organizations within the
PQAO should be audited within 6 years.
This proposed language maintains the 3
year TSA requirement as it applies to
PQAOs but provided additional
flexibility for the EPA Regions to audit
every monitoring organization within
the PQAO every 6 years. This revision
was made to address logistical concerns
at the EPA Regions, particularly for
those Regions with very large PQAOs
composed of many monitoring
organizations. In the EPA’s view, the
proposed revision did not materially
affect the burden on monitoring
organizations.
The EPA received one state comment
supporting the proposed revision as
written, one comment by a joint
environmental organization suggesting
that we maintain the current
requirement to audit each monitoring
organization on a 3-year basis, and two
state comments that suggested that the
proposed revision was a burden to
monitoring organizations.
The comment from the joint
environmental organization expressed
concern with the potential for reduced
frequency of the TSAs for monitoring
organizations in consolidated PQAOs
(proposed 6-year frequency versus
current 3-year frequency). The
commenter believed such a change
PO 00000
Frm 00016
Fmt 4701
Sfmt 4700
could seriously jeopardize
implementation of the Act and threaten
public health by delaying NAAQS
decisions. The commenter cited
examples of recent invalidation of PM2.5
data that were based on findings from
TSAs. In their view, delaying audit
frequencies to once every 6 years (for a
monitoring organization) raises the risk
of even greater delay and disruption of
nonattainment designations in areas that
are violating NAAQS and have data
quality issues at the pertinent
monitoring organizations.
Two commenters from state agencies
felt that the proposed language would
treat these monitoring organizations
(within a PQAO) as individual entities,
causing an increase in the number of
TSAs and difficulty in ensuring
consistency among monitoring
organizations within the PQAO, and
would disrupt monitoring organizations
with the scheduling of these audits. The
PQAO staff would be required to
oversee the changes throughout the
monitoring organizations, participate in
each of the TSAs, track all corrective
actions, verify implementation, and
ensure consistency of implementation
across all monitoring organizations.
Commenters who were concerned
with the proposed language to audit
individual monitoring organizations
within a PQAO may have been
interpreting the current and earlier
appendix A requirements somewhat
differently than the original intent of the
EPA. Since 1996, the TSA language in
appendix A has been associated with
auditing monitoring agencies or
monitoring organizations, not PQAOs
(note—the PQAO term was promulgated
in 2006). For additional context, the
following rule excerpts provide a
chronological history of the TSA
language in appendix A.
Prior to 1998: ‘‘Agencies operating
SLAMS network stations shall be subject
to annual EPA systems audits of their
ambient air monitoring program and are
required to participate in EPA’s
National Performance Audit Program.’’
1998: ‘‘Systems audits of the ambient
air monitoring programs of agencies
operating SLAMS shall be conducted at
least every 3 years by the appropriate
EPA Regional Office.’’
2005: ‘‘Systems audits of the ambient
air monitoring programs of agencies
operating SLAMS shall be conducted at
least every 3 years by the appropriate
Regional Office.’’
2006–2014 (prior to this proposed
change): ‘‘Technical systems audits of
each ambient air monitoring
organization shall be conducted at least
every 3 years by the appropriate EPA
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Regional Office and reported to the
AQS.’’
The EPA notes that the current
definition (40 CFR 58.1) for a
monitoring agency (prior to this
proposal) was defined as ‘‘a state or
local agency responsible for meeting the
requirements of this part.’’ Monitoring
organization was defined as a ‘‘state,
local, or other monitoring organization
responsible for operating a monitoring
site for which the quality assurance
regulations apply.’’ Neither definition
described any consolidation of agencies
into a PQAO; therefore, individual
monitoring agencies or organizations
were to receive a TSA by the EPA
Region annually prior to 1998 and every
3 years after 1998.
As indicated by one of the
commenters who suggested that the
proposed language would treat
monitoring organizations as individual
entities, the TSA language was, in fact,
defined to treat the monitoring agencies
as individual entities. The value of this
approach has been reaffirmed by recent
TSAs where Regional Office auditors
have found that monitoring
organizations within consolidated
PQAOs, in some cases, did not operate
consistent quality systems.
A commenter expressing concern
about the proposed revision made the
point that all monitoring organizations
covered under the umbrella of the
PQAO’s quality system would have to
make changes in their operation each
time a TSA at any of the monitoring
organizations indicates an issue with
that monitoring organization’s quality
system. This comment reflects a concern
(and a tacit acknowledgement) that
monitoring organizations within a
PQAO do not necessarily implement a
consistent quality system and need to be
audited at some frequency. The
commenter is correct and the EPA
agrees that an issue identified by a TSA
at one monitoring organization within
the PQAO should be reviewed by the
PQAO to determine if corrective action
should be instituted for all monitoring
organizations operating in the PQAO.
That is the specific concern that has
driven the EPA’s regulations to
consistently require TSAs at the
monitoring organization level. The
proposed TSA language provides for
this review of the PQAO every 3 years
and of all monitoring organizations
within the PQAO within 6 years.
A state agency commenter was also
concerned that TSAs could affect the
data certification process. The
commenter was concerned that EPA
concurrence with a PQAO’s data
certification could be prohibited due to
the lack of a TSA within the appropriate
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
17263
time frame. The EPA notes that TSA
completeness requirements are reported
on certification reports but do not affect
the concurrence process itself and,
therefore, do not penalize the PQAO if
the TSA is not performed at the required
frequency.
In response to the comment from the
joint environmental organization and
based on the recent findings in the
TSAs,25 the EPA Regions are providing
more scrutiny on the PQAO
requirements to ensure that monitoring
organizations consolidated in PQAOs
develop and document consistent
quality practices. The EPA Headquarters
and Regions are working together to
develop a more consistent TSA process
based on ‘‘lessons learned’’ from the
PM2.5 TSAs findings identified in the
joint environmental organization
comment. In addition, Regions are
scrutinizing PQAO quality systems to
ensure a level of QA consistency of
monitoring organizations within a
PQAO and, where there are issues,
either taking corrective actions or
suggesting that monitoring organizations
within a PQAO disaggregate. The EPA
has also seen PQAOs developing better
documents and training for monitoring
organizations within PQAOs to improve
quality system consistency. Based on
the information presented above, the
EPA believes that the proposal to allow
monitoring organizations within a
PQAO to be audited within a 6-year
period is reasonable and is finalizing the
requirement as proposed.
In summary, the revised regulation
specifies that EPA Regional Offices
conduct TSAs of every PQAO at a 3-year
frequency and that they should also
perform a TSA on all monitoring
organizations within the PQAO within 6
years. Where resources permit, the EPA
encourages the adoption of the practice
of some PQAOs to perform their own
agency-specific TSAs and monitoring
site visits on member monitoring
agencies in the intervening years
between required EPA Regional Office
TSAs. Such visits can help to
proactively identify potential QA
deficiencies before situations involving
long-term data loss occur and can also
serve to assure uniformity in procedures
across PQAOs through periods of
changing personnel, equipment, or EPA
requirements.
The EPA proposed to require
monitoring organizations to complete an
annual survey for the Ambient Air
Protocol Gas Verification Program (AA–
PGVP) (current section 2.6.1). Since
2009, the EPA has had a separate
information collection request 26
requiring monitoring organizations to
complete an annual survey of the
producers that supply their gas
standards (for calibrations and QC) in
order to be able to select standards from
these producers for verification. The
survey generally takes less than 10
minutes to complete. The EPA proposed
to add the requirement to complete the
survey to appendix A.
The EPA received one consulting firm
comment suggesting that entry of data in
the annual survey was a modest burden
and another state comment of support
without additional comment. There
were no adverse comments on
completing the annual survey.
Therefore, the EPA is finalizing the
language as proposed.
In addition, the EPA proposed to add
language that monitoring organizations
participate, at the request of the EPA, in
the AA–PGVP by sending a gas standard
to one of the verification laboratories no
more frequently than every 5 years.
Since many monitoring organizations
already volunteer to send in cylinders,
this proposed new requirement is not
expected to materially affect most
agencies and will not affect those
agencies that do not run gaseous
ambient air monitors and, therefore, do
not use gas standards.
The EPA received three state
comments supporting and one MJO and
two state comments expressing concern
about this aspect of the AA–PGVP
requirement. The supportive responses
included one organization already
participating in the program and
another that mentioned that the
independent verification of cylinder
contents has value for monitoring
groups especially with respect to the
lower target gas concentrations now
employed in QA procedures. A third
response supported the action with no
additional comments. Comments
expressing concern about the proposal
were related to the extra cost associated
with shipping a cylinder to the
verification laboratory and the
Department of Transportation (DOT)
training required for shipping the
cylinder. One commenter mentioned
that the organizations are already
required to use traceable or certified
gases and another suggested that the
EPA could also consider working with
the standard gas vendors directly,
potentially through a federally funded
gas certification and verification
program. A commenter suggested the
25 McCabe, Janet G. (2014). Particle Pollution
Quality Assurance. Memorandum to the Docket,
EPA–HQ–OAR–2013.
26 See https://www.reginfo.gov/public/
Forward?SearchTarget=PRA&textfield=ambient+
air+protocol+gas.
PO 00000
Frm 00017
Fmt 4701
Sfmt 4700
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17264
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
requirement is resource intensive
because additional standard gases will
need to be maintained for use while the
audited cylinder is not in use.
By way of background relating to the
genesis of the AA–PGVP, the EPA notes
that the Office of Research and
Development (ORD) operated a protocol
gas audit program that was discontinued
in 1997. In the mid-2000 timeframe, the
EPA received a number of comments
from monitoring organizations that the
program was needed and the current
program (implemented in 2010) was
created based on those comments. The
monitoring organizations were
concerned that they were receiving
cylinders that were not meeting the
protocol gas specifications even though
the producers, as one commenter
mentioned, are required to use traceable
or certified gases. Information from a
2009 Office of Inspector General report
indicated some failures to meet protocol
gas requirements by some protocol gas
producers.27 Gas producers were also
sharing concerns with the EPA that
some producers were selling cylinders
that were not properly verified.
Although the EPA initially tried to
develop a program that would be
funded by the gas vendors, many of
whom agreed to fund it, one producer
lodged a protest and the EPA could not
implement the program in this manner.
In addition, the AA–PGVP is intended
to be a blind verification of the
producers, meaning it would be most
advantageous for the producer not to
know a cylinder is being sent to a
verification lab and, therefore, the EPA
tries not to request cylinders directly
from gas producers. Although one
commenter suggested that the EPA
receive cylinders directly from the
producer, this would defeat the purpose
of the blind verification and the
producers would have the opportunity
to send a cylinder that may have had
additional testing against its certified
value. The AA–PGVP has been
implemented since 2010 and the EPA is
starting to see a drop in monitoring
organization participation, yet we also
received positive comments that the
program is valuable in keeping the
producers aware of the need for the
quality of their gas standards.
In response to the comment
expressing concern about the cost of
participating in the program and the
logistical difficulty of properly being
certified to ship cylinders, the EPA
clarifies that with the current program,
the EPA covers the cost of shipping the
cylinders to and from the regional AA–
PGVP verification laboratory. Online
DOT training is offered to monitoring
organizations and is valid for 3 years. So
although there is an expense to the
monitoring organization on the time to
train, there is limited burden related to
the rest of the program. The EPA is
aware that additional standard gases
will need to be maintained for use while
the new cylinder is being sent for
verification. Most monitoring
organizations order new cylinders prior
to expiration of older cylinders or before
they run out of gas supply. There is
normally a transition period where new
cylinders are on hand and checked
against the current cylinder before
retiring the older cylinder. The AA–
PGVP Implementation Plan 28 describes
that during this change-out process, if
the new cylinder is ordered with
enough lead time (AA–PGVP estimates
30–45 days from shipping through
verification and cylinder return), it
could be sent to the AA–PGVP
verification laboratory and verified prior
to use by monitoring organizations
before it needed to be exchanged with
an older cylinder.
Based on the comments received and
the EPA’s clarifications of the need for
the current program, the EPA will
codify the ICR requiring monitoring
organizations to report the gas standard
producers it uses on an annual basis and
also finalize the proposed language
allowing the agency to request cylinders
from monitoring organizations no more
frequently than every 5 years.
3. Measurement Quality Checks for
Gases
The EPA proposed to lower the audit
concentrations (current section 3.2.1) of
the one-point QC checks to between
0.005 and 0.08 parts per million (ppm)
for SO2, NO2, and O3 (currently 0.01 to
0.1 ppm), and to between 0.5 and 5 ppm
for CO monitors (currently 1 and 10
ppm). With the development of more
sensitive monitoring instruments with
lower detection limits, technical
improvements in calibrators, and lower
ambient air concentrations in general,
the EPA felt this revision would better
reflect the precision and bias of the
ambient air data being measured at the
site. Since the QC check concentrations
are selected using the mean or median
concentration of typical ambient air
concentrations (guidance on this is
provided in the QA Handbook 29), the
28 https://www.epa.gov/ttnamti1/files/ambient/
27 U.S. Environmental Protection Agency. ‘‘EPA
Needs an Oversight Program for Protocol Gases,’’
Office of Inspector General Report No. 09–P–0235,
2009.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
qaqc/aapgvpimpplan.pdf.
29 QA Handbook for Air Pollution Measurement
Vol. II Ambient Air Quality Monitoring Program at:
https://www.epa.gov/ttn/amtic/qalist.html.
PO 00000
Frm 00018
Fmt 4701
Sfmt 4700
EPA proposed to add some clarification
to the current language by requiring
monitoring organizations to select either
the highest or lowest concentration in
the ranges identified if their mean or
median concentrations are above or
below the prescribed range.
The majority of the comments (19 of
26 responding to the quality assurance
proposal) received on appendix A were
related to this proposed change. One
state and one consulting firm
commenter expressed support for the
change but the majority of commenters
expressed concern (16 state commenters
and one MJO). Most of the commenters
expressed similar technical concerns
that:
• The SLAMS network is in place
mainly for decisions related to the
NAAQS, so QC checks should be at the
levels approximating the NAAQS
values.
• Some of the FRM or FEM that are
still in use may operate acceptably at
concentrations around the NAAQS, but
the older versions of the approved
monitors are not as sensitive at lower
concentrations (i.e., mean or median
concentrations), so QC checks at these
lower levels are beyond the operational
limits of the instrumentation.
• The instrumentation necessary to
challenge the monitors at the lower
concentrations (calibrators with
additional mass flow controllers or gas
cylinders of lower concentrations)
would be required to comply and,
therefore, represent an added expense
and burden.
• The lower concentrations affect the
percent difference statistic so there is
more chance that the QC check will fail
the acceptance requirements and,
therefore, invalidate data that the
monitoring organization feels is of
acceptable quality.
The EPA acknowledges these
comments and has performed some
evaluations on 2013 hourly gaseous data
that are summarized in a memo placed
in the docket.30 As summarized in the
memo, the EPA generally believes that
challenging ambient air analyzers with
a one-point QC check at the level of the
NAAQS provides an incomplete and
potentially inaccurate representation of
the precision and bias of the data
actually reported to the AQS since, in
most cases, the precision and bias
estimates are performed at levels that
are above 99 percent of the actual
SLAMS data reported to AQS. The
30 Papp, M. (2015). Assessments of One-Point QC
Data in Response to Comments on Revisions to the
Ambient Air Quality Assurance Regulation
contained in 40 CFR part 58, appendix A.
Memorandum to the Docket, EPA–HQ–OAR–2013–
0619.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
EPA’s analysis of QC check data shows
that many monitoring agencies are
successfully meeting measurement
quality objectives at lower
concentrations that are closer to the
routine ambient data being reported to
AQS. We recognize that some of these
QC checks may be reported by
monitoring organizations that have
invested in the technology (i.e.,
analyzers, calibration devices and
standards at NCore sites) necessary to
adequately calibrate and estimate
precision and bias at the concentrations
measured at ambient levels. This
analysis demonstrates that the
technology is available to measure and
report precision and bias at mean/
median ambient air concentration
levels.
At the same time, the EPA is aware
that there are monitoring agencies that
have not yet invested in some of these
newer technologies and/or may not
believe that the operation of more
sensitive instrumentation and
associated calibration equipment
outside of the NCore framework is
necessary to meet their monitoring
objectives. In light of the comments
received on this issue, the EPA will
modify the proposed changes to QC
check requirements. Specifically, we are
finalizing the lower concentration
ranges as proposed: 0.005 to 0.08 ppm
for SO2, NO2, and O3, and between the
prescribed range of 0.5 and 5 ppm for
CO monitors. Additionally, rather than
requiring that the range selected be at
the mean or median concentration range
at the site or the agencies network of
sites, the current flexibility to select the
QC check gas concentration within the
prescribed range will remain
unchanged. Specifically, monitoring
agencies should relate the concentration
of the QC check to the monitoring
objective of the site; with SLAMS
monitors primarily intended for NAAQS
compliance utilizing concentrations at
or near the level of the NAAQS (higher
end of the required range), and trace gas
monitors operating at NCore,
background or trends sites related to the
mean or median of the ambient air
concentrations normally measured at
those sites in order to appropriately
reflect the precision and bias at these
routine concentration ranges. The EPA
also clarifies that if the mean or median
concentrations at trace gas sites are
below the method detection limits
(MDL) of the instrument, or if
concentrations are above the prescribed
range, the agency can select the lowest
or highest concentration in the range
that can be practically achieved. In
addition, the EPA will keep language
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
suggesting that an additional QC check
point is encouraged for those
organizations that may have occasional
high values or would like to confirm
monitor linearity at the higher end of
the operational range. It will also
encourage monitoring organizations that
are operating NAAQS compliance sites
to include additional QC checks around
the mean or median values.
The EPA believes that providing
monitoring organizations some
flexibility in determining the QC check
concentration range based on site
monitoring objective and the sensitivity
of its monitors should address the
concerns that were noted in the
comments on this aspect of the
proposed requirement. However, the
EPA reiterates that our analysis of
reported data has shown that
monitoring agencies can test and
achieve acceptable precision and bias
results at these lower concentration
levels. Providing data users with
estimates of precision and bias where
the majority of our ambient air data are
measured is an EPA programmatic goal
and monitoring organizations should be
working with the EPA Regional Offices
to develop the budgets necessary for
purchasing the updated equipment and
revising related procedures. The EPA
will continue to endorse this approach
to make the QC checks more meaningful
and we will consider future revisions to
appendix A to either require QC checks
at two concentration levels (i.e., one
around the mean concentrations and
one related to the NAAQS) or require
the span check 31 to be reported to AQS.
In addition, to alleviate concerns about
failing the acceptance criteria at lower
QC concentrations, the EPA will
evaluate suggestions by monitoring
organizations to raise acceptance criteria
or look at alternative acceptance criteria
(e.g., difference instead of percent
difference). Since acceptance criteria are
included in guidance, the EPA will have
the opportunity to perform the
evaluations without affecting the
regulation. In 2011, the EPA developed
similar guidance for lower
concentration levels of the annual
performance evaluation audits.32
The EPA proposed to remove
reference to zero and span adjustments
(current section 3.2.1.1) and revise the
one-point QC language to simply require
that the QC check be conducted before
any calibration or adjustment to the
monitor. Recent revisions of the QA
31 A check similar to the QC check but
implemented at a concentration closer to the higher
end of the calibration range of the monitor.
32 https://www.epa.gov/ttnamti1/files/ambient/
pm25/datamang/20110217lowlevelstatmemo.pdf.
PO 00000
Frm 00019
Fmt 4701
Sfmt 4700
17265
Handbook discourage the
implementation of frequent span
adjustments so the proposed language
helps to clarify that no adjustment be
made prior to implementation of the
one-point QC check.
There were no comments made on
this proposed revision so the EPA is
finalizing this revision as proposed.
The EPA proposed to remove the
requirement (current section 3.2.2) to
implement an annual performance
evaluation for one monitor in each
calendar quarter when monitoring
organizations have fewer than four
monitoring instruments. The minimum
requirement for the annual performance
evaluation for the primary monitor at a
site is one per year. The current
regulation requires evaluation of 25
percent of the monitors per quarter so
that the performance evaluations are
performed in all four quarters. There are
cases where some monitoring
organizations have fewer than four
primary monitors for a gaseous
pollutant, and the current language
suggests that a monitor already
receiving a performance evaluation be
re-audited to provide for performance
evaluations in all four quarters. This
proposed removal of the requirement for
evaluation in every quarter reduces the
burden for monitoring agencies
operating smaller networks and does not
change the requirement of an annual
performance evaluation for each
primary monitor.
The EPA received one state comment
in support of this revision and no
adverse comments. Therefore, the EPA
is finalizing this revision as proposed.
The current annual performance
evaluation language (current section
3.2.2.1) requires that the audits be
conducted by selecting three
consecutive audit levels (currently five
audit levels are provided in appendix
A). Due to the implementation of the
NCore network, the inception of trace
gas monitors, and generally lower
ambient air concentrations being
measured, there is a need for audit
levels at lower concentrations to more
accurately represent the uncertainties
present in much of the ambient data.
The EPA proposed to expand the audit
levels from five to ten and remove the
requirement to audit three consecutive
levels. The previous regulation
suggested that the three audit levels
bracket 80 percent of the ambient air
concentrations measured by the
analyzer, and monitoring organizations
have requested the use of an audit point
to establish monitor accuracy around
the NAAQS levels. Therefore, the EPA
proposed to revise the language so that
two of the audit levels selected
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17266
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
represent 10–80 percent of routinelycollected ambient concentrations either
measured by the monitor or in the
PQAOs network of monitors. The
proposed revision allowed the third
point to be selected at the NAAQS level
(e.g., 75 ppb for SO2) or above the
highest 3-year routine hourly
concentration, whichever was greater.
One state commenter and a consulting
firm supported this proposal while six
state commenters voiced concern. The
comments expressing concern were
similar to comments made on the onepoint QC check proposal described
earlier, including:
• The SLAMS network is in place
mainly for decisions related to the
NAAQS, so QC checks should be at the
levels approximating the NAAQS
values.
• Some of the FRM or FEM that are
still in use may operate acceptably at
concentrations around the NAAQS, but
these older methods are not as sensitive
at lower concentrations (i.e., mean or
median concentrations), so QC checks at
these lower levels are beyond the limits
of the instrumentation.
• The instrumentation necessary to
challenge the monitors at the lower
concentrations (calibrators with
additional mass flow controllers or gas
cylinders of lower concentrations)
would be required to comply and,
therefore, represent an added expense
and burden.
• The lower concentrations affect the
percent difference statistic so there is
more chance that the QC check will fail
the acceptance requirements and,
therefore, invalidate data that the
monitoring organization feels is of
acceptable quality.
The EPA believes that there are some
distinctions between the annual
performance evaluations and the onepoint QC checks, and although the
comments on the proposed revisions are
similar, a different response to the
comments is appropriate as explained
below.
Where monitoring organizations
typically utilize standards and
equipment at each site to run one-point
QC checks, the annual performance
evaluations require less equipment
since, in many cases, one set (or a few
sets) of independent equipment is/are
used to audit all sites in a network.
Accordingly, the EPA believes that it is
practical for monitoring agencies to
procure and utilize audit equipment,
including calibrators and gas standards
that are capable of generating the lower
concentrations that are typically
measured at most sites in the U.S.
Indeed, all monitoring agencies that
operate NCore multi-pollutant stations
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
should already own and be proficient in
the operation of such equipment as the
objectives of the NCore stations and the
technology used (i.e., trace level gas
monitors) are oriented to characterizing
typical ambient concentrations.
In order to make the requirements
easier to comprehend and allow for
more flexibility in audit point selection,
the EPA will revise the proposed
language to require three points to be
selected: One point around two to three
times the method detection limit of the
instruments within the PQAO network,
a second point less than or equal to the
99 percentile of the data at the site or
the network of sites within a PQAO or
the next highest audit concentration
level, and the third point around the
primary NAAQS or the highest 3-year
concentration at the site or the network
of sites in the PQAO. This framework
provides two audit points that reflect 99
percent of the monitoring data and a
third point at the highest 3-year
concentration or the level of the
NAAQS, whichever concentration the
monitoring organization chooses. Since
performance evaluation audits are only
performed once a year at each site, the
burden to perform these audits at
suitable concentrations is reduced
relative to the QC checks. Therefore, the
revised audit approach should provide
the flexibility requested by the
commenters. Also, in 2011, the EPA
adopted a more flexible acceptance
criteria for the two lower concentration
audit levels (option to use difference
instead of percent difference) 33 that is
not influenced by concentration, which
should alleviate commenter’s concerns
about acceptance criteria at the lower
audit levels. Accordingly, the EPA is
finalizing the changes to performance
audit requirements as described above.
The EPA proposed to revise the
language (current section 3.2.2.2(a))
addressing the limits on excess nitric
oxide (NO) that must be followed during
gas phase titration (GPT) procedures
involving NO2 audits. The previous NO
limit (maintaining at least 0.08 ppm NO)
was restrictive and required auditors to
make numerous mid-audit adjustments
during a GPT that resulted in making
the NO2 audit a time-consuming
procedure. Accordingly, we proposed a
more general statement regarding GPT
that acknowledges the ongoing usage of
monitoring agency procedures and
guidance documents that have
successfully supported NO2 calibration
activities.
The EPA received one state comment
in support of the proposed revision to
33 https://www.epa.gov/ttnamti1/files/ambient/
pm25/datamang/20110217lowlevelstatmemo.pdf.
PO 00000
Frm 00020
Fmt 4701
Sfmt 4700
the language on excess NO and no
adverse comments. Therefore, the EPA
is finalizing this revision as proposed.
The EPA proposed to remove
language (current section 3.2.2.2(b)) in
the annual performance evaluation
section that required Regional approval
for audit gases for any monitors
operating at ranges higher that 1.0 ppm
for O3, SO2 and NO2 and greater than 50
ppm for CO. The EPA does not need to
approve a monitoring organization’s use
of audit gases to audit above proposed
concentration levels. Since data
reported to AQS above the highest level
may be flagged or rejected, the EPA
proposed that PQAOs notify the EPA
Regional Office of sites being audited at
concentrations above level 10 so that
reporting accommodations can be made.
The EPA did not receive any
comments on this proposed change.
Therefore, the EPA is finalizing this
revision as proposed.
The EPA proposed to provide
additional explanatory language in
appendix A to describe the NPAP. The
NPAP has been a long-standing program
for the ambient air monitoring
community. Since 2007, the EPA has
distributed an annual decision
memorandum to all monitoring
organizations in order to determine
whether the monitoring organization
plans to self-implement the NPAP
program or utilize the federally
implemented program. In order to make
this decision, the NPAP adequacy and
independence requirements are
described in this annual decision
memorandum. The EPA proposed to
include these same requirements in
appendix A in a separate section for
NPAP. In addition, the annual decision
memorandum stated that 20 percent of
the sites would be audited each year so
that all sites would be audited in a 5year period. Since there is a possibility
that monitoring organizations may want
certain higher priority sites audited
more frequently, the EPA proposed to
revise the language to require all sites to
be audited within a 6-year period to
provide more flexibility and discretion
for monitoring agencies. This revision
does not change the number of sites
audited in any given year, but allows for
increased frequency in auditing sites
deemed as high priority.
The EPA received one state comment
and one consulting firm comment
supporting this action and two state
comments expressing concern. One
commenter supported it without any
additional comment while another
made the point that the clarification
simply added the definition of an
‘‘independent assessment,’’ which has
been widely circulated and understood
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
by state, local and tribal monitoring
organizations for several years and is
neutral with respect to burden. One
state commenter mentioned that the
proposed additions have changed the
requirements for demonstrating
independence and adequacy that were
originally outlined in the memorandum,
‘‘National Performance Audit Program/
PM2.5 Performance Evaluation Program
Implementation Decision Memorandum
for Calendar Year 2008,’’ by
implementing training requirements,
requiring separate audit equipment, and
adding a requirement to perform a
whole system check tested against an
independent and qualified lab. The
commenter suggested that the proposed
changes impact the costs for the PQAO
to implement the NPAP.
A state commenter suggested that the
description for NPAP was ‘‘inconsistent
with what had been conveyed in the
past and is more pertinent for the
performance audit.’’ The commenter
also suggested that proposed sections
3.1.3.4(a)–(f) be removed and retained in
guidance (annual memorandum).
However, the 2008 version of the QA
Handbook, as well as the current 2013
version, provides the same definition of
a Performance Evaluation as a type of
audit in which the quantitative data
generated in a measurement system are
obtained independently and compared
with routinely obtained data to evaluate
the proficiency of an analyst, or a
laboratory, and has included NPAP in
this definition in both versions of the
QA Handbook. Another state
commenter also raised questions as to
the objective of the program and
suggested that the NPAP objective is
already being accomplished with the
annual performance evaluation.
In response to changes in the NPAP
requirement from the 2008 NPAP
memo, each year the EPA requests that
monitoring organizations make a
decision with regard to selfimplementation of the NPAP program
based on the current year’s decision
memorandum, or allow for federal
implementation of the program. The
proposed regulatory language has been
included in the decision memorandums
for the past number of years that the
EPA expected monitoring organizations
to follow in order to self-implement.
The EPA disagrees that the NPAP
objectives have changed since the
inception of the program. Early versions
of NPAP included cylinders of
unknown concentration being sent to
monitoring organizations (mailed
audits) who would challenge the
analyzers with these standards and send
the results back to the EPA for
evaluation. This process was ‘‘blind,’’
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
meaning that the monitoring
organization did not know the
concentration of the standard they were
auditing. It was completely independent
of monitoring organization
implementation and also established
independence of the concentration
being audited. At the same time the
NPAP mailed audits were conducted,
monitoring organizations continued to
implement their annual performance
evaluations. So, both NPAP and the
annual performance programs have been
implemented at the same time and
NPAP, having a different objective,
allowed for a level of independent
auditing by the EPA. Due to complaints
lodged on the length of time required to
get results back from the NPAP
‘‘mailable’’ program, the EPA instituted
the current NPAP through the probe
program while continuing its primary
objective: providing independent,
quantitative evaluations of data quality.
Since the majority of monitoring
organizations allow for federal
implementation, which is reliably
independent of monitoring organization
implementation (only two monitoring
organizations in the country selfimplement NPAP), the EPA identified
the requirements necessary for selfimplementing monitoring organizations
to maintain as close a level of
independence and data quality
consistency to federal implementation.
Therefore, while one commenter
suggested that the training requirements
be revised to ensure that auditors have
been trained in the procedures that
PQAOs actually employ to satisfy this
requirement, the EPA believes that the
training be required to reflect
consistency with the federal program in
order to establish consistency in data
quality across the NPAP program. The
EPA provides the opportunity for
monitoring organizations to make the
self-implementation decision each year
based on the requirements in the
decision memorandum, which ensures
the NPAP program is equitably and
consistently implemented across all
monitoring organizations. Therefore, the
EPA is finalizing this revision as
proposed, but is also providing some
flexibility as requested in a state
comment by inserting the following
language into the relevant section of
appendix A:
OAQPS, in consultation with the relevant
EPA Regional Office, may approve the
PQAO’s plan to self-implement NPAP if the
OAQPS determines that the PQAO’s selfimplementation plan is equivalent to the
federal programs and adequate to meet the
objectives of national consistency and data
quality.
PO 00000
Frm 00021
Fmt 4701
Sfmt 4700
17267
4. Measurement Quality Checks for
Particulate Monitors
The EPA proposed to require that
flow rate verifications (current section
3.2.3) be reported to AQS. Particulate
matter concentrations (e.g., PM2.5, PM10,
Pb) are reported in mass per unit of
volume (mg/m3). Flow rate verifications
are implemented at required frequencies
in order to ensure that the PM sampler
is providing an accurate and repeatable
measure of volume that is critical for the
determination of concentration. If a
given flow rate verification does not
meet acceptance criteria, the EPA
guidance suggests that data may be
invalidated back to the most recent
acceptable verification, which is why
these checks are performed at higher
frequencies. Implementation of the flow
rate verification is currently a
requirement, but reporting to AQS has
only been a requirement for PM10
continuous instruments. This is the only
QC requirement in appendix A that was
not fully required for reporting for all
PM pollutants and has been a cause of
confusion. When performing TSAs, the
EPA Regional Offices review the flow
rate verification information. There are
cases where it is difficult to find the
flow rate verification information to
ascertain completeness, data quality,
and whether corrective actions have
been implemented in the case of flow
rate verification failures. In addition, the
EPA Regions have mentioned that some
of the monitoring organizations have
been voluntarily reporting these data to
AQS in an effort to increase
transparency and reliability in data
quality. In a recent review of 2012 data,
out of the 1,110 SLAMS PM2.5 samplers
providing flow rate audit data (which
are required to be reported), flow rate
verification data were also reported for
543 samplers or about 49 percent for the
samplers with flow rate audit data. With
the development of a new QA
transaction in AQS, we believe that the
reporting of flow rate verification data
would improve the evaluation of data
quality for data certification and at
national levels, provide consistent
interpretation in the regulation for all
PM pollutants without being overly
burdensome (approximately 12 data
points per sampler per year).
The EPA received one state comment
in support of this revision and no
adverse comments. Therefore, the EPA
is finalizing this revision as proposed.
In addition, the flow rate verification
requirements for all the particulate
monitors suggest randomization of the
implementation of flow rate
verifications with respect to time of day,
day of the week and routine service and
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17268
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
adjustments. Since this is a suggestion,
the EPA proposed to remove this
language from the regulation and
instead include it in QA guidance.
The EPA noted that one consulting
firm voiced concern about removing the
suggestion for randomizing flow rate
verifications. They stated that the
‘‘randomization of QC procedures is a
critical aspect of QA currently
unacknowledged by the EPA, and that
single point (precision) checks of
gaseous monitors and flow rate
verification checks on PM samplers are
crucial to characterizing the precision,
bias and accuracy of the data arising
from those instruments. Diurnal and
weekly rhythms exist in solar radiation,
temperature, humidity, electrical power
and traffic patterns. As standards
decrease and monitoring
instrumentation becomes more
sensitive, the likelihood increases that
interferences will occur in those
instruments. One means of detecting
such biases involves randomized QC
checks since they occur out-of-sync
with daily/weekly rhythms.’’
The EPA agrees with the technical
rationale for randomization provided by
the commenter, but also received
comments that the regulation should
provide requirements and that suggested
practices should be referenced in
guidance documents. Therefore, the
EPA is finalizing this revision as
proposed and will include the
randomization suggestion in the next
revision of the QA Handbook and in the
PM2.5 method.
The EPA proposed to add clarifying
language to the PM2.5 collocation
requirements (current section 3.2.5) that
a site can only count for the collocation
of the method designation of the
primary monitor at that site. Precision is
estimated at the PQAO level and
required at 15 percent of the primary
monitor sites for each method
designation. When developing the
collocation requirements, the EPA
intended to have the collocated
monitors distributed to as many sites as
possible in order to capture as much of
the temporal and spatial variability in
the PQAO given that only 15 percent of
the primary monitors within a method
designation are collocated. Therefore,
since there can be only one primary
monitor at a site for any given time
period, it was originally intended that
the primary monitor and the QA
collocated monitor (for the primary) at
a monitoring site count as one
collocation. This revision does not
change the current regulation and does
not increase or decrease burden, but is
intended to provide clarity on how the
PQAO identifies the number and types
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
of monitors needed to achieve the
collocation requirements.
The EPA received one state and one
consulting firm comment supporting
this clarification and two state
comments expressing concern.
One commenter expressing concern
did not support specifically forbidding
collocation of multiple particulate
monitors at a single site and made the
following points. As the NCore sites
were designed to provide a large suite
of monitoring, the commenter felt it was
an ideal location to deploy a range of
instruments. The commenter
mentioned, ‘‘where the array of PM10–2.5
monitors at a monitoring site include a
PM2.5 FRM as the primary monitor, the
operation of the continuous PM2.5 FEM
is advantageous for collocation across
the network. For the EPA not to allow
this collocation directly contradicts the
goal of the proposed rule by placing
additional compliance and operating
burdens on monitoring organizations
and network operators.’’ A second
commenter mentioned that the
proposed ‘‘new requirement could
result with the discontinuing a sampler
at one location and creating more
upkeep and maintenance for the
samplers at different locations.’’
The EPA notes that the proposed
language does not represent a new
requirement, is not a revision to the
current requirement, and merely
represents a needed clarification of the
current language because some
monitoring organizations were
misinterpreting the original language by
allowing one site to provide multiple
collocations. Since the original language
identified that collocation for appendix
A purposes requires the QA collocated
monitor to be compared against the
primary monitor at a site, and since
there can only be one primary monitor
at a site at any particular time, the EPA
believes that the original language and
intent were clear. Based on data
assessments of collocated data in AQS,
most monitoring organizations follow
this requirement. Since the current
requirement states that 15 percent of the
primary monitors in each method
designation must be collocated, and
there can only be one primary monitor
at a site, the current regulation (without
the clarifying language) allows only one
collocation to count for a given site.
When the EPA became aware of
potential confusion on this issue in
2010, we provided guidance to both the
EPA Regions and monitoring
community through the QA EYE
newsletter (Issue 9, page 3).34 The
article and the table, which was based
34 https://www.epa.gov/ttnamti1/qanews.html.
PO 00000
Frm 00022
Fmt 4701
Sfmt 4700
on the number of sites in a monitoring
organization, were developed to
articulate the intent of the regulation.
The EPA supports the use of multiple
monitors at sites like NCore, as one
commenter suggested, for testing and
evaluation purposes but not for
conforming to the appendix A original
requirements. However, as articulated in
the current appendix A regulation, a
collocated monitor can be used to
achieve collocation requirements for
more than one pollutant. For example,
collocated manual PM10–2.5 monitors
could be used to satisfy PM2.5
collocation, PM10 collocation, as well as
PM10–Pb collocation. Therefore, the
EPA is adding the clarification as
proposed to ensure that the current
requirement is not misinterpreted.
The EPA proposed to provide more
flexibility to monitoring organizations
when selecting sites for collocation.
Appendix A (current section 3.2.5.3)
had required that 80 percent of the
collocated monitors be deployed at sites
within ±20 percent of the NAAQS and
if the monitoring organization did not
have sites within that range, then 60
percent of the sites were to be deployed
among the highest 25 percent of all sites
within the network. Monitoring
organizations found this difficult to
achieve. Some monitoring organizations
did not have many sites and, at times,
due to permission, access, and limited
space issues, the requirement was not
always achievable.
Realizing that the collocated monitors
provide precision estimates for the
PQAO (since only 15 percent of the sites
for each method designation are
collocated), while also acknowledging
that sites that measure concentrations
close to the NAAQS are important, the
EPA proposed to require that 50 percent
(down from 80 percent) of the
collocated monitors be deployed at sites
within ±20 percent of the NAAQS and,
if the PQAO did not have sites within
that range, then 50 percent of the sites
are to be deployed among the highest
sites within the network. Although this
requirement does not change the
number of sites requiring collocation, it
does provide the PQAO additional
flexibility in its choice of collocated
sites.
The EPA received three state
comments and one consulting firm
comment in general support of this
proposal and no comments expressing
concern.
As with the previous requirement, the
EPA has a cut-off value of 3 mg/m3 for
data used in evaluations of precision
and bias, meaning that only data equal
to or greater than 3 mg/m3 are used in
estimates of precision and bias. This did
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Lhorne on DSK5TPTVN1PROD with RULES2
not change in the proposed regulation.
Our expectation is that monitoring
organizations will site collocated
monitors in such a manner that they
will likely collect collocated samples
from sites that have values equal to or
greater than 3 mg/m3. One commenter
was concerned about ‘‘clean’’ days that
are below the 3 mg/m3 threshold since
the employment of this threshold would
affect data completeness by excluding
pairs on cleaner days. The EPA notes,
however, that completeness is not
calculated solely on data pairs with
concentrations equal to or greater than
3 mg/m3, but on all valid collocated
pairs (valid pairs below 3 mg/m3 are
expected to be reported to AQS).
Therefore, as long as the monitoring
agency collects and reports all
collocated data at the required
frequency, data completeness is not an
issue.
Another state commenter, in support
of the proposal, suggested that the
highest concentration site be selected
for the first collocation and, if a second
site is needed, then the second highest
site be selected, and so on. While this
is an alternative approach, the initial
rationale for the revision was to provide
more flexibility in site selection in cases
where some sites (for example the
highest concentration site) had access
problems or some other issue that did
not make it a good candidate for
collocation. The wording in the
proposed regulation is meant to ensure
that some of the sites selected for
collocation represent the locations with
the highest concentrations in the
respective monitoring agencies network
while providing the flexibility to choose
among those sites.
Since there was general support for
the proposal with no adverse comments,
the EPA is finalizing this revision as
proposed.
5. Calculations for Data Quality
Assessment
In order to provide reasonable
estimates of data quality, the EPA uses
data above an established threshold
concentration usually related to the
detection limits of the measurement.
Measurement pairs are selected for use
in the precision and bias calculations
only when both measurements are
greater than or equal to a threshold
concentration.
For many years, the threshold
concentration for Pb precision and bias
data was 0.02 mg/m3. The EPA
promulgated a new Pb FRM (78 FR
40000) utilizing the Inductively
Coupled Plasma Mass Spectrometry
(ICP–MS) analysis technique in 2013 as
a revision to appendix G of 40 CFR part
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
50.35 This new FRM demonstrated
MDLs 36 below 0.0002 mg/m3, which is
well below the EPA requirement of 5
percent of the current Pb NAAQS level
of 0.15 mg/m3, or 0.0075 mg/m3. As a
result of the increased sensitivity
inherent in this new FRM, the EPA
proposed to lower the acceptable Pb
concentration (current section 4) from
the current value of 0.02 mg/m3 to 0.002
mg/m3 for measurements obtained using
the new Pb FRM and other more
recently approved equivalent methods
that have the requisite increased
sensitivity.37 The current 0.02 mg/m3
value will be retained for the previous
Pb FRM that has subsequently been redesignated as FEM EQLA–0813–803, as
well as older equivalent methods that
were approved prior to the more recent
work on developing more sensitive
methods. Since ambient Pb
concentrations are lower and methods
more sensitive, lowering the threshold
concentration will allow more
collocated data to be evaluated, which
will provide more representative
estimates of precision and bias at
current ambient Pb levels.
The EPA received one state comment
and one consulting firm comment in
support of the proposal and one state
comment expressing concern.
The comment expressing concern
related to a perception that data would
be lost due to the increased possibility
that data quality objectives (DQO)
would not be met with the decreased
threshold concentration. The
commenter believed the change would
increase the likelihood that collocated
data would not meet the 20 percent
coefficient of variation (CV) limit for
precision as specified in appendix A,
section 2.3.1.3. This would in turn
decrease data completeness and, if data
loss is great enough, could potentially
render the data from an entire
monitoring location useless for NAAQS
compliance determinations.
The EPA notes that invalidation of
routine data based solely on the
variability of collocated monitoring data
is not required or recommended. The
data validation guidance in the QA
Handbook, which many monitoring
organizations use to develop validation
criteria, allows for these data to be
reviewed in the context of other QC
35 See
78 FR 40000, July 3, 2013.
is described as the minimum
concentration of a substance that can be measured
and reported with 99 percent confidence that the
analyte concentration is greater than zero.
37 FEMS approved on or after March 4, 2010, have
the required sensitivity to utilize the 0.002 mg/m3
reporting limit with the exception of manual
equivalent method EQLA–0813–803, the previous
FRM based on flame atomic absorption
spectroscopy.
36 MDL
PO 00000
Frm 00023
Fmt 4701
Sfmt 4700
17269
samples before decisions to invalidate
data are made. Since the collocated data
are only collected at approximately 15
percent of the monitoring sites, the data
set is meant to reflect the precision of
the PQAO monitoring network and not
to evaluate the validity of data from
individual sites. Site data can be used
to troubleshoot causes of variability and
to take corrective actions, but is not
intended to invalidate routine
monitoring data unless a significant
systemic issue is discovered.
Based on the comment noted above,
the EPA performed an evaluation of
collocated Pb data collected in calendar
years 2011–2013 to evaluate the amount
of collocation information available
when using the two reporting
thresholds. In that time period, 7,063
collocated measurements were taken.
Within this data set, there were 2,521
data pairs where both values were equal
to or greater than 0.02 mg/m3 (i.e., only
about 35 percent of the information
collected could be used to estimate
precision). In the most pertinent
examples, there were cases where
monitoring organizations collected valid
ambient data and no collocated data
could be used due to the current higher
threshold. For example, one monitoring
organization collected 173 collocated
measurements and no value was equal
to or greater than 0.02 mg/m3 and,
therefore, there was no estimate of
precision reported for this monitoring
organization for a 3-year period. There
were eight monitoring organizations that
could not use any collocated results for
2011–2013 and 22 monitoring
organizations (about 50 percent of the
monitoring organizations) that had less
than 25 percent of their data used. In
contrast, if the same data set is used, but
the threshold is reduced to the proposed
value of greater than or equal to 0.002
mg/m3, then 6,418 measurements are
available, which increases precision
data availability from 35 percent to 91
percent. As an example, the monitoring
organization that had no collocated
values (173 measurements) equal to or
greater than 0.02 mg/m3 had the number
of available pairs increased to 172 with
the lower 0.002 mg/m3 threshold and
had a precision estimate CV of 16.43,
which is within the 3-year DQO goal of
20 percent.
The EPA acknowledges that using a
lower threshold concentration will
increase the estimate of precision since
the required CV statistic is a derivation
of the percent difference. When EPA
evaluated the Pb data quality objectives
to determine acceptable precision and
bias for the new standard, we evaluated
all collocated data in AQS including the
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17270
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
lower concentration data.38 Since the
collocated data are actual samples, they
include measurement uncertainty for all
phases of the measurement system
including variability in EPA-provided
filters, sampling handling, sampler flow
differences, plumes from sources,
laboratory contamination, as well as
other types of measurement uncertainty
mentioned by one commenter. In fact,
the goal of the collocation is to provide
an estimate of overall measurement
imprecision between two sampling
systems that are, in theory, sampling the
same air. So although the commenter
identifies this as a concern, providing a
measure of the overall precision of the
measurement system is what the
collocated data are intended to evaluate.
The commenter mentioned that
changing the threshold based solely on
the estimated FRM detection limit may
not translate to other FEMs that may
have different detection limits. At a
minimum, all approved Pb methods are
required to meet the method detection
limit to be approved as equivalent.
Therefore, the 0.002 mg/m3 threshold
should be applicable to the newer
methods and is the reason for the dual
thresholds.
Based on our review and evaluations,
the EPA set the precision goal of a 90
percent confidence limit for the CV of
20 percent as mentioned by the
commenter. This CV estimate is
determined by aggregating 3 years of
collocated data. In the evaluation of the
2011–2013 data, the EPA evaluated data
down to the lower threshold with the
new methods capable of more
sensitivity. The average 3-year precision
estimate (2011–2013) for all monitoring
organizations using the approved FRM
and FEM methods and a threshold of
0.002 mg/m3 was 16.31. The average 3year CV for a threshold of 0.02 mg/m3
was 11.09. This is an increase of
imprecision on average of 5 percent, but
a significant increase in data availability
from 35 percent to 90 percent.
The commenter also suggested that
the current threshold should remain in
effect until a limit of quantitation (LOQ)
test can be performed. Although there
are a number of definitions for LOQ,
some have defined it to be three times
(3x) to ten times (10x) the MDL. The
new Pb FRM by ICP–MS promulgated in
2013 in 40 CFR part 50, appendix G,
showed that the MDLs were below
0.0002 mg/m3. Therefore, the EPA took
the 10x definition of LOQ and
calculated 0.002 mg/m3 as the level of
the new threshold.
38 https://www.epa.gov/ttnamti1/files/ambient/pb/
QAQA.pdf.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
Two commenters made similar points
that, due to the fact that the CV is based
on individual sample pair percent
differences, the CV tends to increase at
lower concentrations for a constant
absolute difference. The EPA
acknowledges this fact. On a related
issue, when developing the 10 audit
levels for annual performance
evaluation checks, the EPA provided
guidance on the two lower audit levels
allowing for an absolute difference
criteria as well as a percent difference
criteria. Rather than eliminate close to
55 percent of the collocated data, which
is what is occurring now with the higher
threshold, the EPA is finalizing the two
thresholds as proposed and will also
evaluate the use of an absolute
difference acceptance criteria at lower
concentration levels.
The EPA proposed to remove the TSP
threshold concentration for precision
and bias since TSP is no longer a
NAAQS-required pollutant and the EPA
no longer has QC requirements for it.
The EPA received one comment in
support of this proposal and no adverse
comments and is finalizing this revision
as proposed.
The EPA proposed to remove the
statistical check currently described in
section 4.1.5 of appendix A. The check
was developed to perform a comparison
of the one-point QC checks and the
annual performance evaluation data
performed by the same PQAO on
gaseous instruments. The section
suggests that 95 percent of all the bias
estimates from the annual performance
evaluation (reported as a percent
difference) should fall within the 95
percent probability interval developed
using the one-point QC checks. The
problem with this specific statistical
check is that PQAOs with very good
repeatability on the one-point QC check
data had a hard time meeting this
requirement since the probability
interval became very tight, making it
more difficult for better performing
PQAOs to meet the requirement when
comparing the one-point QC checks and
performance evaluation data. Separate
statistics to evaluate the one-point QC
checks and the performance evaluations
are already promulgated, so the removal
of this check does not affect data quality
assessments.
The EPA received one comment in
support of this proposal and no adverse
comments and is finalizing this revision
as proposed.
Similar to the statistical comparison
of performance evaluations data, the
EPA proposed to remove the statistical
check (current section 4.2.4) to compare
the flow rate audit data and flow rate
verification data for PM monitors. The
PO 00000
Frm 00024
Fmt 4701
Sfmt 4700
existing language suggests that 95
percent of all the flow rate audit data
results (reported as percent difference)
should fall within the 95 percent
probability interval developed from the
flow rate verification data for the PQAO.
The problem, as with the one-point QC
check comparison requirement for
gaseous monitors, was that monitoring
organizations with very good
repeatability on the flow rate
verifications had a hard time meeting
this requirement since the probability
interval became very tight, making it
difficult for better performing PQAOs to
meet the requirement. Separate statistics
to evaluate the flow rate verifications
and flow rate audits are already
promulgated, so the removal of this
check does not affect data quality
assessments.
The EPA received one comment in
support of this proposal and no adverse
comments and is finalizing this revision
as proposed.
B. Quality Assurance Requirements for
Monitors Used in Evaluations of
Prevention of Significant Deterioration
Projects—Appendix B
The EPA proposed to create appendix
B to specify the minimum quality
assurance requirements for the control
and assessment of the quality of the
ambient air monitoring data submitted
to a PSD reviewing authority or the EPA
by an organization operating an air
monitoring station, or network of
stations, operated in order to comply
with Part 51 New Source Review—
Prevention of Significant Deterioration
(PSD). These proposed revisions to the
quality assurance requirements
applicable to PSD are, in the majority of
cases, identical to the revisions
proposed in appendix A. The majority
of comments received for this rule
focused on the appendix A
requirements and were discussed in the
previous section. Due to the similarity
of the proposed changes for appendix A
and appendix B, the EPA assumes that
comments submitted in response to
proposed appendix A revisions also
reflect the sentiment of commenters
concerning the proposed language in
appendix B. Therefore, the preamble
discussions that include responses to
comments for appendix A should, in
most cases, also apply to appendix B.
Accordingly, the EPA will not duplicate
those discussions in the following
sections pertaining to appendix B, and
we refer the reader back to the relevant
appendix A discussions in section III.A.
of the preamble, above. In the few cases
where comments were made specifically
for appendix B sections, those
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
comments are discussed in the
appropriate sections below.
Lhorne on DSK5TPTVN1PROD with RULES2
1. General Information
The following changes to monitoring
requirements impact Part 58—Ambient
Air Quality Surveillance; Appendix B—
Quality Assurance Requirements for
Prevention of Significant Deterioration
(PSD) Air Monitoring. Changes that
affect the overall appendix are
discussed in this section of the
preamble while changes specific to the
various sections of the appendix will be
addressed in subsequent sections of the
preamble. Since the PSD QA
requirements have been included in
appendix A since 2006, section
headings refer to the current appendix
A sections.
The QA requirements in appendix B
have been developed for measuring the
criteria pollutants of O3, NO2, SO2, CO,
PM2.5, PM10 and Pb and are minimum
QA requirements for the control and
assessment of the quality of the PSD
ambient air monitoring data submitted
to the PSD reviewing authority 39 or the
EPA by an organization operating a
network of PSD stations.
In the 2006 monitoring rule revisions,
the PSD QA requirements, which were
previously in appendix B, were
consolidated with appendix A and
appendix B was reserved. The PSD
requirements, in most cases, parallel
appendix A in structure and content but
because PSD monitoring is only
required for a period of 1 year or less,
some of the frequencies of
implementation of the QC requirements
for PSD are higher than the
corresponding appendix A
requirements. In addition, the agencies
governing the implementation,
assessment and approval of the QA
requirements can be different: The PSD
reviewing authorities for PSD
monitoring and the EPA Regions for
ambient air monitoring for NAAQS
decisions. Since 2006, the combined
regulations have caused confusion or
misinterpretations of the regulations
among the public and monitoring
organizations implementing NAAQS or
PSD requirements, and have resulted in
failure, in some cases, to perform the
necessary QC requirements.
Accordingly, the EPA proposed that the
PSD QA requirements be removed from
appendix A and returned to appendix B.
Separating the two sets of QA
requirements would clearly distinguish
the PSD QA requirements and allow
39 Permitting authority and reviewing authority
are often used synonymously in PSD permitting.
Since reviewing authority has been defined in 40
CFR 51.166(b), it is used throughout appendix B.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
more flexibility for future revisions to
either monitoring program.
With this final rule, the EPA would
not change most of the QA requirements
for PSD. Therefore, the discussion that
follows will cover those sections of the
PSD requirements that the EPA
proposed to change from the current
appendix A requirements.
Commenters supported moving the
PSD QA requirements to a distinct
section with no adverse comments
received, so the EPA is finalizing as
proposed.
The applicability section of appendix
B clarifies that the PSD QA
requirements are not assumed to be
minimum requirements for data use in
NAAQS attainment decisions. One
reason for this distinction is in the
flexibility allowed in PSD monitoring
for the NPEP (current appendix A,
section 2.4). The proposed PSD
requirements allow the PSD reviewing
authority to decide whether
implementation of the NPEP will be
performed. The NPEP, which is
described in appendix A, includes the
NPAP, the PM2.5 Performance
Evaluation Program (PM2.5–PEP), and
the Pb–PEP. Accordingly, under the
proposed revision, if a PSD reviewing
authority intended to use PSD data for
any official comparison to the NAAQS
beyond the permitting application, such
as for attainment/nonattainment
designations or clean data
determinations, then all requirements in
appendix B including implementation
of the NPEP would apply. In this case,
monitoring would more closely conform
to the appendix A requirements. The
EPA proposed this flexibility for PSD
because the NPEP requires either federal
implementation or implementation by a
qualified individual, group or
organization that is not part of the
organization directly performing and
accountable for the work being assessed.
The NPEP may require specialized
equipment, certified auditors and a
number of activities which are
enumerated in the sections associated
with these programs. Arranging this
type of support service may be more
difficult for the operator of a single or
small number of PSD monitoring
stations operating for only a year or less.
The EPA cannot accept funding from
private contractors or industry, and
federal implementation of the NPEP for
PSD would face several funding and
logistical hurdles. This creates an
inequity in the NPEP implementation
options available to the PSD monitoring
organizations compared to the state/
local/tribal monitoring organizations for
NAAQS compliance. The EPA has had
success in training and certifying
PO 00000
Frm 00025
Fmt 4701
Sfmt 4700
17271
private contractors in various categories
of performance evaluations conducted
under NPEP, but many have not made
the necessary investments in capital
equipment to implement all categories
of the performance evaluations. Since
the monitoring objectives for the
collection of data for PSD are not
necessarily the same as the appendix A
monitoring objectives, the EPA
proposed to allow the PSD reviewing
authority to determine whether a PSD
monitoring project must implement the
NPEP.
The EPA only received comments in
support of this proposed change, and is
finalizing the change as proposed.
The EPA proposed to clarify the
definition of PSD PQAO. The PQAO
was first defined in appendix A in 2006
(current appendix A, section 3.1.1),
when the PSD requirements were
combined with appendix A. The
definition is not substantially changed
for PSD, but the EPA proposed to clarify
that a PSD PQAO can only be associated
with one PSD reviewing authority.
Distinguishing among the PSD PQAOs
that coordinate with a PSD reviewing
authority would be consistent with
discrete jurisdictions for PSD
permitting, and it would simplify
oversight of the QA requirements for
each PSD network.
Given that companies may apply for
PSD permits throughout the U.S., it is
expected that some PSD monitoring
organizations will work with multiple
reviewing authorities. The PSD PQAO
code that may appear in the AQS data
base and other records defines the PSD
monitoring organization or a
coordinated aggregation of such
organizations that is responsible for a
set of stations within one PSD reviewing
authority that monitors the same
pollutant and for which data quality
assessments will be pooled. The PSD
monitoring organizations that work with
multiple PSD reviewing authorities
would have individual PSD PQAO
codes for each PSD reviewing authority.
This approach will allow flexibility to
develop appropriate quality systems for
each PSD reviewing authority.
The EPA did not receive any
comment on this process and is
finalizing the requirement as proposed.
The EPA proposed to add definitions
of ‘‘PSD monitoring organization’’ and
‘‘PSD monitoring network’’ to 40 CFR
58.1. The definitions have been
developed to improve understanding of
the appendix B regulations.
Because the EPA uses the term
‘‘monitoring organization’’ frequently in
the NAAQS-associated ambient air
regulations, the EPA wanted to provide
a better definition of the term in the PSD
E:\FR\FM\28MRR2.SGM
28MRR2
17272
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Lhorne on DSK5TPTVN1PROD with RULES2
QA requirements. Therefore, the EPA
proposed the term ‘‘PSD monitoring
organization’’ to identify ‘‘a source
owner/operator, a government agency,
or a contractor of the source or agency
that operates an ambient air pollution
monitoring network for PSD purposes.’’
The EPA also proposed to define
‘‘PSD monitoring network’’ in order to
distinguish ‘‘a set of stations that
provide concentration information for a
specific PSD permit.’’ The EPA will
place both definitions in 40 CFR 58.1.
The EPA did not receive any comment
on these changes and is finalizing them
as proposed.
2. Quality System Requirements
The EPA proposed to remove the
PM10–2.5 requirements for flow rate
verifications, semi-annual flow rate
audits, collocated sampling procedures
and PM10–2.5 PEP from appendix B
(current appendix A, sections 3.2.6,
3.2.8, 3.3.6, 3.3.8, 4.3). In 2006, the EPA
proposed a PM10–2.5 NAAQS along with
requisite QA requirements in appendix
A. While the PM10–2.5 NAAQS was not
promulgated, PM10–2.5 monitoring was
required to be performed at NCore sites
and the EPA proposed requisite QA
requirements in appendix A. Since PSD
monitoring is distinct from monitoring
at NCore sites and PM10–2.5 is not a
criteria pollutant, it will be removed
from the PSD QA requirements. The
EPA did not receive any comment on
this proposed revision and is finalizing
the requirement as proposed.
The EPA proposed that the Pb QA
requirements of collocated sampling
(current appendix A, section 3.3.4.3)
and Pb performance evaluation
procedures (current appendix A, section
3.3.4.4) for non-source oriented NCore
sites be eliminated for PSD. The 2010 Pb
rule in 40 CFR part 58, appendix D,
section 4.5(b) added a requirement to
conduct non-source oriented Pb
monitoring at each NCore site in a CBSA
with a population of 500,000 or more.
Since PSD does not implement NCore
sites, the EPA proposed to eliminate the
Pb QA language specific to non-source
oriented NCore sites from PSD while
retaining the PSD QA requirements for
routine Pb monitoring.
The EPA received three supportive
comments for the removal of this
requirement and no adverse comments.
Therefore, the EPA is finalizing the
requirement as proposed.
The EPA proposed that elements of
QMPs and QAPPs which are separate
documents described in appendix A,
sections 2.1.1 and 2.1.2, can be
combined into a single document for
PSD monitoring networks. The QMP
provides a ‘‘blueprint’’ of a PSD
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
monitoring organization’s quality
system. It includes quality policies and
describes how the organization as a
whole manages and implements its
quality system regardless of what
monitoring is being performed. The
QAPP includes details for implementing
a specific PSD monitoring activity. For
PSD monitoring, the EPA believes the
project-specific QAPP takes priority, but
there are important aspects of the QMP
that could be incorporated into the
QAPP. The current appendix A
requirements allow smaller
organizations or organizations that do
infrequent work with EPA to combine
the QMP with the QAPP based on
negotiations with the funding agency
and provided guidance 40 on a graded
approach to developing these
documents. In the case of PSD QMPs
and QAPPs, the EPA proposed that the
PSD reviewing authority, which has the
approval authority for these documents,
also have the flexibility for allowing the
PSD PQAO to combine pertinent
elements of the QMP into the QAPP
rather than requiring the submission of
both QMP and QAPP documents
separately. The EPA did not receive any
comment on this and is finalizing the
requirement as proposed.
The EPA proposed to add language to
the appendix B version of the DQO
section (current appendix A, section
2.3.1) which allows flexibility for the
PSD reviewing authority and the PSD
monitoring organization to determine if
adherence to the DQOs specified in
appendix A, which are the DQO goals
for NAAQS decisions, are appropriate or
whether project-specific goals are
necessary. Allowing the PSD reviewing
authority and the PSD monitoring
organization flexibility to change the
DQOs does not change the
implementation requirements for the
types and frequency of the QC checks in
appendix B, but does give some
flexibility in the acceptance of data for
use in specific projects for which the
PSD data are collected. As an example,
the goal for acceptable measurement
uncertainty for the collection of O3 data
for NAAQS determinations is defined
for precision as an upper 90 percent
confidence limit for CV of 7 percent and
for bias as an upper 95 percent
confidence limit for the absolute bias of
7 percent. The precision and bias
estimates are made with 3 years of onepoint QC check data. A single or a few
one-point QC checks over 7 percent
would not have a significant effect on
meeting the DQO goal. The PSD
monitoring DQO, depending on the
40 Graded approach to Tribal QAPP and QMPs
https://www.epa.gov/ttn/amtic/cpreldoc.html.
PO 00000
Frm 00026
Fmt 4701
Sfmt 4700
objectives of the PSD monitoring
network, may require a stricter DQO
goal or one less restrictive. Since PSD
monitoring covers a period of 1 year or
less, one-point QC checks over 7 percent
will increase the likelihood of failing to
meet the DQO goal since there would be
fewer QC checks available in the
monitoring period to estimate precision
and bias. With fewer checks, any
individual check will statistically have
more influence over the precision or
bias estimate. Realizing that PSD
monitoring may have different
monitoring objectives, the EPA
proposed to add language that would
allow decisions on DQOs to be
determined through consultation
between the appropriate PSD reviewing
authority and PSD monitoring
organization. The EPA did not receive
any comment on this and is finalizing
the requirement as proposed.
The EPA proposed to add some
clarifying language to the section
describing the NPEP (current appendix
A, section 2.4) to explain selfimplementation of the performance
evaluation by the PSD monitoring
organization. Self-implementation of
NPEP has always been an option for
monitoring organizations but the
requirements for self-implementation
were described in the technical
implementation documents (i.e.,
implementation plans and QAPPs) for
the program and in an annual selfimplementation decision memo that is
distributed to monitoring
organizations.41 These major
requirements for self-implementation
are proposed to be included in the
appendix B sections pertaining to the
NPEP program (NPAP, PM2.5–PEP and
Pb–PEP).
The NPEP clarification also adds a
definition of ‘‘independent assessment.’’
The proposed definition is derived from
the NPEP (NPAP, PM2.5–PEP, and Pb–
PEP) QAPPs and guidance; it also
appears in the annual selfimplementation memo described above.
The clarification is not a new
requirement but consolidates this
information.
Refer to comments related to NPEP in
appendix A in III.A. As there were no
comments specifically related to PSD,
the EPA is finalizing the requirement as
proposed.
The EPA proposed to require PSD
PQAOs to provide information to the
PSD reviewing authority on the vendors
of gas standards that they use (or will
use) for the duration of the PSD
monitoring project. A QAPP or
monitoring plan may incorporate this
41 https://www.epa.gov/ttn/amtic/npepqa.html.
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Lhorne on DSK5TPTVN1PROD with RULES2
information. However, that document
must then be updated if there is a
change in the vendor used. The current
regulation (current appendix A, section
2.6.1) requires any gas vendor
advertising and distributing ‘‘EPA
Protocol Gas’’ to participate in the AA–
PGVP. The EPA posts a list of these
vendors on the AMTIC Web site.42 This
is not expected to be a burden since
information of this type is normally
included in a QAPP or standard
operating procedure for a monitoring
activity.
There were no adverse comments in
appendix A or appendix B related to
identifying vendors used to supply
monitoring organization with gas
standards. Therefore, the EPA is
finalizing the requirement as proposed.
3. Measurement Quality Checks for
Gases
The EPA proposed to lower the audit
concentrations (current appendix A,
section 3.2.1) of the one-point QC
checks to 0.005 and 0.08 ppm for SO2,
NO2, and O3 (currently 0.01 to 0.1 ppm),
and to between 0.5 and 5 ppm for CO
monitors (currently 1 and 10 ppm).
With the development of more sensitive
monitoring instruments with lower
detection limits, technical
improvements in calibrators, and lower
ambient air concentrations in general,
the EPA believes this revision will
better reflect the precision and bias of
the routinely-collected ambient air data.
Because the audit concentrations are
selected using the mean or median
concentration of typical ambient air data
(guidance on this is provided in the QA
Handbook 43), the EPA proposed to add
some clarification to the current
language by requiring PSD monitoring
organizations to select either the highest
or lowest concentration in the ranges
identified if the mean or median values
of the routinely-collected concentrations
are above or below the prescribed range.
The EPA received a number of
comments on this proposed
requirement. Please refer to the
appendix A comments in III.A. In light
of the comments received, the EPA will
maintain the concentration ranges as
proposed: 0.005 to 0.08 ppm for SO2,
NO2, and O3, and between the
prescribed range of 0.5 and 5 ppm for
CO monitors. However, rather than
requiring that the range selected be at
the mean or median concentration range
at the site or the agencies network of
sites, the QC check gas concentration
42 https://www.epa.gov/ttn/amtic/aapgvp.html.
43 QA Handbook for Air Pollution Measurement
Vol. II Ambient Air Quality Monitoring Program at:
https://www.epa.gov/ttn/amtic/qalist.html.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
selected within the prescribed range can
be related to the monitoring objective of
the site, with those monitors primarily
intended for NAAQS compliance
utilizing concentrations at or near the
level of the NAAQS (higher end of the
required range), and trace gas monitors
operating at background or trends sites
related to the mean or median of the
ambient air concentrations normally
measured at those sites in order to
appropriately reflect the precision and
bias at these routine concentration
ranges. If the mean or median
concentrations at trace gas sites are
below the MDL of the instrument or
above the prescribed range, the agency
can select the lowest or highest
concentration in the range that can be
practically achieved. In the case of PSD
monitoring, the EPA will add language
requiring the PSD monitoring
organization to consult with the PSD
reviewing authority on the most
appropriate one-point QC concentration
based on the objectives of the
monitoring activity. In addition, the
EPA will keep language suggesting that
an additional QC check point is
encouraged for those organizations that
may have occasional high values or
would like to confirm the monitors’
linearity at the higher end of the
operational range.
In addition, to alleviate concerns
about failing the acceptance criteria at
lower QC concentrations, the EPA will
evaluate suggestions by monitoring
organizations to raise acceptance criteria
or look at alternative acceptance criteria
(e.g., difference instead of percent
difference). Since acceptance criteria is
included in guidance, the EPA will have
the opportunity to perform the
evaluations without effecting the
regulation.
The EPA proposed to remove the
existing reference to zero and span
adjustments (current appendix A,
section 3.2.1.1) and to revise the onepoint QC language to simply require
that the QC check be conducted before
making any calibration or adjustment to
the monitor. Recent revisions of the QA
Handbook discourage the practice of
making frequent span adjustments, so
the proposed language helps to clarify
that no adjustment be made prior to
implementation of the one-point QC
check. There were no comments made
on this proposed revision, so the EPA is
finalizing this revision as proposed.
The current annual performance
evaluation language (current appendix
A, section 3.2.2.1) requires that the
audits be conducted by selecting three
consecutive audit levels (currently,
appendix A recognizes five audit
levels). Due to the implementation of
PO 00000
Frm 00027
Fmt 4701
Sfmt 4700
17273
the NCore network, the inception of
trace gas monitors, and lower ambient
air concentrations being measured
under typical circumstances, there is a
need for audit levels at lower
concentrations to more accurately
represent the uncertainties present in
the ambient air data. The EPA proposed
to expand the audit levels from five to
ten and remove the requirement to audit
three consecutive levels. The current
regulation also requires that the three
audit levels should bracket 80 percent of
the ambient air concentrations
measured by the analyzer. This current
‘‘bracketing language’’ has caused some
confusion, and monitoring organizations
have requested the use of an audit point
to establish monitor accuracy around
the NAAQS levels. Therefore, the EPA
proposed to revise the language so that
two of the audit levels selected
represent 10 to 80 percent of routinelycollected ambient concentrations either
measured by the monitor or in the PSD
PQAOs network of monitors. The
proposed revision allows the third point
to be selected at a concentration that is
consistent with PSD-specific DQOs (e.g.,
the 75 ppb NAAQS level for SO2).
The EPA received a number of
comments on this proposal. Please refer
to the appendix A comments in III.A.
In addition to comments related to
appendix A, the EPA received
comments specific to PSD on this
section. A commenter mentioned that
for PSD, the performance evaluation
(PE) is performed quarterly since PSD
monitoring may occur for only 1 year.
The current language required the audit
to occur each calendar quarter and since
PSD monitoring does not necessarily
follow calendar quarters, it was
suggested to revise the term ‘‘calendar
quarter’’ to ‘‘quarterly.’’ The EPA will
revise the PSD language to reflect
implementing the quarterly PE on a
quarter or 90-day frequency. A
commenter felt that the requirement that
PE personnel will be required to meet
PE training and certification
requirements was in error because the
requirement for certification applies
only to NPEP audits, not to quarterly
performance evaluation audits, and
there is no further regulatory discussion
to support such an assertion. Because
the EPA has provided more flexibility
on implementing NPEP at PSD sites, we
believed there needed to be an
additional requirement that the
personnel implementing these audits be
trained and certified. However, as the
commenter mentioned, there is no
additional instruction on this, nor is
there any mention of the organization
required to do this training and
certification. It is expected that any
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17274
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
entity performing this activity would be
trained and capable of performing these
audits. Therefore, the EPA will remove
the last sentence requiring training and
certification.
The EPA received a comment that
suggested the PE language was not
consistent with an earlier section (2.7)
that only required the use of reference
and equivalent method monitors as
opposed to trace gas analyzers
regardless of the concentrations
measured. The commenter’s contention
was that based upon the proposed
language related to the selection of PE
concentration, the PSD monitoring
agency would be required to acquire
trace gas instruments due to their
sensitivity and the fact that their
ambient air concentrations were low.
They used examples of annual mean
NO2 values around 1.9 ppb and SO2
concentrations of 1.0 ppb. However, the
proposed PE language is consistent with
the reference and equivalent language
described in section 2.7 since trace gas
analyzers are in fact reference and
equivalent instruments and, therefore,
are included in that description.
Regardless of the proposed PE
concentration range, it would seem that
PSD monitoring organizations that are
required to monitor at the low
concentration ranges would want to
select FRM or FEM instruments more
capable of reliably measuring these
concentrations.
Based on the comments received
related to appendices A and B, the EPA
will revise the proposed language to
require three points to be selected: One
point around two to three times the
method detection limit of the
instruments within the PQAO network,
a second point less than the 99
percentile of the data at the site or the
network of sites within a PQAO or the
next highest audit concentration level,
and the third point around the primary
NAAQS or the highest 3-year
concentration at the site or the network
of sites in the PQAO. This provides two
audit points that reflect 99 percent of
the monitoring data and a third point at
the highest 3-year concentration or the
NAAQS, whichever concentration the
PSD monitoring organization chooses.
The EPA proposed to revise the
language (current appendix A, section
3.2.2.2(a)) addressing the limits on
excess NO that must be followed during
GPT procedures involving NO2 audits.
The current NO limit (maintaining at
least 0.08 ppm) is very restrictive and
requires auditors to make numerous
mid-audit adjustments during a GPT
that result in making the NO2 audit a
very time-consuming procedure.
Monitoring agency staff have advised us
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
that the observance of such excess NO
limits has no apparent effect on NO2
calibrations being conducted with
modern-day GPT-capable calibration
equipment and, therefore, the
requirements in the context of
performing audits is unnecessary.44 We
also note the increasing availability of
the EPA-approved direct NO2 methods
that do not utilize converters, rendering
the use of GPT techniques that require
the output of NO and NOX to be a
potentially diminishingly used
procedure in the future. Accordingly,
we have proposed a more general
statement regarding GPT that
acknowledges the ongoing usage of
monitoring agency procedures and
guidance documents that have
successfully supported NO2 calibration
activities. The EPA believes that if such
procedures have been successfully used
during calibrations when instrument
adjustments are potentially being made,
then such procedures are appropriate
for audit use when instruments are not
subject to adjustment.
The EPA received only supportive
comments endorsing the proposed
revision to the language on excess NO.
Therefore, the EPA is finalizing this
revision as proposed.
The EPA proposed to remove
language (current appendix A, section
3.2.2.2(b)) in the annual performance
evaluation section that requires
Regional approval for audit gases for
any monitors operating at ranges higher
that 1.0 ppm for O3, SO2 and NO2 and
greater than 50 ppm for CO. The EPA
does not need to approve a monitoring
organization’s use of audit gases to audit
above proposed concentration levels
since the EPA has identified the
requirements for all audit gases used in
the program in current appendix A,
section 2.6.1. There should be very few
cases where a PE needs to be performed
above level 10, but there may be some
legitimate instances (e.g., an SO2 audit
in areas impacted by volcanic
emissions). Since data reported to AQS
above the highest level may be rejected
(if PSD PE data are reported to AQS),
the EPA proposes that PQAOs notify the
PSD reviewing authority of sites
auditing at concentrations above level
10 so that reporting accommodations
can be made. There were no comments
made on this proposed revision, so the
EPA is finalizing this revision as
proposed.
The EPA proposed to describe the
NPAP (current appendix A, section 2.4)
in more detail. The NPAP is a long44 See supporting information in Excess NO Issue
paper, Mike Papp and Lewis Weinstock, Docket
number EPA–HQ–OAR–2013–0619.
PO 00000
Frm 00028
Fmt 4701
Sfmt 4700
standing program for the ambient air
monitoring community. The NPAP is a
performance evaluation, which is a type
of audit where quantitative data are
collected independently in order to
evaluate the proficiency of an analyst,
monitoring instrument or laboratory.
This program has been briefly
mentioned in section 2.4 of the current
appendix A requirements. In appendix
A, the EPA proposed to add language
consistent with an annual decision
memorandum 45 distributed to all state
and local monitoring organizations in
order to determine whether the
monitoring organization plans to selfimplement the NPAP program or utilize
the federally implemented program. In
order to make this decision, the NPAP
adequacy and independence
requirements are described in the
decision memorandum. The EPA
proposed to include these same
requirements in appendix B in a
separate section for NPAP. As described
in the applicability section, the
implementation of NPAP is at the
discretion of the PSD reviewing
authority but must be implemented if
data are used in any NAAQS
determinations. Since PSD monitoring
is implemented at shorter intervals
(usually a year) and with fewer
monitors, if NPAP is performed, it is
required to be performed annually on
each monitor operated in the PSD
network.
See appendix A for comments and
discussions related to this section. The
EPA is finalizing this revision as
proposed.
4. Measurement Quality Checks for
Particulate Monitors
The EPA proposed to have one flow
rate verification frequency requirement
for all PM PSD monitors. The current
regulations (current appendix A, table
A–2) provide for monthly flow rate
verifications for most samplers used to
monitor PM2.5, PM10 and Pb and
quarterly flow rate verifications for
high-volume PM10 or TSP samplers (for
Pb). With longer duration NAAQS
monitoring, the quarterly verification
frequencies are adequate for these highvolume PM10 or TSP samplers.
However, with the short duration of
PSD monitoring, the EPA believes that
monthly flow rate verifications are more
appropriate to ensure that any sampler
flow rate problems are identified more
quickly and to reduce the potential for
a significant amount of data invalidation
that could extend monitoring activities.
The EPA received one comment in
support of this revision and no adverse
45 https://www3.epa.gov/ttn/amtic/npepqa.html.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
comments. Therefore, the EPA is
finalizing this revision as proposed.
The EPA proposed to grant more
flexibility to PSD monitoring
organizations when selecting PM2.5
method designations for sites that
require collocation. Appendix A
(current section 3.2.5.2(b)) requires that
if a primary monitor is a FEM, then the
first QC collocated monitor must be a
FRM monitor. Most of the FEM
monitors are continuous monitors while
the FRM monitors are filter-based.
Continuous monitors (which are all
FEMs) may be advantageous for use at
the more remote PSD monitoring
locations, since the site operator would
not need to visit a site as often to
retrieve filters (current FRMs are filterbased). The current collocation
requirements for FEMs require a filterbased FRM for collocation, which
would mean a visit to retrieve the FRM
filters at least 1 week after the QC
collocated monitor operated. Therefore,
the EPA proposed that the FRM be
selected as the QC collocated monitor
unless the PSD PQAO submits a waiver
request to the PSD reviewing authority
to allow for collocation with a FEM. If
the request for a waiver is approved,
then the QC monitor must be the same
method designation as the primary FEM
monitor. The EPA did not receive any
comments on this proposal and is
finalizing this revision as proposed.
The EPA proposed to allow the PSD
reviewing authority to waive the PM2.5
3 mg/m3 concentration validity
threshold for implementation of the
PM2.5–PEP in the last quarter of PSD
monitoring. The PM2.5–PEP (current
appendix A, section 3.2.7) requires five
valid PM2.5–PEP audits per year for
PM2.5 monitoring networks with less
than or equal to five sites and eight
valid PM2.5–PEP audits per year with
PM2.5 monitoring networks greater than
five sites. Any PEP samples collected
with a concentration less than 3 mg/m3
are not considered valid, since they
cannot be used for bias estimates, and
re-sampling is required at a later date.
With NAAQS-related monitoring, which
aggregates the PM2.5–PEP data over a 3year period, re-sampling is easily
accomplished. Due to the relatively
short-term nature of most PSD
monitoring, the likelihood of measuring
low concentrations in many areas
attaining the PM2.5 standard and the
time required to weigh filters collected
in performance evaluations, a PSD
monitoring organization’s QAPP may
contain a provision to waive the 3 mg/
m3 threshold for validity of performance
evaluations conducted in the last
quarter of monitoring, subject to
approval by the PSD reviewing
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
authority. The EPA did not receive any
comments on this proposed waiver and
is finalizing this revision as proposed.
5. Calculations for Data Quality
Assessment
In order to allow reasonable estimates
of data quality, the EPA uses data above
an established threshold concentration
usually related to the detection limits of
the measurement method. Measurement
pairs are selected for use in the
precision and bias calculations only
when both measurements are above a
threshold concentration.
For many years, the threshold
concentration for Pb precision and bias
data has been 0.02 mg/m3. The EPA
promulgated a new Pb FRM utilizing the
ICP–MS analysis technique in 2013 as a
revision to appendix G of 40 CFR part
50.46 This new FRM demonstrated
MDLs 47 below 0.0002 mg/m3, which is
well below the EPA requirement of five
percent of the current Pb NAAQS level
of 0.15 mg/m3, or 0.0075 mg/m3. As a
result of the increased sensitivity
inherent in this new FRM, the EPA
proposed to lower the acceptable Pb
concentration (current section 4) from
the current value of 0.02 mg/m3 to 0.002
mg/m3 for measurements obtained using
the new Pb FRM and other more
recently approved equivalent methods
that have the requisite increased
sensitivity.48 The current 0.02 mg/m3
value will be retained for the previous
Pb FRM that has subsequently been redesignated as FEM EQLA–0813–803 as
well as older equivalent methods that
were approved prior to the more recent
work on developing more sensitive
methods. Since ambient Pb
concentrations are lower and methods
more sensitive, lowering the threshold
concentration will allow much more
collocated information to be evaluated,
which will provide more representative
estimates of precision and bias.
See comments related to this proposal
in the appendix A section. The EPA will
establish two thresholds as proposed
and will evaluate the use of an absolute
difference acceptance criteria at lower
concentration levels.
The EPA also proposed to remove the
TSP threshold concentration since TSP
is no longer a NAAQS-required
pollutant and the EPA no longer has QC
46 See
78 FR 40000, July 3, 2013.
is described as the minimum
concentration of a substance that can be measured
and reported with 99 percent confidence that the
analyte concentration is greater than zero.
48 FEMs approved on or after March 4, 2010, have
the required sensitivity to utilize the 0.002 mg/m3
reporting limit with the exception of manual
equivalent method EQLA–0813–803, the previous
FRM based on flame atomic absorption
spectroscopy.
47 MDL
PO 00000
Frm 00029
Fmt 4701
Sfmt 4700
17275
requirements for it. The EPA received
one comment in support of this
proposed change and no adverse
comments and is finalizing this revision
as proposed.
The EPA proposed to remove the
statistical check currently described in
section 4.1.5 of appendix A. The check
was developed to perform a comparison
of the one-point QC checks and the
annual performance evaluation data
performed by the same PQAO. The
section suggests that 95 percent of all
the bias estimates of the annual
performance evaluations (reported as a
percent difference) should fall within
the 95 percent probability interval
developed using the one-point QC
checks. The problem with this check is
that PQAOs with very good repeatability
on the one-point QC check data had a
hard time meeting this requirement
since the probability interval became
very tight, making it more difficult for
better performing PQAOs to meet the
requirement. Separate statistics to
evaluate the one-point QC checks and
the performance evaluations are already
promulgated, so the removal of this
check does not affect data quality
assessments. The EPA received one
comment in support of this proposal
and no adverse comments and is
finalizing this revision as proposed.
Similar to the statistical comparison
of performance evaluation data, the EPA
proposed to remove the statistical check
(current appendix A, section 4.2.4) to
compare the flow rate audit data and
flow rate verification data. The existing
language suggests that 95 percent of all
the flow rate audit data (reported as
percent difference) should fall within
the 95 percent probability interval
developed from the flow rate
verification data for the PQAO. The
problem, as with the one-point QC
check, was that monitoring
organizations with very good
repeatability on the flow rate
verifications had a hard time meeting
this requirement since the probability
interval became very tight, making it
difficult for better performing PQAOs to
meet the requirement. Separate statistics
to evaluate the flow rate verifications
and flow rate audits are already
promulgated, so the removal of this
check does not affect data quality
assessments. The EPA received one
comment in support of this proposal
and no adverse comments and is
finalizing this revision as proposed.
The EPA proposed to remove the
reporting requirements that are
currently in section 5 of appendix A
because they do not pertain to PSD
monitoring (current sections 5.1, 5.1.1
and 5.1.2.1). Since PSD organizations
E:\FR\FM\28MRR2.SGM
28MRR2
17276
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
are not required to certify their data to
the EPA nor report to AQS, the EPA will
remove language related to these
requirements and language that required
the EPA to calculate and report the
measurement uncertainty for the entire
calendar year. The EPA will retain the
quarterly PSD reporting requirements
(current section 5.2 in appendix A) and
require that those requirements be
consistent with 40 CFR 58.16 as it
pertains to PSD ambient air quality data
and QC data, as described in appendix
B. The EPA did not receive any
comment on this revision and is
finalizing this revision as proposed.
IV. Statutory and Executive Order
Reviews
A. Executive Order 12866: Regulatory
Planning and Review and Executive
Order 13563: Improving Regulation and
Regulatory Review
This action is not a significant
regulatory action and was, therefore, not
submitted to the Office of Management
and Budget (OMB) for review.
B. Paperwork Reduction Act (PRA)
This action does not impose any new
information collection burden under the
PRA. OMB has previously approved the
information collection activities
contained in the existing regulations
and has assigned OMB control number
2060–0084. While the EPA believes that
the net effect of the requirement changes
is a decrease in overall burden, the
current information collection request
calculation tools examine key air
monitoring tasks on somewhat of a
macro level and are therefore not
sufficiently detailed to show a material
change in burden compared with the
existing requirements.
Lhorne on DSK5TPTVN1PROD with RULES2
C. Regulatory Flexibility Act (RFA)
I certify that this action will not have
a significant economic impact on a
substantial number of small entities
under the RFA. This action will not
impose any requirements on small
entities. This action finalizes minor
changes and clarifications to existing
monitoring requirements and
definitions.
D. Unfunded Mandates Reform Act
This action does not contain an
unfunded federal mandate of $100
million or more as described in UMRA,
2 U.S.C. 1531–1538, and does not
significantly or uniquely affect small
governments. The revisions to the
monitoring requirements impose no
enforceable duty on any state, local, or
tribal governments or the private sector
beyond those duties already established
in the CAA.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
E. Executive Order 13132: Federalism
This action does not have federalism
implications. It will not have substantial
direct effects on the states, on the
relationship between the national
government and the states, or on the
distribution of power and
responsibilities among the various
levels of government.
F. Executive Order 13175: Consultation
and Coordination With Indian Tribal
Governments
This action does not have tribal
implications, as specified in Executive
Order 13175 (65 FR 67249, November 9,
2000). Tribes have the opportunity to
seek treatment in a manner similar to a
state for the purpose of installing and
operating a monitoring network
consisting of one or more monitors and
to then install and operate such a
network, but are not required to do so.
With regard to any tribes that may
currently be operating a monitoring
network, as well as any tribes that may
operate a monitoring network in the
future, this action finalizes minor
changes and clarifications to existing
monitoring requirements and will not
materially impact the time required to
operate monitoring networks. Thus,
consultation under the Executive Order
13175 is not required for this action.
The EPA will work through tribal
resources such as the Tribal Air
Monitoring Support Center to ensure a
complete understanding of these
revisions.
G. Executive Order 13045: Protection of
Children From Environmental Health
and Safety Risks
The EPA interprets Executive Order
13045 as applying only to those
regulatory actions that concern
environmental health or safety risks that
the EPA has reason to believe may
disproportionately affect children, per
the definition of ‘‘covered regulatory
action’’ in section 2–202 of the
Executive Order. This action is not
subject to Executive Order 13045
because it does not concern an
environmental health risk or safety risk.
H. Executive Order 13211: Actions
Concerning Regulations That
Significantly Affect Energy Supply,
Distribution, or Use
This action is not subject to Executive
Order 13211, because it is not a
significant regulatory action under
Executive Order 12866.
I. National Technology Transfer and
Advancement Act
This action does not involve technical
standards.
PO 00000
Frm 00030
Fmt 4701
Sfmt 4700
J. Executive Order 12898: Federal
Actions to Address Environmental
Justice in Minority Populations and
Low-Income Populations
The EPA believes the human health or
environmental risk addressed by this
action will not have potential
disproportionately high and adverse
human health or environmental effects
on minority, low-income or indigenous
populations. This action finalizes minor
changes and clarifications to existing
monitoring requirements and
definitions.
K. Congressional Review Act
This action is subject to the CRA, and
the EPA will submit a rule report to
each House of the Congress and to the
Comptroller General of the United
States. This action is not a ‘‘major rule’’
as defined by 5 U.S.C. 804(2).
List of Subjects in 40 CFR Part 58
Environmental protection,
Administrative practice and procedure,
Air pollution control, Intergovernmental
relations.
Dated: March 10, 2016.
Gina McCarthy,
Administrator.
Part 58, chapter I, title 40 of the Code
of Federal Regulations is amended as
follows:
PART 58—AMBIENT AIR QUALITY
SURVEILLANCE
1. The authority citation for part 58
continues to read as follows:
■
Authority: 42 U.S.C. 7403, 7405, 7410,
7414, 7601, 7611, 7614, and 7619.
■
2. Revise § 58.1 to read as follows:
§ 58.1
Definitions.
As used in this part, all terms not
defined herein have the meaning given
them in the Clean Air Act.
AADT means the annual average daily
traffic.
Act means the Clean Air Act as
amended (42 U.S.C. 7401, et seq.)
Additive and multiplicative bias
means the linear regression intercept
and slope of a linear plot fitted to
corresponding candidate and reference
method mean measurement data pairs.
Administrator means the
Administrator of the Environmental
Protection Agency (EPA) or his or her
authorized representative.
Air quality system (AQS) means the
EPA’s computerized system for storing
and reporting of information relating to
ambient air quality data.
Approved regional method (ARM)
means a continuous PM2.5 method that
has been approved specifically within a
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
state or local air monitoring network for
purposes of comparison to the NAAQS
and to meet other monitoring objectives.
AQCR means air quality control
region.
Area-wide means all monitors sited at
neighborhood, urban, and regional
scales, as well as those monitors sited at
either micro- or middle-scale that are
representative of many such locations in
the same CBSA.
Certifying agency means a state, local,
or tribal agency responsible for meeting
the data certification requirements in
accordance with § 58.15 for a unique set
of monitors.
Chemical Speciation Network (CSN)
includes Speciation Trends Network
stations (STN) as specified in paragraph
4.7.4 of appendix D of this part and
supplemental speciation stations that
provide chemical species data of fine
particulate.
CO means carbon monoxide.
Combined statistical area (CSA) is
defined by the U.S. Office of
Management and Budget as a
geographical area consisting of two or
more adjacent Core Based Statistical
Areas (CBSA) with employment
interchange of at least 15 percent.
Combination is automatic if the
employment interchange is 25 percent
and determined by local opinion if more
than 15 but less than 25 percent.
Core-based statistical area (CBSA) is
defined by the U.S. Office of
Management and Budget, as a statistical
geographic entity consisting of the
county or counties associated with at
least one urbanized area/urban cluster
of at least 10,000 population, plus
adjacent counties having a high degree
of social and economic integration.
Metropolitan Statistical Areas (MSAs)
and micropolitan statistical areas are the
two categories of CBSA (metropolitan
areas have populations greater than
50,000; and micropolitan areas have
populations between 10,000 and
50,000). In the case of very large cities
where two or more CBSAs are
combined, these larger areas are referred
to as combined statistical areas (CSAs)
Corrected concentration pertains to
the result of an accuracy or precision
assessment test of an open path analyzer
in which a high-concentration test or
audit standard gas contained in a short
test cell is inserted into the optical
measurement beam of the instrument.
When the pollutant concentration
measured by the analyzer in such a test
includes both the pollutant
concentration in the test cell and the
concentration in the atmosphere, the
atmospheric pollutant concentration
must be subtracted from the test
measurement to obtain the corrected
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
concentration test result. The corrected
concentration is equal to the measured
concentration minus the average of the
atmospheric pollutant concentrations
measured (without the test cell)
immediately before and immediately
after the test.
Design value means the calculated
concentration according to the
applicable appendix of part 50 of this
chapter for the highest site in an
attainment or nonattainment area.
EDO means environmental data
operations.
Effective concentration pertains to
testing an open path analyzer with a
high-concentration calibration or audit
standard gas contained in a short test
cell inserted into the optical
measurement beam of the instrument.
Effective concentration is the equivalent
ambient-level concentration that would
produce the same spectral absorbance
over the actual atmospheric monitoring
path length as produced by the highconcentration gas in the short test cell.
Quantitatively, effective concentration
is equal to the actual concentration of
the gas standard in the test cell
multiplied by the ratio of the path
length of the test cell to the actual
atmospheric monitoring path length.
Federal equivalent method (FEM)
means a method for measuring the
concentration of an air pollutant in the
ambient air that has been designated as
an equivalent method in accordance
with part 53 of this chapter; it does not
include a method for which an
equivalent method designation has been
canceled in accordance with § 53.11 or
§ 53.16.
Federal reference method (FRM)
means a method of sampling and
analyzing the ambient air for an air
pollutant that is specified as a reference
method in an appendix to part 50 of this
chapter, or a method that has been
designated as a reference method in
accordance with this part; it does not
include a method for which a reference
method designation has been canceled
in accordance with § 53.11 or § 53.16 of
this chapter.
HNO3 means nitric acid.
Implementation plan means an
implementation plan approved or
promulgated by the EPA pursuant to
section 110 of the Act.
Local agency means any local
government agency, other than the state
agency, which is charged by a state with
the responsibility for carrying out a
portion of the annual monitoring
network plan required by § 58.10.
Meteorological measurements means
measurements of wind speed, wind
direction, barometric pressure,
temperature, relative humidity, solar
PO 00000
Frm 00031
Fmt 4701
Sfmt 4700
17277
radiation, ultraviolet radiation, and/or
precipitation that occur at SLAMS
stations including the NCore and PAMS
networks.
Metropolitan Statistical Area (MSA)
means a CBSA associated with at least
one urbanized area of 50,000 population
or greater. The central-county, plus
adjacent counties with a high degree of
integration, comprise the area.
Monitor means an instrument,
sampler, analyzer, or other device that
measures or assists in the measurement
of atmospheric air pollutants and which
is acceptable for use in ambient air
surveillance under the applicable
provisions of appendix C to this part.
Monitoring agency means a state,
local or tribal agency responsible for
meeting the requirements of this part.
Monitoring organization means a
monitoring agency responsible for
operating a monitoring site for which
the quality assurance regulations apply.
Monitoring path for an open path
analyzer means the actual path in space
between two geographical locations over
which the pollutant concentration is
measured and averaged.
Monitoring path length of an open
path analyzer means the length of the
monitoring path in the atmosphere over
which the average pollutant
concentration measurement (pathaveraged concentration) is determined.
See also, optical measurement path
length.
Monitoring planning area (MPA)
means a contiguous geographic area
with established, well-defined
boundaries, such as a CBSA, county or
state, having a common area that is used
for planning monitoring locations for
PM2.5. A MPA may cross state
boundaries, such as the Philadelphia
PA–NJ MSA, and be further subdivided
into community monitoring zones. The
MPAs are generally oriented toward
CBSAs or CSAs with populations
greater than 200,000, but for
convenience, those portions of a state
that are not associated with CBSAs can
be considered as a single MPA.
NATTS means the national air toxics
trends stations. This network provides
hazardous air pollution ambient data.
NCore means the National Core
multipollutant monitoring stations.
Monitors at these sites are required to
measure particles (PM2.5 speciated
PM2.5, PM10–2.5), O3, SO2, CO, nitrogen
oxides (NO/NOy), and meteorology
(wind speed, wind direction,
temperature, relative humidity).
Near-road monitor means any
approved monitor meeting the
applicable specifications described in
40 CFR part 58, appendix D (sections
4.2.1, 4.3.2, 4.7.1(b)(2)) and appendix E
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17278
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
(section 6.4(a), Table E–4) for near-road
measurement of PM2.5, CO, or NO2.
Network means all stations of a given
type or types.
Network Plan means the Annual
Monitoring Network Plan described in
§ 58.10.
NH3 means ammonia.
NO2 means nitrogen dioxide.
NO means nitrogen oxide.
NOX means the sum of the
concentrations of NO2 and NO.
NOy means the sum of all total
reactive nitrogen oxides, including NO,
NO2, and other nitrogen oxides referred
to as NOZ.
O3 means ozone.
Open path analyzer means an
automated analytical method that
measures the average atmospheric
pollutant concentration in situ along
one or more monitoring paths having a
monitoring path length of 5 meters or
more and that has been designated as a
reference or equivalent method under
the provisions of part 53 of this chapter.
Optical measurement path length
means the actual length of the optical
beam over which measurement of the
pollutant is determined. The pathintegrated pollutant concentration
measured by the analyzer is divided by
the optical measurement path length to
determine the path-averaged
concentration. Generally, the optical
measurement path length is:
(1) Equal to the monitoring path
length for a (bistatic) system having a
transmitter and a receiver at opposite
ends of the monitoring path;
(2) Equal to twice the monitoring path
length for a (monostatic) system having
a transmitter and receiver at one end of
the monitoring path and a mirror or
retroreflector at the other end; or
(3) Equal to some multiple of the
monitoring path length for more
complex systems having multiple passes
of the measurement beam through the
monitoring path.
PAMS means photochemical
assessment monitoring stations.
Pb means lead.
PM means particulate matter,
including but not limited to PM10,
PM10C, PM2.5, and PM10–2.5.
PM2.5 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 2.5 micrometers as
measured by a reference method based
on appendix L of part 50 and designated
in accordance with part 53 of this
chapter, by an equivalent method
designated in accordance with part 53,
or by an approved regional method
designated in accordance with appendix
C to this part.
PM10 means particulate matter with
an aerodynamic diameter less than or
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
equal to a nominal 10 micrometers as
measured by a reference method based
on appendix J of part 50 of this chapter
and designated in accordance with part
53 of this chapter or by an equivalent
method designated in accordance with
part 53.
PM10C means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers as
measured by a reference method based
on appendix O of part 50 of this chapter
and designated in accordance with part
53 of this chapter or by an equivalent
method designated in accordance with
part 53.
PM10¥2.5 means particulate matter
with an aerodynamic diameter less than
or equal to a nominal 10 micrometers
and greater than a nominal 2.5
micrometers as measured by a reference
method based on appendix O to part 50
of this chapter and designated in
accordance with part 53 of this chapter
or by an equivalent method designated
in accordance with part 53.
Point analyzer means an automated
analytical method that measures
pollutant concentration in an ambient
air sample extracted from the
atmosphere at a specific inlet probe
point, and that has been designated as
a reference or equivalent method in
accordance with part 53 of this chapter.
Primary monitor means the monitor
identified by the monitoring
organization that provides concentration
data used for comparison to the
NAAQS. For any specific site, only one
monitor for each pollutant can be
designated in AQS as primary monitor
for a given period of time. The primary
monitor identifies the default data
source for creating a combined site
record for purposes of NAAQS
comparisons.
Primary quality assurance
organization (PQAO) means a
monitoring organization, a group of
monitoring organizations or other
organization that is responsible for a set
of stations that monitor the same
pollutant and for which data quality
assessments can be pooled. Each criteria
pollutant sampler/monitor at a
monitoring station must be associated
with only one PQAO.
Probe means the actual inlet where an
air sample is extracted from the
atmosphere for delivery to a sampler or
point analyzer for pollutant analysis.
PSD monitoring network means a set
of stations that provide concentration
information for a specific PSD permit.
PSD monitoring organization means a
source owner/operator, a government
agency, or a contractor of the source or
agency that operates an ambient air
PO 00000
Frm 00032
Fmt 4701
Sfmt 4700
pollution monitoring network for PSD
purposes.
PSD reviewing authority means the
state air pollution control agency, local
agency, other state agency, tribe, or
other agency authorized by the
Administrator to carry out a permit
program under §§ 51.165 and 51.166 of
this chapter, or the Administrator in the
case of EPA-implemented permit
programs under § 52.21 of this chapter.
PSD station means any station
operated for the purpose of establishing
the effect on air quality of the emissions
from a proposed source for purposes of
prevention of significant deterioration
as required by § 51.24(n) of this chapter.
Regional Administrator means the
Administrator of one of the ten EPA
Regional Offices or his or her authorized
representative.
Reporting organization means an
entity, such as a state, local, or tribal
monitoring agency, that reports air
quality data to the EPA.
Site means a geographic location. One
or more stations may be at the same site.
SLAMS means state or local air
monitoring stations. The SLAMS
include the ambient air quality
monitoring sites and monitors that are
required by appendix D of this part and
are needed for the monitoring objectives
of appendix D, including NAAQS
comparisons, but may serve other data
purposes. The SLAMS includes NCore,
PAMS, CSN, and all other state or
locally operated criteria pollutant
monitors, operated in accordance to this
part, that have not been designated and
approved by the Regional Administrator
as SPM stations in an annual monitoring
network plan.
SO2 means sulfur dioxide.
Special purpose monitor (SPM)
station means a monitor included in an
agency’s monitoring network that the
agency has designated as a special
purpose monitor station in its annual
monitoring network plan and in the
AQS, and which the agency does not
count when showing compliance with
the minimum requirements of this
subpart for the number and siting of
monitors of various types. Any SPM
operated by an air monitoring agency
must be included in the periodic
assessments and annual monitoring
network plan required by § 58.10 and
approved by the Regional
Administrator.
State agency means the air pollution
control agency primarily responsible for
development and implementation of a
State Implementation Plan under the
Act.
Station means a single monitor, or a
group of monitors, located at a
particular site.
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
STN station means a PM2.5 chemical
speciation station designated to be part
of the speciation trends network. This
network provides chemical species data
of fine particulate.
Supplemental speciation station
means a PM2.5 chemical speciation
station that is operated for monitoring
agency needs and not part of the STN.
Traceable means that a local standard
has been compared and certified, either
directly or via not more than one
intermediate standard, to a National
Institute of Standards and Technology
(NIST)-certified primary standard such
as a NIST-traceable Reference Material
(NTRM) or a NIST-certified Gas
Manufacturer’s Internal Standard
(GMIS).
TSP (total suspended particulates)
means particulate matter as measured
by the method described in appendix B
of Part 50.
Urbanized area means an area with a
minimum residential population of at
least 50,000 people and which generally
includes core census block groups or
blocks that have a population density of
at least 1,000 people per square mile
and surrounding census blocks that
have an overall density of at least 500
people per square mile. The Census
Bureau notes that under certain
conditions, less densely settled territory
may be part of each Urbanized Area.
VOCs means volatile organic
compounds.
■ 3. In § 58.10:
■ a. Revise paragraphs (a)(1) and (a)(2).
■ b. Add paragraph (a)(12).
The revisions and addition read as
follows:
Lhorne on DSK5TPTVN1PROD with RULES2
§ 58.10 Annual monitoring network plan
and periodic network assessment.
(a)(1) Beginning July 1, 2007, the
state, or where applicable local, agency
shall submit to the Regional
Administrator an annual monitoring
network plan which shall provide for
the documentation of the establishment
and maintenance of an air quality
surveillance system that consists of a
network of SLAMS monitoring stations
that can include FRM, FEM, and ARM
monitors that are part of SLAMS, NCore,
CSN, PAMS, and SPM stations. The
plan shall include a statement of
whether the operation of each monitor
meets the requirements of appendices
A, B, C, D, and E of this part, where
applicable. The Regional Administrator
may require additional information in
support of this statement. The annual
monitoring network plan must be made
available for public inspection and
comment for at least 30 days prior to
submission to the EPA and the
submitted plan shall include and
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
address, as appropriate, any received
comments.
(2) Any annual monitoring network
plan that proposes network
modifications (including new or
discontinued monitoring sites, new
determinations that data are not of
sufficient quality to be compared to the
NAAQS, and changes in identification
of monitors as suitable or not suitable
for comparison against the annual PM2.5
NAAQS) to SLAMS networks is subject
to the approval of the EPA Regional
Administrator, who shall approve or
disapprove the plan within 120 days of
submission of a complete plan to the
EPA.
*
*
*
*
*
(12) A detailed description of the
PAMS network being operated in
accordance with the requirements of
appendix D to this part shall be
submitted as part of the annual
monitoring network plan for review by
the EPA Administrator. The PAMS
Network Description described in
section 5 of appendix D may be used to
meet this requirement.
*
*
*
*
*
■ 4. In § 58.11, revise paragraph (a)(3) to
read as follows:
§ 58.11
Network technical requirements.
(a) * * *
(3) The owner or operator of an
existing or a proposed source shall
follow the quality assurance criteria in
appendix B to this part that apply to
PSD monitoring when operating a PSD
site.
*
*
*
*
*
■ 5. In § 58.12:
■ a. Revise paragraph (d)(1).
■ b. Revise paragraph (d)(3).
The revisions read as follows:
§ 58.12
Operating schedules.
*
*
*
*
*
(d) * * *
(1)(i) Manual PM2.5 samplers at
required SLAMS stations without a
collocated continuously operating PM2.5
monitor must operate on at least a 1-in3 day schedule unless a waiver for an
alternative schedule has been approved
per paragraph (d)(1)(ii) of this section.
(ii) For SLAMS PM2.5 sites with both
manual and continuous PM2.5 monitors
operating, the monitoring agency may
request approval for a reduction to 1-in6 day PM2.5 sampling or for seasonal
sampling from the EPA Regional
Administrator. Other requests for a
reduction to 1-in-6 day PM2.5 sampling
or for seasonal sampling may be
approved on a case-by-case basis. The
EPA Regional Administrator may grant
sampling frequency reductions after
PO 00000
Frm 00033
Fmt 4701
Sfmt 4700
17279
consideration of factors (including but
not limited to the historical PM2.5 data
quality assessments, the location of
current PM2.5 design value sites, and
their regulatory data needs) if the
Regional Administrator determines that
the reduction in sampling frequency
will not compromise data needed for
implementation of the NAAQS.
Required SLAMS stations whose
measurements determine the design
value for their area and that are within
±10 percent of the annual NAAQS, and
all required sites where one or more 24hour values have exceeded the 24-hour
NAAQS each year for a consecutive
period of at least 3 years are required to
maintain at least a 1-in-3 day sampling
frequency until the design value no
longer meets these criteria for 3
consecutive years. A continuously
operating FEM or ARM PM2.5 monitor
satisfies this requirement unless it is
identified in the monitoring agency’s
annual monitoring network plan as not
appropriate for comparison to the
NAAQS and the EPA Regional
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS.
(iii) Required SLAMS stations whose
measurements determine the 24-hour
design value for their area and whose
data are within ±5 percent of the level
of the 24-hour PM2.5 NAAQS must have
an FRM or FEM operate on a daily
schedule if that area’s design value for
the annual NAAQS is less than the level
of the annual PM2.5 standard. A
continuously operating FEM or ARM
PM2.5 monitor satisfies this requirement
unless it is identified in the monitoring
agency’s annual monitoring network
plan as not appropriate for comparison
to the NAAQS and the EPA Regional
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS. The
daily schedule must be maintained until
the referenced design value no longer
meets these criteria for 3 consecutive
years.
(iv) Changes in sampling frequency
attributable to changes in design values
shall be implemented no later than
January 1 of the calendar year following
the certification of such data as
described in § 58.15.
*
*
*
*
*
(3) Manual PM2.5 speciation samplers
at STN stations must operate on at least
a 1-in-3 day sampling frequency unless
a reduction in sampling frequency has
been approved by the EPA
Administrator based on factors such as
area’s design value, the role of the
particular site in national health studies,
the correlation of the site’s species data
E:\FR\FM\28MRR2.SGM
28MRR2
17280
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
with nearby sites, and presence of other
leveraged measurements.
*
*
*
*
*
6. In § 58.14, revise paragraph (a) to
read as follows:
■
§ 58.14
System modification.
(a) The state, or where appropriate
local, agency shall develop a network
modification plan and schedule to
modify the ambient air quality
monitoring network that addresses the
findings of the network assessment
required every 5 years by § 58.10(d). The
network modification plan shall be
submitted as part of the Annual
Monitoring Network Plan that is due no
later than the year after submittal of the
network assessment.
*
*
*
*
*
■
7. Revise § 58.15 to read as follows:
Lhorne on DSK5TPTVN1PROD with RULES2
§ 58.15 Annual air monitoring data
certification.
(a) The state, or where appropriate
local, agency shall submit to the EPA
Regional Administrator an annual air
monitoring data certification letter to
certify data collected by FRM, FEM, and
ARM monitors at SLAMS and SPM sites
that meet criteria in appendix A to this
part from January 1 to December 31 of
the previous year. The head official in
each monitoring agency, or his or her
designee, shall certify that the previous
year of ambient concentration and
quality assurance data are completely
submitted to AQS and that the ambient
concentration data are accurate to the
best of her or his knowledge, taking into
consideration the quality assurance
findings. The annual data certification
letter is due by May 1 of each year.
(b) Along with each certification
letter, the state shall submit to the
Regional Administrator an annual
summary report of all the ambient air
quality data collected by FRM, FEM,
and ARM monitors at SLAMS and SPM
sites. The annual report(s) shall be
submitted for data collected from
January 1 to December 31 of the
previous year. The annual summary
serves as the record of the specific data
that is the object of the certification
letter.
(c) Along with each certification
letter, the state shall submit to the
Regional Administrator a summary of
the precision and accuracy data for all
ambient air quality data collected by
FRM, FEM, and ARM monitors at
SLAMS and SPM sites. The summary of
precision and accuracy shall be
submitted for data collected from
January 1 to December 31 of the
previous year.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
8. In § 58.16, revise paragraphs (a), (c),
and (d) to read as follows:
■
§ 58.16 Data submittal and archiving
requirements.
(a) The state, or where appropriate,
local agency, shall report to the
Administrator, via AQS all ambient air
quality data and associated quality
assurance data for SO2; CO; O3; NO2;
NO; NOy; NOX; Pb–TSP mass
concentration; Pb–PM10 mass
concentration; PM10 mass concentration;
PM2.5 mass concentration; for filterbased PM2.5 FRM/FEM, the field blank
mass; chemically speciated PM2.5 mass
concentration data; PM10–2.5 mass
concentration; meteorological data from
NCore and PAMS sites; and metadata
records and information specified by the
AQS Data Coding Manual (https://
www.epa.gov/sites/production/files/
2015-09/documents/aqs_data_coding_
manual_0.pdf). Air quality data and
information must be submitted directly
to the AQS via electronic transmission
on the specified schedule described in
paragraphs (b) and (d) of this section.
*
*
*
*
*
(c) Air quality data submitted for each
reporting period must be edited,
validated, and entered into the AQS
(within the time limits specified in
paragraphs (b) and (d) of this section)
pursuant to appropriate AQS
procedures. The procedures for editing
and validating data are described in the
AQS Data Coding Manual and in each
monitoring agency’s quality assurance
project plan.
(d) The state shall report VOC and if
collected, carbonyl, NH3, and HNO3
data from PAMS sites, and chemically
speciated PM2.5 mass concentration data
to AQS within 6 months following the
end of each quarterly reporting period
listed in paragraph (b) of this section.
*
*
*
*
*
■ 9. Revise Appendix A to part 58 to
read as follows:
Appendix A to Part 58—Quality
Assurance Requirements for Monitors
used in Evaluations of National
Ambient Air Quality Standards
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability. (a) This appendix
specifies the minimum quality system
requirements applicable to SLAMS and other
monitor types whose data are intended to be
used to determine compliance with the
NAAQS (e.g., SPMs, tribal, CASTNET,
NCore, industrial, etc.), unless the EPA
PO 00000
Frm 00034
Fmt 4701
Sfmt 4700
Regional Administrator has reviewed and
approved the monitor for exclusion from
NAAQS use and these quality assurance
requirements.
(b) Primary quality assurance organizations
are encouraged to develop and maintain
quality systems more extensive than the
required minimums. Additional guidance for
the requirements reflected in this appendix
can be found in the ‘‘Quality Assurance
Handbook for Air Pollution Measurement
Systems,’’ Volume II (see reference 10 of this
appendix) and at a national level in
references 1, 2, and 3 of this appendix.
1.2 Primary Quality Assurance
Organization (PQAO). A PQAO is defined as
a monitoring organization or a group of
monitoring organizations or other
organization that is responsible for a set of
stations that monitors the same pollutant and
for which data quality assessments will be
pooled. Each criteria pollutant sampler/
monitor must be associated with only one
PQAO. In some cases, data quality is assessed
at the PQAO level.
1.2.1 Each PQAO shall be defined such
that measurement uncertainty among all
stations in the organization can be expected
to be reasonably homogeneous as a result of
common factors. Common factors that should
be considered in defining PQAOs include:
(a) Operation by a common team of field
operators according to a common set of
procedures;
(b) Use of a common quality assurance
project plan (QAPP) or standard operating
procedures;
(c) Common calibration facilities and
standards;
(d) Oversight by a common quality
assurance organization; and
(e) Support by a common management
organization (i.e., state agency) or laboratory.
Since data quality assessments are made
and data certified at the PQAO level, the
monitoring organization identified as the
PQAO will be responsible for the oversight
of the quality of data of all monitoring
organizations within the PQAO.
1.2.2 Monitoring organizations having
difficulty describing its PQAO or in assigning
specific monitors to primary quality
assurance organizations should consult with
the appropriate EPA Regional Office. Any
consolidation of monitoring organizations to
PQAOs shall be subject to final approval by
the appropriate EPA Regional Office.
1.2.3 Each PQAO is required to
implement a quality system that provides
sufficient information to assess the quality of
the monitoring data. The quality system
must, at a minimum, include the specific
requirements described in this appendix.
Failure to conduct or pass a required check
or procedure, or a series of required checks
or procedures, does not by itself invalidate
data for regulatory decision making. Rather,
PQAOs and the EPA shall use the checks and
procedures required in this appendix in
combination with other data quality
information, reports, and similar
documentation that demonstrate overall
compliance with Part 58. Accordingly, the
EPA and PQAOs shall use a ‘‘weight of
evidence’’ approach when determining the
suitability of data for regulatory decisions.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
The EPA reserves the authority to use or not
use monitoring data submitted by a
monitoring organization when making
regulatory decisions based on the EPA’s
assessment of the quality of the data.
Consensus built validation templates or
validation criteria already approved in
QAPPs should be used as the basis for the
weight of evidence approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used
to describe deviations from a true
concentration or estimate that are related to
the measurement process and not to spatial
or temporal population attributes of the air
being measured.
(b) Precision. A measurement of mutual
agreement among individual measurements
of the same property usually under
prescribed similar conditions, expressed
generally in terms of the standard deviation.
(c) Bias. The systematic or persistent
distortion of a measurement process which
causes errors in one direction.
(d) Accuracy. The degree of agreement
between an observed value and an accepted
reference value. Accuracy includes a
combination of random error (imprecision)
and systematic error (bias) components
which are due to sampling and analytical
operations.
(e) Completeness. A measure of the amount
of valid data obtained from a measurement
system compared to the amount that was
expected to be obtained under correct,
normal conditions.
(f) Detection Limit. The lowest
concentration or amount of target analyte that
can be determined to be different from zero
by a single measurement at a stated level of
probability.
1.4 Measurement Quality Checks. The
measurement quality checks described in
section 3 of this appendix shall be reported
to AQS and are included in the data required
for certification.
1.5 Assessments and Reports. Periodic
assessments and documentation of data
quality are required to be reported to the
EPA. To provide national uniformity in this
assessment and reporting of data quality for
all networks, specific assessment and
reporting procedures are prescribed in detail
in sections 3, 4, and 5 of this appendix. On
the other hand, the selection and extent of
the quality assurance and quality control
activities used by a monitoring organization
depend on a number of local factors such as
field and laboratory conditions, the
objectives for monitoring, the level of data
quality needed, the expertise of assigned
personnel, the cost of control procedures,
pollutant concentration levels, etc. Therefore,
quality system requirements in section 2 of
this appendix are specified in general terms
to allow each monitoring organization to
develop a quality system that is most
efficient and effective for its own
circumstances while achieving the data
quality objectives described in this appendix.
2. Quality System Requirements
A quality system (reference 1 of this
appendix) is the means by which an
organization manages the quality of the
monitoring information it produces in a
systematic, organized manner. It provides a
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
framework for planning, implementing,
assessing and reporting work performed by
an organization and for carrying out required
quality assurance and quality control
activities.
2.1 Quality Management Plans and
Quality Assurance Project Plans. All PQAOs
must develop a quality system that is
described and approved in quality
management plans (QMP) and QAPPs to
ensure that the monitoring results:
(a) Meet a well-defined need, use, or
purpose (reference 5 of this appendix);
(b) Provide data of adequate quality for the
intended monitoring objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards
specifications;
(e) Comply with statutory (and other legal)
requirements; and
(f) Reflect consideration of cost and
economics.
2.1.1 The QMP describes the quality
system in terms of the organizational
structure, functional responsibilities of
management and staff, lines of authority, and
required interfaces for those planning,
implementing, assessing and reporting
activities involving environmental data
operations (EDO). The QMP must be suitably
documented in accordance with EPA
requirements (reference 2 of this appendix),
and approved by the appropriate Regional
Administrator, or his or her representative.
The quality system described in the QMP
will be reviewed during the systems audits
described in section 2.5 of this appendix.
Organizations that implement long-term
monitoring programs with EPA funds should
have a separate QMP document. Smaller
organizations, organizations that do
infrequent work with the EPA or have
monitoring programs of limited size or scope
may combine the QMP with the QAPP if
approved by, and subject to any conditions
of the EPA. Additional guidance on this
process can be found in reference 10 of this
appendix. Approval of the recipient’s QMP
by the appropriate Regional Administrator or
his or her representative may allow
delegation of authority to the PQAOs
independent quality assurance function to
review and approve environmental data
collection activities adequately described and
covered under the scope of the QMP and
documented in appropriate planning
documents (QAPP). Where a PQAO or
monitoring organization has been delegated
authority to review and approve their QAPP,
an electronic copy must be submitted to the
EPA region at the time it is submitted to the
PQAO/monitoring organization’s QAPP
approving authority. The QAPP will be
reviewed by the EPA during systems audits
or circumstances related to data quality. The
QMP submission and approval dates for
PQAOs/monitoring organizations must be
reported to AQS either by the monitoring
organization or the EPA Region.
2.1.2 The QAPP is a formal document
describing, in sufficient detail, the quality
system that must be implemented to ensure
that the results of work performed will satisfy
the stated objectives. PQAOs must develop
QAPPs that describe how the organization
intends to control measurement uncertainty
PO 00000
Frm 00035
Fmt 4701
Sfmt 4700
17281
to an appropriate level in order to achieve the
data quality objectives for the EDO. The
quality assurance policy of the EPA requires
every EDO to have a written and approved
QAPP prior to the start of the EDO. It is the
responsibility of the PQAO/monitoring
organization to adhere to this policy. The
QAPP must be suitably documented in
accordance with EPA requirements (reference
3 of this appendix) and include standard
operating procedures for all EDOs either
within the document or by appropriate
reference. The QAPP must identify each
PQAO operating monitors under the QAPP as
well as generally identify the sites and
monitors to which it is applicable either
within the document or by appropriate
reference. The QAPP submission and
approval dates must be reported to AQS
either by the monitoring organization or the
EPA Region.
2.1.3 The PQAO/monitoring
organization’s quality system must have
adequate resources both in personnel and
funding to plan, implement, assess and
report on the achievement of the
requirements of this appendix and it’s
approved QAPP.
2.2 Independence of Quality Assurance.
The PQAO must provide for a quality
assurance management function, that aspect
of the overall management system of the
organization that determines and implements
the quality policy defined in a PQAO’s QMP.
Quality management includes strategic
planning, allocation of resources and other
systematic planning activities (e.g., planning,
implementation, assessing and reporting)
pertaining to the quality system. The quality
assurance management function must have
sufficient technical expertise and
management authority to conduct
independent oversight and assure the
implementation of the organization’s quality
system relative to the ambient air quality
monitoring program and should be
organizationally independent of
environmental data generation activities.
2.3. Data Quality Performance
Requirements.
2.3.1 Data Quality Objectives. The DQOs,
or the results of other systematic planning
processes, are statements that define the
appropriate type of data to collect and
specify the tolerable levels of potential
decision errors that will be used as a basis
for establishing the quality and quantity of
data needed to support the monitoring
objectives (reference 5 of this appendix). The
DQOs will be developed by the EPA to
support the primary regulatory objectives for
each criteria pollutant. As they are
developed, they will be added to the
regulation. The quality of the conclusions
derived from data interpretation can be
affected by population uncertainty (spatial or
temporal uncertainty) and measurement
uncertainty (uncertainty associated with
collecting, analyzing, reducing and reporting
concentration data). This appendix focuses
on assessing and controlling measurement
uncertainty.
2.3.1.1 Measurement Uncertainty for
Automated and Manual PM2.5 Methods. The
goal for acceptable measurement uncertainty
is defined for precision as an upper 90
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17282
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
percent confidence limit for the coefficient of
variation (CV) of 10 percent and ±10 percent
for total bias.
2.3.1.2 Measurement Uncertainty for
Automated O3 Methods. The goal for
acceptable measurement uncertainty is
defined for precision as an upper 90 percent
confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence
limit for the absolute bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb
Methods. The goal for acceptable
measurement uncertainty is defined for
precision as an upper 90 percent confidence
limit for the CV of 20 percent and for bias
as an upper 95 percent confidence limit for
the absolute bias of 15 percent.
2.3.1.4 Measurement Uncertainty for
NO2. The goal for acceptable measurement
uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV
of 15 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 15 percent.
2.3.1.5 Measurement Uncertainty for SO2.
The goal for acceptable measurement
uncertainty for precision is defined as an
upper 90 percent confidence limit for the CV
of 10 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 10 percent.
2.4 National Performance Evaluation
Programs. The PQAO shall provide for the
implementation of a program of independent
and adequate audits of all monitors providing
data for NAAQS compliance purposes
including the provision of adequate resources
for such audit programs. A monitoring plan
(or QAPP) which provides for PQAO
participation in the EPA’s National
Performance Audit Program (NPAP), the
PM2.5 Performance Evaluation Program
(PM2.5-PEP) program and the Pb Performance
Evaluation Program (Pb-PEP) and indicates
the consent of the PQAO for the EPA to apply
an appropriate portion of the grant funds,
which the EPA would otherwise award to the
PQAO for these QA activities, will be
deemed by the EPA to meet this requirement.
For clarification and to participate, PQAOs
should contact either the appropriate EPA
regional quality assurance (QA) coordinator
at the appropriate EPA Regional Office
location, or the NPAP coordinator at the EPA
Air Quality Assessment Division, Office of
Air Quality Planning and Standards, in
Research Triangle Park, North Carolina. The
PQAOs that plan to implement these
programs (self-implement) rather than use
the federal programs must meet the adequacy
requirements found in the appropriate
sections that follow, as well as meet the
definition of independent assessment that
follows.
2.4.1 Independent assessment. An
assessment performed by a qualified
individual, group, or organization that is not
part of the organization directly performing
and accountable for the work being assessed.
This auditing organization must not be
involved with the generation of the ambient
air monitoring data. An organization can
conduct the performance evaluation (PE) if it
can meet this definition and has a
management structure that, at a minimum,
will allow for the separation of its routine
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
sampling personnel from its auditing
personnel by two levels of management. In
addition, the sample analysis of audit filters
must be performed by a laboratory facility
and laboratory equipment separate from the
facilities used for routine sample analysis.
Field and laboratory personnel will be
required to meet PE field and laboratory
training and certification requirements to
establish comparability to federally
implemented programs.
2.5 Technical Systems Audit Program.
Technical systems audits of each PQAO shall
be conducted at least every 3 years by the
appropriate EPA Regional Office and
reported to the AQS. If a PQAO is made up
of more than one monitoring organization, all
monitoring organizations in the PQAO
should be audited within 6 years (two TSA
cycles of the PQAO). As an example, if a state
has five local monitoring organizations that
are consolidated under one PQAO, all five
local monitoring organizations should
receive a technical systems audit within a 6year period. Systems audit programs are
described in reference 10 of this appendix.
2.6 Gaseous and Flow Rate Audit
Standards.
2.6.1 Gaseous pollutant concentration
standards (permeation devices or cylinders of
compressed gas) used to obtain test
concentrations for CO, SO2, NO, and NO2
must be traceable to either a National
Institute of Standards and Technology (NIST)
Traceable Reference Material (NTRM) or a
NIST-certified Gas Manufacturer’s Internal
Standard (GMIS), certified in accordance
with one of the procedures given in reference
4 of this appendix. Vendors advertising
certification with the procedures provided in
reference 4 of this appendix and distributing
gases as ‘‘EPA Protocol Gas’’ for ambient air
monitoring purposes must participate in the
EPA Ambient Air Protocol Gas Verification
Program or not use ‘‘EPA’’ in any form of
advertising. Monitoring organizations must
provide information to the EPA on the gas
producers they use on an annual basis and
those PQAOs purchasing standards will be
obligated, at the request of the EPA, to
participate in the program at least once every
5 years by sending a new unused standard to
a designated verification laboratory.
2.6.2 Test concentrations for O3 must be
obtained in accordance with the ultraviolet
photometric calibration procedure specified
in appendix D to Part 50 of this chapter and
by means of a certified NIST-traceable O3
transfer standard. Consult references 7 and 8
of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be
made by a flow measuring instrument that is
NIST-traceable to an authoritative volume or
other applicable standard. Guidance for
certifying some types of flowmeters is
provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance.
Requirements and guidance documents for
developing the quality system are contained
in references 1 through 11 of this appendix,
which also contain many suggested
procedures, checks, and control
specifications. Reference 10 describes
specific guidance for the development of a
quality system for data collected for
PO 00000
Frm 00036
Fmt 4701
Sfmt 4700
comparison to the NAAQS. Many specific
quality control checks and specifications for
methods are included in the respective
reference methods described in Part 50 of
this chapter or in the respective equivalent
method descriptions available from the EPA
(reference 6 of this appendix). Similarly,
quality control procedures related to
specifically designated reference and
equivalent method monitors are contained in
the respective operation or instruction
manuals associated with those monitors.
3. Measurement Quality Check Requirements
This section provides the requirements for
PQAOs to perform the measurement quality
checks that can be used to assess data
quality. Data from these checks are required
to be submitted to the AQS within the same
time frame as routinely-collected ambient
concentration data as described in 40 CFR
58.16. Table A–1 of this appendix provides
a summary of the types and frequency of the
measurement quality checks that will be
described in this section.
3.1. Gaseous Monitors of SO2, NO2, O3,
and CO.
3.1.1 One-Point Quality Control (QC)
Check for SO2, NO2, O3, and CO. (a) A onepoint QC check must be performed at least
once every 2 weeks on each automated
monitor used to measure SO2, NO2, O3 and
CO. With the advent of automated calibration
systems, more frequent checking is strongly
encouraged. See Reference 10 of this
appendix for guidance on the review
procedure. The QC check is made by
challenging the monitor with a QC check gas
of known concentration (effective
concentration for open path monitors)
between the prescribed range of 0.005 and
0.08 parts per million (ppm) for SO2, NO2,
and O3, and between the prescribed range of
0.5 and 5 ppm for CO monitors. The QC
check gas concentration selected within the
prescribed range should be related to the
monitoring objectives for the monitor. If
monitoring at an NCore site or for trace level
monitoring, the QC check concentration
should be selected to represent the mean or
median concentrations at the site. If the mean
or median concentrations at trace gas sites
are below the MDL of the instrument the
agency can select the lowest concentration in
the prescribed range that can be practically
achieved. If the mean or median
concentrations at trace gas sites are above the
prescribed range the agency can select the
highest concentration in the prescribed
range. An additional QC check point is
encouraged for those organizations that may
have occasional high values or would like to
confirm the monitors’ linearity at the higher
end of the operational range or around
NAAQS concentrations. If monitoring for
NAAQS decisions, the QC concentration can
be selected at a higher concentration within
the prescribed range but should also consider
precision points around mean or median
monitor concentrations.
(b) Point analyzers must operate in their
normal sampling mode during the QC check
and the test atmosphere must pass through
all filters, scrubbers, conditioners and other
components used during normal ambient
sampling and as much of the ambient air
inlet system as is practicable. The QC check
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
must be conducted before any calibration or
adjustment to the monitor.
(c) Open path monitors are tested by
inserting a test cell containing a QC check gas
concentration into the optical measurement
beam of the instrument. If possible, the
normally used transmitter, receiver, and as
appropriate, reflecting devices should be
used during the test, and the normal
monitoring configuration of the instrument
should be altered as little as possible to
accommodate the test cell for the test.
However, if permitted by the associated
operation or instruction manual, an alternate
local light source or an alternate optical path
that does not include the normal atmospheric
monitoring path may be used. The actual
concentration of the QC check gas in the test
cell must be selected to produce an effective
concentration in the range specified earlier in
this section. Generally, the QC test
concentration measurement will be the sum
of the atmospheric pollutant concentration
and the QC test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the QC test from the
QC check gas concentration measurement. If
the difference between these before and after
measurements is greater than 20 percent of
the effective concentration of the test gas,
discard the test result and repeat the test. If
possible, open path monitors should be
tested during periods when the atmospheric
pollutant concentrations are relatively low
and steady.
(d) Report the audit concentration of the
QC gas and the corresponding measured
concentration indicated by the monitor to
AQS. The percent differences between these
concentrations are used to assess the
precision and bias of the monitoring data as
described in sections 4.1.2 (precision) and
4.1.3 (bias) of this appendix.
3.1.2 Annual performance evaluation for
SO2, NO2, O3, or CO. A performance
evaluation must be conducted on each
primary monitor once a year. This can be
accomplished by evaluating 25 percent of the
primary monitors each quarter. The
17283
evaluation should be conducted by a trained
experienced technician other than the
routine site operator.
3.1.2.1 The evaluation is made by
challenging the monitor with audit gas
standards of known concentration from at
least three audit levels. One point must be
within two to three times the method
detection limit of the instruments within the
PQAOs network, the second point will be
less than or equal to the 99th percentile of
the data at the site or the network of sites in
the PQAO or the next highest audit
concentration level. The third point can be
around the primary NAAQS or the highest 3year concentration at the site or the network
of sites in the PQAO. An additional 4th level
is encouraged for those agencies that would
like to confirm the monitors’ linearity at the
higher end of the operational range. In rare
circumstances, there may be sites measuring
concentrations above audit level 10. Notify
the appropriate EPA region and the AQS
program in order to make accommodations
for auditing at levels above level 10.
Concentration Range, ppm
Audit level
O3
Lhorne on DSK5TPTVN1PROD with RULES2
1 .......................................................................................................................
2 .......................................................................................................................
3 .......................................................................................................................
4 .......................................................................................................................
5 .......................................................................................................................
6 .......................................................................................................................
7 .......................................................................................................................
8 .......................................................................................................................
9 .......................................................................................................................
10 .....................................................................................................................
3.1.2.2 The NO2 audit techniques may
vary depending on the ambient monitoring
method. For chemiluminescence-type NO2
analyzers, gas phase titration (GPT)
techniques should be based on EPA guidance
documents and monitoring agency
experience. The NO2 gas standards may be
more appropriate than GPT for direct NO2
methods that do not employ converters. Care
should be taken to ensure the stability of
such gas standards prior to use.
3.1.2.3 The standards from which audit
gas test concentrations are obtained must
meet the specifications of section 2.6.1 of this
appendix. The gas standards and equipment
used for the performance evaluation must not
be the same as the standards and equipment
used for one-point QC, calibrations, span
evaluations or NPAP.
3.1.2.4 For point analyzers, the
evaluation shall be carried out by allowing
the monitor to analyze the audit gas test
atmosphere in its normal sampling mode
such that the test atmosphere passes through
all filters, scrubbers, conditioners, and other
sample inlet components used during normal
ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open-path monitors are evaluated
by inserting a test cell containing the various
audit gas concentrations into the optical
measurement beam of the instrument. If
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
0.004–0.0059
0.006–0.019
0.020–0.039
0.040–0.069
0.070–0.089
0.090–0.119
0.120–0.139
0.140–0.169
0.170–0.189
0.190–0.259
SO2
NO2
CO
0.0003–0.0029
0.0030–0.0049
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.1499
0.1500–0.2599
0.2600–0.7999
0.8000–1.000
0.0003–0.0029
0.0030–0.0049
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.2999
0.3000–0.4999
0.5000–0.7999
0.8000–1.000
0.020–0.059
0.060–0.199
0.200–0.899
0.900–2.999
3.000–7.999
8.000–15.999
16.000–30.999
31.000–39.999
40.000–49.999
50.000–60.000
possible, the normally used transmitter,
receiver, and, as appropriate, reflecting
devices should be used during the
evaluation, and the normal monitoring
configuration of the instrument should be
modified as little as possible to accommodate
the test cell for the evaluation. However, if
permitted by the associated operation or
instruction manual, an alternate local light
source or an alternate optical path that does
not include the normal atmospheric
monitoring path may be used. The actual
concentrations of the audit gas in the test cell
must be selected to produce effective
concentrations in the evaluation level ranges
specified in this section of this appendix.
Generally, each evaluation concentration
measurement result will be the sum of the
atmospheric pollutant concentration and the
evaluation test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the evaluation test (or
preferably before and after each evaluation
concentration level) from the evaluation
concentration measurement. If the difference
between the before and after measurements is
greater than 20 percent of the effective
PO 00000
Frm 00037
Fmt 4701
Sfmt 4700
concentration of the test gas standard,
discard the test result for that concentration
level and repeat the test for that level. If
possible, open path monitors should be
evaluated during periods when the
atmospheric pollutant concentrations are
relatively low and steady. Also, if the openpath instrument is not installed in a
permanent manner, the monitoring path
length must be reverified to be within ±3
percent to validate the evaluation since the
monitoring path length is critical to the
determination of the effective concentration.
3.1.2.6 Report both the evaluation
concentrations (effective concentrations for
open-path monitors) of the audit gases and
the corresponding measured concentration
(corrected concentrations, if applicable, for
open path monitors) indicated or produced
by the monitor being tested to AQS. The
percent differences between these
concentrations are used to assess the quality
of the monitoring data as described in section
4.1.1 of this appendix.
3.1.3 National Performance Audit
Program (NPAP).
The NPAP is a performance evaluation
which is a type of audit where quantitative
data are collected independently in order to
evaluate the proficiency of an analyst,
monitoring instrument or laboratory. Due to
the implementation approach used in the
E:\FR\FM\28MRR2.SGM
28MRR2
17284
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
program, NPAP provides a national
independent assessment of performance
while maintaining a consistent level of data
quality. Details of the program can be found
in reference 11 of this appendix. The
program requirements include:
3.1.3.1 Performing audits of the primary
monitors at 20 percent of monitoring sites per
year, and 100 percent of the sites every 6
years. High-priority sites may be audited
more frequently. Since not all gaseous
criteria pollutants are monitored at every site
within a PQAO, it is not required that 20
percent of the primary monitors for each
pollutant receive an NPAP audit each year
only that 20 percent of the PQAOs
monitoring sites receive an NPAP audit. It is
expected that over the 6-year period all
primary monitors for all gaseous pollutants
will receive an NPAP audit.
3.1.3.2 Developing a delivery system that
will allow for the audit concentration gasses
to be introduced to the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified
against the NIST standard reference methods
or special review procedures and validated
annually for CO, SO2 and NO2, and at the
beginning of each quarter of audits for O3.
3.1.3.4 As described in section 2.4 of this
appendix, the PQAO may elect, on an annual
basis, to utilize the federally implemented
NPAP program. If the PQAO plans to selfimplement NPAP, the EPA will establish
training and other technical requirements for
PQAOs to establish comparability to
federally implemented programs. In addition
to meeting the requirements in sections
3.1.3.1 through 3.1.3.3 of this appendix, the
PQAO must:
(a) Utilize an audit system equivalent to
the federally implemented NPAP audit
system and is separate from equipment used
in annual performance evaluations.
(b) Perform a whole system check by
having the NPAP system tested against an
independent and qualified EPA lab, or
equivalent.
(c) Evaluate the system with the EPA NPAP
program through collocated auditing at an
acceptable number of sites each year (at least
one for an agency network of five or less
sites; at least two for a network with more
than five sites).
(d) Incorporate the NPAP in the PQAO’s
quality assurance project plan.
(e) Be subject to review by independent,
EPA-trained personnel.
(f) Participate in initial and update
training/certification sessions.
3.1.3.5 OAQPS, in consultation with the
relevant EPA Regional Office, may approve
the PQAO’s plan to self-implement NPAP if
the OAQPS determines that the PQAO’s selfimplementation plan is equivalent to the
federal programs and adequate to meet the
objectives of national consistency and data
quality.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure PM2.5. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be used in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. Report the flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between the audit and
measured flow rates are used to assess the
bias of the monitoring data as described in
section 4.2.2 of this appendix (using flow
rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for
PM2.5. Audit the flow rate of the particulate
monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months
apart. The EPA strongly encourages more
frequent auditing. The audit should
(preferably) be conducted by a trained
experienced technician other than the
routine site operator. The audit is made by
measuring the monitor’s normal operating
flow rate(s) using a flow rate transfer
standard certified in accordance with section
2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow
rate standard used for verifications or to
calibrate the monitor. However, both the
calibration standard and the audit standard
#Primary FEMS of a unique method designation
may be referenced to the same primary flow
rate or volume standard. Care must be taken
in auditing the flow rate to be certain that the
flow measurement device does not alter the
normal operating flow rate of the monitor.
Report the audit flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between these flow rates
are used to evaluate monitor performance.
3.2.3 Collocated Quality Control
Sampling Procedures for PM2.5. For each pair
of collocated monitors, designate one
sampler as the primary monitor whose
concentrations will be used to report air
quality for the site, and designate the other
as the quality control monitor. There can be
only one primary monitor at a monitoring
site for a given time period.
3.2.3.1 For each distinct monitoring
method designation (FRM or FEM) that a
PQAO is using for a primary monitor, the
PQAO must have 15 percent of the primary
monitors of each method designation
collocated (values of 0.5 and greater round
up); and have at least one collocated quality
control monitor (if the total number of
monitors is less than three). The first
collocated monitor must be a designated
FRM monitor.
3.2.3.2 In addition, monitors selected for
collocation must also meet the following
requirements:
(a) A primary monitor designated as an
EPA FRM shall be collocated with a quality
control monitor having the same EPA FRM
method designation.
(b) For each primary monitor designated as
an EPA FEM used by the PQAO, 50 percent
of the monitors designated for collocation, or
the first if only one collocation is necessary,
shall be collocated with a FRM quality
control monitor and 50 percent of the
monitors shall be collocated with a monitor
having the same method designation as the
FEM primary monitor. If an odd number of
collocated monitors is required, the
additional monitor shall be a FRM quality
control monitor. An example of the
distribution of collocated monitors for each
unique FEM is provided below. Table A–2 of
this appendix demonstrates the collocation
procedure with a PQAO having one type of
primary FRM and multiple primary FEMs.
Lhorne on DSK5TPTVN1PROD with RULES2
1–9 ...............................................................................................................................................
10–16 ...........................................................................................................................................
17–23 ...........................................................................................................................................
24–29 ...........................................................................................................................................
30–36 ...........................................................................................................................................
37–43 ...........................................................................................................................................
3.2.3.3 Since the collocation requirements
are used to assess precision of the primary
monitors and there can only be one primary
monitor at a monitoring site, a site can only
count for the collocation of the method
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
designation of the primary monitor at that
site.
3.2.3.4 The collocated monitors should be
deployed according to the following protocol:
(a) Fifty percent of the collocated quality
control monitors should be deployed at sites
PO 00000
Frm 00038
Fmt 4701
#Collocated
with an FRM
#Collocated
Sfmt 4700
1
2
3
4
5
6
#Collocated
with same
method
designation
1
1
2
2
3
3
with annual average or daily concentrations
estimated to be within plus or minus 20
percent of either the annual or 24-hour
NAAQS and the remainder at the PQAOs
discretion;
E:\FR\FM\28MRR2.SGM
28MRR2
0
1
1
2
2
3
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
(b) If an organization has no sites with
annual average or daily concentrations
within ±20 percent of the annual NAAQS or
24-hour NAAQS, 50 percent of the collocated
quality control monitors should be deployed
at those sites with the annual mean
concentrations or 24-hour concentrations
among the highest for all sites in the network
and the remainder at the PQAOs discretion.
(c) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference. A waiver allowing up to 10
meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a
primary and collocated sampler may be
approved by the Regional Administrator for
sites at a neighborhood or larger scale of
representation during the annual network
plan approval process. Sampling and
analytical methodologies must be the
consistently implemented for both primary
and collocated quality control samplers and
for all other samplers in the network.
(d) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation
Program (PEP) Procedures. The PEP is an
independent assessment used to estimate
total measurement system bias. These
evaluations will be performed under the
NPEP as described in section 2.4 of this
appendix or a comparable program.
Performance evaluations will be performed
annually within each PQAO. For PQAOs
with less than or equal to five monitoring
sites, five valid performance evaluation
audits must be collected and reported each
year. For PQAOs with greater than five
monitoring sites, eight valid performance
evaluation audits must be collected and
reported each year. A valid performance
evaluation audit means that both the primary
monitor and PEP audit concentrations are
valid and above 3 mg/m3. Siting of the PEP
monitor must be consistent with section
3.2.3.4(c). However, any horizontal distance
greater than 4 meters and any vertical
distance greater than one meter must be
reported to the EPA regional PEP
coordinator. Additionally for every monitor
designated as a primary monitor, a primary
quality assurance organization must:
3.2.4.1 Have each method designation
evaluated each year; and,
3.2.4.2 Have all FRM, FEM or ARM
samplers subject to a PEP audit at least once
every 6 years, which equates to
approximately 15 percent of the monitoring
sites audited each year.
3.2.4.3. Additional information
concerning the PEP is contained in reference
10 of this appendix. The calculations for
evaluating bias between the primary monitor
and the performance evaluation monitor for
PM2.5 are described in section 4.2.5 of this
appendix.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
3.3PM10.
3.3.1 Flow Rate Verification for PM10 Low
Volume Samplers (less than 200 liter/
minute). A one-point flow rate verification
check must be performed at least once every
month (each verification minimally separated
by 14 days) on each monitor used to measure
PM10. The verification is made by checking
the operational flow rate of the monitor. If
the verification is made in conjunction with
a flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be taken in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are reported
to AQS and used to assess the bias of the
monitoring data as described in section 4.2.2
of this appendix (using flow rates in lieu of
concentrations).
3.3.2 Flow Rate Verification for PM10
High Volume Samplers (greater than 200
liters/minute). For PM10 high volume
samplers, the verification frequency is one
verification every 90 days (quarter) with 4 in
a year. Other than verification frequency,
follow the same technical procedure as
described in section 3.3.1 of this appendix.
3.3.3 Semi-Annual Flow Rate Audit for
PM10. Audit the flow rate of the particulate
monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months
apart. The EPA strongly encourages more
frequent auditing. The audit should
(preferably) be conducted by a trained
experienced technician other than the
routine site operator. The audit is made by
measuring the monitor’s normal operating
flow rate using a flow rate transfer standard
certified in accordance with section 2.6 of
this appendix. The flow rate standard used
for auditing must not be the same flow rate
standard used for verifications or to calibrate
the monitor. However, both the calibration
standard and the audit standard may be
referenced to the same primary flow rate or
volume standard. Care must be taken in
auditing the flow rate to be certain that the
flow measurement device does not alter the
normal operating flow rate of the monitor.
Report the audit flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between these flow rates
are used to evaluate monitor performance.
3.3.4 Collocated Quality Control
Sampling Procedures for Manual PM10.
Collocated sampling for PM10 is only
required for manual samplers. For each pair
of collocated monitors, designate one
sampler as the primary monitor whose
concentrations will be used to report air
quality for the site and designate the other as
the quality control monitor.
3.3.4.1 For manual PM10 samplers, a
PQAO must:
(a) Have 15 percent of the primary
monitors collocated (values of 0.5 and greater
round up); and
(b) Have at least one collocated quality
control monitor (if the total number of
monitors is less than three).
PO 00000
Frm 00039
Fmt 4701
Sfmt 4700
17285
3.3.4.2 The collocated quality control
monitors should be deployed according to
the following protocol:
(a) Fifty percent of the collocated quality
control monitors should be deployed at sites
with daily concentrations estimated to be
within plus or minus 20 percent of the
applicable NAAQS and the remainder at the
PQAOs discretion;
(b) If an organization has no sites with
daily concentrations within plus or minus 20
percent of the NAAQS, 50 percent of the
collocated quality control monitors should be
deployed at those sites with the daily mean
concentrations among the highest for all sites
in the network and the remainder at the
PQAOs discretion.
(c) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference. A waiver allowing up to 10
meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a
primary and collocated sampler may be
approved by the Regional Administrator for
sites at a neighborhood or larger scale of
representation. This waiver may be approved
during the annual network plan approval
process. Sampling and analytical
methodologies must be the consistently
implemented for both collocated samplers
and for all other samplers in the network.
(d) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
(e) In determining the number of collocated
quality control sites required for PM10,
monitoring networks for lead (Pb–PM10)
should be treated independently from
networks for particulate matter (PM), even
though the separate networks may share one
or more common samplers. However, a single
quality control monitor that meets the
collocation requirements for Pb-PM10 and
PM10 may serve as a collocated quality
control monitor for both networks. Extreme
care must be taken when using the filter from
a quality control monitor for both PM10 and
Pb analysis. A PM10 filter weighing should
occur prior to any Pb analysis.
3.4 Pb.
3.4.1 Flow Rate Verification for Pb–PM10
Low Volume Samplers (less than 200 liter/
minute). A one-point flow rate verification
check must be performed at least once every
month (each verification minimally separated
by 14 days) on each monitor used to measure
Pb. The verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be taken in
selecting and using the flow rate
measurement device such that it does not
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17286
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are reported
to AQS and used to assess the bias of the
monitoring data as described in section 4.2.2
of this appendix (using flow rates in lieu of
concentrations).
3.4.2 Flow Rate Verification for Pb High
Volume Samplers (greater than 200 liters/
minute). For high volume samplers, the
verification frequency is one verification
every 90 days (quarter) with four in a year.
Other than verification frequency, follow the
same technical procedure as described in
section 3.4.1 of this appendix.
3.4.3 Semi-Annual Flow Rate Audit for
Pb. Audit the flow rate of the particulate
monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months
apart. The EPA strongly encourages more
frequent auditing. The audit should
(preferably) be conducted by a trained
experienced technician other than the
routine site operator. The audit is made by
measuring the monitor’s normal operating
flow rate using a flow rate transfer standard
certified in accordance with section 2.6 of
this appendix. The flow rate standard used
for auditing must not be the same flow rate
standard used for verifications or to calibrate
the monitor. However, both the calibration
standard and the audit standard may be
referenced to the same primary flow rate or
volume standard. Care must be taken in
auditing the flow rate to be certain that the
flow measurement device does not alter the
normal operating flow rate of the monitor.
Report the audit flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between these flow rates
are used to evaluate monitor performance.
3.4.4 Collocated Quality Control
Sampling for TSP Pb for monitoring sites
other than non-source oriented NCore. For
each pair of collocated monitors for manual
TSP Pb samplers, designate one sampler as
the primary monitor whose concentrations
will be used to report air quality for the site,
and designate the other as the quality control
monitor.
3.4.4.1 A PQAO must:
(a) Have 15 percent of the primary
monitors (not counting non-source oriented
NCore sites in PQAO) collocated. Values of
0.5 and greater round up; and
(b) Have at least one collocated quality
control monitor (if the total number of
monitors is less than three).
3.4.4.2 The collocated quality control
monitors should be deployed according to
the following protocol:
(a) The first collocated Pb site selected
must be the site measuring the highest Pb
concentrations in the network. If the site is
impractical, alternative sites, approved by the
EPA Regional Administrator, may be
selected. If additional collocated sites are
necessary, collocated sites may be chosen
that reflect average ambient air Pb
concentrations in the network.
(b) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
200 liters/min to preclude airflow
interference.
(c) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
3.4.5 Collocated Quality Control
Sampling for Pb–PM10 at monitoring sites
other than non-source oriented NCore. If a
PQAO is monitoring for Pb–PM10 at sites
other than at a non-source oriented NCore
site then the PQAO must:
3.4.5.1 Have 15 percent of the primary
monitors (not counting non-source oriented
NCore sites in PQAO) collocated. Values of
0.5 and greater round up; and
3.4.5.2 Have at least one collocated
quality control monitor (if the total number
of monitors is less than three).
3.4.5.3 The collocated monitors should be
deployed according to the following protocol:
(a) Fifty percent of the collocated quality
control monitors should be deployed at sites
with the highest 3-month average
concentrations and the remainder at the
PQAOs discretion.
(b) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference. A waiver allowing up to 10
meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a
primary and collocated sampler may be
approved by the Regional Administrator for
sites at a neighborhood or larger scale of
representation. This waiver may be approved
during the annual network plan approval
process. Sampling and analytical
methodologies must be the consistently
implemented for both collocated samplers
and for all other samplers in the network.
(c) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
(d) In determining the number of
collocated quality control sites required for
Pb–PM10, monitoring networks for PM10
should be treated independently from
networks for Pb–PM10, even though the
separate networks may share one or more
common samplers. However, a single quality
control monitor that meets the collocation
requirements for Pb–PM10 and PM10 may
serve as a collocated quality control monitor
for both networks. Extreme care must be
taken when using a using the filter from a
quality control monitor for both PM10 and Pb
analysis. A PM10 filter weighing should occur
prior to any Pb analysis.
3.4.6 Pb Analysis Audits. Each calendar
quarter, audit the Pb reference or equivalent
method analytical procedure using filters
containing a known quantity of Pb. These
audit filters are prepared by depositing a Pb
PO 00000
Frm 00040
Fmt 4701
Sfmt 4700
standard on unexposed filters and allowing
them to dry thoroughly. The audit samples
must be prepared using batches of reagents
different from those used to calibrate the Pb
analytical equipment being audited. Prepare
audit samples in the following concentration
ranges:
Equivalent ambient Pb
concentration, μg/m 3
Range
1 ........
2 ........
30–100% of Pb NAAQS.
200–300% of Pb NAAQS.
(a) Extract the audit samples using the
same extraction procedure used for exposed
filters.
(b) Analyze three audit samples in each of
the two ranges each quarter samples are
analyzed. The audit sample analyses shall be
distributed as much as possible over the
entire calendar quarter.
(c) Report the audit concentrations (in mg
Pb/filter or strip) and the corresponding
measured concentrations (in mg Pb/filter or
strip) to AQS using AQS unit code 077. The
percent differences between the
concentrations are used to calculate
analytical accuracy as described in section
4.2.6 of this appendix.
3.4.7 Pb PEP Procedures for monitoring
sites other than non-source oriented NCore.
The PEP is an independent assessment used
to estimate total measurement system bias.
These evaluations will be performed under
the NPEP described in section 2.4 of this
appendix or a comparable program. Each
year, one performance evaluation audit must
be performed at one Pb site in each primary
quality assurance organization that has less
than or equal to five sites and two audits at
PQAOs with greater than five sites. Nonsource oriented NCore sites are not counted.
Siting of the PEP monitor must be consistent
with section 3.4.5.3(b). However, any
horizontal distance greater than 4 meters and
any vertical distance greater than 1 meter
must be reported to the EPA regional PEP
coordinator. In addition, each year, four
collocated samples from PQAOs with less
than or equal to five sites and six collocated
samples at PQAOs with greater than five sites
must be sent to an independent laboratory,
the same laboratory as the performance
evaluation audit, for analysis. The
calculations for evaluating bias between the
primary monitor and the performance
evaluation monitor for Pb are described in
section 4.2.4 of this appendix.
4. Calculations for Data Quality Assessments
(a) Calculations of measurement
uncertainty are carried out by the EPA
according to the following procedures. The
PQAOs must report the data to AQS for all
measurement quality checks as specified in
this appendix even though they may elect to
perform some or all of the calculations in this
section on their own.
(b) The EPA will provide annual
assessments of data quality aggregated by site
and PQAO for SO2, NO2, O3 and CO and by
PQAO for PM10, PM2.5, and Pb.
(c) At low concentrations, agreement
between the measurements of collocated
quality control samplers, expressed as
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
17287
where meas is the concentration indicated by
the PQAO’s instrument and audit is the audit
concentration of the standard used in the QC
check being measured.
4.1.2 Precision Estimate. The precision
estimate is used to assess the one-point QC
checks for SO2, NO2, O3, or CO described in
section 3.1.1 of this appendix. The precision
estimator is the coefficient of variation upper
bound and is calculated using equation 2 of
this section:
where n is the number of single point checks
being aggregated; X2 0.1,n–1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is
calculated using the one-point QC checks for
SO2, NO2, O3, or CO described in section
3.1.1 of this appendix. The bias estimator is
an upper bound on the mean absolute value
of the percent differences as described in
equation 3 of this section:
and the quantity AS is the standard deviation
of the absolute value of the di ′ s and is
calculated using equation 5 of this section:
where Xi is the concentration from the
primary sampler and Yi is the concentration
value from the audit sampler. The coefficient
of variation upper bound is calculated using
equation 7 of this appendix:
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
PO 00000
Frm 00041
Fmt 4701
Sfmt 4700
E:\FR\FM\28MRR2.SGM
ER28MR16.001
ER28MR16.002
4.1.3.1 Assigning a sign (positive/
negative) to the bias estimate. Since the bias
statistic as calculated in equation 3 of this
appendix uses absolute values, it does not
have a tendency (negative or positive bias)
associated with it. A sign will be designated
by rank ordering the percent differences of
28MRR2
ER28MR16.000
Lhorne on DSK5TPTVN1PROD with RULES2
where n is the number of single point checks
being aggregated; t0.95,n–1 is the 95th quantile
of a t-distribution with n–1 degrees of
freedom; the quantity AB is the mean of the
absolute values of the d i ′ s and is calculated
using equation 4 of this section:
the QC check samples from a given site for
a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th
percentiles of the percent differences for each
site. The absolute bias upper bound should
be flagged as positive if both percentiles are
positive and negative if both percentiles are
negative. The absolute bias upper bound
would not be flagged if the 25th and 75th
percentiles are of different signs.
4.2 Statistics for the Assessment of PM10,
PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler
Precision Estimate for PM10, PM2.5 and Pb.
Precision is estimated via duplicate
measurements from collocated samplers. It is
recommended that the precision be
aggregated at the PQAO level quarterly,
annually, and at the 3-year level. The data
pair would only be considered valid if both
concentrations are greater than or equal to
the minimum values specified in section 4(c)
of this appendix. For each collocated data
pair, calculate the relative percent difference,
di, using equation 6 of this appendix:
ER28MR16.005
4.1.1 Percent Difference. Many of the
measurement quality checks start with a
comparison of an audit concentration or
value (flow rate) to the concentration/value
measured by the monitor and use percent
difference as the comparison statistic as
described in equation 1 of this section. For
each single point check, calculate the percent
difference, di, as follows:
ER28MR16.004
(2) Pb: 0.02 mg/m3 (Methods approved
before 3/04/2010, and manual equivalent
method EQLA–0813–803).
(3) PM10 (Hi-Vol): 15 mg/m3.
(4) PM10 (Lo-Vol): 3 mg/m3.
(5) PM2.5: 3 mg/m3.
4.1 Statistics for the Assessment of QC
Checks for SO2, NO2, O3 and CO.
ER28MR16.003
relative percent difference or percent
difference, may be relatively poor. For this
reason, collocated measurement pairs are
selected for use in the precision and bias
calculations only when both measurements
are equal to or above the following limits:
(1) Pb: 0.002 mg/m3 (Methods approved
after 3/04/2010, with exception of manual
equivalent method EQLA–0813–803).
17288
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
where n is the number of valid data pairs
being aggregated, and X2 0.1,n–1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di
is calculated from two values with error.
4.2.2 One-Point Flow Rate Verification
Bias Estimate for PM10, PM2.5 and Pb. For
each one-point flow rate verification,
calculate the percent difference in volume
using equation 1 of this appendix where
meas is the value indicated by the sampler’s
volume measurement and audit is the actual
volume indicated by the auditing flow meter.
The absolute volume bias upper bound is
then calculated using equation 3, where n is
the number of flow rate audits being
aggregated; t0.95,n–1 is the 95th quantile of a
t-distribution with n-1 degrees of freedom,
the quantity AB is the mean of the absolute
values of the di′s and is calculated using
equation 4 of this appendix, and the quantity
AS in equation 3 of this appendix is the
standard deviation of the absolute values if
the di′s and is calculated using equation 5 of
this appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias
Estimate for PM10, PM2.5 and Pb. Use the
same procedure described in section 4.2.2 for
the evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs
Bias Estimate for Pb. The Pb bias estimate is
calculated using the paired routine and the
PEP monitor as described in section 3.4.7.
Use the same procedures as described in
section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs
Bias Estimate for PM2.5. The bias estimate is
calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias
estimator is based on the mean percent
differences (Equation 1). The mean percent
difference, D, is calculated by Equation 8
below.
ER28MR16.007
6. References
(1) American National Standard—
Specifications and Guidelines for Quality
Systems for Environmental Data Collection
and Environmental Technology Programs.
ANSI/ASQC E4–2014. February 2014.
Available from American Society for Quality
Control, 611 East Wisconsin Avenue,
Milwaukee, WI 53202.
(2) EPA Requirements for Quality
Management Plans. EPA QA/R–2. EPA/240/
B–01/002. March 2001, Reissue May 2006.
Office of Environmental Information,
Washington DC 20460. https://www.epa.gov/
quality/agency-wide-quality-systemdocuments.
(3) EPA Requirements for Quality
Assurance Project Plans for Environmental
Data Operations. EPA QA/R–5. EPA/240/B–
01/003. March 2001, Reissue May 2006.
Office of Environmental Information,
Washington DC 20460. https://www.epa.gov/
quality/agency-wide-quality-systemdocuments.
(4) EPA Traceability Protocol for Assay and
Certification of Gaseous Calibration
Standards. EPA–600/R–12/531. May, 2012.
Available from U.S. Environmental
Protection Agency, National Risk
Management Research Laboratory, Research
Triangle Park NC 27711. https://cfpub.epa.
gov/si/si_public_record_report.cfm?dir
EntryId=245292.
(5) Guidance for the Data Quality
Objectives Process. EPA QA/G–4. EPA/240/
B–06/001. February, 2006. Office of
Environmental Information, Washington DC
20460. https://www.epa.gov/quality/agencywide-quality-system-documents.
(6) List of Designated Reference and
Equivalent Methods. Available from U.S.
Environmental Protection Agency, National
Exposure Research Laboratory, Human
Exposure and Atmospheric Sciences
Division, MD–D205–03, Research Triangle
Park, NC 27711. https://www3.epa.gov/ttn/
amtic/criteria.html.
(7) Transfer Standards for the Calibration
of Ambient Air Monitoring Analyzers for
Ozone. EPA–454/B–13–004 U.S.
Environmental Protection Agency, Research
Triangle Park, NC 27711, October, 2013.
https://www3.epa.gov/ttn/amtic/
qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical
Assistance Document for the Calibration of
Ambient Ozone Monitors. EPA–600/4–79–
057. U.S. Environmental Protection Agency,
Research Triangle Park, NC 27711,
September, 1979. https://www.epa.gov/ttn/
amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume 1—
A Field Guide to Environmental Quality
Assurance. EPA–600/R–94/038a. April 1994.
Available from U.S. Environmental
Protection Agency, ORD Publications Office,
Center for Environmental Research
Information (CERI), 26 W. Martin Luther
King Drive, Cincinnati, OH 45268. https://
www3.epa.gov/ttn/amtic/qalist.html.
(10) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume II:
Ambient Air Quality Monitoring Program
Quality System Development. EPA–454/B–
13–003. https://www3.epa.gov/ttn/amtic/
qalist.html.
(11) National Performance Evaluation
Program Standard Operating Procedures.
https://www3.epa.gov/ttn/amtic/
npapsop.html.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
PO 00000
Frm 00042
Fmt 4701
Sfmt 4700
E:\FR\FM\28MRR2.SGM
28MRR2
ER28MR16.006
Lhorne on DSK5TPTVN1PROD with RULES2
where nj is the number of pairs and
d1,d2,...dnj are the biases for each pair to be
averaged.
4.2.6 Pb Analysis Audit Bias Estimate.
The bias estimate is calculated using the
analysis audit data described in section 3.4.6.
Use the same bias estimate procedure as
described in section 4.1.3 of this appendix.
5. Reporting Requirements
5.1 Reporting Requirements. For each
pollutant, prepare a list of all monitoring
sites and their AQS site identification codes
in each PQAO and submit the list to the
appropriate EPA Regional Office, with a copy
to AQS. Whenever there is a change in this
list of monitoring sites in a PQAO, report this
change to the EPA Regional Office and to
AQS.
5.1.1 Quarterly Reports. For each quarter,
each PQAO shall report to AQS directly (or
via the appropriate EPA Regional Office for
organizations not direct users of AQS) the
results of all valid measurement quality
checks it has carried out during the quarter.
The quarterly reports must be submitted
consistent with the data reporting
requirements specified for air quality data as
set forth in 40 CFR 58.16. The EPA strongly
encourages early submission of the quality
assurance data in order to assist the PQAOs
ability to control and evaluate the quality of
the ambient air data.
5.1.2 Annual Reports.
5.1.2.1 When the PQAO has certified
relevant data for the calendar year, the EPA
will calculate and report the measurement
uncertainty for the entire calendar year.
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
17289
TABLE A–1 OF APPENDIX A TO PART 58—MINIMUM DATA ASSESSMENT REQUIREMENTS FOR NAAQS RELATED CRITERIA
POLLUTANT MONITORS
Method
Assessment method
Coverage
Minimum
frequency
Parameters
reported
AQS assessment
type
Gaseous Methods (CO, NO2, SO2, O3)
One-Point QC for
SO2, NO2, O3, CO.
Annual performance
evaluation for SO2,
NO2, O3, CO.
NPAP for SO2, NO2,
O3, CO.
Response check at
concentration
0.005–0.08 ppm
SO2, NO2, O3, and.
0.5 and 5 ppm CO ....
See section 3.1.2 of
this appendix.
Independent Audit .....
Each analyzer ...........
Once per 2 weeks .....
Audit concentration 1
and measured concentration. 2
One-Point QC.
Each analyzer ...........
Once per year ...........
Annual PE.
20% of sites each
year.
Once per year ...........
Audit concentration 1
and measured concentration 2 for
each level.
Audit concentration 1
and measured concentration 2 for
each level.
Primary sampler concentration and duplicate sampler
concentration. 3
Primary sampler concentration and duplicate sampler
concentration. 3
No Transaction reported as raw data.
Audit flow rate and
measured flow rate
indicated by the
sampler.
Audit flow rate and
measured flow rate
indicated by the
sampler.
Audit flow rate and
measured flow rate
indicated by the
sampler.
Measured value and
audit value (ug Pb/
filter) using AQS
unit code 077.
Primary sampler concentration and performance evaluation sampler concentration.
Flow Rate
Verification.
NPAP.
Particulate Methods
Continuous 4 method—collocated quality control sampling
PM2.5.
Manual method—collocated quality control sampling PM10,
PM2.5, Pb–TSP,
Pb–PM10.
Flow rate verification
PM10 (low Vol)
PM2.5, Pb–PM10.
15% ...........................
1-in-12 days ..............
Collocated samplers
15% ...........................
1-in-12 days ..............
Check of sampler
flow rate.
Each sampler ............
Once every month ....
Flow rate verification
PM10 (High-Vol),
Pb–TSP.
Check of sampler
flow rate.
Each sampler ............
Once every quarter ...
Semi-annual flow rate
audit PM10, TSP,
PM10–2.5, PM2.5,
Pb–TSP, Pb–PM10.
Pb analysis audits
Pb–TSP, Pb–PM10.
Check of sampler
flow rate using
independent standard.
Check of analytical
system with Pb
audit strips/filters.
Each sampler, ...........
Once every 6 months
Analytical ...................
Once each quarter ....
Performance Evaluation Program PM2.5.
Collocated samplers
Collocated samplers
(1) 5 valid audits for
primary QA orgs,
with <= 5 sites..
(2) 8 valid audits for
primary QA orgs,
with >5 sites..
(3) All samplers in 6
years.
(1) 1 valid audit and 4
collocated samples
for primary QA
orgs, with <=5
sites..
(2) 2 valid audits and
6 collocated samples for primary QA
orgs with >5 sites.
Distributed over all 4
quarters.
Performance Evaluation Program Pb–
TSP, Pb–PM10.
Lhorne on DSK5TPTVN1PROD with RULES2
Collocated samplers
Distributed over all 4
quarters.
Primary sampler concentration and performance evaluation sampler concentration. Primary
sampler concentration and duplicate
sampler concentration.
1 Effective
concentration for open path analyzers.
concentration, if applicable for open path analyzers.
3 Both primary and collocated sampler values are reported as raw data.
4 PM
2.5 is the only particulate criteria pollutant requiring collocation of continuous and manual primary monitors.
2 Corrected
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
PO 00000
Frm 00043
Fmt 4701
Sfmt 4700
E:\FR\FM\28MRR2.SGM
28MRR2
No Transaction reported as raw data.
Flow Rate
Verification.
Semi Annual Flow
Rate Audit.
Pb Analysis Audits.
PEP.
PEP.
17290
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
TABLE A–2 OF APPENDIX A TO PART 58—SUMMARY OF PM2.5 NUMBER AND TYPE OF COLLOCATION (15% COLLOCATION
REQUIREMENT) REQUIRED USING AN EXAMPLE OF A PQAO THAT HAS 54 PRIMARY MONITORS (54 SITES) WITH ONE
FEDERAL REFERENCE METHOD TYPE AND THREE TYPES OF APPROVED FEDERAL EQUIVALENT METHODS
Total No. of
monitors
Primary sampler method designation
FRM .................................................................................................................
FEM (A) ...........................................................................................................
FEM (B) ...........................................................................................................
FEM (C) ...........................................................................................................
10. Add Appendix B to part 58 to read
as follows:
■
Appendix B to Part 58—Quality
Assurance Requirements for Prevention
of Significant Deterioration (PSD) Air
Monitoring
Lhorne on DSK5TPTVN1PROD with RULES2
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability.
(a) This appendix specifies the minimum
quality assurance requirements for the
control and assessment of the quality of the
ambient air monitoring data submitted to a
PSD reviewing authority or the EPA by an
organization operating an air monitoring
station, or network of stations, operated in
order to comply with Part 51 New Source
Review—Prevention of Significant
Deterioration (PSD). Such organizations are
encouraged to develop and maintain quality
assurance programs more extensive than the
required minimum. Additional guidance for
the requirements reflected in this appendix
can be found in the ‘‘Quality Assurance
Handbook for Air Pollution Measurement
Systems,’’ Volume II (Ambient Air) and
‘‘Quality Assurance Handbook for Air
Pollution Measurement Systems,’’ Volume IV
(Meteorological Measurements) and at a
national level in references 1, 2, and 3 of this
appendix.
(b) It is not assumed that data generated for
PSD under this appendix will be used in
making NAAQS decisions. However, if all
the requirements in this appendix are
followed (including the NPEP programs) and
reported to AQS, with review and
concurrence from the EPA region, data may
be used for NAAQS decisions. With the
exception of the NPEP programs (NPAP,
PM2.5 PEP, Pb–PEP), for which
implementation is at the discretion of the
PSD reviewing authority, all other quality
assurance and quality control requirements
found in the appendix must be met.
1.2 PSD Primary Quality Assurance
Organization (PQAO). A PSD PQAO is
defined as a monitoring organization or a
coordinated aggregation of such
organizations that is responsible for a set of
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
stations within one PSD reviewing authority
that monitors the same pollutant and for
which data quality assessments will be
pooled. Each criteria pollutant sampler/
monitor must be associated with only one
PSD PQAO.
1.2.1 Each PSD PQAO shall be defined
such that measurement uncertainty among all
stations in the organization can be expected
to be reasonably homogeneous, as a result of
common factors. A PSD PQAO must be
associated with only one PSD reviewing
authority. Common factors that should be
considered in defining PSD PQAOs include:
(a) Operation by a common team of field
operators according to a common set of
procedures;
(b) Use of a common QAPP and/or
standard operating procedures;
(c) Common calibration facilities and
standards;
(d) Oversight by a common quality
assurance organization; and
(e) Support by a common management
organization or laboratory.
1.2.2 PSD monitoring organizations
having difficulty describing its PQAO or in
assigning specific monitors to a PSD PQAO
should consult with the PSD reviewing
authority. Any consolidation of PSD PQAOs
shall be subject to final approval by the PSD
reviewing authority.
1.2.3 Each PSD PQAO is required to
implement a quality system that provides
sufficient information to assess the quality of
the monitoring data. The quality system
must, at a minimum, include the specific
requirements described in this appendix.
Failure to conduct or pass a required check
or procedure, or a series of required checks
or procedures, does not by itself invalidate
data for regulatory decision making. Rather,
PSD PQAOs and the PSD reviewing authority
shall use the checks and procedures required
in this appendix in combination with other
data quality information, reports, and similar
documentation that demonstrate overall
compliance with parts 51, 52 and 58 of this
chapter. Accordingly, the PSD reviewing
authority shall use a ‘‘weight of evidence’’
approach when determining the suitability of
data for regulatory decisions. The PSD
reviewing authority reserves the authority to
use or not use monitoring data submitted by
a PSD monitoring organization when making
regulatory decisions based on the PSD
reviewing authority’s assessment of the
quality of the data. Generally, consensus
built validation templates or validation
PO 00000
Frm 00044
Fmt 4701
Sfmt 4700
Total No. of
collocated
20
20
2
12
No. of
collocated
with same
method
designation
as primary
No. of
collocated
with FRM
3
3
1
2
3
2
1
1
3
1
0
1
criteria already approved in quality
assurance project plans (QAPPs) should be
used as the basis for the weight of evidence
approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used
to describe deviations from a true
concentration or estimate that are related to
the measurement process and not to spatial
or temporal population attributes of the air
being measured.
(b) Precision. A measurement of mutual
agreement among individual measurements
of the same property usually under
prescribed similar conditions, expressed
generally in terms of the standard deviation.
(c) Bias. The systematic or persistent
distortion of a measurement process which
causes errors in one direction.
(d) Accuracy. The degree of agreement
between an observed value and an accepted
reference value. Accuracy includes a
combination of random error (imprecision)
and systematic error (bias) components
which are due to sampling and analytical
operations.
(e) Completeness. A measure of the amount
of valid data obtained from a measurement
system compared to the amount that was
expected to be obtained under correct,
normal conditions.
(f) Detectability. The low critical range
value of a characteristic that a method
specific procedure can reliably discern.
1.4 Measurement Quality Check
Reporting. The measurement quality checks
described in section 3 of this appendix, are
required to be submitted to the PSD
reviewing authority within the same time
frame as routinely-collected ambient
concentration data as described in 40 CFR
58.16. The PSD reviewing authority may as
well require that the measurement quality
check data be reported to AQS.
1.5 Assessments and Reports. Periodic
assessments and documentation of data
quality are required to be reported to the PSD
reviewing authority. To provide national
uniformity in this assessment and reporting
of data quality for all networks, specific
assessment and reporting procedures are
prescribed in detail in sections 3, 4, and 5 of
this appendix.
2. Quality System Requirements
A quality system (reference 1 of this
appendix) is the means by which an
organization manages the quality of the
monitoring information it produces in a
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
systematic, organized manner. It provides a
framework for planning, implementing,
assessing and reporting work performed by
an organization and for carrying out required
quality assurance and quality control
activities.
2.1 Quality Assurance Project Plans. All
PSD PQAOs must develop a quality system
that is described and approved in quality
assurance project plans (QAPP) to ensure that
the monitoring results:
(a) Meet a well-defined need, use, or
purpose (reference 5 of this appendix);
(b) Provide data of adequate quality for the
intended monitoring objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards
specifications;
(e) Comply with statutory (and other legal)
requirements; and
(f) Assure quality assurance and quality
control adequacy and independence.
2.1.1 The QAPP is a formal document
that describes these activities in sufficient
detail and is supported by standard operating
procedures. The QAPP must describe how
the organization intends to control
measurement uncertainty to an appropriate
level in order to achieve the objectives for
which the data are collected. The QAPP must
be documented in accordance with EPA
requirements (reference 3 of this appendix).
2.1.2 The PSD PQAO’s quality system
must have adequate resources both in
personnel and funding to plan, implement,
assess and report on the achievement of the
requirements of this appendix and it’s
approved QAPP.
2.1.3 Incorporation of quality
management plan (QMP) elements into the
QAPP. The QMP describes the quality system
in terms of the organizational structure,
functional responsibilities of management
and staff, lines of authority, and required
interfaces for those planning, implementing,
assessing and reporting activities involving
environmental data operations (EDO). The
PSD PQAOs may combine pertinent elements
of the QMP into the QAPP rather than
requiring the submission of both QMP and
QAPP documents separately, with prior
approval of the PSD reviewing authority.
Additional guidance on QMPs can be found
in reference 2 of this appendix.
2.2 Independence of Quality Assurance
Management. The PSD PQAO must provide
for a quality assurance management function
for its PSD data collection operation, that
aspect of the overall management system of
the organization that determines and
implements the quality policy defined in a
PSD PQAO’s QAPP. Quality management
includes strategic planning, allocation of
resources and other systematic planning
activities (e.g., planning, implementation,
assessing and reporting) pertaining to the
quality system. The quality assurance
management function must have sufficient
technical expertise and management
authority to conduct independent oversight
and assure the implementation of the
organization’s quality system relative to the
ambient air quality monitoring program and
should be organizationally independent of
environmental data generation activities.
2.3 Data Quality Performance
Requirements.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
2.3.1 Data Quality Objectives (DQOs).
The DQOs, or the results of other systematic
planning processes, are statements that
define the appropriate type of data to collect
and specify the tolerable levels of potential
decision errors that will be used as a basis
for establishing the quality and quantity of
data needed to support air monitoring
objectives (reference 5 of the appendix). The
DQOs have been developed by the EPA to
support attainment decisions for comparison
to national ambient air quality standards
(NAAQS). The PSD reviewing authority and
the PSD monitoring organization will be
jointly responsible for determining whether
adherence to the EPA developed NAAQS
DQOs specified in appendix A of this part are
appropriate or if DQOs from a projectspecific systematic planning process are
necessary.
2.3.1.1 Measurement Uncertainty for
Automated and Manual PM2.5 Methods. The
goal for acceptable measurement uncertainty
for precision is defined as an upper 90
percent confidence limit for the coefficient of
variation (CV) of 10 percent and plus or
minus 10 percent for total bias.
2.3.1.2 Measurement Uncertainty for
Automated Ozone Methods. The goal for
acceptable measurement uncertainty is
defined for precision as an upper 90 percent
confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence
limit for the absolute bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb
Methods. The goal for acceptable
measurement uncertainty is defined for
precision as an upper 90 percent confidence
limit for the CV of 20 percent and for bias
as an upper 95 percent confidence limit for
the absolute bias of 15 percent.
2.3.1.4 Measurement Uncertainty for
NO2. The goal for acceptable measurement
uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV
of 15 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 15 percent.
2.3.1.5 Measurement Uncertainty for SO2.
The goal for acceptable measurement
uncertainty for precision is defined as an
upper 90 percent confidence limit for the CV
of 10 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 10 percent.
2.4 National Performance Evaluation
Program. Organizations operating PSD
monitoring networks are required to
implement the EPA’s national performance
evaluation program (NPEP) if the data will be
used for NAAQS decisions and at the
discretion of the PSD reviewing authority if
PSD data are not used for NAAQS decisions.
The NPEP includes the National Performance
Audit Program (NPAP), the PM2.5
Performance Evaluation Program (PM2.5-PEP)
and the Pb Performance Evaluation Program
(Pb-PEP). The PSD QAPP shall provide for
the implementation of NPEP including the
provision of adequate resources for such
NPEP if the data will be used for NAAQS
decisions or if required by the PSD reviewing
authority. Contact the PSD reviewing
authority to determine the best procedure for
implementing the audits which may include
an audit by the PSD reviewing authority, a
PO 00000
Frm 00045
Fmt 4701
Sfmt 4700
17291
contractor certified for the activity, or
through self-implementation which is
described in sections below. A determination
of which entity will be performing this audit
program should be made as early as possible
and during the QAPP development process.
The PSD PQAOs, including contractors that
plan to implement these programs on behalf
of PSD PQAOs, that plan to implement these
programs (self-implement) rather than use
the federal programs, must meet the
adequacy requirements found in the
appropriate sections that follow, as well as
meet the definition of independent
assessment that follows.
2.4.1 Independent Assessment. An
assessment performed by a qualified
individual, group, or organization that is not
part of the organization directly performing
and accountable for the work being assessed.
This auditing organization must not be
involved with the generation of the routinelycollected ambient air monitoring data. An
organization can conduct the performance
evaluation (PE) if it can meet this definition
and has a management structure that, at a
minimum, will allow for the separation of its
routine sampling personnel from its auditing
personnel by two levels of management. In
addition, the sample analysis of audit filters
must be performed by a laboratory facility
and laboratory equipment separate from the
facilities used for routine sample analysis.
Field and laboratory personnel will be
required to meet the performance evaluation
field and laboratory training and certification
requirements. The PSD PQAO will be
required to participate in the centralized field
and laboratory standards certification and
comparison processes to establish
comparability to federally implemented
programs.
2.5 Technical Systems Audit Program.
The PSD reviewing authority or the EPA may
conduct system audits of the ambient air
monitoring programs or organizations
operating PSD networks. The PSD monitoring
organizations shall consult with the PSD
reviewing authority to verify the schedule of
any such technical systems audit. Systems
audit programs are described in reference 10
of this appendix.
2.6 Gaseous and Flow Rate Audit
Standards.
2.6.1 Gaseous pollutant concentration
standards (permeation devices or cylinders of
compressed gas) used to obtain test
concentrations for carbon monoxide (CO),
sulfur dioxide (SO2), nitrogen oxide (NO),
and nitrogen dioxide (NO2) must be traceable
to either a National Institute of Standards and
Technology (NIST) Traceable Reference
Material (NTRM) or a NIST-certified Gas
Manufacturer’s Internal Standard (GMIS),
certified in accordance with one of the
procedures given in reference 4 of this
appendix. Vendors advertising certification
with the procedures provided in reference 4
of this appendix and distributing gases as
‘‘EPA Protocol Gas’’ must participate in the
EPA Protocol Gas Verification Program or not
use ‘‘EPA’’ in any form of advertising. The
PSD PQAOs must provide information to the
PSD reviewing authority on the gas vendors
they use (or will use) for the duration of the
PSD monitoring project. This information can
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17292
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
be provided in the QAPP or monitoring plan,
but must be updated if there is a change in
the producer used.
2.6.2 Test concentrations for ozone (O3)
must be obtained in accordance with the
ultraviolet photometric calibration procedure
specified in appendix D to Part 50, and by
means of a certified NIST-traceable O3
transfer standard. Consult references 7 and 8
of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be
made by a flow measuring instrument that is
NIST-traceable to an authoritative volume or
other applicable standard. Guidance for
certifying some types of flow-meters is
provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance.
Requirements and guidance documents for
developing the quality system are contained
in references 1 through 11 of this appendix,
which also contain many suggested
procedures, checks, and control
specifications. Reference 10 describes
specific guidance for the development of a
quality system for data collected for
comparison to the NAAQS. Many specific
quality control checks and specifications for
methods are included in the respective
reference methods described in Part 50 or in
the respective equivalent method
descriptions available from the EPA
(reference 6 of this appendix). Similarly,
quality control procedures related to
specifically designated reference and
equivalent method monitors are contained in
the respective operation or instruction
manuals associated with those monitors. For
PSD monitoring, the use of reference and
equivalent method monitors are required.
3. Measurement Quality Check Requirements
This section provides the requirements for
PSD PQAOs to perform the measurement
quality checks that can be used to assess data
quality. Data from these checks are required
to be submitted to the PSD reviewing
authority within the same time frame as
routinely-collected ambient concentration
data as described in 40 CFR 58.16. Table B–
1 of this appendix provides a summary of the
types and frequency of the measurement
quality checks that are described in this
section. Reporting these results to AQS may
be required by the PSD reviewing authority.
3.1 Gaseous monitors of SO2, NO2, O3, and
CO.
3.1.1 One-Point Quality Control (QC)
Check for SO2, NO2, O3, and CO. (a) A onepoint QC check must be performed at least
once every 2 weeks on each automated
monitor used to measure SO2, NO2, O3 and
CO. With the advent of automated calibration
systems, more frequent checking is strongly
encouraged and may be required by the PSD
reviewing authority. See Reference 10 of this
appendix for guidance on the review
procedure. The QC check is made by
challenging the monitor with a QC check gas
of known concentration (effective
concentration for open path monitors)
between the prescribed range of 0.005 and
0.08 parts per million (ppm) for SO2, NO2,
and O3, and between the prescribed range of
0.5 and 5 ppm for CO monitors. The QC
check gas concentration selected within the
prescribed range should be related to
monitoring objectives for the monitor. If
monitoring for trace level monitoring, the QC
check concentration should be selected to
represent the mean or median concentrations
at the site. If the mean or median
concentrations at trace gas sites are below the
MDL of the instrument the agency can select
the lowest concentration in the prescribed
range that can be practically achieved. If the
mean or median concentrations at trace gas
sites are above the prescribed range the
agency can select the highest concentration
in the prescribed range. The PSD monitoring
organization will consult with the PSD
reviewing authority on the most appropriate
one-point QC concentration based on the
objectives of the monitoring activity. An
additional QC check point is encouraged for
those organizations that may have occasional
high values or would like to confirm the
monitors’ linearity at the higher end of the
operational range or around NAAQS
concentrations. If monitoring for NAAQS
decisions the QC concentration can be
selected at a higher concentration within the
prescribed range but should also consider
precision points around mean or median
concentrations.
(b) Point analyzers must operate in their
normal sampling mode during the QC check
and the test atmosphere must pass through
all filters, scrubbers, conditioners and other
components used during normal ambient
sampling and as much of the ambient air
inlet system as is practicable. The QC check
must be conducted before any calibration or
adjustment to the monitor.
(c) Open-path monitors are tested by
inserting a test cell containing a QC check gas
concentration into the optical measurement
beam of the instrument. If possible, the
normally used transmitter, receiver, and as
appropriate, reflecting devices should be
used during the test and the normal
monitoring configuration of the instrument
should be altered as little as possible to
accommodate the test cell for the test.
However, if permitted by the associated
operation or instruction manual, an alternate
local light source or an alternate optical path
that does not include the normal atmospheric
monitoring path may be used. The actual
concentration of the QC check gas in the test
cell must be selected to produce an effective
concentration in the range specified earlier in
this section. Generally, the QC test
concentration measurement will be the sum
of the atmospheric pollutant concentration
and the QC test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the QC test from the
QC check gas concentration measurement. If
the difference between these before and after
measurements is greater than 20 percent of
the effective concentration of the test gas,
discard the test result and repeat the test. If
possible, open path monitors should be
tested during periods when the atmospheric
pollutant concentrations are relatively low
and steady.
(d) Report the audit concentration of the
QC gas and the corresponding measured
concentration indicated by the monitor. The
percent differences between these
concentrations are used to assess the
precision and bias of the monitoring data as
described in sections 4.1.2 (precision) and
4.1.3 (bias) of this appendix.
3.1.2 Quarterly performance evaluation
for SO2, NO2, O3, or CO. Evaluate each
primary monitor each monitoring quarter (or
90 day frequency) during which monitors are
operated or a least once (if operated for less
than one quarter). The quarterly performance
evaluation (quarterly PE) must be performed
by a qualified individual, group, or
organization that is not part of the
organization directly performing and
accountable for the work being assessed. The
person or entity performing the quarterly PE
must not be involved with the generation of
the routinely-collected ambient air
monitoring data. A PSD monitoring
organization can conduct the quarterly PE
itself if it can meet this definition and has a
management structure that, at a minimum,
will allow for the separation of its routine
sampling personnel from its auditing
personnel by two levels of management. The
quarterly PE also requires a set of equipment
and standards independent from those used
for routine calibrations or zero, span or
precision checks.
3.1.2.1 The evaluation is made by
challenging the monitor with audit gas
standards of known concentration from at
least three audit levels. One point must be
within two to three times the method
detection limit of the instruments within the
PQAOs network, the second point will be
less than or equal to the 99th percentile of
the data at the site or the network of sites in
the PQAO or the next highest audit
concentration level. The third point can be
around the primary NAAQS or the highest 3year concentration at the site or the network
of sites in the PQAO. An additional 4th level
is encouraged for those PSD organizations
that would like to confirm the monitor’s
linearity at the higher end of the operational
range. In rare circumstances, there may be
sites measuring concentrations above audit
level 10. These sites should be identified to
the PSD reviewing authority.
Concentration range, ppm
Audit level
O3
1 .......................................................................................................................
2 .......................................................................................................................
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
PO 00000
Frm 00046
Fmt 4701
0.004–0.0059
0.006–0.019
Sfmt 4700
SO2
NO2
0.0003–0.0029
0.0030–0.0049
0.0003–0.0029
0.0030–0.0049
E:\FR\FM\28MRR2.SGM
28MRR2
CO
0.020–0.059
0.060–0.199
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
17293
Concentration range, ppm
Audit level
SO2
NO2
CO
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.1499
0.1500–0.2599
0.2600–0.7999
0.8000–1.000
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.2999
0.3000–0.4999
0.5000–0.7999
0.8000–1.000
0.200–0.899
0.900–2.999
3.000–7.999
8.000–15.999
16.000–30.999
31.000–39.999
40.000–49.999
50.000–60.000
O3
Lhorne on DSK5TPTVN1PROD with RULES2
3 .......................................................................................................................
4 .......................................................................................................................
5 .......................................................................................................................
6 .......................................................................................................................
7 .......................................................................................................................
8 .......................................................................................................................
9 .......................................................................................................................
10 .....................................................................................................................
3.1.2.2 The NO2 audit techniques may
vary depending on the ambient monitoring
method. For chemiluminescence-type NO2
analyzers, gas phase titration (GPT)
techniques should be based on the EPA
guidance documents and monitoring agency
experience. The NO2 gas standards may be
more appropriate than GPT for direct NO2
methods that do not employ converters. Care
should be taken to ensure the stability of
such gas standards prior to use.
3.1.2.3 The standards from which audit
gas test concentrations are obtained must
meet the specifications of section 2.6.1 of this
appendix.
3.1.2.4 For point analyzers, the
evaluation shall be carried out by allowing
the monitor to analyze the audit gas test
atmosphere in its normal sampling mode
such that the test atmosphere passes through
all filters, scrubbers, conditioners, and other
sample inlet components used during normal
ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open-path monitors are evaluated
by inserting a test cell containing the various
audit gas concentrations into the optical
measurement beam of the instrument. If
possible, the normally used transmitter,
receiver, and, as appropriate, reflecting
devices should be used during the
evaluation, and the normal monitoring
configuration of the instrument should be
modified as little as possible to accommodate
the test cell for the evaluation. However, if
permitted by the associated operation or
instruction manual, an alternate local light
source or an alternate optical path that does
not include the normal atmospheric
monitoring path may be used. The actual
concentrations of the audit gas in the test cell
must be selected to produce effective
concentrations in the evaluation level ranges
specified in this section of this appendix.
Generally, each evaluation concentration
measurement result will be the sum of the
atmospheric pollutant concentration and the
evaluation test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open-path
instrument under test immediately before
and immediately after the evaluation test (or
preferably before and after each evaluation
concentration level) from the evaluation
concentration measurement. If the difference
between the before and after measurements is
greater than 20 percent of the effective
concentration of the test gas standard,
discard the test result for that concentration
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
0.020–0.039
0.040–0.069
0.070–0.089
0.090–0.119
0.120–0.139
0.140–0.169
0.170–0.189
0.190–0.259
level and repeat the test for that level. If
possible, open-path monitors should be
evaluated during periods when the
atmospheric pollutant concentrations are
relatively low and steady. Also, if the openpath instrument is not installed in a
permanent manner, the monitoring path
length must be reverified to be within ±3
percent to validate the evaluation, since the
monitoring path length is critical to the
determination of the effective concentration.
3.1.2.6 Report both the evaluation
concentrations (effective concentrations for
open-path monitors) of the audit gases and
the corresponding measured concentration
(corrected concentrations, if applicable, for
open-path monitors) indicated or produced
by the monitor being tested. The percent
differences between these concentrations are
used to assess the quality of the monitoring
data as described in section 4.1.1 of this
appendix.
3.1.3 National Performance Audit
Program (NPAP). As stated in sections 1.1
and 2.4, PSD monitoring networks may be
subject to the NPEP, which includes the
NPAP. The NPAP is a performance
evaluation which is a type of audit where
quantitative data are collected independently
in order to evaluate the proficiency of an
analyst, monitoring instrument and
laboratory. Due to the implementation
approach used in this program, NPAP
provides for a national independent
assessment of performance with a consistent
level of data quality. The NPAP should not
be confused with the quarterly PE program
described in section 3.1.2. The PSD
organizations shall consult with the PSD
reviewing authority or the EPA regarding
whether the implementation of NPAP is
required and the implementation options
available. Details of the EPA NPAP can be
found in reference 11 of this appendix. The
program requirements include:
3.1.3.1 Performing audits on 100 percent
of monitors and sites each year including
monitors and sites that may be operated for
less than 1 year. The PSD reviewing authority
has the authority to require more frequent
audits at sites they consider to be high
priority.
3.1.3.2 Developing a delivery system that
will allow for the audit concentration gasses
to be introduced at the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified
against the National Institute for Standards
and Technology (NIST) standard reference
methods or special review procedures and
validated annually for CO, SO2 and NO2, and
PO 00000
Frm 00047
Fmt 4701
Sfmt 4700
at the beginning of each quarter of audits for
O 3.
3.1.3.4 The PSD PQAO may elect to selfimplement NPAP. In these cases, the PSD
reviewing authority will work with those
PSD PQAOs to establish training and other
technical requirements to establish
comparability to federally implemented
programs. In addition to meeting the
requirements in sections 3.1.1.3 through
3.1.3.3, the PSD PQAO must:
(a) Ensure that the PSD audit system is
equivalent to the EPA NPAP audit system
and is an entirely separate set of equipment
and standards from the equipment used for
quarterly performance evaluations. If this
system does not generate and analyze the
audit concentrations, as the EPA NPAP
system does, its equivalence to the EPA
NPAP system must be proven to be as
accurate under a full range of appropriate
and varying conditions as described in
section 3.1.3.6.
(b) Perform a whole system check by
having the PSD audit system tested at an
independent and qualified EPA lab, or
equivalent.
(c) Evaluate the system with the EPA NPAP
program through collocated auditing at an
acceptable number of sites each year (at least
one for a PSD network of five or less sites;
at least two for a network with more than five
sites).
(d) Incorporate the NPAP into the PSD
PQAO’s QAPP.
(e) Be subject to review by independent,
EPA-trained personnel.
(f) Participate in initial and update
training/certification sessions.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure PM2.5. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be used in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. Flow rate verification results are to
be reported to the PSD reviewing authority
quarterly as described in section 5.1.
Reporting these results to AQS is encouraged.
The percent differences between the audit
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
17294
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
and measured flow rates are used to assess
the bias of the monitoring data as described
in section 4.2.2 of this appendix (using flow
rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for
PM2.5. Every 6 months, audit the flow rate of
the PM2.5 particulate monitors. For shortterm monitoring operations (those less than
1 year), the flow rate audits must occur at
start up, at the midpoint, and near the
completion of the monitoring project. The
audit must be conducted by a trained
technician other than the routine site
operator. The audit is made by measuring the
monitor’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
for verifications or to calibrate the monitor.
However, both the calibration standard and
the audit standard may be referenced to the
same primary flow rate or volume standard.
Care must be taken in auditing the flow rate
to be certain that the flow measurement
device does not alter the normal operating
flow rate of the monitor. Report the audit
flow rate of the transfer standard and the
corresponding flow rate measured by the
monitor. The percent differences between
these flow rates are used to evaluate monitor
performance.
3.2.3 Collocated Sampling Procedures for
PM2.5. A PSD PQAO must have at least one
collocated monitor for each PSD monitoring
network.
3.2.3.1 For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the QC monitor. There
can be only one primary monitor at a
monitoring site for a given time period.
(a) If the primary monitor is a FRM, then
the quality control monitor must be a FRM
of the same method designation.
(b) If the primary monitor is a FEM, then
the quality control monitor must be a FRM
unless the PSD PQAO submits a waiver for
this requirement, provides a specific reason
why a FRM cannot be implemented, and the
waiver is approved by the PSD reviewing
authority. If the waiver is approved, then the
quality control monitor must be the same
method designation as the primary FEM
monitor.
3.2.3.2 In addition, the collocated
monitors should be deployed according to
the following protocol:
(a) The collocated quality control
monitor(s) should be deployed at sites with
the highest predicted daily PM2.5
concentrations in the network. If the highest
PM2.5 concentration site is impractical for
collocation purposes, alternative sites
approved by the PSD reviewing authority
may be selected. If additional collocated sites
are necessary, the PSD PQAO and the PSD
reviewing authority should determine the
appropriate location(s) based on data needs.
(b) The two collocated monitors must be
within 4 meters of each other and at least 2
meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference. A
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
waiver allowing up to 10 meters horizontal
distance and up to 3 meters vertical distance
(inlet to inlet) between a primary and
collocated quality control monitor may be
approved by the PSD reviewing authority for
sites at a neighborhood or larger scale of
representation. This waiver may be approved
during the QAPP review and approval
process. Sampling and analytical
methodologies must be the consistently
implemented for both collocated samplers
and for all other samplers in the network.
(c) Sample the collocated quality control
monitor on a 6-day schedule for sites not
requiring daily monitoring and on a 3-day
schedule for any site requiring daily
monitoring. Report the measurements from
both primary and collocated quality control
monitors at each collocated sampling site.
The calculations for evaluating precision
between the two collocated monitors are
described in section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation
Program (PEP) Procedures. As stated in
sections 1.1 and 2.4 of this appendix, PSD
monitoring networks may be subject to the
NPEP, which includes the PM2.5 PEP. The
PSD monitoring organizations shall consult
with the PSD reviewing authority or the EPA
regarding whether the implementation of
PM2.5 PEP is required and the
implementation options available for the
PM2.5 PEP. For PSD PQAOs with less than or
equal to five monitoring sites, five valid
performance evaluation audits must be
collected and reported each year. For PSD
PQAOs with greater than five monitoring
sites, eight valid performance evaluation
audits must be collected and reported each
year. Additionally, within the five or eight
required audits, each type of method
designation (FRM/FEM designation) used as
a primary monitor in the PSD network shall
be audited. For a PE to be valid, both the
primary monitor and PEP audit
measurements must meet quality control
requirements and be above 3 mg/m3 or a
predefined lower concentration level
determined by a systematic planning process
and approved by the PSD reviewing
authority. Due to the relatively short-term
nature of most PSD monitoring, the
likelihood of measuring low concentrations
in many areas attaining the PM2.5 standard
and the time required to weigh filters
collected in PEs, a PSD monitoring
organization’s QAPP may contain a provision
to waive the 3 mg/m3 threshold for validity
of PEs conducted in the last quarter of
monitoring, subject to approval by the PSD
reviewing authority.
3.3 PM10.
3.3.1 Flow Rate Verification for PM10. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure PM10. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be taken in
PO 00000
Frm 00048
Fmt 4701
Sfmt 4700
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are used to
assess the bias of the monitoring data as
described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.3.2 Semi-Annual Flow Rate Audit for
PM10. Every 6 months, audit the flow rate of
the PM10 particulate monitors. For short-term
monitoring operations (those less than 1
year), the flow rate audits must occur at start
up, at the midpoint, and near the completion
of the monitoring project. Where possible,
the EPA strongly encourages more frequent
auditing. The audit must be conducted by a
trained technician other than the routine site
operator. The audit is made by measuring the
monitor’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
for verifications or to calibrate the monitor.
However, both the calibration standard and
the audit standard may be referenced to the
same primary flow rate or volume standard.
Care must be taken in auditing the flow rate
to be certain that the flow measurement
device does not alter the normal operating
flow rate of the monitor. Report the audit
flow rate of the transfer standard and the
corresponding flow rate measured by the
monitor. The percent differences between
these flow rates are used to evaluate monitor
performance
3.3.3 Collocated Sampling Procedures for
Manual PM10. A PSD PQAO must have at
least one collocated monitor for each PSD
monitoring network.
3.3.3.1 For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the quality control
monitor.
3.3.3.2 In addition, the collocated
monitors should be deployed according to
the following protocol:
(a) The collocated quality control
monitor(s) should be deployed at sites with
the highest predicted daily PM10
concentrations in the network. If the highest
PM10 concentration site is impractical for
collocation purposes, alternative sites
approved by the PSD reviewing authority
may be selected.
(b) The two collocated monitors must be
within 4 meters of each other and at least 2
meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference. A
waiver allowing up to 10 meters horizontal
distance and up to 3 meters vertical distance
(inlet to inlet) between a primary and
collocated sampler may be approved by the
PSD reviewing authority for sites at a
neighborhood or larger scale of
representation. This waiver may be approved
during the QAPP review and approval
process. Sampling and analytical
methodologies must be the consistently
implemented for both collocated samplers
and for all other samplers in the network.
E:\FR\FM\28MRR2.SGM
28MRR2
Lhorne on DSK5TPTVN1PROD with RULES2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
(c) Sample the collocated quality control
monitor on a 6-day schedule or 3-day
schedule for any site requiring daily
monitoring. Report the measurements from
both primary and collocated quality control
monitors at each collocated sampling site.
The calculations for evaluating precision
between the two collocated monitors are
described in section 4.2.1 of this appendix.
(d) In determining the number of
collocated sites required for PM10, PSD
monitoring networks for Pb-PM10 should be
treated independently from networks for
particulate matter (PM), even though the
separate networks may share one or more
common samplers. However, a single quality
control monitor that meets the collocation
requirements for Pb-PM10 and PM10 may
serve as a collocated quality control monitor
for both networks. Extreme care must be
taken if using the filter from a quality control
monitor for both PM10 and Pb analysis. PM10
filter weighing should occur prior to any Pb
analysis.
3.4 Pb.
3.4.1 Flow Rate Verification for Pb. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure Pb. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. Use a flow rate
transfer standard certified in accordance with
section 2.6 of this appendix to check the
monitor’s normal flow rate. Care should be
taken in selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are used to
assess the bias of the monitoring data as
described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.4.2 Semi-Annual Flow Rate Audit for
Pb. Every 6 months, audit the flow rate of the
Pb particulate monitors. For short-term
monitoring operations (those less than 1
year), the flow rate audits must occur at start
up, at the midpoint, and near the completion
of the monitoring project. Where possible,
the EPA strongly encourages more frequent
auditing. The audit must be conducted by a
trained technician other than the routine site
operator. The audit is made by measuring the
monitor’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
to in verifications or to calibrate the monitor.
However, both the calibration standard and
the audit standard may be referenced to the
same primary flow rate or volume standard.
Great care must be taken in auditing the flow
rate to be certain that the flow measurement
device does not alter the normal operating
flow rate of the monitor. Report the audit
flow rate of the transfer standard and the
corresponding flow rate measured by the
monitor. The percent differences between
these flow rates are used to evaluate monitor
performance.
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
3.4.3 Collocated Sampling for Pb. A PSD
PQAO must have at least one collocated
monitor for each PSD monitoring network.
3.4.3.1 For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the quality control
monitor.
3.4.3.2 In addition, the collocated
monitors should be deployed according to
the following protocol:
(a) The collocated quality control
monitor(s) should be deployed at sites with
the highest predicted daily Pb concentrations
in the network. If the highest Pb
concentration site is impractical for
collocation purposes, alternative sites
approved by the PSD reviewing authority
may be selected.
(b) The two collocated monitors must be
within 4 meters of each other and at least 2
meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference. A
waiver allowing up to 10 meters horizontal
distance and up to 3 meters vertical distance
(inlet to inlet) between a primary and
collocated sampler may be approved by the
PSD reviewing authority for sites at a
neighborhood or larger scale of
representation. This waiver may be approved
during the QAPP review and approval
process. Sampling and analytical
methodologies must be the consistently
implemented for both collocated samplers
and all other samplers in the network.
(c) Sample the collocated quality control
monitor on a 6-day schedule if daily
monitoring is not required or 3-day schedule
for any site requiring daily monitoring.
Report the measurements from both primary
and collocated quality control monitors at
each collocated sampling site. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
(d) In determining the number of
collocated sites required for Pb-PM10, PSD
monitoring networks for PM10 should be
treated independently from networks for PbPM10, even though the separate networks
may share one or more common samplers.
However, a single quality control monitor
that meets the collocation requirements for
Pb-PM10 and PM10 may serve as a collocated
quality control monitor for both networks.
Extreme care must be taken if using a using
the filter from a quality control monitor for
both PM10 and Pb analysis. The PM10 filter
weighing should occur prior to any Pb
analysis.
3.4.4 Pb Analysis Audits. Each calendar
quarter, audit the Pb reference or equivalent
method analytical procedure using filters
containing a known quantity of Pb. These
audit filters are prepared by depositing a Pb
standard on unexposed filters and allowing
them to dry thoroughly. The audit samples
must be prepared using batches of reagents
different from those used to calibrate the Pb
analytical equipment being audited. Prepare
audit samples in the following concentration
ranges:
PO 00000
Frm 00049
Fmt 4701
Sfmt 4700
Equivalent ambient
Pb concentration, μg/m3
Range
1 ...........
2 ...........
17295
30–100% of Pb NAAQS.
200–300% of Pb NAAQS.
(a) Audit samples must be extracted using
the same extraction procedure used for
exposed filters.
(b) Analyze three audit samples in each of
the two ranges each quarter samples are
analyzed. The audit sample analyses shall be
distributed as much as possible over the
entire calendar quarter.
(c) Report the audit concentrations (in mg
Pb/filter or strip) and the corresponding
measured concentrations (in mg Pb/filter or
strip) using AQS unit code 077 (if reporting
to AQS). The percent differences between the
concentrations are used to calculate
analytical accuracy as described in section
4.2.5 of this appendix.
3.4.5 Pb Performance Evaluation Program
(PEP) Procedures. As stated in sections 1.1
and 2.4, PSD monitoring networks may be
subject to the NPEP, which includes the Pb
PEP. The PSD monitoring organizations shall
consult with the PSD reviewing authority or
the EPA regarding whether the
implementation of Pb-PEP is required and
the implementation options available for the
Pb-PEP. The PEP is an independent
assessment used to estimate total
measurement system bias. Each year, one PE
audit must be performed at one Pb site in
each PSD PQAO network that has less than
or equal to five sites and two audits for PSD
PQAO networks with greater than five sites.
In addition, each year, four collocated
samples from PSD PQAO networks with less
than or equal to five sites and six collocated
samples from PSD PQAO networks with
greater than five sites must be sent to an
independent laboratory for analysis. The
calculations for evaluating bias between the
primary monitor and the PE monitor for Pb
are described in section 4.2.4 of this
appendix.
4. Calculations for Data Quality Assessments
(a) Calculations of measurement
uncertainty are carried out by PSD PQAO
according to the following procedures. The
PSD PQAOs should report the data for all
appropriate measurement quality checks as
specified in this appendix even though they
may elect to perform some or all of the
calculations in this section on their own.
(b) At low concentrations, agreement
between the measurements of collocated
samplers, expressed as relative percent
difference or percent difference, may be
relatively poor. For this reason, collocated
measurement pairs will be selected for use in
the precision and bias calculations only
when both measurements are equal to or
above the following limits:
(1) Pb: 0.002 mg/m3 (Methods approved
after 3/04/2010, with exception of manual
equivalent method EQLA–0813–803).
(2) Pb: 0.02 mg/m3 (Methods approved
before 3/04/2010, and manual equivalent
method EQLA–0813–803).
(3) PM10 (Hi-Vol): 15 mg/m3.
(4) PM10 (Lo-Vol): 3 mg/m3.
(5) PM2.5: 3 mg/m3.
E:\FR\FM\28MRR2.SGM
28MRR2
17296
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
4.1 Statistics for the Assessment of QC
Checks for SO2, NO2, O3 and CO.
4.1.1 Percent Difference. Many of the
measurement quality checks start with a
comparison of an audit concentration or
value (flow-rate) to the concentration/value
measured by the monitor and use percent
difference as the comparison statistic as
described in equation 1 of this section. For
each single point check, calculate the percent
difference, di, as follows:
where meas is the concentration indicated by
the PQAO’s instrument and audit is the audit
concentration of the standard used in the QC
check being measured.
4.1.2 Precision Estimate. The precision
estimate is used to assess the one-point QC
checks for SO2, NO2, O3, or CO described in
section 3.1.1 of this appendix. The precision
estimator is the coefficient of variation upper
bound and is calculated using equation 2 of
this section:
where n is the number of single point checks
being aggregated; X2 0.1,n–1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is
calculated using the one-point QC checks for
SO2, NO2, O3, or CO described in section
3.1.1 of this appendix. The bias estimator is
an upper bound on the mean absolute value
of the percent differences as described in
equation 3 of this section:
where n is the number of single point checks
being aggregated; t0.95,n–1 is the 95th quantile
of a t-distribution with n–1 degrees of
freedom; the quantity AB is the mean of the
absolute values of the di′s and is calculated
using equation 4 of this section:
ER28MR16.011
ER28MR16.012
(c) The PM2.5 3 mg/m3 limit for the
PM2.5¥PEP may be superseded by mutual
agreement between the PSD PQAO and the
PSD reviewing authority as specified in
section 3.2.4 of the appendix and detailed in
the approved QAPP.
ER28MR16.010
ER28MR16.009
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
PO 00000
Frm 00050
Fmt 4701
Sfmt 4725
E:\FR\FM\28MRR2.SGM
28MRR2
ER28MR16.008
Lhorne on DSK5TPTVN1PROD with RULES2
and the quantity AS is the standard deviation
of the absolute value of the di′s and is
calculated using equation 5 of this section:
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
of variation upper bound is calculated using
equation 7 of this appendix:
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
where nj is the number of pairs and
d1,d2,...dnj are the biases for each pair to be
averaged.
4.2.6 Pb Analysis Audit Bias Estimate.
The bias estimate is calculated using the
analysis audit data described in section 3.4.4.
Use the same bias estimate procedure as
described in section 4.1.3 of this appendix.
5. Reporting Requirements
5.1. Quarterly Reports. For each quarter,
each PSD PQAO shall report to the PSD
reviewing authority (and AQS if required by
the PSD reviewing authority) the results of all
valid measurement quality checks it has
carried out during the quarter. The quarterly
reports must be submitted consistent with
the data reporting requirements specified for
air quality data as set forth in 40 CFR 58.16
and pertain to PSD monitoring.
6. References
(1) American National Standard—
Specifications and Guidelines for Quality
Systems for Environmental Data Collection
and Environmental Technology Programs.
ANSI/ASQC E4–2014. February 2014.
PO 00000
Frm 00051
Fmt 4701
Sfmt 4700
Available from American Society for Quality
Control, 611 East Wisconsin Avenue,
Milwaukee, WI 53202.
(2) EPA Requirements for Quality
Management Plans. EPA QA/R–2. EPA/240/
B–01/002. March 2001, Reissue May 2006.
Office of Environmental Information,
Washington, DC 20460. https://www.epa.gov/
quality/agency-wide-quality-systemdocuments.
(3) EPA Requirements for Quality
Assurance Project Plans for Environmental
Data Operations. EPA QA/R–5. EPA/240/B–
01/003. March 2001, Reissue May 2006.
Office of Environmental Information,
Washington, DC 20460. https://www.epa.gov/
quality/agency-wide-quality-systemdocuments.
(4) EPA Traceability Protocol for Assay and
Certification of Gaseous Calibration
Standards. EPA–600/R–12/531. May, 2012.
Available from U.S. Environmental
Protection Agency, National Risk
Management Research Laboratory, Research
Triangle Park, NC 27711. https://
cfpub.epa.gov/si/si_public_record_
report.cfm?dirEntryId=245292.
(5) Guidance for the Data Quality
Objectives Process. EPA QA/G–4. EPA/240/
B–06/001. February, 2006. Office of
Environmental Information, Washington, DC
20460. https://www.epa.gov/quality/agencywide-quality-system-documents.
(6) List of Designated Reference and
Equivalent Methods. Available from U.S.
Environmental Protection Agency, National
Exposure Research Laboratory, Human
Exposure and Atmospheric Sciences
Division, MD–D205–03, Research Triangle
E:\FR\FM\28MRR2.SGM
28MRR2
ER28MR16.015
section 4.1.3 of this appendix. The bias
estimator is based on the mean percent
differences (Equation 1). The mean percent
difference, D, is calculated by Equation 8
below.
ER28MR16.014
where n is the number of valid data pairs
being aggregated, and X2 0.1,n–1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di
is calculated from two values with error.
4.2.2 One-Point Flow Rate Verification
Bias Estimate for PM10, PM2.5 and Pb. For
each one-point flow rate verification,
calculate the percent difference in volume
using equation 1 of this appendix where
meas is the value indicated by the sampler’s
volume measurement and audit is the actual
volume indicated by the auditing flow meter.
The absolute volume bias upper bound is
then calculated using equation 3, where n is
the number of flow rate audits being
aggregated; t0.95,n–1 is the 95th quantile of a
t-distribution with n–1 degrees of freedom,
the quantity AB is the mean of the absolute
values of the di′s and is calculated using
equation 4 of this appendix, and the quantity
AS in equation 3 of this appendix is the
standard deviation of the absolute values if
the di′s and is calculated using equation 5 of
this appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias
Estimate for PM10, PM2.5 and Pb. Use the
same procedure described in section 4.2.2 for
the evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs
Bias Estimate for Pb. The Pb bias estimate is
calculated using the paired routine and the
PEP monitor as described in section 3.4.5.
Use the same procedures as described in
section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs
Bias Estimate for PM2.5. The bias estimate is
calculated using the PEP audits described in
measurements from collocated samplers. It is
recommended that the precision be
aggregated at the PQAO level quarterly,
annually, and at the 3-year level. The data
pair would only be considered valid if both
concentrations are greater than or equal to
the minimum values specified in section 4(c)
of this appendix. For each collocated data
pair, calculate the relative percent difference,
di, using equation 6 of this appendix:
ER28MR16.013
site. The absolute bias upper bound should
be flagged as positive if both percentiles are
positive and negative if both percentiles are
negative. The absolute bias upper bound
would not be flagged if the 25th and 75th
percentiles are of different signs.
4.2 Statistics for the Assessment of PM10,
PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler
Precision Estimate for PM10, PM2.5 and Pb.
Precision is estimated via duplicate
where Xi is the concentration from the
primary sampler and Yi is the concentration
value from the audit sampler. The coefficient
Lhorne on DSK5TPTVN1PROD with RULES2
4.1.3.1 Assigning a sign (positive/
negative) to the bias estimate. Since the bias
statistic as calculated in equation 3 of this
appendix uses absolute values, it does not
have a tendency (negative or positive bias)
associated with it. A sign will be designated
by rank ordering the percent differences of
the QC check samples from a given site for
a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th
percentiles of the percent differences for each
17297
17298
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
Park, NC 27711. https://www3.epa.gov/ttn/
amtic/criteria.html.
(7) Transfer Standards for the Calibration
of Ambient Air Monitoring Analyzers for
Ozone. EPA–454/B–13–004 U.S.
Environmental Protection Agency, Research
Triangle Park, NC 27711, October, 2013.
https://www3.epa.gov/ttn/amtic/
qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical
Assistance Document for the Calibration of
Ambient Ozone Monitors. EPA–600/4–79–
057. U.S. Environmental Protection Agency,
Research Triangle Park, NC 27711,
September, 1979. https://www.epa.gov/ttn/
amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume 1—
A Field Guide to Environmental Quality
Assurance. EPA–600/R–94/038a. April 1994.
Available from U.S. Environmental
Protection Agency, ORD Publications Office,
Center for Environmental Research
Information (CERI), 26 W. Martin Luther
King Drive, Cincinnati, OH 45268. https://
www3.epa.gov/ttn/amtic/qalist.html.
(10) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume II:
Ambient Air Quality Monitoring Program
Quality System Development. EPA–454/B–
13–003. https://www3.epa.gov/ttn/amtic/
qalist.html.
(11) National Performance Evaluation
Program Standard Operating Procedures.
https://www3.epa.gov/ttn/amtic/
npapsop.html.
TABLE B–1—MINIMUM DATA ASSESSMENT REQUIREMENTS FOR NAAQS RELATED CRITERIA POLLUTANT PSD MONITORS
Method
Assessment method
Coverage
Minimum frequency
Parameters reported
AQS
Assessment type
Gaseous Methods (CO, NO2, SO2, O3)
One-Point QC for SO2,
NO2, O3, CO.
Quarterly performance
evaluation for SO2, NO2,
O3, CO.
NPAP for SO2, NO2, O3,
CO3.
Response check at concentration 0.005–0.08
ppm SO2, NO2, O3, &
0.5 and 5 ppm CO.
See section 3.1.2 of this
appendix.
Each analyzer ...................
Once per 2 weeks ............
Audit concentration 1 and
measured concentration 2.
One-Point QC.
Each analyzer ...................
Once per quarter ..............
Annual PE.
Independent Audit ............
Each primary monitor .......
Once per year ...................
Audit concentration 1 and
measured concentration 2 for each level.
Audit concentration 1 and
measured concentration 2 for each level.
Primary sampler concentration and duplicate
sampler concentration 4.
Audit flow rate and measured flow rate indicated
by the sampler.
Audit flow rate and measured flow rate indicated
by the sampler.
Measured value and audit
value (ug Pb/filter) using
AQS unit code 077 for
parameters:
14129—Pb (TSP) LC
FRM/FEM
85129—Pb (TSP) LC NonFRM/FEM.
Primary sampler concentration and performance evaluation sampler
concentration.
No Transaction reported as raw
data.
Flow Rate
Verification.
NPAP.
Particulate Methods
Collocated sampling PM10,
PM2.5, Pb.
Collocated samplers .........
1 per PSD Network per
pollutant.
Flow rate verification PM10,
PM2.5, Pb.
Check of sampler flow rate
Each sampler ....................
Semi-annual flow rate audit
PM10, PM2.5, Pb.
Check of sampler flow rate
using independent
standard.
Check of analytical system
with Pb audit strips/filters.
Each sampler ....................
Performance Evaluation
Program PM2.5 3.
Collocated samplers .........
Performance Evaluation
Program Pb 3.
Collocated samplers .........
(1) 5 valid audits for
PQAOs with <= 5 sites..
(2) 8 valid audits for
PQAOs with > 5 sites.
(3) All samplers in 6 years
(1) 1 valid audit and 4 collocated samples for
PQAOs, with <=5 sites.
(2) 2 valid audits and 6
collocated samples for
PQAOs with >5 sites.
Pb analysis audits Pb-TSP,
Pb-PM10.
Analytical ..........................
Every 6 days or every 3
days if daily monitoring
required.
Once every month ............
Once every 6 months or
beginning, middle and
end of monitoring.
Each quarter .....................
Over all 4 quarters ............
Over all 4 quarters ............
Primary sampler concentration and performance evaluation sampler
concentration. Primary
sampler concentration
and duplicate sampler
concentration.
Semi Annual Flow
Rate Audit.
Pb Analysis Audits.
PEP.
PEP.
1 Effective
concentration for open path analyzers.
concentration, if applicable for open path analyzers.
PM2.5 PEP and Pb-PEP must be implemented if data is used for NAAQS decisions otherwise implementation is at PSD reviewing authority discretion.
4 Both primary and collocated sampler values are reported as raw data.
2 Corrected
3 NPAP,
Lhorne on DSK5TPTVN1PROD with RULES2
■ 11. In Appendix D to part 58, revise
paragraph 3(b), remove and reserve
paragraph 4.5(b), and revise paragraph
4.5(c) to read as follows:
Appendix D to Part 58—Network
Design Criteria for Ambient Air Quality
Monitoring
*
*
*
*
*
3. * * *
(b) The NCore sites must measure, at a
minimum, PM2.5 particle mass using
continuous and integrated/filter-based
VerDate Sep<11>2014
17:36 Mar 25, 2016
Jkt 238001
samplers, speciated PM2.5, PM10–2.5 particle
mass, O3, SO2, CO, NO/NOY, wind speed,
wind direction, relative humidity, and
ambient temperature.
(1) Although the measurement of NOy is
required in support of a number of
monitoring objectives, available commercial
instruments may indicate little difference in
their measurement of NOy compared to the
conventional measurement of NOX,
particularly in areas with relatively fresh
sources of nitrogen emissions. Therefore, in
areas with negligible expected difference
between NOy and NOX measured
PO 00000
Frm 00052
Fmt 4701
Sfmt 4700
concentrations, the Administrator may allow
for waivers that permit NOX monitoring to be
substituted for the required NOy monitoring
at applicable NCore sites.
(2) The EPA recognizes that, in some cases,
the physical location of the NCore site may
not be suitable for representative
meteorological measurements due to the
site’s physical surroundings. It is also
possible that nearby meteorological
measurements may be able to fulfill this data
need. In these cases, the requirement for
E:\FR\FM\28MRR2.SGM
28MRR2
Federal Register / Vol. 81, No. 59 / Monday, March 28, 2016 / Rules and Regulations
meteorological monitoring can be waived by
the Administrator.
*
*
*
*
*
Lhorne on DSK5TPTVN1PROD with RULES2
4.5 * * *
(b) [Reserved]
(c) The EPA Regional Administrator may
require additional monitoring beyond the
minimum monitoring requirements
VerDate Sep<11>2014
15:02 Mar 25, 2016
Jkt 238001
contained in paragraph 4.5(a) of this
appendix where the likelihood of Pb air
quality violations is significant or where the
emissions density, topography, or population
locations are complex and varied. The EPA
Regional Administrators may require
additional monitoring at locations including,
but not limited to, those near existing
PO 00000
Frm 00053
Fmt 4701
Sfmt 9990
17299
additional industrial sources of Pb, recently
closed industrial sources of Pb, airports
where piston-engine aircraft emit Pb, and
other sources of re-entrained Pb dust.
*
*
*
*
*
[FR Doc. 2016–06226 Filed 3–25–16; 8:45 am]
BILLING CODE 6560–50–P
E:\FR\FM\28MRR2.SGM
28MRR2
Agencies
[Federal Register Volume 81, Number 59 (Monday, March 28, 2016)]
[Rules and Regulations]
[Pages 17247-17299]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 2016-06226]
[[Page 17247]]
Vol. 81
Monday,
No. 59
March 28, 2016
Part II
Environmental Protection Agency
-----------------------------------------------------------------------
40 CFR Part 58
Revisions to Ambient Monitoring Quality Assurance and Other
Requirements; Final Rule
Federal Register / Vol. 81 , No. 59 / Monday, March 28, 2016 / Rules
and Regulations
[[Page 17248]]
-----------------------------------------------------------------------
ENVIRONMENTAL PROTECTION AGENCY
40 CFR Part 58
[EPA-HQ-OAR-2013-0619; FRL-9942-91-OAR]
RIN 2060-AS00
Revisions to Ambient Monitoring Quality Assurance and Other
Requirements
AGENCY: Environmental Protection Agency (EPA).
ACTION: Final rule.
-----------------------------------------------------------------------
SUMMARY: This action promulgates revisions to ambient air monitoring
requirements for criteria pollutants. These revisions include adding
and harmonizing definitions; clarifying annual monitoring network plan
public notice requirements; revising network design requirements;
system modifications and operating schedules; clarifying data
certification, data submittal and archiving procedures; reorganizing
and clarifying quality assurance requirements; and revising certain
network design criteria for non-source oriented lead monitoring. These
revisions also address other issues in the Ambient Air Quality
Surveillance Requirements, to help reduce the compliance burden of
monitoring agencies operating ambient monitoring networks.
DATES: This final rule is effective on April 27, 2016.
ADDRESSES: The EPA has established a docket for this action under
Docket ID No. EPA-HQ-OAR-2013-0619. All documents in the docket are
listed on the https://www.regulations.gov Web site. Although listed in
the index, some information is not publicly available, e.g.,
Confidential Business Information (CBI) or other information whose
disclosure is restricted by statute. Certain other material, such as
copyrighted material, is not placed on the Internet and will be
publicly available only in hard copy form. Publicly available docket
materials are available either electronically through https://www.regulations.gov or in hard copy at Docket ID No. EPA-HQ-OAR-2013-
0619, EPA Docket Center, EPA WJC West Building, Room 3334, 1301
Constitution Ave. NW., Washington, DC. The Docket Facility is open from
8:30 a.m. to 4:30 p.m. Monday through Friday, excluding legal holidays.
The docket telephone number is (202) 566-1742. The Public Reading Room
is open from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding
legal holidays. The telephone number for the Public Reading Room is
(202) 566-1744.
FOR FURTHER INFORMATION CONTACT: Mr. Lewis Weinstock, Air Quality
Assessment Division, Office of Air Quality Planning and Standards, U.S.
Environmental Protection Agency, Mail code C304-06, Research Triangle
Park, NC 27711; telephone: (919) 541-3661; fax: (919) 541-1903; email:
weinstock.lewis@epa.gov.
SUPPLEMENTARY INFORMATION:
A. Does this action apply to me?
This action applies to state, territorial, and local air quality
management programs that are responsible for ambient air monitoring
under 40 CFR part 58. Categories and entities potentially regulated by
this action include:
------------------------------------------------------------------------
Category NAICS \a\ code
------------------------------------------------------------------------
State/territorial/local/tribal government............... 924110
------------------------------------------------------------------------
\a\ North American Industry Classification System.
B. Where can I get a copy of this document?
In addition to being available in the docket, an electronic copy of
this action will also be available on the Worldwide Web (WWW) through
the Technology Transfer Network (TTN). Following signature, a copy of
this action will be posted at the TTN's Ambient Monitoring Technology
Information Center at the following address: https://www3.epa.gov/ttnamti1/monregs.html. The TTN provides information and technology
exchange in various areas of air pollution control.
C. Judicial Review
This rule is nationally applicable and, furthermore, the
Administrator finds that it is of nationwide scope and effect. Under
section 307(b)(1) of the Clean Air Act (CAA), judicial review of this
final rule is available by filing a petition for review in the U.S.
Court of Appeals for the District of Columbia Circuit by May 27, 2016.
Moreover, under section 307(b)(2) of the CAA, the requirements
established by this action may not be challenged separately in any
civil or criminal proceedings brought by the EPA to enforce these
requirements.
Table of Contents
The following topics are discussed in this preamble:
I. Background
II. Amendments to the Ambient Monitoring Requirements
A. General Information
B. Definitions
C. Annual Monitoring Network Plan and Periodic Network
Assessment
D. Network Technical Requirements
E. Operating Schedules
F. System Modification
G. Annual Air Monitoring Data Certification
H. Data Submittal and Archiving Requirements
I. Network Design Criteria (Appendix D)
III. Amendments to Quality Assurance Requirements
A. Quality Assurance Requirements for Monitors Used in
Evaluations for National Ambient Air Quality Standards--Appendix A
1. General Information
2. Quality System Requirements
3. Measurement Quality Checks for Gases
4. Measurement Quality Checks for Particulate Monitors
5. Calculations for Data Quality Assessment
B. Quality Assurance Requirements for Monitors Used in
Evaluations of Prevention of Significant Deterioration Projects--
Appendix B
1. General Information
2. Quality System Requirements
3. Measurement Quality Checks for Gases
4. Measurement Quality Checks for Particulate Monitors
5. Calculations for Data Quality Assessment
IV. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory Planning and Review and
Executive Order 13563: Improving Regulation and Regulatory Review
B. Paperwork Reduction Act (PRA)
C. Regulatory Flexibility Act (RFA)
D. Unfunded Mandates Reform Act
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation and Coordination With
Indian Tribal Governments
G. Executive Order 13045: Protection of Children From
Environmental Health and Safety Risks
H. Executive Order 13211: Actions Concerning Regulations That
Significantly Affect Energy Supply, Distribution, or Use
I. National Technology Transfer and Advancement Act
J. Executive Order 12898: Federal Actions To Address
Environmental Justice in Minority Populations and Low-Income
Populations
K. Congressional Review Act
I. Background
On September 11, 2014, the EPA proposed revisions to its ambient
air monitoring requirements for criteria pollutants to provide
clarifications to existing requirements and to reduce the compliance
burden of monitoring agencies operating ambient monitoring networks (79
FR 54356). The proposal focused on ambient monitoring requirements that
are found in 40 CFR part 58 and the associated appendices (A, D, and
new Appendix B), including issues such as operating schedules, the
[[Page 17249]]
development of annual monitoring network plans, data reporting and
certification requirements, and the operation of the required quality
assurance (QA) program. These revisions were proposed to maintain the
robust nature of the ambient monitoring networks while identifying
efficiencies and flexibilities that would help ensure the successful
operation of the national monitoring system.
The EPA last completed a comprehensive revision of its ambient air
monitoring regulations in a final rule published on October 17, 2006
(71 FR 61236). Minor revisions were completed in a direct final rule
published on June 12, 2007 (72 FR 32193). Periodic pollutant-specific
monitoring updates have occurred in conjunction with revisions to the
National Ambient Air Quality Standards (NAAQS). In such cases, the
monitoring revisions were typically finalized as part of the NAAQS
final rules.\1\
---------------------------------------------------------------------------
\1\ Links to the NAAQS final rules are available at: https://www3.epa.gov/ttn/naaqs/criteria.html.
---------------------------------------------------------------------------
II. Amendments to the Ambient Monitoring Requirements
A. General Information
This section describes revisions to the EPA's ambient air
monitoring requirements found in 40 CFR part 58--Ambient Air Quality
Surveillance: Subpart A--General Provisions, Subpart B--Monitoring
Network, and Appendix D--Network Design Criteria for Ambient Air
Quality Monitoring.
The EPA received public comments on its September 2014 proposal
from 31 respondents including 15 state agencies, 12 local agencies, two
multijurisdictional organizations (MJO), one consulting firm, and one
environmental organization whose comments represented two
organizations. Due to the relatively large number of individual
revisions contained in the proposal, commenters typically focused their
attention on particular items of interest while occasionally providing
a more general, overarching statement of support for the remaining
provisions. In some cases, commenters remained silent on other
provisions of the proposal and the level of support for those
provisions cannot be ascertained. In the following sections, the
specific comments will be noted as they pertain to each particular
proposed revision. This preamble will summarize the affected
regulation, proposed changes, public comments that were received, the
EPA's analysis of those comments where applicable, and EPA's final
decision concerning the revisions. A detailed description of changes to
Quality Assurance Requirements is contained in section III of the
preamble.
B. Definitions
The presence of a definitions section in the regulation ensures a
consistent interpretation of technical terminology across the various
parts of the CFR that pertain to ambient air monitoring, as well as in
supporting guidance documents, databases, and outreach materials that
support the monitoring community.
The EPA proposed to add and revise several terms to ensure
consistent interpretation within the monitoring regulations and to
harmonize usage of terms with the definition of key metadata fields
that are important components of the Air Quality System (AQS).\2\
---------------------------------------------------------------------------
\2\ The AQS is the EPA's repository of ambient air quality data.
The AQS stores data from over 10,000 monitors, 5,000 of which are
currently active. State, local and tribal agencies collect the data
and submit it to the AQS on a periodic basis. See https://www.epa.gov/aqs/aqs-obtaining-aqs-data for additional information.
---------------------------------------------------------------------------
The EPA proposed to add the term ``Certifying Agency'' to the list
of definitions. The certifying agency field was added to the AQS in
2013 as part of the development of a revised process for states and the
EPA Regions to meet the data certification requirements described in 40
CFR 58.15. The new term specifically describes any monitoring agency
that is responsible for meeting data certification requirements for a
set of monitors. In practice, a certifying agency is typically a state,
local, or tribal agency depending on the particular data reporting
arrangements that have been approved by an EPA Regional Office for a
given state. A list of certifying agencies by individual monitor is
available on the AQS-TTN Web site.\3\
---------------------------------------------------------------------------
\3\ https://aqs.epa.gov/aqsweb/codes/data/CertifyingAgenciesByMonitor.html.
---------------------------------------------------------------------------
The term ``Chemical Speciation Network,'' or CSN, was proposed for
addition to the definition list. The CSN has been functionally defined
as being composed of the Speciation Trends Network (STN) sites and the
supplemental speciation sites that are collectively operated by
monitoring agencies to obtain particulate matter up to 2.5 micrometers
(PM2.5) chemical species data.
The term ``Implementation Plan'' was proposed for addition to
provide more specificity to current definitions that reference the word
``plan'' in their description. The EPA wishes to ensure that references
to State Implementation Plans (SIPs) are not confused with references
to Annual Monitoring Network Plans that are described in 40 CFR 58.10.
The EPA proposed to revise the term ``Local Agency'' to clarify
that such organizations are responsible for implementing portions of
Annual Monitoring Network Plans. The current definition refers to the
carrying out a plan that is not specifically defined, leading to
possible confusion with SIPs.
The EPA proposed to revise the term ``Meteorological Measurements''
to clarify that such measurements refer to required parameters at the
National Core Monitoring Program (NCore) and photochemical assessment
monitoring stations (PAMS).
The terms ``Monitoring Agency'' and ``Monitoring Organization''
were proposed for clarification to include tribal monitoring agencies
and to simplify the definition of monitoring organization to reference
the definition of monitoring agency.
The term ``NCore'' was proposed for revision to remove nitrogen
dioxide (NO2) and lead in PM10 (Pb-
PM10) as a required measurement and to expand the definition
of basic meteorology to specifically reference the required
measurements: Wind speed, wind direction, temperature, and relative
humidity. The EPA clarifies that NO2 was never a required
NCore measurement and that the current definition was erroneous on this
issue. Additionally, the requirement to measure Pb-PM10 at
NCore sites in areas over 500,000 population was proposed for
elimination due to the extremely low concentrations being measured at
these sites.
The term ``Near-road NO2 Monitor'' was proposed for
revision to ``Near-road Monitor.'' This revision is being made to
broaden the definition of near-road monitors to include all monitors
operating under the specific requirements described in 40 CFR part 58,
appendix D (sections 4.2.1, 4.3.2, 4.7.1(b)(2)) and appendix E (section
6.4(a), Table E-4) for near-road measurement of PM2.5 and
carbon monoxide (CO) in addition to NO2.
The term ``Network Plan'' was proposed for addition to clarify that
any such references in 40 CFR part 58 refer to the annual monitoring
network plan required in 40 CFR 58.10.
The term ``Plan'' was proposed for deletion as its usage has been
replaced with more specific references to either the annual monitoring
network plan required in 40 CFR 58.10 or the SIP approved or
promulgated pursuant to CAA section 110.
The term ``Population-oriented Monitoring (or sites)'' was proposed
for
[[Page 17250]]
deletion. This term, along with the related concept of population-
oriented monitoring, was deleted from 40 CFR part 58 in the 2013
PM2.5 NAAQS final rule (78 FR 3235-3236). This was to ensure
consistency with the longstanding definition of ambient air applied to
the other NAAQS pollutants.
The term ``Primary Monitor'' was proposed for addition to the
definition list. The use of this term has become important in AQS to
better define the processes used to calculate NAAQS design values when
more than one monitor is being operated by a monitoring agency for a
given pollutant at the same site. This term identifies the primary
monitor used as the default data source in AQS for creating a combined
site record for pollutants that allow site combinations per 40 CFR part
50.
The term ``Primary Quality Assurance Organization'' was proposed
for revision to include the use of the acronym, ``PQAO,'' and to note
that a PQAO could include a group of monitoring organizations.
The terms ``PSD Monitoring Organization'' and ``PSD Monitoring
Network'' were proposed for addition to support the proposed new
appendix B that will pertain specifically to QA requirements for
prevention of significant deterioration (PSD) networks.
The term ``PSD Reviewing Authority'' was proposed for addition to
support the addition of appendix B to the part 58 appendices and to
clarify the identification of the lead authority in determining the
applicability of QA requirements for PSD monitoring projects.
The term ``Reporting Organization'' was proposed for revision to
clarify that the term refers specifically to the reporting of data as
defined in AQS. The AQS does allow the distinct designation of agency
roles that include analyzing, certifying, collecting, reporting, and
PQAO.
The term ``SLAMS'' (state and local air monitoring stations) was
proposed for clarification to indicate that the designation of a
monitor as SLAMS generally refers to a monitor required under appendix
D of part 58 and is needed to meet monitoring objectives. The SLAMS
monitors make up networks that include NCore, PAMS, CSN, and other
state or local agency sites that have been so designated in annual
monitoring network plans.
The terms ``State Agency'' and ``STN'' were proposed for minor
wording changes for purposes of clarity only.
The term ``State Speciation Site'' was proposed for deletion given
the proposed addition of ``Supplemental Speciation Station'' to better
describe the distinct elements of the CSN, which includes the STN
stations that are required under section 4.7.4 of appendix D of part
58, and supplemental speciation stations that are operated for specific
monitoring agency needs and are not considered to be required monitors
under appendix D.
We received relatively few comments on the proposed revisions to
definitions. One commenter noted that the clarification of
Meteorological Measurements should specify that those parameters are
also required at SLAMS sites, which include both the NCore and PAMS
sites. They noted the use of the undefined phrase ``combined data
record'' in the Primary Monitor definition and recommended that a
definition be provided. They also recommended that the EPA include an
explanation of the term ``Special Purpose Monitor'' (SPM) in the
definitions section of the preamble and not rely solely on the amended
regulatory text. A commenter from a state air program noted that the
proposed definition for ``Monitoring Organization'' includes the phrase
``or other monitoring organization.'' They believe the phrase is
ambiguous and could extend the applicability of requirements such as
technical systems audits to universities, contractors, and other
government organizations. This commenter was concerned that the
phrasing could expand the applicability of regulations, and that the
phrase should be either defined or removed from the final definition
verbiage.
The EPA has made several revisions to definitions in response to
these comments. The Meteorological Measurements definition has been
amended to include a clarifying reference that SLAMS stations include
sites that comprise the NCore and PAMS networks. Additionally, the
words ``or other monitoring organization'' have been removed from the
definition for Monitoring Organization to remove any ambiguity that
monitoring regulations apply to entities other than state, local, or
tribal agencies.\4\ The EPA does not believe that the definition for
Primary Monitor needs to be amended as the term ``combined data
record'' is already defined as part of appendix N to Part 50
(Interpretation of the National Ambient Air Quality Standards for
PM2.5). The EPA acknowledges that the preamble to the
proposal inadvertently failed to discuss a clarification to the Special
Purpose Monitor definition included in the proposal. The proposed
revision to this definition was the addition of two sentences that
merely restated existing requirements already established in 40 CFR
58.10 with regard to annual monitoring network plans and network
assessments. The EPA believes that the proposed definition is a useful
but minor revision that should be retained as proposed. No other
comments were received on the proposed revisions to definitions and
they will be finalized as proposed.
---------------------------------------------------------------------------
\4\ The EPA does note that other mechanisms can be used to
extend the applicability of monitoring requirements to sites
operated by other entities, e.g., industrial monitors. For example,
states can develop Memorandum of Understanding (MOU's) with the
operators of such sites to ensure that the monitors are operated
according to part 58 requirements and that the resulting data are of
known quality.
---------------------------------------------------------------------------
C. Annual Monitoring Network Plan and Periodic Network Assessment
The annual monitoring network plan process provides an important
communications and planning pathway between monitoring agencies, EPA
Regional Offices, and the general public. The network assessment
process, required every 5 years, provides an opportunity to conduct
more in-depth planning and analyses of current and future ambient
monitoring needs and objectives to help ensure that monitoring programs
respond to changing requirements, demographics, air quality trends, and
updated technology.
The EPA proposed several changes to the annual monitoring network
plan process and related requirements. We received significant comment
on these changes. Therefore, each individual proposed revision is
discussed below along with relevant comments.
Since the revision of the annual monitoring network plan process in
2006, the EPA has received feedback about confusion concerning the
difference between the process of obtaining public inspection versus
comment, the responsibility of monitoring agencies to respond to public
comment in their submitted annual monitoring network plans, and the
responsibility of the EPA Regional Offices to obtain public comment
depending on a monitoring agency's prior action, as well as whether the
annual monitoring network plan was modified based on discussions with
the monitoring agency following plan submission. Accordingly, we
proposed that the public inspection aspect of the requirement contained
in 40 CFR 58.10(a)(1) be revised to clearly indicate that obtaining
public comment is a required part of the process, and that plans that
are submitted to the EPA
[[Page 17251]]
Regional Offices should address such comments that were received during
the public notice period. A related part of the annual monitoring
network plan process is described in 40 CFR 58.10(a)(2) with the
distinction that this section pertains specifically to plans that
propose SLAMS modifications and, thereby, also require specific
approval from the EPA Regional Administrator.
Consistent with the proposed change to the comment process
described above, the EPA proposed changes to the text in 40 CFR
58.10(a)(1) to reflect the fact that public comments will have been
required to be obtained by monitoring agencies prior to submission, and
that the role of the EPA Regional Office would be to review the
submitted plan together with public comments and any modifications to
the plan based on these comments.
A number of state monitoring agencies and two MJOs commented that
the proposed requirement to solicit and address comments during the
public inspection period would impose additional burden, inflexibility,
and delays on the process by requiring that the comments be addressed
before the original plan is submitted to the EPA. Some of these
commenters estimated that it would take an additional two months
compared with the current process to handle comments in this manner,
and that they could only support the proposed change if the deadline
for submittal was revised as well. They requested that the EPA waive
this proposed requirement or make the procedure more flexible by
allowing comments to be submitted later, perhaps as an amendment before
the plan is approved, or even with the next year's plan. Four state
programs supported the proposed revision noting the importance of
soliciting public input on the content of the plan and the perspective
that states should take the lead in responding to comments versus the
EPA. One of these states noted that they attempt to schedule a public
comment period for every SLAMS modification. They also noted that
flexibility would be needed in emergency situations that demand
immediate changes to their network. Another of these states requested
that the term ``address'' be clarified and noted that the timeliest way
to handle comments and responses would be to include this information
in an appendix to the plan when submitted to the EPA. A different
perspective was offered by comments received from a joint environmental
group submission. They commented that the proposed changes did not go
far enough to ensure a meaningful public comment opportunity. They
noted that annual monitoring network plans are integral parts of SIPs
and that the CAA requires that SIP submittals and revisions be more
formally publicly noticed. They suggested that the EPA require states
to prominently advertise monitoring plans, allow at least 30 days for
public comment, then either hold a public hearing or provide such an
opportunity if requested. They also added that a separate notice and
comment opportunity must be required on the EPA's proposed action on a
submitted plan or a related amendment to an approved plan, and that all
of the suggested public comment requirements must also be applicable to
the 5-year network assessment.
The EPA recognizes the diversity of comments on this aspect of the
proposal. Nearly all commenters recognized that fostering public
involvement in the annual monitoring network plan is important and
desirable. Those commenters supporting the proposal noted that their
existing procedures already address the proposed requirements and that
they found it desirable to be able to respond directly to stakeholders.
Adverse comment was related to the implied additional burden of
obtaining comment versus the current requirement of posting for public
inspection, concern about limiting the flexibility to subsequently
modify the plan following submission to the EPA, and the perceived
impracticality of adequately responding to public comments in a timely
manner.
The EPA does not agree with the comments received from the joint
environmental group submission on this aspect of the proposal. First,
the final rule text requires annual monitoring network plans to be made
available for at least 30 days of public inspection and comment and
further requires monitoring agencies to address, as appropriate, any
significant issues raised in public comment. Requiring at least 30 days
of public participation and consideration of significant comments is
consistent with the CAA and the Administrative Procedure Act (APA) and,
at the same time, affords monitoring agencies with the flexibility and
discretion to provide for additional time and public participation
procedures.
Second, the EPA disagrees that state action on an annual monitoring
network plan triggers the same public participation requirements
applicable to SIP adoption and revision. Section 110(a)(2)(B) of the
CAA provides that each SIP shall ``provide for establishment and
operation of appropriate devices, methods, systems, and procedures
necessary to (i) monitor, compile, and analyze data on ambient air
quality, and (ii) upon request, make such data available to the
Administrator.'' To meet these requirements, our September 2013
Guidance on Infrastructure State Implementation Plan (SIP) Elements
under Clean Air Act Sections 110(a)(1) and 110(a)(2) states that ``the
best practice for an air agency submitting an infrastructure SIP would
be to submit, for inclusion into the SIP . . . , the statutory or
regulatory provisions that provide the air agency or official with the
authority and responsibility to perform'' certain actions required
under 40 CFR part 58. (See 2013 iSIP Guidance, p. 22.) In other words,
CAA section 110(a)(2)(B) simply requires that monitoring agencies have
the legal authority to implement 40 CFR part 58; it does not treat
annual monitoring network plans required under 40 CFR part 58 as
``integral parts'' of a SIP subject to public participation whenever
such network plans are established or modified.
Third, the EPA disagrees that EPA action on an annual monitoring
network plan requires a separate notice and comment opportunity. The
EPA reviews and acts on network plans through informal adjudications in
which the EPA determines whether such network plans satisfy the
requirements in 40 CFR 58.10. Such adjudications are not rulemakings
subject to the public participation requirements of the APA (see 5
U.S.C. 553), although they are final agency actions subject to judicial
review (see 5 U.S.C. 706). The EPA's decision to treat network plan
decisions as case-by-case adjudications rather than ``rules'' reflects
the fact that the EPA simply compares the information supplied in the
network plan with the requirements of 40 CFR part 58 and notifies the
relevant monitoring agencies that design and operate the corresponding
networks whether their particular networks satisfy Part 58 or need
further revision.
Finally, the EPA disagrees that public notice and comment is
required ``at both the state and federal levels on the 5-year
monitoring network assessments required at 40 CFR 58.10(d).'' To the
extent that the EPA takes ``substantive action'' on such assessments,
such actions are not rulemakings subject to public participation
requirements under the CAA or the APA.
Given the relatively broad support for the concept of soliciting
public comment as part of the annual monitoring network plan posting
process, as well as the concern for the
[[Page 17252]]
implied logistical challenge of both obtaining comment and developing
(and getting management approval for) adequate responses, while still
meeting the required submission deadline of July 1, the EPA believes
that some modification of the proposed language is appropriate. As
noted by several commenters, the implied burden to ``reference and
address any such received comments'' as described in the proposed
regulatory language may be too difficult to achieve. As suggested by
one commenter, it may be more practical for monitoring agencies to
review and consider the comments, and only to modify the plan when
``appropriate and feasible.'' By modifying the proposed language to
provide more flexibility and discretion in addressing comments based on
each agency's technical evaluation of received comments and the
associated management review chain, the EPA can finalize the generally
supported goal of increasing public involvement in the process while
lessening the burden on agencies that have not previously included the
solicitation of public comment in their process. Accordingly, the EPA
is revising the regulatory language in the last sentence of 40 CFR
58.10(a)(1) from ``The annual monitoring network plan must be made
available for public inspection and comment for at least 30 days prior
to submission to the EPA and the submitted plan shall reference and
address any received comments'' to ``The annual monitoring network plan
must be made available for public inspection and comment for at least
30 days prior to submission to the EPA and the submitted plan shall
include and address, as appropriate, any received comments.'' The EPA
believes that this revised language, including the clarification that
the plan ``address, as appropriate, any received comments,'' provides
sufficient flexibility to monitoring agencies and ensures adequate
public participation practices. Under this approach, all agencies will
review public comments and make changes to the plan as appropriate in
light of public comments, taking into account the requirement for
timely submission of network plans. The EPA encourages states to
provide responses to significant comments but understands that
developing formal responses may potentially delay submission of the
plan beyond the July 1 deadline, in light of internal timelines and
management review procedures. To avoid such delays, it would also be
acceptable for states to submit the proposed plan with comments and any
resulting changes, and where the EPA finds it necessary to discuss how
the state considered and addressed specific comments, the EPA will
follow up as part of our process for reviewing the plan for approval.
Another aspect of the annual monitoring network plan requirements
is the listing of required elements and site information in 40 CFR
58.10. The EPA proposed to add two requirements to this list as
described below. First, the EPA proposed to require that a PAMS network
description be specifically included in the 40 CFR 58.10(a)
requirements for any monitoring agencies affected by PAMS requirements.
The requirements for such a plan are already referenced in appendix D,
sections 5.2 and 5.4 of this part. Second, the EPA proposed that
``long-term'' SPMs, i.e., those SPMs operating for longer than 24
months whose data could be used to calculate design values for NAAQS
pollutants in cases where the EPA-approved methods are being employed,
should be identified in the 40 CFR 58.10(b) requirements along with a
discussion of the rationale for keeping the monitor(s) as SPMs or
potentially reclassifying to SLAMS. The EPA did not propose that such
monitors must become SLAMS, only that the ongoing operation of such
monitors and the rationale for retaining them as SPMs be explicitly
discussed to avoid confusion, particularly because the monitoring data
could be used to calculate design values regardless of whether the
monitors are designated SPMs or SLAMs. Thus, there is potential for
unintended complexities in the designations process if any design value
SPMs would be discontinued without adequate discussion.
Nine commenters addressed the above issues. Only one commenter
specifically addressed the addition of the PAMS network description and
that comment was ``Support this action.'' The remainder of comments
addressed the issue of requiring an annual monitoring network plan
discussion and rationale for whether longer-term SPMs should be
retained as SPMs or reclassified to SLAMS. Three of these commenters
were supportive of the proposed revision with several noting that they
expected that monitoring agencies would still be granted discretion on
the issues by the EPA Regional Offices. Two commenters suggested
revised language to limit the proposed SPM discussion to only criteria
pollutant monitors and also only those monitors utilizing federal
reference methods (FRM) or federal equivalent methods (FEM). One
commenter only supported the revision if the EPA could provide grant
funding. Three commenters did not support the proposed revision, either
because they interpreted the provision as meaning that the EPA was
proposing that such longer-term SPMs be automatically converted to
SLAMS in the absence of a justification, due to the belief that such a
rationale would create a burden for monitoring agencies and that such a
discussion is misplaced in the annual monitoring network plan, or
because of the belief that ongoing discussions between the states and
EPA Regional Offices are already sufficient to handle such issues, and
that the additional requirement is an unnecessary limit on monitoring
network flexibility.
After consideration of these comments, the addition of the PAMS
network description to the list of requirements in 40 CFR 58.10(a) will
be finalized as proposed due to general support and lack of comment on
this revision.
The EPA will not finalize the proposed changes to 40 CFR 58.10(b).
The EPA believes that some misunderstanding still exists as to the
intent of the proposed addition of a required discussion and rationale
concerning longer-term SPM monitors. Although preamble language
explicitly stated that the EPA was not intending to propose an
automatic conversion process for such SPMs, several commenters
interpreted the proposal in that way. One commenter noted, ``Also the
mechanism is unclear for how SPMs not granted approval will convert to
a SLAMS monitor.'' It was not the EPA's intention to imply any
limitations on monitoring agency discretion to employ SPMs as part of
their network design strategy, only to raise the awareness among all
stakeholders of such situations when they occur, particularly with
longer-term SPMs that may have design values approaching or exceeding
the NAAQS. Comments regarding the need to limit the proposed
requirement to FRMs or FEMs also indicate a misunderstanding of the
proposed language as this limitation was already included in the
regulatory language in the proposal. Given these apparent areas of
confusion and the concern about additional burden that the inclusion of
such a rationale would place on plan submitters, the EPA will not
finalize this proposed change to 58.10(b). Nevertheless, we continue to
believe that an open and robust discussion about such longer-term SPMs
is an important part of interactions between monitoring agencies and
EPA Regional Offices, particularly in the context of monitors utilizing
EPA-approved methods that are measuring concentrations near the level
of
[[Page 17253]]
applicable NAAQS. While continuing to support the use of SPMs to
provide flexible options for investigating air quality problems, we
encourage reference to these situations in annual monitoring network
plans and thoughtful consideration of the pros and cons of converting
such monitors to SLAMS particularly to avoid potential disruption of
implementation actions due to discontinuance of important SPMs.
The EPA proposed a minor edit to the annual monitoring network plan
requirements to revise terminology referring to PM2.5
speciation monitoring. No comments were received on this issue and the
change will be finalized as proposed.
The EPA received comments on a general rewording of regulatory
language that was included as part of the revisions to 40 CFR 58.10(a).
Specifically, we revised the sentence ``The plan shall include a
statement of purposes for each monitor and evidence that siting and
operation of each monitor meets the requirements of appendices A, C, D,
and E of this part, where applicable'' to ``The plan shall include a
purpose statement for each monitor along with a statement of whether
the operation of each monitor meets the requirements of appendices A,
B, C, D, and E of this part, where applicable.'' Additionally, the
proposed language added the following sentence: ``The Regional
Administrator may require the submission of additional information as
needed to evaluate compliance with applicable requirements of Part 58
and its appendices.''
One state monitoring agency noted that there was overlap between
the monitoring objective and the purpose of a monitor as referenced in
the regulatory language. They suggested that the terms be defined in
the definitions section of the rule. They also suggested removing the
purpose statement entirely as it appears duplicative with other annual
monitoring plan requirements that are already present. Two MJOs
referenced the statement concerning the Regional Administrator's
discretion to require the submission of additional information to
evaluate the compliance of the submitted plan with part 58 and
appendices. They commented that the proposed language was ``vague and
open-ended'' and that the presence of this requirement would lead to
significant differences among the EPA Regions concerning the level of
detail needed to evaluate plan submittals. It was suggested that the
EPA consider amending the language to more clearly define the
circumstances when additional information would be needed.
The EPA believes that some revision of the referenced language is
appropriate to achieve the goal of providing monitoring agencies with a
more explicit description of the documentation that is required in the
plans as well as providing the EPA Regional Offices with a clear basis
for review and approval. We agree with the comment that the requirement
for a ``purpose statement'' is vaguely worded and duplicative of
existing requirements (in 40 CFR 58.10(b)) that pertain to factors such
as monitoring objective and spatial scale. We also note the comments
concerning the open-ended nature of the statement that the Regional
Administrator has discretion to require the submission of additional
information to evaluate the compliance of the submitted plan with Part
58 and appendices. The EPA observes that this type of statement is not
unusual in the context of various monitoring requirements, particularly
in the Network Design Criteria described in appendix D. We do not
anticipate frequent requests for additional information in the context
of the Annual Monitoring Network Plan requirements, but we would
anticipate that additional information would be needed by Regional
Offices when the reasons supporting compliance with the applicable
requirements of part 58 and its appendices have changed from the
previous year's plan, or when a monitor has been added since the
previous year's plan was approved.
Accordingly, the EPA is revising the proposed language by deleting
the words ``a purpose statement for each monitor along with'' from the
second sentence of 40 CFR 58.10(a)(1) and also revising the sentence
``The Regional Administrator may require the submission of additional
information as needed to evaluate compliance with applicable
requirements of Part 58 and its appendices'' to ``The Regional
Administrator may require additional information in support of this
statement,'' which is a somewhat narrower framing of the need for
Regional Administrator discretion in the context of assuring whether
the operation of each monitor meets the requirements of appendices A,
B, C, D, and E of this part, as described in the submitted Annual
Monitoring Network Plan.
Finally, two public comments were received on preamble language in
the proposal pertaining to the EPA's discussion about the ability of
Regional Offices to handle partial approvals of annual monitoring
network plans in cases where one or more of the required elements is
problematic. A joint environmental organization comment noted that the
EPA's discussion did not indicate a timeframe for the correction of
deficiencies and, hence, the described partial approval process was
unlawful and arbitrary. They further suggested that an appropriate time
limit for the correction of deficiencies would be 90 days. A MJO
comment noted that a partial approval process is not an appropriate
strategy for the longer term, although the process as it exists now has
been found to be useful in some cases. This commenter supported
language in the preamble discussion relating to an approval process
while noting technical deficiencies, as long as such deficiencies were
related to required elements of the plan.
The EPA notes that the preamble discussion (79 FR 54360) was not
tied to any proposed revisions to requirements or regulatory language,
but was intended as an articulation of what we believe to be currently
available flexibility in the handling of annual monitoring network plan
submissions. The EPA agrees that deficiencies should be corrected and
intends to work with monitoring agencies to address deficiencies in a
timely manner. However, the EPA does not believe that the lack of a
regulatory schedule for correcting deficiencies is unlawful or that it
would be appropriate to establish one without having solicited comment
on the topic in the proposal. Accordingly, no additional action was
taken within the context of this rulemaking.
D. Network Technical Requirements
The Network Technical Requirements section provides a place for
cross-referencing and clarifying the applicability of the various
requirements that are described in the appendices to part 58.
The EPA proposed to revise the language in 40 CFR 58.11(a)(3) to
note the proposed revisions to appendix B to the QA requirements that
would pertain to PSD monitoring sites. One supportive comment was
received on this issue and the revision will be finalized as proposed.
E. Operating Schedules
The operating schedule requirements described in 40 CFR 58.12
pertain to the minimum required frequency of sampling for continuous
analyzers (for example, hourly averages) and manual methods for
particulate matter (PM) and Pb sampling (typically 24-hour averages for
manual methods).
[[Page 17254]]
The EPA proposed to revise these requirements by (1) adding
flexibility in the minimum required sampling for PM2.5 mass
sampling and for PM2.5 speciation sampling; (2) modifying
language pertaining to continuous mass monitoring to reflect revisions
in regulatory language that were finalized in the 2013 PM NAAQS final
rule; and (3) clarifying the applicability of certain criteria that can
lead to an increase in the required sampling frequency, for example, to
a daily schedule. Ten commenters responded to these proposed changes.
Most of the comments were generally supportive of these changes as they
provide additional flexibility and potential burden reductions for
monitoring agencies. Some comments noted concern with specific changes
to the period of time that a PM2.5 sampler would have to
utilize an increased sampling frequency if triggered by design values.
Additional details on these generally supportive comments are discussed
below in the relevant sections. A joint environmental organization
comment opposed all the sampling frequency changes; they noted concern
for the increased risk of not detecting daily variations in
PM2.5 by allowing samplers to follow reduced sampling
schedules and also noted the lack of a cost analysis documenting the
burden of monitoring as well as the fact that the EPA was not requiring
additional monitoring to compensate for the reduced sampling frequency.
With regard to the minimum required sampling frequency for manual
PM2.5 samplers, current requirements state that at least a
1-in-3 day frequency is mandated for required SLAMS monitors without a
collocated continuous monitor. The EPA believes that some regulatory
flexibility is appropriate in situations where a particular monitor is
highly unlikely to record a violation of the PM2.5 NAAQS,
such as in areas with very low PM2.5 concentrations relative
to the NAAQS and/or in urban areas with many more monitors than are
required by appendix D (when a subset of those monitors is reading
lower than other monitors in the area). The EPA specifically proposed
that the required sampling frequency could be reduced to 1-in-6 day
sampling or another alternate schedule through a case-by-case approval
by the EPA Regional Administrator. Such approvals could be based on
factors that are already described in 40 CFR 58.12(d)(1)(ii) such as
historical PM2.5 data assessments, the attainment status of
the area, the location of design value sites, and the presence of
continuous PM2.5 monitors at nearby locations. The EPA noted
that the request for such reductions in sampling frequency would occur
as part of the annual monitoring network plan process as operating
schedules are a required part of the plans as stated in 40 CFR
58.10(b)(4). For sites with a collocated continuous monitor, the EPA
also proposed that the current regulatory flexibility to reduce to 1-
in-6 day sampling or a seasonal sampling schedule is appropriate based
on factors described above and, in certain cases, may also be
applicable to lower-reading SLAMS sites without a collocated continuous
monitor, for example, to reduce frequency from 1-in-6 day sampling to a
seasonal schedule. Such flexibility was proposed through changes in the
regulatory language in 40 CFR 58.12(d)(1)(i) and (ii).
With the one exception noted earlier, supportive comments were
received on this specific proposed revision. One MJO commented that
flexibility is needed in specifying operating schedules, and that it is
preferable to retain lower reading sites with a reduced sampling
frequency rather than close them completely. Similar comments included
``Support this action'' and the observation that the proposed changes
should reduce monitoring burden. Concerning the joint environmental
organization comment noting the potential increased risk of not
characterizing the risk from PM2.5 levels that might be
missed when sampling frequency is reduced, the EPA notes that these
case-by-case situations would be reviewed by EPA Regional Offices for
approval, and that the pertinent approval criteria would include an
assessment of prevailing PM2.5 concentrations and the
availability of other manual or continuous monitors that would provide
characterization in the general area. As stated in the proposal, we
expect these sampling reduction requests to be made for lower reading
sites so the impact on area design values would be negligible. We also
note that the requests would be made through the annual monitoring
network plan process and, therefore, would be open for public
inspection and comment prior to potential approval by the EPA. On an
overall basis, the EPA believes that it is important to have
operational flexibilities with regard to sampling frequency to permit
monitoring agencies to shift resources (e.g., higher sampling frequency
samplers) to high priority areas; this flexibility supports the ability
of the monitoring network to react to changing air quality trends and
problems in a manner most protective of public health. Concerning the
observation that the EPA has not provided an analysis of relevant
costs, we note the public availability of such financial information in
information collection request documents that are regularly updated and
submitted for public comment according to Office of Management and
Budget regulation.\5\
---------------------------------------------------------------------------
\5\ See https://www.regulations.gov/#!documentDetail;D=EPA-HQ-
OAR-2002-0091-0017.
---------------------------------------------------------------------------
In consideration of the comments above, the EPA is finalizing the
revisions to add flexibility to sampling frequency requirements for
PM2.5 mass samplers as proposed.
The EPA also proposed added flexibility in sampling frequency for
PM2.5 CSN sites, specifically the STN sites that are
currently operated at approximately 53 locations.\6\ The STN stations
are currently required to sample on at least a 1-in-3 day frequency
with no opportunity for flexibility. Justifications for the proposed
additional flexibility include the conservation of resources for
reinvestment in other needs within the CSN, rising analytical costs,
and the availability of new technologies that provide continuous
measurement of PM2.5 species. Accordingly, the EPA proposed
that a reduction in sampling frequency from 1-in-3 day be permissible
for manual PM2.5 samplers at STN stations, for example, to a
1-in-6 day frequency. The approval for such changes at STN stations, on
a case-by-case basis, would be made by the EPA Administrator as the
authority for changes to STN has been retained at the Administrator
level per appendix D of this part, section 4.7.4.\7\ Factors that would
be considered as part of the decision would include an area's design
value, the role of the particular site in national health studies, the
correlation of the site's species data with nearby sites, and presence
of other leveraged measurements.
---------------------------------------------------------------------------
\6\ https://www.epa.gov/ttn/amtic/specgen.html.
\7\ The approval process has been delegated, in practice, to the
Director of the Air Quality Assessment Division within the Office of
Air Quality Planning and Standards.
---------------------------------------------------------------------------
Few commenters specifically addressed this proposed change as the
aforementioned comments pertaining to changes in sampling frequency for
PM2.5 mass samplers were likely deemed pertinent to the CSN.
Where this proposed change was mentioned specifically, monitoring
agency comments noted support as a means of increasing flexibility and
potentially protecting sites by reducing sampling frequency versus
eliminating sites completely. The joint environmental organization
comment stated that a
[[Page 17255]]
reasoned justification for the change was not provided, and noted that
speciation data are critical in development of SIP control strategies,
health studies, modeling exercises, and investigation of air pollution
episodes.
The EPA notes the supportive comments from monitoring agencies and
agrees that increasing flexibility with respect to sampling frequency
as an alternative to site elimination was a motivation for the
revision. With respect to the environmental organization comment noting
concern about the additional flexibility and the potential for reduced
sampling frequency, the EPA agrees with the observation that
PM2.5 speciation data are critical to supporting many
different monitoring objectives. Because we believe that
PM2.5 speciation data are critical for the objectives noted
above, we recently completed an in-depth assessment of the CSN with the
goal of protecting, to the greatest extent possible, the long-term
operation of the network.\8\ In the face of rising analytical costs and
unchanging budgets, the EPA considered factors such as site reductions,
changes in sampling frequency, and alterations in operational
procedures to support long-term viability of the CSN. The results of
the assessment were implemented in late 2014 and early 2015, and the
EPA believes the revised CSN continues to provide strong support for
key monitoring objectives noted by the commenter and would do so even
if sampling frequency were selectively reduced at a small number of STN
sites based on substantive and suitable criteria. The EPA notes that a
proposal to reduce sampling frequency would need to be accompanied by a
technical rationale justifying the request and evaluating the impact on
data users and the ability of the site to meet the aforementioned key
objectives, for example, by employing new technology such as continuous
monitoring of PM2.5 species, in lieu of the reduced number
of filter samples.
---------------------------------------------------------------------------
\8\ https://www.sdas.battelle.org/CSNAssessment/html/Default.html.
---------------------------------------------------------------------------
In consideration of the comments and detailed network assessment
described above, the EPA is finalizing the revisions to add flexibility
to sampling frequency requirements for the PM2.5 STN sites
as proposed.
The EPA proposed editorial revisions to 40 CFR 58.12(d)(1)(ii) to
harmonize the language regarding the use of continuous FEM or approved
regional methods (ARM) monitors to support sampling frequency
flexibility for manual PM2.5 samplers with the current
language in 40 CFR 58.12(d)(1)(iii) that was revised as part of 2013 PM
NAAQS final rule. Specifically, the phrase ``unless it is identified in
the monitoring agency's annual monitoring network plan as not
appropriate for comparison to the NAAQS and the EPA Regional
Administrator has approved that the data from that monitor may be
excluded from comparison to the NAAQS'' was proposed for appending to
the current regulatory language to reflect the new process that was
finalized in the 2013 PM NAAQS final rule that allows monitoring
agencies to request that continuous PM2.5 FEM data be
excluded from NAAQS comparison based on technical criteria described in
40 CFR 58.11(e). We also proposed the addition of the phrase ``and the
EPA Regional Administrator has approved that the data from that monitor
may be excluded from comparison to the NAAQS'' to the revisions that
were made with the 2013 PM NAAQS. This revision was proposed to clearly
indicate that two distinct actions are necessary for the data from a
continuous PM2.5 FEM to be considered not comparable to the
NAAQS; first, the identification of the relevant monitor(s) in an
agency's annual monitoring network plan, and, second, the approval by
the EPA Regional Administrator of that request to exclude data. The
language used by the EPA in the relevant sections of 40 CFR 58.12
related to the initial request by monitoring agencies but did not
specifically address the needed approval by the EPA.
No comments specifically addressed these editorial changes in
regulatory language and they will be finalized as proposed.
Finally, the EPA proposed to clarify the applicability of
statements in 40 CFR 58.12(d)(1)(ii) and (iii) that reference the
relationship of sampling frequency to site design values. Specifically,
we proposed clarifications and revisions affecting the following
statements: (1) ``Required SLAMS stations whose measurements determine
the design value for their area and that are within 10
percent of the NAAQS; and all required sites where one or more 24-hour
values have exceeded the NAAQS each year for a consecutive period of at
least 3 years are required to maintain at least a 1-in-3 day sampling
frequency,'' and (2) ``Required SLAMS stations whose measurements
determine the 24-hour design value for their area and whose data are
within 5 percent of the level of the 24-hour
PM2.5 NAAQS must have a FRM or FEM operate on a daily
schedule.'' These revisions were proposed to avoid confusion among
monitoring agencies and Regional Offices concerning the applicability
of the sampling frequency adjustments since design values are
recalculated annually and, in some situations, such revised design
values can either fall below the comparative criteria or rise above the
criteria. To provide some clarity to this situation as well as to
provide a framework where changes in sampling frequency occur on a more
consistent and predictable basis, the EPA proposed that design value-
driven sampling frequency changes be maintained for a minimum 3-year
period once such a change is triggered. Additionally, such changes in
sampling frequency would be required to be implemented no later than
January 1 of the year that follows the recalculation and certification
of a triggering design value.
A number of supportive comments were received on this specific
issue from monitoring agencies. These comments ranged from unqualified
support to more conditional support based on concerns related to
funding levels and the overall burden of analyzing more
PM2.5 filters when sampling frequency is increased. One
agency commented that the proposed change ``makes sense where the
concentrations have reached a plateau or fluctuate back and forth from
year to year.'' However, concern was noted about waiting for 3 years to
decrease sampling frequency when design values are clearly trending
downward. Another state agency generally agreed with the proposed
approach but requested clarifying language that the same criteria that
would require an increase in sampling frequency for a 3-year period due
to an increase in design values would also allow a decrease in sampling
frequency for a 3-year period if the corresponding site design value
decreased below a threshold. Other commenters expressed concern about
the associated resource burdens noting that their gravimetric
laboratories are already operating at full capacity and that an
increase from 1-in-3 day sampling to daily sampling would triple the
number of filters to be weighed. Accordingly, these commenters
requested that the EPA allow the affected design value sampler to drop
back to a reduced sampling frequency as soon as a design value fell out
of the specific range and not be required to wait for the proposed 3-
year period. One commenter expressed concern that the provision could
trigger daily sampling even if the higher values were caused by a rare
or exceptional event, and requested that the proposed revision be
omitted. Finally, one state monitoring
[[Page 17256]]
agency expressed concern about the apparent deletion of PM10
monitoring requirements from 40 CFR 58.12, and also offered suggested
revisions to the current requirements in 40 CFR 58.12(e).
The EPA notes the range of responses on this issue and acknowledges
that in cases where the sampling frequency for a PM2.5
sampler is increased, for example from 1-in-3 day to daily sampling,
the associated burden, which includes field support and gravimetric lab
support, would increase for a minimum period of 3 years based on the
proposed change. After that 3-year period of increased sampling, the
sampling frequency would be eligible to be reduced if the triggering
design value was no longer in the specified range (e.g., 5
percent of the 24-hour PM2.5 NAAQS). The EPA agrees that the
treatment of sampling frequency in situations where a sampler is no
longer in the specific triggering range after a 3-year period of
increased sampling, should be analogous to the treatment of sampling
frequency in situations where a sampler first enters into the specific
triggering range, for purposes of providing predictability to
monitoring agencies in terms of anticipating operational burden. In
other words, where the sampling frequency is reduced at a sampler after
a 3-year period of increased sampling frequency (for example, where the
design value falls out of the 5 percent range), that
sampler should not be subject to an increased sampling frequency
requirement for at least 3 years. With regard to the concern that an
exceptional event could trigger the increased burden of operating a
higher sampling frequency sampler, we believe that this is a plausible
situation that deserves additional consideration. Rather than trying to
account for this situation in this rule, however, we believe it is best
dealt with in the context of the ongoing process of developing guidance
and proposed revisions to the Exceptional Events rule.\9\ Once those
actions are finalized, the EPA will work with Regional Offices to
clarify how to address this situation. On the related concern of a
``rare'' event triggering increased sampling frequency, the EPA notes
that the form of the PM2.5 NAAQS is intended to address such
year-to-year variations such that design values should not be overly
affected by ``rare'' occurrences of PM2.5 concentrations in
any given year. With regard to the comment indicating an apparent
deletion of the PM10 sampling frequency requirements in 40
CFR 58.12(e), we note that such changes were not included as part of
the proposal and those requirements remain.
---------------------------------------------------------------------------
\9\ https://www2.epa.gov/air-quality-analysis/treatment-data-influenced-exceptional-events#Proposed%20EE%20Rule.
---------------------------------------------------------------------------
The EPA believes that this proposed revision to sampling frequency
procedures is a necessary clarification to the regulatory change that
was finalized in 2006, and will provide a more predictable and
statistically robust process for making design value driven changes in
sampling frequency. Based on the unqualified and qualified supportive
comments, we are finalizing the regulatory language as proposed. While
we are mindful of the potential for added burden in cases where
PM2.5 samplers must move to a more frequent sampling
frequency for a longer period of time based on this revision, we also
note that the likelihood of such occurrences affecting monitoring
agencies is relatively small. Based on an AQS retrieval conducted in
August 2014, fewer than ten PM2.5 monitors out of a pool of
980 FRM monitors were required to operate on a daily sampling frequency
based on the rule provisions of 40 CFR 58.12(d)(1)(iii).\10\ While this
analysis is not predictive in nature, we believe the overall risk of
increasing burden on monitoring programs is quite small and an
acceptable consequence of providing a more specific way of implementing
an important aspect of the sampling frequency requirements.
Alternatively, as noted in the regulatory text, monitoring agencies
have the option of installing a continuous PM2.5 FEM monitor
to satisfy this requirement and, thereby, avoid the consequence of
handling an increased number of filters.
---------------------------------------------------------------------------
\10\ Hanley, T. (2015). Assessment of PM2.5 data to
determine the number of sites that would be potentially required to
increase their sample frequency to daily. Memorandum to the Docket,
EPA-HQ-OAR-2013-0619.
---------------------------------------------------------------------------
F. System Modification
The System Modification section pertains to the specific
requirements that must be followed when monitoring agencies request
changes to the SLAMS portion of their networks.
In the 2006 monitoring amendments, the EPA finalized a requirement
in 40 CFR 58.14(a) for monitoring agencies to ``develop and implement a
plan and schedule to modify the ambient air quality network that
complies with the finding of the network assessments required every 5
years by 58.10(e).'' Since 2006, there has been confusion between the
EPA and monitoring agencies as to whether a separate plan was required
to be submitted by 40 CFR 58.14(a) relative to the annual monitoring
network plan, with that separate plan devoted specifically to
discussing the results of the 5-year network assessment. As explained
in the monitoring proposal, the EPA did not intend for the submission
of a distinct plan devoted specifically to the implementation of the 5-
year network assessment. Accordingly, the EPA proposed to revise the
regulatory language in 40 CFR 58.14(a) to clearly indicate that a
separate plan is not needed to account for the findings of the 5-year
network assessment, and that the information concerning the
implementation of the 5-year assessment, referred to in the proposed
regulatory language as a ``network modification plan,'' shall be
submitted as part of the annual monitoring network plan that is due no
later than the year after the network assessment is due.\11\ According
to the proposed schedule, the annual monitoring network plans that are
due in 2016, 2021, etc., would contain the information referencing the
network assessments.
---------------------------------------------------------------------------
\11\ Monitoring agencies, at their discretion, could submit the
network modification plan in the year that the assessment is due if
sufficient feedback had been received. On balance, the EPA believes
that the extra year following the completion of the network
assessment would be valuable to assure a productive outcome from the
assessment process.
---------------------------------------------------------------------------
A number of comments were received on this issue. Most of the
commenters provided the perspective that the clarification in the
regulatory text was useful but that additional clarification was needed
to address how the phrase ``implement the findings'' was used in the
language. Five of these commenters noted that states should only have
to address those changes in the network assessments that are
specifically required by regulation, and that the EPA should clarify
that monitoring agencies have the flexibility to discuss what findings
they intend to implement and which findings they do not intend to
implement. Two commenters noted that monitoring agencies should not
have to summarize the findings of their network assessment in a network
modification plan that is due one year after the assessment, but rather
should have the flexibility to address and implement those findings
that are appropriate based on available resources and changing
priorities over some period of time. Two commenters supported the
proposed language without additional elaboration.
The EPA agrees with the comments requesting additional
clarification. The intention of the proposed revision was to clarify
the process for how and when monitoring agencies should deal with
[[Page 17257]]
the results from these important network assessments, not to imply that
all the results should be implemented or were necessarily required. The
network assessment requirements detailed in 40 CFR 58.10(d) reference a
mix of required elements (e.g., meeting the monitoring objectives of
appendix D) as well as useful but non-required elements such as
evaluation of new technologies and the evaluation of the impact on data
users of site discontinuance. To the extent that the EPA used the
phrase ``implements the findings of the network assessment'' in the
proposed regulatory language of 40 CFR 58.14(a), the concern from
monitoring agencies about specifying which results from the network
assessment are required and not required is understandable. The EPA
always intended that the results of the network assessments should be
used as a flexible planning tool for informing the next 5 years of
monitoring network operations, and the specificity being implied by the
monitoring agency comments reflects a misreading of those
intentions.\12\ The EPA disagrees with the comments suggesting that a
network modification plan is unnecessary. Such a requirement has been a
part of the monitoring regulations since the inception of the network
assessment, and having the network modification plan submitted as part
of the annual monitoring network plan insures public involvement in a
key process that occurs on a relatively infrequent basis.
---------------------------------------------------------------------------
\12\ See https://www.epa.gov/ttn/amtic/files/2014conference/monnaweinstock.pdf.
---------------------------------------------------------------------------
To address the concerns noted above, the proposed regulatory
language is being revised to replace ``implements'' with ``addresses,''
as follows: ``The state, or where appropriate local, agency shall
develop a network modification plan and schedule to modify the ambient
air quality monitoring network that addresses the findings of the
network assessment required every 5 years by Sec. 58.10(d).'' With
this revision, the EPA is indicating that the network modification plan
should reference or ``address'' the findings of the network assessment
without the unintended implication that some of the findings are
required network changes that must be implemented. The correct vehicle
for the discussion of required elements that must be implemented is the
annual monitoring network plan that is required to be submitted each
year, as discussed earlier in section II.C of this preamble.
The EPA also proposed to revise an incorrect cross-reference in the
current text of 40 CFR 58.14(a) in which the network assessment
requirement is noted as being contained in 40 CFR 58.10(e) when the
correct cross-reference is 40 CFR 58.10(d). One supportive comment
addressed this issue, and the revision will be finalized as proposed.
G. Annual Air Monitoring Data Certification
The data certification requirement is intended to provide ambient
air quality data users with an indication that all required validation
and reporting steps have been completed, and that the certified data
sets are now considered final and appropriate for all uses including
the calculation of design values and the determination of NAAQS
attainment status. Current requirements include the certification of
data collected at all monitors at SLAMS and monitors at SPMs using FRM,
FEM, or ARM methods. In practice, this requirement includes a very wide
range of measurements that are not limited to criteria pollutants but
also extend to non-criteria pollutant measurements at PAMS stations,
meteorological measurements at PAMS and NCore stations, and
PM2.5 chemical speciation parameters.
The EPA proposed several changes in the data certification
requirements to accomplish a streamlining of this important process.
First, to support the focus on certification of criteria pollutant
measurements, the EPA proposed to revise relevant sections of 40 CFR
58.15 to focus the requirement on FRM, FEM, and ARM monitors at SLAMS
and at SPM stations rather than at all SLAMS, which also include PAMS
and CSN measurements that may not utilize federally approved methods.
Second, the EPA proposed that the required AQS reports be submitted to
the Regional Administrator rather than through the Regional
Administrator to the Administrator as is currently required. Finally,
minor editorial changes were proposed in 40 CFR 58.15 to generalize the
title of the official responsible for data certification (senior
official versus senior air pollution control officer) and to remove an
outdated reference to the former due date for the data certification
letter (July 1 versus the current due date of May 1).
Seven commenters specifically addressed the proposed changes to
data certification. Three monitoring agencies, one MJO, and one
consulting firm were supportive of the changes. One of these commenters
also noted that the data certification and QA report hosted on the AQS
system, the AMP600 report, should be modified to provide more useful
data certification flag recommendations for regions and states. Another
of these supportive commenters also stated that the EPA should ensure
that QA practices and responsibilities remain in place to validate PAMS
and PM2.5 chemical speciation data. A joint environmental
group comment stated that the EPA had not provided a rational basis for
the proposed changes, and that an inconsistency exists between
proposing to retain the data certification process for criteria
pollutants while stating that existing QA plans and procedures would be
sufficient to validate non-criteria pollutant measurements. In this
commenter's view, the data certification process, as it exists today,
appears to delay the availability of data for use in computing criteria
pollutant design values, so perhaps the agency should consider
eliminating the process entirely if it is deemed unnecessary. Finally,
one commenter asked that the EPA consider moving the data certification
deadline from May 1 back to July 1, and also to consider not requiring
chemical speciation data to be certified.
With regard to the adverse comment, the EPA notes that the proposed
changes were made to protect the viability of the process in the face
of a rapidly increasing volume of data subject to certification
requirements versus the available resources at the monitoring agency
and EPA level needed to meet the requirements and deadline. We continue
to believe that the data certification process adds the greatest degree
of value when focused on criteria pollutants that support the
calculation of design values and the mandatory designations process.
The review of design values occurs on an annual basis and there is a
long-standing practice of waiting for criteria pollutant data to be
certified before such calculations are completed.\13\ This process
provides a basis for documenting that a state's review of their data is
complete and that the data are considered final for key purposes such
as comparison to the NAAQS. The same annual pattern of regular data
usage and oversight does not exist for non-criteria pollutants such as
PAMS, PM2.5 chemical species, and air toxics data, and these
data are not directly compared to the NAAQS. Therefore, the EPA
believes that the applicability and visibility of the data
certification process for these measurements is less critical. As
stated in the proposal, there are existing standardized procedures and
QA documents that provide a framework for assuring the quality of
[[Page 17258]]
non-criteria pollutants,\14\ and we believe that the resulting quality
of such data will not be compromised by their removal from the data
certification process. With regard to the comment requesting that the
data certification deadline be pushed back to July 1, the EPA notes
that this deadline was not proposed for revision and, therefore, is not
being considered in this final rulemaking. With regard to the comment
about excluding chemical speciation data from the certification
process, the EPA notes that this procedural change would occur as a
result of the proposed revisions as explained above.
---------------------------------------------------------------------------
\13\ See 40 CFR part 50, appendix N, section 3.0(a) as revised
on January 15, 2013 (78 FR 3278).
\14\ See https://www.epa.gov/ttn/amtic/specguid.html and https://www.epa.gov/ttn/amtic/airtoxqa.html.
---------------------------------------------------------------------------
After reviewing the comments, the EPA is finalizing the changes to
data certification requirements as proposed. The EPA agrees with
commenters that efforts to improve the validation procedures for non-
criteria data should continue and the agency has invested in revised
tools, such as the recently launched Data Analysis and Reporting Tool
(DART) web resource that can assist monitoring agencies with the
validation of data including PAMS and air toxics data.\15\ Improvements
are also being made to the AMP600 report to improve the utility of the
program for generating recommended certification flags for
consideration by monitoring agencies and EPA Regional Offices during
the annual review process.
---------------------------------------------------------------------------
\15\ See https://www.epa.gov/ttn/amtic/files/2014conference/mondatdewinter.pdf or access DART at https://www.airnowtech.org/dart/dartwelcome.cfm (username and password required).
---------------------------------------------------------------------------
H. Data Submittal and Archiving Requirements
The requirements described in 40 CFR 58.16 address the specific
measurements that must be reported to AQS as well as the relevant
schedule for doing so. Required measurements include criteria
pollutants in support of NAAQS monitoring objectives and public
reporting; specific ozone (O3) and PM2.5
precursor measurements such as those obtained at PAMS, NCore, and CSN
stations; selected meteorological measurements at PAMS and NCore
stations; and associated QA data that support the assessment of
precision and bias. In 1997, an additional set of required supplemental
measurements was added to 40 CFR 58.16 in support of the newly
promulgated FRM for PM2.5, described in 40 CFR part 50,
appendix L. In the 2006 monitoring amendments, many of these
supplemental measurements were removed from the requirements based on
the EPA's confidence that the PM2.5 FRM was meeting data
quality objectives (see 71 FR 2748). At that time, reporting
requirements were retained for average daily ambient temperature and
average daily ambient pressure, as well as any applicable sampler
flags, in addition to PM2.5 mass and field blank mass.
The EPA believes that it is no longer necessary to require agencies
to report the average daily temperature and average daily pressure from
manual PM2.5 samplers, given the long-standing experience
with the FRM and the ubiquitous availability of meteorological data,
and these specific AQS reporting requirements were proposed for removal
in the monitoring proposal. The EPA also proposed to remove similar
language referenced elsewhere in 40 CFR 58.16 that pertains to
measurements at Pb sites as well as to other average temperature and
average pressure measurements recorded by samplers or from nearby
airports. For the reasons noted above, the EPA believes that
meteorological data are more than adequately available from a number of
sources, and that the removal of specific requirements for such data to
be reported to AQS represents an opportunity for burden reduction. The
EPA notes that the requirement to report specific meteorological data
for NCore and PAMS stations remains unchanged given the importance of
having on-site meteorological data to correlate with PM2.5
and O3 precursor measurements. The EPA also proposed a
change to the data reporting schedule described in 40 CFR 58.16(b) and
(d) to provide additional flexibility for reporting PM2.5
chemical speciation data measured at CSN stations. Specifically, we
proposed that such data be required to be reported to AQS within 6
months following the end of each quarterly reporting period, as is
presently required for certain PAMS measurements such as volatile
organic compounds. This change would provide an additional 90 days for
PM2.5 chemical speciation data to be reported compared with
the current requirement of reporting 90 days after the end of each
quarterly reporting period. This change was proposed to provide both
the EPA and monitoring agencies with potential data reporting
flexibility as technological and procedural revisions are considered
for the national analytical frameworks that support the CSN network.
Seven commenters specifically addressed the proposed changes to
data submittal and archiving requirements. One state monitoring agency,
one MJO, and one consulting firm were supportive of all of the proposed
changes in this rule section, with the consulting firm comment also
noting that average temperature and pressure information should still
be archived within monitoring programs for data validation purposes.
Two state monitoring agencies expressed concerns about the proposed
change in the reporting deadline for PM2.5 chemical
speciation data by noting the impacts on their usage of the data, one
agency noting that efforts to submit timely exceptional event
demonstrations would be impacted by the longer period allowed for
reporting data, and the other state agency noting that their use of the
speciation data to validate PM2.5 FRM and ion (e.g.,
sulfate, nitrate) data would be impacted.
With specific regard to the impact on state submissions of
exceptional event data exclusion determinations, the EPA understands
the impact of the additional 90-day delay in gaining access to
PM2.5 chemical speciation data, but also notes that the
relatively long timelines that currently exist within the exceptional
events rule framework can typically accommodate an additional delay of
90 days without significant impact on the submitting agency.
Accordingly, we do not believe that the additional 90 days being
proposed for reporting PM2.5 chemical speciation data should
materially impact the ability of submitters to develop exceptional
event data exclusion determinations within allowable timeframes.\16\
Concerning the comment relating to the availability of PM2.5
chemical speciation data to QA practices for PM2.5 FRM data,
the EPA acknowledges the comparative value of such data but believes
that the existing availability of PM2.5 sampler diagnostic
records, collocated FRM data, as well as the potential availability of
continuous monitoring data from collocated monitors and/or nearby
sites, should be more than sufficient to validate PM2.5 FRM
data in the absence of more timely reported speciation data.
---------------------------------------------------------------------------
\16\ The EPA expects chemical speciation data to be reported
within 30 days of PM2.5 mass data based on the revised
analytical framework that took effect in late 2015.
---------------------------------------------------------------------------
In consideration of the comments noted above, the EPA is finalizing
the changes to data submittal and archiving requirements as proposed.
I. Network Design Criteria (Appendix D)
Appendix D to part 58 contains important information about ambient
monitoring objectives, site types, spatial scales, as well as other
general and specific minimum requirements
[[Page 17259]]
concerning network size and design criteria.
The EPA proposed two changes that affect the required suite of
measurements in the NCore network. This multi-pollutant network became
operational on January 1, 2011, and includes approximately 80 stations
that are located in both urban and rural areas.\17\
---------------------------------------------------------------------------
\17\ See https://www3.epa.gov/ttn/amtic/ncore.html for more
information.
---------------------------------------------------------------------------
The EPA proposed a minor change to section 3 of appendix D to part
58, the design criteria for NCore sites, specifically, the deletion of
the requirement to measure speciated PM10-2.5 from the list
of measurements in section 3(b). An identical revision was finalized in
the text of 40 CFR 58.16(a) in the 2013 p.m. NAAQS final rule (see 78
FR 3244). During this process, the EPA inadvertently failed to complete
a similar change that was required in the language of section 3 of
appendix D. Accordingly we proposed this change to align the NCore
monitoring requirements between the two sections noted above.
The EPA also proposed to delete the requirement to measure Pb at
urban NCore sites, either as Pb in Total Suspended Particles (Pb-TSP)
or as Pb-PM10. This requirement was finalized as part of the
reconsideration of Pb monitoring requirements that occurred in 2010
(see 75 FR 81126). Since that time, non-source oriented Pb data has
been measured at 50 urban NCore sites, with the majority of sites
having already collected at least 2 years of data. In all cases, valid
ambient Pb readings have been low, with maximum 3-month rolling
averages typically reading around 0.01 micrograms per cubic meter as
compared to the NAAQS level of 0.15 micrograms per cubic meter.\18\
This is an expected result given the elimination of Pb from gasoline
and the refocusing of the ambient network to characterize emissions at
sites that have been placed in relative close proximity to the
remaining industrial sources around a given threshold. We expect the
vast majority of non-source sites to have the 3 years of data necessary
to calculate a design value following the completion of monitoring in
2015. Given the uniformly low readings being measured at these NCore
sites, we believe it is appropriate to consider eliminating this
requirement. As noted in the associated docket memo, non-source
oriented Pb data will continue to be measured (as Pb-PM10)
at the 27 National Air Toxics Trends Sites (NATTS) and at hundreds of
PM2.5 speciation stations that comprise the CSN and IMPROVE
networks.
---------------------------------------------------------------------------
\18\ See supporting information for reconsideration of existing
requirements to monitor for lead at urban NCore site, Kevin
Cavender, Docket number EPA-HQ-OAR-2013-0619, https://www.regulations.gov/#!documentDetail;D=EPA-HQ-OAR-2013-0619-0002.
---------------------------------------------------------------------------
Accordingly, the EPA proposed to delete the requirement to monitor
for non-source oriented Pb at NCore sites from appendix D of 40 CFR
part 58.\19\ Given the requirement to collect a minimum of 3 years of
Pb data in order to support the calculation of design values, the EPA
proposed that monitoring agencies would be able to request permission
to discontinue non-source oriented monitoring following the collection
of at least 3 years of data at each urban NCore site.\20\
---------------------------------------------------------------------------
\19\ Specific revisions are proposed in 40 CFR part 58, appendix
D, section 3(b) and sections 4.5(b) and 4.5(c).
\20\ The EPA will review requests for shutdown under the
provisions of 40 CFR 58.14. Although the EPA anticipates that these
non-source oriented monitors will have design values well below the
NAAQS and will be eligible to be discontinued after 3 years of data
have been collected, in the event that a monitor records levels
approaching the NAAQS, it may not qualify to be discontinued.
---------------------------------------------------------------------------
Eight commenters specifically addressed the proposed changes to
network design criteria. Five state or local monitoring agencies, one
MJO, and one consulting firm were supportive of all of the proposed
changes in this appendix, with several of the monitoring agencies
characterizing their measurements of Pb at urban NCore sites as either
``extremely low'' or between 3 percent or 5 to 7 percent of the Pb
NAAQS. One joint environmental group comment disagreed with the
proposed change to Pb monitoring, noting the perspective that there is
no safe level of Pb and that data even well below the level of the
NAAQS could assist communities with finding ways of reducing Pb
exposure and that such data would also assist researchers investigating
the risks of Pb exposure for children. This commenter also noted that
the EPA might propose to lower the Pb NAAQS in an upcoming rulemaking
that was pending at the time when the comment was submitted.
With regard to the adverse comment, the EPA notes in the referenced
docket memo that well over 300 monitoring sites for Pb would remain in
operation following the proposed termination of monitoring at urban
NCore sites. These remaining sites would provide characterization of Pb
in TSP, PM10, and PM2.5 in a variety of urban and
rural locations including source oriented sites, neighborhood/community
locations, and background areas. We also note that the EPA retains the
authority to require additional Pb monitoring as determined by Regional
Administrators per the rule language in appendix D, section 4.5(c).
With regard to the reference to the EPA's upcoming decision on the Pb
NAAQS, we note that on December 19, 2014, based on a review of the full
body of evidence, the EPA proposed to retain, without revision, the
current NAAQS of 0.15 micrograms per cubic meter (as a 3-month average
in TSP) as requisite to protect public health and welfare.\21\
---------------------------------------------------------------------------
\21\ https://www.epa.gov/airquality/lead/actions.html#dec2014.
---------------------------------------------------------------------------
In consideration of the supportive comments noted above, the EPA is
finalizing the changes to network design criteria as proposed. With
specific regard to Pb monitoring at urban NCore sites, monitoring
agencies should request permission from the EPA Regional Administrator
to discontinue non-source oriented monitoring following the collection
of at least 3 years of complete data at each affected site. Monitoring
agencies should work closely with their respective EPA Regional Offices
to ensure review and coordination of these changes to the network and
inclusion of such changes in annual monitoring network plans.
III. Amendments to Quality Assurance Requirements
A. Quality Assurance Requirements for Monitors Used in Evaluations for
National Ambient Air Quality Standards--Appendix A
1. General Information
The following changes to monitoring requirements relate to appendix
A to part 58. Changes that affect the overall appendix are discussed in
this section of the preamble while changes specific to the various
sections of the appendix will be addressed in subsequent sections of
the preamble. The EPA notes that the entire regulatory text section for
appendix A will be reprinted since this section is being reorganized
for clarity as well as being selectively revised as described in detail
below. Additionally, although the EPA proposed a new appendix B to
apply to PSD monitors, much of the proposed content of appendix B was
taken directly from the existing requirements for these monitors set
forth in appendix A. It should be noted that a number of provisions
from appendix A were reprinted in the regulatory text for appendix B
solely for clarity, to assist the public in understanding the changes
being proposed. The EPA did not solicit comment on those provisions and
did not make any changes to those provisions in this rulemaking.
[[Page 17260]]
The QA requirements in appendix A have been developed for measuring
the criteria pollutants of O3, NO2, sulfur
dioxide (SO2), CO, Pb and PM (PM10 and
PM2.5), and are minimum requirements for monitoring these
ambient air pollutants for use in NAAQS attainment demonstrations. To
emphasize the objective of this appendix, the EPA proposed to change
the title of appendix A to ``Quality Assurance Requirements for
Monitors used in Evaluations of National Ambient Air Quality
Standards,'' and remove the terms SLAMS and SPMs from the title. We do,
however, in the applicability paragraph, indicate that any monitor
identified as SLAMS must meet the appendix A criteria in order to avoid
any confusion about SLAMS monitors measuring criteria pollutants.
Special purpose monitors may in fact be monitoring for a criteria
pollutant for other objectives besides making comparisons to the NAAQS.
Therefore, appendix A clarifies in the title and the applicability
section that the QA requirements specified in this appendix are for
criteria pollutant monitors that are designated, through the Part 58
ambient air regulations and monitoring organization annual monitoring
network plans, as eligible to be used for NAAQS evaluation purposes.
The applicability section also provides a reporting mechanism in AQS to
identify any criteria pollutant monitors that are not used for NAAQS
evaluations. The criteria pollutants identified for NAAQS exclusion
will require review and approval by the EPA Regional Offices and will
increase transparency and efficiencies in the NAAQS designation, data
quality evaluation and data certification processes. There were no
adverse comments to the change in the title and, therefore, the title
will be changed as proposed.
The previous appendix A regulation had separate sections for
automated (continuous) and manual method types. The EPA proposed to
reformat the document by pollutant rather than by method type. The four
gaseous pollutants (CO, NO2, SO2 and
O3) will be contained in one section since the quality
control (QC) requirements are very similar, and separate sections will
be provided for PM10, PM2.5, and Pb.
The EPA received one supportive comment from a consulting firm made
on the proposed reformatting and no adverse comments. Therefore,
appendix A and appendix B will be reformatted as proposed.
In the 2006 monitoring rule revisions, the PSD QA requirements,
which were previously in appendix B, were added to appendix A and
appendix B was reserved. The PSD requirements, in most cases, mimicked
appendix A in structure but because PSD monitoring is often operated
only for a period of 1 year, some of the frequencies of implementation
of the PSD requirements are higher than the appendix A requirements. In
addition, the agencies governing the implementation, assessment and
approval of the QA requirements are different for PSD and ambient air
monitoring for NAAQS decisions (i.e., the EPA Regions for appendix A
versus PSD reviewing authorities for PSD). The combined regulations
have caused confusion among monitoring organizations and those
implementing PSD requirements, so the EPA proposed that the PSD
requirements be moved back to a separate appendix B. This change would
also provide more flexibility for revision if changes in either
appendix are needed.
The EPA received one supportive comment to adopt this change and
received no adverse comments. Therefore, PSD QA requirements will be
placed into appendix B as proposed.
Finally, the EPA proposed that appendix A emphasize the use of PQAO
and moved the definition and explanation to the beginning of the
regulation in order to ensure that the application and use of PQAO in
appendix A is clearly understood. The definition for PQAO was not
proposed for change. Since the PQAO can be a consolidation of a number
of local monitoring organizations, the EPA proposed to add a sentence
clarifying that the agency identified as the PQAO (usually the state
agency) will be responsible for overseeing that the appendix A
requirements are being met by all local agencies within the PQAO.
Current appendix A regulation requires PQAOs to be approved by the EPA
Regions during network reviews or audits. The EPA believes this
approval can occur at any time and proposed to eliminate wording that
suggests that PQAO approvals can only occur during events like network
reviews or audits.
The EPA received one comment supporting the clarifying language
suggesting it will reduce unnecessary work on the part of the
monitoring agencies by combining and consolidating QA/QC activities and
also fostering a unified approach to air monitoring across an entire
state's PQAO. The EPA received no adverse comments. Therefore, the EPA
is finalizing the language as proposed.
2. Quality System Requirements
The EPA proposed to remove the QA requirements for
PM10-2.5 (see current sections 3.2.6, 3.2.8, 3.3.6, 3.3.8,
4.3). Appendix A has traditionally been used to describe the QA
requirements of the criteria pollutants used in making NAAQS attainment
decisions. While the part 58 Ambient Air Monitoring regulation requires
monitoring for the CSN, PAMS, and total oxides of nitrogen
(NOy) for NCore, the QA requirements for these networks are
found in technical assistance documents and not in appendix A. In 2006,
the EPA proposed a PM10-2.5 NAAQS along with requisite QA
requirements in appendix A. While the PM10-2.5 NAAQS was not
promulgated, PM10-2.5 monitoring was required to be
performed at NCore sites and the EPA proposed requisite QA requirements
in appendix A. Some of the PM requirements, like collocation for
precision and the performance evaluation programs for bias, are
accomplished on a percentage of monitoring sites within a PQAO. For
example, collocated sampling for PM2.5 and PM10
is required at approximately 15 percent of the monitoring sites within
a PQAO. Since virtually every NCore site is the responsibility of a
different PQAO, the appendix A requirements for PM10-2.5, if
implemented at the PQAO level, would have been required to be
implemented at almost every NCore site, which would have been expensive
and an unintended burden. Therefore, the EPA required the
implementation of the PM10-2.5 QC requirements at a national
level and worked with the EPA Regions and monitoring organizations to
identify the sites that would implement the requirements. The
implementation of the PM10-2.5 QC requirements at NCore
sites fundamentally changed how QC is implemented in appendix A and has
been a cause of confusion. Since PM10-2.5 is not a NAAQS
pollutant and the QC requirements cannot be cost-effectively
implemented at a PQAO level, the EPA proposed to eliminate the
PM10-2.5 requirements including flow rate verifications,
semi-annual flow rate audits, collocated sampling procedures, and the
PM10-2.5 Performance Evaluation Program (PEP). Similar to
the technical assistance documents associated for the CSN \22\ and PAMS
\23\ networks, the EPA will develop QA guidance for the
PM10-2.5 network which will afford more flexibility for
implementation and revision of QC activities for PM10-2.5.
---------------------------------------------------------------------------
\22\ See https://www.epa.gov/ttn/amtic/specguid.html for CSN
quality assurance project plan.
\23\ See https://www.epa.gov/ttn/amtic/pamsguidance.html for PAMS
technical assistance document.
---------------------------------------------------------------------------
The EPA received comments from a state and a consulting firm in
support of
[[Page 17261]]
the removal of these requirements and no adverse comments. Therefore,
the EPA will remove the PM10-2.5 QA requirements as
proposed.
The EPA proposed that the QA Pb requirements of collocated sampling
(see current section 3.3.4.3) and Pb performance evaluation procedures
(see current section 3.3.4.4) for non-source oriented NCore sites be
eliminated. The 2010 Pb rule in 40 CFR part 58, appendix D, section
4.5(b), added a requirement to conduct non-source oriented Pb
monitoring at each NCore site in a core based statistical area (CBSA)
with a population of 500,000 or more. This requirement had some
monitoring organizations implementing Pb monitoring at only their NCore
sites. Since the appendix A requirements are focused on PQAOs, the QC
requirements would increase at PQAOs who were required to implement Pb
monitoring at their NCore site. Similar to the PM10-2.5 QA
requirements, the requirement for Pb at NCore sites forced the EPA away
from a focus on PQAOs to working with the EPA Regions and monitoring
organizations for implementation of the Pb-PEP at NCore sites at
national levels. Therefore, the EPA proposed to eliminate the
collocation requirement and the Pb-PEP requirements at NCore sites
while retaining the requirements for flow rate verifications and flow
rate audits, which do not require additional monitors or independent
sampling and analysis. Similar to the CSN and PAMS programs, the EPA
will develop QA guidance for Pb monitoring in the NCore network, which
will afford more flexibility for change/revision to accommodate Pb
monitoring at non-source oriented NCore sites. Additionally, the EPA
proposed to delete the requirement to measure Pb at these specific
NCore sites, either as Pb-TSP or as Pb-PM10 (see section
II.I). Such a revision would eliminate the need for any associated QA
requirements including collocation, Pb-PEP or any QC requirements for
these monitors.
The EPA received two state comments and one MJO comment in support
of the removal of this requirement and no adverse comments. Therefore,
the EPA will remove the Pb QA requirements at non-source oriented NCore
sites as proposed. As noted earlier in section II.I, the EPA is also
finalizing the proposed deletion of Pb monitoring requirements at NCore
sites from appendix D.
The EPA proposed that quality management plan (QMP) (current
section 2.1.1) and quality assurance project plan (QAPP) (current
section 2.1.2) submission and approval dates be reported by monitoring
organizations and the EPA. This will allow for timely and accurate
reporting of this information. From 2007 to 2011, the EPA tracked the
submission and approval of QMPs and QAPPs by polling the EPA Regions
each year and updating a spreadsheet that was posted on the Ambient
Monitoring Technical Information Center (AMTIC) Web site. The
development of the annual spreadsheet was time-consuming on the part of
monitoring organizations and the EPA and, due to polling delays, took a
significant amount of time to assemble a final version for posting. It
is expected that simplified reporting by monitoring organizations and
EPA to AQS will reduce entry errors and the burden of incorporating
this information into annual spreadsheets, and increase transparency of
this important quality system documentation. In order to reduce the
initial burden of this data entry activity, the EPA populated AQS with
the last set of updated QMP and QAPP data from the 2011 listing.
Monitoring organizations will need to update AQS only when submitting
new or revised versions of QAPP or QMPs (one or two fields) and the EPA
can then add approval dates.
The EPA received one state comment in support of this proposal, and
two states, a consulting firm and one MJO commented expressing concern.
One state commenter mentioned that the preamble indicates that the
monitoring organizations would be responsible for submitting the dates
associated with QMP and QAPP submittals and approvals and, if this was
the intent of the proposed rule, AQS must be modified to allow
monitoring organizations the ability to enter this data. The commenter
also mentioned that the EPA's AQS web application only allows
monitoring organizations to view QAPP and QMP dates, but the
functionality to enter or revise those dates is unavailable. The
commenter mentioned other issues related to the current functionality
of the system but not a disagreement with the proposed requirement to
report the data.
The MJO commenter mentioned that reporting to AQS was an
unnecessary burden on state air monitoring agencies because the EPA
Regional Offices receive these reports and the information is available
to the public on the EPA AMTIC Web site. The consulting firm did not
understand how shifting this burden to ``monitoring organizations''
would relieve the reporting burden on any organization other than the
EPA.
As mentioned in the proposal, the approach of reporting QAPP and
QMP information to AMTIC was not only time-consuming for monitoring
organizations but also for EPA who would work for 2 to 3 months to pull
together this annual report. By reporting the information directly to
AQS, the monitoring organization's requirements are also reduced since
they do not need to be polled every year to gather this information,
review it for accuracy and completeness, and transmit it to the EPA
Regional Office. The monitoring organizations will only need to report
updates to AQS when they occur and will not be burdened with this
request/review process every year.
In regard to the comment related to the current functionality of
AQS, which did not allow agency reporting of the QMP/QAPP information,
the EPA notes that AQS is now available for monitoring organizations,
and EPA Regional Offices, to report this information that has currently
been reported and revised by the EPA. Therefore, rather than posting a
static table on AMTIC each year (which could change through-out the
time period between updates), AMTIC can host a link to the most up-to-
date information in AQS, which is a much more efficient method than the
cumbersome annual collection and reporting method described above.
Therefore, the EPA is finalizing the requirement as proposed.
The EPA proposed that if a PQAO or monitoring organization has been
delegated authority to review and approve their QAPP, an electronic
copy must be submitted to the EPA Regional Office at the time it is
submitted to the PQAO/monitoring organization's QAPP approving
authority. Submission of an electronic version to the EPA at the time
of completion is not considered an added burden on the monitoring
organization because such submission is already a standard practice as
part of the review process for technical systems audits (TSA).
The EPA did not receive any supporting or adverse comments to this
proposal, but did receive a state comment suggesting that a copy of all
approved QAPP's be submitted annually rather than at the time when a
QAPP is submitted or approved. The EPA notes that during recent systems
audits, EPA auditors have found language in approved QAPPs that do not
meet ambient air regulatory requirements. Non-conformance with a
regulatory requirement can lead to data invalidation. In an effort to
identify any non-conformance with regulatory requirements as early as
possible, especially with monitoring organizations that have been
delegated responsibility to approve their own
[[Page 17262]]
QAPPs, the EPA believes it is important to have the opportunity to
review these documents as early as possible to eliminate potential data
invalidation issues. Therefore, the EPA is finalizing this language as
proposed.
In the QAPP requirement language, the EPA proposed to clarify that
the QAPP include a list of sites and monitors associated with the QAPP.
The EPA received a state comment that considered it a burden to
update the QAPP every time a site or monitor is changed or is added.
The commenter suggested adding that this information can be referenced
in other publicly available documents. Since this section allows
standard operating procedures to be referenced in the QAPP, the EPA
will also allow the referencing of monitors and sites.
The requirement to identify the sites/monitors in a QAPP is a
standard QAPP requirement and is why it is included in the regulation.
However, the QAPP can refer to an official table that is updated
annually that may be on a Web site or other official documentation
(e.g., annual network plan). In addition, if the QAPP does contain this
information, an addendum to the QAPP modifying this information (with
reference to the QAPP) can be accomplished without having to physically
edit the document each time a monitoring site is added because the
addition of the site does not affect how the quality system is
implemented.
The EPA is finalizing the requirement as proposed, but is also
clarifying that sites and monitors may be allowed to be referenced from
other up-to-date sources.
The EPA proposed to add some clarifying language to the section
describing the National Performance Evaluation Program (NPEP) (current
section 2.4) explaining self-implementation of the performance
evaluation by the monitoring organization. The clarification also adds
the definition of ``independent assessment'' which is included in the
PM2.5-PEP, Pb-PEP and National Performance Audit Program
(NPAP) QAPPs, and is included in the self-implementation memo sent to
the monitoring organizations on an annual basis and posted on the AMTIC
Web site.\24\ The clarification codifies in regulation what was in
guidance, and provides a better reference for this information in
addition to the annual memo sent to the monitoring organizations.
---------------------------------------------------------------------------
\24\ See https://www.epa.gov/ttn/amtic/npepqa.html.
---------------------------------------------------------------------------
The EPA received one state comment in support of the addition of
the independent assessment definition and one state comment noting
concern.
The state comment of concern included a reference to the NPAP
revisions that are proposed below (section 3.1.3) and does not appear
to be related to the actual definition that was proposed in this
section. Further, we note that the state that made the comment
qualifies as eligible to conduct an ``independent assessment'' under
the current definition that was proposed and has been defined in this
way in annual self-implementation decision memorandums that have been
sent to monitoring organizations since 2008. This definition has not
changed and was expected to be achieved by monitoring organizations in
order to self-implement the various performance evaluations defined in
this section. Therefore, the EPA is finalizing the requirement as
proposed.
The EPA proposed to add clarifying language to the TSA section
(current section 2.4). As described in more detail below, the current
TSA requirements are clearly intended to be performed at the monitoring
organization level.
The EPA proposed a TSA frequency of 3 years for each PQAO, but
included language that if a PQAO is made up of a number of monitoring
organizations, all monitoring organizations within the PQAO should be
audited within 6 years. This proposed language maintains the 3 year TSA
requirement as it applies to PQAOs but provided additional flexibility
for the EPA Regions to audit every monitoring organization within the
PQAO every 6 years. This revision was made to address logistical
concerns at the EPA Regions, particularly for those Regions with very
large PQAOs composed of many monitoring organizations. In the EPA's
view, the proposed revision did not materially affect the burden on
monitoring organizations.
The EPA received one state comment supporting the proposed revision
as written, one comment by a joint environmental organization
suggesting that we maintain the current requirement to audit each
monitoring organization on a 3-year basis, and two state comments that
suggested that the proposed revision was a burden to monitoring
organizations.
The comment from the joint environmental organization expressed
concern with the potential for reduced frequency of the TSAs for
monitoring organizations in consolidated PQAOs (proposed 6-year
frequency versus current 3-year frequency). The commenter believed such
a change could seriously jeopardize implementation of the Act and
threaten public health by delaying NAAQS decisions. The commenter cited
examples of recent invalidation of PM2.5 data that were
based on findings from TSAs. In their view, delaying audit frequencies
to once every 6 years (for a monitoring organization) raises the risk
of even greater delay and disruption of nonattainment designations in
areas that are violating NAAQS and have data quality issues at the
pertinent monitoring organizations.
Two commenters from state agencies felt that the proposed language
would treat these monitoring organizations (within a PQAO) as
individual entities, causing an increase in the number of TSAs and
difficulty in ensuring consistency among monitoring organizations
within the PQAO, and would disrupt monitoring organizations with the
scheduling of these audits. The PQAO staff would be required to oversee
the changes throughout the monitoring organizations, participate in
each of the TSAs, track all corrective actions, verify implementation,
and ensure consistency of implementation across all monitoring
organizations.
Commenters who were concerned with the proposed language to audit
individual monitoring organizations within a PQAO may have been
interpreting the current and earlier appendix A requirements somewhat
differently than the original intent of the EPA. Since 1996, the TSA
language in appendix A has been associated with auditing monitoring
agencies or monitoring organizations, not PQAOs (note--the PQAO term
was promulgated in 2006). For additional context, the following rule
excerpts provide a chronological history of the TSA language in
appendix A.
Prior to 1998: ``Agencies operating SLAMS network stations shall be
subject to annual EPA systems audits of their ambient air monitoring
program and are required to participate in EPA's National Performance
Audit Program.''
1998: ``Systems audits of the ambient air monitoring programs of
agencies operating SLAMS shall be conducted at least every 3 years by
the appropriate EPA Regional Office.''
2005: ``Systems audits of the ambient air monitoring programs of
agencies operating SLAMS shall be conducted at least every 3 years by
the appropriate Regional Office.''
2006-2014 (prior to this proposed change): ``Technical systems
audits of each ambient air monitoring organization shall be conducted
at least every 3 years by the appropriate EPA
[[Page 17263]]
Regional Office and reported to the AQS.''
The EPA notes that the current definition (40 CFR 58.1) for a
monitoring agency (prior to this proposal) was defined as ``a state or
local agency responsible for meeting the requirements of this part.''
Monitoring organization was defined as a ``state, local, or other
monitoring organization responsible for operating a monitoring site for
which the quality assurance regulations apply.'' Neither definition
described any consolidation of agencies into a PQAO; therefore,
individual monitoring agencies or organizations were to receive a TSA
by the EPA Region annually prior to 1998 and every 3 years after 1998.
As indicated by one of the commenters who suggested that the
proposed language would treat monitoring organizations as individual
entities, the TSA language was, in fact, defined to treat the
monitoring agencies as individual entities. The value of this approach
has been reaffirmed by recent TSAs where Regional Office auditors have
found that monitoring organizations within consolidated PQAOs, in some
cases, did not operate consistent quality systems.
A commenter expressing concern about the proposed revision made the
point that all monitoring organizations covered under the umbrella of
the PQAO's quality system would have to make changes in their operation
each time a TSA at any of the monitoring organizations indicates an
issue with that monitoring organization's quality system. This comment
reflects a concern (and a tacit acknowledgement) that monitoring
organizations within a PQAO do not necessarily implement a consistent
quality system and need to be audited at some frequency. The commenter
is correct and the EPA agrees that an issue identified by a TSA at one
monitoring organization within the PQAO should be reviewed by the PQAO
to determine if corrective action should be instituted for all
monitoring organizations operating in the PQAO. That is the specific
concern that has driven the EPA's regulations to consistently require
TSAs at the monitoring organization level. The proposed TSA language
provides for this review of the PQAO every 3 years and of all
monitoring organizations within the PQAO within 6 years.
A state agency commenter was also concerned that TSAs could affect
the data certification process. The commenter was concerned that EPA
concurrence with a PQAO's data certification could be prohibited due to
the lack of a TSA within the appropriate time frame. The EPA notes that
TSA completeness requirements are reported on certification reports but
do not affect the concurrence process itself and, therefore, do not
penalize the PQAO if the TSA is not performed at the required
frequency.
In response to the comment from the joint environmental
organization and based on the recent findings in the TSAs,\25\ the EPA
Regions are providing more scrutiny on the PQAO requirements to ensure
that monitoring organizations consolidated in PQAOs develop and
document consistent quality practices. The EPA Headquarters and Regions
are working together to develop a more consistent TSA process based on
``lessons learned'' from the PM2.5 TSAs findings identified
in the joint environmental organization comment. In addition, Regions
are scrutinizing PQAO quality systems to ensure a level of QA
consistency of monitoring organizations within a PQAO and, where there
are issues, either taking corrective actions or suggesting that
monitoring organizations within a PQAO disaggregate. The EPA has also
seen PQAOs developing better documents and training for monitoring
organizations within PQAOs to improve quality system consistency. Based
on the information presented above, the EPA believes that the proposal
to allow monitoring organizations within a PQAO to be audited within a
6-year period is reasonable and is finalizing the requirement as
proposed.
---------------------------------------------------------------------------
\25\ McCabe, Janet G. (2014). Particle Pollution Quality
Assurance. Memorandum to the Docket, EPA-HQ-OAR-2013.
---------------------------------------------------------------------------
In summary, the revised regulation specifies that EPA Regional
Offices conduct TSAs of every PQAO at a 3-year frequency and that they
should also perform a TSA on all monitoring organizations within the
PQAO within 6 years. Where resources permit, the EPA encourages the
adoption of the practice of some PQAOs to perform their own agency-
specific TSAs and monitoring site visits on member monitoring agencies
in the intervening years between required EPA Regional Office TSAs.
Such visits can help to proactively identify potential QA deficiencies
before situations involving long-term data loss occur and can also
serve to assure uniformity in procedures across PQAOs through periods
of changing personnel, equipment, or EPA requirements.
The EPA proposed to require monitoring organizations to complete an
annual survey for the Ambient Air Protocol Gas Verification Program
(AA-PGVP) (current section 2.6.1). Since 2009, the EPA has had a
separate information collection request \26\ requiring monitoring
organizations to complete an annual survey of the producers that supply
their gas standards (for calibrations and QC) in order to be able to
select standards from these producers for verification. The survey
generally takes less than 10 minutes to complete. The EPA proposed to
add the requirement to complete the survey to appendix A.
---------------------------------------------------------------------------
\26\ See https://www.reginfo.gov/public/Forward?SearchTarget=PRA&textfield=ambient+air+protocol+gas.
---------------------------------------------------------------------------
The EPA received one consulting firm comment suggesting that entry
of data in the annual survey was a modest burden and another state
comment of support without additional comment. There were no adverse
comments on completing the annual survey. Therefore, the EPA is
finalizing the language as proposed.
In addition, the EPA proposed to add language that monitoring
organizations participate, at the request of the EPA, in the AA-PGVP by
sending a gas standard to one of the verification laboratories no more
frequently than every 5 years. Since many monitoring organizations
already volunteer to send in cylinders, this proposed new requirement
is not expected to materially affect most agencies and will not affect
those agencies that do not run gaseous ambient air monitors and,
therefore, do not use gas standards.
The EPA received three state comments supporting and one MJO and
two state comments expressing concern about this aspect of the AA-PGVP
requirement. The supportive responses included one organization already
participating in the program and another that mentioned that the
independent verification of cylinder contents has value for monitoring
groups especially with respect to the lower target gas concentrations
now employed in QA procedures. A third response supported the action
with no additional comments. Comments expressing concern about the
proposal were related to the extra cost associated with shipping a
cylinder to the verification laboratory and the Department of
Transportation (DOT) training required for shipping the cylinder. One
commenter mentioned that the organizations are already required to use
traceable or certified gases and another suggested that the EPA could
also consider working with the standard gas vendors directly,
potentially through a federally funded gas certification and
verification program. A commenter suggested the
[[Page 17264]]
requirement is resource intensive because additional standard gases
will need to be maintained for use while the audited cylinder is not in
use.
By way of background relating to the genesis of the AA-PGVP, the
EPA notes that the Office of Research and Development (ORD) operated a
protocol gas audit program that was discontinued in 1997. In the mid-
2000 timeframe, the EPA received a number of comments from monitoring
organizations that the program was needed and the current program
(implemented in 2010) was created based on those comments. The
monitoring organizations were concerned that they were receiving
cylinders that were not meeting the protocol gas specifications even
though the producers, as one commenter mentioned, are required to use
traceable or certified gases. Information from a 2009 Office of
Inspector General report indicated some failures to meet protocol gas
requirements by some protocol gas producers.\27\ Gas producers were
also sharing concerns with the EPA that some producers were selling
cylinders that were not properly verified. Although the EPA initially
tried to develop a program that would be funded by the gas vendors,
many of whom agreed to fund it, one producer lodged a protest and the
EPA could not implement the program in this manner.
---------------------------------------------------------------------------
\27\ U.S. Environmental Protection Agency. ``EPA Needs an
Oversight Program for Protocol Gases,'' Office of Inspector General
Report No. 09-P-0235, 2009.
---------------------------------------------------------------------------
In addition, the AA-PGVP is intended to be a blind verification of
the producers, meaning it would be most advantageous for the producer
not to know a cylinder is being sent to a verification lab and,
therefore, the EPA tries not to request cylinders directly from gas
producers. Although one commenter suggested that the EPA receive
cylinders directly from the producer, this would defeat the purpose of
the blind verification and the producers would have the opportunity to
send a cylinder that may have had additional testing against its
certified value. The AA-PGVP has been implemented since 2010 and the
EPA is starting to see a drop in monitoring organization participation,
yet we also received positive comments that the program is valuable in
keeping the producers aware of the need for the quality of their gas
standards.
In response to the comment expressing concern about the cost of
participating in the program and the logistical difficulty of properly
being certified to ship cylinders, the EPA clarifies that with the
current program, the EPA covers the cost of shipping the cylinders to
and from the regional AA-PGVP verification laboratory. Online DOT
training is offered to monitoring organizations and is valid for 3
years. So although there is an expense to the monitoring organization
on the time to train, there is limited burden related to the rest of
the program. The EPA is aware that additional standard gases will need
to be maintained for use while the new cylinder is being sent for
verification. Most monitoring organizations order new cylinders prior
to expiration of older cylinders or before they run out of gas supply.
There is normally a transition period where new cylinders are on hand
and checked against the current cylinder before retiring the older
cylinder. The AA-PGVP Implementation Plan \28\ describes that during
this change-out process, if the new cylinder is ordered with enough
lead time (AA-PGVP estimates 30-45 days from shipping through
verification and cylinder return), it could be sent to the AA-PGVP
verification laboratory and verified prior to use by monitoring
organizations before it needed to be exchanged with an older cylinder.
---------------------------------------------------------------------------
\28\ https://www.epa.gov/ttnamti1/files/ambient/qaqc/aapgvpimpplan.pdf.
---------------------------------------------------------------------------
Based on the comments received and the EPA's clarifications of the
need for the current program, the EPA will codify the ICR requiring
monitoring organizations to report the gas standard producers it uses
on an annual basis and also finalize the proposed language allowing the
agency to request cylinders from monitoring organizations no more
frequently than every 5 years.
3. Measurement Quality Checks for Gases
The EPA proposed to lower the audit concentrations (current section
3.2.1) of the one-point QC checks to between 0.005 and 0.08 parts per
million (ppm) for SO2, NO2, and O3
(currently 0.01 to 0.1 ppm), and to between 0.5 and 5 ppm for CO
monitors (currently 1 and 10 ppm). With the development of more
sensitive monitoring instruments with lower detection limits, technical
improvements in calibrators, and lower ambient air concentrations in
general, the EPA felt this revision would better reflect the precision
and bias of the ambient air data being measured at the site. Since the
QC check concentrations are selected using the mean or median
concentration of typical ambient air concentrations (guidance on this
is provided in the QA Handbook \29\), the EPA proposed to add some
clarification to the current language by requiring monitoring
organizations to select either the highest or lowest concentration in
the ranges identified if their mean or median concentrations are above
or below the prescribed range.
---------------------------------------------------------------------------
\29\ QA Handbook for Air Pollution Measurement Vol. II Ambient
Air Quality Monitoring Program at: https://www.epa.gov/ttn/amtic/qalist.html.
---------------------------------------------------------------------------
The majority of the comments (19 of 26 responding to the quality
assurance proposal) received on appendix A were related to this
proposed change. One state and one consulting firm commenter expressed
support for the change but the majority of commenters expressed concern
(16 state commenters and one MJO). Most of the commenters expressed
similar technical concerns that:
The SLAMS network is in place mainly for decisions related
to the NAAQS, so QC checks should be at the levels approximating the
NAAQS values.
Some of the FRM or FEM that are still in use may operate
acceptably at concentrations around the NAAQS, but the older versions
of the approved monitors are not as sensitive at lower concentrations
(i.e., mean or median concentrations), so QC checks at these lower
levels are beyond the operational limits of the instrumentation.
The instrumentation necessary to challenge the monitors at
the lower concentrations (calibrators with additional mass flow
controllers or gas cylinders of lower concentrations) would be required
to comply and, therefore, represent an added expense and burden.
The lower concentrations affect the percent difference
statistic so there is more chance that the QC check will fail the
acceptance requirements and, therefore, invalidate data that the
monitoring organization feels is of acceptable quality.
The EPA acknowledges these comments and has performed some
evaluations on 2013 hourly gaseous data that are summarized in a memo
placed in the docket.\30\ As summarized in the memo, the EPA generally
believes that challenging ambient air analyzers with a one-point QC
check at the level of the NAAQS provides an incomplete and potentially
inaccurate representation of the precision and bias of the data
actually reported to the AQS since, in most cases, the precision and
bias estimates are performed at levels that are above 99 percent of the
actual SLAMS data reported to AQS. The
[[Page 17265]]
EPA's analysis of QC check data shows that many monitoring agencies are
successfully meeting measurement quality objectives at lower
concentrations that are closer to the routine ambient data being
reported to AQS. We recognize that some of these QC checks may be
reported by monitoring organizations that have invested in the
technology (i.e., analyzers, calibration devices and standards at NCore
sites) necessary to adequately calibrate and estimate precision and
bias at the concentrations measured at ambient levels. This analysis
demonstrates that the technology is available to measure and report
precision and bias at mean/median ambient air concentration levels.
---------------------------------------------------------------------------
\30\ Papp, M. (2015). Assessments of One-Point QC Data in
Response to Comments on Revisions to the Ambient Air Quality
Assurance Regulation contained in 40 CFR part 58, appendix A.
Memorandum to the Docket, EPA-HQ-OAR-2013-0619.
---------------------------------------------------------------------------
At the same time, the EPA is aware that there are monitoring
agencies that have not yet invested in some of these newer technologies
and/or may not believe that the operation of more sensitive
instrumentation and associated calibration equipment outside of the
NCore framework is necessary to meet their monitoring objectives. In
light of the comments received on this issue, the EPA will modify the
proposed changes to QC check requirements. Specifically, we are
finalizing the lower concentration ranges as proposed: 0.005 to 0.08
ppm for SO2, NO2, and O3, and between
the prescribed range of 0.5 and 5 ppm for CO monitors. Additionally,
rather than requiring that the range selected be at the mean or median
concentration range at the site or the agencies network of sites, the
current flexibility to select the QC check gas concentration within the
prescribed range will remain unchanged. Specifically, monitoring
agencies should relate the concentration of the QC check to the
monitoring objective of the site; with SLAMS monitors primarily
intended for NAAQS compliance utilizing concentrations at or near the
level of the NAAQS (higher end of the required range), and trace gas
monitors operating at NCore, background or trends sites related to the
mean or median of the ambient air concentrations normally measured at
those sites in order to appropriately reflect the precision and bias at
these routine concentration ranges. The EPA also clarifies that if the
mean or median concentrations at trace gas sites are below the method
detection limits (MDL) of the instrument, or if concentrations are
above the prescribed range, the agency can select the lowest or highest
concentration in the range that can be practically achieved. In
addition, the EPA will keep language suggesting that an additional QC
check point is encouraged for those organizations that may have
occasional high values or would like to confirm monitor linearity at
the higher end of the operational range. It will also encourage
monitoring organizations that are operating NAAQS compliance sites to
include additional QC checks around the mean or median values.
The EPA believes that providing monitoring organizations some
flexibility in determining the QC check concentration range based on
site monitoring objective and the sensitivity of its monitors should
address the concerns that were noted in the comments on this aspect of
the proposed requirement. However, the EPA reiterates that our analysis
of reported data has shown that monitoring agencies can test and
achieve acceptable precision and bias results at these lower
concentration levels. Providing data users with estimates of precision
and bias where the majority of our ambient air data are measured is an
EPA programmatic goal and monitoring organizations should be working
with the EPA Regional Offices to develop the budgets necessary for
purchasing the updated equipment and revising related procedures. The
EPA will continue to endorse this approach to make the QC checks more
meaningful and we will consider future revisions to appendix A to
either require QC checks at two concentration levels (i.e., one around
the mean concentrations and one related to the NAAQS) or require the
span check \31\ to be reported to AQS. In addition, to alleviate
concerns about failing the acceptance criteria at lower QC
concentrations, the EPA will evaluate suggestions by monitoring
organizations to raise acceptance criteria or look at alternative
acceptance criteria (e.g., difference instead of percent difference).
Since acceptance criteria are included in guidance, the EPA will have
the opportunity to perform the evaluations without affecting the
regulation. In 2011, the EPA developed similar guidance for lower
concentration levels of the annual performance evaluation audits.\32\
---------------------------------------------------------------------------
\31\ A check similar to the QC check but implemented at a
concentration closer to the higher end of the calibration range of
the monitor.
\32\ https://www.epa.gov/ttnamti1/files/ambient/pm25/datamang/20110217lowlevelstatmemo.pdf.
---------------------------------------------------------------------------
The EPA proposed to remove reference to zero and span adjustments
(current section 3.2.1.1) and revise the one-point QC language to
simply require that the QC check be conducted before any calibration or
adjustment to the monitor. Recent revisions of the QA Handbook
discourage the implementation of frequent span adjustments so the
proposed language helps to clarify that no adjustment be made prior to
implementation of the one-point QC check.
There were no comments made on this proposed revision so the EPA is
finalizing this revision as proposed.
The EPA proposed to remove the requirement (current section 3.2.2)
to implement an annual performance evaluation for one monitor in each
calendar quarter when monitoring organizations have fewer than four
monitoring instruments. The minimum requirement for the annual
performance evaluation for the primary monitor at a site is one per
year. The current regulation requires evaluation of 25 percent of the
monitors per quarter so that the performance evaluations are performed
in all four quarters. There are cases where some monitoring
organizations have fewer than four primary monitors for a gaseous
pollutant, and the current language suggests that a monitor already
receiving a performance evaluation be re-audited to provide for
performance evaluations in all four quarters. This proposed removal of
the requirement for evaluation in every quarter reduces the burden for
monitoring agencies operating smaller networks and does not change the
requirement of an annual performance evaluation for each primary
monitor.
The EPA received one state comment in support of this revision and
no adverse comments. Therefore, the EPA is finalizing this revision as
proposed.
The current annual performance evaluation language (current section
3.2.2.1) requires that the audits be conducted by selecting three
consecutive audit levels (currently five audit levels are provided in
appendix A). Due to the implementation of the NCore network, the
inception of trace gas monitors, and generally lower ambient air
concentrations being measured, there is a need for audit levels at
lower concentrations to more accurately represent the uncertainties
present in much of the ambient data. The EPA proposed to expand the
audit levels from five to ten and remove the requirement to audit three
consecutive levels. The previous regulation suggested that the three
audit levels bracket 80 percent of the ambient air concentrations
measured by the analyzer, and monitoring organizations have requested
the use of an audit point to establish monitor accuracy around the
NAAQS levels. Therefore, the EPA proposed to revise the language so
that two of the audit levels selected
[[Page 17266]]
represent 10-80 percent of routinely-collected ambient concentrations
either measured by the monitor or in the PQAOs network of monitors. The
proposed revision allowed the third point to be selected at the NAAQS
level (e.g., 75 ppb for SO2) or above the highest 3-year
routine hourly concentration, whichever was greater.
One state commenter and a consulting firm supported this proposal
while six state commenters voiced concern. The comments expressing
concern were similar to comments made on the one-point QC check
proposal described earlier, including:
The SLAMS network is in place mainly for decisions related
to the NAAQS, so QC checks should be at the levels approximating the
NAAQS values.
Some of the FRM or FEM that are still in use may operate
acceptably at concentrations around the NAAQS, but these older methods
are not as sensitive at lower concentrations (i.e., mean or median
concentrations), so QC checks at these lower levels are beyond the
limits of the instrumentation.
The instrumentation necessary to challenge the monitors at
the lower concentrations (calibrators with additional mass flow
controllers or gas cylinders of lower concentrations) would be required
to comply and, therefore, represent an added expense and burden.
The lower concentrations affect the percent difference
statistic so there is more chance that the QC check will fail the
acceptance requirements and, therefore, invalidate data that the
monitoring organization feels is of acceptable quality.
The EPA believes that there are some distinctions between the
annual performance evaluations and the one-point QC checks, and
although the comments on the proposed revisions are similar, a
different response to the comments is appropriate as explained below.
Where monitoring organizations typically utilize standards and
equipment at each site to run one-point QC checks, the annual
performance evaluations require less equipment since, in many cases,
one set (or a few sets) of independent equipment is/are used to audit
all sites in a network. Accordingly, the EPA believes that it is
practical for monitoring agencies to procure and utilize audit
equipment, including calibrators and gas standards that are capable of
generating the lower concentrations that are typically measured at most
sites in the U.S. Indeed, all monitoring agencies that operate NCore
multi-pollutant stations should already own and be proficient in the
operation of such equipment as the objectives of the NCore stations and
the technology used (i.e., trace level gas monitors) are oriented to
characterizing typical ambient concentrations.
In order to make the requirements easier to comprehend and allow
for more flexibility in audit point selection, the EPA will revise the
proposed language to require three points to be selected: One point
around two to three times the method detection limit of the instruments
within the PQAO network, a second point less than or equal to the 99
percentile of the data at the site or the network of sites within a
PQAO or the next highest audit concentration level, and the third point
around the primary NAAQS or the highest 3-year concentration at the
site or the network of sites in the PQAO. This framework provides two
audit points that reflect 99 percent of the monitoring data and a third
point at the highest 3-year concentration or the level of the NAAQS,
whichever concentration the monitoring organization chooses. Since
performance evaluation audits are only performed once a year at each
site, the burden to perform these audits at suitable concentrations is
reduced relative to the QC checks. Therefore, the revised audit
approach should provide the flexibility requested by the commenters.
Also, in 2011, the EPA adopted a more flexible acceptance criteria for
the two lower concentration audit levels (option to use difference
instead of percent difference) \33\ that is not influenced by
concentration, which should alleviate commenter's concerns about
acceptance criteria at the lower audit levels. Accordingly, the EPA is
finalizing the changes to performance audit requirements as described
above.
---------------------------------------------------------------------------
\33\ https://www.epa.gov/ttnamti1/files/ambient/pm25/datamang/20110217lowlevelstatmemo.pdf.
---------------------------------------------------------------------------
The EPA proposed to revise the language (current section
3.2.2.2(a)) addressing the limits on excess nitric oxide (NO) that must
be followed during gas phase titration (GPT) procedures involving
NO2 audits. The previous NO limit (maintaining at least 0.08
ppm NO) was restrictive and required auditors to make numerous mid-
audit adjustments during a GPT that resulted in making the
NO2 audit a time-consuming procedure. Accordingly, we
proposed a more general statement regarding GPT that acknowledges the
ongoing usage of monitoring agency procedures and guidance documents
that have successfully supported NO2 calibration activities.
The EPA received one state comment in support of the proposed
revision to the language on excess NO and no adverse comments.
Therefore, the EPA is finalizing this revision as proposed.
The EPA proposed to remove language (current section 3.2.2.2(b)) in
the annual performance evaluation section that required Regional
approval for audit gases for any monitors operating at ranges higher
that 1.0 ppm for O3, SO2 and NO2 and
greater than 50 ppm for CO. The EPA does not need to approve a
monitoring organization's use of audit gases to audit above proposed
concentration levels. Since data reported to AQS above the highest
level may be flagged or rejected, the EPA proposed that PQAOs notify
the EPA Regional Office of sites being audited at concentrations above
level 10 so that reporting accommodations can be made.
The EPA did not receive any comments on this proposed change.
Therefore, the EPA is finalizing this revision as proposed.
The EPA proposed to provide additional explanatory language in
appendix A to describe the NPAP. The NPAP has been a long-standing
program for the ambient air monitoring community. Since 2007, the EPA
has distributed an annual decision memorandum to all monitoring
organizations in order to determine whether the monitoring organization
plans to self-implement the NPAP program or utilize the federally
implemented program. In order to make this decision, the NPAP adequacy
and independence requirements are described in this annual decision
memorandum. The EPA proposed to include these same requirements in
appendix A in a separate section for NPAP. In addition, the annual
decision memorandum stated that 20 percent of the sites would be
audited each year so that all sites would be audited in a 5-year
period. Since there is a possibility that monitoring organizations may
want certain higher priority sites audited more frequently, the EPA
proposed to revise the language to require all sites to be audited
within a 6-year period to provide more flexibility and discretion for
monitoring agencies. This revision does not change the number of sites
audited in any given year, but allows for increased frequency in
auditing sites deemed as high priority.
The EPA received one state comment and one consulting firm comment
supporting this action and two state comments expressing concern. One
commenter supported it without any additional comment while another
made the point that the clarification simply added the definition of an
``independent assessment,'' which has been widely circulated and
understood
[[Page 17267]]
by state, local and tribal monitoring organizations for several years
and is neutral with respect to burden. One state commenter mentioned
that the proposed additions have changed the requirements for
demonstrating independence and adequacy that were originally outlined
in the memorandum, ``National Performance Audit Program/
PM2.5 Performance Evaluation Program Implementation Decision
Memorandum for Calendar Year 2008,'' by implementing training
requirements, requiring separate audit equipment, and adding a
requirement to perform a whole system check tested against an
independent and qualified lab. The commenter suggested that the
proposed changes impact the costs for the PQAO to implement the NPAP.
A state commenter suggested that the description for NPAP was
``inconsistent with what had been conveyed in the past and is more
pertinent for the performance audit.'' The commenter also suggested
that proposed sections 3.1.3.4(a)-(f) be removed and retained in
guidance (annual memorandum). However, the 2008 version of the QA
Handbook, as well as the current 2013 version, provides the same
definition of a Performance Evaluation as a type of audit in which the
quantitative data generated in a measurement system are obtained
independently and compared with routinely obtained data to evaluate the
proficiency of an analyst, or a laboratory, and has included NPAP in
this definition in both versions of the QA Handbook. Another state
commenter also raised questions as to the objective of the program and
suggested that the NPAP objective is already being accomplished with
the annual performance evaluation.
In response to changes in the NPAP requirement from the 2008 NPAP
memo, each year the EPA requests that monitoring organizations make a
decision with regard to self-implementation of the NPAP program based
on the current year's decision memorandum, or allow for federal
implementation of the program. The proposed regulatory language has
been included in the decision memorandums for the past number of years
that the EPA expected monitoring organizations to follow in order to
self-implement.
The EPA disagrees that the NPAP objectives have changed since the
inception of the program. Early versions of NPAP included cylinders of
unknown concentration being sent to monitoring organizations (mailed
audits) who would challenge the analyzers with these standards and send
the results back to the EPA for evaluation. This process was ``blind,''
meaning that the monitoring organization did not know the concentration
of the standard they were auditing. It was completely independent of
monitoring organization implementation and also established
independence of the concentration being audited. At the same time the
NPAP mailed audits were conducted, monitoring organizations continued
to implement their annual performance evaluations. So, both NPAP and
the annual performance programs have been implemented at the same time
and NPAP, having a different objective, allowed for a level of
independent auditing by the EPA. Due to complaints lodged on the length
of time required to get results back from the NPAP ``mailable''
program, the EPA instituted the current NPAP through the probe program
while continuing its primary objective: providing independent,
quantitative evaluations of data quality. Since the majority of
monitoring organizations allow for federal implementation, which is
reliably independent of monitoring organization implementation (only
two monitoring organizations in the country self-implement NPAP), the
EPA identified the requirements necessary for self-implementing
monitoring organizations to maintain as close a level of independence
and data quality consistency to federal implementation. Therefore,
while one commenter suggested that the training requirements be revised
to ensure that auditors have been trained in the procedures that PQAOs
actually employ to satisfy this requirement, the EPA believes that the
training be required to reflect consistency with the federal program in
order to establish consistency in data quality across the NPAP program.
The EPA provides the opportunity for monitoring organizations to make
the self-implementation decision each year based on the requirements in
the decision memorandum, which ensures the NPAP program is equitably
and consistently implemented across all monitoring organizations.
Therefore, the EPA is finalizing this revision as proposed, but is also
providing some flexibility as requested in a state comment by inserting
the following language into the relevant section of appendix A:
OAQPS, in consultation with the relevant EPA Regional Office,
may approve the PQAO's plan to self-implement NPAP if the OAQPS
determines that the PQAO's self-implementation plan is equivalent to
the federal programs and adequate to meet the objectives of national
consistency and data quality.
4. Measurement Quality Checks for Particulate Monitors
The EPA proposed to require that flow rate verifications (current
section 3.2.3) be reported to AQS. Particulate matter concentrations
(e.g., PM2.5, PM10, Pb) are reported in mass per
unit of volume ([mu]g/m\3\). Flow rate verifications are implemented at
required frequencies in order to ensure that the PM sampler is
providing an accurate and repeatable measure of volume that is critical
for the determination of concentration. If a given flow rate
verification does not meet acceptance criteria, the EPA guidance
suggests that data may be invalidated back to the most recent
acceptable verification, which is why these checks are performed at
higher frequencies. Implementation of the flow rate verification is
currently a requirement, but reporting to AQS has only been a
requirement for PM10 continuous instruments. This is the
only QC requirement in appendix A that was not fully required for
reporting for all PM pollutants and has been a cause of confusion. When
performing TSAs, the EPA Regional Offices review the flow rate
verification information. There are cases where it is difficult to find
the flow rate verification information to ascertain completeness, data
quality, and whether corrective actions have been implemented in the
case of flow rate verification failures. In addition, the EPA Regions
have mentioned that some of the monitoring organizations have been
voluntarily reporting these data to AQS in an effort to increase
transparency and reliability in data quality. In a recent review of
2012 data, out of the 1,110 SLAMS PM2.5 samplers providing
flow rate audit data (which are required to be reported), flow rate
verification data were also reported for 543 samplers or about 49
percent for the samplers with flow rate audit data. With the
development of a new QA transaction in AQS, we believe that the
reporting of flow rate verification data would improve the evaluation
of data quality for data certification and at national levels, provide
consistent interpretation in the regulation for all PM pollutants
without being overly burdensome (approximately 12 data points per
sampler per year).
The EPA received one state comment in support of this revision and
no adverse comments. Therefore, the EPA is finalizing this revision as
proposed.
In addition, the flow rate verification requirements for all the
particulate monitors suggest randomization of the implementation of
flow rate verifications with respect to time of day, day of the week
and routine service and
[[Page 17268]]
adjustments. Since this is a suggestion, the EPA proposed to remove
this language from the regulation and instead include it in QA
guidance.
The EPA noted that one consulting firm voiced concern about
removing the suggestion for randomizing flow rate verifications. They
stated that the ``randomization of QC procedures is a critical aspect
of QA currently unacknowledged by the EPA, and that single point
(precision) checks of gaseous monitors and flow rate verification
checks on PM samplers are crucial to characterizing the precision, bias
and accuracy of the data arising from those instruments. Diurnal and
weekly rhythms exist in solar radiation, temperature, humidity,
electrical power and traffic patterns. As standards decrease and
monitoring instrumentation becomes more sensitive, the likelihood
increases that interferences will occur in those instruments. One means
of detecting such biases involves randomized QC checks since they occur
out-of-sync with daily/weekly rhythms.''
The EPA agrees with the technical rationale for randomization
provided by the commenter, but also received comments that the
regulation should provide requirements and that suggested practices
should be referenced in guidance documents. Therefore, the EPA is
finalizing this revision as proposed and will include the randomization
suggestion in the next revision of the QA Handbook and in the
PM2.5 method.
The EPA proposed to add clarifying language to the PM2.5
collocation requirements (current section 3.2.5) that a site can only
count for the collocation of the method designation of the primary
monitor at that site. Precision is estimated at the PQAO level and
required at 15 percent of the primary monitor sites for each method
designation. When developing the collocation requirements, the EPA
intended to have the collocated monitors distributed to as many sites
as possible in order to capture as much of the temporal and spatial
variability in the PQAO given that only 15 percent of the primary
monitors within a method designation are collocated. Therefore, since
there can be only one primary monitor at a site for any given time
period, it was originally intended that the primary monitor and the QA
collocated monitor (for the primary) at a monitoring site count as one
collocation. This revision does not change the current regulation and
does not increase or decrease burden, but is intended to provide
clarity on how the PQAO identifies the number and types of monitors
needed to achieve the collocation requirements.
The EPA received one state and one consulting firm comment
supporting this clarification and two state comments expressing
concern.
One commenter expressing concern did not support specifically
forbidding collocation of multiple particulate monitors at a single
site and made the following points. As the NCore sites were designed to
provide a large suite of monitoring, the commenter felt it was an ideal
location to deploy a range of instruments. The commenter mentioned,
``where the array of PM10-2.5 monitors at a monitoring site
include a PM2.5 FRM as the primary monitor, the operation of
the continuous PM2.5 FEM is advantageous for collocation
across the network. For the EPA not to allow this collocation directly
contradicts the goal of the proposed rule by placing additional
compliance and operating burdens on monitoring organizations and
network operators.'' A second commenter mentioned that the proposed
``new requirement could result with the discontinuing a sampler at one
location and creating more upkeep and maintenance for the samplers at
different locations.''
The EPA notes that the proposed language does not represent a new
requirement, is not a revision to the current requirement, and merely
represents a needed clarification of the current language because some
monitoring organizations were misinterpreting the original language by
allowing one site to provide multiple collocations. Since the original
language identified that collocation for appendix A purposes requires
the QA collocated monitor to be compared against the primary monitor at
a site, and since there can only be one primary monitor at a site at
any particular time, the EPA believes that the original language and
intent were clear. Based on data assessments of collocated data in AQS,
most monitoring organizations follow this requirement. Since the
current requirement states that 15 percent of the primary monitors in
each method designation must be collocated, and there can only be one
primary monitor at a site, the current regulation (without the
clarifying language) allows only one collocation to count for a given
site. When the EPA became aware of potential confusion on this issue in
2010, we provided guidance to both the EPA Regions and monitoring
community through the QA EYE newsletter (Issue 9, page 3).\34\ The
article and the table, which was based on the number of sites in a
monitoring organization, were developed to articulate the intent of the
regulation.
---------------------------------------------------------------------------
\34\ https://www.epa.gov/ttnamti1/qanews.html.
---------------------------------------------------------------------------
The EPA supports the use of multiple monitors at sites like NCore,
as one commenter suggested, for testing and evaluation purposes but not
for conforming to the appendix A original requirements. However, as
articulated in the current appendix A regulation, a collocated monitor
can be used to achieve collocation requirements for more than one
pollutant. For example, collocated manual PM10-2.5 monitors
could be used to satisfy PM2.5 collocation, PM10
collocation, as well as PM10-Pb collocation. Therefore, the
EPA is adding the clarification as proposed to ensure that the current
requirement is not misinterpreted.
The EPA proposed to provide more flexibility to monitoring
organizations when selecting sites for collocation. Appendix A (current
section 3.2.5.3) had required that 80 percent of the collocated
monitors be deployed at sites within 20 percent of the
NAAQS and if the monitoring organization did not have sites within that
range, then 60 percent of the sites were to be deployed among the
highest 25 percent of all sites within the network. Monitoring
organizations found this difficult to achieve. Some monitoring
organizations did not have many sites and, at times, due to permission,
access, and limited space issues, the requirement was not always
achievable.
Realizing that the collocated monitors provide precision estimates
for the PQAO (since only 15 percent of the sites for each method
designation are collocated), while also acknowledging that sites that
measure concentrations close to the NAAQS are important, the EPA
proposed to require that 50 percent (down from 80 percent) of the
collocated monitors be deployed at sites within 20 percent
of the NAAQS and, if the PQAO did not have sites within that range,
then 50 percent of the sites are to be deployed among the highest sites
within the network. Although this requirement does not change the
number of sites requiring collocation, it does provide the PQAO
additional flexibility in its choice of collocated sites.
The EPA received three state comments and one consulting firm
comment in general support of this proposal and no comments expressing
concern.
As with the previous requirement, the EPA has a cut-off value of 3
[mu]g/m\3\ for data used in evaluations of precision and bias, meaning
that only data equal to or greater than 3 [mu]g/m\3\ are used in
estimates of precision and bias. This did
[[Page 17269]]
not change in the proposed regulation. Our expectation is that
monitoring organizations will site collocated monitors in such a manner
that they will likely collect collocated samples from sites that have
values equal to or greater than 3 [mu]g/m\3\. One commenter was
concerned about ``clean'' days that are below the 3 [mu]g/m\3\
threshold since the employment of this threshold would affect data
completeness by excluding pairs on cleaner days. The EPA notes,
however, that completeness is not calculated solely on data pairs with
concentrations equal to or greater than 3 [mu]g/m\3\, but on all valid
collocated pairs (valid pairs below 3 [mu]g/m\3\ are expected to be
reported to AQS). Therefore, as long as the monitoring agency collects
and reports all collocated data at the required frequency, data
completeness is not an issue.
Another state commenter, in support of the proposal, suggested that
the highest concentration site be selected for the first collocation
and, if a second site is needed, then the second highest site be
selected, and so on. While this is an alternative approach, the initial
rationale for the revision was to provide more flexibility in site
selection in cases where some sites (for example the highest
concentration site) had access problems or some other issue that did
not make it a good candidate for collocation. The wording in the
proposed regulation is meant to ensure that some of the sites selected
for collocation represent the locations with the highest concentrations
in the respective monitoring agencies network while providing the
flexibility to choose among those sites.
Since there was general support for the proposal with no adverse
comments, the EPA is finalizing this revision as proposed.
5. Calculations for Data Quality Assessment
In order to provide reasonable estimates of data quality, the EPA
uses data above an established threshold concentration usually related
to the detection limits of the measurement. Measurement pairs are
selected for use in the precision and bias calculations only when both
measurements are greater than or equal to a threshold concentration.
For many years, the threshold concentration for Pb precision and
bias data was 0.02 [mu]g/m\3\. The EPA promulgated a new Pb FRM (78 FR
40000) utilizing the Inductively Coupled Plasma Mass Spectrometry (ICP-
MS) analysis technique in 2013 as a revision to appendix G of 40 CFR
part 50.\35\ This new FRM demonstrated MDLs \36\ below 0.0002 [mu]g/
m\3\, which is well below the EPA requirement of 5 percent of the
current Pb NAAQS level of 0.15 [mu]g/m\3\, or 0.0075 [mu]g/m\3\. As a
result of the increased sensitivity inherent in this new FRM, the EPA
proposed to lower the acceptable Pb concentration (current section 4)
from the current value of 0.02 [mu]g/m\3\ to 0.002 [mu]g/m\3\ for
measurements obtained using the new Pb FRM and other more recently
approved equivalent methods that have the requisite increased
sensitivity.\37\ The current 0.02 [mu]g/m\3\ value will be retained for
the previous Pb FRM that has subsequently been re-designated as FEM
EQLA-0813-803, as well as older equivalent methods that were approved
prior to the more recent work on developing more sensitive methods.
Since ambient Pb concentrations are lower and methods more sensitive,
lowering the threshold concentration will allow more collocated data to
be evaluated, which will provide more representative estimates of
precision and bias at current ambient Pb levels.
---------------------------------------------------------------------------
\35\ See 78 FR 40000, July 3, 2013.
\36\ MDL is described as the minimum concentration of a
substance that can be measured and reported with 99 percent
confidence that the analyte concentration is greater than zero.
\37\ FEMS approved on or after March 4, 2010, have the required
sensitivity to utilize the 0.002 [mu]g/m\3\ reporting limit with the
exception of manual equivalent method EQLA-0813-803, the previous
FRM based on flame atomic absorption spectroscopy.
---------------------------------------------------------------------------
The EPA received one state comment and one consulting firm comment
in support of the proposal and one state comment expressing concern.
The comment expressing concern related to a perception that data
would be lost due to the increased possibility that data quality
objectives (DQO) would not be met with the decreased threshold
concentration. The commenter believed the change would increase the
likelihood that collocated data would not meet the 20 percent
coefficient of variation (CV) limit for precision as specified in
appendix A, section 2.3.1.3. This would in turn decrease data
completeness and, if data loss is great enough, could potentially
render the data from an entire monitoring location useless for NAAQS
compliance determinations.
The EPA notes that invalidation of routine data based solely on the
variability of collocated monitoring data is not required or
recommended. The data validation guidance in the QA Handbook, which
many monitoring organizations use to develop validation criteria,
allows for these data to be reviewed in the context of other QC samples
before decisions to invalidate data are made. Since the collocated data
are only collected at approximately 15 percent of the monitoring sites,
the data set is meant to reflect the precision of the PQAO monitoring
network and not to evaluate the validity of data from individual sites.
Site data can be used to troubleshoot causes of variability and to take
corrective actions, but is not intended to invalidate routine
monitoring data unless a significant systemic issue is discovered.
Based on the comment noted above, the EPA performed an evaluation
of collocated Pb data collected in calendar years 2011-2013 to evaluate
the amount of collocation information available when using the two
reporting thresholds. In that time period, 7,063 collocated
measurements were taken. Within this data set, there were 2,521 data
pairs where both values were equal to or greater than 0.02 [mu]g/m\3\
(i.e., only about 35 percent of the information collected could be used
to estimate precision). In the most pertinent examples, there were
cases where monitoring organizations collected valid ambient data and
no collocated data could be used due to the current higher threshold.
For example, one monitoring organization collected 173 collocated
measurements and no value was equal to or greater than 0.02 [mu]g/m\3\
and, therefore, there was no estimate of precision reported for this
monitoring organization for a 3-year period. There were eight
monitoring organizations that could not use any collocated results for
2011-2013 and 22 monitoring organizations (about 50 percent of the
monitoring organizations) that had less than 25 percent of their data
used. In contrast, if the same data set is used, but the threshold is
reduced to the proposed value of greater than or equal to 0.002 [mu]g/
m\3\, then 6,418 measurements are available, which increases precision
data availability from 35 percent to 91 percent. As an example, the
monitoring organization that had no collocated values (173
measurements) equal to or greater than 0.02 [mu]g/m\3\ had the number
of available pairs increased to 172 with the lower 0.002 [mu]g/m\3\
threshold and had a precision estimate CV of 16.43, which is within the
3-year DQO goal of 20 percent.
The EPA acknowledges that using a lower threshold concentration
will increase the estimate of precision since the required CV statistic
is a derivation of the percent difference. When EPA evaluated the Pb
data quality objectives to determine acceptable precision and bias for
the new standard, we evaluated all collocated data in AQS including the
[[Page 17270]]
lower concentration data.\38\ Since the collocated data are actual
samples, they include measurement uncertainty for all phases of the
measurement system including variability in EPA-provided filters,
sampling handling, sampler flow differences, plumes from sources,
laboratory contamination, as well as other types of measurement
uncertainty mentioned by one commenter. In fact, the goal of the
collocation is to provide an estimate of overall measurement
imprecision between two sampling systems that are, in theory, sampling
the same air. So although the commenter identifies this as a concern,
providing a measure of the overall precision of the measurement system
is what the collocated data are intended to evaluate. The commenter
mentioned that changing the threshold based solely on the estimated FRM
detection limit may not translate to other FEMs that may have different
detection limits. At a minimum, all approved Pb methods are required to
meet the method detection limit to be approved as equivalent.
Therefore, the 0.002 [mu]g/m\3\ threshold should be applicable to the
newer methods and is the reason for the dual thresholds.
---------------------------------------------------------------------------
\38\ https://www.epa.gov/ttnamti1/files/ambient/pb/QAQA.pdf.
---------------------------------------------------------------------------
Based on our review and evaluations, the EPA set the precision goal
of a 90 percent confidence limit for the CV of 20 percent as mentioned
by the commenter. This CV estimate is determined by aggregating 3 years
of collocated data. In the evaluation of the 2011-2013 data, the EPA
evaluated data down to the lower threshold with the new methods capable
of more sensitivity. The average 3-year precision estimate (2011-2013)
for all monitoring organizations using the approved FRM and FEM methods
and a threshold of 0.002 [mu]g/m\3\ was 16.31. The average 3-year CV
for a threshold of 0.02 [mu]g/m\3\ was 11.09. This is an increase of
imprecision on average of 5 percent, but a significant increase in data
availability from 35 percent to 90 percent.
The commenter also suggested that the current threshold should
remain in effect until a limit of quantitation (LOQ) test can be
performed. Although there are a number of definitions for LOQ, some
have defined it to be three times (3x) to ten times (10x) the MDL. The
new Pb FRM by ICP-MS promulgated in 2013 in 40 CFR part 50, appendix G,
showed that the MDLs were below 0.0002 [mu]g/m\3\. Therefore, the EPA
took the 10x definition of LOQ and calculated 0.002 [mu]g/m\3\ as the
level of the new threshold.
Two commenters made similar points that, due to the fact that the
CV is based on individual sample pair percent differences, the CV tends
to increase at lower concentrations for a constant absolute difference.
The EPA acknowledges this fact. On a related issue, when developing the
10 audit levels for annual performance evaluation checks, the EPA
provided guidance on the two lower audit levels allowing for an
absolute difference criteria as well as a percent difference criteria.
Rather than eliminate close to 55 percent of the collocated data, which
is what is occurring now with the higher threshold, the EPA is
finalizing the two thresholds as proposed and will also evaluate the
use of an absolute difference acceptance criteria at lower
concentration levels.
The EPA proposed to remove the TSP threshold concentration for
precision and bias since TSP is no longer a NAAQS-required pollutant
and the EPA no longer has QC requirements for it.
The EPA received one comment in support of this proposal and no
adverse comments and is finalizing this revision as proposed.
The EPA proposed to remove the statistical check currently
described in section 4.1.5 of appendix A. The check was developed to
perform a comparison of the one-point QC checks and the annual
performance evaluation data performed by the same PQAO on gaseous
instruments. The section suggests that 95 percent of all the bias
estimates from the annual performance evaluation (reported as a percent
difference) should fall within the 95 percent probability interval
developed using the one-point QC checks. The problem with this specific
statistical check is that PQAOs with very good repeatability on the
one-point QC check data had a hard time meeting this requirement since
the probability interval became very tight, making it more difficult
for better performing PQAOs to meet the requirement when comparing the
one-point QC checks and performance evaluation data. Separate
statistics to evaluate the one-point QC checks and the performance
evaluations are already promulgated, so the removal of this check does
not affect data quality assessments.
The EPA received one comment in support of this proposal and no
adverse comments and is finalizing this revision as proposed.
Similar to the statistical comparison of performance evaluations
data, the EPA proposed to remove the statistical check (current section
4.2.4) to compare the flow rate audit data and flow rate verification
data for PM monitors. The existing language suggests that 95 percent of
all the flow rate audit data results (reported as percent difference)
should fall within the 95 percent probability interval developed from
the flow rate verification data for the PQAO. The problem, as with the
one-point QC check comparison requirement for gaseous monitors, was
that monitoring organizations with very good repeatability on the flow
rate verifications had a hard time meeting this requirement since the
probability interval became very tight, making it difficult for better
performing PQAOs to meet the requirement. Separate statistics to
evaluate the flow rate verifications and flow rate audits are already
promulgated, so the removal of this check does not affect data quality
assessments.
The EPA received one comment in support of this proposal and no
adverse comments and is finalizing this revision as proposed.
B. Quality Assurance Requirements for Monitors Used in Evaluations of
Prevention of Significant Deterioration Projects--Appendix B
The EPA proposed to create appendix B to specify the minimum
quality assurance requirements for the control and assessment of the
quality of the ambient air monitoring data submitted to a PSD reviewing
authority or the EPA by an organization operating an air monitoring
station, or network of stations, operated in order to comply with Part
51 New Source Review--Prevention of Significant Deterioration (PSD).
These proposed revisions to the quality assurance requirements
applicable to PSD are, in the majority of cases, identical to the
revisions proposed in appendix A. The majority of comments received for
this rule focused on the appendix A requirements and were discussed in
the previous section. Due to the similarity of the proposed changes for
appendix A and appendix B, the EPA assumes that comments submitted in
response to proposed appendix A revisions also reflect the sentiment of
commenters concerning the proposed language in appendix B. Therefore,
the preamble discussions that include responses to comments for
appendix A should, in most cases, also apply to appendix B.
Accordingly, the EPA will not duplicate those discussions in the
following sections pertaining to appendix B, and we refer the reader
back to the relevant appendix A discussions in section III.A. of the
preamble, above. In the few cases where comments were made specifically
for appendix B sections, those
[[Page 17271]]
comments are discussed in the appropriate sections below.
1. General Information
The following changes to monitoring requirements impact Part 58--
Ambient Air Quality Surveillance; Appendix B--Quality Assurance
Requirements for Prevention of Significant Deterioration (PSD) Air
Monitoring. Changes that affect the overall appendix are discussed in
this section of the preamble while changes specific to the various
sections of the appendix will be addressed in subsequent sections of
the preamble. Since the PSD QA requirements have been included in
appendix A since 2006, section headings refer to the current appendix A
sections.
The QA requirements in appendix B have been developed for measuring
the criteria pollutants of O3, NO2,
SO2, CO, PM2.5, PM10 and Pb and are
minimum QA requirements for the control and assessment of the quality
of the PSD ambient air monitoring data submitted to the PSD reviewing
authority \39\ or the EPA by an organization operating a network of PSD
stations.
---------------------------------------------------------------------------
\39\ Permitting authority and reviewing authority are often used
synonymously in PSD permitting. Since reviewing authority has been
defined in 40 CFR 51.166(b), it is used throughout appendix B.
---------------------------------------------------------------------------
In the 2006 monitoring rule revisions, the PSD QA requirements,
which were previously in appendix B, were consolidated with appendix A
and appendix B was reserved. The PSD requirements, in most cases,
parallel appendix A in structure and content but because PSD monitoring
is only required for a period of 1 year or less, some of the
frequencies of implementation of the QC requirements for PSD are higher
than the corresponding appendix A requirements. In addition, the
agencies governing the implementation, assessment and approval of the
QA requirements can be different: The PSD reviewing authorities for PSD
monitoring and the EPA Regions for ambient air monitoring for NAAQS
decisions. Since 2006, the combined regulations have caused confusion
or misinterpretations of the regulations among the public and
monitoring organizations implementing NAAQS or PSD requirements, and
have resulted in failure, in some cases, to perform the necessary QC
requirements. Accordingly, the EPA proposed that the PSD QA
requirements be removed from appendix A and returned to appendix B.
Separating the two sets of QA requirements would clearly distinguish
the PSD QA requirements and allow more flexibility for future revisions
to either monitoring program.
With this final rule, the EPA would not change most of the QA
requirements for PSD. Therefore, the discussion that follows will cover
those sections of the PSD requirements that the EPA proposed to change
from the current appendix A requirements.
Commenters supported moving the PSD QA requirements to a distinct
section with no adverse comments received, so the EPA is finalizing as
proposed.
The applicability section of appendix B clarifies that the PSD QA
requirements are not assumed to be minimum requirements for data use in
NAAQS attainment decisions. One reason for this distinction is in the
flexibility allowed in PSD monitoring for the NPEP (current appendix A,
section 2.4). The proposed PSD requirements allow the PSD reviewing
authority to decide whether implementation of the NPEP will be
performed. The NPEP, which is described in appendix A, includes the
NPAP, the PM2.5 Performance Evaluation Program
(PM2.5-PEP), and the Pb-PEP. Accordingly, under the proposed
revision, if a PSD reviewing authority intended to use PSD data for any
official comparison to the NAAQS beyond the permitting application,
such as for attainment/nonattainment designations or clean data
determinations, then all requirements in appendix B including
implementation of the NPEP would apply. In this case, monitoring would
more closely conform to the appendix A requirements. The EPA proposed
this flexibility for PSD because the NPEP requires either federal
implementation or implementation by a qualified individual, group or
organization that is not part of the organization directly performing
and accountable for the work being assessed. The NPEP may require
specialized equipment, certified auditors and a number of activities
which are enumerated in the sections associated with these programs.
Arranging this type of support service may be more difficult for the
operator of a single or small number of PSD monitoring stations
operating for only a year or less.
The EPA cannot accept funding from private contractors or industry,
and federal implementation of the NPEP for PSD would face several
funding and logistical hurdles. This creates an inequity in the NPEP
implementation options available to the PSD monitoring organizations
compared to the state/local/tribal monitoring organizations for NAAQS
compliance. The EPA has had success in training and certifying private
contractors in various categories of performance evaluations conducted
under NPEP, but many have not made the necessary investments in capital
equipment to implement all categories of the performance evaluations.
Since the monitoring objectives for the collection of data for PSD are
not necessarily the same as the appendix A monitoring objectives, the
EPA proposed to allow the PSD reviewing authority to determine whether
a PSD monitoring project must implement the NPEP.
The EPA only received comments in support of this proposed change,
and is finalizing the change as proposed.
The EPA proposed to clarify the definition of PSD PQAO. The PQAO
was first defined in appendix A in 2006 (current appendix A, section
3.1.1), when the PSD requirements were combined with appendix A. The
definition is not substantially changed for PSD, but the EPA proposed
to clarify that a PSD PQAO can only be associated with one PSD
reviewing authority. Distinguishing among the PSD PQAOs that coordinate
with a PSD reviewing authority would be consistent with discrete
jurisdictions for PSD permitting, and it would simplify oversight of
the QA requirements for each PSD network.
Given that companies may apply for PSD permits throughout the U.S.,
it is expected that some PSD monitoring organizations will work with
multiple reviewing authorities. The PSD PQAO code that may appear in
the AQS data base and other records defines the PSD monitoring
organization or a coordinated aggregation of such organizations that is
responsible for a set of stations within one PSD reviewing authority
that monitors the same pollutant and for which data quality assessments
will be pooled. The PSD monitoring organizations that work with
multiple PSD reviewing authorities would have individual PSD PQAO codes
for each PSD reviewing authority. This approach will allow flexibility
to develop appropriate quality systems for each PSD reviewing
authority.
The EPA did not receive any comment on this process and is
finalizing the requirement as proposed.
The EPA proposed to add definitions of ``PSD monitoring
organization'' and ``PSD monitoring network'' to 40 CFR 58.1. The
definitions have been developed to improve understanding of the
appendix B regulations.
Because the EPA uses the term ``monitoring organization''
frequently in the NAAQS-associated ambient air regulations, the EPA
wanted to provide a better definition of the term in the PSD
[[Page 17272]]
QA requirements. Therefore, the EPA proposed the term ``PSD monitoring
organization'' to identify ``a source owner/operator, a government
agency, or a contractor of the source or agency that operates an
ambient air pollution monitoring network for PSD purposes.''
The EPA also proposed to define ``PSD monitoring network'' in order
to distinguish ``a set of stations that provide concentration
information for a specific PSD permit.'' The EPA will place both
definitions in 40 CFR 58.1. The EPA did not receive any comment on
these changes and is finalizing them as proposed.
2. Quality System Requirements
The EPA proposed to remove the PM10-2.5 requirements for
flow rate verifications, semi-annual flow rate audits, collocated
sampling procedures and PM10-2.5 PEP from appendix B
(current appendix A, sections 3.2.6, 3.2.8, 3.3.6, 3.3.8, 4.3). In
2006, the EPA proposed a PM10-2.5 NAAQS along with requisite
QA requirements in appendix A. While the PM10-2.5 NAAQS was
not promulgated, PM10-2.5 monitoring was required to be
performed at NCore sites and the EPA proposed requisite QA requirements
in appendix A. Since PSD monitoring is distinct from monitoring at
NCore sites and PM10-2.5 is not a criteria pollutant, it
will be removed from the PSD QA requirements. The EPA did not receive
any comment on this proposed revision and is finalizing the requirement
as proposed.
The EPA proposed that the Pb QA requirements of collocated sampling
(current appendix A, section 3.3.4.3) and Pb performance evaluation
procedures (current appendix A, section 3.3.4.4) for non-source
oriented NCore sites be eliminated for PSD. The 2010 Pb rule in 40 CFR
part 58, appendix D, section 4.5(b) added a requirement to conduct non-
source oriented Pb monitoring at each NCore site in a CBSA with a
population of 500,000 or more. Since PSD does not implement NCore
sites, the EPA proposed to eliminate the Pb QA language specific to
non-source oriented NCore sites from PSD while retaining the PSD QA
requirements for routine Pb monitoring.
The EPA received three supportive comments for the removal of this
requirement and no adverse comments. Therefore, the EPA is finalizing
the requirement as proposed.
The EPA proposed that elements of QMPs and QAPPs which are separate
documents described in appendix A, sections 2.1.1 and 2.1.2, can be
combined into a single document for PSD monitoring networks. The QMP
provides a ``blueprint'' of a PSD monitoring organization's quality
system. It includes quality policies and describes how the organization
as a whole manages and implements its quality system regardless of what
monitoring is being performed. The QAPP includes details for
implementing a specific PSD monitoring activity. For PSD monitoring,
the EPA believes the project-specific QAPP takes priority, but there
are important aspects of the QMP that could be incorporated into the
QAPP. The current appendix A requirements allow smaller organizations
or organizations that do infrequent work with EPA to combine the QMP
with the QAPP based on negotiations with the funding agency and
provided guidance \40\ on a graded approach to developing these
documents. In the case of PSD QMPs and QAPPs, the EPA proposed that the
PSD reviewing authority, which has the approval authority for these
documents, also have the flexibility for allowing the PSD PQAO to
combine pertinent elements of the QMP into the QAPP rather than
requiring the submission of both QMP and QAPP documents separately. The
EPA did not receive any comment on this and is finalizing the
requirement as proposed.
---------------------------------------------------------------------------
\40\ Graded approach to Tribal QAPP and QMPs https://www.epa.gov/ttn/amtic/cpreldoc.html.
---------------------------------------------------------------------------
The EPA proposed to add language to the appendix B version of the
DQO section (current appendix A, section 2.3.1) which allows
flexibility for the PSD reviewing authority and the PSD monitoring
organization to determine if adherence to the DQOs specified in
appendix A, which are the DQO goals for NAAQS decisions, are
appropriate or whether project-specific goals are necessary. Allowing
the PSD reviewing authority and the PSD monitoring organization
flexibility to change the DQOs does not change the implementation
requirements for the types and frequency of the QC checks in appendix
B, but does give some flexibility in the acceptance of data for use in
specific projects for which the PSD data are collected. As an example,
the goal for acceptable measurement uncertainty for the collection of
O3 data for NAAQS determinations is defined for precision as
an upper 90 percent confidence limit for CV of 7 percent and for bias
as an upper 95 percent confidence limit for the absolute bias of 7
percent. The precision and bias estimates are made with 3 years of one-
point QC check data. A single or a few one-point QC checks over 7
percent would not have a significant effect on meeting the DQO goal.
The PSD monitoring DQO, depending on the objectives of the PSD
monitoring network, may require a stricter DQO goal or one less
restrictive. Since PSD monitoring covers a period of 1 year or less,
one-point QC checks over 7 percent will increase the likelihood of
failing to meet the DQO goal since there would be fewer QC checks
available in the monitoring period to estimate precision and bias. With
fewer checks, any individual check will statistically have more
influence over the precision or bias estimate. Realizing that PSD
monitoring may have different monitoring objectives, the EPA proposed
to add language that would allow decisions on DQOs to be determined
through consultation between the appropriate PSD reviewing authority
and PSD monitoring organization. The EPA did not receive any comment on
this and is finalizing the requirement as proposed.
The EPA proposed to add some clarifying language to the section
describing the NPEP (current appendix A, section 2.4) to explain self-
implementation of the performance evaluation by the PSD monitoring
organization. Self-implementation of NPEP has always been an option for
monitoring organizations but the requirements for self-implementation
were described in the technical implementation documents (i.e.,
implementation plans and QAPPs) for the program and in an annual self-
implementation decision memo that is distributed to monitoring
organizations.\41\ These major requirements for self-implementation are
proposed to be included in the appendix B sections pertaining to the
NPEP program (NPAP, PM2.5-PEP and Pb-PEP).
---------------------------------------------------------------------------
\41\ https://www.epa.gov/ttn/amtic/npepqa.html.
---------------------------------------------------------------------------
The NPEP clarification also adds a definition of ``independent
assessment.'' The proposed definition is derived from the NPEP (NPAP,
PM2.5-PEP, and Pb-PEP) QAPPs and guidance; it also appears
in the annual self-implementation memo described above. The
clarification is not a new requirement but consolidates this
information.
Refer to comments related to NPEP in appendix A in III.A. As there
were no comments specifically related to PSD, the EPA is finalizing the
requirement as proposed.
The EPA proposed to require PSD PQAOs to provide information to the
PSD reviewing authority on the vendors of gas standards that they use
(or will use) for the duration of the PSD monitoring project. A QAPP or
monitoring plan may incorporate this
[[Page 17273]]
information. However, that document must then be updated if there is a
change in the vendor used. The current regulation (current appendix A,
section 2.6.1) requires any gas vendor advertising and distributing
``EPA Protocol Gas'' to participate in the AA-PGVP. The EPA posts a
list of these vendors on the AMTIC Web site.\42\ This is not expected
to be a burden since information of this type is normally included in a
QAPP or standard operating procedure for a monitoring activity.
---------------------------------------------------------------------------
\42\ https://www.epa.gov/ttn/amtic/aapgvp.html.
---------------------------------------------------------------------------
There were no adverse comments in appendix A or appendix B related
to identifying vendors used to supply monitoring organization with gas
standards. Therefore, the EPA is finalizing the requirement as
proposed.
3. Measurement Quality Checks for Gases
The EPA proposed to lower the audit concentrations (current
appendix A, section 3.2.1) of the one-point QC checks to 0.005 and 0.08
ppm for SO2, NO2, and O3 (currently
0.01 to 0.1 ppm), and to between 0.5 and 5 ppm for CO monitors
(currently 1 and 10 ppm). With the development of more sensitive
monitoring instruments with lower detection limits, technical
improvements in calibrators, and lower ambient air concentrations in
general, the EPA believes this revision will better reflect the
precision and bias of the routinely-collected ambient air data. Because
the audit concentrations are selected using the mean or median
concentration of typical ambient air data (guidance on this is provided
in the QA Handbook \43\), the EPA proposed to add some clarification to
the current language by requiring PSD monitoring organizations to
select either the highest or lowest concentration in the ranges
identified if the mean or median values of the routinely-collected
concentrations are above or below the prescribed range.
---------------------------------------------------------------------------
\43\ QA Handbook for Air Pollution Measurement Vol. II Ambient
Air Quality Monitoring Program at: https://www.epa.gov/ttn/amtic/qalist.html.
---------------------------------------------------------------------------
The EPA received a number of comments on this proposed requirement.
Please refer to the appendix A comments in III.A. In light of the
comments received, the EPA will maintain the concentration ranges as
proposed: 0.005 to 0.08 ppm for SO2, NO2, and
O3, and between the prescribed range of 0.5 and 5 ppm for CO
monitors. However, rather than requiring that the range selected be at
the mean or median concentration range at the site or the agencies
network of sites, the QC check gas concentration selected within the
prescribed range can be related to the monitoring objective of the
site, with those monitors primarily intended for NAAQS compliance
utilizing concentrations at or near the level of the NAAQS (higher end
of the required range), and trace gas monitors operating at background
or trends sites related to the mean or median of the ambient air
concentrations normally measured at those sites in order to
appropriately reflect the precision and bias at these routine
concentration ranges. If the mean or median concentrations at trace gas
sites are below the MDL of the instrument or above the prescribed
range, the agency can select the lowest or highest concentration in the
range that can be practically achieved. In the case of PSD monitoring,
the EPA will add language requiring the PSD monitoring organization to
consult with the PSD reviewing authority on the most appropriate one-
point QC concentration based on the objectives of the monitoring
activity. In addition, the EPA will keep language suggesting that an
additional QC check point is encouraged for those organizations that
may have occasional high values or would like to confirm the monitors'
linearity at the higher end of the operational range.
In addition, to alleviate concerns about failing the acceptance
criteria at lower QC concentrations, the EPA will evaluate suggestions
by monitoring organizations to raise acceptance criteria or look at
alternative acceptance criteria (e.g., difference instead of percent
difference). Since acceptance criteria is included in guidance, the EPA
will have the opportunity to perform the evaluations without effecting
the regulation.
The EPA proposed to remove the existing reference to zero and span
adjustments (current appendix A, section 3.2.1.1) and to revise the
one-point QC language to simply require that the QC check be conducted
before making any calibration or adjustment to the monitor. Recent
revisions of the QA Handbook discourage the practice of making frequent
span adjustments, so the proposed language helps to clarify that no
adjustment be made prior to implementation of the one-point QC check.
There were no comments made on this proposed revision, so the EPA is
finalizing this revision as proposed.
The current annual performance evaluation language (current
appendix A, section 3.2.2.1) requires that the audits be conducted by
selecting three consecutive audit levels (currently, appendix A
recognizes five audit levels). Due to the implementation of the NCore
network, the inception of trace gas monitors, and lower ambient air
concentrations being measured under typical circumstances, there is a
need for audit levels at lower concentrations to more accurately
represent the uncertainties present in the ambient air data. The EPA
proposed to expand the audit levels from five to ten and remove the
requirement to audit three consecutive levels. The current regulation
also requires that the three audit levels should bracket 80 percent of
the ambient air concentrations measured by the analyzer. This current
``bracketing language'' has caused some confusion, and monitoring
organizations have requested the use of an audit point to establish
monitor accuracy around the NAAQS levels. Therefore, the EPA proposed
to revise the language so that two of the audit levels selected
represent 10 to 80 percent of routinely-collected ambient
concentrations either measured by the monitor or in the PSD PQAOs
network of monitors. The proposed revision allows the third point to be
selected at a concentration that is consistent with PSD-specific DQOs
(e.g., the 75 ppb NAAQS level for SO2).
The EPA received a number of comments on this proposal. Please
refer to the appendix A comments in III.A.
In addition to comments related to appendix A, the EPA received
comments specific to PSD on this section. A commenter mentioned that
for PSD, the performance evaluation (PE) is performed quarterly since
PSD monitoring may occur for only 1 year. The current language required
the audit to occur each calendar quarter and since PSD monitoring does
not necessarily follow calendar quarters, it was suggested to revise
the term ``calendar quarter'' to ``quarterly.'' The EPA will revise the
PSD language to reflect implementing the quarterly PE on a quarter or
90-day frequency. A commenter felt that the requirement that PE
personnel will be required to meet PE training and certification
requirements was in error because the requirement for certification
applies only to NPEP audits, not to quarterly performance evaluation
audits, and there is no further regulatory discussion to support such
an assertion. Because the EPA has provided more flexibility on
implementing NPEP at PSD sites, we believed there needed to be an
additional requirement that the personnel implementing these audits be
trained and certified. However, as the commenter mentioned, there is no
additional instruction on this, nor is there any mention of the
organization required to do this training and certification. It is
expected that any
[[Page 17274]]
entity performing this activity would be trained and capable of
performing these audits. Therefore, the EPA will remove the last
sentence requiring training and certification.
The EPA received a comment that suggested the PE language was not
consistent with an earlier section (2.7) that only required the use of
reference and equivalent method monitors as opposed to trace gas
analyzers regardless of the concentrations measured. The commenter's
contention was that based upon the proposed language related to the
selection of PE concentration, the PSD monitoring agency would be
required to acquire trace gas instruments due to their sensitivity and
the fact that their ambient air concentrations were low. They used
examples of annual mean NO2 values around 1.9 ppb and
SO2 concentrations of 1.0 ppb. However, the proposed PE
language is consistent with the reference and equivalent language
described in section 2.7 since trace gas analyzers are in fact
reference and equivalent instruments and, therefore, are included in
that description. Regardless of the proposed PE concentration range, it
would seem that PSD monitoring organizations that are required to
monitor at the low concentration ranges would want to select FRM or FEM
instruments more capable of reliably measuring these concentrations.
Based on the comments received related to appendices A and B, the
EPA will revise the proposed language to require three points to be
selected: One point around two to three times the method detection
limit of the instruments within the PQAO network, a second point less
than the 99 percentile of the data at the site or the network of sites
within a PQAO or the next highest audit concentration level, and the
third point around the primary NAAQS or the highest 3-year
concentration at the site or the network of sites in the PQAO. This
provides two audit points that reflect 99 percent of the monitoring
data and a third point at the highest 3-year concentration or the
NAAQS, whichever concentration the PSD monitoring organization chooses.
The EPA proposed to revise the language (current appendix A,
section 3.2.2.2(a)) addressing the limits on excess NO that must be
followed during GPT procedures involving NO2 audits. The
current NO limit (maintaining at least 0.08 ppm) is very restrictive
and requires auditors to make numerous mid-audit adjustments during a
GPT that result in making the NO2 audit a very time-
consuming procedure. Monitoring agency staff have advised us that the
observance of such excess NO limits has no apparent effect on
NO2 calibrations being conducted with modern-day GPT-capable
calibration equipment and, therefore, the requirements in the context
of performing audits is unnecessary.\44\ We also note the increasing
availability of the EPA-approved direct NO2 methods that do
not utilize converters, rendering the use of GPT techniques that
require the output of NO and NOX to be a potentially
diminishingly used procedure in the future. Accordingly, we have
proposed a more general statement regarding GPT that acknowledges the
ongoing usage of monitoring agency procedures and guidance documents
that have successfully supported NO2 calibration activities.
The EPA believes that if such procedures have been successfully used
during calibrations when instrument adjustments are potentially being
made, then such procedures are appropriate for audit use when
instruments are not subject to adjustment.
---------------------------------------------------------------------------
\44\ See supporting information in Excess NO Issue paper, Mike
Papp and Lewis Weinstock, Docket number EPA-HQ-OAR-2013-0619.
---------------------------------------------------------------------------
The EPA received only supportive comments endorsing the proposed
revision to the language on excess NO. Therefore, the EPA is finalizing
this revision as proposed.
The EPA proposed to remove language (current appendix A, section
3.2.2.2(b)) in the annual performance evaluation section that requires
Regional approval for audit gases for any monitors operating at ranges
higher that 1.0 ppm for O3, SO2 and
NO2 and greater than 50 ppm for CO. The EPA does not need to
approve a monitoring organization's use of audit gases to audit above
proposed concentration levels since the EPA has identified the
requirements for all audit gases used in the program in current
appendix A, section 2.6.1. There should be very few cases where a PE
needs to be performed above level 10, but there may be some legitimate
instances (e.g., an SO2 audit in areas impacted by volcanic
emissions). Since data reported to AQS above the highest level may be
rejected (if PSD PE data are reported to AQS), the EPA proposes that
PQAOs notify the PSD reviewing authority of sites auditing at
concentrations above level 10 so that reporting accommodations can be
made. There were no comments made on this proposed revision, so the EPA
is finalizing this revision as proposed.
The EPA proposed to describe the NPAP (current appendix A, section
2.4) in more detail. The NPAP is a long-standing program for the
ambient air monitoring community. The NPAP is a performance evaluation,
which is a type of audit where quantitative data are collected
independently in order to evaluate the proficiency of an analyst,
monitoring instrument or laboratory. This program has been briefly
mentioned in section 2.4 of the current appendix A requirements. In
appendix A, the EPA proposed to add language consistent with an annual
decision memorandum \45\ distributed to all state and local monitoring
organizations in order to determine whether the monitoring organization
plans to self-implement the NPAP program or utilize the federally
implemented program. In order to make this decision, the NPAP adequacy
and independence requirements are described in the decision memorandum.
The EPA proposed to include these same requirements in appendix B in a
separate section for NPAP. As described in the applicability section,
the implementation of NPAP is at the discretion of the PSD reviewing
authority but must be implemented if data are used in any NAAQS
determinations. Since PSD monitoring is implemented at shorter
intervals (usually a year) and with fewer monitors, if NPAP is
performed, it is required to be performed annually on each monitor
operated in the PSD network.
---------------------------------------------------------------------------
\45\ https://www3.epa.gov/ttn/amtic/npepqa.html.
---------------------------------------------------------------------------
See appendix A for comments and discussions related to this
section. The EPA is finalizing this revision as proposed.
4. Measurement Quality Checks for Particulate Monitors
The EPA proposed to have one flow rate verification frequency
requirement for all PM PSD monitors. The current regulations (current
appendix A, table A-2) provide for monthly flow rate verifications for
most samplers used to monitor PM2.5, PM10 and Pb
and quarterly flow rate verifications for high-volume PM10
or TSP samplers (for Pb). With longer duration NAAQS monitoring, the
quarterly verification frequencies are adequate for these high-volume
PM10 or TSP samplers. However, with the short duration of
PSD monitoring, the EPA believes that monthly flow rate verifications
are more appropriate to ensure that any sampler flow rate problems are
identified more quickly and to reduce the potential for a significant
amount of data invalidation that could extend monitoring activities.
The EPA received one comment in support of this revision and no
adverse
[[Page 17275]]
comments. Therefore, the EPA is finalizing this revision as proposed.
The EPA proposed to grant more flexibility to PSD monitoring
organizations when selecting PM2.5 method designations for
sites that require collocation. Appendix A (current section 3.2.5.2(b))
requires that if a primary monitor is a FEM, then the first QC
collocated monitor must be a FRM monitor. Most of the FEM monitors are
continuous monitors while the FRM monitors are filter-based. Continuous
monitors (which are all FEMs) may be advantageous for use at the more
remote PSD monitoring locations, since the site operator would not need
to visit a site as often to retrieve filters (current FRMs are filter-
based). The current collocation requirements for FEMs require a filter-
based FRM for collocation, which would mean a visit to retrieve the FRM
filters at least 1 week after the QC collocated monitor operated.
Therefore, the EPA proposed that the FRM be selected as the QC
collocated monitor unless the PSD PQAO submits a waiver request to the
PSD reviewing authority to allow for collocation with a FEM. If the
request for a waiver is approved, then the QC monitor must be the same
method designation as the primary FEM monitor. The EPA did not receive
any comments on this proposal and is finalizing this revision as
proposed.
The EPA proposed to allow the PSD reviewing authority to waive the
PM2.5 3 [mu]g/m\3\ concentration validity threshold for
implementation of the PM2.5-PEP in the last quarter of PSD
monitoring. The PM2.5-PEP (current appendix A, section
3.2.7) requires five valid PM2.5-PEP audits per year for
PM2.5 monitoring networks with less than or equal to five
sites and eight valid PM2.5-PEP audits per year with
PM2.5 monitoring networks greater than five sites. Any PEP
samples collected with a concentration less than 3 [mu]g/m\3\ are not
considered valid, since they cannot be used for bias estimates, and re-
sampling is required at a later date. With NAAQS-related monitoring,
which aggregates the PM2.5-PEP data over a 3-year period,
re-sampling is easily accomplished. Due to the relatively short-term
nature of most PSD monitoring, the likelihood of measuring low
concentrations in many areas attaining the PM2.5 standard
and the time required to weigh filters collected in performance
evaluations, a PSD monitoring organization's QAPP may contain a
provision to waive the 3 [mu]g/m\3\ threshold for validity of
performance evaluations conducted in the last quarter of monitoring,
subject to approval by the PSD reviewing authority. The EPA did not
receive any comments on this proposed waiver and is finalizing this
revision as proposed.
5. Calculations for Data Quality Assessment
In order to allow reasonable estimates of data quality, the EPA
uses data above an established threshold concentration usually related
to the detection limits of the measurement method. Measurement pairs
are selected for use in the precision and bias calculations only when
both measurements are above a threshold concentration.
For many years, the threshold concentration for Pb precision and
bias data has been 0.02 [mu]g/m\3\. The EPA promulgated a new Pb FRM
utilizing the ICP-MS analysis technique in 2013 as a revision to
appendix G of 40 CFR part 50.\46\ This new FRM demonstrated MDLs \47\
below 0.0002 [mu]g/m\3\, which is well below the EPA requirement of
five percent of the current Pb NAAQS level of 0.15 [mu]g/m\3\, or
0.0075 [mu]g/m\3\. As a result of the increased sensitivity inherent in
this new FRM, the EPA proposed to lower the acceptable Pb concentration
(current section 4) from the current value of 0.02 [mu]g/m\3\ to 0.002
[mu]g/m\3\ for measurements obtained using the new Pb FRM and other
more recently approved equivalent methods that have the requisite
increased sensitivity.\48\ The current 0.02 [mu]g/m\3\ value will be
retained for the previous Pb FRM that has subsequently been re-
designated as FEM EQLA-0813-803 as well as older equivalent methods
that were approved prior to the more recent work on developing more
sensitive methods. Since ambient Pb concentrations are lower and
methods more sensitive, lowering the threshold concentration will allow
much more collocated information to be evaluated, which will provide
more representative estimates of precision and bias.
---------------------------------------------------------------------------
\46\ See 78 FR 40000, July 3, 2013.
\47\ MDL is described as the minimum concentration of a
substance that can be measured and reported with 99 percent
confidence that the analyte concentration is greater than zero.
\48\ FEMs approved on or after March 4, 2010, have the required
sensitivity to utilize the 0.002 [mu]g/m\3\ reporting limit with the
exception of manual equivalent method EQLA-0813-803, the previous
FRM based on flame atomic absorption spectroscopy.
---------------------------------------------------------------------------
See comments related to this proposal in the appendix A section.
The EPA will establish two thresholds as proposed and will evaluate the
use of an absolute difference acceptance criteria at lower
concentration levels.
The EPA also proposed to remove the TSP threshold concentration
since TSP is no longer a NAAQS-required pollutant and the EPA no longer
has QC requirements for it. The EPA received one comment in support of
this proposed change and no adverse comments and is finalizing this
revision as proposed.
The EPA proposed to remove the statistical check currently
described in section 4.1.5 of appendix A. The check was developed to
perform a comparison of the one-point QC checks and the annual
performance evaluation data performed by the same PQAO. The section
suggests that 95 percent of all the bias estimates of the annual
performance evaluations (reported as a percent difference) should fall
within the 95 percent probability interval developed using the one-
point QC checks. The problem with this check is that PQAOs with very
good repeatability on the one-point QC check data had a hard time
meeting this requirement since the probability interval became very
tight, making it more difficult for better performing PQAOs to meet the
requirement. Separate statistics to evaluate the one-point QC checks
and the performance evaluations are already promulgated, so the removal
of this check does not affect data quality assessments. The EPA
received one comment in support of this proposal and no adverse
comments and is finalizing this revision as proposed.
Similar to the statistical comparison of performance evaluation
data, the EPA proposed to remove the statistical check (current
appendix A, section 4.2.4) to compare the flow rate audit data and flow
rate verification data. The existing language suggests that 95 percent
of all the flow rate audit data (reported as percent difference) should
fall within the 95 percent probability interval developed from the flow
rate verification data for the PQAO. The problem, as with the one-point
QC check, was that monitoring organizations with very good
repeatability on the flow rate verifications had a hard time meeting
this requirement since the probability interval became very tight,
making it difficult for better performing PQAOs to meet the
requirement. Separate statistics to evaluate the flow rate
verifications and flow rate audits are already promulgated, so the
removal of this check does not affect data quality assessments. The EPA
received one comment in support of this proposal and no adverse
comments and is finalizing this revision as proposed.
The EPA proposed to remove the reporting requirements that are
currently in section 5 of appendix A because they do not pertain to PSD
monitoring (current sections 5.1, 5.1.1 and 5.1.2.1). Since PSD
organizations
[[Page 17276]]
are not required to certify their data to the EPA nor report to AQS,
the EPA will remove language related to these requirements and language
that required the EPA to calculate and report the measurement
uncertainty for the entire calendar year. The EPA will retain the
quarterly PSD reporting requirements (current section 5.2 in appendix
A) and require that those requirements be consistent with 40 CFR 58.16
as it pertains to PSD ambient air quality data and QC data, as
described in appendix B. The EPA did not receive any comment on this
revision and is finalizing this revision as proposed.
IV. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory Planning and Review and Executive
Order 13563: Improving Regulation and Regulatory Review
This action is not a significant regulatory action and was,
therefore, not submitted to the Office of Management and Budget (OMB)
for review.
B. Paperwork Reduction Act (PRA)
This action does not impose any new information collection burden
under the PRA. OMB has previously approved the information collection
activities contained in the existing regulations and has assigned OMB
control number 2060-0084. While the EPA believes that the net effect of
the requirement changes is a decrease in overall burden, the current
information collection request calculation tools examine key air
monitoring tasks on somewhat of a macro level and are therefore not
sufficiently detailed to show a material change in burden compared with
the existing requirements.
C. Regulatory Flexibility Act (RFA)
I certify that this action will not have a significant economic
impact on a substantial number of small entities under the RFA. This
action will not impose any requirements on small entities. This action
finalizes minor changes and clarifications to existing monitoring
requirements and definitions.
D. Unfunded Mandates Reform Act
This action does not contain an unfunded federal mandate of $100
million or more as described in UMRA, 2 U.S.C. 1531-1538, and does not
significantly or uniquely affect small governments. The revisions to
the monitoring requirements impose no enforceable duty on any state,
local, or tribal governments or the private sector beyond those duties
already established in the CAA.
E. Executive Order 13132: Federalism
This action does not have federalism implications. It will not have
substantial direct effects on the states, on the relationship between
the national government and the states, or on the distribution of power
and responsibilities among the various levels of government.
F. Executive Order 13175: Consultation and Coordination With Indian
Tribal Governments
This action does not have tribal implications, as specified in
Executive Order 13175 (65 FR 67249, November 9, 2000). Tribes have the
opportunity to seek treatment in a manner similar to a state for the
purpose of installing and operating a monitoring network consisting of
one or more monitors and to then install and operate such a network,
but are not required to do so. With regard to any tribes that may
currently be operating a monitoring network, as well as any tribes that
may operate a monitoring network in the future, this action finalizes
minor changes and clarifications to existing monitoring requirements
and will not materially impact the time required to operate monitoring
networks. Thus, consultation under the Executive Order 13175 is not
required for this action. The EPA will work through tribal resources
such as the Tribal Air Monitoring Support Center to ensure a complete
understanding of these revisions.
G. Executive Order 13045: Protection of Children From Environmental
Health and Safety Risks
The EPA interprets Executive Order 13045 as applying only to those
regulatory actions that concern environmental health or safety risks
that the EPA has reason to believe may disproportionately affect
children, per the definition of ``covered regulatory action'' in
section 2-202 of the Executive Order. This action is not subject to
Executive Order 13045 because it does not concern an environmental
health risk or safety risk.
H. Executive Order 13211: Actions Concerning Regulations That
Significantly Affect Energy Supply, Distribution, or Use
This action is not subject to Executive Order 13211, because it is
not a significant regulatory action under Executive Order 12866.
I. National Technology Transfer and Advancement Act
This action does not involve technical standards.
J. Executive Order 12898: Federal Actions to Address Environmental
Justice in Minority Populations and Low-Income Populations
The EPA believes the human health or environmental risk addressed
by this action will not have potential disproportionately high and
adverse human health or environmental effects on minority, low-income
or indigenous populations. This action finalizes minor changes and
clarifications to existing monitoring requirements and definitions.
K. Congressional Review Act
This action is subject to the CRA, and the EPA will submit a rule
report to each House of the Congress and to the Comptroller General of
the United States. This action is not a ``major rule'' as defined by 5
U.S.C. 804(2).
List of Subjects in 40 CFR Part 58
Environmental protection, Administrative practice and procedure,
Air pollution control, Intergovernmental relations.
Dated: March 10, 2016.
Gina McCarthy,
Administrator.
Part 58, chapter I, title 40 of the Code of Federal Regulations is
amended as follows:
PART 58--AMBIENT AIR QUALITY SURVEILLANCE
0
1. The authority citation for part 58 continues to read as follows:
Authority: 42 U.S.C. 7403, 7405, 7410, 7414, 7601, 7611, 7614,
and 7619.
0
2. Revise Sec. 58.1 to read as follows:
Sec. 58.1 Definitions.
As used in this part, all terms not defined herein have the meaning
given them in the Clean Air Act.
AADT means the annual average daily traffic.
Act means the Clean Air Act as amended (42 U.S.C. 7401, et seq.)
Additive and multiplicative bias means the linear regression
intercept and slope of a linear plot fitted to corresponding candidate
and reference method mean measurement data pairs.
Administrator means the Administrator of the Environmental
Protection Agency (EPA) or his or her authorized representative.
Air quality system (AQS) means the EPA's computerized system for
storing and reporting of information relating to ambient air quality
data.
Approved regional method (ARM) means a continuous PM2.5
method that has been approved specifically within a
[[Page 17277]]
state or local air monitoring network for purposes of comparison to the
NAAQS and to meet other monitoring objectives.
AQCR means air quality control region.
Area-wide means all monitors sited at neighborhood, urban, and
regional scales, as well as those monitors sited at either micro- or
middle-scale that are representative of many such locations in the same
CBSA.
Certifying agency means a state, local, or tribal agency
responsible for meeting the data certification requirements in
accordance with Sec. 58.15 for a unique set of monitors.
Chemical Speciation Network (CSN) includes Speciation Trends
Network stations (STN) as specified in paragraph 4.7.4 of appendix D of
this part and supplemental speciation stations that provide chemical
species data of fine particulate.
CO means carbon monoxide.
Combined statistical area (CSA) is defined by the U.S. Office of
Management and Budget as a geographical area consisting of two or more
adjacent Core Based Statistical Areas (CBSA) with employment
interchange of at least 15 percent. Combination is automatic if the
employment interchange is 25 percent and determined by local opinion if
more than 15 but less than 25 percent.
Core-based statistical area (CBSA) is defined by the U.S. Office of
Management and Budget, as a statistical geographic entity consisting of
the county or counties associated with at least one urbanized area/
urban cluster of at least 10,000 population, plus adjacent counties
having a high degree of social and economic integration. Metropolitan
Statistical Areas (MSAs) and micropolitan statistical areas are the two
categories of CBSA (metropolitan areas have populations greater than
50,000; and micropolitan areas have populations between 10,000 and
50,000). In the case of very large cities where two or more CBSAs are
combined, these larger areas are referred to as combined statistical
areas (CSAs)
Corrected concentration pertains to the result of an accuracy or
precision assessment test of an open path analyzer in which a high-
concentration test or audit standard gas contained in a short test cell
is inserted into the optical measurement beam of the instrument. When
the pollutant concentration measured by the analyzer in such a test
includes both the pollutant concentration in the test cell and the
concentration in the atmosphere, the atmospheric pollutant
concentration must be subtracted from the test measurement to obtain
the corrected concentration test result. The corrected concentration is
equal to the measured concentration minus the average of the
atmospheric pollutant concentrations measured (without the test cell)
immediately before and immediately after the test.
Design value means the calculated concentration according to the
applicable appendix of part 50 of this chapter for the highest site in
an attainment or nonattainment area.
EDO means environmental data operations.
Effective concentration pertains to testing an open path analyzer
with a high-concentration calibration or audit standard gas contained
in a short test cell inserted into the optical measurement beam of the
instrument. Effective concentration is the equivalent ambient-level
concentration that would produce the same spectral absorbance over the
actual atmospheric monitoring path length as produced by the high-
concentration gas in the short test cell. Quantitatively, effective
concentration is equal to the actual concentration of the gas standard
in the test cell multiplied by the ratio of the path length of the test
cell to the actual atmospheric monitoring path length.
Federal equivalent method (FEM) means a method for measuring the
concentration of an air pollutant in the ambient air that has been
designated as an equivalent method in accordance with part 53 of this
chapter; it does not include a method for which an equivalent method
designation has been canceled in accordance with Sec. 53.11 or Sec.
53.16.
Federal reference method (FRM) means a method of sampling and
analyzing the ambient air for an air pollutant that is specified as a
reference method in an appendix to part 50 of this chapter, or a method
that has been designated as a reference method in accordance with this
part; it does not include a method for which a reference method
designation has been canceled in accordance with Sec. 53.11 or Sec.
53.16 of this chapter.
HNO3 means nitric acid.
Implementation plan means an implementation plan approved or
promulgated by the EPA pursuant to section 110 of the Act.
Local agency means any local government agency, other than the
state agency, which is charged by a state with the responsibility for
carrying out a portion of the annual monitoring network plan required
by Sec. 58.10.
Meteorological measurements means measurements of wind speed, wind
direction, barometric pressure, temperature, relative humidity, solar
radiation, ultraviolet radiation, and/or precipitation that occur at
SLAMS stations including the NCore and PAMS networks.
Metropolitan Statistical Area (MSA) means a CBSA associated with at
least one urbanized area of 50,000 population or greater. The central-
county, plus adjacent counties with a high degree of integration,
comprise the area.
Monitor means an instrument, sampler, analyzer, or other device
that measures or assists in the measurement of atmospheric air
pollutants and which is acceptable for use in ambient air surveillance
under the applicable provisions of appendix C to this part.
Monitoring agency means a state, local or tribal agency responsible
for meeting the requirements of this part.
Monitoring organization means a monitoring agency responsible for
operating a monitoring site for which the quality assurance regulations
apply.
Monitoring path for an open path analyzer means the actual path in
space between two geographical locations over which the pollutant
concentration is measured and averaged.
Monitoring path length of an open path analyzer means the length of
the monitoring path in the atmosphere over which the average pollutant
concentration measurement (path-averaged concentration) is determined.
See also, optical measurement path length.
Monitoring planning area (MPA) means a contiguous geographic area
with established, well-defined boundaries, such as a CBSA, county or
state, having a common area that is used for planning monitoring
locations for PM2.5. A MPA may cross state boundaries, such
as the Philadelphia PA-NJ MSA, and be further subdivided into community
monitoring zones. The MPAs are generally oriented toward CBSAs or CSAs
with populations greater than 200,000, but for convenience, those
portions of a state that are not associated with CBSAs can be
considered as a single MPA.
NATTS means the national air toxics trends stations. This network
provides hazardous air pollution ambient data.
NCore means the National Core multipollutant monitoring stations.
Monitors at these sites are required to measure particles
(PM2.5 speciated PM2.5, PM10-2.5),
O3, SO2, CO, nitrogen oxides (NO/NOy),
and meteorology (wind speed, wind direction, temperature, relative
humidity).
Near-road monitor means any approved monitor meeting the applicable
specifications described in 40 CFR part 58, appendix D (sections 4.2.1,
4.3.2, 4.7.1(b)(2)) and appendix E
[[Page 17278]]
(section 6.4(a), Table E-4) for near-road measurement of
PM2.5, CO, or NO2.
Network means all stations of a given type or types.
Network Plan means the Annual Monitoring Network Plan described in
Sec. 58.10.
NH3 means ammonia.
NO2 means nitrogen dioxide.
NO means nitrogen oxide.
NOX means the sum of the concentrations of NO2 and NO.
NOy means the sum of all total reactive nitrogen oxides, including
NO, NO2, and other nitrogen oxides referred to as
NOZ.
O3 means ozone.
Open path analyzer means an automated analytical method that
measures the average atmospheric pollutant concentration in situ along
one or more monitoring paths having a monitoring path length of 5
meters or more and that has been designated as a reference or
equivalent method under the provisions of part 53 of this chapter.
Optical measurement path length means the actual length of the
optical beam over which measurement of the pollutant is determined. The
path-integrated pollutant concentration measured by the analyzer is
divided by the optical measurement path length to determine the path-
averaged concentration. Generally, the optical measurement path length
is:
(1) Equal to the monitoring path length for a (bistatic) system
having a transmitter and a receiver at opposite ends of the monitoring
path;
(2) Equal to twice the monitoring path length for a (monostatic)
system having a transmitter and receiver at one end of the monitoring
path and a mirror or retroreflector at the other end; or
(3) Equal to some multiple of the monitoring path length for more
complex systems having multiple passes of the measurement beam through
the monitoring path.
PAMS means photochemical assessment monitoring stations.
Pb means lead.
PM means particulate matter, including but not limited to
PM10, PM10C, PM2.5, and
PM10-2.5.
PM2.5 means particulate matter with an aerodynamic diameter less
than or equal to a nominal 2.5 micrometers as measured by a reference
method based on appendix L of part 50 and designated in accordance with
part 53 of this chapter, by an equivalent method designated in
accordance with part 53, or by an approved regional method designated
in accordance with appendix C to this part.
PM10 means particulate matter with an aerodynamic diameter less
than or equal to a nominal 10 micrometers as measured by a reference
method based on appendix J of part 50 of this chapter and designated in
accordance with part 53 of this chapter or by an equivalent method
designated in accordance with part 53.
PM10C means particulate matter with an aerodynamic diameter less
than or equal to a nominal 10 micrometers as measured by a reference
method based on appendix O of part 50 of this chapter and designated in
accordance with part 53 of this chapter or by an equivalent method
designated in accordance with part 53.
PM10-2.5 means particulate matter with an aerodynamic diameter less
than or equal to a nominal 10 micrometers and greater than a nominal
2.5 micrometers as measured by a reference method based on appendix O
to part 50 of this chapter and designated in accordance with part 53 of
this chapter or by an equivalent method designated in accordance with
part 53.
Point analyzer means an automated analytical method that measures
pollutant concentration in an ambient air sample extracted from the
atmosphere at a specific inlet probe point, and that has been
designated as a reference or equivalent method in accordance with part
53 of this chapter.
Primary monitor means the monitor identified by the monitoring
organization that provides concentration data used for comparison to
the NAAQS. For any specific site, only one monitor for each pollutant
can be designated in AQS as primary monitor for a given period of time.
The primary monitor identifies the default data source for creating a
combined site record for purposes of NAAQS comparisons.
Primary quality assurance organization (PQAO) means a monitoring
organization, a group of monitoring organizations or other organization
that is responsible for a set of stations that monitor the same
pollutant and for which data quality assessments can be pooled. Each
criteria pollutant sampler/monitor at a monitoring station must be
associated with only one PQAO.
Probe means the actual inlet where an air sample is extracted from
the atmosphere for delivery to a sampler or point analyzer for
pollutant analysis.
PSD monitoring network means a set of stations that provide
concentration information for a specific PSD permit.
PSD monitoring organization means a source owner/operator, a
government agency, or a contractor of the source or agency that
operates an ambient air pollution monitoring network for PSD purposes.
PSD reviewing authority means the state air pollution control
agency, local agency, other state agency, tribe, or other agency
authorized by the Administrator to carry out a permit program under
Sec. Sec. 51.165 and 51.166 of this chapter, or the Administrator in
the case of EPA-implemented permit programs under Sec. 52.21 of this
chapter.
PSD station means any station operated for the purpose of
establishing the effect on air quality of the emissions from a proposed
source for purposes of prevention of significant deterioration as
required by Sec. 51.24(n) of this chapter.
Regional Administrator means the Administrator of one of the ten
EPA Regional Offices or his or her authorized representative.
Reporting organization means an entity, such as a state, local, or
tribal monitoring agency, that reports air quality data to the EPA.
Site means a geographic location. One or more stations may be at
the same site.
SLAMS means state or local air monitoring stations. The SLAMS
include the ambient air quality monitoring sites and monitors that are
required by appendix D of this part and are needed for the monitoring
objectives of appendix D, including NAAQS comparisons, but may serve
other data purposes. The SLAMS includes NCore, PAMS, CSN, and all other
state or locally operated criteria pollutant monitors, operated in
accordance to this part, that have not been designated and approved by
the Regional Administrator as SPM stations in an annual monitoring
network plan.
SO2 means sulfur dioxide.
Special purpose monitor (SPM) station means a monitor included in
an agency's monitoring network that the agency has designated as a
special purpose monitor station in its annual monitoring network plan
and in the AQS, and which the agency does not count when showing
compliance with the minimum requirements of this subpart for the number
and siting of monitors of various types. Any SPM operated by an air
monitoring agency must be included in the periodic assessments and
annual monitoring network plan required by Sec. 58.10 and approved by
the Regional Administrator.
State agency means the air pollution control agency primarily
responsible for development and implementation of a State
Implementation Plan under the Act.
Station means a single monitor, or a group of monitors, located at
a particular site.
[[Page 17279]]
STN station means a PM2.5 chemical speciation station
designated to be part of the speciation trends network. This network
provides chemical species data of fine particulate.
Supplemental speciation station means a PM2.5 chemical
speciation station that is operated for monitoring agency needs and not
part of the STN.
Traceable means that a local standard has been compared and
certified, either directly or via not more than one intermediate
standard, to a National Institute of Standards and Technology (NIST)-
certified primary standard such as a NIST-traceable Reference Material
(NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS).
TSP (total suspended particulates) means particulate matter as
measured by the method described in appendix B of Part 50.
Urbanized area means an area with a minimum residential population
of at least 50,000 people and which generally includes core census
block groups or blocks that have a population density of at least 1,000
people per square mile and surrounding census blocks that have an
overall density of at least 500 people per square mile. The Census
Bureau notes that under certain conditions, less densely settled
territory may be part of each Urbanized Area.
VOCs means volatile organic compounds.
0
3. In Sec. 58.10:
0
a. Revise paragraphs (a)(1) and (a)(2).
0
b. Add paragraph (a)(12).
The revisions and addition read as follows:
Sec. 58.10 Annual monitoring network plan and periodic network
assessment.
(a)(1) Beginning July 1, 2007, the state, or where applicable
local, agency shall submit to the Regional Administrator an annual
monitoring network plan which shall provide for the documentation of
the establishment and maintenance of an air quality surveillance system
that consists of a network of SLAMS monitoring stations that can
include FRM, FEM, and ARM monitors that are part of SLAMS, NCore, CSN,
PAMS, and SPM stations. The plan shall include a statement of whether
the operation of each monitor meets the requirements of appendices A,
B, C, D, and E of this part, where applicable. The Regional
Administrator may require additional information in support of this
statement. The annual monitoring network plan must be made available
for public inspection and comment for at least 30 days prior to
submission to the EPA and the submitted plan shall include and address,
as appropriate, any received comments.
(2) Any annual monitoring network plan that proposes network
modifications (including new or discontinued monitoring sites, new
determinations that data are not of sufficient quality to be compared
to the NAAQS, and changes in identification of monitors as suitable or
not suitable for comparison against the annual PM2.5 NAAQS)
to SLAMS networks is subject to the approval of the EPA Regional
Administrator, who shall approve or disapprove the plan within 120 days
of submission of a complete plan to the EPA.
* * * * *
(12) A detailed description of the PAMS network being operated in
accordance with the requirements of appendix D to this part shall be
submitted as part of the annual monitoring network plan for review by
the EPA Administrator. The PAMS Network Description described in
section 5 of appendix D may be used to meet this requirement.
* * * * *
0
4. In Sec. 58.11, revise paragraph (a)(3) to read as follows:
Sec. 58.11 Network technical requirements.
(a) * * *
(3) The owner or operator of an existing or a proposed source shall
follow the quality assurance criteria in appendix B to this part that
apply to PSD monitoring when operating a PSD site.
* * * * *
0
5. In Sec. 58.12:
0
a. Revise paragraph (d)(1).
0
b. Revise paragraph (d)(3).
The revisions read as follows:
Sec. 58.12 Operating schedules.
* * * * *
(d) * * *
(1)(i) Manual PM2.5 samplers at required SLAMS stations
without a collocated continuously operating PM2.5 monitor
must operate on at least a 1-in-3 day schedule unless a waiver for an
alternative schedule has been approved per paragraph (d)(1)(ii) of this
section.
(ii) For SLAMS PM2.5 sites with both manual and
continuous PM2.5 monitors operating, the monitoring agency
may request approval for a reduction to 1-in-6 day PM2.5
sampling or for seasonal sampling from the EPA Regional Administrator.
Other requests for a reduction to 1-in-6 day PM2.5 sampling
or for seasonal sampling may be approved on a case-by-case basis. The
EPA Regional Administrator may grant sampling frequency reductions
after consideration of factors (including but not limited to the
historical PM2.5 data quality assessments, the location of
current PM2.5 design value sites, and their regulatory data
needs) if the Regional Administrator determines that the reduction in
sampling frequency will not compromise data needed for implementation
of the NAAQS. Required SLAMS stations whose measurements determine the
design value for their area and that are within 10 percent
of the annual NAAQS, and all required sites where one or more 24-hour
values have exceeded the 24-hour NAAQS each year for a consecutive
period of at least 3 years are required to maintain at least a 1-in-3
day sampling frequency until the design value no longer meets these
criteria for 3 consecutive years. A continuously operating FEM or ARM
PM2.5 monitor satisfies this requirement unless it is
identified in the monitoring agency's annual monitoring network plan as
not appropriate for comparison to the NAAQS and the EPA Regional
Administrator has approved that the data from that monitor may be
excluded from comparison to the NAAQS.
(iii) Required SLAMS stations whose measurements determine the 24-
hour design value for their area and whose data are within 5 percent of the level of the 24-hour PM2.5 NAAQS must
have an FRM or FEM operate on a daily schedule if that area's design
value for the annual NAAQS is less than the level of the annual
PM2.5 standard. A continuously operating FEM or ARM
PM2.5 monitor satisfies this requirement unless it is
identified in the monitoring agency's annual monitoring network plan as
not appropriate for comparison to the NAAQS and the EPA Regional
Administrator has approved that the data from that monitor may be
excluded from comparison to the NAAQS. The daily schedule must be
maintained until the referenced design value no longer meets these
criteria for 3 consecutive years.
(iv) Changes in sampling frequency attributable to changes in
design values shall be implemented no later than January 1 of the
calendar year following the certification of such data as described in
Sec. 58.15.
* * * * *
(3) Manual PM2.5 speciation samplers at STN stations
must operate on at least a 1-in-3 day sampling frequency unless a
reduction in sampling frequency has been approved by the EPA
Administrator based on factors such as area's design value, the role of
the particular site in national health studies, the correlation of the
site's species data
[[Page 17280]]
with nearby sites, and presence of other leveraged measurements.
* * * * *
0
6. In Sec. 58.14, revise paragraph (a) to read as follows:
Sec. 58.14 System modification.
(a) The state, or where appropriate local, agency shall develop a
network modification plan and schedule to modify the ambient air
quality monitoring network that addresses the findings of the network
assessment required every 5 years by Sec. 58.10(d). The network
modification plan shall be submitted as part of the Annual Monitoring
Network Plan that is due no later than the year after submittal of the
network assessment.
* * * * *
0
7. Revise Sec. 58.15 to read as follows:
Sec. 58.15 Annual air monitoring data certification.
(a) The state, or where appropriate local, agency shall submit to
the EPA Regional Administrator an annual air monitoring data
certification letter to certify data collected by FRM, FEM, and ARM
monitors at SLAMS and SPM sites that meet criteria in appendix A to
this part from January 1 to December 31 of the previous year. The head
official in each monitoring agency, or his or her designee, shall
certify that the previous year of ambient concentration and quality
assurance data are completely submitted to AQS and that the ambient
concentration data are accurate to the best of her or his knowledge,
taking into consideration the quality assurance findings. The annual
data certification letter is due by May 1 of each year.
(b) Along with each certification letter, the state shall submit to
the Regional Administrator an annual summary report of all the ambient
air quality data collected by FRM, FEM, and ARM monitors at SLAMS and
SPM sites. The annual report(s) shall be submitted for data collected
from January 1 to December 31 of the previous year. The annual summary
serves as the record of the specific data that is the object of the
certification letter.
(c) Along with each certification letter, the state shall submit to
the Regional Administrator a summary of the precision and accuracy data
for all ambient air quality data collected by FRM, FEM, and ARM
monitors at SLAMS and SPM sites. The summary of precision and accuracy
shall be submitted for data collected from January 1 to December 31 of
the previous year.
0
8. In Sec. 58.16, revise paragraphs (a), (c), and (d) to read as
follows:
Sec. 58.16 Data submittal and archiving requirements.
(a) The state, or where appropriate, local agency, shall report to
the Administrator, via AQS all ambient air quality data and associated
quality assurance data for SO2; CO; O3;
NO2; NO; NOy; NOX; Pb-TSP mass
concentration; Pb-PM10 mass concentration; PM10
mass concentration; PM2.5 mass concentration; for filter-
based PM2.5 FRM/FEM, the field blank mass; chemically
speciated PM2.5 mass concentration data; PM10-2.5
mass concentration; meteorological data from NCore and PAMS sites; and
metadata records and information specified by the AQS Data Coding
Manual (https://www.epa.gov/sites/production/files/2015-09/documents/aqs_data_coding_manual_0.pdf). Air quality data and information must be
submitted directly to the AQS via electronic transmission on the
specified schedule described in paragraphs (b) and (d) of this section.
* * * * *
(c) Air quality data submitted for each reporting period must be
edited, validated, and entered into the AQS (within the time limits
specified in paragraphs (b) and (d) of this section) pursuant to
appropriate AQS procedures. The procedures for editing and validating
data are described in the AQS Data Coding Manual and in each monitoring
agency's quality assurance project plan.
(d) The state shall report VOC and if collected, carbonyl,
NH3, and HNO3 data from PAMS sites, and
chemically speciated PM2.5 mass concentration data to AQS
within 6 months following the end of each quarterly reporting period
listed in paragraph (b) of this section.
* * * * *
0
9. Revise Appendix A to part 58 to read as follows:
Appendix A to Part 58--Quality Assurance Requirements for Monitors used
in Evaluations of National Ambient Air Quality Standards
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability. (a) This appendix specifies the minimum
quality system requirements applicable to SLAMS and other monitor
types whose data are intended to be used to determine compliance
with the NAAQS (e.g., SPMs, tribal, CASTNET, NCore, industrial,
etc.), unless the EPA Regional Administrator has reviewed and
approved the monitor for exclusion from NAAQS use and these quality
assurance requirements.
(b) Primary quality assurance organizations are encouraged to
develop and maintain quality systems more extensive than the
required minimums. Additional guidance for the requirements
reflected in this appendix can be found in the ``Quality Assurance
Handbook for Air Pollution Measurement Systems,'' Volume II (see
reference 10 of this appendix) and at a national level in references
1, 2, and 3 of this appendix.
1.2 Primary Quality Assurance Organization (PQAO). A PQAO is
defined as a monitoring organization or a group of monitoring
organizations or other organization that is responsible for a set of
stations that monitors the same pollutant and for which data quality
assessments will be pooled. Each criteria pollutant sampler/monitor
must be associated with only one PQAO. In some cases, data quality
is assessed at the PQAO level.
1.2.1 Each PQAO shall be defined such that measurement
uncertainty among all stations in the organization can be expected
to be reasonably homogeneous as a result of common factors. Common
factors that should be considered in defining PQAOs include:
(a) Operation by a common team of field operators according to a
common set of procedures;
(b) Use of a common quality assurance project plan (QAPP) or
standard operating procedures;
(c) Common calibration facilities and standards;
(d) Oversight by a common quality assurance organization; and
(e) Support by a common management organization (i.e., state
agency) or laboratory.
Since data quality assessments are made and data certified at
the PQAO level, the monitoring organization identified as the PQAO
will be responsible for the oversight of the quality of data of all
monitoring organizations within the PQAO.
1.2.2 Monitoring organizations having difficulty describing its
PQAO or in assigning specific monitors to primary quality assurance
organizations should consult with the appropriate EPA Regional
Office. Any consolidation of monitoring organizations to PQAOs shall
be subject to final approval by the appropriate EPA Regional Office.
1.2.3 Each PQAO is required to implement a quality system that
provides sufficient information to assess the quality of the
monitoring data. The quality system must, at a minimum, include the
specific requirements described in this appendix. Failure to conduct
or pass a required check or procedure, or a series of required
checks or procedures, does not by itself invalidate data for
regulatory decision making. Rather, PQAOs and the EPA shall use the
checks and procedures required in this appendix in combination with
other data quality information, reports, and similar documentation
that demonstrate overall compliance with Part 58. Accordingly, the
EPA and PQAOs shall use a ``weight of evidence'' approach when
determining the suitability of data for regulatory decisions.
[[Page 17281]]
The EPA reserves the authority to use or not use monitoring data
submitted by a monitoring organization when making regulatory
decisions based on the EPA's assessment of the quality of the data.
Consensus built validation templates or validation criteria already
approved in QAPPs should be used as the basis for the weight of
evidence approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used to describe deviations
from a true concentration or estimate that are related to the
measurement process and not to spatial or temporal population
attributes of the air being measured.
(b) Precision. A measurement of mutual agreement among
individual measurements of the same property usually under
prescribed similar conditions, expressed generally in terms of the
standard deviation.
(c) Bias. The systematic or persistent distortion of a
measurement process which causes errors in one direction.
(d) Accuracy. The degree of agreement between an observed value
and an accepted reference value. Accuracy includes a combination of
random error (imprecision) and systematic error (bias) components
which are due to sampling and analytical operations.
(e) Completeness. A measure of the amount of valid data obtained
from a measurement system compared to the amount that was expected
to be obtained under correct, normal conditions.
(f) Detection Limit. The lowest concentration or amount of
target analyte that can be determined to be different from zero by a
single measurement at a stated level of probability.
1.4 Measurement Quality Checks. The measurement quality checks
described in section 3 of this appendix shall be reported to AQS and
are included in the data required for certification.
1.5 Assessments and Reports. Periodic assessments and
documentation of data quality are required to be reported to the
EPA. To provide national uniformity in this assessment and reporting
of data quality for all networks, specific assessment and reporting
procedures are prescribed in detail in sections 3, 4, and 5 of this
appendix. On the other hand, the selection and extent of the quality
assurance and quality control activities used by a monitoring
organization depend on a number of local factors such as field and
laboratory conditions, the objectives for monitoring, the level of
data quality needed, the expertise of assigned personnel, the cost
of control procedures, pollutant concentration levels, etc.
Therefore, quality system requirements in section 2 of this appendix
are specified in general terms to allow each monitoring organization
to develop a quality system that is most efficient and effective for
its own circumstances while achieving the data quality objectives
described in this appendix.
2. Quality System Requirements
A quality system (reference 1 of this appendix) is the means by
which an organization manages the quality of the monitoring
information it produces in a systematic, organized manner. It
provides a framework for planning, implementing, assessing and
reporting work performed by an organization and for carrying out
required quality assurance and quality control activities.
2.1 Quality Management Plans and Quality Assurance Project
Plans. All PQAOs must develop a quality system that is described and
approved in quality management plans (QMP) and QAPPs to ensure that
the monitoring results:
(a) Meet a well-defined need, use, or purpose (reference 5 of
this appendix);
(b) Provide data of adequate quality for the intended monitoring
objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards specifications;
(e) Comply with statutory (and other legal) requirements; and
(f) Reflect consideration of cost and economics.
2.1.1 The QMP describes the quality system in terms of the
organizational structure, functional responsibilities of management
and staff, lines of authority, and required interfaces for those
planning, implementing, assessing and reporting activities involving
environmental data operations (EDO). The QMP must be suitably
documented in accordance with EPA requirements (reference 2 of this
appendix), and approved by the appropriate Regional Administrator,
or his or her representative. The quality system described in the
QMP will be reviewed during the systems audits described in section
2.5 of this appendix. Organizations that implement long-term
monitoring programs with EPA funds should have a separate QMP
document. Smaller organizations, organizations that do infrequent
work with the EPA or have monitoring programs of limited size or
scope may combine the QMP with the QAPP if approved by, and subject
to any conditions of the EPA. Additional guidance on this process
can be found in reference 10 of this appendix. Approval of the
recipient's QMP by the appropriate Regional Administrator or his or
her representative may allow delegation of authority to the PQAOs
independent quality assurance function to review and approve
environmental data collection activities adequately described and
covered under the scope of the QMP and documented in appropriate
planning documents (QAPP). Where a PQAO or monitoring organization
has been delegated authority to review and approve their QAPP, an
electronic copy must be submitted to the EPA region at the time it
is submitted to the PQAO/monitoring organization's QAPP approving
authority. The QAPP will be reviewed by the EPA during systems
audits or circumstances related to data quality. The QMP submission
and approval dates for PQAOs/monitoring organizations must be
reported to AQS either by the monitoring organization or the EPA
Region.
2.1.2 The QAPP is a formal document describing, in sufficient
detail, the quality system that must be implemented to ensure that
the results of work performed will satisfy the stated objectives.
PQAOs must develop QAPPs that describe how the organization intends
to control measurement uncertainty to an appropriate level in order
to achieve the data quality objectives for the EDO. The quality
assurance policy of the EPA requires every EDO to have a written and
approved QAPP prior to the start of the EDO. It is the
responsibility of the PQAO/monitoring organization to adhere to this
policy. The QAPP must be suitably documented in accordance with EPA
requirements (reference 3 of this appendix) and include standard
operating procedures for all EDOs either within the document or by
appropriate reference. The QAPP must identify each PQAO operating
monitors under the QAPP as well as generally identify the sites and
monitors to which it is applicable either within the document or by
appropriate reference. The QAPP submission and approval dates must
be reported to AQS either by the monitoring organization or the EPA
Region.
2.1.3 The PQAO/monitoring organization's quality system must
have adequate resources both in personnel and funding to plan,
implement, assess and report on the achievement of the requirements
of this appendix and it's approved QAPP.
2.2 Independence of Quality Assurance. The PQAO must provide for
a quality assurance management function, that aspect of the overall
management system of the organization that determines and implements
the quality policy defined in a PQAO's QMP. Quality management
includes strategic planning, allocation of resources and other
systematic planning activities (e.g., planning, implementation,
assessing and reporting) pertaining to the quality system. The
quality assurance management function must have sufficient technical
expertise and management authority to conduct independent oversight
and assure the implementation of the organization's quality system
relative to the ambient air quality monitoring program and should be
organizationally independent of environmental data generation
activities.
2.3. Data Quality Performance Requirements.
2.3.1 Data Quality Objectives. The DQOs, or the results of other
systematic planning processes, are statements that define the
appropriate type of data to collect and specify the tolerable levels
of potential decision errors that will be used as a basis for
establishing the quality and quantity of data needed to support the
monitoring objectives (reference 5 of this appendix). The DQOs will
be developed by the EPA to support the primary regulatory objectives
for each criteria pollutant. As they are developed, they will be
added to the regulation. The quality of the conclusions derived from
data interpretation can be affected by population uncertainty
(spatial or temporal uncertainty) and measurement uncertainty
(uncertainty associated with collecting, analyzing, reducing and
reporting concentration data). This appendix focuses on assessing
and controlling measurement uncertainty.
2.3.1.1 Measurement Uncertainty for Automated and Manual
PM2.5 Methods. The goal for acceptable measurement
uncertainty is defined for precision as an upper 90
[[Page 17282]]
percent confidence limit for the coefficient of variation (CV) of 10
percent and 10 percent for total bias.
2.3.1.2 Measurement Uncertainty for Automated O3
Methods. The goal for acceptable measurement uncertainty is defined
for precision as an upper 90 percent confidence limit for the CV of
7 percent and for bias as an upper 95 percent confidence limit for
the absolute bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb Methods. The goal for
acceptable measurement uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV of 20 percent and for
bias as an upper 95 percent confidence limit for the absolute bias
of 15 percent.
2.3.1.4 Measurement Uncertainty for NO2. The goal for
acceptable measurement uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV of 15 percent and for
bias as an upper 95 percent confidence limit for the absolute bias
of 15 percent.
2.3.1.5 Measurement Uncertainty for SO2. The goal for
acceptable measurement uncertainty for precision is defined as an
upper 90 percent confidence limit for the CV of 10 percent and for
bias as an upper 95 percent confidence limit for the absolute bias
of 10 percent.
2.4 National Performance Evaluation Programs. The PQAO shall
provide for the implementation of a program of independent and
adequate audits of all monitors providing data for NAAQS compliance
purposes including the provision of adequate resources for such
audit programs. A monitoring plan (or QAPP) which provides for PQAO
participation in the EPA's National Performance Audit Program
(NPAP), the PM2.5 Performance Evaluation Program
(PM2.5-PEP) program and the Pb Performance Evaluation
Program (Pb-PEP) and indicates the consent of the PQAO for the EPA
to apply an appropriate portion of the grant funds, which the EPA
would otherwise award to the PQAO for these QA activities, will be
deemed by the EPA to meet this requirement. For clarification and to
participate, PQAOs should contact either the appropriate EPA
regional quality assurance (QA) coordinator at the appropriate EPA
Regional Office location, or the NPAP coordinator at the EPA Air
Quality Assessment Division, Office of Air Quality Planning and
Standards, in Research Triangle Park, North Carolina. The PQAOs that
plan to implement these programs (self-implement) rather than use
the federal programs must meet the adequacy requirements found in
the appropriate sections that follow, as well as meet the definition
of independent assessment that follows.
2.4.1 Independent assessment. An assessment performed by a
qualified individual, group, or organization that is not part of the
organization directly performing and accountable for the work being
assessed. This auditing organization must not be involved with the
generation of the ambient air monitoring data. An organization can
conduct the performance evaluation (PE) if it can meet this
definition and has a management structure that, at a minimum, will
allow for the separation of its routine sampling personnel from its
auditing personnel by two levels of management. In addition, the
sample analysis of audit filters must be performed by a laboratory
facility and laboratory equipment separate from the facilities used
for routine sample analysis. Field and laboratory personnel will be
required to meet PE field and laboratory training and certification
requirements to establish comparability to federally implemented
programs.
2.5 Technical Systems Audit Program. Technical systems audits of
each PQAO shall be conducted at least every 3 years by the
appropriate EPA Regional Office and reported to the AQS. If a PQAO
is made up of more than one monitoring organization, all monitoring
organizations in the PQAO should be audited within 6 years (two TSA
cycles of the PQAO). As an example, if a state has five local
monitoring organizations that are consolidated under one PQAO, all
five local monitoring organizations should receive a technical
systems audit within a 6-year period. Systems audit programs are
described in reference 10 of this appendix.
2.6 Gaseous and Flow Rate Audit Standards.
2.6.1 Gaseous pollutant concentration standards (permeation
devices or cylinders of compressed gas) used to obtain test
concentrations for CO, SO2, NO, and NO2 must
be traceable to either a National Institute of Standards and
Technology (NIST) Traceable Reference Material (NTRM) or a NIST-
certified Gas Manufacturer's Internal Standard (GMIS), certified in
accordance with one of the procedures given in reference 4 of this
appendix. Vendors advertising certification with the procedures
provided in reference 4 of this appendix and distributing gases as
``EPA Protocol Gas'' for ambient air monitoring purposes must
participate in the EPA Ambient Air Protocol Gas Verification Program
or not use ``EPA'' in any form of advertising. Monitoring
organizations must provide information to the EPA on the gas
producers they use on an annual basis and those PQAOs purchasing
standards will be obligated, at the request of the EPA, to
participate in the program at least once every 5 years by sending a
new unused standard to a designated verification laboratory.
2.6.2 Test concentrations for O3 must be obtained in
accordance with the ultraviolet photometric calibration procedure
specified in appendix D to Part 50 of this chapter and by means of a
certified NIST-traceable O3 transfer standard. Consult
references 7 and 8 of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be made by a flow measuring
instrument that is NIST-traceable to an authoritative volume or
other applicable standard. Guidance for certifying some types of
flowmeters is provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance. Requirements and guidance
documents for developing the quality system are contained in
references 1 through 11 of this appendix, which also contain many
suggested procedures, checks, and control specifications. Reference
10 describes specific guidance for the development of a quality
system for data collected for comparison to the NAAQS. Many specific
quality control checks and specifications for methods are included
in the respective reference methods described in Part 50 of this
chapter or in the respective equivalent method descriptions
available from the EPA (reference 6 of this appendix). Similarly,
quality control procedures related to specifically designated
reference and equivalent method monitors are contained in the
respective operation or instruction manuals associated with those
monitors.
3. Measurement Quality Check Requirements
This section provides the requirements for PQAOs to perform the
measurement quality checks that can be used to assess data quality.
Data from these checks are required to be submitted to the AQS
within the same time frame as routinely-collected ambient
concentration data as described in 40 CFR 58.16. Table A-1 of this
appendix provides a summary of the types and frequency of the
measurement quality checks that will be described in this section.
3.1. Gaseous Monitors of SO2, NO2,
O3, and CO.
3.1.1 One-Point Quality Control (QC) Check for SO2,
NO2, O3, and CO. (a) A one-point QC check must
be performed at least once every 2 weeks on each automated monitor
used to measure SO2, NO2, O3 and
CO. With the advent of automated calibration systems, more frequent
checking is strongly encouraged. See Reference 10 of this appendix
for guidance on the review procedure. The QC check is made by
challenging the monitor with a QC check gas of known concentration
(effective concentration for open path monitors) between the
prescribed range of 0.005 and 0.08 parts per million (ppm) for
SO2, NO2, and O3, and between the
prescribed range of 0.5 and 5 ppm for CO monitors. The QC check gas
concentration selected within the prescribed range should be related
to the monitoring objectives for the monitor. If monitoring at an
NCore site or for trace level monitoring, the QC check concentration
should be selected to represent the mean or median concentrations at
the site. If the mean or median concentrations at trace gas sites
are below the MDL of the instrument the agency can select the lowest
concentration in the prescribed range that can be practically
achieved. If the mean or median concentrations at trace gas sites
are above the prescribed range the agency can select the highest
concentration in the prescribed range. An additional QC check point
is encouraged for those organizations that may have occasional high
values or would like to confirm the monitors' linearity at the
higher end of the operational range or around NAAQS concentrations.
If monitoring for NAAQS decisions, the QC concentration can be
selected at a higher concentration within the prescribed range but
should also consider precision points around mean or median monitor
concentrations.
(b) Point analyzers must operate in their normal sampling mode
during the QC check and the test atmosphere must pass through all
filters, scrubbers, conditioners and other components used during
normal ambient sampling and as much of the ambient air inlet system
as is practicable. The QC check
[[Page 17283]]
must be conducted before any calibration or adjustment to the
monitor.
(c) Open path monitors are tested by inserting a test cell
containing a QC check gas concentration into the optical measurement
beam of the instrument. If possible, the normally used transmitter,
receiver, and as appropriate, reflecting devices should be used
during the test, and the normal monitoring configuration of the
instrument should be altered as little as possible to accommodate
the test cell for the test. However, if permitted by the associated
operation or instruction manual, an alternate local light source or
an alternate optical path that does not include the normal
atmospheric monitoring path may be used. The actual concentration of
the QC check gas in the test cell must be selected to produce an
effective concentration in the range specified earlier in this
section. Generally, the QC test concentration measurement will be
the sum of the atmospheric pollutant concentration and the QC test
concentration. As such, the result must be corrected to remove the
atmospheric concentration contribution. The corrected concentration
is obtained by subtracting the average of the atmospheric
concentrations measured by the open path instrument under test
immediately before and immediately after the QC test from the QC
check gas concentration measurement. If the difference between these
before and after measurements is greater than 20 percent of the
effective concentration of the test gas, discard the test result and
repeat the test. If possible, open path monitors should be tested
during periods when the atmospheric pollutant concentrations are
relatively low and steady.
(d) Report the audit concentration of the QC gas and the
corresponding measured concentration indicated by the monitor to
AQS. The percent differences between these concentrations are used
to assess the precision and bias of the monitoring data as described
in sections 4.1.2 (precision) and 4.1.3 (bias) of this appendix.
3.1.2 Annual performance evaluation for SO2, NO2, O3, or CO. A
performance evaluation must be conducted on each primary monitor
once a year. This can be accomplished by evaluating 25 percent of
the primary monitors each quarter. The evaluation should be
conducted by a trained experienced technician other than the routine
site operator.
3.1.2.1 The evaluation is made by challenging the monitor with
audit gas standards of known concentration from at least three audit
levels. One point must be within two to three times the method
detection limit of the instruments within the PQAOs network, the
second point will be less than or equal to the 99th percentile of
the data at the site or the network of sites in the PQAO or the next
highest audit concentration level. The third point can be around the
primary NAAQS or the highest 3-year concentration at the site or the
network of sites in the PQAO. An additional 4th level is encouraged
for those agencies that would like to confirm the monitors'
linearity at the higher end of the operational range. In rare
circumstances, there may be sites measuring concentrations above
audit level 10. Notify the appropriate EPA region and the AQS
program in order to make accommodations for auditing at levels above
level 10.
----------------------------------------------------------------------------------------------------------------
Concentration Range, ppm
Audit level ---------------------------------------------------------------
O3 SO2 NO2 CO
----------------------------------------------------------------------------------------------------------------
1............................................... 0.004-0.0059 0.0003-0.0029 0.0003-0.0029 0.020-0.059
2............................................... 0.006-0.019 0.0030-0.0049 0.0030-0.0049 0.060-0.199
3............................................... 0.020-0.039 0.0050-0.0079 0.0050-0.0079 0.200-0.899
4............................................... 0.040-0.069 0.0080-0.0199 0.0080-0.0199 0.900-2.999
5............................................... 0.070-0.089 0.0200-0.0499 0.0200-0.0499 3.000-7.999
6............................................... 0.090-0.119 0.0500-0.0999 0.0500-0.0999 8.000-15.999
7............................................... 0.120-0.139 0.1000-0.1499 0.1000-0.2999 16.000-30.999
8............................................... 0.140-0.169 0.1500-0.2599 0.3000-0.4999 31.000-39.999
9............................................... 0.170-0.189 0.2600-0.7999 0.5000-0.7999 40.000-49.999
10.............................................. 0.190-0.259 0.8000-1.000 0.8000-1.000 50.000-60.000
----------------------------------------------------------------------------------------------------------------
3.1.2.2 The NO2 audit techniques may vary depending
on the ambient monitoring method. For chemiluminescence-type
NO2 analyzers, gas phase titration (GPT) techniques
should be based on EPA guidance documents and monitoring agency
experience. The NO2 gas standards may be more appropriate
than GPT for direct NO2 methods that do not employ
converters. Care should be taken to ensure the stability of such gas
standards prior to use.
3.1.2.3 The standards from which audit gas test concentrations
are obtained must meet the specifications of section 2.6.1 of this
appendix. The gas standards and equipment used for the performance
evaluation must not be the same as the standards and equipment used
for one-point QC, calibrations, span evaluations or NPAP.
3.1.2.4 For point analyzers, the evaluation shall be carried out
by allowing the monitor to analyze the audit gas test atmosphere in
its normal sampling mode such that the test atmosphere passes
through all filters, scrubbers, conditioners, and other sample inlet
components used during normal ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open-path monitors are evaluated by inserting a test
cell containing the various audit gas concentrations into the
optical measurement beam of the instrument. If possible, the
normally used transmitter, receiver, and, as appropriate, reflecting
devices should be used during the evaluation, and the normal
monitoring configuration of the instrument should be modified as
little as possible to accommodate the test cell for the evaluation.
However, if permitted by the associated operation or instruction
manual, an alternate local light source or an alternate optical path
that does not include the normal atmospheric monitoring path may be
used. The actual concentrations of the audit gas in the test cell
must be selected to produce effective concentrations in the
evaluation level ranges specified in this section of this appendix.
Generally, each evaluation concentration measurement result will be
the sum of the atmospheric pollutant concentration and the
evaluation test concentration. As such, the result must be corrected
to remove the atmospheric concentration contribution. The corrected
concentration is obtained by subtracting the average of the
atmospheric concentrations measured by the open path instrument
under test immediately before and immediately after the evaluation
test (or preferably before and after each evaluation concentration
level) from the evaluation concentration measurement. If the
difference between the before and after measurements is greater than
20 percent of the effective concentration of the test gas standard,
discard the test result for that concentration level and repeat the
test for that level. If possible, open path monitors should be
evaluated during periods when the atmospheric pollutant
concentrations are relatively low and steady. Also, if the open-path
instrument is not installed in a permanent manner, the monitoring
path length must be reverified to be within 3 percent to
validate the evaluation since the monitoring path length is critical
to the determination of the effective concentration.
3.1.2.6 Report both the evaluation concentrations (effective
concentrations for open-path monitors) of the audit gases and the
corresponding measured concentration (corrected concentrations, if
applicable, for open path monitors) indicated or produced by the
monitor being tested to AQS. The percent differences between these
concentrations are used to assess the quality of the monitoring data
as described in section 4.1.1 of this appendix.
3.1.3 National Performance Audit Program (NPAP).
The NPAP is a performance evaluation which is a type of audit
where quantitative data are collected independently in order to
evaluate the proficiency of an analyst, monitoring instrument or
laboratory. Due to the implementation approach used in the
[[Page 17284]]
program, NPAP provides a national independent assessment of
performance while maintaining a consistent level of data quality.
Details of the program can be found in reference 11 of this
appendix. The program requirements include:
3.1.3.1 Performing audits of the primary monitors at 20 percent
of monitoring sites per year, and 100 percent of the sites every 6
years. High-priority sites may be audited more frequently. Since not
all gaseous criteria pollutants are monitored at every site within a
PQAO, it is not required that 20 percent of the primary monitors for
each pollutant receive an NPAP audit each year only that 20 percent
of the PQAOs monitoring sites receive an NPAP audit. It is expected
that over the 6-year period all primary monitors for all gaseous
pollutants will receive an NPAP audit.
3.1.3.2 Developing a delivery system that will allow for the
audit concentration gasses to be introduced to the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified against the NIST
standard reference methods or special review procedures and
validated annually for CO, SO2 and NO2, and at
the beginning of each quarter of audits for O3.
3.1.3.4 As described in section 2.4 of this appendix, the PQAO
may elect, on an annual basis, to utilize the federally implemented
NPAP program. If the PQAO plans to self-implement NPAP, the EPA will
establish training and other technical requirements for PQAOs to
establish comparability to federally implemented programs. In
addition to meeting the requirements in sections 3.1.3.1 through
3.1.3.3 of this appendix, the PQAO must:
(a) Utilize an audit system equivalent to the federally
implemented NPAP audit system and is separate from equipment used in
annual performance evaluations.
(b) Perform a whole system check by having the NPAP system
tested against an independent and qualified EPA lab, or equivalent.
(c) Evaluate the system with the EPA NPAP program through
collocated auditing at an acceptable number of sites each year (at
least one for an agency network of five or less sites; at least two
for a network with more than five sites).
(d) Incorporate the NPAP in the PQAO's quality assurance project
plan.
(e) Be subject to review by independent, EPA-trained personnel.
(f) Participate in initial and update training/certification
sessions.
3.1.3.5 OAQPS, in consultation with the relevant EPA Regional
Office, may approve the PQAO's plan to self-implement NPAP if the
OAQPS determines that the PQAO's self-implementation plan is
equivalent to the federal programs and adequate to meet the
objectives of national consistency and data quality.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A one-point flow rate
verification check must be performed at least once every month (each
verification minimally separated by 14 days) on each monitor used to
measure PM2.5. The verification is made by checking the
operational flow rate of the monitor. If the verification is made in
conjunction with a flow rate adjustment, it must be made prior to
such flow rate adjustment. For the standard procedure, use a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix to check the monitor's normal flow rate. Care should
be used in selecting and using the flow rate measurement device such
that it does not alter the normal operating flow rate of the
monitor. Report the flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between the audit and measured flow rates are used to
assess the bias of the monitoring data as described in section 4.2.2
of this appendix (using flow rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for PM2.5. Audit the flow rate
of the particulate monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months apart. The EPA strongly
encourages more frequent auditing. The audit should (preferably) be
conducted by a trained experienced technician other than the routine
site operator. The audit is made by measuring the monitor's normal
operating flow rate(s) using a flow rate transfer standard certified
in accordance with section 2.6 of this appendix. The flow rate
standard used for auditing must not be the same flow rate standard
used for verifications or to calibrate the monitor. However, both
the calibration standard and the audit standard may be referenced to
the same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between these flow rates are used to evaluate monitor
performance.
3.2.3 Collocated Quality Control Sampling Procedures for PM2.5.
For each pair of collocated monitors, designate one sampler as the
primary monitor whose concentrations will be used to report air
quality for the site, and designate the other as the quality control
monitor. There can be only one primary monitor at a monitoring site
for a given time period.
3.2.3.1 For each distinct monitoring method designation (FRM or
FEM) that a PQAO is using for a primary monitor, the PQAO must have
15 percent of the primary monitors of each method designation
collocated (values of 0.5 and greater round up); and have at least
one collocated quality control monitor (if the total number of
monitors is less than three). The first collocated monitor must be a
designated FRM monitor.
3.2.3.2 In addition, monitors selected for collocation must also
meet the following requirements:
(a) A primary monitor designated as an EPA FRM shall be
collocated with a quality control monitor having the same EPA FRM
method designation.
(b) For each primary monitor designated as an EPA FEM used by
the PQAO, 50 percent of the monitors designated for collocation, or
the first if only one collocation is necessary, shall be collocated
with a FRM quality control monitor and 50 percent of the monitors
shall be collocated with a monitor having the same method
designation as the FEM primary monitor. If an odd number of
collocated monitors is required, the additional monitor shall be a
FRM quality control monitor. An example of the distribution of
collocated monitors for each unique FEM is provided below. Table A-2
of this appendix demonstrates the collocation procedure with a PQAO
having one type of primary FRM and multiple primary FEMs.
----------------------------------------------------------------------------------------------------------------
#Collocated
#Collocated with same
#Primary FEMS of a unique method designation #Collocated with an FRM method
designation
----------------------------------------------------------------------------------------------------------------
1-9............................................................. 1 1 0
10-16........................................................... 2 1 1
17-23........................................................... 3 2 1
24-29........................................................... 4 2 2
30-36........................................................... 5 3 2
37-43........................................................... 6 3 3
----------------------------------------------------------------------------------------------------------------
3.2.3.3 Since the collocation requirements are used to assess
precision of the primary monitors and there can only be one primary
monitor at a monitoring site, a site can only count for the
collocation of the method designation of the primary monitor at that
site.
3.2.3.4 The collocated monitors should be deployed according to
the following protocol:
(a) Fifty percent of the collocated quality control monitors
should be deployed at sites with annual average or daily
concentrations estimated to be within plus or minus 20 percent of
either the annual or 24-hour NAAQS and the remainder at the PQAOs
discretion;
[[Page 17285]]
(b) If an organization has no sites with annual average or daily
concentrations within 20 percent of the annual NAAQS or
24-hour NAAQS, 50 percent of the collocated quality control monitors
should be deployed at those sites with the annual mean
concentrations or 24-hour concentrations among the highest for all
sites in the network and the remainder at the PQAOs discretion.
(c) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference. A waiver allowing up to 10 meters horizontal distance
and up to 3 meters vertical distance (inlet to inlet) between a
primary and collocated sampler may be approved by the Regional
Administrator for sites at a neighborhood or larger scale of
representation during the annual network plan approval process.
Sampling and analytical methodologies must be the consistently
implemented for both primary and collocated quality control samplers
and for all other samplers in the network.
(d) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation Program (PEP) Procedures. The
PEP is an independent assessment used to estimate total measurement
system bias. These evaluations will be performed under the NPEP as
described in section 2.4 of this appendix or a comparable program.
Performance evaluations will be performed annually within each PQAO.
For PQAOs with less than or equal to five monitoring sites, five
valid performance evaluation audits must be collected and reported
each year. For PQAOs with greater than five monitoring sites, eight
valid performance evaluation audits must be collected and reported
each year. A valid performance evaluation audit means that both the
primary monitor and PEP audit concentrations are valid and above 3
[micro]g/m\3\. Siting of the PEP monitor must be consistent with
section 3.2.3.4(c). However, any horizontal distance greater than 4
meters and any vertical distance greater than one meter must be
reported to the EPA regional PEP coordinator. Additionally for every
monitor designated as a primary monitor, a primary quality assurance
organization must:
3.2.4.1 Have each method designation evaluated each year; and,
3.2.4.2 Have all FRM, FEM or ARM samplers subject to a PEP audit
at least once every 6 years, which equates to approximately 15
percent of the monitoring sites audited each year.
3.2.4.3. Additional information concerning the PEP is contained
in reference 10 of this appendix. The calculations for evaluating
bias between the primary monitor and the performance evaluation
monitor for PM2.5 are described in section 4.2.5 of this
appendix.
3.3PM10.
3.3.1 Flow Rate Verification for PM10 Low Volume Samplers (less
than 200 liter/minute). A one-point flow rate verification check
must be performed at least once every month (each verification
minimally separated by 14 days) on each monitor used to measure
PM10. The verification is made by checking the
operational flow rate of the monitor. If the verification is made in
conjunction with a flow rate adjustment, it must be made prior to
such flow rate adjustment. For the standard procedure, use a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix to check the monitor's normal flow rate. Care should
be taken in selecting and using the flow rate measurement device
such that it does not alter the normal operating flow rate of the
monitor. The percent differences between the audit and measured flow
rates are reported to AQS and used to assess the bias of the
monitoring data as described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.3.2 Flow Rate Verification for PM10 High Volume Samplers
(greater than 200 liters/minute). For PM10 high volume
samplers, the verification frequency is one verification every 90
days (quarter) with 4 in a year. Other than verification frequency,
follow the same technical procedure as described in section 3.3.1 of
this appendix.
3.3.3 Semi-Annual Flow Rate Audit for PM10. Audit the flow rate
of the particulate monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months apart. The EPA strongly
encourages more frequent auditing. The audit should (preferably) be
conducted by a trained experienced technician other than the routine
site operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between these flow rates are used to evaluate monitor
performance.
3.3.4 Collocated Quality Control Sampling Procedures for Manual
PM10. Collocated sampling for PM10 is only required for
manual samplers. For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site and designate the other as the
quality control monitor.
3.3.4.1 For manual PM10 samplers, a PQAO must:
(a) Have 15 percent of the primary monitors collocated (values
of 0.5 and greater round up); and
(b) Have at least one collocated quality control monitor (if the
total number of monitors is less than three).
3.3.4.2 The collocated quality control monitors should be
deployed according to the following protocol:
(a) Fifty percent of the collocated quality control monitors
should be deployed at sites with daily concentrations estimated to
be within plus or minus 20 percent of the applicable NAAQS and the
remainder at the PQAOs discretion;
(b) If an organization has no sites with daily concentrations
within plus or minus 20 percent of the NAAQS, 50 percent of the
collocated quality control monitors should be deployed at those
sites with the daily mean concentrations among the highest for all
sites in the network and the remainder at the PQAOs discretion.
(c) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference. A waiver allowing up to 10 meters horizontal distance
and up to 3 meters vertical distance (inlet to inlet) between a
primary and collocated sampler may be approved by the Regional
Administrator for sites at a neighborhood or larger scale of
representation. This waiver may be approved during the annual
network plan approval process. Sampling and analytical methodologies
must be the consistently implemented for both collocated samplers
and for all other samplers in the network.
(d) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
(e) In determining the number of collocated quality control
sites required for PM10, monitoring networks for lead
(Pb-PM10) should be treated independently from networks
for particulate matter (PM), even though the separate networks may
share one or more common samplers. However, a single quality control
monitor that meets the collocation requirements for Pb-
PM10 and PM10 may serve as a collocated
quality control monitor for both networks. Extreme care must be
taken when using the filter from a quality control monitor for both
PM10 and Pb analysis. A PM10 filter weighing
should occur prior to any Pb analysis.
3.4 Pb.
3.4.1 Flow Rate Verification for Pb-PM10 Low Volume Samplers
(less than 200 liter/minute). A one-point flow rate verification
check must be performed at least once every month (each verification
minimally separated by 14 days) on each monitor used to measure Pb.
The verification is made by checking the operational flow rate of
the monitor. If the verification is made in conjunction with a flow
rate adjustment, it must be made prior to such flow rate adjustment.
For the standard procedure, use a flow rate transfer standard
certified in accordance with section 2.6 of this appendix to check
the monitor's normal flow rate. Care should be taken in selecting
and using the flow rate measurement device such that it does not
[[Page 17286]]
alter the normal operating flow rate of the monitor. The percent
differences between the audit and measured flow rates are reported
to AQS and used to assess the bias of the monitoring data as
described in section 4.2.2 of this appendix (using flow rates in
lieu of concentrations).
3.4.2 Flow Rate Verification for Pb High Volume Samplers
(greater than 200 liters/minute). For high volume samplers, the
verification frequency is one verification every 90 days (quarter)
with four in a year. Other than verification frequency, follow the
same technical procedure as described in section 3.4.1 of this
appendix.
3.4.3 Semi-Annual Flow Rate Audit for Pb. Audit the flow rate of
the particulate monitor twice a year. The two audits should ideally
be spaced between 5 and 7 months apart. The EPA strongly encourages
more frequent auditing. The audit should (preferably) be conducted
by a trained experienced technician other than the routine site
operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between these flow rates are used to evaluate monitor
performance.
3.4.4 Collocated Quality Control Sampling for TSP Pb for
monitoring sites other than non-source oriented NCore. For each pair
of collocated monitors for manual TSP Pb samplers, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the
quality control monitor.
3.4.4.1 A PQAO must:
(a) Have 15 percent of the primary monitors (not counting non-
source oriented NCore sites in PQAO) collocated. Values of 0.5 and
greater round up; and
(b) Have at least one collocated quality control monitor (if the
total number of monitors is less than three).
3.4.4.2 The collocated quality control monitors should be
deployed according to the following protocol:
(a) The first collocated Pb site selected must be the site
measuring the highest Pb concentrations in the network. If the site
is impractical, alternative sites, approved by the EPA Regional
Administrator, may be selected. If additional collocated sites are
necessary, collocated sites may be chosen that reflect average
ambient air Pb concentrations in the network.
(b) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference.
(c) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
3.4.5 Collocated Quality Control Sampling for Pb-PM10
at monitoring sites other than non-source oriented NCore. If a PQAO
is monitoring for Pb-PM10 at sites other than at a non-
source oriented NCore site then the PQAO must:
3.4.5.1 Have 15 percent of the primary monitors (not counting
non-source oriented NCore sites in PQAO) collocated. Values of 0.5
and greater round up; and
3.4.5.2 Have at least one collocated quality control monitor (if
the total number of monitors is less than three).
3.4.5.3 The collocated monitors should be deployed according to
the following protocol:
(a) Fifty percent of the collocated quality control monitors
should be deployed at sites with the highest 3-month average
concentrations and the remainder at the PQAOs discretion.
(b) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference. A waiver allowing up to 10 meters horizontal distance
and up to 3 meters vertical distance (inlet to inlet) between a
primary and collocated sampler may be approved by the Regional
Administrator for sites at a neighborhood or larger scale of
representation. This waiver may be approved during the annual
network plan approval process. Sampling and analytical methodologies
must be the consistently implemented for both collocated samplers
and for all other samplers in the network.
(c) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
(d) In determining the number of collocated quality control
sites required for Pb-PM10, monitoring networks for
PM10 should be treated independently from networks for
Pb-PM10, even though the separate networks may share one
or more common samplers. However, a single quality control monitor
that meets the collocation requirements for Pb-PM10 and
PM10 may serve as a collocated quality control monitor
for both networks. Extreme care must be taken when using a using the
filter from a quality control monitor for both PM10 and
Pb analysis. A PM10 filter weighing should occur prior to
any Pb analysis.
3.4.6 Pb Analysis Audits. Each calendar quarter, audit the Pb
reference or equivalent method analytical procedure using filters
containing a known quantity of Pb. These audit filters are prepared
by depositing a Pb standard on unexposed filters and allowing them
to dry thoroughly. The audit samples must be prepared using batches
of reagents different from those used to calibrate the Pb analytical
equipment being audited. Prepare audit samples in the following
concentration ranges:
------------------------------------------------------------------------
Equivalent ambient Pb concentration,
Range [mu]g/m \3\
------------------------------------------------------------------------
1............................... 30-100% of Pb NAAQS.
2............................... 200-300% of Pb NAAQS.
------------------------------------------------------------------------
(a) Extract the audit samples using the same extraction
procedure used for exposed filters.
(b) Analyze three audit samples in each of the two ranges each
quarter samples are analyzed. The audit sample analyses shall be
distributed as much as possible over the entire calendar quarter.
(c) Report the audit concentrations (in [mu]g Pb/filter or
strip) and the corresponding measured concentrations (in [mu]g Pb/
filter or strip) to AQS using AQS unit code 077. The percent
differences between the concentrations are used to calculate
analytical accuracy as described in section 4.2.6 of this appendix.
3.4.7 Pb PEP Procedures for monitoring sites other than non-
source oriented NCore. The PEP is an independent assessment used to
estimate total measurement system bias. These evaluations will be
performed under the NPEP described in section 2.4 of this appendix
or a comparable program. Each year, one performance evaluation audit
must be performed at one Pb site in each primary quality assurance
organization that has less than or equal to five sites and two
audits at PQAOs with greater than five sites. Non-source oriented
NCore sites are not counted. Siting of the PEP monitor must be
consistent with section 3.4.5.3(b). However, any horizontal distance
greater than 4 meters and any vertical distance greater than 1 meter
must be reported to the EPA regional PEP coordinator. In addition,
each year, four collocated samples from PQAOs with less than or
equal to five sites and six collocated samples at PQAOs with greater
than five sites must be sent to an independent laboratory, the same
laboratory as the performance evaluation audit, for analysis. The
calculations for evaluating bias between the primary monitor and the
performance evaluation monitor for Pb are described in section 4.2.4
of this appendix.
4. Calculations for Data Quality Assessments
(a) Calculations of measurement uncertainty are carried out by
the EPA according to the following procedures. The PQAOs must report
the data to AQS for all measurement quality checks as specified in
this appendix even though they may elect to perform some or all of
the calculations in this section on their own.
(b) The EPA will provide annual assessments of data quality
aggregated by site and PQAO for SO2, NO2,
O3 and CO and by PQAO for PM10,
PM2.5, and Pb.
(c) At low concentrations, agreement between the measurements of
collocated quality control samplers, expressed as
[[Page 17287]]
relative percent difference or percent difference, may be relatively
poor. For this reason, collocated measurement pairs are selected for
use in the precision and bias calculations only when both
measurements are equal to or above the following limits:
(1) Pb: 0.002 [micro]g/m\3\ (Methods approved after 3/04/2010,
with exception of manual equivalent method EQLA-0813-803).
(2) Pb: 0.02 [micro]g/m\3\ (Methods approved before 3/04/2010,
and manual equivalent method EQLA-0813-803).
(3) PM10 (Hi-Vol): 15 [micro]g/m\3\.
(4) PM10 (Lo-Vol): 3 [micro]g/m\3\.
(5) PM2.5: 3 [micro]g/m\3\.
4.1 Statistics for the Assessment of QC Checks for
SO2, NO2, O3 and CO.
4.1.1 Percent Difference. Many of the measurement quality checks
start with a comparison of an audit concentration or value (flow
rate) to the concentration/value measured by the monitor and use
percent difference as the comparison statistic as described in
equation 1 of this section. For each single point check, calculate
the percent difference, di, as follows:
[GRAPHIC] [TIFF OMITTED] TR28MR16.000
where meas is the concentration indicated by the PQAO's instrument
and audit is the audit concentration of the standard used in the QC
check being measured.
4.1.2 Precision Estimate. The precision estimate is used to
assess the one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The precision estimator is the coefficient of variation upper bound
and is calculated using equation 2 of this section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.001
where n is the number of single point checks being aggregated; X\2\
0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is calculated using the
one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The bias estimator is an upper bound on the mean absolute value of
the percent differences as described in equation 3 of this section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.002
where n is the number of single point checks being aggregated;
t0.95,n-1 is the 95th quantile of a t-distribution with
n-1 degrees of freedom; the quantity AB is the mean of the absolute
values of the d i ' s and is calculated using equation 4
of this section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.003
and the quantity AS is the standard deviation of the absolute value
of the di ' s and is calculated using equation 5 of this
section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.004
4.1.3.1 Assigning a sign (positive/negative) to the bias
estimate. Since the bias statistic as calculated in equation 3 of
this appendix uses absolute values, it does not have a tendency
(negative or positive bias) associated with it. A sign will be
designated by rank ordering the percent differences of the QC check
samples from a given site for a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th percentiles of the percent
differences for each site. The absolute bias upper bound should be
flagged as positive if both percentiles are positive and negative if
both percentiles are negative. The absolute bias upper bound would
not be flagged if the 25th and 75th percentiles are of different
signs.
4.2 Statistics for the Assessment of PM10,
PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler Precision Estimate for
PM10, PM2.5 and Pb. Precision is estimated via duplicate
measurements from collocated samplers. It is recommended that the
precision be aggregated at the PQAO level quarterly, annually, and
at the 3-year level. The data pair would only be considered valid if
both concentrations are greater than or equal to the minimum values
specified in section 4(c) of this appendix. For each collocated data
pair, calculate the relative percent difference, di,
using equation 6 of this appendix:
[GRAPHIC] [TIFF OMITTED] TR28MR16.005
where Xi is the concentration from the primary sampler
and Yi is the concentration value from the audit sampler.
The coefficient of variation upper bound is calculated using
equation 7 of this appendix:
[[Page 17288]]
[GRAPHIC] [TIFF OMITTED] TR28MR16.006
where n is the number of valid data pairs being aggregated, and X\2\
0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di is
calculated from two values with error.
4.2.2 One-Point Flow Rate Verification Bias Estimate for PM10,
PM2.5 and Pb. For each one-point flow rate verification, calculate
the percent difference in volume using equation 1 of this appendix
where meas is the value indicated by the sampler's volume
measurement and audit is the actual volume indicated by the auditing
flow meter. The absolute volume bias upper bound is then calculated
using equation 3, where n is the number of flow rate audits being
aggregated; t0.95,n-1 is the 95th quantile of a t-
distribution with n-1 degrees of freedom, the quantity AB is the
mean of the absolute values of the di's and is calculated
using equation 4 of this appendix, and the quantity AS in equation 3
of this appendix is the standard deviation of the absolute values if
the di's and is calculated using equation 5 of this
appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias Estimate for PM10, PM2.5
and Pb. Use the same procedure described in section 4.2.2 for the
evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs Bias Estimate for Pb. The
Pb bias estimate is calculated using the paired routine and the PEP
monitor as described in section 3.4.7. Use the same procedures as
described in section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs Bias Estimate for PM2.5.
The bias estimate is calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias estimator is based on the
mean percent differences (Equation 1). The mean percent difference,
D, is calculated by Equation 8 below.
[GRAPHIC] [TIFF OMITTED] TR28MR16.007
where nj is the number of pairs and
d1,d2,...dnj are the biases for
each pair to be averaged.
4.2.6 Pb Analysis Audit Bias Estimate. The bias estimate is
calculated using the analysis audit data described in section 3.4.6.
Use the same bias estimate procedure as described in section 4.1.3
of this appendix.
5. Reporting Requirements
5.1 Reporting Requirements. For each pollutant, prepare a list
of all monitoring sites and their AQS site identification codes in
each PQAO and submit the list to the appropriate EPA Regional
Office, with a copy to AQS. Whenever there is a change in this list
of monitoring sites in a PQAO, report this change to the EPA
Regional Office and to AQS.
5.1.1 Quarterly Reports. For each quarter, each PQAO shall
report to AQS directly (or via the appropriate EPA Regional Office
for organizations not direct users of AQS) the results of all valid
measurement quality checks it has carried out during the quarter.
The quarterly reports must be submitted consistent with the data
reporting requirements specified for air quality data as set forth
in 40 CFR 58.16. The EPA strongly encourages early submission of the
quality assurance data in order to assist the PQAOs ability to
control and evaluate the quality of the ambient air data.
5.1.2 Annual Reports.
5.1.2.1 When the PQAO has certified relevant data for the
calendar year, the EPA will calculate and report the measurement
uncertainty for the entire calendar year.
6. References
(1) American National Standard--Specifications and Guidelines
for Quality Systems for Environmental Data Collection and
Environmental Technology Programs. ANSI/ASQC E4-2014. February 2014.
Available from American Society for Quality Control, 611 East
Wisconsin Avenue, Milwaukee, WI 53202.
(2) EPA Requirements for Quality Management Plans. EPA QA/R-2.
EPA/240/B-01/002. March 2001, Reissue May 2006. Office of
Environmental Information, Washington DC 20460. https://www.epa.gov/quality/agency-wide-quality-system-documents.
(3) EPA Requirements for Quality Assurance Project Plans for
Environmental Data Operations. EPA QA/R-5. EPA/240/B-01/003. March
2001, Reissue May 2006. Office of Environmental Information,
Washington DC 20460. https://www.epa.gov/quality/agency-wide-quality-system-documents.
(4) EPA Traceability Protocol for Assay and Certification of
Gaseous Calibration Standards. EPA-600/R-12/531. May, 2012.
Available from U.S. Environmental Protection Agency, National Risk
Management Research Laboratory, Research Triangle Park NC 27711.
https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=245292.
(5) Guidance for the Data Quality Objectives Process. EPA QA/G-
4. EPA/240/B-06/001. February, 2006. Office of Environmental
Information, Washington DC 20460. https://www.epa.gov/quality/agency-wide-quality-system-documents.
(6) List of Designated Reference and Equivalent Methods.
Available from U.S. Environmental Protection Agency, National
Exposure Research Laboratory, Human Exposure and Atmospheric
Sciences Division, MD-D205-03, Research Triangle Park, NC 27711.
https://www3.epa.gov/ttn/amtic/criteria.html.
(7) Transfer Standards for the Calibration of Ambient Air
Monitoring Analyzers for Ozone. EPA-454/B-13-004 U.S. Environmental
Protection Agency, Research Triangle Park, NC 27711, October, 2013.
https://www3.epa.gov/ttn/amtic/qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical Assistance Document
for the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057.
U.S. Environmental Protection Agency, Research Triangle Park, NC
27711, September, 1979. https://www.epa.gov/ttn/amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume 1--A Field Guide to Environmental Quality Assurance.
EPA-600/R-94/038a. April 1994. Available from U.S. Environmental
Protection Agency, ORD Publications Office, Center for Environmental
Research Information (CERI), 26 W. Martin Luther King Drive,
Cincinnati, OH 45268. https://www3.epa.gov/ttn/amtic/qalist.html.
(10) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II: Ambient Air Quality Monitoring Program Quality
System Development. EPA-454/B-13-003. https://www3.epa.gov/ttn/amtic/qalist.html.
(11) National Performance Evaluation Program Standard Operating
Procedures. https://www3.epa.gov/ttn/amtic/npapsop.html.
[[Page 17289]]
Table A-1 of Appendix A to Part 58--Minimum Data Assessment Requirements for NAAQS Related Criteria Pollutant Monitors
--------------------------------------------------------------------------------------------------------------------------------------------------------
Method Assessment method Coverage Minimum frequency Parameters reported AQS assessment type
--------------------------------------------------------------------------------------------------------------------------------------------------------
Gaseous Methods (CO, NO2, SO2, O3)
--------------------------------------------------------------------------------------------------------------------------------------------------------
One-Point QC for SO2, NO2, O3, CO.. Response check at Each analyzer......... Once per 2 weeks..... Audit concentration One-Point QC.
concentration 0.005- \1\ and measured
0.08 ppm SO2, NO2, concentration. \2\
O3, and.
0.5 and 5 ppm CO......
Annual performance evaluation for See section 3.1.2 of Each analyzer......... Once per year........ Audit concentration Annual PE.
SO2, NO2, O3, CO. this appendix. \1\ and measured
concentration \2\
for each level.
NPAP for SO2, NO2, O3, CO.......... Independent Audit..... 20% of sites each year Once per year........ Audit concentration NPAP.
\1\ and measured
concentration \2\
for each level.
--------------------------------------------------------------------------------------------------------------------------------------------------------
Particulate Methods
--------------------------------------------------------------------------------------------------------------------------------------------------------
Continuous \4\ method--collocated Collocated samplers... 15%................... 1-in-12 days......... Primary sampler No Transaction
quality control sampling PM2.5. concentration and reported as raw
duplicate sampler data.
concentration. \3\
Manual method--collocated quality Collocated samplers... 15%................... 1-in-12 days......... Primary sampler No Transaction
control sampling PM10, PM2.5, Pb- concentration and reported as raw
TSP, Pb-PM10. duplicate sampler data.
concentration. \3\
Flow rate verification PM10 (low Check of sampler flow Each sampler.......... Once every month..... Audit flow rate and Flow Rate
Vol) PM2.5, Pb-PM10. rate. measured flow rate Verification.
indicated by the
sampler.
Flow rate verification PM10 (High- Check of sampler flow Each sampler.......... Once every quarter... Audit flow rate and Flow Rate
Vol), Pb-TSP. rate. measured flow rate Verification.
indicated by the
sampler.
Semi-annual flow rate audit PM10, Check of sampler flow Each sampler,......... Once every 6 months.. Audit flow rate and Semi Annual Flow Rate
TSP, PM10-2.5, PM2.5, Pb-TSP, Pb- rate using measured flow rate Audit.
PM10. independent standard. indicated by the
sampler.
Pb analysis audits Pb-TSP, Pb-PM10. Check of analytical Analytical............ Once each quarter.... Measured value and Pb Analysis Audits.
system with Pb audit audit value (ug Pb/
strips/filters. filter) using AQS
unit code 077.
Performance Evaluation Program Collocated samplers... (1) 5 valid audits for Distributed over all Primary sampler PEP.
PM2.5. primary QA orgs, with 4 quarters. concentration and
<= 5 sites.. performance
(2) 8 valid audits for evaluation sampler
primary QA orgs, with concentration.
>5 sites..
(3) All samplers in 6
years.
Performance Evaluation Program Pb- Collocated samplers... (1) 1 valid audit and Distributed over all Primary sampler PEP.
TSP, Pb-PM10. 4 collocated samples 4 quarters. concentration and
for primary QA orgs, performance
with <=5 sites.. evaluation sampler
(2) 2 valid audits and concentration.
6 collocated samples Primary sampler
for primary QA orgs concentration and
with >5 sites. duplicate sampler
concentration.
--------------------------------------------------------------------------------------------------------------------------------------------------------
\1\ Effective concentration for open path analyzers.
\2\ Corrected concentration, if applicable for open path analyzers.
\3\ Both primary and collocated sampler values are reported as raw data.
\4\ PM2.5 is the only particulate criteria pollutant requiring collocation of continuous and manual primary monitors.
[[Page 17290]]
Table A-2 of Appendix A to Part 58--Summary of PM2.5 Number and Type of Collocation (15% Collocation
Requirement) Required Using an Example of a PQAO That Has 54 Primary Monitors (54 sites) With One Federal
Reference Method Type and Three Types of Approved Federal Equivalent Methods
----------------------------------------------------------------------------------------------------------------
No. of
collocated
Total No. of Total No. of No. of with same
Primary sampler method designation monitors collocated collocated method
with FRM designation
as primary
----------------------------------------------------------------------------------------------------------------
FRM............................................. 20 3 3 3
FEM (A)......................................... 20 3 2 1
FEM (B)......................................... 2 1 1 0
FEM (C)......................................... 12 2 1 1
----------------------------------------------------------------------------------------------------------------
0
10. Add Appendix B to part 58 to read as follows:
Appendix B to Part 58--Quality Assurance Requirements for Prevention of
Significant Deterioration (PSD) Air Monitoring
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability.
(a) This appendix specifies the minimum quality assurance
requirements for the control and assessment of the quality of the
ambient air monitoring data submitted to a PSD reviewing authority
or the EPA by an organization operating an air monitoring station,
or network of stations, operated in order to comply with Part 51 New
Source Review--Prevention of Significant Deterioration (PSD). Such
organizations are encouraged to develop and maintain quality
assurance programs more extensive than the required minimum.
Additional guidance for the requirements reflected in this appendix
can be found in the ``Quality Assurance Handbook for Air Pollution
Measurement Systems,'' Volume II (Ambient Air) and ``Quality
Assurance Handbook for Air Pollution Measurement Systems,'' Volume
IV (Meteorological Measurements) and at a national level in
references 1, 2, and 3 of this appendix.
(b) It is not assumed that data generated for PSD under this
appendix will be used in making NAAQS decisions. However, if all the
requirements in this appendix are followed (including the NPEP
programs) and reported to AQS, with review and concurrence from the
EPA region, data may be used for NAAQS decisions. With the exception
of the NPEP programs (NPAP, PM2.5 PEP, Pb-PEP), for which
implementation is at the discretion of the PSD reviewing authority,
all other quality assurance and quality control requirements found
in the appendix must be met.
1.2 PSD Primary Quality Assurance Organization (PQAO). A PSD
PQAO is defined as a monitoring organization or a coordinated
aggregation of such organizations that is responsible for a set of
stations within one PSD reviewing authority that monitors the same
pollutant and for which data quality assessments will be pooled.
Each criteria pollutant sampler/monitor must be associated with only
one PSD PQAO.
1.2.1 Each PSD PQAO shall be defined such that measurement
uncertainty among all stations in the organization can be expected
to be reasonably homogeneous, as a result of common factors. A PSD
PQAO must be associated with only one PSD reviewing authority.
Common factors that should be considered in defining PSD PQAOs
include:
(a) Operation by a common team of field operators according to a
common set of procedures;
(b) Use of a common QAPP and/or standard operating procedures;
(c) Common calibration facilities and standards;
(d) Oversight by a common quality assurance organization; and
(e) Support by a common management organization or laboratory.
1.2.2 PSD monitoring organizations having difficulty describing
its PQAO or in assigning specific monitors to a PSD PQAO should
consult with the PSD reviewing authority. Any consolidation of PSD
PQAOs shall be subject to final approval by the PSD reviewing
authority.
1.2.3 Each PSD PQAO is required to implement a quality system
that provides sufficient information to assess the quality of the
monitoring data. The quality system must, at a minimum, include the
specific requirements described in this appendix. Failure to conduct
or pass a required check or procedure, or a series of required
checks or procedures, does not by itself invalidate data for
regulatory decision making. Rather, PSD PQAOs and the PSD reviewing
authority shall use the checks and procedures required in this
appendix in combination with other data quality information,
reports, and similar documentation that demonstrate overall
compliance with parts 51, 52 and 58 of this chapter. Accordingly,
the PSD reviewing authority shall use a ``weight of evidence''
approach when determining the suitability of data for regulatory
decisions. The PSD reviewing authority reserves the authority to use
or not use monitoring data submitted by a PSD monitoring
organization when making regulatory decisions based on the PSD
reviewing authority's assessment of the quality of the data.
Generally, consensus built validation templates or validation
criteria already approved in quality assurance project plans (QAPPs)
should be used as the basis for the weight of evidence approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used to describe deviations
from a true concentration or estimate that are related to the
measurement process and not to spatial or temporal population
attributes of the air being measured.
(b) Precision. A measurement of mutual agreement among
individual measurements of the same property usually under
prescribed similar conditions, expressed generally in terms of the
standard deviation.
(c) Bias. The systematic or persistent distortion of a
measurement process which causes errors in one direction.
(d) Accuracy. The degree of agreement between an observed value
and an accepted reference value. Accuracy includes a combination of
random error (imprecision) and systematic error (bias) components
which are due to sampling and analytical operations.
(e) Completeness. A measure of the amount of valid data obtained
from a measurement system compared to the amount that was expected
to be obtained under correct, normal conditions.
(f) Detectability. The low critical range value of a
characteristic that a method specific procedure can reliably
discern.
1.4 Measurement Quality Check Reporting. The measurement quality
checks described in section 3 of this appendix, are required to be
submitted to the PSD reviewing authority within the same time frame
as routinely-collected ambient concentration data as described in 40
CFR 58.16. The PSD reviewing authority may as well require that the
measurement quality check data be reported to AQS.
1.5 Assessments and Reports. Periodic assessments and
documentation of data quality are required to be reported to the PSD
reviewing authority. To provide national uniformity in this
assessment and reporting of data quality for all networks, specific
assessment and reporting procedures are prescribed in detail in
sections 3, 4, and 5 of this appendix.
2. Quality System Requirements
A quality system (reference 1 of this appendix) is the means by
which an organization manages the quality of the monitoring
information it produces in a
[[Page 17291]]
systematic, organized manner. It provides a framework for planning,
implementing, assessing and reporting work performed by an
organization and for carrying out required quality assurance and
quality control activities.
2.1 Quality Assurance Project Plans. All PSD PQAOs must develop
a quality system that is described and approved in quality assurance
project plans (QAPP) to ensure that the monitoring results:
(a) Meet a well-defined need, use, or purpose (reference 5 of
this appendix);
(b) Provide data of adequate quality for the intended monitoring
objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards specifications;
(e) Comply with statutory (and other legal) requirements; and
(f) Assure quality assurance and quality control adequacy and
independence.
2.1.1 The QAPP is a formal document that describes these
activities in sufficient detail and is supported by standard
operating procedures. The QAPP must describe how the organization
intends to control measurement uncertainty to an appropriate level
in order to achieve the objectives for which the data are collected.
The QAPP must be documented in accordance with EPA requirements
(reference 3 of this appendix).
2.1.2 The PSD PQAO's quality system must have adequate resources
both in personnel and funding to plan, implement, assess and report
on the achievement of the requirements of this appendix and it's
approved QAPP.
2.1.3 Incorporation of quality management plan (QMP) elements
into the QAPP. The QMP describes the quality system in terms of the
organizational structure, functional responsibilities of management
and staff, lines of authority, and required interfaces for those
planning, implementing, assessing and reporting activities involving
environmental data operations (EDO). The PSD PQAOs may combine
pertinent elements of the QMP into the QAPP rather than requiring
the submission of both QMP and QAPP documents separately, with prior
approval of the PSD reviewing authority. Additional guidance on QMPs
can be found in reference 2 of this appendix.
2.2 Independence of Quality Assurance Management. The PSD PQAO
must provide for a quality assurance management function for its PSD
data collection operation, that aspect of the overall management
system of the organization that determines and implements the
quality policy defined in a PSD PQAO's QAPP. Quality management
includes strategic planning, allocation of resources and other
systematic planning activities (e.g., planning, implementation,
assessing and reporting) pertaining to the quality system. The
quality assurance management function must have sufficient technical
expertise and management authority to conduct independent oversight
and assure the implementation of the organization's quality system
relative to the ambient air quality monitoring program and should be
organizationally independent of environmental data generation
activities.
2.3 Data Quality Performance Requirements.
2.3.1 Data Quality Objectives (DQOs). The DQOs, or the results
of other systematic planning processes, are statements that define
the appropriate type of data to collect and specify the tolerable
levels of potential decision errors that will be used as a basis for
establishing the quality and quantity of data needed to support air
monitoring objectives (reference 5 of the appendix). The DQOs have
been developed by the EPA to support attainment decisions for
comparison to national ambient air quality standards (NAAQS). The
PSD reviewing authority and the PSD monitoring organization will be
jointly responsible for determining whether adherence to the EPA
developed NAAQS DQOs specified in appendix A of this part are
appropriate or if DQOs from a project-specific systematic planning
process are necessary.
2.3.1.1 Measurement Uncertainty for Automated and Manual PM2.5
Methods. The goal for acceptable measurement uncertainty for
precision is defined as an upper 90 percent confidence limit for the
coefficient of variation (CV) of 10 percent and plus or minus 10
percent for total bias.
2.3.1.2 Measurement Uncertainty for Automated Ozone Methods. The
goal for acceptable measurement uncertainty is defined for precision
as an upper 90 percent confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence limit for the absolute
bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb Methods. The goal for
acceptable measurement uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV of 20 percent and for
bias as an upper 95 percent confidence limit for the absolute bias
of 15 percent.
2.3.1.4 Measurement Uncertainty for NO2. The goal for acceptable
measurement uncertainty is defined for precision as an upper 90
percent confidence limit for the CV of 15 percent and for bias as an
upper 95 percent confidence limit for the absolute bias of 15
percent.
2.3.1.5 Measurement Uncertainty for SO2. The goal for acceptable
measurement uncertainty for precision is defined as an upper 90
percent confidence limit for the CV of 10 percent and for bias as an
upper 95 percent confidence limit for the absolute bias of 10
percent.
2.4 National Performance Evaluation Program. Organizations
operating PSD monitoring networks are required to implement the
EPA's national performance evaluation program (NPEP) if the data
will be used for NAAQS decisions and at the discretion of the PSD
reviewing authority if PSD data are not used for NAAQS decisions.
The NPEP includes the National Performance Audit Program (NPAP), the
PM2.5 Performance Evaluation Program (PM2.5-
PEP) and the Pb Performance Evaluation Program (Pb-PEP). The PSD
QAPP shall provide for the implementation of NPEP including the
provision of adequate resources for such NPEP if the data will be
used for NAAQS decisions or if required by the PSD reviewing
authority. Contact the PSD reviewing authority to determine the best
procedure for implementing the audits which may include an audit by
the PSD reviewing authority, a contractor certified for the
activity, or through self-implementation which is described in
sections below. A determination of which entity will be performing
this audit program should be made as early as possible and during
the QAPP development process. The PSD PQAOs, including contractors
that plan to implement these programs on behalf of PSD PQAOs, that
plan to implement these programs (self-implement) rather than use
the federal programs, must meet the adequacy requirements found in
the appropriate sections that follow, as well as meet the definition
of independent assessment that follows.
2.4.1 Independent Assessment. An assessment performed by a
qualified individual, group, or organization that is not part of the
organization directly performing and accountable for the work being
assessed. This auditing organization must not be involved with the
generation of the routinely-collected ambient air monitoring data.
An organization can conduct the performance evaluation (PE) if it
can meet this definition and has a management structure that, at a
minimum, will allow for the separation of its routine sampling
personnel from its auditing personnel by two levels of management.
In addition, the sample analysis of audit filters must be performed
by a laboratory facility and laboratory equipment separate from the
facilities used for routine sample analysis. Field and laboratory
personnel will be required to meet the performance evaluation field
and laboratory training and certification requirements. The PSD PQAO
will be required to participate in the centralized field and
laboratory standards certification and comparison processes to
establish comparability to federally implemented programs.
2.5 Technical Systems Audit Program. The PSD reviewing authority
or the EPA may conduct system audits of the ambient air monitoring
programs or organizations operating PSD networks. The PSD monitoring
organizations shall consult with the PSD reviewing authority to
verify the schedule of any such technical systems audit. Systems
audit programs are described in reference 10 of this appendix.
2.6 Gaseous and Flow Rate Audit Standards.
2.6.1 Gaseous pollutant concentration standards (permeation
devices or cylinders of compressed gas) used to obtain test
concentrations for carbon monoxide (CO), sulfur dioxide
(SO2), nitrogen oxide (NO), and nitrogen dioxide
(NO2) must be traceable to either a National Institute of
Standards and Technology (NIST) Traceable Reference Material (NTRM)
or a NIST-certified Gas Manufacturer's Internal Standard (GMIS),
certified in accordance with one of the procedures given in
reference 4 of this appendix. Vendors advertising certification with
the procedures provided in reference 4 of this appendix and
distributing gases as ``EPA Protocol Gas'' must participate in the
EPA Protocol Gas Verification Program or not use ``EPA'' in any form
of advertising. The PSD PQAOs must provide information to the PSD
reviewing authority on the gas vendors they use (or will use) for
the duration of the PSD monitoring project. This information can
[[Page 17292]]
be provided in the QAPP or monitoring plan, but must be updated if
there is a change in the producer used.
2.6.2 Test concentrations for ozone (O3) must be
obtained in accordance with the ultraviolet photometric calibration
procedure specified in appendix D to Part 50, and by means of a
certified NIST-traceable O3 transfer standard. Consult
references 7 and 8 of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be made by a flow measuring
instrument that is NIST-traceable to an authoritative volume or
other applicable standard. Guidance for certifying some types of
flow-meters is provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance. Requirements and guidance
documents for developing the quality system are contained in
references 1 through 11 of this appendix, which also contain many
suggested procedures, checks, and control specifications. Reference
10 describes specific guidance for the development of a quality
system for data collected for comparison to the NAAQS. Many specific
quality control checks and specifications for methods are included
in the respective reference methods described in Part 50 or in the
respective equivalent method descriptions available from the EPA
(reference 6 of this appendix). Similarly, quality control
procedures related to specifically designated reference and
equivalent method monitors are contained in the respective operation
or instruction manuals associated with those monitors. For PSD
monitoring, the use of reference and equivalent method monitors are
required.
3. Measurement Quality Check Requirements
This section provides the requirements for PSD PQAOs to perform
the measurement quality checks that can be used to assess data
quality. Data from these checks are required to be submitted to the
PSD reviewing authority within the same time frame as routinely-
collected ambient concentration data as described in 40 CFR 58.16.
Table B-1 of this appendix provides a summary of the types and
frequency of the measurement quality checks that are described in
this section. Reporting these results to AQS may be required by the
PSD reviewing authority.
3.1 Gaseous monitors of SO2, NO2, O3, and CO.
3.1.1 One-Point Quality Control (QC) Check for SO2, NO2, O3, and
CO. (a) A one-point QC check must be performed at least once every 2
weeks on each automated monitor used to measure SO2,
NO2, O3 and CO. With the advent of automated
calibration systems, more frequent checking is strongly encouraged
and may be required by the PSD reviewing authority. See Reference 10
of this appendix for guidance on the review procedure. The QC check
is made by challenging the monitor with a QC check gas of known
concentration (effective concentration for open path monitors)
between the prescribed range of 0.005 and 0.08 parts per million
(ppm) for SO2, NO2, and O3, and
between the prescribed range of 0.5 and 5 ppm for CO monitors. The
QC check gas concentration selected within the prescribed range
should be related to monitoring objectives for the monitor. If
monitoring for trace level monitoring, the QC check concentration
should be selected to represent the mean or median concentrations at
the site. If the mean or median concentrations at trace gas sites
are below the MDL of the instrument the agency can select the lowest
concentration in the prescribed range that can be practically
achieved. If the mean or median concentrations at trace gas sites
are above the prescribed range the agency can select the highest
concentration in the prescribed range. The PSD monitoring
organization will consult with the PSD reviewing authority on the
most appropriate one-point QC concentration based on the objectives
of the monitoring activity. An additional QC check point is
encouraged for those organizations that may have occasional high
values or would like to confirm the monitors' linearity at the
higher end of the operational range or around NAAQS concentrations.
If monitoring for NAAQS decisions the QC concentration can be
selected at a higher concentration within the prescribed range but
should also consider precision points around mean or median
concentrations.
(b) Point analyzers must operate in their normal sampling mode
during the QC check and the test atmosphere must pass through all
filters, scrubbers, conditioners and other components used during
normal ambient sampling and as much of the ambient air inlet system
as is practicable. The QC check must be conducted before any
calibration or adjustment to the monitor.
(c) Open-path monitors are tested by inserting a test cell
containing a QC check gas concentration into the optical measurement
beam of the instrument. If possible, the normally used transmitter,
receiver, and as appropriate, reflecting devices should be used
during the test and the normal monitoring configuration of the
instrument should be altered as little as possible to accommodate
the test cell for the test. However, if permitted by the associated
operation or instruction manual, an alternate local light source or
an alternate optical path that does not include the normal
atmospheric monitoring path may be used. The actual concentration of
the QC check gas in the test cell must be selected to produce an
effective concentration in the range specified earlier in this
section. Generally, the QC test concentration measurement will be
the sum of the atmospheric pollutant concentration and the QC test
concentration. As such, the result must be corrected to remove the
atmospheric concentration contribution. The corrected concentration
is obtained by subtracting the average of the atmospheric
concentrations measured by the open path instrument under test
immediately before and immediately after the QC test from the QC
check gas concentration measurement. If the difference between these
before and after measurements is greater than 20 percent of the
effective concentration of the test gas, discard the test result and
repeat the test. If possible, open path monitors should be tested
during periods when the atmospheric pollutant concentrations are
relatively low and steady.
(d) Report the audit concentration of the QC gas and the
corresponding measured concentration indicated by the monitor. The
percent differences between these concentrations are used to assess
the precision and bias of the monitoring data as described in
sections 4.1.2 (precision) and 4.1.3 (bias) of this appendix.
3.1.2 Quarterly performance evaluation for SO2, NO2, O3, or CO.
Evaluate each primary monitor each monitoring quarter (or 90 day
frequency) during which monitors are operated or a least once (if
operated for less than one quarter). The quarterly performance
evaluation (quarterly PE) must be performed by a qualified
individual, group, or organization that is not part of the
organization directly performing and accountable for the work being
assessed. The person or entity performing the quarterly PE must not
be involved with the generation of the routinely-collected ambient
air monitoring data. A PSD monitoring organization can conduct the
quarterly PE itself if it can meet this definition and has a
management structure that, at a minimum, will allow for the
separation of its routine sampling personnel from its auditing
personnel by two levels of management. The quarterly PE also
requires a set of equipment and standards independent from those
used for routine calibrations or zero, span or precision checks.
3.1.2.1 The evaluation is made by challenging the monitor with
audit gas standards of known concentration from at least three audit
levels. One point must be within two to three times the method
detection limit of the instruments within the PQAOs network, the
second point will be less than or equal to the 99th percentile of
the data at the site or the network of sites in the PQAO or the next
highest audit concentration level. The third point can be around the
primary NAAQS or the highest 3-year concentration at the site or the
network of sites in the PQAO. An additional 4th level is encouraged
for those PSD organizations that would like to confirm the monitor's
linearity at the higher end of the operational range. In rare
circumstances, there may be sites measuring concentrations above
audit level 10. These sites should be identified to the PSD
reviewing authority.
----------------------------------------------------------------------------------------------------------------
Concentration range, ppm
Audit level ---------------------------------------------------------------
O3 SO2 NO2 CO
----------------------------------------------------------------------------------------------------------------
1............................................... 0.004-0.0059 0.0003-0.0029 0.0003-0.0029 0.020-0.059
2............................................... 0.006-0.019 0.0030-0.0049 0.0030-0.0049 0.060-0.199
[[Page 17293]]
3............................................... 0.020-0.039 0.0050-0.0079 0.0050-0.0079 0.200-0.899
4............................................... 0.040-0.069 0.0080-0.0199 0.0080-0.0199 0.900-2.999
5............................................... 0.070-0.089 0.0200-0.0499 0.0200-0.0499 3.000-7.999
6............................................... 0.090-0.119 0.0500-0.0999 0.0500-0.0999 8.000-15.999
7............................................... 0.120-0.139 0.1000-0.1499 0.1000-0.2999 16.000-30.999
8............................................... 0.140-0.169 0.1500-0.2599 0.3000-0.4999 31.000-39.999
9............................................... 0.170-0.189 0.2600-0.7999 0.5000-0.7999 40.000-49.999
10.............................................. 0.190-0.259 0.8000-1.000 0.8000-1.000 50.000-60.000
----------------------------------------------------------------------------------------------------------------
3.1.2.2 The NO2 audit techniques may vary depending
on the ambient monitoring method. For chemiluminescence-type
NO2 analyzers, gas phase titration (GPT) techniques
should be based on the EPA guidance documents and monitoring agency
experience. The NO2 gas standards may be more appropriate
than GPT for direct NO2 methods that do not employ
converters. Care should be taken to ensure the stability of such gas
standards prior to use.
3.1.2.3 The standards from which audit gas test concentrations
are obtained must meet the specifications of section 2.6.1 of this
appendix.
3.1.2.4 For point analyzers, the evaluation shall be carried out
by allowing the monitor to analyze the audit gas test atmosphere in
its normal sampling mode such that the test atmosphere passes
through all filters, scrubbers, conditioners, and other sample inlet
components used during normal ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open-path monitors are evaluated by inserting a test
cell containing the various audit gas concentrations into the
optical measurement beam of the instrument. If possible, the
normally used transmitter, receiver, and, as appropriate, reflecting
devices should be used during the evaluation, and the normal
monitoring configuration of the instrument should be modified as
little as possible to accommodate the test cell for the evaluation.
However, if permitted by the associated operation or instruction
manual, an alternate local light source or an alternate optical path
that does not include the normal atmospheric monitoring path may be
used. The actual concentrations of the audit gas in the test cell
must be selected to produce effective concentrations in the
evaluation level ranges specified in this section of this appendix.
Generally, each evaluation concentration measurement result will be
the sum of the atmospheric pollutant concentration and the
evaluation test concentration. As such, the result must be corrected
to remove the atmospheric concentration contribution. The corrected
concentration is obtained by subtracting the average of the
atmospheric concentrations measured by the open-path instrument
under test immediately before and immediately after the evaluation
test (or preferably before and after each evaluation concentration
level) from the evaluation concentration measurement. If the
difference between the before and after measurements is greater than
20 percent of the effective concentration of the test gas standard,
discard the test result for that concentration level and repeat the
test for that level. If possible, open-path monitors should be
evaluated during periods when the atmospheric pollutant
concentrations are relatively low and steady. Also, if the open-path
instrument is not installed in a permanent manner, the monitoring
path length must be reverified to be within 3 percent to
validate the evaluation, since the monitoring path length is
critical to the determination of the effective concentration.
3.1.2.6 Report both the evaluation concentrations (effective
concentrations for open-path monitors) of the audit gases and the
corresponding measured concentration (corrected concentrations, if
applicable, for open-path monitors) indicated or produced by the
monitor being tested. The percent differences between these
concentrations are used to assess the quality of the monitoring data
as described in section 4.1.1 of this appendix.
3.1.3 National Performance Audit Program (NPAP). As stated in
sections 1.1 and 2.4, PSD monitoring networks may be subject to the
NPEP, which includes the NPAP. The NPAP is a performance evaluation
which is a type of audit where quantitative data are collected
independently in order to evaluate the proficiency of an analyst,
monitoring instrument and laboratory. Due to the implementation
approach used in this program, NPAP provides for a national
independent assessment of performance with a consistent level of
data quality. The NPAP should not be confused with the quarterly PE
program described in section 3.1.2. The PSD organizations shall
consult with the PSD reviewing authority or the EPA regarding
whether the implementation of NPAP is required and the
implementation options available. Details of the EPA NPAP can be
found in reference 11 of this appendix. The program requirements
include:
3.1.3.1 Performing audits on 100 percent of monitors and sites
each year including monitors and sites that may be operated for less
than 1 year. The PSD reviewing authority has the authority to
require more frequent audits at sites they consider to be high
priority.
3.1.3.2 Developing a delivery system that will allow for the
audit concentration gasses to be introduced at the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified against the National
Institute for Standards and Technology (NIST) standard reference
methods or special review procedures and validated annually for CO,
SO2 and NO2, and at the beginning of each
quarter of audits for O3.
3.1.3.4 The PSD PQAO may elect to self-implement NPAP. In these
cases, the PSD reviewing authority will work with those PSD PQAOs to
establish training and other technical requirements to establish
comparability to federally implemented programs. In addition to
meeting the requirements in sections 3.1.1.3 through 3.1.3.3, the
PSD PQAO must:
(a) Ensure that the PSD audit system is equivalent to the EPA
NPAP audit system and is an entirely separate set of equipment and
standards from the equipment used for quarterly performance
evaluations. If this system does not generate and analyze the audit
concentrations, as the EPA NPAP system does, its equivalence to the
EPA NPAP system must be proven to be as accurate under a full range
of appropriate and varying conditions as described in section
3.1.3.6.
(b) Perform a whole system check by having the PSD audit system
tested at an independent and qualified EPA lab, or equivalent.
(c) Evaluate the system with the EPA NPAP program through
collocated auditing at an acceptable number of sites each year (at
least one for a PSD network of five or less sites; at least two for
a network with more than five sites).
(d) Incorporate the NPAP into the PSD PQAO's QAPP.
(e) Be subject to review by independent, EPA-trained personnel.
(f) Participate in initial and update training/certification
sessions.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A one-point flow rate
verification check must be performed at least once every month (each
verification minimally separated by 14 days) on each monitor used to
measure PM2.5. The verification is made by checking the
operational flow rate of the monitor. If the verification is made in
conjunction with a flow rate adjustment, it must be made prior to
such flow rate adjustment. For the standard procedure, use a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix to check the monitor's normal flow rate. Care should
be used in selecting and using the flow rate measurement device such
that it does not alter the normal operating flow rate of the
monitor. Flow rate verification results are to be reported to the
PSD reviewing authority quarterly as described in section 5.1.
Reporting these results to AQS is encouraged. The percent
differences between the audit
[[Page 17294]]
and measured flow rates are used to assess the bias of the
monitoring data as described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for PM2.5. Every 6 months,
audit the flow rate of the PM2.5 particulate monitors.
For short-term monitoring operations (those less than 1 year), the
flow rate audits must occur at start up, at the midpoint, and near
the completion of the monitoring project. The audit must be
conducted by a trained technician other than the routine site
operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor. The percent
differences between these flow rates are used to evaluate monitor
performance.
3.2.3 Collocated Sampling Procedures for PM2.5. A PSD PQAO must
have at least one collocated monitor for each PSD monitoring
network.
3.2.3.1 For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the QC
monitor. There can be only one primary monitor at a monitoring site
for a given time period.
(a) If the primary monitor is a FRM, then the quality control
monitor must be a FRM of the same method designation.
(b) If the primary monitor is a FEM, then the quality control
monitor must be a FRM unless the PSD PQAO submits a waiver for this
requirement, provides a specific reason why a FRM cannot be
implemented, and the waiver is approved by the PSD reviewing
authority. If the waiver is approved, then the quality control
monitor must be the same method designation as the primary FEM
monitor.
3.2.3.2 In addition, the collocated monitors should be deployed
according to the following protocol:
(a) The collocated quality control monitor(s) should be deployed
at sites with the highest predicted daily PM2.5
concentrations in the network. If the highest PM2.5
concentration site is impractical for collocation purposes,
alternative sites approved by the PSD reviewing authority may be
selected. If additional collocated sites are necessary, the PSD PQAO
and the PSD reviewing authority should determine the appropriate
location(s) based on data needs.
(b) The two collocated monitors must be within 4 meters of each
other and at least 2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for samplers having flow rates
less than 200 liters/min to preclude airflow interference. A waiver
allowing up to 10 meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a primary and collocated
quality control monitor may be approved by the PSD reviewing
authority for sites at a neighborhood or larger scale of
representation. This waiver may be approved during the QAPP review
and approval process. Sampling and analytical methodologies must be
the consistently implemented for both collocated samplers and for
all other samplers in the network.
(c) Sample the collocated quality control monitor on a 6-day
schedule for sites not requiring daily monitoring and on a 3-day
schedule for any site requiring daily monitoring. Report the
measurements from both primary and collocated quality control
monitors at each collocated sampling site. The calculations for
evaluating precision between the two collocated monitors are
described in section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation Program (PEP) Procedures. As
stated in sections 1.1 and 2.4 of this appendix, PSD monitoring
networks may be subject to the NPEP, which includes the
PM2.5 PEP. The PSD monitoring organizations shall consult
with the PSD reviewing authority or the EPA regarding whether the
implementation of PM2.5 PEP is required and the
implementation options available for the PM2.5 PEP. For
PSD PQAOs with less than or equal to five monitoring sites, five
valid performance evaluation audits must be collected and reported
each year. For PSD PQAOs with greater than five monitoring sites,
eight valid performance evaluation audits must be collected and
reported each year. Additionally, within the five or eight required
audits, each type of method designation (FRM/FEM designation) used
as a primary monitor in the PSD network shall be audited. For a PE
to be valid, both the primary monitor and PEP audit measurements
must meet quality control requirements and be above 3 [mu]g/m\3\ or
a predefined lower concentration level determined by a systematic
planning process and approved by the PSD reviewing authority. Due to
the relatively short-term nature of most PSD monitoring, the
likelihood of measuring low concentrations in many areas attaining
the PM2.5 standard and the time required to weigh filters
collected in PEs, a PSD monitoring organization's QAPP may contain a
provision to waive the 3 [micro]g/m\3\ threshold for validity of PEs
conducted in the last quarter of monitoring, subject to approval by
the PSD reviewing authority.
3.3 PM10.
3.3.1 Flow Rate Verification for PM10. A one-point flow rate
verification check must be performed at least once every month (each
verification minimally separated by 14 days) on each monitor used to
measure PM10. The verification is made by checking the
operational flow rate of the monitor. If the verification is made in
conjunction with a flow rate adjustment, it must be made prior to
such flow rate adjustment. For the standard procedure, use a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix to check the monitor's normal flow rate. Care should
be taken in selecting and using the flow rate measurement device
such that it does not alter the normal operating flow rate of the
monitor. The percent differences between the audit and measured flow
rates are used to assess the bias of the monitoring data as
described in section 4.2.2 of this appendix (using flow rates in
lieu of concentrations).
3.3.2 Semi-Annual Flow Rate Audit for PM10. Every 6 months,
audit the flow rate of the PM10 particulate monitors. For
short-term monitoring operations (those less than 1 year), the flow
rate audits must occur at start up, at the midpoint, and near the
completion of the monitoring project. Where possible, the EPA
strongly encourages more frequent auditing. The audit must be
conducted by a trained technician other than the routine site
operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor. The percent
differences between these flow rates are used to evaluate monitor
performance
3.3.3 Collocated Sampling Procedures for Manual PM10. A PSD PQAO
must have at least one collocated monitor for each PSD monitoring
network.
3.3.3.1 For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the
quality control monitor.
3.3.3.2 In addition, the collocated monitors should be deployed
according to the following protocol:
(a) The collocated quality control monitor(s) should be deployed
at sites with the highest predicted daily PM10
concentrations in the network. If the highest PM10
concentration site is impractical for collocation purposes,
alternative sites approved by the PSD reviewing authority may be
selected.
(b) The two collocated monitors must be within 4 meters of each
other and at least 2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for samplers having flow rates
less than 200 liters/min to preclude airflow interference. A waiver
allowing up to 10 meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a primary and collocated
sampler may be approved by the PSD reviewing authority for sites at
a neighborhood or larger scale of representation. This waiver may be
approved during the QAPP review and approval process. Sampling and
analytical methodologies must be the consistently implemented for
both collocated samplers and for all other samplers in the network.
[[Page 17295]]
(c) Sample the collocated quality control monitor on a 6-day
schedule or 3-day schedule for any site requiring daily monitoring.
Report the measurements from both primary and collocated quality
control monitors at each collocated sampling site. The calculations
for evaluating precision between the two collocated monitors are
described in section 4.2.1 of this appendix.
(d) In determining the number of collocated sites required for
PM10, PSD monitoring networks for Pb-PM10
should be treated independently from networks for particulate matter
(PM), even though the separate networks may share one or more common
samplers. However, a single quality control monitor that meets the
collocation requirements for Pb-PM10 and PM10
may serve as a collocated quality control monitor for both networks.
Extreme care must be taken if using the filter from a quality
control monitor for both PM10 and Pb analysis.
PM10 filter weighing should occur prior to any Pb
analysis.
3.4 Pb.
3.4.1 Flow Rate Verification for Pb. A one-point flow rate
verification check must be performed at least once every month (each
verification minimally separated by 14 days) on each monitor used to
measure Pb. The verification is made by checking the operational
flow rate of the monitor. If the verification is made in conjunction
with a flow rate adjustment, it must be made prior to such flow rate
adjustment. Use a flow rate transfer standard certified in
accordance with section 2.6 of this appendix to check the monitor's
normal flow rate. Care should be taken in selecting and using the
flow rate measurement device such that it does not alter the normal
operating flow rate of the monitor. The percent differences between
the audit and measured flow rates are used to assess the bias of the
monitoring data as described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.4.2 Semi-Annual Flow Rate Audit for Pb. Every 6 months, audit
the flow rate of the Pb particulate monitors. For short-term
monitoring operations (those less than 1 year), the flow rate audits
must occur at start up, at the midpoint, and near the completion of
the monitoring project. Where possible, the EPA strongly encourages
more frequent auditing. The audit must be conducted by a trained
technician other than the routine site operator. The audit is made
by measuring the monitor's normal operating flow rate using a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix. The flow rate standard used for auditing must not be
the same flow rate standard used to in verifications or to calibrate
the monitor. However, both the calibration standard and the audit
standard may be referenced to the same primary flow rate or volume
standard. Great care must be taken in auditing the flow rate to be
certain that the flow measurement device does not alter the normal
operating flow rate of the monitor. Report the audit flow rate of
the transfer standard and the corresponding flow rate measured by
the monitor. The percent differences between these flow rates are
used to evaluate monitor performance.
3.4.3 Collocated Sampling for Pb. A PSD PQAO must have at least
one collocated monitor for each PSD monitoring network.
3.4.3.1 For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the
quality control monitor.
3.4.3.2 In addition, the collocated monitors should be deployed
according to the following protocol:
(a) The collocated quality control monitor(s) should be deployed
at sites with the highest predicted daily Pb concentrations in the
network. If the highest Pb concentration site is impractical for
collocation purposes, alternative sites approved by the PSD
reviewing authority may be selected.
(b) The two collocated monitors must be within 4 meters of each
other and at least 2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for samplers having flow rates
less than 200 liters/min to preclude airflow interference. A waiver
allowing up to 10 meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a primary and collocated
sampler may be approved by the PSD reviewing authority for sites at
a neighborhood or larger scale of representation. This waiver may be
approved during the QAPP review and approval process. Sampling and
analytical methodologies must be the consistently implemented for
both collocated samplers and all other samplers in the network.
(c) Sample the collocated quality control monitor on a 6-day
schedule if daily monitoring is not required or 3-day schedule for
any site requiring daily monitoring. Report the measurements from
both primary and collocated quality control monitors at each
collocated sampling site. The calculations for evaluating precision
between the two collocated monitors are described in section 4.2.1
of this appendix.
(d) In determining the number of collocated sites required for
Pb-PM10, PSD monitoring networks for PM10
should be treated independently from networks for Pb-
PM10, even though the separate networks may share one or
more common samplers. However, a single quality control monitor that
meets the collocation requirements for Pb-PM10 and
PM10 may serve as a collocated quality control monitor
for both networks. Extreme care must be taken if using a using the
filter from a quality control monitor for both PM10 and
Pb analysis. The PM10 filter weighing should occur prior
to any Pb analysis.
3.4.4 Pb Analysis Audits. Each calendar quarter, audit the Pb
reference or equivalent method analytical procedure using filters
containing a known quantity of Pb. These audit filters are prepared
by depositing a Pb standard on unexposed filters and allowing them
to dry thoroughly. The audit samples must be prepared using batches
of reagents different from those used to calibrate the Pb analytical
equipment being audited. Prepare audit samples in the following
concentration ranges:
------------------------------------------------------------------------
Equivalent ambient Pb
Range concentration, [micro]g/m\3\
------------------------------------------------------------------------
1...................................... 30-100% of Pb NAAQS.
2...................................... 200-300% of Pb NAAQS.
------------------------------------------------------------------------
(a) Audit samples must be extracted using the same extraction
procedure used for exposed filters.
(b) Analyze three audit samples in each of the two ranges each
quarter samples are analyzed. The audit sample analyses shall be
distributed as much as possible over the entire calendar quarter.
(c) Report the audit concentrations (in [micro]g Pb/filter or
strip) and the corresponding measured concentrations (in [micro]g
Pb/filter or strip) using AQS unit code 077 (if reporting to AQS).
The percent differences between the concentrations are used to
calculate analytical accuracy as described in section 4.2.5 of this
appendix.
3.4.5 Pb Performance Evaluation Program (PEP) Procedures. As
stated in sections 1.1 and 2.4, PSD monitoring networks may be
subject to the NPEP, which includes the Pb PEP. The PSD monitoring
organizations shall consult with the PSD reviewing authority or the
EPA regarding whether the implementation of Pb-PEP is required and
the implementation options available for the Pb-PEP. The PEP is an
independent assessment used to estimate total measurement system
bias. Each year, one PE audit must be performed at one Pb site in
each PSD PQAO network that has less than or equal to five sites and
two audits for PSD PQAO networks with greater than five sites. In
addition, each year, four collocated samples from PSD PQAO networks
with less than or equal to five sites and six collocated samples
from PSD PQAO networks with greater than five sites must be sent to
an independent laboratory for analysis. The calculations for
evaluating bias between the primary monitor and the PE monitor for
Pb are described in section 4.2.4 of this appendix.
4. Calculations for Data Quality Assessments
(a) Calculations of measurement uncertainty are carried out by
PSD PQAO according to the following procedures. The PSD PQAOs should
report the data for all appropriate measurement quality checks as
specified in this appendix even though they may elect to perform
some or all of the calculations in this section on their own.
(b) At low concentrations, agreement between the measurements of
collocated samplers, expressed as relative percent difference or
percent difference, may be relatively poor. For this reason,
collocated measurement pairs will be selected for use in the
precision and bias calculations only when both measurements are
equal to or above the following limits:
(1) Pb: 0.002 [micro]g/m\3\ (Methods approved after 3/04/2010,
with exception of manual equivalent method EQLA-0813-803).
(2) Pb: 0.02 [micro]g/m\3\ (Methods approved before 3/04/2010,
and manual equivalent method EQLA-0813-803).
(3) PM10 (Hi-Vol): 15 [micro]g/m\3\.
(4) PM10 (Lo-Vol): 3 [micro]g/m\3\.
(5) PM2.5: 3 [micro]g/m\3\.
[[Page 17296]]
(c) The PM2.5 3 [micro]g/m\3\ limit for the
PM2.5-PEP may be superseded by mutual agreement between
the PSD PQAO and the PSD reviewing authority as specified in section
3.2.4 of the appendix and detailed in the approved QAPP.
4.1 Statistics for the Assessment of QC Checks for SO2, NO2, O3
and CO.
4.1.1 Percent Difference. Many of the measurement quality checks
start with a comparison of an audit concentration or value (flow-
rate) to the concentration/value measured by the monitor and use
percent difference as the comparison statistic as described in
equation 1 of this section. For each single point check, calculate
the percent difference, di, as follows:
[GRAPHIC] [TIFF OMITTED] TR28MR16.008
where meas is the concentration indicated by the PQAO's instrument
and audit is the audit concentration of the standard used in the QC
check being measured.
4.1.2 Precision Estimate. The precision estimate is used to
assess the one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The precision estimator is the coefficient of variation upper bound
and is calculated using equation 2 of this section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.009
where n is the number of single point checks being aggregated; X\2\
0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is calculated using the
one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The bias estimator is an upper bound on the mean absolute value of
the percent differences as described in equation 3 of this section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.010
where n is the number of single point checks being aggregated;
t0.95,n-1 is the 95th quantile of a t-distribution with
n-1 degrees of freedom; the quantity AB is the mean of the absolute
values of the di's and is calculated using equation 4 of
this section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.011
and the quantity AS is the standard deviation of the absolute value
of the di's and is calculated using equation 5 of this
section:
[GRAPHIC] [TIFF OMITTED] TR28MR16.012
[[Page 17297]]
4.1.3.1 Assigning a sign (positive/negative) to the bias
estimate. Since the bias statistic as calculated in equation 3 of
this appendix uses absolute values, it does not have a tendency
(negative or positive bias) associated with it. A sign will be
designated by rank ordering the percent differences of the QC check
samples from a given site for a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th percentiles of the percent
differences for each site. The absolute bias upper bound should be
flagged as positive if both percentiles are positive and negative if
both percentiles are negative. The absolute bias upper bound would
not be flagged if the 25th and 75th percentiles are of different
signs.
4.2 Statistics for the Assessment of PM10, PM2.5, and
Pb.
4.2.1 Collocated Quality Control Sampler Precision Estimate for
PM10, PM2.5 and Pb. Precision is estimated via duplicate
measurements from collocated samplers. It is recommended that the
precision be aggregated at the PQAO level quarterly, annually, and
at the 3-year level. The data pair would only be considered valid if
both concentrations are greater than or equal to the minimum values
specified in section 4(c) of this appendix. For each collocated data
pair, calculate the relative percent difference, di,
using equation 6 of this appendix:
[GRAPHIC] [TIFF OMITTED] TR28MR16.013
where Xi is the concentration from the primary sampler
and Yi is the concentration value from the audit sampler.
The coefficient of variation upper bound is calculated using
equation 7 of this appendix:
[GRAPHIC] [TIFF OMITTED] TR28MR16.014
where n is the number of valid data pairs being aggregated, and X\2\
0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di is
calculated from two values with error.
4.2.2 One-Point Flow Rate Verification Bias Estimate for PM10,
PM2.5 and Pb. For each one-point flow rate verification, calculate
the percent difference in volume using equation 1 of this appendix
where meas is the value indicated by the sampler's volume
measurement and audit is the actual volume indicated by the auditing
flow meter. The absolute volume bias upper bound is then calculated
using equation 3, where n is the number of flow rate audits being
aggregated; t0.95,n-1 is the 95th quantile of a t-
distribution with n-1 degrees of freedom, the quantity AB is the
mean of the absolute values of the di's and is calculated
using equation 4 of this appendix, and the quantity AS in equation 3
of this appendix is the standard deviation of the absolute values if
the di's and is calculated using equation 5 of this
appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias Estimate for PM10, PM2.5
and Pb. Use the same procedure described in section 4.2.2 for the
evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs Bias Estimate for Pb. The
Pb bias estimate is calculated using the paired routine and the PEP
monitor as described in section 3.4.5. Use the same procedures as
described in section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs Bias Estimate for PM2.5.
The bias estimate is calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias estimator is based on the
mean percent differences (Equation 1). The mean percent difference,
D, is calculated by Equation 8 below.
[GRAPHIC] [TIFF OMITTED] TR28MR16.015
where nj is the number of pairs and
d1,d2,...dnj are the biases for
each pair to be averaged.
4.2.6 Pb Analysis Audit Bias Estimate. The bias estimate is
calculated using the analysis audit data described in section 3.4.4.
Use the same bias estimate procedure as described in section 4.1.3
of this appendix.
5. Reporting Requirements
5.1. Quarterly Reports. For each quarter, each PSD PQAO shall
report to the PSD reviewing authority (and AQS if required by the
PSD reviewing authority) the results of all valid measurement
quality checks it has carried out during the quarter. The quarterly
reports must be submitted consistent with the data reporting
requirements specified for air quality data as set forth in 40 CFR
58.16 and pertain to PSD monitoring.
6. References
(1) American National Standard--Specifications and Guidelines
for Quality Systems for Environmental Data Collection and
Environmental Technology Programs. ANSI/ASQC E4-2014. February 2014.
Available from American Society for Quality Control, 611 East
Wisconsin Avenue, Milwaukee, WI 53202.
(2) EPA Requirements for Quality Management Plans. EPA QA/R-2.
EPA/240/B-01/002. March 2001, Reissue May 2006. Office of
Environmental Information, Washington, DC 20460. https://www.epa.gov/quality/agency-wide-quality-system-documents.
(3) EPA Requirements for Quality Assurance Project Plans for
Environmental Data Operations. EPA QA/R-5. EPA/240/B-01/003. March
2001, Reissue May 2006. Office of Environmental Information,
Washington, DC 20460. https://www.epa.gov/quality/agency-wide-quality-system-documents.
(4) EPA Traceability Protocol for Assay and Certification of
Gaseous Calibration Standards. EPA-600/R-12/531. May, 2012.
Available from U.S. Environmental Protection Agency, National Risk
Management Research Laboratory, Research Triangle Park, NC 27711.
https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=245292.
(5) Guidance for the Data Quality Objectives Process. EPA QA/G-
4. EPA/240/B-06/001. February, 2006. Office of Environmental
Information, Washington, DC 20460. https://www.epa.gov/quality/agency-wide-quality-system-documents.
(6) List of Designated Reference and Equivalent Methods.
Available from U.S. Environmental Protection Agency, National
Exposure Research Laboratory, Human Exposure and Atmospheric
Sciences Division, MD-D205-03, Research Triangle
[[Page 17298]]
Park, NC 27711. https://www3.epa.gov/ttn/amtic/criteria.html.
(7) Transfer Standards for the Calibration of Ambient Air
Monitoring Analyzers for Ozone. EPA-454/B-13-004 U.S. Environmental
Protection Agency, Research Triangle Park, NC 27711, October, 2013.
https://www3.epa.gov/ttn/amtic/qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical Assistance Document
for the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057.
U.S. Environmental Protection Agency, Research Triangle Park, NC
27711, September, 1979. https://www.epa.gov/ttn/amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume 1--A Field Guide to Environmental Quality Assurance.
EPA-600/R-94/038a. April 1994. Available from U.S. Environmental
Protection Agency, ORD Publications Office, Center for Environmental
Research Information (CERI), 26 W. Martin Luther King Drive,
Cincinnati, OH 45268. https://www3.epa.gov/ttn/amtic/qalist.html.
(10) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II: Ambient Air Quality Monitoring Program Quality
System Development. EPA-454/B-13-003. https://www3.epa.gov/ttn/amtic/qalist.html.
(11) National Performance Evaluation Program Standard Operating
Procedures. https://www3.epa.gov/ttn/amtic/npapsop.html.
Table B-1--Minimum Data Assessment Requirements for NAAQS Related Criteria Pollutant PSD Monitors
--------------------------------------------------------------------------------------------------------------------------------------------------------
Method Assessment method Coverage Minimum frequency Parameters reported AQS Assessment type
--------------------------------------------------------------------------------------------------------------------------------------------------------
Gaseous Methods (CO, NO2, SO2, O3)
--------------------------------------------------------------------------------------------------------------------------------------------------------
One-Point QC for SO2, NO2, O3, CO. Response check at Each analyzer........ Once per 2 weeks..... Audit concentration One-Point QC.
concentration 0.005- \1\ and measured
0.08 ppm SO2, NO2, concentration \2\.
O3, & 0.5 and 5 ppm
CO.
Quarterly performance evaluation See section 3.1.2 of Each analyzer........ Once per quarter..... Audit concentration Annual PE.
for SO2, NO2, O3, CO. this appendix. \1\ and measured
concentration \2\
for each level.
NPAP for SO2, NO2, O3, CO3........ Independent Audit.... Each primary monitor. Once per year........ Audit concentration NPAP.
\1\ and measured
concentration \2\
for each level.
--------------------------------------------------------------------------------------------------------------------------------------------------------
Particulate Methods
--------------------------------------------------------------------------------------------------------------------------------------------------------
Collocated sampling PM10, PM2.5, Collocated samplers.. 1 per PSD Network per Every 6 days or every Primary sampler No Transaction reported
Pb. pollutant. 3 days if daily concentration and as raw data.
monitoring required. duplicate sampler
concentration \4\.
Flow rate verification PM10, Check of sampler flow Each sampler......... Once every month..... Audit flow rate and Flow Rate Verification.
PM2.5, Pb. rate. measured flow rate
indicated by the
sampler.
Semi-annual flow rate audit PM10, Check of sampler flow Each sampler......... Once every 6 months Audit flow rate and Semi Annual Flow Rate
PM2.5, Pb. rate using or beginning, middle measured flow rate Audit.
independent standard. and end of indicated by the
monitoring. sampler.
Pb analysis audits Pb-TSP, Pb-PM10 Check of analytical Analytical........... Each quarter......... Measured value and Pb Analysis Audits.
system with Pb audit audit value (ug Pb/
strips/filters. filter) using AQS
unit code 077 for
parameters:
14129--Pb (TSP) LC
FRM/FEM.
85129--Pb (TSP) LC
Non-FRM/FEM..
Performance Evaluation Program Collocated samplers.. (1) 5 valid audits Over all 4 quarters.. Primary sampler PEP.
PM2.5 \3\. for PQAOs with <= 5 concentration and
sites.. performance
(2) 8 valid audits evaluation sampler
for PQAOs with > 5 concentration.
sites..
(3) All samplers in 6
years.
Performance Evaluation Program Pb Collocated samplers.. (1) 1 valid audit and Over all 4 quarters.. Primary sampler PEP.
\3\. 4 collocated samples concentration and
for PQAOs, with <=5 performance
sites. evaluation sampler
(2) 2 valid audits concentration.
and 6 collocated Primary sampler
samples for PQAOs concentration and
with >5 sites.. duplicate sampler
concentration.
--------------------------------------------------------------------------------------------------------------------------------------------------------
\1\ Effective concentration for open path analyzers.
\2\ Corrected concentration, if applicable for open path analyzers.
\3\ NPAP, PM2.5 PEP and Pb-PEP must be implemented if data is used for NAAQS decisions otherwise implementation is at PSD reviewing authority
discretion.
\4\ Both primary and collocated sampler values are reported as raw data.
0
11. In Appendix D to part 58, revise paragraph 3(b), remove and reserve
paragraph 4.5(b), and revise paragraph 4.5(c) to read as follows:
Appendix D to Part 58--Network Design Criteria for Ambient Air Quality
Monitoring
* * * * *
3. * * *
(b) The NCore sites must measure, at a minimum, PM2.5
particle mass using continuous and integrated/filter-based samplers,
speciated PM2.5, PM10-2.5 particle mass,
O3, SO2, CO, NO/NOY, wind speed,
wind direction, relative humidity, and ambient temperature.
(1) Although the measurement of NOy is required in
support of a number of monitoring objectives, available commercial
instruments may indicate little difference in their measurement of
NOy compared to the conventional measurement of
NOX, particularly in areas with relatively fresh sources
of nitrogen emissions. Therefore, in areas with negligible expected
difference between NOy and NOX measured
concentrations, the Administrator may allow for waivers that permit
NOX monitoring to be substituted for the required
NOy monitoring at applicable NCore sites.
(2) The EPA recognizes that, in some cases, the physical
location of the NCore site may not be suitable for representative
meteorological measurements due to the site's physical surroundings.
It is also possible that nearby meteorological measurements may be
able to fulfill this data need. In these cases, the requirement for
[[Page 17299]]
meteorological monitoring can be waived by the Administrator.
* * * * *
4.5 * * *
(b) [Reserved]
(c) The EPA Regional Administrator may require additional
monitoring beyond the minimum monitoring requirements contained in
paragraph 4.5(a) of this appendix where the likelihood of Pb air
quality violations is significant or where the emissions density,
topography, or population locations are complex and varied. The EPA
Regional Administrators may require additional monitoring at
locations including, but not limited to, those near existing
additional industrial sources of Pb, recently closed industrial
sources of Pb, airports where piston-engine aircraft emit Pb, and
other sources of re-entrained Pb dust.
* * * * *
[FR Doc. 2016-06226 Filed 3-25-16; 8:45 am]
BILLING CODE 6560-50-P