Revisions to Ambient Monitoring Quality Assurance and Other Requirements, 54355-54395 [2014-19758]
Download as PDF
Vol. 79
Thursday,
No. 176
September 11, 2014
Part II
Environmental Protection Agency
tkelley on DSK3SPTVN1PROD with PROPOSALS2
40 CFR Part 58
Revisions to Ambient Monitoring Quality Assurance and Other
Requirements; Proposed Rule
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
PO 00000
Frm 00001
Fmt 4717
Sfmt 4717
E:\FR\FM\11SEP2.SGM
11SEP2
54356
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
ENVIRONMENTAL PROTECTION
AGENCY
40 CFR Part 58
[EPA–HQ–OAR–2013–0619; FRL–9915–16–
OAR]
RIN 2060–AR59
Revisions to Ambient Monitoring
Quality Assurance and Other
Requirements
Environmental Protection
Agency (EPA).
ACTION: Proposed rule.
AGENCY:
This action proposes
revisions to ambient air monitoring
requirements for criteria pollutants to
provide clarifications to existing
requirements to reduce the compliance
burden of monitoring agencies operating
ambient networks. This proposal
focuses on reorganizing and clarifying
quality assurance requirements,
simplifying and reducing data reporting
and certification requirements,
clarifying the annual monitoring
network plan public notice
requirements, revising certain network
design criteria for nonsource lead
monitoring, and addressing other issues
in part 58 Ambient Air Quality
Surveillance Requirements.
DATES: Comments must be received on
or before November 10, 2014.
ADDRESSES: Submit your comments,
identified by Docket ID No. EPA–HQ–
OAR–2013–0619, by one of the
following methods:
• Federal eRulemaking Portal: https://
www.regulations.gov. Follow the online
instructions for submitting comments.
• Email: A-and-R-Docket@epa.gov.
Include docket ID No. EPA–HQ–OAR–
2013–0619 in the subject line of the
message.
• Fax: (202) 566–9744
• Mail: Environmental Protection
Agency, Mail code 28221T, Attention
Docket No. EPA–HQ–OAR–2013–0619,
1200 Pennsylvania Ave. NW.,
Washington, DC 20460. Please include a
total of two copies.
• Hand/Courier Delivery: EPA Docket
Center, Room 3334, EPA WJC West
Building, 1301 Constitution Ave. NW.,
Washington, DC. Such deliveries are
only accepted during the Docket’s
normal hours of operation, and special
arrangements should be made for
deliveries of boxed information.
Instructions: Direct your comments to
Docket ID No. EPA–HQ–OAR–2013–
0619. The EPA’s policy is that all
comments received will be included in
the public docket without change and
may be made available online at https://
tkelley on DSK3SPTVN1PROD with PROPOSALS2
SUMMARY:
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
www.regulations.gov, including any
personal information provided, unless
the comment includes information
claimed to be Confidential Business
Information (CBI) or other information
whose disclosure is restricted by statute.
Do not submit information that you
consider to be CBI or otherwise
protected through https://
www.regulations.gov or email. The
www.regulations.gov Web site is an
‘‘anonymous access’’ system, which
means the EPA will not know your
identity or contact information unless
you provide it in the body of your
comment. If you send an email
comment directly to the EPA without
going through https://
www.regulations.gov, your email
address will be automatically captured
and included as part of the comment
that is placed in the public docket and
made available on the Internet. If you
submit an electronic comment, the EPA
recommends that you include your
name and other contact information in
the body of your comment and with any
disk or CD ROM you submit. If the EPA
cannot read your comment due to
technical difficulties and cannot contact
you for clarification, the EPA may not
be able to consider your comment.
Electronic files should avoid the use of
special characters, any form of
encryption, and be free of any defects or
viruses. For additional information
about the EPA’s public docket, visit the
EPA Docket Center homepage at https://
www.epa.gov/epahome/dockets.htm.
Docket: All documents in the docket
are listed in the https://
www.regulations.gov index. Although
listed in the index, some information is
not publicly available, e.g., CBI or other
information whose disclosure is
restricted by statute. Certain other
material, such as copyrighted material,
will be publicly available only in hard
copy. Publicly available docket
materials are available either
electronically in www.regulations.gov or
in hard copy at the Air and Radiation
Docket and Information Center, EPA/
DC, Room 3334, WJC West Building,
1301 Constitution Ave. NW.,
Washington, DC. The Public Reading
Room is open from 8:30 a.m. to 4:30
p.m., Monday through Friday, excluding
legal holidays. The telephone number
for the Public Reading Room is (202)
566–1744 and the telephone number for
the Air and Radiation Docket and
Information Center is (202) 566–1742.
FOR FURTHER INFORMATION CONTACT: Mr.
Lewis Weinstock, Air Quality
Assessment Division, Office of Air
Quality Planning and Standards, U.S.
Environmental Protection Agency, Mail
PO 00000
Frm 00002
Fmt 4701
Sfmt 4702
code C304–06, Research Triangle Park,
NC 27711; telephone: (919) 541–3661;
fax: (919) 541–1903; email:
Weinstock.lewis@epa.gov.
SUPPLEMENTARY INFORMATION:
A. Does this action apply to me?
This action applies to state, territorial,
and local air quality management
programs that are responsible for
ambient air monitoring under 40 CFR
part 58. Categories and entities
potentially regulated by this action
include:
NAICS a code
Category
State/territorial/local/tribal
government.
a North
System.
American
Industry
924110
Classification
B. What should I consider as I prepare
my comments for the EPA?
1. Submitting CBI. Do not submit this
information to the EPA through https://
www.regulations.gov or email. Clearly
mark any of the information that you
claim to be CBI. For CBI information in
a disk or CD ROM that you mail to the
EPA, mark the outside of the disk or CD
ROM as CBI and then identify
electronically within the disk or CD
ROM the specific information that is
claimed as CBI. In addition to one
complete version of the comment that
includes information claimed as CBI, a
copy of the comment that does not
contain the information claimed as CBI
must be submitted for inclusion in the
public docket. Information so marked
will not be disclosed except in
accordance with procedures set forth in
40 CFR part 2.
2. Tips for preparing your comments.
When submitting comments, remember
to:
• Follow directions—The agency may
ask you to respond to specific questions
or organize comments by referencing a
Code of Federal Regulations (CFR) part
or section number.
• Explain why you agree or disagree,
suggest alternatives, and substitute
language for your requested changes.
• Describe any assumptions and
provide any technical information and/
or data that you used.
• If you estimate potential costs or
burdens, explain how you arrived at
your estimate in sufficient detail to
allow for it to be reproduced.
• Provide specific examples to
illustrate your concerns and suggest
alternatives.
• Explain your views as clearly as
possible, avoiding the use of profanity
or personal threats.
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
• Make sure to submit your
comments by the comment period
deadline identified.
C. Where can I get a copy of this
document?
In addition to being available in the
docket, an electronic copy of this
proposed rule will also be available on
the Worldwide Web (WWW) through
the Technology Transfer Network
(TTN). Following signature, a copy of
this proposed rule will be posted on the
TTN’s policy and guidance page for
newly proposed or promulgated rules at
the following address: https://
www.epa.gov/ttn/oarpg/. The TTN
provides information and technology
exchange in various areas of air
pollution control. A redline/strikeout
document comparing the proposed
revisions to the appropriate sections of
the current rules is located in the
docket.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Table of Contents
The following topics are discussed in
this preamble:
I. Background
II. Proposed Changes to the Ambient
Monitoring Requirements
A. General Information
B. Definitions
C. Annual Monitoring Network Plan and
Periodic Network Assessment
D. Network Technical Requirements
E. Operating Schedules
F. System Modification
G. Annual Air Monitoring Data
Certification
H. Data Submittal and Archiving
Requirements
I. Network Design Criteria (Appendix D)
III. Proposed Changes to Quality Assurance
Requirements
A. Quality Assurance Requirements for
Monitors Used in Evaluations for
National Ambient Air Quality
Standards—Appendix A
1. General Information
2. Quality System Requirements
3. Quality Control Checks for Gases
4. Quality Control Checks for Particulate
Monitors
5. Calculations for Data Quality
Assessment
B. Quality Assurance Requirements for
Monitors Used in Evaluations of
Prevention of Significant Deterioration
Projects—Appendix B
1. General Information
2. Quality System Requirements
3. Quality Control Checks for Gases
4. Quality Control Checks for Particulate
Monitors
5. Calculations for Data Quality
Assessment
IV. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory
Planning and Review and Executive
Order 13563: Improving Regulations and
Regulatory Review
B. Paperwork Reduction Act
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
C. Regulatory Flexibility Act
D. Unfunded Mandates Reform Act
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation
and Coordination With Indian Tribal
Governments
G. Executive Order 13045: Protection of
Children From Environmental Health
and Safety Risks
H. Executive Order 13211: Actions
Concerning Regulations That
Significantly Affect Energy Supply,
Distribution, or Use
I. National Technology Transfer and
Advancement Act
J. Executive Order 12898: Federal Actions
To Address Environmental Justice in
Minority Populations and Low-Income
Populations
I. Background
The EPA is proposing revisions to
ambient air requirements for criteria
pollutants to provide clarifications to
existing requirements to reduce the
compliance burden of monitoring
agencies operating ambient networks.
This proposal focuses on ambient
monitoring requirements that are found
in 40 CFR part 58 and the associated
appendices (A, D, and new Appendix
B), including issues such as operating
schedules, the development of annual
monitoring network plans, data
reporting and certification requirements,
and the operation of the required quality
assurance (QA) program.
The EPA last completed a
comprehensive revision of ambient air
monitoring regulations in a final rule
published on October 17, 2006 (see 71
FR 61236). Minor revisions were
completed in a direct final rule
published on June 12, 2007 (see 72 FR
32193). Periodic pollutant-specific
monitoring updates have occurred in
conjunction with revisions to the
National Ambient Air Quality Standards
(NAAQS). In such cases, the monitoring
revisions were typically finalized as part
of the NAAQS final rules.1
II. Proposed Changes to the Ambient
Monitoring Requirements
A. General Information
The following proposed changes to
monitoring requirements impact these
subparts of part 58—Ambient Air
Quality Surveillance: Subpart A—
General Provisions, and Subpart B—
Monitoring Network. Specific proposed
changes to these subparts are described
below.
B. Definitions
The EPA proposes to add and revise
several terms to ensure consistent
interpretation within the monitoring
1 Links to the NAAQS final rules are available at:
https://www.epa.gov/air/criteria.html.
PO 00000
Frm 00003
Fmt 4701
Sfmt 4702
54357
regulations and to harmonize usage of
terms with the definition of key
metadata fields that are important
components of the Air Quality System
(AQS).2
The EPA proposes to add the term
‘‘Certifying Agency’’ to the list of
definitions. The certifying agency field
was added to AQS in 2013 as part of the
development of a revised process for
states and the EPA Regions to meet the
data certification requirements
described in 40 CFR 58.15. The new
term specifically describes any
monitoring agency that is responsible
for meeting data certification
requirements for a set of monitors. In
practice, certifying agencies are
typically a state, local, or tribal agency
depending on the particular data
reporting arrangements that have been
approved by an EPA regional office for
a given state. A list of certifying
agencies by individual monitor is
available on the AQS–TTN Web site.3
The term ‘‘Chemical Speciation
Network’’ or CSN is being proposed for
addition to the definition list. The CSN
network has been functionally defined
as being comprised of the Speciation
Trends Network sites and the
supplemental speciation sites that are
collectively operated by monitoring
agencies to obtain PM2.5 chemical
species data.
The term ‘‘Implementation Plan’’ is
being proposed for addition to provide
more specificity to current definitions
that reference the word ‘‘plan’’ in their
description. The EPA wishes to ensure
that references to State Implementation
Plans (SIPs) are not confused with
references to Annual Monitoring
Network Plans that are described in 40
CFR 58.10.
The term ‘‘Local Agency’’ is being
proposed for revision to clarify that
such organizations are responsible for
implementing portions of annual
monitoring network plans. The current
definition refers to the carrying out of a
plan which is not specifically defined,
leading to possible confusion with SIPs.
The term ‘‘meteorological
measurements’’ is being proposed for
clarification that such measurements
refer to required parameters at NCore
and photochemical assessment
monitoring stations (PAMS).
2 The AQS is the EPA’s repository of ambient air
quality data. The AQS stores data from over 10,000
monitors, 5,000 of which are currently active. State,
local and tribal agencies collect the data and submit
it to the AQS on a periodic basis. See https://
www.epa.gov/ttn/airs/airsaqs/ for additional
information.
3 https://www.epa.gov/ttn/airs/airsaqs/memos/
criteria_monitor_list_by_certifying_agency_and_
PQAO.xls.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
54358
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
The terms ‘‘Monitoring Agency’’ and
‘‘Monitoring Organization’’ are being
proposed for clarification to include
tribal monitoring agencies and to
simplify the monitoring organization
definition to reference the
aforementioned monitoring agency
definition.
The term ‘‘NCore’’ is being proposed
for revision to remove nitrogen dioxide
(NO2) and lead in PM10 (Pb-PM10) as a
required measurement and to expand
the definition of basic meteorology to
specifically reference the required
measurements: Wind speed, wind
direction, temperature, and relative
humidity. The EPA clarifies that NO2
was never a required NCore
measurement and that the current
definition was erroneous on this issue.
Additionally, the requirement to
measure Pb-PM10 at NCore sites in areas
over 500,000 population is being
proposed for elimination in the rule.
The term ‘‘Near-road NO2 Monitor’’ is
being proposed for revision to ‘‘Nearroad Monitor.’’ This revision is being
made to broaden the definition of nearroad monitors to include all such
monitors operating under the specific
requirements described in 40 CFR part
58, appendix D (sections 4.2.1, 4.3.2,
4.7.1(b)(2)) and appendix E (section
6.4(a), Table E–4) for near-road
measurement of PM2.5 and carbon
monoxide (CO) in addition to NO2.
The term ‘‘Network Plan’’ is being
proposed for addition to clarify that any
such references in 40 CFR part 58 refer
to the annual monitoring network plan
required in 40 CFR 58.10.
The term ‘‘Plan’’ is being proposed for
deletion as its usage has been replaced
with more specific references to either
the annual monitoring network plan
required in 40 CFR 58.10 or the SIP
approved or promulgated pursuant to
section 110 of the Clean Air Act.
The term ‘‘Population-oriented
Monitoring (or sites)’’ is being proposed
for deletion. This term along with the
related usage of the concept of
population-oriented monitoring was
deleted from 40 CFR part 58 in the 2013
PM2.5 NAAQS final rule (see 78 FR
3235–3236). As explained in that rule,
the action was taken to ensure
consistency with the longstanding
definition of ambient air applied to the
other NAAQS pollutants.
The term ‘‘Primary Monitor’’ is being
proposed for addition to the definition
list. The usage of this term has become
important in AQS to better define the
processes used to calculate design
values when more than one monitor is
being operated by a monitoring agency
for a given pollutant. This term
identifies the primary monitor used as
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
the default data source in AQS for
creating a combined site record.
The term ‘‘Primary Quality Assurance
Organization’’ is being proposed for
revision to include the usage of the
acronym, ‘‘PQAO.’’
The terms ‘‘PSD Monitoring
Organization’’ and ‘‘PSD Monitoring
Network’’ are being added to support
the proposed new appendix B that will
pertain specifically to QA requirements
for prevention of significant
deterioration (PSD) networks.
The term ‘‘PSD Reviewing Authority’’
is being added to support the addition
of appendix B to the part 58 appendices
and to clarify the identification of the
lead authority in determining the
applicability of QA requirements for
PSD monitoring projects.
The term ‘‘Reporting Organization’’ is
being proposed for revision to clarify
that the term refers specifically to the
reporting of data as defined in AQS. The
AQS does allow the distinct designation
of agency roles that include analyzing,
certifying, collecting, reporting, and
PQAO.
The term ‘‘SLAMS’’ (state and local
air monitoring stations) is being
proposed for clarification to clearly
indicate that the designation of a
monitor as SLAMS refers to a monitor
required under appendix D of part 58.
The SLAMS monitors make up
networks that include NCore, PAMS,
CSN, and other state or local agency
sites that have been so designated in
annual monitoring network plans.
The terms ‘‘State Agency’’ and ‘‘STN’’
are proposed for minor wording changes
for purposes of clarity only.
The term ‘‘State Speciation Site’’ is
being proposed for deletion in lieu of
the proposed addition of ‘‘Supplemental
Speciation Station’’ to better describe
the distinct elements of the CSN
network which includes the Speciation
Trends Network Stations that are
required under section 4.7.4 of
appendix D of part 58 and supplemental
speciation stations which are operated
for specific monitoring agency needs
and are not considered to be required
monitors under appendix D.
C. Annual Monitoring Network Plan and
Periodic Network Assessment
The EPA finalized the current Annual
Monitoring Network Plan requirement
as part of the 2006 amendments to the
ambient monitoring requirements (see
71 FR 61247–61249). The revised
requirements were intended to
consolidate separate network plan
requirements that existed for SLAMS
and national air monitoring stations
(NAMS) networks, clarify processes for
providing public input in the network
PO 00000
Frm 00004
Fmt 4701
Sfmt 4702
plans and obtaining formal EPA
Regional Office review, and revise the
required plan elements to address other
changes that had occurred in part 58.
Since 2006, further revisions to the
annual monitoring network plan
requirements have occurred to address
new requirements for monitoring
networks including the NCore multipollutant network, source-oriented lead
(Pb), near-road monitoring for NO2, CO
and PM2.5, other required NAAQS
monitoring, and data quality
requirements for continuous PM2.5
Federal Equivalent Methods (FEMs).
The current Annual Monitoring
Network Plan requirements state that
plans must be made available for public
inspection for at least 30 days prior to
submission to the EPA. Additionally,
any plans that propose SLAMS network
modifications are subject to EPA
Regional Administrator approval, and
either the monitoring agency or the EPA
Regional Office must provide an
opportunity for public comment. This
process to improve transparency
pertaining to the planning of ambient
monitoring networks has been
successful and the EPA believes that
state and local agencies are increasingly
receiving public comments on these
plans.4 To aid in the visibility of these
plans, the EPA hosts an annual
monitoring network plan summary page
on its Ambient Monitoring Technical
Information Center (AMTIC) Web site.5
Since the revision of the annual
monitoring network plan process in
2006, the EPA has received feedback
from its regional offices as well as some
states that the regulatory language
pertaining to public involvement has
been unclear. Areas of confusion
include determining the difference
between the process of obtaining public
inspection versus comment, the
responsibility of monitoring agencies to
respond to public comment in their
submitted plans, and the responsibility
of the EPA regional offices to obtain
public comment depending on a
monitoring agency’s prior action as well
as whether the annual monitoring
network plan was modified based on
discussions with the monitoring agency
following plan submission.
The EPA believes that the intent of
the 2006 revision to these requirements
was to support wider public
involvement in the planning and
implementation of air monitoring
4 The EPA notes that there is no specified process
for obtaining public input into draft annual
monitoring network plans although the typical
process is to post the plans on state or local Web
sites along with an on-line process to obtain public
comments.
5 See https://www.epa.gov/ttn/amtic/plans.html.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
networks, and, to that extent, the
solicitation of public comments prior to
the submission of the annual monitoring
network plan to the EPA regional office
is a desirable part of the process.
Indeed, the EPA stated in the preamble
to the 2006 amendments that ‘‘Although
the public inspection requirement does
not specifically require states to obtain
and respond to received comments,
such a process is encouraged with the
subsequent transmission of comments to
the appropriate EPA regional office for
review’’ (see 71 FR 61248).
Given the heightened interest and
visibility of the annual monitoring
network plan process since 2006, the
EPA believes that it is appropriate to
propose that the public inspection
aspect of this requirement contained in
40 CFR 58.10(a)(1) be revised to clearly
indicate that obtaining public comment
is a required part of the process, and
that plans that are submitted to the EPA
regional offices should address such
comments that were received during the
public notice period. The EPA
understands that this proposed change
in process could increase burden for
those monitoring agencies that have not
routinely incorporated public comments
into their annual monitoring network
plan process. However, we believe that
these efforts will increase the
transparency of the current process and
potentially reduce questions and
adverse comment from stakeholders
who have not been included in annual
monitoring network plan discussions
prior to submission to the EPA. For
those monitoring agencies that already
have been posting plans for public
comment, this proposed change should
have no net effect on workload.
A related part of the annual
monitoring network plan process is
described in 40 CFR 58.10(a)(2) with the
distinction that this section pertains
specifically to plans that propose
SLAMS modifications and thereby also
require specific approval from the EPA
Regional Administrator. Similar to the
public comment issue described above,
the process of obtaining such comment
for plans that contain network
modifications was not clearly described,
with the regulatory text initially placing
the responsibility on the EPA regional
offices to obtain public comment, but
then providing monitoring agencies
with the option of obtaining public
comment, which consequently would
relieve the EPA regional office from
having to do so. Consistent with the
proposed change to the comment
process described above, the EPA is
proposing changes to the text in 40 CFR
58.10(a)(2) to reflect the fact that public
comments will have been required to be
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
obtained by monitoring agencies prior to
submission and that the role of the EPA
regional office will be to review the
submitted plan together with public
comments and any modifications to the
plan based on these comments. On an
overall basis, the EPA believes that this
proposed change to clearly place the
responsibility for obtaining public
comment on monitoring agencies makes
sense since these organizations are, in
effect, closer to their stakeholders and in
a better position to notify the public
about the availability and key issues
contained in annual monitoring network
plans, compared with similar efforts by
the EPA regions that oversee many such
agencies.
On a related note, the EPA
emphasizes the value of the partnership
between monitoring agencies and their
respective EPA regional offices, and
encourages an active dialogue between
these parties during the development
and review of annual monitoring
network plans. Although the monitoring
regulations only require that the EPA
Regional Administrators approve annual
monitoring network plans that propose
changes to SLAMS stations, the EPA
encourages monitoring agencies to seek
formal approval of submitted plans
regardless of whether SLAMS changes
are proposed or not. Such a process
would ensure that not only plans with
proposed modifications are formally
approved, but also that plans where
potential network changes are indeed
appropriate but not proposed, would be
subject to discussion. Although the EPA
is not proposing that annual monitoring
network plans that do not propose
changes to SLAMS should also be
subject to the EPA Regional
Administrator’s approval, we support
close working relationships between
monitoring agencies and the EPA
regions and see value in having a formal
review of all such plans, regardless of
whether network modifications are
proposed.
Another aspect of the annual
monitoring network plan requirements
is the listing of required information for
each proposed and existing site as
described in 40 CFR 58.10(b). The EPA
is proposing to add two elements to this
list as described below.
First, the EPA is proposing to require
that a PAMS network description be
specifically included as a part of the
annual monitoring network plan for any
monitoring agencies affected by PAMS
requirements. The requirements for
such a plan are already referenced in
appendix D, sections 5.2 and 5.4 of this
part. In fact, the requirement for an
‘‘approved PAMS network description
provided by the state’’ is already
PO 00000
Frm 00005
Fmt 4701
Sfmt 4702
54359
specified in section 5.4. Accordingly,
the EPA is proposing that a PAMS
network description be a required
element in annual monitoring network
plans for affected monitoring agencies,
and that any such plans already
developed for PAMS networks in
accordance with section 5 of appendix
D could be used to meet this proposed
requirement. The EPA believes that the
burden impact of this proposed change
should be minimal, as a review of
archived 2012 annual monitoring
network plans posted on the EPA’s
AMTIC Web page shows that many such
plans already include references to
PAMS stations. For purposes of
consistency and clarity, however, the
EPA believes there is merit for
proposing this revision to the annual
monitoring network plan requirements
so that stakeholders interested in the
operation of PAMS stations can find the
relevant information in one place.
Second, the EPA is proposing
language that affects ‘‘long-term’’
Special Purpose Monitors (SPMs), i.e.,
those SPMs operating for longer than 24
months whose data could be used to
calculate design values for NAAQS
pollutants in cases where the EPA
approved methods are being employed.
As long as such monitors are classified
as SPMs, their operation can be
discontinued without EPA approval per
40 CFR 58.20(f). While such operational
flexibility is a key component of special
purpose monitoring, the issue can
become more complex when longerterm SPMs measure elevated levels of
criteria pollutants and potentially
become design value monitors for a
region. In such cases, the EPA is faced
with scenarios where key monitors that
can impact the attainment status of a
region can potentially be discontinued
without prior notification or approval.
Given the important regulatory
implications of such monitoring
network decisions, the EPA believes
that it is important that the ongoing
operation and treatment of such SPMs
be specifically called out and discussed
in annual monitoring network plans.
Therefore, the EPA is proposing that a
new required element be added to the
annual monitoring network plan
requirements. Specifically, the EPA is
proposing that such long-term SPMs be
identified in the plans along with a
discussion of the rationale for keeping
the monitor(s) as SPMs or potentially
reclassifying to SLAMS. The EPA is not
proposing that such monitors must
become SLAMS, only that the ongoing
operation of such monitors and the
rationale for retaining them as SPMs be
explicitly discussed to avoid confusion
E:\FR\FM\11SEP2.SGM
11SEP2
54360
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
tkelley on DSK3SPTVN1PROD with PROPOSALS2
and the potential for unintended
complexities in the designations process
if any design value SPMs would be
discontinued without adequate
discussion.
The EPA is proposing minor edits to
the annual monitoring network plan
requirements to revise terminology
referring to PM2.5 speciation monitoring,
to note the proposed addition of
appendix B to the QA requirements (see
section III.B of this preamble), and to
clarify that annual monitoring network
plans should include statements
addressing whether the operation of
each monitor meets the requirements of
the associated appendices in part 58.
Finally, the issue has arisen
concerning the flexibility that the EPA
Regional Administrators have with
reference to the approvals that are
required within 120 days of annual
monitoring network plan approval, for
example, in the situation where the
majority of the submitted plan is
acceptable but one or more of the
required elements is problematic. In
these situations, which we believe to be
infrequent, the existing regulatory
language provides sufficient flexibility
for such situations to be handled on a
case-by-case basis, for example, through
the use of a partial approval process
where the Regional Administrator’s
approval decision letter specifies what
elements of the submitted plan are
approved and what elements are not.
Alternatively, if the plan satisfies the
requirements for network adequacy
under appendix D and the monitors are
suitable for regulatory decisions
(consistent with the requirements of
appendix A), the Regional
Administrator has the discretion to
approve the plan, while noting technical
deficiencies to be corrected. We would
expect that the resolution of the specific
items under discussion would be
documented through follow-up
communications with the submitting
monitoring agency to ensure that a
complete record exists for the basis of
the annual monitoring network plan
approval.
The EPA solicits comments on all of
the proposed changes to annual
monitoring network plans requirements
contained in 40 CFR 58.10.
D. Network Technical Requirements
The EPA is proposing to revise the
language in 40 CFR 58.11(a)(3) to note
the proposed revisions to appendix B to
the QA requirements (see section III.B of
this preamble) that would pertain to
PSD monitoring sites.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
E. Operating Schedules
The operating schedule requirements
described in 40 CFR 58.12 pertain to the
minimum required frequency of
sampling for continuous analyzers (for
example, hourly averages) and manual
methods for particulate matter (PM) and
Pb sampling (typically 24-hour averages
for manual methods). The EPA is
proposing to revise these requirements
in three ways: By proposing added
flexibility in the minimum required
sampling for PM2.5 mass sampling and
for PM2.5 speciation sampling; by
modifying language pertaining to
continuous mass monitoring to reflect
revisions in regulatory language that
were finalized in the 2013 p.m. NAAQS
final rule; and by clarifying the
applicability of certain criteria that can
lead to an increase in the required
sampling frequency, for example, to a
daily schedule.
With regard to the minimum required
sampling frequency for manual PM2.5
samplers, current requirements state
that at least a 1-in-3 day frequency is
mandated for required SLAMS monitors
without a collocated continuous
monitor. For the majority of such
manual PM2.5 samplers, the EPA
continues to believe that a 1-in-3 day
sampling frequency is appropriate to
meet the data quality objectives that
support the PM2.5 NAAQS.6 For a subset
of these monitors, however, the EPA
believes that some regulatory flexibility
may be appropriate in situations where
a particular monitor is highly unlikely
to record a violation of the PM2.5
NAAQS. Such situations might occur in
areas with very low PM2.5
concentrations relative to the NAAQS
and/or in urban areas with many more
monitors than are required by appendix
D and a subset of those monitors are
reading lower than other monitors in the
area. In these situations, the EPA
believes it is appropriate to propose that
the required sampling frequency could
be reduced to 1-in-6 day sampling or
another alternate schedule through a
case-by-case approval by the EPA
Regional Administrator. Such approvals
could be based on factors that are
already described in 40 CFR
58.12(d)(1)(ii) such as historical PM2.5
data assessments, the attainment status
of the area, the location of design value
sites, and the presence of continuous
PM2.5 monitors at nearby locations. The
EPA envisions that the request for such
reductions in sampling frequency would
6 According to a retrieval from AQS dated 12–23–
2013, approximately 65% of primary PM2.5
samplers (those monitors with a parameter
occurrence code of ‘‘1’’) operated on a 1-in-3 day
sampling frequency.
PO 00000
Frm 00006
Fmt 4701
Sfmt 4702
occur during the annual monitoring
network plan process as operating
schedules are a required part of the
plans as stated in 40 CFR 58.10(b)(4).
For sites with a collocated continuous
monitor, the EPA also believes that the
current regulatory flexibility to reduce
to 1-in-6 day sampling or a seasonal
sampling schedule is appropriate based
on factors described above, and in
certain cases, may also be applicable to
lower reading SLAMS sites without a
collocated continuous monitor, for
example, to reduce frequency from 1-in6 day sampling to a seasonal schedule.
Accordingly, we have proposed such
flexibility through changes in the
regulatory language in 40 CFR
58.12(d)(1)(i) and (ii).
The EPA also believes that some
flexibility for sampling frequency is
appropriate to propose for PM2.5
Chemical Speciation Stations,
specifically the Speciation Trends
Network (STN) sites that are at
approximately 53 locations.7 The STN
stations are currently required to sample
on at least a 1-in-3 day frequency with
no opportunity for flexibility. While the
EPA firmly believes in the long-term
importance of the STN stations to
support the development of SIPs,
modeling exercises, health studies, and
the investigation of air pollution
episodes and exceptional events, we do
not believe that the current inflexibility
with regard to sampling frequency is in
the best interests of monitoring
agencies, the EPA, or stakeholders. For
the past several years, the EPA has been
investigating alternative monitoring
technologies such as continuous PM2.5
speciation methods that can supplement
or potentially even replace manual
PM2.5 speciation methods.8 As these
methods become more refined, the EPA
may wish to selectively reduce sampling
frequency at manual samplers for one or
more channels to conserve resources for
reinvestment in other needs within the
CSN network. Additionally, the EPA is
currently conducting an assessment of
the entire CSN network to evaluate the
long-term viability of the program in the
context of changes in air quality, the
recently revised PM NAAQS, rising
analytical costs, and flat or declining
resources. Accordingly, for the reasons
mentioned above, the EPA is proposing
that a reduction in sampling frequency
from 1-in-3 day be permissible for
manual PM2.5 samplers at STN stations.
The approval for such changes at STN
stations, on a case by case basis, would
be made by the EPA Administrator as
the authority for changes to STN has
7 https://www.epa.gov/ttn/amtic/specgen.html.
8 https://www.epa.gov/ttnamti1/spesunset.html.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
been retained at the Administrator level
per appendix D of this part, section
4.7.4. Factors that would be considered
as part of the decision would include an
area’s design value, the role of the
particular site in national health studies,
the correlation of the site’s species data
with nearby sites, and presence of other
leveraged measurements. In practice, we
would expect a close working
relationship with the EPA regional
offices and monitoring agencies to
consider such changes to STN,
preferably as part of the annual
monitoring network plan process, taking
into account the findings of the CSN
assessment process that is expected to
be completed later in 2014, as well as
a parallel effort being undertaken for the
Interagency Monitoring of Protected
Visual Environments (IMPROVE)
network.9
The EPA is proposing editorial
revisions to 40 CFR 58.12(d)(1)(ii) to
harmonize the language regarding the
use of continuous FEM or approved
regional methods (ARM) monitors to
support sampling frequency flexibility
for manual PM2.5 samplers with the
current language in 40 CFR
58.12(d)(1)(iii) that was revised as part
of 2013 PM NAAQS final rule.
Specifically, the phrase ‘‘unless it is
identified in the monitoring agency’s
annual monitoring network plan as not
appropriate for comparison to the
NAAQS and the EPA Regional
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS’’ is
being proposed for appending to the
current regulatory language. This
change reflects the new process that was
finalized in the 2013 PM NAAQS final
rule that allows monitoring agencies to
request that continuous PM2.5 FEM data
be excluded from NAAQS comparison
based on technical criteria described in
40 CFR 58.11(e) (see 78 FR 3241–3244).
If such requests are made by monitoring
agencies and subsequently approved by
the EPA regional offices as part of the
annual monitoring plan process, it
follows that the data from these
continuous PM2.5 FEMs would also not
be of sufficient quality to support a
request for sampling reduction for a
collocated manual PM2.5 sampler. The
EPA revised the relevant language in
one section of 40 CFR 58.12 during the
2013 PM rulemaking but failed to revise
a similar phrase in another section of 40
CFR 58.12. Accordingly, the EPA is
proposing the change to ensure
consistent regulatory language
throughout 40 CFR 58.12. Within these
9 https://vista.cira.colostate.edu/improve/
Default.htm.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
editorial changes, we are also proposing
the addition of the phrase ‘‘and the EPA
Regional Administrator has approved
that the data from that monitor may be
excluded from comparison to the
NAAQS’’ to the revisions that were
made with the 2013 PM NAAQS. This
revision is being proposed to clearly
indicate that two distinct actions are
necessary for the data from a continuous
PM2.5 FEM to be considered not
comparable to the NAAQS; first, the
identification of the relevant monitor(s)
in an agency’s annual monitoring
network plan, and, second, the approval
by the EPA Regional Administrator of
that request to exclude data. The
language used by the EPA in the
relevant sections of 40 CFR 58.12
related to the initial request by
monitoring agencies but did not
specifically address the needed
approval by the EPA.
Finally, the EPA is clarifying the
applicability of statements in 40 CFR
58.12(d)(1)(ii) and (iii) that reference the
relationship of sampling frequency to
site design values. Specifically, we are
proposing clarifications and revisions
affecting the following statements: (1)
‘‘Required SLAMS stations whose
measurements determine the design
value for their area and that are within
±10 percent of the NAAQS; and all
required sites where one or more 24hour values have exceeded the NAAQS
each year for a consecutive period of at
least 3 years are required to maintain at
least a 1-in-3 day sampling frequency,’’
and (2) ‘‘Required SLAMS stations
whose measurements determine the 24hour design value for their area and
whose data are within ±5 percent of the
level of the 24-hour PM2.5 NAAQS must
have a Federal Reference Method (FRM)
or FEM operate on a daily schedule if
that area’s design value for the annual
NAAQS is less than the level of the
annual PM2.5 standard.’’ Since these
provisions were finalized in 2006, there
has been some confusion among
monitoring agencies and regional offices
concerning the applicability of the
sampling frequency adjustments since
design values are recalculated annually
and, in some situations, such revised
design values can either fall below the
comparative criteria or rise above the
criteria. For example, if according to 40
CFR 58.12(d)(1)(iii) a sampler must be
on a daily sampling schedule because
its design value is within ±5 percent of
the 24-hour NAAQS and it meets the
other listed criteria, how and when
should the sampling frequency be
revised if the referenced 24-hour design
value falls out of the ±5 percent criteria
the following year? In an extreme
PO 00000
Frm 00007
Fmt 4701
Sfmt 4702
54361
example, what would happen if the 24hour design value changed each year to
be alternately within the 5 percent
criteria and then not within the criteria?
It was not the EPA’s intention in the
2006 monitoring revisions to create
scenarios in which the required
sampling frequencies for individual
samplers would be ‘‘chasing’’ annual
changes in design values. Such a
framework would be difficult to
implement for both monitoring agencies
and regional offices for logistical
reasons including the scheduling of
operators and the availability of PM2.5
filters, and also because of the time lag
involved with reporting and certifying
data and the validation of revised design
values, which typically does not occur
until the summer following the
completion of each calendar year’s
sampling. To provide some clarity to
this situation as well as to provide a
framework where changes in sampling
frequency occur on a more consistent
and predictable basis, the EPA is
proposing that design value-driven
sampling frequency changes be
maintained for a minimum 3-year
period once such a change is triggered.
Additionally, such changes in sampling
frequency would be required to be
implemented no later than January 1 of
the year which followed the
recalculation and certification of a
triggering design value. For example, if
a triggering design value that required a
change to daily sampling frequency was
calculated in the summer of 2014 based
on 2011–2013 certified data, then the
affected sampler would be required to
have an increased sampling frequency
no later than January 1, 2015, and
would maintain that daily frequency
through at least 2017, regardless of
changes to the triggering design value in
the intervening years.
To accomplish these proposed
changes, the EPA is proposing changes
in the 40 CFR 58.12 regulatory text to
clarify that sampling frequency changes
that are triggered by design values must
be maintained until the triggering
design value site no longer meets the
criteria for at least 3 consecutive years.
Specifically, these changes include the
insertion of the phrase ‘‘until the design
value no longer meets these criteria for
3 consecutive years’’ into 40 CFR
58.12(d)(1)(ii) and the sentence ‘‘The
daily schedule must be maintained until
the referenced design values no longer
meet these criteria for 3 consecutive
years’’ into 40 CFR 58.12(d)(1)(iii). The
EPA notes that where a design value is
based on 3 years of data, 3 consecutive
years of design values would require 5
years of data (e.g., 2010–2012, 2011–
2013, 2012–2014). New regulatory
E:\FR\FM\11SEP2.SGM
11SEP2
54362
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
tkelley on DSK3SPTVN1PROD with PROPOSALS2
language has been proposed in 40 CFR
58.12(d)(1)(iv) to document the timing
of when design value-driven changes in
sampling frequency must be
implemented.
On balance, the EPA believes that the
overall impact of proposed changes to
the operating schedule requirements
will be a modest reduction in the
burden for monitoring agencies. We
believe that the number of PM2.5 FRM
and CSN samplers impacted by these
proposed changes will be relatively
small, but where they occur will
provide some logistical relief for sites
that are less critical in terms of NAAQS
implementation and other key
objectives. The EPA solicits comment
on all of these proposed changes to 40
CFR 58.12 requirements.
F. System Modification
In the 2006 monitoring amendments,
the EPA finalized a requirement in 40
CFR 58.14(a) for monitoring agencies to
‘‘develop and implement a plan and
schedule to modify the ambient air
quality network that complies with the
finding of the network assessments
required every 5 years by 58.10(e).’’ The
remainder of the associated regulatory
language reads very much like the
required procedure for making annual
monitoring network plans available for
public inspection, comment, and the
EPA Regional Administrator’s approval
as described in 40 CFR 58.10(a)(1) and
(2). Since 2006, there has been
confusion between the EPA and the
monitoring agencies as to whether a
separate plan was required to be
submitted by 40 CFR 58.14(a) relative to
the annual monitoring network plan,
with that separate plan devoted
specifically to discussing the results of
the 5-year network assessment.
A review of the 2006 monitoring
proposal and final rule reveals no
specific discussion concerning the
submission of a distinct plan devoted
specifically to the implementation of the
5-year network assessment. While the
EPA continues to support the
importance of the network assessment
requirement,10 there appears to be no
specific benefit to the requirement for a
distinct plan to discuss the 5-year
network assessments, and the inference
of the need for such a plan may be
attributable to some redundancy in the
aforementioned requirements when the
regulatory language was revised in 2006.
Monitoring agencies, for example, could
include a specific section or attachment
to the annual monitoring network plan
10 The next 5-year network assessment will be
due no later than July 1, 2015, according to the
schedule established by 40 CFR 58.10(d).
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
that fulfilled all the requirements
described in 40 CFR 58.14(a) including
how each agency would implement the
findings of the assessment and the
schedule for doing so. By including
such information in the annual
monitoring network plans, the implied
need to develop a separate plan with the
attendant burden of public posting,
obtaining public comment, and the EPA
Regional Administrator’s review and
approval can be avoided, reducing the
burden on all parties.
In terms of timing, these specific
sections or attachments referring to the
5-year network assessments could be
required either in the year when the
assessment is due (e.g., 2015) or in the
year following when the assessment is
due (e.g., 2016). The submission in the
year following the network assessment
would allow more time for monitoring
agencies to fully consider the results of
the 5-year assessment and would also
allow the public more time to review
and comment on the recommendations.
Accordingly, the EPA is proposing to
revise the regulatory language in 40 CFR
58.14(a) to clearly indicate that a
separate plan is not needed to account
for the findings of the 5-year network
assessment, and that the information
concerning the implementation of the 5year assessment, referred to in the
proposed regulatory language as a
‘‘network modification plan,’’ shall be
submitted as part of the annual
monitoring network plan that is no later
than the year after the network
assessment is due.11 According to the
proposed schedule, the annual
monitoring network plans that are due
in 2016, 2021, etc., would contain the
information referencing the network
assessments.
The EPA is also proposing to revise an
incorrect cross-reference in the current
text of 40 CFR 58.14(a) in which the
network assessment requirement is
noted as being contained in 58.10(e)
when the correct cross-reference is
58.10(d).
G. Annual Air Monitoring Data
Certification
The data certification requirement is
intended to provide ambient air quality
data users with an indication that all
required validation and reporting steps
have been completed, and that the
certified data sets are now considered
final and appropriate for all uses
11 Monitoring agencies, at their discretion, could
submit the network modification plan in the year
that the assessment is due if sufficient feedback had
been received. On balance, EPA believes that the
extra year following the completion of the network
assessment would be valuable to assure a
productive outcome from the assessment process.
PO 00000
Frm 00008
Fmt 4701
Sfmt 4702
including the calculation of design
values and the determination of NAAQS
attainment status. The formal
certification process currently involves
the transmission of a data certification
letter to the EPA signed by a senior
monitoring agency official that
references the list of monitors being
certified. The letter is accompanied by
required AQS reports that summarize
the data being certified and the
accompanying QA data that support the
validation of the referenced list of
monitors. Once the letter and required
reports are submitted to the EPA, the
data certification requirement has been
fulfilled. In practice, the EPA has
provided an additional discretionary
review of the data certification
submissions by monitoring agencies to
make sure the submissions are complete
and that the EPA agrees that the
referenced data are of appropriate
quality. When these reviews have been
completed, the EPA’s review has been
documented by the presence of a
specific AQS flag for each monitor-year
of data that has been certified and
reviewed.
The actual breadth of data
certification requirements has not
materially changed since the original
requirements were finalized in 1979 as
part of the requirement for monitoring
agencies to submit an annual SLAMS
summary report (see 44 FR 27573). Data
certification requirements were last
revised in 2006 when the deadline for
certification was changed to May 1 from
July 1 for most measurements.
Current requirements include the
certification of data collected at all
SLAMS and SPMs using FRM, FEM, or
ARM methods. In practice, this
requirement includes a very wide range
of measurements that are not limited to
criteria pollutants but also extend to
non-criteria pollutant measurements at
PAMS stations, meteorological
measurements at PAMS and NCore
stations, and PM2.5 chemical speciation
parameters. For monitoring agencies
operating these complex stations, this
places an additional burden on the data
review and validation process in
addition to the routine procedures
already in place to validate and report
data as required by 40 CFR 58.16. For
example, current PAMS requirements
include the reporting of approximately
54 individual ‘‘target list’’ volatile
organic compounds per station while
many dozens of PM2.5 species are
reported at CSN stations.
None of these specialized monitoring
programs were in place when the data
certification requirements were
originally promulgated and the large
number of measurements being obtained
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
in typical modern-day monitoring
networks has resulted in a burden
overload that has threatened the
viability of the data certification
process. For example, monitoring
agencies have struggled with the
availability of specific QA checks that
can be used to meet the certification
requirements for PAMS and CSN data,
and the EPA’s discretionary review of
data certification submissions have
become increasingly incomplete or
delayed due to the enormous number of
monitors being submitted for
certification under the current
requirements.
The EPA believes that the data
certification requirements need to be
revised to streamline the associated
workload for monitoring agencies as
well as the EPA so that the process can
be focused on those measurements that
have greatest impacts on state programs,
namely the criteria pollutants that
support the calculation of annual design
values and the mandatory designations
process. By focusing the data
certification process on the NAAQS, the
greatest value will be derived from the
certification process and both the
monitoring agencies and the EPA will
be able to devote scarce resources to the
most critical of ambient monitoring
objectives. The EPA is not implying that
the need for thorough data validation
processes is unimportant for noncriteria pollutants. However we believe
that existing QA plans and standard
operating procedures, together with the
regulatory language in 40 CFR 58.16(c)
to edit and report validated data, is
sufficient to assure the quality of noncriteria pollutant measurements being
reported to AQS.
Accordingly, the EPA is proposing
several changes in the data certification
requirements to accomplish a
streamlining of this important process.
First, to support the focus on
certification of criteria pollutant
measurements, the EPA is proposing to
revise relevant sections of 40 CFR 58.15
to focus the requirement on FRM, FEM,
and ARM monitors at SLAMS and at
SPM stations rather than at all SLAMS
which also include PAMS and CSN
measurements that may not utilize
federally approved methods. This
proposed wording change limits the
data certification requirement to criteria
pollutants since the EPA approved
methods do not exist for non-criteria
measurements. Second, the EPA is also
proposing that the required AQS reports
be submitted to the Regional
Administrator rather than through the
Regional Administrator to the
Administrator as is currently required.
From a process standpoint, this
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
proposed change effectively places each
EPA regional office in charge of the
entire data certification process
(including the discretionary review)
versus the EPA headquarters where the
discretionary reviews have taken place
in the past. This delegation of
responsibility for the discretionary
review will allow this important part of
the certification process to be shared
among the ten EPA regional offices, and
result in a more timely review of
certification results and the posting of
appropriate certification status flags in
AQS. The EPA notes that significant
progress has already been made in
revising this part of the certification
process and that a new AQS report, the
AMP 600, has been developed to more
efficiently support the sharing of
relevant information between certifying
agencies and the EPA regional offices.12
Additionally, minor editorial changes
are being proposed in 40 CFR 58.15 to
generalize the title of the official
responsible for data certification (senior
official versus senior air pollution
control officer) and to remove an
outdated reference to the former due
date for the data certification letter (July
1 versus the current due date of May 1).
H. Data Submittal and Archiving
Requirements
The requirements described in 40 CFR
58.16 address the specific
measurements that must be reported to
AQS as well as the relevant schedule for
doing so. Required measurements
include criteria pollutants in support of
NAAQS monitoring objectives as well as
public reporting, specific ozone (O3) and
PM2.5 precursor measurements such as
those obtained at PAMS, NCore, and
CSN stations, selected meteorological
measurements at PAMS and NCore
stations, and associated QA data that
support the assessment of precision and
bias.
In 1997, an additional set of required
supplemental measurements was added
to 40 CFR 58.16 in support of the newly
promulgated FRM for PM2.5, described
in 40 CFR part 50, appendix L. These
measurements included maximum,
minimum, and average ambient
temperature; maximum, minimum, and
average ambient pressure; flow rate
coefficient of variation (CV); total
sample volume; and elapsed sample
time. In the 2006 monitoring
amendments, many of these
supplemental measurements were
removed from the requirements based
on the EPA’s confidence that the PM2.5
12 Note relevant training material available on the
AQS TTN Web site: https://www.epa.gov/ttn/airs/
airsaqs/training/2013_Q2_Webinar_Final.pdf.
PO 00000
Frm 00009
Fmt 4701
Sfmt 4702
54363
FRM was meeting data quality
objectives (see 71 FR 2748). At that
time, reporting requirements were
retained for average daily ambient
temperature and average daily ambient
pressure, as well as any applicable
sampler flags, in addition to PM2.5 mass
and field blank mass. Given the
additional years of data supporting the
performance of the PM2.5 FRM as well
as the near ubiquitous availability of
meteorological data available from
sources such as the National Weather
Service automated surface observing
system 13 in addition to air quality
networks, the EPA believes that it is no
longer necessary to require agencies to
report the average daily temperature and
average daily pressure from manual
PM2.5 samplers, thereby providing some
modest relief from the associated
reporting burden. Accordingly, the EPA
is proposing to remove AQS reporting
requirements for average daily
temperature and average daily pressure
as related to PM2.5 measurements with
the expectation that monitoring agencies
will retain such measurements as
needed to support data validation needs
as well as to fulfill requirements in
associated QA project plans and
standard operating procedures. The EPA
is also proposing to remove similar
language referenced elsewhere in 40
CFR 58.16 that pertains to
measurements at Pb sites as well as to
other average temperature and average
pressure measurements recorded by
samplers or from nearby airports. For
the reasons noted above, the EPA
believes that meteorological data are
more than adequately available from a
number of sources, and that the removal
of specific requirements for such data to
be reported to AQS represents an
opportunity for burden reduction. The
EPA notes that the requirement to report
specific meteorological data for NCore
and PAMS stations remains unchanged.
The EPA is also proposing a change
to the data reporting schedule described
in 40 CFR 58.16(b) and (d) to provide
additional flexibility for reporting PM2.5
chemical speciation data measured at
CSN stations. Specifically, we are
proposing that such data be required to
be reported to AQS within 6 months
following the end of each quarterly
reporting period, as is presently
required for certain PAMS
measurements such as volatile organic
compounds. This change would provide
an additional 90 days for PM2.5 chemical
speciation data to be reported compared
with the current requirement of
reporting 90 days after the end of each
13 See https://www.nws.noaa.gov/ost/
asostech.html.
E:\FR\FM\11SEP2.SGM
11SEP2
54364
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
tkelley on DSK3SPTVN1PROD with PROPOSALS2
quarterly reporting period. This change
is being proposed to provide both the
EPA and monitoring agencies with
potential data reporting flexibility as
technological and procedural revisions
are considered for the national
analytical frameworks that support the
CSN network. Given that the primary
objectives of the CSN (and IMPROVE)
programs are to support long-term needs
such as SIP development, modeling, and
health studies, the EPA believes that
such programs would not be negatively
impacted by the revised reporting
requirements and that potential
contractual efficiencies could be
realized by allowing more time for
analytical laboratories to complete their
QA reviews and report their results to
AQS.
I. Network Design Criteria (Appendix D)
The EPA is proposing two changes
that affect the required suite of
measurements in the NCore network.
This multi-pollutant network became
operational on January 1, 2011, and
includes approximately 80 stations that
are located in both urban and rural
areas.14
The EPA is proposing a minor change
to section 3 of appendix D to part 58,
the design criteria for NCore sites.
Specifically, we are proposing to delete
the requirement to measure speciated
PM10-2.5 from the list of measurements
in section 3(b). An identical revision
was finalized in the text of 40 CFR
58.16(a) in the 2013 p.m. NAAQS final
rule (see 78 FR 3244). At that time, we
noted the lack of consensus on
appropriate sampling and analytical
techniques for speciated PM10-2.5, and
the pending analysis of data from a pilot
project that examined these issues.
Based on the supportive comments
received from monitoring agencies and
multi-state organizations, the EPA
deleted the requirement for speciated
PM10-2.5 from 40 CFR 58.16(a). During
this process, the EPA inadvertently
failed to complete a similar change that
was required in the language of section
3 of appendix D. Accordingly we are
proposing this change to align the
NCore monitoring requirements
between the two sections noted above.
The EPA is also proposing to delete
the requirement to measure Pb at urban
NCore sites, either as Pb in Total
Suspended Particles (Pb–TSP) or as Pb–
PM10. This requirement was finalized as
part of the reconsideration of Pb
monitoring requirements that occurred
in 2010 (see 75 FR 81126). At that time,
we noted that monitoring of Pb at such
14 See https://www.epa.gov/ttn/amtic/ncore/
index.html for more information.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
nonsource locations at NCore sites
would support the characterization of
typical neighborhood-scale Pb
concentrations in urban areas to assist
with the understanding of the risk posed
by Pb to the general population. We also
noted that such information could assist
with the determination of
nonattainment boundaries and support
the development of long-term trends.
Since this requirement was finalized
in late 2010, nonsource lead data has
been measured at 50 urban NCore sites,
with the majority of sites having already
collected at least 2 years of data. In all
cases, valid ambient Pb readings have
been low, with maximum 3-month
rolling averages typically reading
around 0.01 micrograms per cubic meter
as compared to the NAAQS level of 0.15
micrograms per cubic meter.15 We
expect the majority of sites to have the
3 years necessary to calculate a design
value following the completion of
monitoring in 2014. Given the
uniformly low readings being measured
at these NCore sites, we believe it is
appropriate to consider eliminating this
requirement. As noted in the associated
docket memo, nonsource Pb data will
continue to be measured (as Pb–PM10) at
the 27 National Air Toxics Trends Sites
(NATTS) and at hundreds of PM2.5
speciation stations that comprise the
CSN and IMPROVE networks. The EPA
believes that these ongoing networks
adequately support the nonsource
monitoring objectives articulated in the
2010 Pb monitoring reconsideration.
Accordingly, the EPA is proposing to
delete the requirement to monitor for
nonsource Pb at NCore sites from
appendix D of 40 CFR part 58.16 Given
the requirement to collect a minimum of
3 years of Pb data in order to support
the calculation of design values, the
EPA proposes that monitoring agencies
would be able to request permission to
discontinue nonsource monitoring
following the collection of at least 3
years of data at each urban NCore site.17
Affected monitoring agencies should
work closely with their respective EPA
15 See supporting information for reconsideration
of existing requirements to monitor for lead at
urban NCore site, Kevin Cavender, Docket number
EPA–HQ–OAR–2013–0619.
16 Specific revisions are proposed in 40 CFR part
58, appendix D, section 3(b) and sections 4.5(b) and
4.5(c).
17 The EPA will review requests for shutdown
under the provisions of 40 CFR 58.14. Although
EPA anticipates that these nonsource monitors will
have design values well below the NAAQS and will
be eligible to be discontinued after three years of
data have been collected, in the event that a
monitor records levels approaching the NAAQS it
may not qualify to be discontinued.
PO 00000
Frm 00010
Fmt 4701
Sfmt 4702
regional offices to ensure coordination
of these changes to the network.
The EPA solicits comments on these
proposed changes to Pb monitoring
requirements.
III. Proposed Changes to Quality
Assurance Requirements
A. Quality Assurance Requirements for
Monitors Used in Evaluations for
National Ambient Air Quality
Standards—Appendix A
1. General Information
The following proposed changes to
monitoring requirements impact these
subparts of part 58—Ambient Air
Quality Surveillance; appendix A—
Quality Assurance Requirements for
SLAMS, SPMs and PSD Air Monitoring.
Changes that affect the overall appendix
follow while those specific to the
various sections of the appendix will be
addressed under a specific section
heading. The EPA notes that the entire
regulatory text section for appendix A is
being reprinted with this proposal since
this section is being reorganized for
clarity as well as being selectively
revised as described in detail below.
Likewise, although the EPA is proposing
a new appendix B to apply to PSD
monitors, much of the content of
appendix B is taken directly from the
existing requirements for these monitors
set forth in appendix A. The EPA is
soliciting comment on the specific
provisions of appendices A and B that
are being proposed for revision.
However, there are a number of
provisions that are being reprinted in
the regulatory text solely for clarity to
assist the public in understanding the
changes being proposed; the EPA is not
soliciting comment on those provisions
and considers changes to those
provisions to be beyond the scope of
this rulemaking.
The QA requirements in appendix A
have been developed for measuring the
criteria pollutants of O3, NO2, sulfur
dioxide (SO2), CO, Pb and PM (PM10
and PM2.5) and are minimum
requirements for monitoring these
ambient air pollutants for use in
NAAQS attainment demonstrations. To
emphasize the objective of this
appendix, the EPA proposes to change
the title of appendix A to ‘‘Quality
Assurance Requirements for Monitors
used in Evaluations of National
Ambient Air Quality Standards,’’ and
remove the terms SLAMS and SPMs
from the title. We do, however, in the
applicability paragraph, indicate that
any monitor identified as SLAMS must
meet the appendix A criteria in order to
avoid any confusion about SLAMS
monitors measuring criteria pollutants.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
Special purpose monitors may in fact be
monitoring for a criteria pollutant for
other objectives than NAAQS
determinations. Therefore, appendix A
attempts to clarify in the title and the
applicability section that the QA
requirements specified in this appendix
are for criteria pollutant monitors that
are designated, through the part 58
ambient air regulations and monitoring
organization annual monitoring network
plans, as eligible to be used for NAAQS
evaluation purposes. The applicability
section also provides a reporting
mechanism in AQS to identify any
criteria pollutant monitors that are not
used for NAAQS evaluations. The
criteria pollutants identified for NAAQS
exclusion will require review and
approval by the EPA regional offices
and will increase transparency and
efficiencies in the NAAQS designation,
data quality evaluation and data
certification processes.
The current appendix A regulation
has separate sections for automated
(continuous) and manual method types.
Since there are continuous and manual
methods for measuring PM which have
different quality control (QC)
requirements, monitoring organizations
have found it difficult to navigate the
current appendix A requirements. The
EPA proposes to reformat the document
by pollutant rather than by method type.
The four gaseous pollutants (CO, NO2,
SO2 and O3) will be contained in one
section since the QC requirements are
very similar, and separate sections will
be provided for PM10, PM2.5, and Pb.
In the 2006 monitoring rule revisions,
the PSD QA requirements, which were
previously in appendix B, were added
to appendix A and appendix B was
reserved. The PSD requirements, in
most cases, mimicked appendix A in
structure but because PSD monitoring is
often only for a period of one year, some
of the frequencies of implementation of
the PSD requirements are higher than
the appendix A requirements. In
addition, the agencies governing the
implementation, assessment and
approval of the QA requirements are
different for PSD and ambient air
monitoring for NAAQS decisions (i.e.,
the EPA regions for appendix A versus
reviewing authorities for PSD). The
combined regulations have caused
confusion among monitoring
organizations and those implementing
PSD requirements, and the EPA
proposes that the PSD requirements be
moved back to a separate appendix B.
This change would also provide more
flexibility for revision if changes in
either appendix are needed. Details of
this proposed change will follow in
Section III.B.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
Finally, the EPA proposes that the
appendix A regulation emphasize the
use of PQAO and moved the definition
and explanation to the beginning of the
regulation in order to ensure that the
application and use of PQAO in
appendix A is clearly understood. The
definition for PQAO is not being
proposed for change. Since the PQAO
can be a consolidation of a number of
local monitoring organizations, the EPA
proposes to add a sentence clarifying
that the agency identified as the PQAO
(usually the state agency) will be
responsible for overseeing that the
appendix A requirements are being met
by all consolidated local agencies
within the PQAO. Current appendix A
regulation requires PQAOs to be
approved by the EPA regions during
network reviews or audits. The EPA
believes this approval can occur at any
time and proposes to eliminate the
wording that suggests that PQAO
approvals can only occur during events
like network reviews or audits.
2. Quality System Requirements
The EPA proposes to remove the QA
requirements for PM10-2.5 (see current
sections 3.2.6, 3.2.8, 3.3.6, 3.3.8, 4.3).
Appendix A has traditionally been used
to describe the QA requirements of the
criteria pollutants used in making
NAAQS attainment decisions. While the
40 CFR part 58 Ambient Air Monitoring
regulation requires monitoring for the
CSN, PAMS, and total oxides of
Nitrogen (NOy) for NCore, the QA
requirements for these networks are
found in technical assistance documents
and not in appendix A. In 2006, the EPA
proposed a PM10-2.5 NAAQS along with
requisite QA requirements in appendix
A. While the PM10-2.5 NAAQS was not
promulgated, PM10-2.5 monitoring was
required to be performed at NCore sites
and the EPA proposed requisite QA
requirements in appendix A. Some of
the PM QC requirements, like
collocation for precision and the
performance evaluation programs for
bias, are accomplished on a percentage
of monitoring sites within a PQAO. For
example, collocated sampling for PM2.5
and PM10 is required at approximately
15 percent of the monitoring sites
within a PQAO. Since virtually every
NCore site is the responsibility of a
different PQAO, the appendix A
requirements for PM10-2.5, if
implemented at the PQAO level, would
have been required to be implemented
at almost every NCore site, which would
have been expensive and an unintended
burden. Therefore, the EPA required the
implementation of the PM10-2.5 QC
requirements at a national level and
worked with the EPA regions and
PO 00000
Frm 00011
Fmt 4701
Sfmt 4702
54365
monitoring organizations to identify the
sites that would implement the
requirements. The implementation of
the PM10-2.5 QC requirements at NCore
sites fundamentally changed how QC is
implemented in appendix A and has
been a cause of confusion with these
parties. Since PM10-2.5 is not a NAAQS
pollutant and the QC requirements
cannot be cost-effectively implemented
at a PQAO level, the EPA is proposing
to eliminate the PM10-2.5 requirements
including flow rate verifications, semiannual flow rate audits, collocated
sampling procedures, and the PM10-2.5
Performance Evaluation Program (PEP).
Similar to the technical assistance
documents associated for the CSN 18 and
PAMS 19 networks, the EPA will
develop QA guidance for the PM10-2.5
network which will afford more
flexibility for implementation and
revision of QC activities for PM10-2.5.
The EPA proposes that the QA Pb
requirements of collocated sampling
(see current section 3.3.4.3) and Pb
performance evaluation procedures (see
current section 3.3.4.4) for non-source
NCore sites be eliminated. The 2010 Pb
rule in 40 CFR part 58, appendix D,
section 4.5(b), added a requirement to
conduct non-source oriented Pb
monitoring at each NCore site in a core
based statistical area (CBSA) with a
population of 500,000 or more. This
requirement had some monitoring
organizations implementing Pb
monitoring at only one site, the NCore
site. Since the appendix A requirements
are focused on PQAOs, the QC
requirements would increase at PQAOs
who were required to implement Pb
monitoring at NCore. Similar to the
PM10-2.5 QA requirements, the
requirement for Pb at NCore sites forced
the EPA away from a focus on PQAOs
to working with the EPA regions and
monitoring organizations for
implementation of the Pb Performance
Evaluation Program (Pb-PEP) at national
levels. Therefore, the EPA is proposing
to eliminate the collocation requirement
and the Pb-PEP requirements while
retaining the requirements for flow rate
verifications and flow rate audits which
do not require additional monitors or
independent sampling and analysis.
Similar to the CSN and PAMS programs,
the EPA will develop QA guidance for
the Pb NCore network which will afford
more flexibility for change/revision to
accommodate Pb monitoring at nonsource NCore sites. Additionally, the
18 See https://www.epa.gov/ttn/amtic/
specguid.html for CSN quality assurance project
plan.
19 See https://www.epa.gov/ttn/amtic/
pamsguidance.html for PAMS technical assistance
document.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
54366
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
EPA is proposing to delete the
requirement to measure Pb at these
specific NCore sites, either as Pb-TSP or
as Pb-PM10 (see section II.I of this rule).
If that proposed change is finalized, it
will eliminate the need for any
associated QA requirements including
collocation, Pb-PEP or any QC
requirements for these monitors. If the
proposed change to NCore Pb
requirements is not finalized, then the
EPA will consider the proposed revision
to QA requirements as described above
on its own merits.
The EPA proposes that quality
management plan (QMP) (current
section 2.1.1) and quality assurance
project plan (QAPP) (current section
2.1.2) submission and approval dates be
reported by monitoring organizations
and the EPA. This will allow for timely
and accurate reporting of this
information. Since 2007, the EPA has
been tracking the submission and
approval of QMPs and QAPPs by
polling the EPA regions each year and
updating a spreadsheet to the AMTIC
Web site. The development of the
annual spreadsheet is time consuming
on the part of monitoring organizations
and the EPA. It is expected that
simplified reporting at the monitoring
organization and the EPA regional office
level to AQS will reduce entry errors
and the burden of incorporating this
information into annual spreadsheets,
and increase transparency of this
important quality system
documentation. In order to reduce the
initial burden of this data entry activity,
the EPA has populated AQS with the
last set of updated QMP and QAPP data
from the annual spreadsheet review
cycle. If this portion of the proposal is
finalized, monitoring organizations will
only need to update AQS as necessary.
In addition, some monitoring
organizations have received delegation
of authority to approve their QAPP
through the monitoring organization’s
own QA organization. The EPA
proposes that if a PQAO or monitoring
organization has been delegated
authority to review and approve their
QAPP, an electronic copy must be
submitted to the EPA regional office at
the time it is submitted to the PQAO/
monitoring organization’s QAPP
approving authority. Submission of an
electronic version to the EPA at the time
of completion is not considered an
added burden on the monitoring
organization because such submission is
already a standard practice as part of the
review process for technical systems
audits.
The EPA proposes to add some
clarifying language to the section
describing the National Performance
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
Evaluation Program (NPEP) (current
section 2.4) explaining selfimplementation of the performance
evaluation by the monitoring
organization. The clarification also adds
the definition of independent
assessment which is included in the
PEP (PM2.5-PEP, Pb-PEP and National
Performance Audit Program (NPAP))
QAPPs and guidance and is included in
the self-implementation memo sent to
the monitoring organizations on an
annual basis and posted on the AMTIC
Web site 20. The clarification is not a
new requirement but provides a better
reference for this information in
addition to the annual memo sent to the
monitoring organizations.
The EPA proposes to add some
clarifying language to the technical
systems audits (TSA) section (current
section 2.4). The current TSA
requirements are performed at the
monitoring organization level. Since the
EPA is revising the language in
appendix A to focus on PQAOs instead
of monitoring organizations, this may
have an effect on those EPA Regions
that want to perform TSA on monitoring
organizations within a PQAO (a PQAO
can be a single monitoring organization
or a consolidation of a number of local
monitoring organizations). The EPA
proposes a TSA frequency of 3 years for
each PQAO, but includes language that
if a PQAO is made up of a number of
monitoring organizations, all monitoring
organizations within the PQAO be
audited within 6 years. This proposed
language maintains the every 3 years
TSA requirement as it applies to PQAOs
but provides additional flexibility for
the EPA regions to audit every
monitoring organization within the
PQAO every 6 years. This change does
not materially affect the burden on
monitoring organizations.
The EPA proposes to require
monitoring organizations to complete an
annual survey for the Ambient Air
Protocol Gas Verification Program (AA–
PGVP) (current section 2.6.1). Since
2009, the EPA has had a separate
information collection request (ICR)
requiring monitoring organizations to
complete an annual survey of the
producers that supply their gas
standards (for calibrations and QC) in
order to be able to select standards from
these producers for verification. The
survey generally takes less than 10
minutes to complete. The EPA proposes
to add the requirement to appendix A.
In addition, the EPA proposes to add
language that monitoring organizations
participate, at the request of the EPA, in
20 See
https://www.epa.gov/ttn/amtic/
npepqa.html.
PO 00000
Frm 00012
Fmt 4701
Sfmt 4702
the AA–PGVP by sending a gas standard
to one of the verification laboratories
every 5 years. Since many monitoring
organizations already volunteer to send
in cylinders, this proposed new
requirement may not materially affect
most agencies and will not affect those
agencies not using gas standards.
3. Quality Control Checks for Gases
The EPA proposes to lower the audit
concentrations (current section 3.2.1) of
the one-point QC checks to 0.005 and
0.08 parts per million (ppm) for SO2,
NO2, and O3 (currently 0.01 to 0.1 ppm),
and to between 0.5 and 5 ppm for CO
monitors (currently 1 and 10 ppm).
With the development of more sensitive
monitoring instruments with lower
detection limits, technical
improvements in calibrators, and lower
ambient air concentrations in general,
the EPA feels this revision will better
reflect the precision and bias of the data.
Since the audit concentrations are
selected using the mean or median
concentration of typical ambient air
concentrations (guidance on this is
provided in the QA Handbook 21), the
EPA is proposing to add some
clarification to the current language by
requiring monitoring organizations to
select either the highest or lowest
concentration in the ranges identified if
their mean or median concentrations are
above or below the prescribed range.
There is no additional burden to this
requirement since the frequency is the
same and the audit concentrations are
not so low as to make them
unachievable to generate or measure.
The EPA proposes to remove
reference to zero and span adjustments
(current section 3.2.1.1) and revise the
one-point QC language to simply require
that the QC check be conducted before
any calibration or adjustment to the
monitor. Recent revisions of the QA
Handbook discourage the
implementation of frequent span
adjustments so the proposed language
helps to clarify that no adjustment be
made prior to implementation of the
one-point QC check.
The EPA proposes to remove the
requirement (current section 3.2.2) to
implement an annual performance
evaluation for one monitor in each
calendar quarter when monitoring
organizations have less than four
monitoring instruments. The minimum
requirement for the annual performance
evaluation for the primary monitor at a
site is one per year. The current
regulation requires evaluation of the
21 QA Handbook for Air Pollution Measurement
Vol. II, Ambient Air Quality Monitoring Program at:
https://www.epa.gov/ttn/amtic/qalist.html.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
monitors at 25 percent per quarter so
that the performance evaluations are
performed in all four quarters. There are
cases where some monitoring
organizations have less than four
primary monitors for a gaseous
pollutant, and the current language
suggests that a monitor already
receiving a performance evaluation be
re-audited to provide for performance
evaluations in all four quarters. This is
a burden reduction for monitoring
agencies operating smaller networks and
does not change the requirement of an
annual performance evaluation for each
primary monitor.
The current annual performance
evaluation language (current section
3.2.2.1) requires that the audits be
conducted by selecting three
consecutive audit levels (currently five
audit levels are provided in appendix
A). Due to the implementation of the
NCore network, the inception of trace
gas monitors, and lower ambient air
concentrations being measured under
typical circumstances, there is a need
for audit levels at lower concentrations
to more accurately represent the
uncertainties present in much of the
ambient data. The EPA proposes to
expand the audit levels from five to ten
and remove the requirement to audit
three consecutive levels. The current
regulation also requires that the three
audit levels should bracket 80 percent of
the ambient air concentrations
measured by the analyzer. This current
language has caused some confusion
and monitoring organizations have
requested the use of an audit point to
establish monitor accuracy around the
NAAQS levels. Therefore, the EPA is
proposing to revise the language so that
two of the audits levels selected
represent 10–80 percent of routinelycollected ambient concentrations either
measured by the monitor or in the
PQAOs network of monitors. The
proposed revision allows the third point
to be selected at the NAAQS level (e.g.,
75 ppb for SO2) or above the highest 3year routine hourly concentration,
whichever is greater.
The EPA proposes to revise the
language (current section 3.2.2.2(a))
addressing the limits on excess nitric
oxide (NO) that must be followed during
gas phase titration (GPT) procedures
involving NO2 audits. The current NO
limit (maintaining at least 0.08 ppm) is
very restrictive and requires auditors to
make numerous mid-audit adjustments
during a GPT that result in making the
NO2 audit a very time consuming
procedure. Monitoring agency staff have
advised us that the observance of such
excess NO limits has no apparent effect
on NO2 calibrations being conducted
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
with modern-day GPT capable
calibration equipment and, therefore,
that the requirement in the context of
performing audits is unnecessary.22 We
also note the increasing availability of
the EPA approved direct NO2 methods
that do not utilize converters, rendering
the use of GPT techniques that require
the output of NO and NOX to be a
potentially diminishingly used
procedure in the future. Accordingly,
we have proposed a more general
statement regarding GPT that
acknowledges the ongoing usage of
monitoring agency procedures and
guidance documents that have
successfully supported NO2 calibration
activities. The EPA believes that if such
procedures have been successfully used
during calibrations when instrument
adjustments are potentially being made,
then such procedures are appropriate
for audit use when instruments are not
subject to adjustment. The EPA solicits
comment on this proposed
generalization of the GPT requirements,
including whether a more specific set of
requirements similar to the current
excess NO levels can be developed
based on operational experience and/or
peer reviewed literature.
The EPA proposes to remove language
(current section 3.2.2.2(b)) in the annual
performance evaluation section that
requires regional approval for audit
gases for any monitors operating at
ranges higher that 1.0 ppm for O3, SO2
and NO2 and greater than 50 ppm for
CO. The EPA does not need to approve
a monitoring organization’s use of audit
gases to audit above proposed
concentration levels. There should be
very few cases where a performance
evaluation needs to be performed above
level 10, but there may be some
legitimate instances (e.g., SO2 audits in
areas impacted by volcanic emissions).
Since data reported to AQS above the
highest level may be flagged or rejected,
the EPA proposes that PQAOs notify the
EPA regions of sites auditing at
concentrations above level 10 so that
reporting accommodations can be made.
The EPA proposes to provide
additional explanatory language in
appendix A to describe the NPAP
(current section 2.4). The NPAP has
been a long standing program for the
ambient air monitoring community. The
NPAP is a performance evaluation
which is a type of audit where
quantitative data are collected
independently in order to evaluate the
proficiency of an analyst, monitoring
instrument or laboratory. It has been
22 See supporting information in Excess NO Issue
paper, Mike Papp and Lewis Weinstock, Docket
number EPA–HQ–OAR–2013–0619.
PO 00000
Frm 00013
Fmt 4701
Sfmt 4702
54367
briefly mentioned in section 2.4 of the
current appendix A requirements. Since
2007, the EPA has distributed a memo
to all monitoring organizations in order
to determine whether the monitoring
organization plans to self-implement the
NPAP program or utilize the federally
implemented program. In order to make
this decision, the NPAP adequacy and
independence requirements are
described in the memo. The EPA
proposes to include these same
requirements in appendix A in a
separate section for NPAP. In addition,
the memo currently states that 20
percent of the sites would be audited
each year and, therefore, all sites would
be audited in a 5-year period. Since
there is a possibility that monitoring
organizations may want some higher
priority sites audited more frequently,
the EPA is proposing to revise the
language to require all sites to be
audited within a 6-year period to
provide more flexibility and discretion
for monitoring agencies. This revision
does not change the number of sites
audited in any given year, but allows for
increased frequency of sites deemed as
high priority.
4. Quality Control Checks for Particulate
Monitors
The EPA proposes to require that flow
rate verifications (current section 3.2.3)
be reported to AQS. Particulate matter
concentrations (e.g., PM2.5, PM10, Pb) are
reported in mass per unit of volume
(e.g., mg/m3). Flow rate verifications are
implemented at required frequencies in
order to ensure that the PM sampler is
providing an accurate and repeatable
measure of volume which is critical for
the determination of concentration. If a
given flow rate verification does not
meet acceptance criteria, the EPA
guidance suggests that data may be
invalidated back to the most recent
acceptable verification which is why
these checks are performed at higher
frequencies. Implementation of the flow
rate verification is currently a
requirement, but the reporting to AQS
has only been a requirement for PM10
continuous instruments. This is the only
QC requirement in appendix A that was
not fully required for reporting for all
pollutants and has been a cause of
confusion. When performing TSAs, the
EPA regions review the flow rate
verification information. There are cases
where it is difficult to find the flow rate
verification information to ascertain
completeness, data quality and whether
corrective actions have been
implemented in the case of flow rate
verification failures. In addition, the
EPA regions have mentioned that some
of the monitoring organizations have
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
54368
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
been reporting this data to AQS in an
effort to increase transparency and
reliability in data quality. In a recent
review of 2012 data, out of the 1,110
SLAMS PM2.5 samplers providing flow
rate audit data (which are required to be
reported), flow rate verification data was
also reported for 543 samplers or about
49 percent for the samplers with flow
rate audit data. With the development of
a new QA transaction in AQS, we
believe that the reporting of flow rate
verification data would improve the
evaluation of data quality for data
certification and at national levels,
provide consistent interpretation in the
regulation for all PM pollutants without
being overly burdensome
(approximately 12 per sampler per
year).
In addition, the flow rate verification
requirements for all the particulate
monitors suggest randomization of the
implementation of flow rate
verifications with respect to time of day,
day of the week and routine service and
adjustments. Since this is a suggestion,
the EPA proposes to remove this
language from the regulation and
instead include it in QA guidance.
The EPA proposes to add clarifying
language to the PM2.5 collocation
requirements (current section 3.2.5) that
a site can only count for the collocation
of the method designation of the
primary monitor at that site. Precision is
estimated at the PQAO level and at 15
percent of the sites for each method
designation that is designated as a
primary monitor. When developing the
collocation requirements, the EPA
intended to have the collocated
monitors distributed to as many sites as
possible in order to capture as much of
the temporal and spatial variability in
the PQAO. Therefore, since there can be
only one primary monitor at a site for
any given time period, it was originally
intended that the primary monitor and
the QA collocated monitor (for the
primary) at a monitoring site count as
one collocation. There have been some
cases where multiple monitoring
methods have been placed at a single
site to fulfill multiple collocation
requirements, which is not the intent of
the current requirement. For example, a
site (Site A) may have a primary
monitor that is designated as a FRM
(FRM A). This site may also have a FEM
(FEM B) at the site that is not the
primary monitor. If this site was
selected for collocation, then the QA
collocated monitor must be the same
method designation as the primary, so
the site would be collocated with
another FRM A monitor. For primary
monitors that are FEMs, the current
requirement calls for the first QA
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
collocated monitor of a FEM primary
monitor be a FRM monitor. Some
monitoring organizations have been
using the collocated FRM monitors at
Site A to satisfy the collocation
requirements for other sites (e.g., Sites
B, C, D) that have a FEM (FEM B or
other FEM) as the primary monitor
rather than placing a QA collocated
FRM monitor at Site B (C or D). This
was not the intent of the original
regulation and the EPA provided
additional guidance to monitoring
organizations in 2010 23 on the correct
(intended) interpretation. This revision
does not change the current regulation
and does not increase or decrease
burden, but is intended to provide
clarity on how the PQAO identifies the
number and types of monitors needed to
achieve the collocation requirements.
The EPA proposes to provide more
flexibility to monitoring organizations
when selecting sites for collocation.
Appendix A currently (current section
3.2.5.3) requires 80 percent of the
collocated monitors be deployed at sites
within ±20 percent of the NAAQS and
if the monitoring organization does not
have sites within that range, then 60
percent of the sites are to be deployed
among the highest 25 percent of all sites
within the network. Monitoring
organizations have found this difficult
to achieve. Some monitoring
organizations do not have many sites
and, at times, due to permission, access
and limited space issues, the
requirement was not always achievable.
Realizing that the collocated monitors
provide precision estimates for the
PQAO (since only 15 percent of the sites
are collocated), while also
acknowledging that sites that measure
concentrations close to the NAAQS are
important, the EPA proposes to require
that 50 percent (reduction from 80
percent) of the collocated monitors be
deployed at sites within ±20 percent of
the NAAQS, and if the monitoring
organization does not have sites within
that range, then 50 percent of the sites
are to be deployed among the highest
sites within the network. Although this
requirement does not change the
number of sites requiring collocation, it
does provide the monitoring
organizations additional flexibility in its
choice of collocated sites.
5. Calculations for Data Quality
Assessment
In order to provide reasonable
estimates of data quality, the EPA uses
data above an established threshold
concentration usually related to the
23 QA EYE Issue 9 Page 3 at: https://www.epa.gov/
ttn/amtic/qanews.html.
PO 00000
Frm 00014
Fmt 4701
Sfmt 4702
detection limits of the measurement.
Measurement pairs are selected for use
in the precision and bias calculations
only when both measurements are
above a threshold concentration.
For many years, the threshold
concentration for Pb precision and bias
data was 0.02 ug/m3. The EPA
promulgated a new Pb FRM (see 78 FR
40000) utilizing the Inductively
Coupled Plasma Mass Spectrometry
(ICP–MS) analysis technique in 2013 as
a revision to appendix G of 40 CFR part
50 24. This new FRM demonstrated
method detection limits (MDLs) 25
below 0.0002 mg/m3, which is well
below the EPA requirement of five
percent of the current Pb NAAQS level
of 0.15 mg/m3 or 0.0075 mg/m3. As a
result of the increased sensitivity
inherent in this new FRM, the EPA
proposes to lower the acceptable Pb
concentration (current section 4) from
the current value of 0.02 ug/m3 to 0.002
mg/m3 for measurements obtained using
the new Pb FRM and other more
recently approved equivalent methods
that have the requisite increased
sensitivity.26 The current 0.02 ug/m3
value will be retained for the previous
Pb FRM that has subsequently been redesignated as Federal Equivalent
Method EQLA–0813–803, as well as
older equivalent methods that were
approved prior to the more recent work
on developing more sensitive methods.
Since ambient Pb concentrations are
lower and methods more sensitive,
lowering the threshold concentration
will allow much more collocated
information to be evaluated which will
provide more representative estimates of
precision and bias.
The EPA also proposes to remove the
total suspended particulate (TSP)
threshold concentration for precision
and bias since TSP is no longer a
NAAQS required pollutant and the EPA
no longer has QC requirements for it.
The EPA proposes to remove the
statistical check currently described in
section 4.1.5 of appendix A. The check
was developed to perform a comparison
of the one-point QC checks and the
annual performance evaluation data
performed by the same PQAO. The
section suggests that 95 percent of all
the bias estimates from the annual
performance evaluation (reported as a
24 See
78 FR 40000, July 3, 2013.
is described as the minimum
concentration of a substance that can be measured
and reported with 99-percent confidence that the
analyte concentration is greater than zero.
26 FEMS approved on or after March 4, 2010, have
the required sensitivity to utilize the 0.002 mg/m3
reporting limit with the exception of manual
equivalent method EQLA–0813–803, the previous
FRM based on flame atomic absorption
spectroscopy.
25 MDL
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
percent difference) should fall within
the 95 percent probability interval
developed using the one-point QC
checks. The problem with this check is
that PQAOs with very good repeatability
on the one-point QC check data had a
hard time meeting this requirement
since the probability interval became
very tight, making it more difficult for
better performing PQAOs to meet the
requirement. Separate statistics to
evaluate the one-point QC checks and
the performance evaluations are already
promulgated, so the removal of this
check does not affect data quality
assessments.
Similar to the statistical comparison
of performance evaluations data, the
EPA proposes to remove the statistical
check (current section 4.2.4) to compare
the flow rate audit data and flow rate
verification data. The existing language
suggests that 95 percent of all the flow
rate audit data results (reported as
percent difference) should fall within
the 95 percent probability interval
developed from the flow rate
verification data for the PQAO. The
problem, as with the one-point QC
check, was that monitoring
organizations with very good
repeatability on the flow rate
verifications had a hard time meeting
this requirement since the probability
interval became very tight, making it
difficult for better performing PQAOs to
meet the requirement. Separate statistics
to evaluate the flow rate verifications
and flow rate audits are already
promulgated, so the removal of this
check does not affect data quality
assessments.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
B. Quality Assurance Requirements for
Monitors Used in Evaluations of
Prevention of Significant Deterioration
Projects-Appendix B
1. General Information
The following proposed changes to
monitoring requirements impact these
subparts of part 58—Ambient Air
Quality Surveillance; appendix B—
Quality Assurance Requirements for
Prevention of Significant Deterioration
(PSD) Air Monitoring. Changes that
affect the overall appendix follow while
those specific to the various sections of
the appendix will be addressed under
specific section headings. Since the PSD
QA have been included in appendix A
since 2006, section headings refer to the
current appendix A sections.
The quality assurance requirements in
appendix B have been developed for
measuring the criteria pollutants of O3,
NO2, SO2, CO, PM2.5, PM10 and Pb and
are minimum QA requirements for the
control and assessment of the quality of
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
the PSD ambient air monitoring data
submitted to the PSD reviewing
authority 27 or the EPA by an
organization operating a network of PSD
stations.
In the 2006 monitoring rule revisions,
the PSD QA requirements, which were
previously in appendix B, were
consolidated with appendix A and
appendix B was held in reserve. The
PSD requirements, in most cases,
parallel appendix A in structure and
content but because PSD monitoring is
only required for a period of one year
or less, some of the frequencies of
implementation of the QC requirements
for PSD are higher than the
corresponding appendix A
requirements. In addition, the agencies
governing the implementation,
assessment and approval of the QA
requirements are different; the
reviewing authorities for PSD
monitoring and the EPA regions for
ambient air monitoring for NAAQS
decisions. The combined regulations
have caused confusion or
misinterpretations of the regulations
among the public and monitoring
organizations implementing NAAQS or
PSD requirements, and have resulted in
failure, in some cases, to perform the
necessary QC requirements.
Accordingly, the EPA proposes that the
PSD QA requirements be removed from
appendix A and returned to appendix B
which is currently reserved. Separating
the two sets of QA requirements would
clearly distinguish the PSD QA
requirements and allow more flexibility
for future revisions to either monitoring
program.
With this proposed rule, the EPA
would not change most of the QC
requirements for PSD. Therefore, the
discussion that follows will cover those
sections of the PSD requirements that
the EPA proposes to change from the
current appendix A requirements.
The applicability section of appendix
B clarifies that the PSD QA
requirements are not assumed to be
minimum requirements for data used in
NAAQS decisions. One reason for this
distinction is in the flexibility allowed
in PSD monitoring for the NPEP (current
appendix A section 2.4). The proposed
PSD requirements allow the PSD
reviewing authority to decide whether
implementation of the NPEP will be
performed. The NPEP, which is
described in appendix A, includes the
NPAP, PM2.5 Performance Evaluation
Program (PM2.5-PEP), and the Pb-PEP.
27 Permitting authority and reviewing authority
are often used synonymously in PSD permitting.
Since reviewing authority has been defined in 40
CFR 51.166(b), it is used throughout appendix B.
PO 00000
Frm 00015
Fmt 4701
Sfmt 4702
54369
Accordingly, under the proposed rule, if
a PSD reviewing authority were to have
the intent of using PSD data for any
official comparison to the NAAQS
beyond the permitting application, such
as for attainment/nonattainment
designations or clean data
determinations, then all requirements in
appendix B including implementation
of the NPEP would apply. In this case,
monitoring would more closely conform
to the appendix A requirements. The
EPA proposes this flexibility for PSD
because the NPEP requires either federal
implementation or implementation by a
qualified individual, group or
organization that is not part of the
organization directly performing and
accountable for the work being assessed.
The NPEP may require specialized
equipment, certified auditors and a
number of activities which are
enumerated in the sections associated
with these programs. Arranging this
type of support service may be more
difficult for the operator of a single or
small number of PSD monitoring
stations operating for only a year or less.
The EPA cannot accept funding from
private contractors or industry, and
federal implementation of the NPEP for
PSD would face several funding and
logistical hurdles. This creates an
inequity in the NPEP implementation
options available to the PSD monitoring
organizations compared to the state/
local/tribal monitoring organization
monitoring for NAAQS compliance. The
EPA has had success in training and
certifying private contractors in various
categories of performance evaluations
conducted under NPEP, but many have
not made the necessary investments in
capital equipment to implement all
categories of the performance
evaluations. Since the monitoring
objectives for the collection of data for
PSD are not necessarily the same as
those for NAAQS evaluations, the EPA
proposes to allow the PSD reviewing
authority to determine whether a PSD
monitoring project must implement the
NPEP.
The EPA proposes to clarify the
definition of PSD PQAO. The PQAO
was first defined in appendix A in 2006
(current appendix A section 3.1.1) when
the PSD requirements were combined
with appendix A. The definition is not
substantially changed for PSD, but the
EPA proposes to clarify that a PSD
PQAO can only be associated with one
PSD reviewing authority. Distinguishing
among the PSD PQAOs that coordinate
with a PSD reviewing authority would
be consistent with discrete jurisdictions
for PSD permitting, and it would
simplify oversight of the QA
requirements for each PSD network.
E:\FR\FM\11SEP2.SGM
11SEP2
54370
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Given that companies may apply for
PSD permits throughout the United
States, it is expected that some PSD
monitoring organizations will work with
multiple reviewing authorities. The PSD
PQAO code which may appear in the
AQS data base and other records defines
the PSD monitoring organization or a
coordinated aggregation of such
organizations that is responsible for a
set of stations within one PSD reviewing
authority that monitors the same
pollutant and for which data quality
assessments will be pooled. The PSD
monitoring organizations that work with
multiple PSD reviewing authorities
would have individual PSD PQAO
codes for each PSD reviewing authority.
This approach will allow for the
flexibility to develop appropriate
quality systems for each PSD reviewing
authority.
The EPA proposes to add definitions
of ‘‘PSD monitoring organization’’ and
‘‘PSD monitoring network’’ to 40 CFR
58.1. The definitions have been
developed to improve understanding of
the appendix B regulations.
Since the EPA uses the term
‘‘monitoring organization’’ quite
frequently in the NAAQS associated
ambient air regulations, the EPA wants
to provide a better definition of the term
in the PSD QA requirements. Therefore,
the EPA proposes the term ‘‘PSD
monitoring organization’’ to identify ‘‘a
source owner/operator, a government
agency, or its contractor that operates an
ambient air pollution monitoring
network for PSD purposes.’’
The EPA also proposes to define ‘‘PSD
monitoring network’’ in order to
distinguish ‘‘a set of monitors that
provide concentration information for a
specific PSD permit.’’ The EPA will
place both definitions in 40 CFR 58.1.
2. Quality System Requirements
The EPA proposes to remove the
PM10-2.5 requirements for flow rate
verifications, semi-annual flow rate
audits, collocated sampling procedures
and PM10-2.5 Performance Evaluation
Program from appendix B (current
appendix A sections 3.2.6, 3.2.8, 3.3.6,
3.3.8, 4.3). In 2006, the EPA proposed a
PM10-2.5 NAAQS along with requisite
QA requirements in appendix A. While
the PM10-2.5 NAAQS was not
promulgated, PM10-2.5 monitoring was
required to be performed at NCore sites
and the EPA proposed requisite QA
requirements in appendix A. Since PSD
monitoring is distinct from monitoring
at NCore sites and PM10-2.5 is not a
criteria pollutant, it will be removed
from the PSD QA requirements.
The EPA proposes that the Pb QA
requirements of collocated sampling
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
(current appendix A section 3.3.4.3) and
Pb performance evaluation procedures
(current appendix A section 3.3.4.4) for
non-source oriented NCore sites be
eliminated for PSD. The 2010 Pb rule in
40 CFR part 58, appendix D, section
4.5(b), added a requirement to conduct
non-source oriented Pb monitoring at
each NCore site in a CBSA with a
population of 500,000 or more. Since
PSD does not implement NCore sites,
the EPA proposes to eliminate the Pb
QA language specific to non-source
NCore sites from PSD while retaining
the PSD QA requirements for routine Pb
monitoring.
The EPA proposes that elements of
QMPs and QAPPs which are separate
documents and are described in
appendix A, sections 2.1.1 and 2.1.2,
can be combined into a single document
for PSD monitoring networks. The QMP
provides a ‘‘blueprint’’ of a PSD
monitoring organization’s quality
system. It includes quality policies and
describes how the organization as a
whole manages and implements its
quality system regardless of what
monitoring is being performed. The
QAPP includes details for implementing
a specific PSD monitoring activity. For
PSD monitoring, the EPA believes the
project-specific QAPP takes priority but
there are important aspects of the QMP
that could be incorporated into the
QAPP. The current appendix A
requirements allow smaller
organizations or organizations that do
infrequent work with EPA to combine
the QMP with the QAPP based on
negotiations with the funding agency
and provided guidance 28 on a graded
approach to developing these
documents. In the case of PSD QMPs
and QAPPs, the EPA proposes that the
PSD reviewing authority, which has the
approval authority for these documents,
also have the flexibility for allowing the
PSD PQAO to combine pertinent
elements of the QMP into the QAPP
rather than requiring the submission of
both QMP and QAPP documents
separately.
The EPA proposes to add language to
the appendix B version of the data
quality objectives (DQO) section
(current appendix A section 2.3.1)
which allows flexibility for the PSD
reviewing authority and the PSD
monitoring organization to determine if
adherence to the DQOs specified in
appendix A, which are the DQO goals
for NAAQS decisions, are appropriate or
whether project-specific goals are
necessary. Allowing the PSD reviewing
authority and the PSD monitoring
28 Graded approach to Tribal QAPP and QMPs
https://www.epa.gov/ttn/amtic/cpreldoc.html.
PO 00000
Frm 00016
Fmt 4701
Sfmt 4702
organization flexibility to change the
DQOs does not change the
implementation requirements for the
types and frequency of the QC checks in
appendix B, but does give some
flexibility in the acceptance of data for
use in specific projects for which the
PSD data are collected. As an example,
the goal for acceptable measurement
uncertainty for the collection of O3 data
for NAAQS determinations is defined
for precision as an upper 90 percent
confidence limit for CV of seven percent
and for bias as an upper 95 percent
confidence limit for the absolute bias of
seven percent. The precision and bias
estimates are made with 3 years of onepoint QC check data. A single or a few
one-point QC checks over seven percent
would not have a significant effect on
meeting the DQO goal. The PSD
monitoring DQO, depending on the
objectives of the PSD monitoring
network, may require a stricter DQO
goal or one less restrictive. Since PSD
monitoring covers a period of 1 year or
less, one-point QC checks over seven
percent will increase the likelihood of
failing to meet the DQO goal since there
would be fewer QC checks available in
the monitoring period to estimate
precision and bias. With fewer checks,
any individual check will statistically
have more influence over the precision
or bias estimate. Realizing that PSD
monitoring may have different
monitoring objectives, the EPA proposes
to add language that would allow
decisions on data quality objectives to
be determined through consultation
between the appropriate PSD reviewing
authority and PSD monitoring
organization.
The EPA proposes to add some
clarifying language to the section
describing the NPEP (current appendix
A section 2.4) to explain selfimplementation of the performance
evaluation by the PSD monitoring
organization. Self-implementation of
NPEP has always been an option for
monitoring organizations but the
requirements for self-implementation
were described in the technical
implementation documents (i.e.,
implementation plans and QAPPs) for
the program and in an annual selfimplementation decision memo that is
distributed to monitoring
organizations.29 These major
requirements for self-implementation
are proposed to be included in the
appendix B sections pertaining to the
NPEP program (NPAP, PM2.5-PEP and
Pb-PEP).
The NPEP clarification also adds a
definition of ‘‘independent assessment.’’
29 https://www.epa.gov/ttn/amtic/npepqa.html.
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
The proposed definition is derived from
the NPEP (NPAP, PM2.5-PEP, and PbPEP) QAPPs and guidance; it also
appears in the annual selfimplementation memo described above.
The clarification is not a new
requirement but consolidates this
information.
The EPA proposes to require PSD
PQAOs to provide information to the
PSD reviewing authority on the vendors
of gas standards that they use (or will
use) for the duration of the PSD
monitoring project. A QAPP or
monitoring plan may incorporate this
information; however, that document
must then be updated if there is a
change in the vendor used. The current
regulation (current appendix A section
2.6.1) requires any gas vendor
advertising and distributing ‘‘EPA
Protocol Gas’’ to participate in the AA–
PGVP. The EPA posts a list of these
vendors on the AMTIC Web site.30 This
is not expected to be a burden since
information of this type is normally
included in a QAPP or standard
operating procedure for a monitoring
activity.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
3. Quality Control Checks for Gases
The EPA proposes to lower the audit
concentrations (current appendix A
section 3.2.1) of the one-point QC
checks to 0.005 and 0.08 ppm for SO2,
NO2, and O3 (currently 0.01 to 0.1 ppm),
and to between 0.5 and 5 ppm for CO
monitors (currently 1 and 10 ppm).
With the development of more sensitive
monitoring instruments with lower
detection limits, technical
improvements in calibrators, and lower
ambient air concentrations in general,
the EPA believes this revision will
better reflect the precision and bias of
the routinely-collected ambient air data.
Since the audit concentrations are
selected using the mean or median
concentration of typical ambient air data
(guidance on this is provided in the QA
Handbook 31), the EPA is proposing to
add some clarification to the current
language by requiring PSD monitoring
organizations to select either the highest
or lowest concentration in the ranges
identified if the mean or median values
of the routinely-collected concentrations
are above or below the prescribed range.
There is no additional burden added by
this requirement since the frequency is
the same and the audit concentrations
are not so low as to make them
unachievable to generate or measure.
30 https://www.epa.gov/ttn/amtic/aapgvp.html.
31 QA Handbook for Air Pollution Measurement
Vol. II Ambient Air Quality Monitoring Program at:
https://www.epa.gov/ttn/amtic/qalist.html.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
The EPA proposes to remove the
existing reference to zero and span
adjustments (current appendix A,
section 3.2.1.1) and to revise the onepoint QC language to simply require
that the QC check be conducted before
making any calibration or adjustment to
the monitor. Recent revisions of the QA
Handbook discourage the practice of
making frequent span adjustments so
the proposed language helps to clarify
that no adjustment be made prior to
implementation of the one-point QC
check.
The current annual performance
evaluation language (current appendix
A, section 3.2.2.1) requires that the
audits be conducted by selecting three
consecutive audit levels (currently
appendix A recognizes five audit
levels). Due to the implementation of
the NCore network, the inception of
trace gas monitors, and lower ambient
air concentrations being measured
under typical circumstances, there is a
need for audit levels at lower
concentrations to more accurately
represent the uncertainties present in
the ambient air data. The EPA proposes
to expand the audit levels from five to
ten and remove the requirement to audit
three consecutive levels. The current
regulation also requires that the three
audit levels should bracket 80 percent of
the ambient air concentrations
measured by the analyzer. This current
‘‘bracketing language’’ has caused some
confusion and monitoring organizations
have requested the use of an audit point
to establish monitor accuracy around
the NAAQS levels. Therefore, the EPA
is proposing to revise the language so
that two of the audit levels selected
represent 10 to 80 percent of routinelycollected ambient concentrations either
measured by the monitor or in the PSD
PQAOs network of monitors. The
proposed revision allows the third point
to be selected at a concentration that is
consistent with PSD-specific DQOs (e.g.,
the 75 ppb NAAQS level for SO2).
The EPA proposes to revise the
language (current appendix A, section
3.2.2.2(a)) addressing the limits on
excess NO that must be followed during
GPT procedures involving NO2 audits.
The current NO limit (maintaining at
least 0.08 ppm) is very restrictive and
requires auditors to make numerous
mid-audit adjustments during a GPT
that result in making the NO2 audit a
very time consuming procedure.
Monitoring agency staff have advised us
that the observance of such excess NO
limits has no apparent effect on NO2
calibrations being conducted with
modern-day GPT-capable calibration
equipment and, therefore, that the
requirements in the context of
PO 00000
Frm 00017
Fmt 4701
Sfmt 4702
54371
performing audits is unnecessary.32 We
also note the increasing availability of
the EPA-approved direct NO2 methods
that do not utilize converters, rendering
the use of GPT techniques that require
the output of NO and NOX to be a
potentially diminishingly used
procedure in the future. Accordingly,
we have proposed a more general
statement regarding GPT that
acknowledges the ongoing usage of
monitoring agency procedures and
guidance documents that have
successfully supported NO2 calibration
activities. The EPA believes that if such
procedures have been successfully used
during calibrations when instrument
adjustments are potentially being made,
than such procedures are appropriate
for audit use when instruments are not
subject to adjustment. The EPA solicits
comment on this proposed
generalization of the GPT requirements,
including whether a more specific set of
requirements similar to the current
excess NO levels can be developed
based on operational experience and/or
peer reviewed literature.
The EPA proposes to remove language
(current appendix A section 3.2.2.2(b))
in the annual performance evaluation
section that requires regional approval
for audit gases for any monitors
operating at ranges higher that 1.0 ppm
for O3, SO2 and NO2 and greater than 50
ppm for CO. The EPA does not need to
approve a monitoring organization’s use
of audit gases to audit above proposed
concentration levels since the EPA has
identified the requirements for all audit
gases used in the program in current
appendix A, section 2.6.1. There should
be very few cases where a performance
evaluation needs to be performed above
level 10 but there may be some
legitimate instances (e.g., an SO2 audit
in areas impacted by volcanic
emissions). Since data reported to AQS
above the highest level may be rejected
(if PSD PE data are reported to AQS),
the EPA proposes that PQAOs notify the
PSD reviewing authority of sites
auditing at concentrations above level
10 so that reporting accommodations
can be made.
The EPA proposes to describe the
NPAP (current appendix A, section 2.4)
in more detail. The NPAP is a longstanding program for the ambient air
monitoring community. The NPAP is a
performance evaluation which is a type
of audit where quantitative data are
collected independently in order to
evaluate the proficiency of an analyst,
monitoring instrument or laboratory.
32 See supporting information in Excess NO Issue
paper, Mike Papp and Lewis Weinstock, Docket
number EPA–HQ–OAR–2013–0619.
E:\FR\FM\11SEP2.SGM
11SEP2
54372
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
tkelley on DSK3SPTVN1PROD with PROPOSALS2
This program has been briefly
mentioned in section 2.4 of the current
appendix A requirements. In appendix
A, the EPA is proposing to add language
consistent with an annual decision
memorandum 33 distributed to all state
and local monitoring organizations in
order to determine whether the
monitoring organization plans to selfimplement the NPAP program or utilize
the federally implemented program. In
order to make this decision, the NPAP
adequacy and independence
requirements are described in the
decision memorandum. The EPA
proposes to include these same
requirements in appendix B in a
separate section for NPAP. As described
in the applicability section, the
implementation of NPAP is at the
discretion of the PSD reviewing
authority but must be implemented if
data are used in any NAAQS
determinations. Since PSD monitoring
is implemented at shorter intervals
(usually a year) and with fewer
monitors, if NPAP is performed, it is
required to be performed annually on
each monitor operated in the PSD
network.
4. Quality Control Checks for Particulate
Monitors
The EPA proposes to have one flow
rate verification frequency requirement
for all PM PSD monitors. The current
regulations (current appendix A, table
A–2) provides for monthly flow rate
verifications for most samplers used to
monitor PM2.5, PM10 and Pb and
quarterly flow rate verifications for
high-volume PM10 or TSP samplers (for
Pb). With longer duration NAAQS
monitoring, the quarterly verification
frequencies are adequate for these highvolume PM10 or TSP samplers.
However, with the short duration of
PSD monitoring, the EPA believes that
monthly flow rate verifications are more
appropriate to ensure that any sampler
flow rate problems are identified more
quickly and to reduce the potential for
a significant amount of data invalidation
that could extend monitoring activities.
The EPA proposes to grant more
flexibility to PSD monitoring
organizations when selecting PM2.5
method designations for sites that
require collocation. Appendix A
currently (current appendix A, section
3.2.5.2(b)) requires that if a primary
monitor is a FEM, then the first QC
collocated monitor must be a FRM
monitor. Most of the FEM monitors are
continuous monitors while the FRM
monitors are filter-based. Continuous
33 https://www.epa.gov/ttn/amtic/files/ambient/
pm25/qa/npappep2014.pdf.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
monitors (which are all FEMs) may be
advantageous for use at the more remote
PSD monitoring locations, since the site
operator would not need to visit a site
as often to retrieve filters (current FRMs
are filter-based). The current collocation
requirements for FEMs require a filterbased FRM for collocation which would
mean a visit to retrieve the FRM filters
at least one week after the QC collocated
monitor operated. Therefore, the EPA
proposes that the FRM be selected as the
QC collocated monitor unless the PSD
PQAO submits a waiver request to allow
for collocation with a FEM to the PSD
reviewing authority. If the request for a
waiver is approved, then the QC
monitor must be the same method
designation as the primary FEM
monitor.
The EPA proposes to allow the PSD
reviewing authority to waive the PM2.5
3 mg/m3 concentration validity
threshold for implementation of the
PM2.5-PEP in the last quarter of PSD
monitoring. The PM2.5-PEP (current
appendix A section 3.2.7) requires five
valid PM2.5-PEP audits per year for
PM2.5 monitoring networks with less
than or equal to five sites and eight
valid PM2.5-PEP audits per year with
PM2.5 monitoring networks greater than
five sites. Any PEP sample collected
with a concentration less than 3 mg/m3
are not considered valid, since they
cannot be used for bias estimates, and
re-sampling is required at a later date.
With NAAQS related monitoring, which
aggregates the PM2.5-PEP data over a 3year period, re-sampling is easily
accomplished. Due to the relatively
short-term nature of most PSD
monitoring, the likelihood of measuring
low concentrations in many areas
attaining the PM2.5 standard and the
time required to weigh filters collected
in performance evaluations, a PSD
monitoring organization’s QAPP may
contain a provision to waive the 3 mg/
m3 threshold for validity of performance
evaluations conducted in the last
quarter of monitoring, subject to
approval by the PSD reviewing
authority.
5. Calculations for Data Quality
Assessment
In order to allow reasonable estimates
of data quality, the EPA uses data above
an established threshold concentration
usually related to the detection limits of
the measurement method. Measurement
pairs are selected for use in the
precision and bias calculations only
when both measurements are above a
threshold concentration.
For many years, the threshold
concentration for Pb precision and bias
data has been 0.02 ug/m3. The EPA
PO 00000
Frm 00018
Fmt 4701
Sfmt 4702
promulgated a new Pb FRM utilizing the
ICP–MS analysis technique in 2013 as a
revision to appendix G of 40 CFR part
50.34 This new FRM demonstrated
MDLs 35 below 0.0002 mg/m3 which is
well below the EPA requirement of five
percent of the current Pb NAAQS level
of 0.15 mg/m3 or 0.0075 mg/m3. As a
result of the increased sensitivity
inherent in this new FRM, the EPA
proposes to lower the acceptable Pb
concentration (current section 4) from
the current value of 0.02 ug/m3 to 0.002
mg/m3 for measurements obtained using
the new Pb FRM and other more
recently approved equivalent methods
that have the requisite increased
sensitivity.36 The current 0.02 ug/m3
value will be retained for the previous
Pb FRM that has subsequently been
redesignated as Federal Equivalent
Method EQLA–0813–803 as well as
older equivalent methods that were
approved prior to the more recent work
on developing more sensitive methods.
Since ambient Pb concentrations are
lower and methods more sensitive,
lowering the threshold concentration
will allow much more collocated
information to be evaluated, which will
provide more representative estimates of
precision and bias.
The EPA also proposes to remove the
TSP threshold concentration since TSP
is no longer an ambient indicator of PM
NAAQS required pollutant and the EPA
no longer applies QC requirements for
it.
The EPA proposes to remove the
statistical check currently described in
section 4.1.5 of appendix A. The check
was developed to perform a comparison
of the one-point QC checks and the
annual performance evaluation data
performed by the same PQAO. The
section suggests that 95 percent of all
the bias estimates of the annual
performance evaluations (reported as a
percent difference) should fall within
the 95 percent probability interval
developed using the one-point QC
checks. The problem with this check is
that PQAOs with very good repeatability
on the one-point QC check data had a
hard time meeting this requirement
since the probability interval became
very tight, making it more difficult for
better performing PQAOs to meet the
requirement. Separate statistics to
34 See
78 FR 40000, July 3, 2013.
is described as the minimum
concentration of a substance that can be measured
and reported with 99 percent confidence that the
analyte concentration is greater than zero.
36 FEMs approved on or after March 4, 2010, have
the required sensitivity to utilize the 0.002 mg/m3
reporting limit with the exception of manual
equivalent method EQLA–0813–803, the previous
FRM based on flame atomic absorption
spectroscopy.
35 MDL
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
evaluate the one-point QC checks and
the performance evaluations are already
promulgated, so the removal of this
check does not affect data quality
assessments.
Similar to the statistical comparison
of performance evaluation data, the EPA
proposes to remove the statistical check
(current appendix A, section 4.2.4) to
compare the flow rate audit data and
flow rate verification data. The existing
language suggests that 95 percent of all
the flow rate audit data (reported as
percent difference) should fall within
the 95 percent probability interval
developed from the flow rate
verification data for the PQAO. The
problem, as with the one-point QC
check, was that monitoring
organizations with very good
repeatability on the flow rate
verifications had a hard time meeting
this requirement since the probability
interval became very tight, making it
difficult for better performing PQAOs to
meet the requirement. Separate statistics
to evaluate the flow rate verifications
and flow rate audits are already
promulgated so the removal of this
check does not affect data quality
assessments.
The EPA proposes to remove the
reporting requirements that are
currently in section 5 of appendix A
because they do not pertain to PSD
monitoring (current sections 5.1, 5.1.1
and 5.1.2.1). Since PSD organizations
are not required to certify their data to
the EPA nor report to AQS, the EPA will
remove language related to these
requirements and language that required
the EPA to calculate and report the
measurement uncertainty for the entire
calendar year. The EPA will retain the
quarterly PSD reporting requirements
(current section 5.2 in appendix A) and
require that those requirements be
consistent with Part 58.16 as it pertains
to PSD ambient air quality data and QC
data, as described in appendix B.
IV. Statutory and Executive Order
Reviews
tkelley on DSK3SPTVN1PROD with PROPOSALS2
A. Executive Order 12866: Regulatory
Planning and Review and Executive
Order 13563: Improving Regulation and
Regulatory Review
This action is not a ‘‘significant
regulatory action’’ under the terms of
Executive Order 12866 (58 FR 51735,
October 4, 1993) and is therefore not
subject to review under Executive
Orders 12866 and 13563 (76 FR 3821,
January 21, 2011).
B. Paperwork Reduction Act
This action does not impose an
information collection burden under the
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
provisions of the Paperwork Reduction
Act, 44 U.S.C. 3501 et seq. Burden is
defined at 5 CFR 1320.3(b). While the
EPA believes that the net effect of the
proposed changes to requirements is a
net decrease in burden, the current
information collection request
calculation tools are not sufficiently
detailed to show a material change in
burden compared with the existing
requirements.
C. Regulatory Flexibility Act
The Regulatory Flexibility Act (RFA)
generally requires an agency to prepare
a regulatory flexibility analysis of any
rule subject to notice and comment
rulemaking requirements under the
Administrative Procedure Act or any
other statute unless the agency certifies
that the rule will not have a significant
economic impact on a substantial
number of small entities. Small entities
include small businesses, small
organizations, and small governmental
jurisdictions.
For purposes of assessing the impacts
of this rule on small entities, small
entity is defined as (1) a small business
as defined by the Small Business
Administration’s (SBA) regulations at 13
CFR 121.201; (2) a small governmental
jurisdiction that is a government of a
city, county, town, school district or
special district with a population of less
than 50,000; and (3) a small
organization that is any not-for-profit
enterprise which is independently
owned and operated and is not
dominant in its field.
After considering the economic
impacts of this rule on small entities, I
certify that this action will not have a
significant economic impact on a
substantial number of small entities.
This proposed rule will neither impose
emission measurement requirements
beyond those specified in the current
regulations, nor will it change any
emission standard. As such, it will not
present a significant economic impact
on small entities.
D. Unfunded Mandates Reform Act
This action contains no federal
mandates under the provisions of Title
II of the Unfunded Mandates Reform
Act of 1995 (UMRA), 2 U.S.C. 1531–
1538 for state, local, or tribal
governments or the private sector. This
action imposes no enforceable duty on
any state, local or tribal governments or
the private sector. Therefore, this action
is not subject to the requirements of
sections 202 or 205 of the UMRA. This
action is also not subject to the
requirements of section 203 of UMRA
because it contains no regulatory
PO 00000
Frm 00019
Fmt 4701
Sfmt 4702
54373
requirements that might significantly or
uniquely affect small governments.
E. Executive Order 13132: Federalism
This action does not have federalism
implications. It will not have substantial
direct effects on the states, on the
relationship between the national
government and the states, or on the
distribution of power and
responsibilities among the various
levels of government, as specified in
Executive Order 13132. This action
proposes minor changes to existing
monitoring requirements and will not
materially impact the time required to
operate monitoring networks. Thus,
Executive Order 13132 does not apply
to this action. In the spirit of Executive
Order 13132, and consistent with the
EPA policy to promote communications
between the EPA and state and local
governments, the EPA specifically
solicits comment on this proposed rule
from state and local officials.
F. Executive Order 13175: Consultation
and Coordination With Indian Tribal
Governments
This action does not have tribal
implications, as specified in Executive
Order 13175 (65 FR 67249, November 9,
2000). This proposed rule imposes no
requirements on tribal governments.
This action proposes minor changes to
existing monitoring requirements and
will not materially impact the time
required to operate monitoring
networks. Thus, Executive Order 13175
does not apply to this action. In the
spirit of Executive order 13175, the EPA
specifically solicits additional comment
on this proposed action from tribal
officials.
G. Executive Order 13045: Protection of
Children From Environmental Health
and Safety Risks
The EPA interprets E.O. 13045 (62 FR
19885, April 23, 1997) as applying only
to those regulatory actions that concern
health or safety risks, such that the
analysis required under section 5–501 of
the E.O. has the potential to influence
the regulation. This action is not subject
to E.O. 13045 because it does not
establish an environmental standard
intended to mitigate health or safety
risks.
H. Executive Order 13211: Actions
Concerning Regulations That
Significantly Affect Energy Supply,
Distribution, or Use
This action is not a ‘‘significant
energy action’’ as defined in Executive
Order 13211 (66 FR 28355 (May 22,
2001)), because it is not likely to have
a significant adverse effect on the
E:\FR\FM\11SEP2.SGM
11SEP2
54374
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
supply, distribution, or use of energy.
This action proposes minor changes to
existing monitoring requirements.
I. National Technology Transfer and
Advancement Act
Section 12(d) of the National
Technology Transfer and Advancement
Act of 1995 (‘‘NTTAA’’), Public Law No.
104–113 (15 U.S.C. 272 note) directs the
EPA to use voluntary consensus
standards in its regulatory activities
unless to do so would be inconsistent
with applicable law or otherwise
impractical. Voluntary consensus
standards are technical standards (e.g.,
materials specifications, test methods,
sampling procedures, and business
practices) that are developed or adopted
by voluntary consensus standards
bodies. The NTTAA directs the EPA to
provide Congress, through OMB,
explanations when the agency decides
not to use available and applicable
voluntary consensus standards. This
proposed rulemaking does not involve
technical standards. Therefore this
action is not subject to the NTTAA.
J. Executive Order 12898: Federal
Actions To Address Environmental
Justice in Minority Populations and
Low-Income Populations
Executive Order (E.O.) 12898 (59 FR
7629 (Feb. 16, 1994)) establishes federal
executive policy on environmental
justice. Its main provision directs
federal agencies, to the greatest extent
practicable and permitted by law, to
make environmental justice part of their
mission by identifying and addressing,
as appropriate, disproportionately high
and adverse human health or
environmental effects of their programs,
policies, and activities on minority
populations and low-income
populations in the United States.
The EPA has determined that this
proposed rule will not have
disproportionately high and adverse
human health or environmental effects
on minority or low-income populations
because it does not affect the level of
protection provided to human health or
the environment.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
List of Subjects in 40 CFR Part 58
Environmental protection,
Administrative practice and procedure,
Air pollution control, Intergovernmental
relations.
Dated: August 13, 2014.
Gina McCarthy,
Administrator.
For the reasons stated in the
preamble, the Environmental Protection
Agency proposes to amend title 40,
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
chapter 1 of the Code of Federal
Regulations as follows:
PART 58—AMBIENT AIR QUALITY
SURVEILLANCE
1. The authority citation for part 58
continues to read as follows:
■
Authority: 42 U.S.C. 7403, 7405, 7410,
7414, 7601, 7611, 7614, and 7619.
■
2. Revise § 58.1 to read as follows:
§ 58.1
Definitions.
As used in this part, all terms not
defined herein have the meaning given
them in the Clean Air Act.
AADT means the annual average daily
traffic.
Act means the Clean Air Act as
amended (42 U.S.C. 7401, et seq.)
Additive and multiplicative bias
means the linear regression intercept
and slope of a linear plot fitted to
corresponding candidate and reference
method mean measurement data pairs.
Administrator means the
Administrator of the Environmental
Protection Agency (EPA) or his or her
authorized representative.
Air Quality System (AQS) means the
EPA’s computerized system for storing
and reporting of information relating to
ambient air quality data.
Approved regional method (ARM)
means a continuous PM2.5 method that
has been approved specifically within a
state or local air monitoring network for
purposes of comparison to the NAAQS
and to meet other monitoring objectives.
AQCR means air quality control
region.
Area-wide means all monitors sited at
neighborhood, urban, and regional
scales, as well as those monitors sited at
either micro- or middle-scale that are
representative of many such locations in
the same CBSA.
Certifying agency means a state, local,
or tribal agency responsible for meeting
the data certification requirements in
accordance with § 58.15 of this part for
a unique set of monitors.
Chemical Speciation Network (CSN)
includes Speciation Trends Network
stations (STN) as specified in paragraph
4.7.4 of appendix D of this part and
supplemental speciation stations that
provide chemical species data of fine
particulate.
CO means carbon monoxide.
Combined statistical area (CSA) is
defined by the U.S. Office of
Management and Budget as a
geographical area consisting of two or
more adjacent Core Based Statistical
Areas (CBSA) with employment
interchange of at least 15 percent.
Combination is automatic if the
employment interchange is 25 percent
PO 00000
Frm 00020
Fmt 4701
Sfmt 4702
and determined by local opinion if more
than 15 but less than 25 percent.
Core-based statistical area (CBSA) is
defined by the U.S. Office of
Management and Budget, as a statistical
geographic entity consisting of the
county or counties associated with at
least one urbanized area/urban cluster
of at least 10,000 population, plus
adjacent counties having a high degree
of social and economic integration.
Metropolitan Statistical Areas (MSAs)
and micropolitan statistical areas are the
two categories of CBSA (metropolitan
areas have populations greater than
50,000; and micropolitan areas have
populations between 10,000 and
50,000). In the case of very large cities
where two or more CBSAs are
combined, these larger areas are referred
to as combined statistical areas (CSAs)
Corrected concentration pertains to
the result of an accuracy or precision
assessment test of an open path analyzer
in which a high-concentration test or
audit standard gas contained in a short
test cell is inserted into the optical
measurement beam of the instrument.
When the pollutant concentration
measured by the analyzer in such a test
includes both the pollutant
concentration in the test cell and the
concentration in the atmosphere, the
atmospheric pollutant concentration
must be subtracted from the test
measurement to obtain the corrected
concentration test result. The corrected
concentration is equal to the measured
concentration minus the average of the
atmospheric pollutant concentrations
measured (without the test cell)
immediately before and immediately
after the test.
Design value means the calculated
concentration according to the
applicable appendix of part 50 of this
chapter for the highest site in an
attainment or nonattainment area.
EDO means environmental data
operations.
Effective concentration pertains to
testing an open path analyzer with a
high-concentration calibration or audit
standard gas contained in a short test
cell inserted into the optical
measurement beam of the instrument.
Effective concentration is the equivalent
ambient-level concentration that would
produce the same spectral absorbance
over the actual atmospheric monitoring
path length as produced by the highconcentration gas in the short test cell.
Quantitatively, effective concentration
is equal to the actual concentration of
the gas standard in the test cell
multiplied by the ratio of the path
length of the test cell to the actual
atmospheric monitoring path length.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
Federal equivalent method (FEM)
means a method for measuring the
concentration of an air pollutant in the
ambient air that has been designated as
an equivalent method in accordance
with part 53; it does not include a
method for which an equivalent method
designation has been canceled in
accordance with § 53.11 or § 53.16.
Federal reference method (FRM)
means a method of sampling and
analyzing the ambient air for an air
pollutant that is specified as a reference
method in an appendix to part 50 of this
chapter, or a method that has been
designated as a reference method in
accordance with this part; it does not
include a method for which a reference
method designation has been canceled
in accordance with § 53.11 or § 5316.
HNO3 means nitric acid.
Implementation Plan means an
implementation plan approved or
promulgated by the EPA pursuant to
section 110 of the Act.
Local agency means any local
government agency, other than the state
agency, which is charged by a state with
the responsibility for carrying out a
portion of the annual monitoring
network plan required by § 58.10.
Meteorological measurements means
measurements of wind speed, wind
direction, barometric pressure,
temperature, relative humidity, solar
radiation, ultraviolet radiation, and/or
precipitation that occur at stations
including NCore and PAMS.
Metropolitan Statistical Area (MSA)
means a CBSA associated with at least
one urbanized area of 50,000 population
or greater. The central county, plus
adjacent counties with a high degree of
integration, comprise the area.
Monitor means an instrument,
sampler, analyzer, or other device that
measures or assists in the measurement
of atmospheric air pollutants and which
is acceptable for use in ambient air
surveillance under the applicable
provisions of appendix C to this part.
Monitoring agency means a state,
local or Tribal agency responsible for
meeting the requirements of this part.
Monitoring organization means a
monitoring agency or other monitoring
organization responsible for operating a
monitoring site for which the quality
assurance regulations apply.
Monitoring path for an open path
analyzer means the actual path in space
between two geographical locations over
which the pollutant concentration is
measured and averaged.
Monitoring path length of an open
path analyzer means the length of the
monitoring path in the atmosphere over
which the average pollutant
concentration measurement (path-
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
averaged concentration) is determined.
See also, optical measurement path
length.
Monitoring planning area (MPA)
means a contiguous geographic area
with established, well-defined
boundaries, such as a CBSA, county or
state, having a common area that is used
for planning monitoring locations for
PM2.5. A MPA may cross state
boundaries, such as the Philadelphia
PA–NJ MSA, and be further subdivided
into community monitoring zones. The
MPAs are generally oriented toward
CBSAs or CSAs with populations
greater than 200,000, but for
convenience, those portions of a state
that are not associated with CBSAs can
be considered as a single MPA.
NATTS means the national air toxics
trends stations. This network provides
hazardous air pollution ambient data.
NCore means the National Core
multipollutant monitoring stations.
Monitors at these sites are required to
measure particles (PM2.5, speciated
PM2.5, PM10-2.5), O3, SO2, CO, nitrogen
oxides (NO/NOy), and meteorology
(wind speed, wind direction,
temperature, relative humidity).
Near-road monitor means any
approved monitor meeting the
applicable specifications described in
40 CFR part 58, appendix D (sections
4.2.1, 4.3.2, 4.7.1(b)(2)) and appendix E
(section 6.4(a), Table E–4) for near-road
measurement of PM2.5, CO, or NO2.
Network means all stations of a given
type or types.
Network Plan means the Annual
Monitoring Network Plan described in
§ 58.10 of this part.
NH3 means ammonia.
NO2 means nitrogen dioxide.
NO means nitrogen oxide.
NOX means the sum of the
concentrations of NO2 and NO.
NOy means the sum of all total
reactive nitrogen oxides, including NO,
NO2, and other nitrogen oxides referred
to as NOZ.
O3 means ozone.
Open path analyzer means an
automated analytical method that
measures the average atmospheric
pollutant concentration in situ along
one or more monitoring paths having a
monitoring path length of 5 meters or
more and that has been designated as a
reference or equivalent method under
the provisions of part 53 of this chapter.
Optical measurement path length
means the actual length of the optical
beam over which measurement of the
pollutant is determined. The pathintegrated pollutant concentration
measured by the analyzer is divided by
the optical measurement path length to
determine the path-averaged
PO 00000
Frm 00021
Fmt 4701
Sfmt 4702
54375
concentration. Generally, the optical
measurement path length is:
(1) Equal to the monitoring path
length for a (bistatic) system having a
transmitter and a receiver at opposite
ends of the monitoring path;
(2) Equal to twice the monitoring path
length for a (monostatic) system having
a transmitter and receiver at one end of
the monitoring path and a mirror or
retroreflector at the other end; or
(3) Equal to some multiple of the
monitoring path length for more
complex systems having multiple passes
of the measurement beam through the
monitoring path.
PAMS means photochemical
assessment monitoring stations.
Pb means lead.
PM means particulate matter,
including but not limited to PM10,
PM10C, PM2.5, and PM10-2.5.
PM2.5 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 2.5 micrometers as
measured by a reference method based
on appendix L of part 50 and designated
in accordance with part 53, by an
equivalent method designated in
accordance with part 53, or by an
approved regional method designated in
accordance with appendix C to this part.
PM10 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers as
measured by a reference method based
on appendix J of part 50 and designated
in accordance with part 53 or by an
equivalent method designated in
accordance with part 53.
PM10C means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers as
measured by a reference method based
on appendix O of part 50 and
designated in accordance with part 53
or by an equivalent method designated
in accordance with part 53.
PM10-2.5 means particulate matter with
an aerodynamic diameter less than or
equal to a nominal 10 micrometers and
greater than a nominal 2.5 micrometers
as measured by a reference method
based on appendix O to part 50 and
designated in accordance with part 53
or by an equivalent method designated
in accordance with part 53.
Point analyzer means an automated
analytical method that measures
pollutant concentration in an ambient
air sample extracted from the
atmosphere at a specific inlet probe
point, and that has been designated as
a reference or equivalent method in
accordance with part 53 of this chapter.
Primary Monitor means the monitor
identified by the monitoring
organization that provides concentration
data used for comparison to the
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
54376
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
NAAQS. For any specific site, only one
monitor for each pollutant can be
designated in AQS as primary monitor
for a given period of time. The primary
monitor identifies the default data
source for creating a combined site
record for purposes of NAAQS
comparisons.
Primary quality assurance
organization (PQAO) means a
monitoring organization, a group of
monitoring organizations or other
organization that is responsible for a set
of stations that monitor the same
pollutant and for which data quality
assessments can be pooled. Each criteria
pollutant sampler/monitor at a
monitoring station in the SLAMS and
SPM networks must be associated with
only one PQAO.
Probe means the actual inlet where an
air sample is extracted from the
atmosphere for delivery to a sampler or
point analyzer for pollutant analysis.
PSD monitoring network means a set
of stations that provide concentration
information for a specific PSD permit.
PSD monitoring organization means a
source owner/operator, a government
agency, or a contractor of the source or
agency that operates an ambient air
pollution monitoring network for PSD
purposes.
PSD reviewing authority means the
state air pollution control agency, local
agency, other state agency, tribe, or
other agency authorized by the
Administrator to carry out a permit
program under § 51.165 and § 51.166, or
the Administrator in the case of EPAimplemented permit programs under
§ 52.21.
PSD station means any station
operated for the purpose of establishing
the effect on air quality of the emissions
from a proposed source for purposes of
prevention of significant deterioration
as required by § 51.24(n).
Regional Administrator means the
Administrator of one of the ten EPA
regional offices or his or her authorized
representative.
Reporting organization means an
entity, such as a state, local, or tribal
monitoring agency, that reports air
quality data to the EPA.
Site means a geographic location. One
or more stations may be at the same site.
SLAMS means state or local air
monitoring stations. The SLAMS
include the ambient air quality
monitoring sites and monitors that are
required by appendix D of this part and
are needed for the monitoring objectives
of appendix D, including NAAQS
comparisons, but may serve other data
purposes. The SLAMS includes NCore,
PAMS, CSN, and all other state or
locally operated criteria pollutant
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
monitors operated in accordance to this
part, that have not been designated and
approved by the Regional Administrator
as SPM stations in an annual monitoring
network plan.
SO2 means sulfur dioxide.
Special purpose monitor (SPM)
station means a monitor included in an
agency’s monitoring network that the
agency has designated as a special
purpose monitor station in its annual
monitoring network plan and in the
AQS, and which the agency does not
count when showing compliance with
the minimum requirements of this
subpart for the number and siting of
monitors of various types. Any SPM
operated by an air monitoring agency
must be included in the periodic
assessments and annual monitoring
network plan required by § 58.10 and
approved by the Regional
Administrator.
State agency means the air pollution
control agency primarily responsible for
development and implementation of a
State Implementation Plan under the
Act.
Station means a single monitor, or a
group of monitors, located at a
particular site.
STN station means a PM2.5 chemical
speciation station designated to be part
of the speciation trends network. This
network provides chemical species data
of fine particulate.
Supplemental speciation station
means a PM2.5 chemical speciation
station that is operated for monitoring
agency needs and not part of the STN.
Traceable means that a local standard
has been compared and certified, either
directly or via not more than one
intermediate standard, to a National
Institute of Standards and Technology
(NIST)-certified primary standard such
as a NIST-traceable Reference Material
(NTRM) or a NIST-certified Gas
Manufacturer’s Internal Standard
(GMIS).
TSP (total suspended particulates)
means particulate matter as measured
by the method described in appendix B
of part 50.
Urbanized area means an area with a
minimum residential population of at
least 50,000 people and which generally
includes core census block groups or
blocks that have a population density of
at least 1,000 people per square mile
and surrounding census blocks that
have an overall density of at least 500
people per square mile. The Census
Bureau notes that under certain
conditions, less densely settled territory
may be part of each Urbanized Area.
VOCs means volatile organic
compounds.
■ 3. In § 58.10:
PO 00000
Frm 00022
Fmt 4701
Sfmt 4702
a. Revise paragraphs (a)(1) and (a)(2).
b. Add paragraph (a)(9).
c. Add paragraph (b)(14).
The revisions and additions read as
follows:
■
■
■
§ 58.10 Annual monitoring network plan
and periodic network assessment.
(a)(1) Beginning July 1, 2007, the
state, or where applicable local, agency
shall submit to the Regional
Administrator an annual monitoring
network plan which shall provide for
the documentation of the establishment
and maintenance of an air quality
surveillance system that consists of a
network of SLAMS monitoring stations
that can include FRM, FEM, and ARM
monitors that are part of SLAMS, NCore,
CSN, PAMS, and SPM stations. The
plan shall include a purpose statement
for each monitor along with a statement
of whether the operation of each
monitor meets the requirements of
appendices A, B, C, D, and E of this
part, where applicable. The Regional
Administrator may require the
submission of additional information as
needed to evaluate compliance with
applicable requirements of part 58 and
its appendices. The annual monitoring
network plan must be made available
for public inspection and comment for
at least 30 days prior to submission to
the EPA and the submitted plan shall
reference and address any such received
comments.
(2) Any annual monitoring network
plan that proposes SLAMS network
modifications (including new or
discontinued monitoring sites, new
determinations that data are not of
sufficient quality to be compared to the
NAAQS, and changes in identification
of monitors as suitable or not suitable
for comparison against the annual PM2.5
NAAQS) is subject to the approval of
the EPA Regional Administrator, who
shall approve or disapprove the plan
within 120 days of submission of a
complete plan to the EPA.
*
*
*
*
*
(9) A detailed description of the
PAMS network being operated in
accordance with the requirements of
appendix D to this part shall be
submitted as part of the annual
monitoring network plan for review by
the EPA Administrator. The PAMS
Network Description described in
section 5 of appendix D may be used to
meet this requirement.
(b) * * *
(14) The identification of any SPMs
operating for a longer period than 24
months that utilize FRM, FEM, and/or
ARM monitors accompanied by a
discussion of the rationale for retention
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
as an SPM rather than a reclassification
to SLAMS.
*
*
*
*
*
■ 4. In § 58.11, revise paragraph (a)(3) to
read as follows:
§ 58.11
Network technical requirements.
(a) * * *
(3) The owner or operator of an
existing or a proposed source shall
follow the quality assurance criteria in
appendix B to this part that apply to
PSD monitoring when operating a PSD
site.
*
*
*
*
*
■ 5. In § 58.12:
■ a. Revise paragraph (d)(1).
■ b. Revise paragraph (d)(3).
The revisions read as follows:
§ 58.12
Operating schedules.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
*
*
*
*
*
(d) * * *
(1)(i) Manual PM2.5 samplers at
required SLAMS stations without a
collocated continuously operating PM2.5
monitor must operate on at least a 1-in3 day schedule unless a waiver for an
alternative schedule has been approved
per paragraph (d)(1)(ii) of this section.
(ii) For SLAMS PM2.5 sites with both
manual and continuous PM2.5 monitors
operating, the monitoring agency may
request approval for a reduction to 1-in6 day PM2.5 sampling or for seasonal
sampling from the EPA Regional
Administrator. Other requests for a
reduction to 1-in-6 day PM2.5 sampling
or for seasonal sampling may be
approved on a case-by-case basis. The
EPA Regional Administrator may grant
sampling frequency reductions after
consideration of factors (including but
not limited to the historical PM2.5 data
quality assessments, the location of
current PM2.5 design value sites, and
their regulatory data needs) if the
Regional Administrator determines that
the reduction in sampling frequency
will not compromise data needed for
implementation of the NAAQS.
Required SLAMS stations whose
measurements determine the design
value for their area and that are within
plus or minus 10 percent of the annual
NAAQS, and all required sites where
one or more 24-hour values have
exceeded the 24-hour NAAQS each year
for a consecutive period of at least 3
years are required to maintain at least a
1-in-3 day sampling frequency until the
design value no longer meets these
criteria for 3 consecutive years. A
continuously operating FEM or ARM
PM2.5 monitor satisfies this requirement
unless it is identified in the monitoring
agency’s annual monitoring network
plan as not appropriate for comparison
to the NAAQS and the EPA Regional
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS.
(iii) Required SLAMS stations whose
measurements determine the 24-hour
design value for their area and whose
data are within plus or minus 5 percent
of the level of the 24-hour PM2.5 NAAQS
must have an FRM or FEM operate on
a daily schedule if that area’s design
value for the annual NAAQS is less than
the level of the annual PM2.5 standard.
A continuously operating FEM or ARM
PM2.5 monitor satisfies this requirement
unless it is identified in the monitoring
agency’s annual monitoring network
plan as not appropriate for comparison
to the NAAQS and the EPA Regional
Administrator has approved that the
data from that monitor may be excluded
from comparison to the NAAQS. The
daily schedule must be maintained until
the referenced design values no longer
meets these criteria for 3 consecutive
years.
(iv) Changes in sampling frequency
attributable to changes in design values
shall be implemented no later than
January 1 of the calendar year following
the certification of such data as
described in § 58.15.
*
*
*
*
*
(3) Manual PM2.5 speciation samplers
at STN stations must operate on at least
a 1-in-3 day sampling frequency unless
a reduction in sampling frequency has
been approved by the EPA
Administrator based on factors such as
area’s design value, the role of the
particular site in national health studies,
the correlation of the site’s species data
with nearby sites, and presence of other
leveraged measurements.
*
*
*
*
*
■ 6. In § 58.14, revise paragraph (a) to
read as follows:
§ 58.14
System modification.
(a) The state, or where appropriate
local, agency shall develop and
implement a network modification plan
and schedule to modify the ambient air
quality monitoring network that
implements the findings of the network
assessment required every 5 years by
§ 58.10(d). The network modification
plan shall be submitted as part of the
Annual Monitoring Network Plan that is
due no later than the year after
submittal of the network assessment.
*
*
*
*
*
■ 7. Revise § 58.15 to read as follows:
§ 58.15 Annual air monitoring data
certification.
(a) The state, or where appropriate
local, agency shall submit to the EPA
Regional Administrator an annual air
monitoring data certification letter to
PO 00000
Frm 00023
Fmt 4701
Sfmt 4702
54377
certify data collected by FRM, FEM, and
ARM monitors at SLAMS and SPM sites
that meet criteria in appendix A to this
part from January 1 to December 31 of
the previous year. The head official in
each monitoring agency, or his or her
designee, shall certify that the previous
year of ambient concentration and
quality assurance data are completely
submitted to AQS and that the ambient
concentration data are accurate to the
best of her or his knowledge, taking into
consideration the quality assurance
findings. The annual data certification
letter is due by May 1 of each year.
(b) Along with each certification
letter, the state shall submit to the
Regional Administrator an annual
summary report of all the ambient air
quality data collected by FRM, FEM,
and ARM monitors at SLAMS and SPM
sites. The annual report(s) shall be
submitted for data collected from
January 1 to December 31 of the
previous year. The annual summary
serves as the record of the specific data
that is the object of the certification
letter.
(c) Along with each certification
letter, the state shall submit to the
Regional Administrator a summary of
the precision and accuracy data for all
ambient air quality data collected by
FRM, FEM, and ARM monitors at
SLAMS and SPM sites. The summary of
precision and accuracy shall be
submitted for data collected from
January 1 to December 31 of the
previous year.
■ 8. In § 58.16, revise paragraphs (a), (c),
and (d) to read as follows:
§ 58.16 Data submittal and archiving
requirements.
(a) The state, or where appropriate,
local agency, shall report to the
Administrator, via AQS all ambient air
quality data and associated quality
assurance data for SO2; CO; O3; NO2;
NO; NOy; NOX; Pb-TSP mass
concentration; Pb-PM10 mass
concentration; PM10 mass concentration;
PM2.5 mass concentration; for filterbased PM2.5 FRM/FEM, the field blank
mass; chemically speciated PM2.5 mass
concentration data; PM10–2.5 mass
concentration; meteorological data from
NCore and PAMS sites; and metadata
records and information specified by the
AQS Data Coding Manual (https://
www.epa.gov/ttn/airs/airsaqs/manuals/
manuals.htm). Air quality data and
information must be submitted directly
to the AQS via electronic transmission
on the specified schedule described in
paragraphs (b) and (d) of this section.
*
*
*
*
*
(c) Air quality data submitted for each
reporting period must be edited,
E:\FR\FM\11SEP2.SGM
11SEP2
54378
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
validated, and entered into the AQS
(within the time limits specified in
paragraphs (b) and (d) of this section)
pursuant to appropriate AQS
procedures. The procedures for editing
and validating data are described in the
AQS Data Coding Manual and in each
monitoring agency’s quality assurance
project plan.
(d) The state shall report VOC and if
collected, carbonyl, NH3, and HNO3
data from PAMS sites, and chemically
speciated PM2.5 mass concentration data
to AQS within 6 months following the
end of each quarterly reporting period
listed in paragraph (b) of this section.
*
*
*
*
*
■ 9. Revise Appendix A to part 58 to
read as follows:
Appendix A to Part 58—Quality
Assurance Requirements for Monitors
Used in Evaluations of National
Ambient Air Quality Standards
tkelley on DSK3SPTVN1PROD with PROPOSALS2
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability. (a) This appendix
specifies the minimum quality system
requirements applicable to SLAMS and other
monitor types whose data are intended to be
used to determine compliance with the
NAAQS (e.g., SPMs, tribal, CASTNET,
industrial, etc), unless the EPA Regional
Administrator has reviewed and approved
the monitor for exclusion from NAAQS use
and these quality assurance requirements.
(b) Primary quality assurance organizations
are encouraged to develop and maintain
quality systems more extensive than the
required minimums. Additional guidance for
the requirements reflected in this appendix
can be found in the ‘‘Quality Assurance
Handbook for Air Pollution Measurement
Systems,’’ Volume II (see reference 10 of this
appendix) and at a national level in
references 1, 2, and 3 of this appendix.
1.2 Primary Quality Assurance
Organization (PQAO). A PQAO is defined as
a monitoring organization or a coordinated
aggregation of such organizations that is
responsible for a set of stations that monitors
the same pollutant and for which data quality
assessments will be pooled. Each criteria
pollutant/monitor must be associated with
only one PQAO. In some cases, data quality
is assessed at the PQAO level.
1.2.1 Each PQAO shall be defined such
that measurement uncertainty among all
stations in the organization can be expected
to be reasonably homogeneous as a result of
common factors. Common factors that should
be considered in defining PQAOs include:
(a) Operation by a common team of field
operators according to a common set of
procedures;
(b) Use of a common quality assurance
project plan (QAPP) or standard operating
procedures;
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
(c) Common calibration facilities and
standards;
(d) Oversight by a common quality
assurance organization; and
(e) Support by a common management
organization (i.e., state agency) or laboratory.
Since data quality assessments are made
and data certified at the PQAO level, the
monitoring organization identified as the
PQAO will be responsible for the oversight
of the quality of data of all monitoring
organizations within the PQAO.
1.2.2 Monitoring organizations having
difficulty describing its PQAO or in assigning
specific monitors to primary quality
assurance organizations should consult with
the appropriate EPA regional office. Any
consolidation of monitoring organizations to
PQAOs shall be subject to final approval by
the appropriate EPA regional office.
1.2.3 Each PQAO is required to
implement a quality system that provides
sufficient information to assess the quality of
the monitoring data. The quality system
must, at a minimum, include the specific
requirements described in this appendix.
Failure to conduct or pass a required check
or procedure, or a series of required checks
or procedures, does not by itself invalidate
data for regulatory decision making. Rather,
PQAOs and the EPA shall use the checks and
procedures required in this appendix in
combination with other data quality
information, reports, and similar
documentation that demonstrate overall
compliance with part 58. Accordingly, the
EPA and PQAOs shall use a ‘‘weight of
evidence’’ approach when determining the
suitability of data for regulatory decisions.
The EPA reserves the authority to use or not
use monitoring data submitted by a
monitoring organization when making
regulatory decisions based on the EPA’s
assessment of the quality of the data.
Consensus built validation templates or
validation criteria already approved in
Quality Assurance Project Plans (QAPPs)
should be used as the basis for the weight of
evidence approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used
to describe deviations from a true
concentration or estimate that are related to
the measurement process and not to spatial
or temporal population attributes of the air
being measured.
(b) Precision. A measurement of mutual
agreement among individual measurements
of the same property usually under
prescribed similar conditions, expressed
generally in terms of the standard deviation.
(c) Bias. The systematic or persistent
distortion of a measurement process which
causes errors in one direction.
(d) Accuracy. The degree of agreement
between an observed value and an accepted
reference value. Accuracy includes a
combination of random error (imprecision)
and systematic error (bias) components
which are due to sampling and analytical
operations.
(e) Completeness. A measure of the amount
of valid data obtained from a measurement
system compared to the amount that was
expected to be obtained under correct,
normal conditions.
PO 00000
Frm 00024
Fmt 4701
Sfmt 4702
(f) Detection Limit. The lowest
concentration or amount of target analyte that
can be determined to be different from zero
by a single measurement at a stated level of
probability.
1.4 Measurement Quality Checks. The
measurement quality checks described in
sections 3 of this appendix shall be reported
to AQS and are included in the data required
for certification.
1.5 Assessments and Reports. Periodic
assessments and documentation of data
quality are required to be reported to the
EPA. To provide national uniformity in this
assessment and reporting of data quality for
all networks, specific assessment and
reporting procedures are prescribed in detail
in sections 3, 4, and 5 of this appendix. On
the other hand, the selection and extent of
the quality assurance and quality control
activities used by a monitoring organization
depend on a number of local factors such as
field and laboratory conditions, the
objectives for monitoring, the level of data
quality needed, the expertise of assigned
personnel, the cost of control procedures,
pollutant concentration levels, etc. Therefore,
quality system requirements in section 2 of
this appendix are specified in general terms
to allow each monitoring organization to
develop a quality system that is most
efficient and effective for its own
circumstances while achieving the data
quality objectives described in this appendix.
2. Quality System Requirements
A quality system (reference 1 of this
appendix) is the means by which an
organization manages the quality of the
monitoring information it produces in a
systematic, organized manner. It provides a
framework for planning, implementing,
assessing and reporting work performed by
an organization and for carrying out required
quality assurance and quality control
activities.
2.1 Quality Management Plans and
Quality Assurance Project Plans. All PQAOs
must develop a quality system that is
described and approved in quality
management plans (QMP) and QAPPs to
ensure that the monitoring results:
(a) Meet a well-defined need, use, or
purpose (reference 5 of this appendix);
(b) Provide data of adequate quality for the
intended monitoring objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards
specifications;
(e) Comply with statutory (and other legal)
requirements; and
(f) Reflect consideration of cost and
economics.
2.1.1 The QMP describes the quality
system in terms of the organizational
structure, functional responsibilities of
management and staff, lines of authority, and
required interfaces for those planning,
implementing, assessing and reporting
activities involving environmental data
operations (EDO). The QMP must be suitably
documented in accordance with EPA
requirements (reference 2 of this appendix),
and approved by the appropriate Regional
Administrator, or his or her representative.
The quality system described in the QMP
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
will be reviewed during the systems audits
described in section 2.5 of this appendix.
Organizations that implement long-term
monitoring programs with EPA funds should
have a separate QMP document. Smaller
organizations, organizations that do
infrequent work with the EPA or have
monitoring programs of limited size or scope
may combine the QMP with the QAPP if
approved by, and subject to any conditions
of the EPA. Additional guidance on this
process can be found in reference 10 of this
appendix. Approval of the recipient’s QMP
by the appropriate Regional Administrator or
his or her representative may allow
delegation of authority to review and approve
environmental data collection activities
adequately described and covered under the
scope of the QMP and documented in
appropriate planning documents (QAPP) to
the PQAOs independent quality assurance
function. Where a PQAO or monitoring
organization has been delegated authority to
review and approve their QAPP, an
electronic copy must be submitted to the EPA
region at the time it is submitted to the
PQAO/monitoring organizations QAPP
approving authority. The QAPP will be
reviewed by the EPA during systems audits
or circumstances related to data quality. The
QMP submission and approval dates for
PQAOs/monitoring organizations must be
reported to AQS.
2.1.2 The QAPP is a formal document
describing, in sufficient detail, the quality
system that must be implemented to ensure
that the results of work performed will satisfy
the stated objectives. PQAOs must develop
QAPPs that describe how the organization
intends to control measurement uncertainty
to an appropriate level in order to achieve the
data quality objectives for the EDO. The
quality assurance policy of the EPA requires
every EDO to have a written and approved
QAPP prior to the start of the EDO. It is the
responsibility of the PQAO/monitoring
organization to adhere to this policy. The
QAPP must be suitably documented in
accordance with EPA requirements (reference
3 of this appendix) which include standard
operating procedures for all EDOs either
within the document or by appropriate
reference. The QAPP must identify each
PQAO operating monitors under the QAPP as
well as generally identify the sites and
monitors to which it is applicable. The QAPP
submission and approval dates must be
reported to AQS.
2.1.3 ’The PQAO/monitoring
organization’s quality system must have
adequate resources both in personnel and
funding to plan, implement, assess and
report on the achievement of the
requirements of this appendix and its
approved QAPP.
2.2 Independence of Quality Assurance.
The PQAO must provide for a quality
assurance management function; that aspect
of the overall management system of the
organization that determines and implements
the quality policy defined in a PQAO’s QMP.
Quality management includes strategic
planning, allocation of resources and other
systematic planning activities (e.g., planning,
implementation, assessing and reporting)
pertaining to the quality system. The quality
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
assurance management function must have
sufficient technical expertise and
management authority to conduct
independent oversight and assure the
implementation of the organization’s quality
system relative to the ambient air quality
monitoring program and should be
organizationally independent of
environmental data generation activities.
2.3. Data Quality Performance
Requirements.
2.3.1 Data Quality Objectives. The DQOs,
or the results of other systematic planning
processes, are statements that define the
appropriate type of data to collect and
specify the tolerable levels of potential
decision errors that will be used as a basis
for establishing the quality and quantity of
data needed to support the monitoring
objectives (reference 5 of this appendix). The
DQOs will be developed by the EPA to
support the primary regulatory objectives for
each criteria pollutant. As they are
developed, they will be added to the
regulation. The quality of the conclusions
derived from data interpretation can be
affected by population uncertainty (spatial or
temporal uncertainty) and measurement
uncertainty (uncertainty associated with
collecting, analyzing, reducing and reporting
concentration data). This appendix focuses
on assessing and controlling measurement
uncertainty.
2.3.1.1 Measurement Uncertainty for
Automated and Manual PM2.5 Methods. The
goal for acceptable measurement uncertainty
is defined for precision as an upper 90
percent confidence limit for the coefficient of
variation (CV) of 10 percent and plus or
minus 10 percent for total bias.
2.3.1.2 Measurement Uncertainty for
Automated O3 Methods. The goal for
acceptable measurement uncertainty is
defined for precision as an upper 90 percent
confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence
limit for the absolute bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb
Methods. The goal for acceptable
measurement uncertainty is defined for
precision as an upper 90 percent confidence
limit for the CV of 20 percent and for bias
as an upper 95 percent confidence limit for
the absolute bias of 15 percent.
2.3.1.4 Measurement Uncertainty for
NO2. The goal for acceptable measurement
uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV
of 15 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 15 percent.
2.3.1.5 Measurement Uncertainty for SO2.
The goal for acceptable measurement
uncertainty for precision is defined as an
upper 90 percent confidence limit for the CV
of 10 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 10 percent.
2.4 National Performance Evaluation
Programs. The PQAO shall provide for the
implementation of a program of independent
and adequate audits of all monitors providing
data for NAAQS compliance purposes
including the provision of adequate resources
for such audit programs. A monitoring plan
(or QAPP) which provides for PQAO
PO 00000
Frm 00025
Fmt 4701
Sfmt 4702
54379
participation in the EPA’s National
Performance Audit Program (NPAP), the
PM2.5 Performance Evaluation Program
(PM2.5-PEP) program and the Pb Performance
Evaluation Program (Pb-PEP) and indicates
the consent of the PQAO for the EPA to apply
an appropriate portion of the grant funds,
which the EPA would otherwise award to the
PQAO for these QA activities, will be
deemed by the EPA to meet this requirement.
For clarification and to participate, PQAOs
should contact either the appropriate EPA
regional quality assurance (QA) coordinator
at the appropriate EPA regional office
location, or the NPAP coordinator at the EPA
Air Quality Assessment Division, Office of
Air Quality Planning and Standards, in
Research Triangle Park, North Carolina. The
PQAOs that plan to implement these
programs (self-implement) rather than use
the federal programs must meet the adequacy
requirements found in the appropriate
sections that follow, as well as meet the
definition of independent assessment that
follows.
2.4.1 Independent assessment. An
assessment performed by a qualified
individual, group, or organization that is not
part of the organization directly performing
and accountable for the work being assessed.
This auditing organization must not be
involved with the generation of the ambient
air monitoring data. An organization can
conduct the performance evaluation (PE) if it
can meet this definition and has a
management structure that, at a minimum,
will allow for the separation of its routine
sampling personnel from its auditing
personnel by two levels of management. In
addition, the sample analysis of audit filters
must be performed by a laboratory facility
and laboratory equipment separate from the
facilities used for routine sample analysis.
Field and laboratory personnel will be
required to meet PE field and laboratory
training and certification requirements to
establish comparability to federally
implemented programs.
2.5 Technical Systems Audit Program.
Technical systems audits of each PQAO shall
be conducted at least every 3 years by the
appropriate EPA regional office and reported
to the AQS. If a PQAO is made up of more
than one monitoring organization, all
monitoring organizations in the PQAO
should be audited within 6 years (two TSA
cycles of the PQAO). As an example, if a state
has five local monitoring organizations that
are consolidated under one PQAO, all five
local monitoring organizations will receive a
technical systems audit within a 6-year
period. Systems audit programs are described
in reference 10 of this appendix. For further
instructions, PQAOs should contact the
appropriate EPA regional QA coordinator.
2.6 Gaseous and Flow Rate Audit
Standards.
2.6.1 Gaseous pollutant concentration
standards (permeation devices or cylinders of
compressed gas) used to obtain test
concentrations for carbon monoxide (CO),
sulfur dioxide (SO2), nitrogen oxide (NO),
and nitrogen dioxide (NO2) must be traceable
to either a National Institute of Standards and
Technology (NIST) Traceable Reference
Material (NTRM) or a NIST-certified Gas
E:\FR\FM\11SEP2.SGM
11SEP2
54380
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
Manufacturer’s Internal Standard (GMIS),
certified in accordance with one of the
procedures given in reference 4 of this
appendix. Vendors advertising certification
with the procedures provided in reference 4
of this appendix and distributing gases as
‘‘EPA Protocol Gas’’ for ambient air
monitoring purposes must participate in the
EPA Ambient Air Protocol Gas Verification
Program or not use ‘‘EPA’’ in any form of
advertising. Monitoring organizations must
provide information to the EPA on the gas
producers they use on an annual basis and
those PQAOs purchasing standards will be
obligated, at the request of the EPA, to
participate in the program at least once every
5 years by sending a new unused standard to
a designated verification laboratory.
2.6.2 Test concentrations for ozone (O3)
must be obtained in accordance with the
ultraviolet photometric calibration procedure
specified in appendix D to part 50 of this
chapter and by means of a certified NISTtraceable O3 transfer standard. Consult
references 7 and 8 of this appendix for
guidance on transfer standards for O3.
2.6.3 Flow rate measurements must be
made by a flow measuring instrument that is
NIST-traceable to an authoritative volume or
other applicable standard. Guidance for
certifying some types of flowmeters is
provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance.
Requirements and guidance documents for
developing the quality system are contained
in references 1 through 11 of this appendix,
which also contain many suggested
procedures, checks, and control
specifications. Reference 10 describes
specific guidance for the development of a
quality system for data collected for
comparison to the NAAQS. Many specific
quality control checks and specifications for
methods are included in the respective
reference methods described in part 50 of
this chapter or in the respective equivalent
method descriptions available from the EPA
(reference 6 of this appendix). Similarly,
quality control procedures related to
specifically designated reference and
equivalent method monitors are contained in
the respective operation or instruction
manuals associated with those monitors.
3. Measurement Quality Check
Requirements
This section provides the requirements for
PQAOs to perform the measurement quality
checks that can be used to assess data
quality. Data from these checks are required
to be submitted to the AQS within the same
time frame as routinely-collected ambient
concentration data as described in 40 CFR
58.16. Table A–1 of this appendix provides
a summary of the types and frequency of the
measurement quality checks that will be
described in this section.
3.1. Gaseous Monitors of SO2, NO2, O3,
and CO.
3.1.1 One-Point Quality Control (QC)
Check for SO2, NO2, O3, and CO. (a) A onepoint QC check must be performed at least
once every 2 weeks on each automated
monitor used to measure SO2, NO2, O3 and
CO. With the advent of automated calibration
systems, more frequent checking is strongly
encouraged. See Reference 10 of this
appendix for guidance on the review
procedure. The QC check is made by
challenging the monitor with a QC check gas
of known concentration (effective
concentration for open path monitors)
between the prescribed range of 0.005 and
0.08 parts per million (ppm) for SO2, NO2,
and O3, and between the prescribed range of
0.5 and 5 ppm for CO monitors. The QC
check gas concentration selected within the
prescribed range must be related to the mean
or median of the ambient air concentrations
normally measured at sites within the
monitoring network in order to appropriately
reflect the precision and bias at these
ambient air concentration ranges. If the mean
or median concentrations at the sites are
below or above the prescribed range for the
relevant pollutant, select the lowest or
highest concentration in the range. An
additional QC check point is encouraged for
those organizations that may have occasional
high values or would like to confirm the
monitors’ linearity at the higher end of the
operational range or around NAAQS
concentrations.
(b) Point analyzers must operate in their
normal sampling mode during the QC check
and the test atmosphere must pass through
all filters, scrubbers, conditioners and other
components used during normal ambient
sampling and as much of the ambient air
inlet system as is practicable. The QC check
must be conducted before any calibration or
adjustment to the monitor.
(c) Open path monitors are tested by
inserting a test cell containing a QC check gas
concentration into the optical measurement
beam of the instrument. If possible, the
normally used transmitter, receiver, and as
appropriate, reflecting devices should be
used during the test, and the normal
monitoring configuration of the instrument
should be altered as little as possible to
accommodate the test cell for the test.
However, if permitted by the associated
operation or instruction manual, an alternate
local light source or an alternate optical path
that does not include the normal atmospheric
monitoring path may be used. The actual
concentration of the QC check gas in the test
cell must be selected to produce an effective
concentration in the range specified earlier in
this section. Generally, the QC test
concentration measurement will be the sum
of the atmospheric pollutant concentration
and the QC test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the QC test from the
QC check gas concentration measurement. If
the difference between these before and after
measurements is greater than 20 percent of
the effective concentration of the test gas,
discard the test result and repeat the test. If
possible, open path monitors should be
tested during periods when the atmospheric
pollutant concentrations are relatively low
and steady.
(d) Report the audit concentration of the
QC gas and the corresponding measured
concentration indicated by the monitor to
AQS. The percent differences between these
concentrations are used to assess the
precision and bias of the monitoring data as
described in sections 4.1.2 (precision) and
4.1.3 (bias) of this appendix.
3.1.2 Annual performance evaluation for
SO2, NO2, O3, or CO. A performance
evaluation must be conducted on each
primary monitor once a year. This can be
accomplished by evaluating 25 percent of the
primary monitors each quarter. The
evaluation should be conducted by a trained
experienced technician other than the
routine site operator.
3.1.2.1 The evaluation is made by
challenging the monitor with audit gas
standards of known concentration from at
least three audit levels. Two of the audit
levels selected will represent a range of 10–
80 percent of the typical ambient air
concentrations either measured by the
monitor or in the PQAOs network of
monitors. The third point should be at the
NAAQS level or above the highest 3-year
ambient air hourly concentration, whichever
is greater. An additional 4th level is
encouraged for those agencies that would like
to confirm the monitors’ linearity at the
higher end of the operational range. In rare
circumstances, there may be sites measuring
concentrations above audit level 10. Notify
the appropriate EPA region and the AQS
program in order to make accommodations
for auditing at levels above level 10.
Concentration range, ppm
Audit level
tkelley on DSK3SPTVN1PROD with PROPOSALS2
O3
1
2
3
4
5
6
7
8
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
PO 00000
Frm 00026
SO2
0.004–0.0059
0.006–0.019
0.020–0.039
0.040–0.069
0.070–0.089
0.090–0.119
0.120–0.139
0.140–0.169
Fmt 4701
Sfmt 4702
0.0003–0.0029
0.0030–0.0049
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.1499
0.1500–0.2599
E:\FR\FM\11SEP2.SGM
NO2
0.0003–0.0029
0.0030–0.0049
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.2999
0.3000–0.4999
11SEP2
CO
0.020–0.059
0.060–0.199
0.200–0.899
0.900–2.999
3.000–7.999
8.000–15.999
16.000–30.999
31.000–39.999
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
54381
Concentration range, ppm
Audit level
SO2
O3
tkelley on DSK3SPTVN1PROD with PROPOSALS2
9 ...............................................................................................
10 .............................................................................................
3.1.2.2 The NO2 audit techniques may
vary depending on the ambient monitoring
method. For chemiluminescence-type NO2
analyzers, gas phase titration (GPT)
techniques should be based on EPA guidance
documents and monitoring agency
experience. The NO2 gas standards may be
more appropriate than GPT for direct NO2
methods that do not employ converters. Care
should be taken to ensure the stability of
such gas standards prior to use.
3.1.2.3 The standards from which audit
gas test concentrations are obtained must
meet the specifications of section 2.6.1 of this
appendix. The gas standards and equipment
used for the performance evaluation must not
be the same as the standards and equipment
used for one-point QC, calibrations, span
evaluations or NPAP.
3.1.2.4 For point analyzers, the
evaluation shall be carried out by allowing
the monitor to analyze the audit gas test
atmosphere in its normal sampling mode
such that the test atmosphere passes through
all filters, scrubbers, conditioners, and other
sample inlet components used during normal
ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open path monitors are evaluated
by inserting a test cell containing the various
audit gas concentrations into the optical
measurement beam of the instrument. If
possible, the normally used transmitter,
receiver, and, as appropriate, reflecting
devices should be used during the
evaluation, and the normal monitoring
configuration of the instrument should be
modified as little as possible to accommodate
the test cell for the evaluation. However, if
permitted by the associated operation or
instruction manual, an alternate local light
source or an alternate optical path that does
not include the normal atmospheric
monitoring path may be used. The actual
concentrations of the audit gas in the test cell
must be selected to produce effective
concentrations in the evaluation level ranges
specified in this section of this appendix.
Generally, each evaluation concentration
measurement result will be the sum of the
atmospheric pollutant concentration and the
evaluation test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the evaluation test (or
preferably before and after each evaluation
concentration level) from the evaluation
concentration measurement. If the difference
between the before and after measurements is
greater than 20 percent of the effective
concentration of the test gas standard,
discard the test result for that concentration
level and repeat the test for that level. If
possible, open path monitors should be
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
0.170–0.189
0.190–0.259
0.2600–0.7999
0.8000–1.000
evaluated during periods when the
atmospheric pollutant concentrations are
relatively low and steady. Also, if the open
path instrument is not installed in a
permanent manner, the monitoring path
length must be reverified to be within plus
or minus 3 percent to validate the evaluation
since the monitoring path length is critical to
the determination of the effective
concentration.
3.1.2.6 Report both the evaluation
concentrations (effective concentrations for
open path monitors) of the audit gases and
the corresponding measured concentration
(corrected concentrations, if applicable, for
open path monitors) indicated or produced
by the monitor being tested to AQS. The
percent differences between these
concentrations are used to assess the quality
of the monitoring data as described in section
4.1.1 of this appendix.
3.1.3 National Performance Audit
Program (NPAP).
The NPAP is a performance evaluation
which is a type of audit where quantitative
data are collected independently in order to
evaluate the proficiency of an analyst,
monitoring instrument or laboratory. Details
of the program can be found in reference 11
of this appendix. The program requirements
include:
3.1.3.1 Performing audits of the primary
monitors at 20 percent of monitoring sites per
year, and 100 percent of the sites in 6 years.
High-priority sites may be visited more often.
Since not all gaseous criteria pollutants are
monitored at every site within a PQAO, it is
not required that 20 percent of the primary
monitors for each pollutant receive an NPAP
audit each year only that 20 percent of the
PQAOs monitoring sites receive an NPAP
audit. It is expected that over the 6-year
period all primary monitors for all gaseous
pollutants will receive an NPAP audit.
3.1.3.2 Developing a delivery system that
will allow for the audit concentration gasses
to be introduced to the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified
against the NIST standard reference methods
or special review procedures and validated
annually for CO, SO2 and NO2, and at the
beginning of each quarter of audits for O3.
3.1.3.4 As described in section 2.4 of this
appendix, the PQAO may elect, on an annual
basis, to utilize the federally implemented
NPAP program. If the PQAO plans to selfimplement NPAP, the EPA will establish
training and other technical requirements for
PQAOs to establish comparability to
federally implemented programs. In addition
to meeting the requirements in sections
3.1.3.1 through 3.1.3.3 of this appendix, the
PQAO must:
(a) Utilize an audit system equivalent to
the federally implemented NPAP audit
system and is separate from equipment used
in annual performance evaluations.
PO 00000
Frm 00027
Fmt 4701
Sfmt 4702
NO2
0.5000–0.7999
0.8000–1.000
CO
40.000–49.999
50.000–60.000
(b) Perform a whole system check by
having the NPAP system tested against an
independent and qualified EPA lab, or
equivalent.
(c) Evaluate the system with the EPA NPAP
program through collocated auditing at an
acceptable number of sites each year (at least
one for an agency network of five or less
sites; at least two for a network with more
than five sites).
(d) Incorporate the NPAP in the PQAO’s
quality assurance project plan.
(e) Be subject to review by independent,
EPA-trained personnel.
(f) Participate in initial and update
training/certification sessions.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure PM2.5. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be used in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. Report the flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between the audit and
measured flow rates are used to assess the
bias of the monitoring data as described in
section 4.2.2 of this appendix (using flow
rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for
PM2.5. Audit the flow rate of the particulate
monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months
apart. The EPA strongly encourages more
frequent auditing. The audit should
(preferably) be conducted by a trained
experienced technician other than the
routine site operator. The audit is made by
measuring the monitor’s normal operating
flow rate(s) using a flow rate transfer
standard certified in accordance with section
2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow
rate standard used for verifications or to
calibrate the monitor. However, both the
calibration standard and the audit standard
may be referenced to the same primary flow
rate or volume standard. Care must be taken
in auditing the flow rate to be certain that the
flow measurement device does not alter the
normal operating flow rate of the monitor.
Report the audit flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
E:\FR\FM\11SEP2.SGM
11SEP2
54382
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
percent differences between these flow rates
are used to evaluate monitor performance.
3.2.3 Collocated Quality Control
Sampling Procedures for PM2.5. For each pair
of collocated monitors, designate one
sampler as the primary monitor whose
concentrations will be used to report air
quality for the site, and designate the other
as the quality control monitor. There can be
only one primary monitor at a monitoring
site for a given time period.
3.2.3.1 For each distinct monitoring
method designation (FRM or FEM) that a
PQAO is using for a primary monitor, the
PQAO must:
(a) Have 15 percent of the primary
monitors of each method designation
collocated (values of 0.5 and greater round
up); and
(b) Have at least one collocated quality
control monitor (if the total number of
monitors is less than three). The first
collocated monitor must be a designated
FRM monitor.
3.2.3.2 In addition, monitors selected for
collocation must also meet the following
requirements:
(a) A primary monitor designated as an
EPA FRM shall be collocated with a quality
control monitor having the same EPA FRM
method designation.
(b) For each primary monitor designated as
an EPA FEM used by the PQAO, 50 percent
of the monitors designated for collocation, or
#Primary FEMS of a unique method designation
the first if only one collocation is necessary,
shall be collocated with a FRM quality
control monitor and 50 percent of the
monitors shall be collocated with a monitor
having the same method designation as the
FEM primary monitor. If an odd number of
collocated monitors is required, the
additional monitor shall be a FRM quality
control monitor. An example of the
distribution of collocated monitors for each
unique FEM is provided below. Table A–2 of
this appendix demonstrates the procedure
with a PQAO having an FRM and multiple
FEMs.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
‘‘1–9’’ ............................................................................................................................................
‘‘10–16’’ ........................................................................................................................................
‘‘17–23’’ ........................................................................................................................................
‘‘24–29’’ ........................................................................................................................................
‘‘30–36’’ ........................................................................................................................................
‘‘37–43’’ ........................................................................................................................................
3.2.3.3 Since the collocation
requirements are used to assess precision of
the primary monitors and there can only be
one primary monitor at a monitoring site, a
site can only count for the collocation of the
method designation of the primary monitor at
that site.
3.2.3.4 The collocated monitors should be
deployed according to the following protocol:
(a) Fifty percent of the collocated quality
control monitors should be deployed at sites
with annual average or daily concentrations
estimated to be within ±20 percent of either
the annual or 24-hour NAAQS and the
remainder at the PQAOs discretion;
(b) If an organization has no sites with
annual average or daily concentrations
within ±20 percent of the annual NAAQS or
24-hour NAAQS, 50 percent of the collocated
quality control monitors should be deployed
at those sites with the annual mean
concentrations or 24-hour concentrations
among the highest for all sites in the network
and the remainder at the PQAOs discretion.
(c) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference. A waiver allowing up to 10
meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a
primary and collocated sampler may be
approved by the Regional Administrator for
sites at a neighborhood or larger scale of
representation during the annual network
plan approval process. Calibration, sampling,
and analysis must be the same for both
primary and collocated quality control
samplers and the same as for all other
samplers in the network.
(d) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation
Program (PEP) Procedures. The PEP is an
independent assessment used to estimate
total measurement system bias. These
evaluations will be performed under the
NPEP as described in section 2.4 of this
appendix or a comparable program.
Performance evaluations will be performed
annually within each PQAO. For PQAOs
with less than or equal to five monitoring
sites, five valid performance evaluation
audits must be collected and reported each
year. For PQAOs with greater than five
monitoring sites, eight valid performance
evaluation audits must be collected and
reported each year. A valid performance
evaluation audit means that both the primary
monitor and PEP audit concentrations are
valid and above 3 mg/m3. Siting of the PEP
monitor should be consistent with section
3.2.3.7. However, any horizontal distance
greater than 4 meters and any vertical
distance greater than one meter must be
reported to the EPA regional PEP
coordinator. Additionally for every monitor
designated as a primary monitor, a primary
quality assurance organization must:
3.2.4.1 Have each method designation
evaluated each year; and,
3.2.4.2 Have all FRM, FEM or ARM
samplers subject to a PEP audit at least once
every six years; which equates to
approximately 15 percent of the monitoring
sites audited each year.
3.2.4.3 Additional information
concerning the PEP is contained in reference
10 of this appendix. The calculations for
evaluating bias between the primary monitor
and the performance evaluation monitor for
PO 00000
Frm 00028
Fmt 4701
#Collocated
with an FRM
#Collocated
Sfmt 4702
1
2
3
4
5
6
1
1
2
2
3
3
#Collocated
with same
method
designation
0
1
1
2
2
3
PM2.5 are described in section 4.2.5 of this
appendix.
3.3 PM10.
3.3.1 Flow Rate Verification for PM10 Low
Volume Samplers (less than 200 liter/
minute). A one-point flow rate verification
check must be performed at least once every
month (each verification minimally separated
by 14 days) on each monitor used to measure
PM10. The verification is made by checking
the operational flow rate of the monitor. If
the verification is made in conjunction with
a flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be taken in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are reported
to AQS and used to assess the bias of the
monitoring data as described in section 4.2.2
of this appendix (using flow rates in lieu of
concentrations).
3.3.2 Flow Rate Verification for PM10
High Volume Samplers (greater than 200
liters/minute). For PM10 high volume
samplers, the verification frequency is one
verification every 90 days (quarter) with 4 in
a year. Other than verification frequency,
follow the same technical procedure as
described in section 3.3.1 of this appendix.
3.3.3 Semi-Annual Flow Rate Audit for
PM10. Audit the flow rate of the particulate
monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months
apart. The EPA strongly encourages more
frequent auditing. The audit should
(preferably) be conducted by a trained
experienced technician other than the
routine site operator. The audit is made by
measuring the monitor’s normal operating
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
flow rate using a flow rate transfer standard
certified in accordance with section 2.6 of
this appendix. The flow rate standard used
for auditing must not be the same flow rate
standard used for verifications or to calibrate
the monitor. However, both the calibration
standard and the audit standard may be
referenced to the same primary flow rate or
volume standard. Care must be taken in
auditing the flow rate to be certain that the
flow measurement device does not alter the
normal operating flow rate of the monitor.
Report the audit flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between these flow rates
are used to evaluate monitor performance.
3.3.4 Collocated Quality Control
Sampling Procedures for Manual PM10.
Collocated sampling for PM10 is only
required for manual samplers. For each pair
of collocated monitors, designate one
sampler as the primary monitor whose
concentrations will be used to report air
quality for the site and designate the other as
the quality control monitor.
3.3.4.1 For manual PM10 samplers, a
PQAO must:
(a) Have 15 percent of the primary
monitors collocated (values of 0.5 and greater
round up); and
(b) Have at least one collocated quality
control monitor (if the total number of
monitors is less than three).
3.3.4.2 The collocated quality control
monitors should be deployed according to
the following protocol:
(a) Fifty percent of the collocated quality
control monitors should be deployed at sites
with daily concentrations estimated to be
within ±20 percent of the applicable NAAQS
and the remainder at the PQAOs discretion;
(b) If an organization has no sites with
daily concentrations within ±20 percent of
the NAAQS, 50 percent of the collocated
quality control monitors should be deployed
at those sites with the daily mean
concentrations among the highest for all sites
in the network and the remainder at the
PQAOs discretion.
(c) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference. A waiver allowing up to 10
meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a
primary and collocated sampler may be
approved by the Regional Administrator for
sites at a neighborhood or larger scale of
representation. This waiver may be approved
during the annual network plan approval
process. Calibration, sampling, and analysis
must be the same for both collocated
samplers and the same as for all other
samplers in the network.
(d) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
(e) In determining the number of collocated
quality control sites required for PM10,
monitoring networks for lead (Pb-PM10)
should be treated independently from
networks for particulate matter (PM), even
though the separate networks may share one
or more common samplers. However, a single
quality control monitor that meets the
collocation requirements for Pb-PM10 and
PM10 may serve as a collocated quality
control monitor for both networks. Extreme
care must be taken when using the filter from
a quality control monitor for both PM10 and
Pb analysis. A PM10 filter weighing should
occur prior to any Pb analysis.
3.4 Pb.
3.4.1 Flow Rate Verification for Pb–PM10
Low Volume Samplers (less than 200 liter/
minute). A one-point flow rate verification
check must be performed at least once every
month (each verification minimally separated
by 14 days) on each monitor used to measure
Pb. The verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be taken in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are reported
to AQS and used to assess the bias of the
monitoring data as described in section 4.2.2
of this appendix (using flow rates in lieu of
concentrations).
3.4.2 Flow Rate Verification for Pb High
Volume Samplers (greater than 200 liters/
minute). For high volume samplers, the
verification frequency is one verification
every 90 days (quarter) with four in a year.
Other than verification frequency, follow the
same technical procedure as described in
section 3.4.1 of this appendix.
3.4.3 Semi-Annual Flow Rate Audit for
Pb. Audit the flow rate of the particulate
monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months
apart. The EPA strongly encourages more
frequent auditing. The audit should
(preferably) be conducted by a trained
experienced technician other than the
routine site operator. The audit is made by
measuring the monitor’s normal operating
flow rate using a flow rate transfer standard
certified in accordance with section 2.6 of
this appendix. The flow rate standard used
for auditing must not be the same flow rate
standard used for verifications or to calibrate
the monitor. However, both the calibration
standard and the audit standard may be
referenced to the same primary flow rate or
volume standard. Care must be taken in
auditing the flow rate to be certain that the
flow measurement device does not alter the
normal operating flow rate of the monitor.
Report the audit flow rate of the transfer
standard and the corresponding flow rate
measured by the monitor to AQS. The
percent differences between these flow rates
are used to evaluate monitor performance.
3.4.4 Collocated Quality Control
Sampling for TSP Pb for monitoring sites
PO 00000
Frm 00029
Fmt 4701
Sfmt 4702
54383
other than non-source NCore. For each pair
of collocated monitors for manual TSP Pb
samplers, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the quality control
monitor.
3.4.4.1 A PQAO must:
(a) Have 15 percent of the primary
monitors (not counting non-source NCore
sites in PQAO) collocated. Values of 0.5 and
greater round up; and
(b) Have at least one collocated quality
control monitor (if the total number of
monitors is less than three).
3.4.4.2 The collocated quality control
monitors should be deployed according to
the following protocol:
(a) The first collocated Pb site selected
must be the site measuring the highest Pb
concentrations in the network. If the site is
impractical, alternative sites, approved by the
EPA Regional Administrator, may be
selected. If additional collocated sites are
necessary, collocated sites may be chosen
that reflect average ambient air Pb
concentrations in the network.
(b) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference.
(c) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
3.4.5 Collocated Quality Control
Sampling for Pb-PM10 at monitoring sites
other than non-source NCore. If a PQAO is
monitoring for Pb-PM10 at sites other than at
a non-source oriented NCore site then the
PQAO must:
3.4.5.1 Have 15 percent of the primary
monitors (not counting non-source NCore
sites in PQAO) collocated. Values of 0.5 and
greater round up; and
3.4.5.2 Have at least one collocated
quality control monitor (if the total number
of monitors is less than three).
3.4.5.3 The collocated monitors should be
deployed according to the following protocol:
(a) Fifty percent of the collocated quality
control monitors should be deployed at sites
with the highest 3-month average
concentrations and the remainder at the
PQAOs discretion.
(b) The two collocated monitors must be
within 4 meters (inlet to inlet) of each other
and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter
apart for samplers having flow rates less than
200 liters/min to preclude airflow
interference. A waiver allowing up to 10
meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a
primary and collocated sampler may be
approved by the Regional Administrator for
sites at a neighborhood or larger scale of
representation. This waiver may be approved
during the annual network plan approval
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
1 ...................
2 ...................
Equivalent ambient Pb
concentration, μg/m3
30–100% of Pb NAAQS.
200–300% of Pb NAAQS.
where, meas is the concentration indicated
by the PQAO’s instrument and audit is the
audit concentration of the standard used in
the QC check being measured.
4.1.2 Precision Estimate. The precision
estimate is used to assess the one-point QC
checks for SO2, NO2, O3, or CO described in
section 3.1.1 of this appendix. The precision
estimator is the coefficient of variation upper
bound and is calculated using equation 2 of
this section:
where, n is the number of single point checks
being aggregated; X 2 0.1,n-1 is the 10th
percentile of a chi-squared distribution with
n-1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is
calculated using the one-point QC checks for
SO2, NO2, O3, or CO described in section
3.1.1 of this appendix. The bias estimator is
an upper bound on the mean absolute value
of the percent differences as described in
equation 3 of this section:
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
where, n is the number of single point checks
being aggregated; t0.95,-1 is the 95th quantile
of a t-distribution with n-1 degrees of
freedom; the quantity AB is the mean of the
absolute values of the di’s and is calculated
using equation 4 of this section:
PO 00000
Frm 00030
Fmt 4701
Sfmt 4702
EP11SE14.002
tkelley on DSK3SPTVN1PROD with PROPOSALS2
(a) Extract the audit samples using the
same extraction procedure used for exposed
filters.
(b) Analyze three audit samples in each of
the two ranges each quarter samples are
(b) The EPA will provide annual
assessments of data quality aggregated by site
and PQAO for SO2, NO2, O3 and CO and by
PQAO for PM10, PM2.5, and Pb.
(c) At low concentrations, agreement
between the measurements of collocated
quality control samplers, expressed as
relative percent difference or percent
difference, may be relatively poor. For this
reason, collocated measurement pairs are
selected for use in the precision and bias
calculations only when both measurements
are equal to or above the following limits:
(1) Pb: 0.002 mg/m3 (Methods approved
after 3/04/2010, with exception of manual
equivalent method EQLA–0813–803).
(2) Pb: 0.02 mg/m3 (Methods approved
before 3/04/2010, and manual equivalent
method EQLA–0813–803).
(3) PM10(Hi-Vol): 15 mg/m3.
(4) PM10(Lo-Vol): 3 mg/m3.
(5) PM2.5: 3 mg/m3.
4.1 Statistics for the Assessment of QC
Checks for SO2, NO2, O3 and CO.
4.1.1 Percent Difference. Many of the
measurement quality checks start with a
comparison of an audit concentration or
value (flow rate) to the concentration/value
measured by the monitor and use percent
difference as the comparison statistic as
described in equation 1 of this section. For
each single point check, calculate the percent
difference, di, as follows:
EP11SE14.003
Range
analyzed. The audit sample analyses shall be
distributed as much as possible over the
entire calendar quarter.
(c) Report the audit concentrations (in mg
Pb/filter or strip) and the corresponding
measured concentrations (in mg Pb/filter or
strip) to AQS using AQS unit code 077. The
percent differences between the
concentrations are used to calculate
analytical accuracy as described in section
4.2.6 of this appendix.
3.4.7 Pb PEP Procedures for monitoring
sites other than non-source NCore. The PEP
is an independent assessment used to
estimate total measurement system bias.
These evaluations will be performed under
the NPEP described in section 2.4 of this
appendix or a comparable program. Each
year, one performance evaluation audit must
be performed at one Pb site in each primary
quality assurance organization that has less
than or equal to five sites and two audits at
PQAOs with greater than five sites. Nonsource oriented NCore sites are not counted.
In addition, each year, four collocated
samples from PQAOs with less than or equal
to five sites and six collocated samples at
PQAOs with greater than five sites must be
sent to an independent laboratory, the same
laboratory as the performance evaluation
audit, for analysis. Siting of this PEP monitor
should be consistent with section 3.4.5.4.
However, any horizontal distance greater
than 4 meters and any vertical distance
greater than 1 meter must be reported to the
EPA regional PEP coordinator. The
calculations for evaluating bias between the
primary monitor and the performance
evaluation monitor for Pb are described in
section 4.2.4 of this appendix.
4. Calculations for Data Quality Assessment
(a) Calculations of measurement
uncertainty are carried out by the EPA
according to the following procedures. The
PQAOs must report the data to AQS for all
measurement quality checks as specified in
this appendix even though they may elect to
perform some or all of the calculations in this
section on their own.
and the quantity AS is the standard deviation
of the absolute value of the di’s and is
calculated using equation 5 of this section:
E:\FR\FM\11SEP2.SGM
11SEP2
EP11SE14.001
process. Calibration, sampling, and analysis
must be the same for both collocated
samplers and the same as for all other
samplers in the network.
(c) Sample the collocated quality control
monitor on a 1-in-12 day schedule. Report
the measurements from both primary and
collocated quality control monitors at each
collocated sampling site to AQS. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
(d) In determining the number of
collocated quality control sites required for
Pb-PM10, monitoring networks for PM10
should be treated independently from
networks for Pb-PM10, even though the
separate networks may share one or more
common samplers. However, a single quality
control monitor that meets the collocation
requirements for Pb-PM10 and PM10 may
serve as a collocated quality control monitor
for both networks. Extreme care must be
taken when using a using the filter from a
quality control monitor for both PM10 and Pb
analysis. A PM10 filter weighing should occur
prior to any Pb analysis.
3.4.6 Pb Analysis Audits. Each calendar
quarter, audit the Pb reference or equivalent
method analytical procedure using filters
containing a known quantity of Pb. These
audit filters are prepared by depositing a Pb
standard on unexposed filters and allowing
them to dry thoroughly. The audit samples
must be prepared using batches of reagents
different from those used to calibrate the Pb
analytical equipment being audited. Prepare
audit samples in the following concentration
ranges:
EP11SE14.000
54384
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
where, Xi is the concentration from the
primary sampler and Yi is the concentration
value from the audit sampler. The coefficient
of variation upper bound is calculated using
equation 7 of this appendix:
where, nj is the number of pairs and d1,
d2,...dnj are the biases for each pair to be
averaged.
4.2.6 Pb Analysis Audit Bias Estimate.
The bias estimate is calculated using the
analysis audit data described in section 3.4.6.
Use the same bias estimate procedure as
described in section 4.1.3 of this appendix.
5. Reporting Requirements
5.1 Reporting Requirements. For each
pollutant, prepare a list of all monitoring
sites and their AQS site identification codes
in each PQAO and submit the list to the
appropriate EPA regional office, with a copy
to AQS. Whenever there is a change in this
list of monitoring sites in a PQAO, report this
change to the EPA regional office and to
AQS.
5.1.1 Quarterly Reports. For each quarter,
each PQAO shall report to AQS directly (or
via the appropriate EPA regional office for
organizations not direct users of AQS) the
results of all valid measurement quality
checks it has carried out during the quarter.
The quarterly reports must be submitted
consistent with the data reporting
requirements specified for air quality data as
set forth in 40 CFR 58.16. The EPA strongly
encourages early submission of the quality
assurance data in order to assist the PQAOs
ability to control and evaluate the quality of
the ambient air data.
5.1.2 Annual Reports.
5.1.2.1 When the PQAO has certified
relevant data for the calendar year, the EPA
will calculate and report the measurement
uncertainty for the entire calendar year.
6.0 References
(1) American National Standard—
Specifications and Guidelines for
Quality Systems for Environmental Data
Collection and Environmental
Technology Programs. ANSI/ASQC E4–
2004. February 2004. Available from
American Society for Quality Control,
611 East Wisconsin Avenue, Milwaukee,
WI 53202.
(2) EPA Requirements for Quality
Management Plans. EPA QA/R–2. EPA/
240/B–01/002. March 2001, Reissue May
2006. Office of Environmental
Information, Washington DC 20460.
https://www.epa.gov/quality/qs-docs/r2final.pdf.
(3) EPA Requirements for Quality Assurance
Project Plans for Environmental Data
Operations. EPA QA/R–5. EPA/240/B–
01/003. March 2001, Reissue May 2006.
Office of Environmental Information,
Washington DC 20460. https://
www.epa.gov/quality/qs-docs/r5final.pdf.
(4) EPA Traceability Protocol for Assay and
Certification of Gaseous Calibration
Standards. EPA–600/R–12/531. May,
2012. Available from U.S. Environmental
Protection Agency, National Risk
Management Research Laboratory,
Research Triangle Park NC 27711. https://
www.epa.gov/nrmrl/appcd/mmd/dbtraceability-protocol.html.
(5) Guidance for the Data Quality Objectives
Process. EPA QA/G–4. EPA/240/B–06/
001. February, 2006. Office of
Environmental Information, Washington
DC 20460. https://www.epa.gov/quality/
qs-docs/g4-final.pdf.
(6) List of Designated Reference and
Equivalent Methods. Available from U.S.
Environmental Protection Agency,
National Exposure Research Laboratory,
PO 00000
Frm 00031
Fmt 4701
Sfmt 4702
E:\FR\FM\11SEP2.SGM
11SEP2
EP11SE14.006
EP11SE14.007
the minimum values specified in section 4(c)
of this appendix. For each collocated data
pair, calculate the relative percent difference,
di, using equation 6 of this appendix:
EP11SE14.005
where, n is the number of valid data pairs
being aggregated, and X 2 0.1,n-1 is the 10th
percentile of a chi-squared distribution with
n-1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di
is calculated from two values with error.
4.2.2 One-Point Flow Rate Verification
Bias Estimate for PM10, PM2.5 and Pb. For
each one-point flow rate verification,
calculate the percent difference in volume
using equation 1 of this appendix where
meas is the value indicated by the sampler’s
volume measurement and audit is the actual
volume indicated by the auditing flow meter.
The absolute volume bias upper bound is
then calculated using equation 3, where n is
the number of flow rate audits being
aggregated; t0.95,n-1is the 95th quantile of a tdistribution with n-1 degrees of freedom, the
quantity AB is the mean of the absolute
values of the di’s and is calculated using
equation 4 of this appendix, and the quantity
AS in equation 3 of this appendix is the
standard deviation of the absolute values if
the di’s and is calculated using equation 5 of
this appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias
Estimate for PM10, PM2.5 and Pb. Use the
same procedure described in section 4.2.2 for
the evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs
Bias Estimate for Pb. The Pb bias estimate is
calculated using the paired routine and the
PEP monitor as described in section 3.4.7.
Use the same procedures as described in
section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs
Bias Estimate for PM2.5. The bias estimate is
calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias
estimator is based on the mean percent
differences (Equation 1). The mean percent
difference, D, is calculated by Equation 8
below.
4.1.3.2 Calculate the 25th and 75th
percentiles of the percent differences for each
site. The absolute bias upper bound should
be flagged as positive if both percentiles are
positive and negative if both percentiles are
negative. The absolute bias upper bound
would not be flagged if the 25th and 75th
percentiles are of different signs.
4.2 Statistics for the Assessment of PM10,
PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler
Precision Estimate for PM10, PM2.5 and Pb.
Precision is estimated via duplicate
measurements from collocated samplers. It is
recommended that the precision be
aggregated at the PQAO level quarterly,
annually, and at the 3-year level. The data
pair would only be considered valid if both
concentrations are greater than or equal to
EP11SE14.004
tkelley on DSK3SPTVN1PROD with PROPOSALS2
4.1.3.1 Assigning a sign (positive/
negative) to the bias estimate. Since the bias
statistic as calculated in equation 3 of this
appendix uses absolute values, it does not
have a tendency (negative or positive bias)
associated with it. A sign will be designated
by rank ordering the percent differences of
the QC check samples from a given site for
a particular assessment interval.
54385
54386
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
Human Exposure and Atmospheric
Sciences Division, MD–D205–03,
Research Triangle Park, NC 27711.
https://www.epa.gov/ttn/amtic/
criteria.html.
(7) Transfer Standards for the Calibration of
Ambient Air Monitoring Analyzers for
Ozone. EPA–454/B–13–004 U.S.
Environmental Protection Agency,
Research Triangle Park, NC 27711,
October, 2013. https://www.epa.gov/ttn/
amtic/qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical
Assistance Document for the Calibration
of Ambient Ozone Monitors. EPA–600/
4–79–057. U.S. Environmental
Protection Agency, Research Triangle
Park, NC 27711, September, 1979. https://
www.epa.gov/ttn/amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume
1–A Field Guide to Environmental
Quality Assurance. EPA–600/R–94/038a.
April 1994. Available from U.S.
Environmental Protection Agency, ORD
Publications Office, Center for
Environmental Research Information
(CERI), 26 W. Martin Luther King Drive,
Cincinnati, OH 45268. https://
www.epa.gov/ttn/amtic/qabook.html.
(10) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume
II: Ambient Air Quality Monitoring
Program Quality System Development.
EPA–454/B–13–003. https://
www.epa.gov/ttn/amtic/qabook.html.
(11) National Performance Evaluation
Program Standard Operating Procedures.
https://www.epa.gov/ttn/amtic/
npapsop.html.
TABLE A–1 OF APPENDIX A TO PART 58—MINIMUM DATA ASSESSMENT REQUIREMENTS FOR NAAQS RELATED CRITERIA
POLLUTANT MONITORS
Method
Assessment method
Minimum
frequency
Coverage
Parameters reported
AQS
assessment
type
Gaseous Methods (CO, NO2, SO2, O3)
1-Point QC for SO2, NO2, O3, CO
Annual performance evaluation for
SO2, NO2, O3, CO.
NPAP for SO2, NO2, O3, CO ........
Response check at concentration
0.005–0.08 ppm SO2, NO2, O3,
and 0.5 and 5 ppm CO.
See section 3.1.2 of this appendix
Independent Audit ........................
Each analyzer ...
Once per 2
weeks.
Audit concentration 1 and measured concentration 2.
1-Point QC.
Each analyzer ...
Once per year ...
Annual PE.
20% of sites
each year.
Once per year ...
Audit concentration 1 and measured concentration 2 for each
level.
Audit concentration 1 and measured concentration 2 for each
level.
Primary sampler
and duplicate
centration.3
Primary sampler
and duplicate
centration.3
Audit flow rate and
rate indicated by
Audit flow rate and
rate indicated by
Audit flow rate and
rate indicated by
No Transaction
reported as
raw data.
No Transaction
reported as
raw data.
Flow Rate
Verification.
Flow Rate
Verification.
Semi Annual
Flow Rate
Audit.
Pb Analysis Audits.
NPAP.
Particulate Methods
Collocated samplers .....................
15% ...................
1-in-12 days ......
Manual method-collocated quality
control sampling PM10, PM2.5,
Pb–TSP, Pb–PM10.
Flow rate verification PM10 (low
Vol) PM2.5, Pb-PM10.
Flow rate verification PM10 (HighVol), Pb-TSP.
Semi-annual flow rate audit PM10,
TSP, PM10-2.5, PM2.5, Pb-TSP,
Pb-PM10..
Pb analysis audits Pb-TSP, PbPM10.
Collocated samplers .....................
15% ...................
1-in-12 days ......
Check of sampler flow rate ..........
Each sampler ....
Check of sampler flow rate ..........
Each sampler ....
Check of sampler flow rate using
independent standard.
Each sampler ....
Once every
month.
Once every
quarter.
Once every 6
months.
Check of analytical system with
Pb audit strips/filters.
Analytical ...........
Once each quarter.
Performance Evaluation Program
PM2.5.
Collocated samplers .....................
Collocated samplers .....................
(1) 5 valid audits
for primary QA
orgs, with <=5
sites. (2) 8
valid audits for
primary QA
orgs, with >5
sites. (3) All
samplers in 6
years.
(1) 1 valid audit
and 4 collocated samples
for primary QA
orgs, with <=5
sites. (2) 2
valid audits
and 6 collocated samples
for primary QA
orgs with >5
sites.
Distributed over
all 4 quarters.
Performance Evaluation Program
Pb-TSP, Pb-PM10.
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Continuous 4
method-collocated
quality control sampling PM2.5.
Distributed over
all 4 quarters.
concentration for open path analyzers.
concentration, if applicable for open path analyzers.
primary and collocated sampler values are reported as raw data.
4 PM
2.5 is the only particulate criteria pollutant requiring collocation of continuous and manual primary monitors.
3 Both
Jkt 232001
PO 00000
Frm 00032
Fmt 4701
Sfmt 4702
measured flow
the sampler.
measured flow
the sampler.
measured flow
the sampler.
Primary sampler concentration
and performance evaluation
sampler concentration. Primary
sampler concentration and duplicate sampler concentration.
2 Corrected
18:52 Sep 10, 2014
concentration
sampler con-
Measured value and audit value
(μg Pb/filter) using AQS unit
code 077.
Primary sampler concentration
and performance evaluation
sampler concentration.
1 Effective
VerDate Mar<15>2010
concentration
sampler con-
E:\FR\FM\11SEP2.SGM
11SEP2
PEP.
PEP.
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
54387
TABLE A–2 OF APPENDIX A TO PART 58—SUMMARY OF PM2.5 NUMBER AND TYPE OF COLLOCATION (15% COLLOCATION
REQUIREMENT) REQUIRED USING AN EXAMPLE OF A PQAO THAT HAS 54 PRIMARY MONITORS (54 SITES) WITH ONE
FEDERAL REFERENCE METHOD TYPE AND THREE TYPES OF APPROVED FEDERAL EQUIVALENT METHODS
Total number
of monitors
Primary sampler method designation
FRM .........................................................................................
FEM (A) ...................................................................................
FEM (B) ...................................................................................
FEM (C) ...................................................................................
10. Add Appendix B to part 58 to read
as follows:
■
Appendix B to Part 58—Quality
Assurance Requirements for Prevention
of Significant Deterioration (PSD) Air
Monitoring
tkelley on DSK3SPTVN1PROD with PROPOSALS2
1. General Information
2. Quality System Requirements
3. Measurement Quality Check
Requirements
4. Calculations for Data Quality
Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability.
(a) This appendix specifies the minimum
quality assurance requirements for the
control and assessment of the quality of the
ambient air monitoring data submitted to a
PSD reviewing authority or the EPA by an
organization operating an air monitoring
station, or network of stations, operated in
order to comply with Part 51 New Source
Review—Prevention of Significant
Deterioration (PSD). Such organizations are
encouraged to develop and maintain quality
assurance programs more extensive than the
required minimum. Additional guidance for
the requirements reflected in this appendix
can be found in the ‘‘Quality Assurance
Handbook for Air Pollution Measurement
Systems,’’ Volume II (Ambient Air) and
‘‘Quality Assurance Handbook for Air
Pollution Measurement Systems,’’ Volume IV
(Meteorological Measurements) and at a
national level in references 1, 2, and 3 of this
appendix.
(b) It is not assumed that data generated for
PSD under this appendix will be used in
making NAAQS decisions. However, if all
the requirements in this appendix are
followed (including the NPEP programs) and
reported to AQS, with review and
concurrence from the EPA region, data may
be used for NAAQS decisions. With the
exception of the NPEP programs (NPAP,
PM2.5 PEP, Pb–PEP) for which
implementation is at the discretion of the
PSD reviewing authority, all other quality
assurance and quality control requirements
found in the appendix must be met.
1.2 PSD Primary Quality Assurance
Organization (PQAO). A PSD PQAO is
defined as a monitoring organization or a
coordinated aggregation of such
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
Total number
of collocated
20
20
2
12
3
3
1
2
organizations that is responsible for a set of
stations within one reviewing authority that
monitors the same pollutant and for which
data quality assessments will be pooled. Each
criteria pollutant/monitor must be associated
with only one PSD PQAO.
1.2.1 Each PSD PQAO shall be defined
such that measurement uncertainty among all
stations in the organization can be expected
to be reasonably homogeneous, as a result of
common factors. A PSD PQAO must be
associated with only one PSD reviewing
authority. Common factors that should be
considered in defining PSD PQAOs include:
(a) Operation by a common team of field
operators according to a common set of
procedures;
(b) Use of a common QAPP and/or
standard operating procedures;
(c) Common calibration facilities and
standards;
(d) Oversight by a common quality
assurance organization; and
(e) Support by a common management
organization or laboratory.
1.2.2 PSD monitoring organizations
having difficulty describing its PQAO or in
assigning specific monitors to a PSD PQAO
should consult with the reviewing authority.
Any consolidation of PSD PQAOs shall be
subject to final approval by the PSD
reviewing authority.
1.2.3 Each PSD PQAO is required to
implement a quality system that provides
sufficient information to assess the quality of
the monitoring data. The quality system
must, at a minimum, include the specific
requirements described in this appendix.
Failure to conduct or pass a required check
or procedure, or a series of required checks
or procedures, does not by itself invalidate
data for regulatory decision making. Rather,
PSD PQAOs and the PSD reviewing authority
shall use the checks and procedures required
in this appendix in combination with other
data quality information, reports, and similar
documentation that demonstrate overall
compliance with parts 51, 52 and 58 of this
chapter. Accordingly, the PSD reviewing
authority shall use a ‘‘weight of evidence’’
approach when determining the suitability of
data for regulatory decisions. The PSD
reviewing authority reserves the authority to
use or not use monitoring data submitted by
a PSD monitoring organization when making
regulatory decisions based on the PSD
reviewing authority’s assessment of the
quality of the data. Generally, consensus
built validation templates or validation
PO 00000
Frm 00033
Fmt 4701
Sfmt 4702
Number of
collocated
with same
method designation
as primary
Number of
collocated
with FRM
3
2
1
1
3
1
0
1
criteria already approved in quality
assurance project plans (QAPPs) should be
used as the basis for the weight of evidence
approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used
to describe deviations from a true
concentration or estimate that are related to
the measurement process and not to spatial
or temporal population attributes of the air
being measured.
(b) Precision. A measurement of mutual
agreement among individual measurements
of the same property usually under
prescribed similar conditions, expressed
generally in terms of the standard deviation.
(c) Bias. The systematic or persistent
distortion of a measurement process which
causes errors in one direction.
(d) Accuracy. The degree of agreement
between an observed value and an accepted
reference value. Accuracy includes a
combination of random error (imprecision)
and systematic error (bias) components
which are due to sampling and analytical
operations.
(e) Completeness. A measure of the amount
of valid data obtained from a measurement
system compared to the amount that was
expected to be obtained under correct,
normal conditions.
(f) Detectability. The low critical range
value of a characteristic that a method
specific procedure can reliably discern.
1.4 Measurement Quality Check
Reporting. The measurement quality checks
described in section 3 of this appendix, are
required to be submitted to the PSD
reviewing authority within the same time
frame as routinely-collected ambient
concentration data as described in 40 CFR
58.16. The PSD reviewing authority may as
well require that the measurement quality
check data be reported to AQS.
1.5 Assessments and Reports. Periodic
assessments and documentation of data
quality are required to be reported to the PSD
reviewing authority. To provide national
uniformity in this assessment and reporting
of data quality for all networks, specific
assessment and reporting procedures are
prescribed in detail in sections 3, 4, and 5 of
this appendix.
2. Quality System Requirements
A quality system (reference 1 of this
appendix) is the means by which an
organization manages the quality of the
monitoring information it produces in a
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
54388
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
systematic, organized manner. It provides a
framework for planning, implementing,
assessing and reporting work performed by
an organization and for carrying out required
quality assurance and quality control
activities.
2.1 Quality Assurance Project Plans. All
PSD PQAOs must develop a quality system
that is described and approved in quality
assurance project plans (QAPP) to ensure that
the monitoring results:
(a) Meet a well-defined need, use, or
purpose (reference 5 of this appendix);
(b) Provide data of adequate quality for the
intended monitoring objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards
specifications;
(e) Comply with statutory (and other legal)
requirements; and
(f) Assure quality assurance and quality
control adequacy and independence.
2.1.1 The QAPP is a formal document
that describes these activities in sufficient
detail and is supported by standard operating
procedures. The QAPP must describe how
the organization intends to control
measurement uncertainty to an appropriate
level in order to achieve the objectives for
which the data are collected. The QAPP must
be documented in accordance with EPA
requirements (reference 3 of this appendix).
2.1.2 The PSD PQAO’s quality system
must have adequate resources both in
personnel and funding to plan, implement,
assess and report on the achievement of the
requirements of this appendix and it’s
approved QAPP.
2.1.3 Incorporation of quality
management plan (QMP) elements into the
QAPP. The QMP describes the quality system
in terms of the organizational structure,
functional responsibilities of management
and staff, lines of authority, and required
interfaces for those planning, implementing,
assessing and reporting activities involving
environmental data operations (EDO). The
PSD PQAOs may combine pertinent elements
of the QMP into the QAPP rather than
requiring the submission of both QMP and
QAPP documents separately, with prior
approval of the PSD reviewing authority.
Additional guidance on QMPs can be found
in reference 2 of this appendix.
2.2 Independence of Quality Assurance
Management. The PSD PQAO must provide
for a quality assurance management function
for its PSD data collection operation, that
aspect of the overall management system of
the organization that determines and
implements the quality policy defined in a
PSD PQAO’s QAPP. Quality management
includes strategic planning, allocation of
resources and other systematic planning
activities (e.g., planning, implementation,
assessing and reporting) pertaining to the
quality system. The quality assurance
management function must have sufficient
technical expertise and management
authority to conduct independent oversight
and assure the implementation of the
organization’s quality system relative to the
ambient air quality monitoring program and
should be organizationally independent of
environmental data generation activities.
2.3. Data Quality Performance
Requirements.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
2.3.1 Data Quality Objectives (DQOs).
The DQOs, or the results of other systematic
planning processes, are statements that
define the appropriate type of data to collect
and specify the tolerable levels of potential
decision errors that will be used as a basis
for establishing the quality and quantity of
data needed to support air monitoring
objectives (reference 5 of the appendix). The
DQOs have been developed by the EPA to
support attainment decisions for comparison
to national ambient air quality standards
(NAAQS). The reviewing authority and the
PSD monitoring organization will be jointly
responsible for determining whether
adherence to the EPA developed NAAQS
DQOs specified in appendix A of this part are
appropriate or if DQOs from a projectspecific systematic planning process are
necessary.
2.3.1.1 Measurement Uncertainty for
Automated and Manual PM2.5 Methods. The
goal for acceptable measurement uncertainty
for precision is defined as an upper 90
percent confidence limit for the coefficient of
variation (CV) of 10 percent and plus or
minus 10 percent for total bias.
2.3.1.2 Measurement Uncertainty for
Automated Ozone Methods. The goal for
acceptable measurement uncertainty is
defined for precision as an upper 90 percent
confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence
limit for the absolute bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb
Methods. The goal for acceptable
measurement uncertainty is defined for
precision as an upper 90 percent confidence
limit for the CV of 20 percent and for bias
as an upper 95 percent confidence limit for
the absolute bias of 15 percent.
2.3.1.4 Measurement Uncertainty for
NO2. The goal for acceptable measurement
uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV
of 15 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 15 percent.
2.3.1.5 Measurement Uncertainty for SO2.
The goal for acceptable measurement
uncertainty for precision is defined as an
upper 90 percent confidence limit for the CV
of 10 percent and for bias as an upper 95
percent confidence limit for the absolute bias
of 10 percent.
2.4 National Performance Evaluation
Program. Organizations operating PSD
monitoring networks are required to
implement the EPA’s national performance
evaluation program (NPEP) if the data will be
used for NAAQS decisions and at the
discretion of the PSD reviewing authority if
PSD data is not used for NAAQS decisions.
The NPEP includes the National Performance
Audit Program (NPAP), the PM2.5
Performance Evaluation Program (PM2.5-PEP)
and the Pb Performance Evaluation Program
(Pb-PEP). The PSD QAPP shall provide for
the implementation of NPEP including the
provision of adequate resources for such
audit programs. Contact the PSD reviewing
authority to determine the best procedure for
implementing the audits which may include
an audit by the PSD reviewing authority, a
contractor certified for the activity, or
through self-implementation which is
PO 00000
Frm 00034
Fmt 4701
Sfmt 4702
described in sections below. A determination
of which entity will be performing this audit
program should be made as early as possible
and during the QAPP development process.
The PSD PQAOs, including contractors that
plan to implement these programs on behalf
of PSD PQAOs, that plan to implement these
programs (self-implement) rather than use
the federal programs, must meet the
adequacy requirements found in the
appropriate sections that follow, as well as
meet the definition of independent
assessment that follows.
2.4.1 Independent Assessment. An
assessment performed by a qualified
individual, group, or organization that is not
part of the organization directly performing
and accountable for the work being assessed.
This auditing organization must not be
involved with the generation of the routinelycollected ambient air monitoring data. An
organization can conduct the performance
evaluation (PE) if it can meet this definition
and has a management structure that, at a
minimum, will allow for the separation of its
routine sampling personnel from its auditing
personnel by two levels of management. In
addition, the sample analysis of audit filters
must be performed by a laboratory facility
and laboratory equipment separate from the
facilities used for routine sample analysis.
Field and laboratory personnel will be
required to meet the performance evaluation
field and laboratory training and certification
requirements. The PSD PQAO will be
required to participate in the centralized field
and laboratory standards certification and
comparison processes to establish
comparability to federally implemented
programs.
2.5 Technical Systems Audit Program.
The PSD reviewing authority or the EPA,
may conduct system audits of the ambient air
monitoring programs or organizations
operating PSD networks. The PSD monitoring
organizations shall consult with the PSD
reviewing authority to verify the schedule of
any such technical systems audit. Systems
audit programs are described in reference 10
of this appendix.
2.6 Gaseous and Flow Rate Audit
Standards.
2.6.1 Gaseous pollutant concentration
standards (permeation devices or cylinders of
compressed gas) used to obtain test
concentrations for carbon monoxide (CO),
sulfur dioxide (SO2), nitrogen oxide (NO),
and nitrogen dioxide (NO2) must be traceable
to either a National Institute of Standards and
Technology (NIST) Traceable Reference
Material (NTRM) or a NIST-certified Gas
Manufacturer’s Internal Standard (GMIS),
certified in accordance with one of the
procedures given in reference 4 of this
appendix. Vendors advertising certification
with the procedures provided in reference 4
of this appendix and distributing gases as
‘‘EPA Protocol Gas’’ must participate in the
EPA Protocol Gas Verification Program or not
use ‘‘EPA’’ in any form of advertising. The
PSD PQAOs must provide information to the
PSD reviewing authority on the gas vendors
they use (or will use) for the duration of the
PSD monitoring project. This information can
be provided in the QAPP or monitoring plan,
but must be updated if there is a change in
the producer used.
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
2.6.2 Test concentrations for ozone (O3)
must be obtained in accordance with the
ultraviolet photometric calibration procedure
specified in appendix D to part 50, and by
means of a certified NIST-traceable O3
transfer standard. Consult references 7 and 8
of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be
made by a flow measuring instrument that is
NIST-traceable to an authoritative volume or
other applicable standard. Guidance for
certifying some types of flow-meters is
provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance.
Requirements and guidance documents for
developing the quality system are contained
in references 1 through 11 of this appendix,
which also contain many suggested
procedures, checks, and control
specifications. Reference 10 describes
specific guidance for the development of a
quality system for data collected for
comparison to the NAAQS. Many specific
quality control checks and specifications for
methods are included in the respective
reference methods described in part 50 or in
the respective equivalent method
descriptions available from the EPA
(reference 6 of this appendix). Similarly,
quality control procedures related to
specifically designated reference and
equivalent method monitors are contained in
the respective operation or instruction
manuals associated with those monitors. For
PSD monitoring, the use of reference and
equivalent method monitors are required.
3. Measurement Quality Check
Requirements
This section provides the requirements for
PSD PQAOs to perform the measurement
quality checks that can be used to assess data
quality. Data from these checks are required
to be submitted to the PSD reviewing
authority within the same time frame as
routinely-collected ambient concentration
data as described in 40 CFR 58.16. Table B–
1 of this appendix provides a summary of the
types and frequency of the measurement
quality checks that are described in this
section. Reporting these results to AQS may
be required by the PSD reviewing authority.
3.1 Gaseous monitors of SO2, NO2, O3, and
CO.
3.1.1 One-Point Quality Control (QC)
Check for SO2, NO2, O3, and CO. (a) A onepoint QC check must be performed at least
once every 2 weeks on each automated
monitor used to measure SO2, NO2, O3 and
CO. With the advent of automated calibration
systems, more frequent checking is strongly
encouraged and may be required by the PSD
reviewing authority. See Reference 10 of this
appendix for guidance on the review
procedure. The QC check is made by
challenging the monitor with a QC check gas
of known concentration (effective
concentration for open path monitors)
between the prescribed range of 0.005 and
0.08 parts per million (ppm) for SO2, NO2,
and O3, and between the prescribed range of
0.5 and 5 ppm for CO monitors. The QC
check gas concentration selected within the
prescribed range must be related to the mean
or median of the ambient air concentrations
normally measured at sites within the PSD
monitoring network in order to appropriately
reflect the precision and bias at these routine
concentration ranges. If the mean or median
concentrations at the sites are below or above
the prescribed range, select the lowest or
highest concentration in the range. An
additional QC check point is encouraged for
those organizations that may have occasional
high values or would like to confirm the
monitors’ linearity at the higher end of the
operational range.
(b) Point analyzers must operate in their
normal sampling mode during the QC check
and the test atmosphere must pass through
all filters, scrubbers, conditioners and other
components used during normal ambient
sampling and as much of the ambient air
inlet system as is practicable. The QC check
must be conducted before any calibration or
adjustment to the monitor.
(c) Open-path monitors are tested by
inserting a test cell containing a QC check gas
concentration into the optical measurement
beam of the instrument. If possible, the
normally used transmitter, receiver, and as
appropriate, reflecting devices should be
used during the test and the normal
monitoring configuration of the instrument
should be altered as little as possible to
accommodate the test cell for the test.
However, if permitted by the associated
operation or instruction manual, an alternate
local light source or an alternate optical path
that does not include the normal atmospheric
monitoring path may be used. The actual
concentration of the QC check gas in the test
cell must be selected to produce an effective
concentration in the range specified earlier in
this section. Generally, the QC test
concentration measurement will be the sum
of the atmospheric pollutant concentration
and the QC test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open path
instrument under test immediately before
and immediately after the QC test from the
54389
QC check gas concentration measurement. If
the difference between these before and after
measurements is greater than 20 percent of
the effective concentration of the test gas,
discard the test result and repeat the test. If
possible, open path monitors should be
tested during periods when the atmospheric
pollutant concentrations are relatively low
and steady.
(d) Report the audit concentration of the
QC gas and the corresponding measured
concentration indicated by the monitor. The
percent differences between these
concentrations are used to assess the
precision and bias of the monitoring data as
described in sections 4.1.2 (precision) and
4.1.3 (bias) of this appendix.
3.1.2 Quarterly performance evaluation
for SO2, NO2, O3 , or CO. Evaluate each
primary monitor each calendar quarter
during which monitors are operated or a least
once (if operated for less than one quarter).
The quarterly performance evaluation
(quarterly PE) must be performed by a
qualified individual, group, or organization
that is not part of the organization directly
performing and accountable for the work
being assessed. The person or entity
performing the quarterly PE must not be
involved with the generation of the routinelycollected ambient air monitoring data. A PSD
monitoring organization can conduct the
quarterly PE itself if it can meet this
definition and has a management structure
that, at a minimum, will allow for the
separation of its routine sampling personnel
from its auditing personnel by two levels of
management. The quarterly PE also requires
a set of equipment and standards
independent from those used for routine
calibrations or zero, span or precision checks.
The PE personnel will be required to meet PE
training and certification requirements.
3.1.2.1 The evaluation is made by
challenging the monitor with audit gas
standards of known concentration from at
least three audit levels. Two of the audit
levels selected will represent a range of
10–80 percent of the typical ambient air
concentrations either measured by the
monitor or in the PQAOs network of
monitors. The third point should be at the
NAAQS level or above the highest
anticipated routine hourly concentration,
whichever is greater. An additional 4th level
is encouraged for those PSD organizations
that would like to confirm the monitor’s
linearity at the higher end of the operational
range. In rare circumstances, there may be
sites measuring concentrations above audit
level 10. These sites should be identified to
the PSD reviewing authority.
Concentration range, ppm
Audit level
tkelley on DSK3SPTVN1PROD with PROPOSALS2
O3
1
2
3
4
5
6
7
8
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
...............................................................................................
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
PO 00000
Frm 00035
SO2
0.004–0.0059
0.006–0.019
0.020–0.039
0.040–0.069
0.070–0.089
0.090–0.119
0.120–0.139
0.140–0.169
Fmt 4701
Sfmt 4702
0.0003–0.0029
0.0030–0.0049
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.1499
0.1500–0.2599
E:\FR\FM\11SEP2.SGM
NO2
0.0003–0.0029
0.0030–0.0049
0.0050–0.0079
0.0080–0.0199
0.0200–0.0499
0.0500–0.0999
0.1000–0.2999
0.3000–0.4999
11SEP2
CO
0.020–0.059
0.060–0.199
0.200–0.899
0.900–2.999
3.000–7.999
8.000–15.999
16.000–30.999
31.000–39.999
54390
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
Concentration range, ppm
Audit level
SO2
O3
tkelley on DSK3SPTVN1PROD with PROPOSALS2
9 ...............................................................................................
10 .............................................................................................
3.1.2.2 The NO2 audit techniques may
vary depending on the ambient monitoring
method. For chemiluminescence-type NO2
analyzers, gas phase titration (GPT)
techniques should be based on the EPA
guidance documents and monitoring agency
experience. The NO2 gas standards may be
more appropriate than GPT for direct NO2
methods that do not employ converters. Care
should be taken to ensure the stability of
such gas standards prior to use.
3.1.2.3 The standards from which audit
gas test concentrations are obtained must
meet the specifications of section 2.6.1 of this
appendix.
3.1.2.4 For point analyzers, the evaluation
shall be carried out by allowing the monitor
to analyze the audit gas test atmosphere in
its normal sampling mode such that the test
atmosphere passes through all filters,
scrubbers, conditioners, and other sample
inlet components used during normal
ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open-path monitors are evaluated
by inserting a test cell containing the various
audit gas concentrations into the optical
measurement beam of the instrument. If
possible, the normally used transmitter,
receiver, and, as appropriate, reflecting
devices should be used during the
evaluation, and the normal monitoring
configuration of the instrument should be
modified as little as possible to accommodate
the test cell for the evaluation. However, if
permitted by the associated operation or
instruction manual, an alternate local light
source or an alternate optical path that does
not include the normal atmospheric
monitoring path may be used. The actual
concentrations of the audit gas in the test cell
must be selected to produce effective
concentrations in the evaluation level ranges
specified in this section of this appendix.
Generally, each evaluation concentration
measurement result will be the sum of the
atmospheric pollutant concentration and the
evaluation test concentration. As such, the
result must be corrected to remove the
atmospheric concentration contribution. The
corrected concentration is obtained by
subtracting the average of the atmospheric
concentrations measured by the open-path
instrument under test immediately before
and immediately after the evaluation test (or
preferably before and after each evaluation
concentration level) from the evaluation
concentration measurement. If the difference
between the before and after measurements is
greater than 20 percent of the effective
concentration of the test gas standard,
discard the test result for that concentration
level and repeat the test for that level. If
possible, open path monitors should be
evaluated during periods when the
atmospheric pollutant concentrations are
relatively low and steady. Also, if the openpath instrument is not installed in a
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
0.170–0.189
0.190–0.259
0.2600–0.7999
0.8000–1.000
permanent manner, the monitoring path
length must be reverified to be within plus
or minus 3 percent to validate the evaluation,
since the monitoring path length is critical to
the determination of the effective
concentration.
3.1.2.6 Report both the evaluation
concentrations (effective concentrations for
open-path monitors) of the audit gases and
the corresponding measured concentration
(corrected concentrations, if applicable, for
open-path monitors) indicated or produced
by the monitor being tested. The percent
differences between these concentrations are
used to assess the quality of the monitoring
data as described in section 4.1.1 of this
appendix.
3.1.3 National Performance Evaluation
Program (NPAP).
As stated in sections 1.1 and 2.4, PSD
monitoring networks may be subject to the
NPEP, which includes the NPAP. The NPAP
is a performance evaluation which is a type
of audit where quantitative data are collected
independently in order to evaluate the
proficiency of an analyst, monitoring
instrument and laboratory. The NPAP should
not be confused with the quarterly PE
program described in section 3.1.2. The PSD
organizations shall consult with the PSD
reviewing authority or the EPA regarding
whether the implementation of NPAP is
required and the implementation options
available. Details of the EPA NPAP can be
found in reference 11 of this appendix. The
program requirements include:
3.1.3.1 Performing audits on 100 percent
of monitors and sites each year including
monitors and sites that may be operated for
less than 1 year. The reviewing authority has
the authority to require more frequent audits
at sites they consider to be high priority.
3.1.3.2 Developing a delivery system that
will allow for the audit concentration gasses
to be introduced at the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified
against the National Institute for Standards
and Technology (NIST) standard reference
methods or special review procedures and
validated annually for CO, SO2 and NO2, and
at the beginning of each quarter of audits for
O3.
3.1.3.4 The PSD PQAO may elect to selfimplement NPAP. In these cases, the PSD
reviewing authority will work with those
PSD PQAOs to establish training and other
technical requirements to establish
comparability to federally implemented
programs. In addition to meeting the
requirements in sections 3.1.1.3 through
3.1.3.3, the PSD PQAO must:
(a) Ensure that the PSD audit system is
equivalent to the EPA NPAP audit system
and is an entirely separate set of equipment
and standards from the equipment used for
quarterly performance evaluations. If this
system does not generate and analyze the
PO 00000
Frm 00036
Fmt 4701
Sfmt 4702
NO2
0.5000–0.7999
0.8000–1.000
CO
40.000–49.999
50.000–60.000
audit concentrations, as the EPA NPAP
system does, its equivalence to the EPA
NPAP system must be proven to be as
accurate under a full range of appropriate
and varying conditions as described in
section 3.1.3.6.
(b) Perform a whole system check by
having the PSD audit system tested at an
independent and qualified EPA lab, or
equivalent.
(c) Evaluate the system with the EPA NPAP
program through collocated auditing at an
acceptable number of sites each year (at least
one for a PSD network of five or less sites;
at least two for a network with more than five
sites).
(d) Incorporate the NPAP into the PSD
PQAO’s QAPP.
(e) Be subject to review by independent,
EPA-trained personnel.
(f) Participate in initial and update
training/certification sessions.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure PM2.5. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be used in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. Flow rate verification results are to
be reported to the PSD reviewing authority
quarterly as described in section 5.1.
Reporting these results to AQS is encouraged.
The percent differences between the audit
and measured flow rates are used to assess
the bias of the monitoring data as described
in section 4.2.2 of this appendix (using flow
rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for
PM2.5. Every 6 months, audit the flow rate of
the PM2.5 particulate monitors. For shortterm monitoring operations (those less than
1 year), the flow rate audits must occur at
start up, at the midpoint, and near the
completion of the monitoring project. The
audit must be conducted by a trained
technician other than the routine site
operator. The audit is made by measuring the
monitor’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
for verifications or to calibrate the monitor.
However, both the calibration standard and
the audit standard may be referenced to the
same primary flow rate or volume standard.
E:\FR\FM\11SEP2.SGM
11SEP2
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
Care must be taken in auditing the flow rate
to be certain that the flow measurement
device does not alter the normal operating
flow rate of the monitor. Report the audit
flow rate of the transfer standard and the
corresponding flow rate measured by the
monitor. The percent differences between
these flow rates are used to evaluate monitor
performance.
3.2.3 Collocated Sampling Procedures for
PM2.5. A PSD PQAO must have at least one
collocated monitor for each PSD monitoring
network.
3.2.3.1 For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the QC monitor. There
can be only one primary monitor at a
monitoring site for a given time period.
(a) If the primary monitor is a FRM, then
the quality control monitor must be a FRM
of the same method designation.
(b) If the primary monitor is a FEM, then
the quality control monitor must be a FRM
unless the PSD PQAO submits a waiver for
this requirement, provides a specific reason
why a FRM cannot be implemented, and the
waiver is approved by the PSD reviewing
authority. If the waiver is approved, then the
quality control monitor must be the same
method designation as the primary FEM
monitor.
3.2.3.2 In addition, the collocated
monitors should be deployed according to
the following protocol:
(a) The collocated quality control
monitor(s) should be deployed at sites with
the highest predicted daily PM2.5
concentrations in the network. If the highest
PM2.5 concentration site is impractical for
collocation purposes, alternative sites
approved by the PSD reviewing authority
may be selected. If additional collocated sites
are necessary, the PSD PQAO and the
reviewing authority should determine the
appropriate location(s) based on data needs.
(b) The two collocated monitors must be
within 4 meters of each other and at least 2
meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference. A
waiver allowing up to 10 meters horizontal
distance and up to 3 meters vertical distance
(inlet to inlet) between a primary and
collocated quality control monitor may be
approved by the PSD reviewing authority for
sites at a neighborhood or larger scale of
representation. This waiver may be approved
during the QAPP review and approval
process. Calibration, sampling, and analysis
must be the same for both collocated
samplers and the same as for all other
samplers in the network.
(c) Sample the collocated quality control
monitor on a 6-day schedule for sites not
requiring daily monitoring and on a 3-day
schedule for any site requiring daily
monitoring. Report the measurements from
both primary and collocated quality control
monitors at each collocated sampling site.
The calculations for evaluating precision
between the two collocated monitors are
described in section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation
Program (PEP) Procedures. As stated in
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
sections 1.1 and 2.4 of this appendix, PSD
monitoring networks may be subject to the
NPEP, which includes the PM2.5 PEP. The
PSD monitoring organizations shall consult
with the PSD reviewing authority or the EPA
regarding whether the implementation of
PM2.5 PEP is required and the
implementation options available for the
PM2.5 PEP. For PSD PQAOs with less than or
equal to five monitoring sites, five valid
performance evaluation audits must be
collected and reported each year. For PSD
PQAOs with greater than five monitoring
sites, eight valid performance evaluation
audits must be collected and reported each
year. Additionally, within the five or eight
required audits, each type of method
designation (FRM/FEM designation) used as
a primary monitor in the PSD network shall
be audited. For a PE to be valid, both the
primary monitor and PEP audit
measurements must meet quality control
requirements and be above 3 mg/m3 or a
predefined lower concentration level
determined by a systematic planning process
and approved by the PSD reviewing
authority. Due to the relatively short-term
nature of most PSD monitoring, the
likelihood of measuring low concentrations
in many areas attaining the PM2.5 standard
and the time required to weigh filters
collected in PEs, a PSD monitoring
organization’s QAPP may contain a provision
to waive the 3 mg/m3 threshold for validity
of PEs conducted in the last quarter of
monitoring, subject to approval by the PSD
reviewing authority.
3.3 PM10.
3.3.1 Flow Rate Verification for PM10. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally seperated by 14 days)
on each monitor used to measure PM10. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. For the
standard procedure, use a flow rate transfer
standard certified in accordance with section
2.6 of this appendix to check the monitor’s
normal flow rate. Care should be taken in
selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are used to
assess the bias of the monitoring data as
described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.3.2 Semi-Annual Flow Rate Audit for
PM10. Every 6 months, audit the flow rate of
the PM10 particulate monitors. For short-term
monitoring operations (those less than 1
year), the flow rate audits must occur at start
up, at the midpoint, and near the completion
of the monitoring project. Where possible,
the EPA strongly encourages more frequent
auditing. The audit must be conducted by a
trained technician other than the routine site
operator. The audit is made by measuring the
monitor’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
PO 00000
Frm 00037
Fmt 4701
Sfmt 4702
54391
for verifications or to calibrate the monitor.
However, both the calibration standard and
the audit standard may be referenced to the
same primary flow rate or volume standard.
Care must be taken in auditing the flow rate
to be certain that the flow measurement
device does not alter the normal operating
flow rate of the monitor. Report the audit
flow rate of the transfer standard and the
corresponding flow rate measured by the
monitor. The percent differences between
these flow rates are used to evaluate monitor
performance
3.3.3 Collocated Sampling Procedures for
Manual PM10. A PSD PQAO must have at
least one collocated monitor for each PSD
monitoring network.
3.3.3.1 For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the quality control
monitor.
3.3.3.2 In addition, the collocated
monitors should be deployed according to
the following protocol:
(a) The collocated quality control
monitor(s) should be deployed at sites with
the highest predicted daily PM10
concentrations in the network. If the highest
PM10 concentration site is impractical for
collocation purposes, alternative sites
approved by the PSD reviewing authority
may be selected.
(b) The two collocated monitors must be
within 4 meters of each other and at least 2
meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
liters/min to preclude airflow interference. A
waiver allowing up to 10 meters horizontal
distance and up to 3 meters vertical distance
(inlet to inlet) between a primary and
collocated sampler may be approved by the
PSD reviewing authority for sites at a
neighborhood or larger scale of
representation. This waiver may be approved
during the QAPP review and approval
process. Calibration, sampling, and analysis
must be the same for both collocated
samplers and the same as for all other
samplers in the network.
(c) Sample the collocated quality control
monitor on a 6-day schedule or 3-day
schedule for any site requiring daily
monitoring. Report the measurements from
both primary and collocated quality control
monitors at each collocated sampling site.
The calculations for evaluating precision
between the two collocated monitors are
described in section 4.2.1 of this appendix.
(d) In determining the number of
collocated sites required for PM10, PSD
monitoring networks for Pb-PM10 should be
treated independently from networks for
particulate matter (PM), even though the
separate networks may share one or more
common samplers. However, a single quality
control monitor that meets the collocation
requirements for Pb-PM10 and PM10 may
serve as a collocated quality control monitor
for both networks. Extreme care must be
taken if using the filter from a quality control
monitor for both PM10 and Pb analysis. PM10
filter weighing should occur prior to any Pb
analysis.
E:\FR\FM\11SEP2.SGM
11SEP2
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
3.4 Pb.
3.4.1 Flow Rate Verification for Pb. A
one-point flow rate verification check must
be performed at least once every month (each
verification minimally separated by 14 days)
on each monitor used to measure Pb. The
verification is made by checking the
operational flow rate of the monitor. If the
verification is made in conjunction with a
flow rate adjustment, it must be made prior
to such flow rate adjustment. Use a flow rate
transfer standard certified in accordance with
section 2.6 of this appendix to check the
monitor’s normal flow rate. Care should be
taken in selecting and using the flow rate
measurement device such that it does not
alter the normal operating flow rate of the
monitor. The percent differences between the
audit and measured flow rates are used to
assess the bias of the monitoring data as
described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.4.2 Semi-Annual Flow Rate Audit for
Pb. Every 6 months, audit the flow rate of the
Pb particulate monitors. For short-term
monitoring operations (those less than 1
year), the flow rate audits must occur at start
up, at the midpoint, and near the completion
of the monitoring project. Where possible,
the EPA strongly encourages more frequent
auditing. The audit must be conducted by a
trained technician other than the routine site
operator. The audit is made by measuring the
monitor’s normal operating flow rate using a
flow rate transfer standard certified in
accordance with section 2.6 of this appendix.
The flow rate standard used for auditing
must not be the same flow rate standard used
to in verifications or to calibrate the monitor.
However, both the calibration standard and
the audit standard may be referenced to the
same primary flow rate or volume standard.
Great care must be taken in auditing the flow
rate to be certain that the flow measurement
device does not alter the normal operating
flow rate of the monitor. Report the audit
flow rate of the transfer standard and the
corresponding flow rate measured by the
monitor. The percent differences between
these flow rates are used to evaluate monitor
performance.
3.4.3 Collocated Sampling for Pb. A PSD
PQAO must have at least one collocated
monitor for each PSD monitoring network.
3.4.3.1 For each pair of collocated
monitors, designate one sampler as the
primary monitor whose concentrations will
be used to report air quality for the site, and
designate the other as the quality control
monitor.
3.4.3.2 In addition, the collocated
monitors should be deployed according to
the following protocol:
(a) The collocated quality control
monitor(s) should be deployed at sites with
the highest predicted daily Pb concentrations
in the network. If the highest Pb
concentration site is impractical for
collocation purposes, alternative sites
approved by the PSD reviewing authority
may be selected.
(b) The two collocated monitors must be
within 4 meters of each other and at least 2
meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for
samplers having flow rates less than 200
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
liters/min to preclude airflow interference. A
waiver allowing up to 10 meters horizontal
distance and up to 3 meters vertical distance
(inlet to inlet) between a primary and
collocated sampler may be approved by the
reviewing authority for sites at a
neighborhood or larger scale of
representation. This waiver may be approved
during the QAPP review and approval
process. Calibration, sampling, and analysis
must be the same for both collocated
samplers and the same as for all other
samplers in the network.
(c) Sample the collocated quality control
monitor on a 6-day schedule if daily
monitoring is not required or 3-day schedule
for any site requiring daily monitoring.
Report the measurements from both primary
and collocated quality control monitors at
each collocated sampling site. The
calculations for evaluating precision between
the two collocated monitors are described in
section 4.2.1 of this appendix.
(d) In determining the number of
collocated sites required for Pb-PM10, PSD
monitoring networks for PM10 should be
treated independently from networks for PbPM10, even though the separate networks
may share one or more common samplers.
However, a single quality control monitor
that meets the collocation requirements for
Pb-PM10 and PM10 may serve as a collocated
quality control monitor for both networks.
Extreme care must be taken if using a using
the filter from a quality control monitor for
both PM10 and Pb analysis. The PM10 filter
weighing should occur prior to any Pb
analysis.
3.4.4 Pb Analysis Audits. Each calendar
quarter, audit the Pb reference or equivalent
method analytical procedure using filters
containing a known quantity of Pb. These
audit filters are prepared by depositing a Pb
standard on unexposed filters and allowing
them to dry thoroughly. The audit samples
must be prepared using batches of reagents
different from those used to calibrate the Pb
analytical equipment being audited. Prepare
audit samples in the following concentration
ranges:
Range
1 ...................
2 ...................
Equivalent ambient Pb
concentration, μg/m3
30–100% of Pb NAAQS.
200–300% of Pb NAAQS.
(a) Audit samples must be extracted using
the same extraction procedure used for
exposed filters.
(b) Analyze three audit samples in each of
the two ranges each quarter samples are
analyzed. The audit sample analyses shall be
distributed as much as possible over the
entire calendar quarter.
(c) Report the audit concentrations (in mg
Pb/filter or strip) and the corresponding
measured concentrations (in mg Pb/filter or
strip) using AQS unit code 077 (if reporting
to AQS). The percent differences between the
concentrations are used to calculate
analytical accuracy as described in section
4.2.5 of this appendix.
3.4.5 Pb Performance Evaluation Program
(PEP) Procedures. As stated in sections 1.1
and 2.4, PSD monitoring networks may be
PO 00000
Frm 00038
Fmt 4701
Sfmt 4725
subject to the NPEP, which includes the Pb
Performance Evaluation Program. PSD
monitoring organizations shall consult with
the PSD reviewing authority or the EPA
regarding whether the implementation of PbPEP is required and the implementation
options available for the Pb-PEP. The PEP is
an independent assessment used to estimate
total measurement system bias. Each year,
one PE audit must be performed at one Pb
site in each PSD PQAO network that has less
than or equal to five sites and two audits for
PSD PQAO networks with greater than five
sites. In addition, each year, four collocated
samples from PSD PQAO networks with less
than or equal to five sites and six collocated
samples from PSD PQAO networks with
greater than five sites must be sent to an
independent laboratory for analysis. The
calculations for evaluating bias between the
primary monitor and the PE monitor for Pb
are described in section 4.2.4 of this
appendix.
4. Calculations for Data Quality Assessment
(a) Calculations of measurement
uncertainty are carried out by PSD PQAO
according to the following procedures. The
PSD PQAOs should report the data for all
appropriate measurement quality checks as
specified in this appendix even though they
may elect to perform some or all of the
calculations in this section on their own.
(b) At low concentrations, agreement
between the measurements of collocated
samplers, expressed as relative percent
difference or percent difference, may be
relatively poor. For this reason, collocated
measurement pairs will be selected for use in
the precision and bias calculations only
when both measurements are equal to or
above the following limits:
(1) Pb: 0.002 mg/m3 (Methods approved
after 3/04/2010, with exception of manual
equivalent method EQLA–0813–803).
(2) Pb: 0.02 mg/m3 (Methods approved
before 3/04/2010, and manual equivalent
method EQLA–0813–803).
(3) PM10 (Hi-Vol): 15 mg/m3.
(4) PM10 (Lo-Vol): 3 mg/m3.
(5) PM2.5: 3 mg/m3.
The PM2.5 3 mg/m3 limit for the PM2.5-PEP
may be superseded by mutual agreement
between the PSD PQAO and the PSD
reviewing authority as specified in section
3.2.4 of the appendix and detailed in the
approved QAPP.
4.1 Statistics for the Assessment of QC
Checks for SO2, NO2, O3 and CO.
4.1.1 Percent Difference. Many of the
measurement quality checks start with a
comparison of an audit concentration or
value (flow-rate) to the concentration/value
measured by the monitor and use percent
difference as the comparison statistic as
described in equation 1 of this section. For
each single point check, calculate the percent
difference, di, as follows:
E:\FR\FM\11SEP2.SGM
11SEP2
EP11SE14.008
tkelley on DSK3SPTVN1PROD with PROPOSALS2
54392
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
54393
where, meas is the concentration indicated
by the PQAO’s instrument and audit is the
audit concentration of the standard used in
the QC check being measured.
4.1.2 Precision Estimate. The precision
estimate is used to assess the one-point QC
checks for SO2, NO2, O3, or CO described in
section 3.1.1 of this appendix. The precision
estimator is the coefficient of variation upper
bound and is calculated using equation 2 of
this section:
where, n is the number of single point checks
being aggregated; X2 0.1,n-1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is
calculated using the one-point QC checks for
SO2, NO2, O3, or CO described in section
3.1.1 of this appendix. The bias estimator is
an upper bound on the mean absolute value
of the percent differences as described in
equation 3 of this section:
and the quantity AS is the standard deviation
of the absolute value of the di’s and is
calculated using equation 5 of this section:
4.2 Statistics for the Assessment of PM10,
PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler
Precision Estimate for PM10, PM2.5 and Pb.
Precision is estimated via duplicate
measurements from collocated samplers. It is
recommended that the precision be
aggregated at the PQAO level quarterly,
annually, and at the 3-year level. The data
pair would only be considered valid if both
concentrations are greater than or equal to
the minimum values specified in section 4(c)
of this appendix. For each collocated data
pair, calculate the relative percent difference,
di, using equation 6 of this appendix:
EP11SE14.014
where, Xi is the concentration from the
primary sampler and Yi is the concentration
value from the audit sampler. The coefficient
of variation upper bound is calculated using
equation 7 of this appendix:
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
PO 00000
Frm 00039
Fmt 4701
Sfmt 4702
AS in equation 3 of this appendix is the
standard deviation of the absolute values if
the di’s and is calculated using equation 5 of
this appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias
Estimate for PM10, PM2.5 and Pb. Use the
same procedure described in section 4.2.2 for
the evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs
Bias Estimate for Pb. The Pb bias estimate is
calculated using the paired routine and the
E:\FR\FM\11SEP2.SGM
11SEP2
EP11SE14.011
meas is the value indicated by the sampler’s
volume measurement and audit is the actual
volume indicated by the auditing flow meter.
The absolute volume bias upper bound is
then calculated using equation 3, where n is
the number of flow rate audits being
aggregated; t0.95,n-1 is the 95th quantile of a tdistribution with n-1 degrees of freedom, the
quantity AB is the mean of the absolute
values of the di’s and is calculated using
equation 4 of this appendix, and the quantity
EP11SE14.010
where, n is the number of valid data pairs
being aggregated, and X2 0.1,n-1 is the 10th
percentile of a chi-squared distribution with
n–1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di
is calculated from two values with error.
4.2.2 One-Point Flow Rate Verification
Bias Estimate for PM10, PM2.5 and Pb. For
each one-point flow rate verification,
calculate the percent difference in volume
using equation 1 of this appendix where
EP11SE14.009
tkelley on DSK3SPTVN1PROD with PROPOSALS2
EP11SE14.012
EP11SE14.013
where, n is the number of single point checks
being aggregated; t0.95,n-1 is the 95th quantile
of a t-distribution with n-1 degrees of
freedom; the quantity AB is the mean of the
absolute values of the di’s and is calculated
using equation 4 of this section:
4.1.3.1 Assigning a sign (positive/
negative) to the bias estimate. Since the bias
statistic as calculated in equation 3 of this
appendix uses absolute values, it does not
have a tendency (negative or positive bias)
associated with it. A sign will be designated
by rank ordering the percent differences of
the QC check samples from a given site for
a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th
percentiles of the percent differences for each
site. The absolute bias upper bound should
be flagged as positive if both percentiles are
positive and negative if both percentiles are
negative. The absolute bias upper bound
would not be flagged if the 25th and 75th
percentiles are of different signs.
54394
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
PEP monitor as described in section 3.4.5.
Use the same procedures as described in
section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs
Bias Estimate for PM2.5. The bias estimate is
calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias
estimator is based on the mean percent
differences (Equation 1). The mean percent
difference, D, is calculated by Equation 8
below.
where, nj is the number of pairs and
d1,d2, . . . dnj are the biases for each pair to
be averaged.
4.2.6 Pb Analysis Audit Bias Estimate.
The bias estimate is calculated using the
analysis audit data described in section 3.4.4.
Use the same bias estimate procedure as
described in section 4.1.3 of this appendix.
5. Reporting Requirements
5.1 Quarterly Reports. For each quarter,
each PSD PQAO shall report to the PSD
reviewing authority (and AQS if required by
the PSD reviewing authority) the results of all
valid measurement quality checks it has
carried out during the quarter. The quarterly
reports must be submitted consistent with
the data reporting requirements specified for
air quality data as set forth in 40 CFR 58.16
and pertain to PSD monitoring.
6.0 References
(1) American National Standard—
Specifications and Guidelines for
Quality Systems for Environmental Data
Collection and Environmental
Technology Programs. ANSI/ASQC E4–
2004. February 2004. Available from
American Society for Quality Control,
611 East Wisconsin Avenue, Milwaukee,
WI 53202.
(2) EPA Requirements for Quality
Management Plans. EPA QA/R–2. EPA/
240/B–01/002. March 2001, Reissue May
2006. Office of Environmental
Information, Washington, DC 20460.
https://www.epa.gov/quality/qs-docs/r2final.pdf.
(3) EPA Requirements for Quality Assurance
Project Plans for Environmental Data
Operations. EPA QA/R–5. EPA/240/B–
01/003. March 2001, Reissue May 2006.
Office of Environmental Information,
Washington, DC 20460. https://
www.epa.gov/quality/qs-docs/r5final.pdf.
(4) EPA Traceability Protocol for Assay and
Certification of Gaseous Calibration
Standards. EPA–600/R–12/531. May,
2012. Available from U.S. Environmental
Protection Agency, National Risk
Management Research Laboratory,
Research Triangle Park, NC 27711.
https://www.epa.gov/nrmrl/appcd/mmd/
db-traceability-protocol.html.
(5) Guidance for the Data Quality Objectives
Process. EPA QA/G–4. EPA/240/B–06/
001. February, 2006. Office of
Environmental Information, Washington,
DC 20460. https://www.epa.gov/quality/
qs-docs/g4-final.pdf.
(6) List of Designated Reference and
Equivalent Methods. Available from U.S.
Environmental Protection Agency,
National Exposure Research Laboratory,
Human Exposure and Atmospheric
Sciences Division, MD–D205–03,
Research Triangle Park, NC 27711.
https://www.epa.gov/ttn/amtic/
criteria.html.
(7) Transfer Standards for the Calibration of
Ambient Air Monitoring Analyzers for
Ozone. EPA–454/B–13–004 U.S.
Environmental Protection Agency,
Research Triangle Park, NC 27711,
October, 2013. https://www.epa.gov/ttn/
amtic/qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical
Assistance Document for the Calibration
of Ambient Ozone Monitors. EPA–600/
4–79–057. U.S. Environmental
Protection Agency, Research Triangle
Park, NC 27711, September, 1979. https://
www.epa.gov/ttn/amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume
1—A Field Guide to Environmental
Quality Assurance. EPA–600/R–94/038a.
April 1994. Available from U.S.
Environmental Protection Agency, ORD
Publications Office, Center for
Environmental Research Information
(CERI), 26 W. Martin Luther King Drive,
Cincinnati, OH 45268. https://
www.epa.gov/ttn/amtic/qabook.html.
(10) Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume
II: Ambient Air Quality Monitoring
Program Quality System Development.
EPA–454/B–13–003. https://
www.epa.gov/ttn/amtic/qabook.html.
(11) National Performance Evaluation
Program Standard Operating Procedures.
https://www.epa.gov/ttn/amtic/
npapsop.html.
TABLE B–1—MINIMUM DATA ASSESSMENT REQUIREMENTS FOR NAAQS RELATED CRITERIA POLLUTANT PSD MONITORS
Method
Assessment method
Coverage
Minimum
frequency
Parameters
reported
AQS assessment type
Gaseous Methods (CO, NO2, SO2, O3)
1-Point QC for SO2, NO2,
O3, CO.
Quarterly performance
evaluation for SO2,
NO2, O3, CO.
NPAP for SO2, NO2, O3,
CO 3.
Response check at concentration 0.005–0.08
ppm SO2, NO2, O3, &
0.5 and 5 ppm CO.
See section 3.1.2 of this
appendix.
Each analyzer .................
Once per 2 weeks ..........
Audit concentration 1 and
measured concentration 2.
1-Point QC.
Each analyzer .................
Once per quarter ............
Annual PE.
Independent Audit ..........
Each primary monitor .....
Once per year ................
Audit concentration 1 and
measured concentration 2 for each level.
Audit concentration 1 and
measured concentration 2 for each level.
Primary sampler concentration and duplicate sampler concentration 4.
Audit flow rate and
measured flow rate indicated by the sampler.
Audit flow rate and
measured flow rate indicated by the sampler.
No Transaction reported
as raw data.
NPAP.
Particulate Methods
1 per PSD Network per
pollutant.
Every 6 days or every 3
days if daily monitoring
required.
Check of sampler flow
rate.
Each sampler .................
Once every month ..........
Semi-annual flow rate
audit.
PM10, PM2.5, Pb ...............
Check of sampler flow
rate using independent
standard.
Each sampler .................
Once every 6 months or
beginning, middle and
end of monitoring.
VerDate Mar<15>2010
18:52 Sep 10, 2014
Jkt 232001
PO 00000
Frm 00040
Fmt 4701
Sfmt 4702
E:\FR\FM\11SEP2.SGM
11SEP2
Flow Rate Verification.
Semi Annual Flow Rate
Audit.
EP11SE14.015
Collocated samplers .......
Flow rate verification .......
PM10, PM2.5, Pb ...............
tkelley on DSK3SPTVN1PROD with PROPOSALS2
Collocated sampling
PM10, PM2.5, Pb.
Federal Register / Vol. 79, No. 176 / Thursday, September 11, 2014 / Proposed Rules
54395
TABLE B–1—MINIMUM DATA ASSESSMENT REQUIREMENTS FOR NAAQS RELATED CRITERIA POLLUTANT PSD
MONITORS—Continued
Method
Assessment method
Coverage
Minimum
frequency
Parameters
reported
Pb analysis audits ...........
Pb-TSP, Pb-PM10 ............
Check of analytical system with Pb audit
strips/filters.
Analytical ........................
Each quarter ...................
Pb Analysis Audits.
Performance Evaluation
Program PM2.5 3.
Collocated samplers .......
Performance Evaluation
Program.
Pb 3 ..................................
Collocated samplers .......
(1) 5 valid audits for
Over all 4 quarters .........
PQAOs with <= 5 sites.
(2) 8 valid audits for
PQAOs with > 5 sites.
(3) All samplers in 6
years.
(1) 1 valid audit and 4
Over all 4 quarters .........
collocated samples for
PQAOs, with <=5 sites.
(2) 2 valid audits and 6
collocated samples for
PQAOs with > 5 sites.
Measured value and
audit value (ug Pb/filter) using AQS unit
code 077 for parameters:
14129—Pb (TSP) LC
FRM/FEM.
85129—Pb (TSP) LC
Non-FRM/FEM.
Primary sampler concentration and performance evaluation sampler concentration.
Primary sampler concentration and performance evaluation sampler concentration. Primary sampler concentration and duplicate sampler concentration.
PEP.
AQS assessment type
PEP.
1 Effective
concentration for open path analyzers.
concentration, if applicable for open path analyzers.
3 NPAP, PM
2.5 PEP and Pb-PEP must be implemented if data is used for NAAQS decisions otherwise implementation is at PSD reviewing authority discretion.
4 Both primary and collocated sampler values are reported as raw data.
2 Corrected
■ 11. In Appendix D to part 58, revise
paragraph 3(b), remove and reserve
paragraph 4.5(b), and revise paragraph
4.5(c) to read as follows:
Appendix D to Part 58—Network
Design Criteria for Ambient Air Quality
Monitoring
*
*
*
*
*
tkelley on DSK3SPTVN1PROD with PROPOSALS2
3. * * *
(b) The NCore sites must measure, at a
minimum, PM2.5 particle mass using
continuous and integrated/filter-based
samplers, speciated PM2.5, PM10–2.5 particle
mass, O3, SO2, CO, NO/NOY, wind speed,
wind direction, relative humidity, and
ambient temperature.
(1) Although the measurement of NOy is
required in support of a number of
monitoring objectives, available commercial
instruments may indicate little difference in
VerDate Mar<15>2010
20:02 Sep 10, 2014
Jkt 232001
their measurement of NOy compared to the
conventional measurement of NOX,
particularly in areas with relatively fresh
sources of nitrogen emissions. Therefore, in
areas with negligible expected difference
between NOy and NOX measured
concentrations, the Administrator may allow
for waivers that permit NOX monitoring to be
substituted for the required NOy monitoring
at applicable NCore sites.
(2) The EPA recognizes that, in some cases,
the physical location of the NCore site may
not be suitable for representative
meteorological measurements due to the
site’s physical surroundings. It is also
possible that nearby meteorological
measurements may be able to fulfill this data
need. In these cases, the requirement for
meteorological monitoring can be waived by
the Administrator.
4.5 * * *
(b) [Reserved]
(c) The EPA Regional Administrator may
require additional monitoring beyond the
minimum monitoring requirements
contained in paragraph 4.5(a) of this
appendix where the likelihood of Pb air
quality violations is significant or where the
emissions density, topography, or population
locations are complex and varied. EPA
Regional Administrators may require
additional monitoring at locations including,
but not limited to, those near existing
additional industrial sources of Pb, recently
closed industrial sources of Pb, airports
where piston-engine aircraft emit Pb, and
other sources of re-entrained Pb dust.
*
BILLING CODE 6560–50–P
PO 00000
*
*
Frm 00041
*
Fmt 4701
*
Sfmt 9990
*
*
*
*
*
[FR Doc. 2014–19758 Filed 9–10–14; 8:45 am]
E:\FR\FM\11SEP2.SGM
11SEP2
Agencies
[Federal Register Volume 79, Number 176 (Thursday, September 11, 2014)]
[Proposed Rules]
[Pages 54355-54395]
From the Federal Register Online via the Government Printing Office [www.gpo.gov]
[FR Doc No: 2014-19758]
[[Page 54355]]
Vol. 79
Thursday,
No. 176
September 11, 2014
Part II
Environmental Protection Agency
-----------------------------------------------------------------------
40 CFR Part 58
Revisions to Ambient Monitoring Quality Assurance and Other
Requirements; Proposed Rule
Federal Register / Vol. 79 , No. 176 / Thursday, September 11, 2014 /
Proposed Rules
[[Page 54356]]
-----------------------------------------------------------------------
ENVIRONMENTAL PROTECTION AGENCY
40 CFR Part 58
[EPA-HQ-OAR-2013-0619; FRL-9915-16-OAR]
RIN 2060-AR59
Revisions to Ambient Monitoring Quality Assurance and Other
Requirements
AGENCY: Environmental Protection Agency (EPA).
ACTION: Proposed rule.
-----------------------------------------------------------------------
SUMMARY: This action proposes revisions to ambient air monitoring
requirements for criteria pollutants to provide clarifications to
existing requirements to reduce the compliance burden of monitoring
agencies operating ambient networks. This proposal focuses on
reorganizing and clarifying quality assurance requirements, simplifying
and reducing data reporting and certification requirements, clarifying
the annual monitoring network plan public notice requirements, revising
certain network design criteria for nonsource lead monitoring, and
addressing other issues in part 58 Ambient Air Quality Surveillance
Requirements.
DATES: Comments must be received on or before November 10, 2014.
ADDRESSES: Submit your comments, identified by Docket ID No. EPA-HQ-
OAR-2013-0619, by one of the following methods:
Federal eRulemaking Portal: https://www.regulations.gov.
Follow the online instructions for submitting comments.
Email: A-and-R-Docket@epa.gov. Include docket ID No. EPA-
HQ-OAR-2013-0619 in the subject line of the message.
Fax: (202) 566-9744
Mail: Environmental Protection Agency, Mail code 28221T,
Attention Docket No. EPA-HQ-OAR-2013-0619, 1200 Pennsylvania Ave. NW.,
Washington, DC 20460. Please include a total of two copies.
Hand/Courier Delivery: EPA Docket Center, Room 3334, EPA
WJC West Building, 1301 Constitution Ave. NW., Washington, DC. Such
deliveries are only accepted during the Docket's normal hours of
operation, and special arrangements should be made for deliveries of
boxed information.
Instructions: Direct your comments to Docket ID No. EPA-HQ-OAR-
2013-0619. The EPA's policy is that all comments received will be
included in the public docket without change and may be made available
online at https://www.regulations.gov, including any personal
information provided, unless the comment includes information claimed
to be Confidential Business Information (CBI) or other information
whose disclosure is restricted by statute. Do not submit information
that you consider to be CBI or otherwise protected through https://www.regulations.gov or email. The www.regulations.gov Web site is an
``anonymous access'' system, which means the EPA will not know your
identity or contact information unless you provide it in the body of
your comment. If you send an email comment directly to the EPA without
going through https://www.regulations.gov, your email address will be
automatically captured and included as part of the comment that is
placed in the public docket and made available on the Internet. If you
submit an electronic comment, the EPA recommends that you include your
name and other contact information in the body of your comment and with
any disk or CD ROM you submit. If the EPA cannot read your comment due
to technical difficulties and cannot contact you for clarification, the
EPA may not be able to consider your comment. Electronic files should
avoid the use of special characters, any form of encryption, and be
free of any defects or viruses. For additional information about the
EPA's public docket, visit the EPA Docket Center homepage at https://www.epa.gov/epahome/dockets.htm.
Docket: All documents in the docket are listed in the https://www.regulations.gov index. Although listed in the index, some
information is not publicly available, e.g., CBI or other information
whose disclosure is restricted by statute. Certain other material, such
as copyrighted material, will be publicly available only in hard copy.
Publicly available docket materials are available either electronically
in www.regulations.gov or in hard copy at the Air and Radiation Docket
and Information Center, EPA/DC, Room 3334, WJC West Building, 1301
Constitution Ave. NW., Washington, DC. The Public Reading Room is open
from 8:30 a.m. to 4:30 p.m., Monday through Friday, excluding legal
holidays. The telephone number for the Public Reading Room is (202)
566-1744 and the telephone number for the Air and Radiation Docket and
Information Center is (202) 566-1742.
FOR FURTHER INFORMATION CONTACT: Mr. Lewis Weinstock, Air Quality
Assessment Division, Office of Air Quality Planning and Standards, U.S.
Environmental Protection Agency, Mail code C304-06, Research Triangle
Park, NC 27711; telephone: (919) 541-3661; fax: (919) 541-1903; email:
Weinstock.lewis@epa.gov.
SUPPLEMENTARY INFORMATION:
A. Does this action apply to me?
This action applies to state, territorial, and local air quality
management programs that are responsible for ambient air monitoring
under 40 CFR part 58. Categories and entities potentially regulated by
this action include:
------------------------------------------------------------------------
Category NAICS \a\ code
------------------------------------------------------------------------
State/territorial/local/tribal government. 924110
------------------------------------------------------------------------
\a\ North American Industry Classification System.
B. What should I consider as I prepare my comments for the EPA?
1. Submitting CBI. Do not submit this information to the EPA
through https://www.regulations.gov or email. Clearly mark any of the
information that you claim to be CBI. For CBI information in a disk or
CD ROM that you mail to the EPA, mark the outside of the disk or CD ROM
as CBI and then identify electronically within the disk or CD ROM the
specific information that is claimed as CBI. In addition to one
complete version of the comment that includes information claimed as
CBI, a copy of the comment that does not contain the information
claimed as CBI must be submitted for inclusion in the public docket.
Information so marked will not be disclosed except in accordance with
procedures set forth in 40 CFR part 2.
2. Tips for preparing your comments. When submitting comments,
remember to:
Follow directions--The agency may ask you to respond to
specific questions or organize comments by referencing a Code of
Federal Regulations (CFR) part or section number.
Explain why you agree or disagree, suggest alternatives,
and substitute language for your requested changes.
Describe any assumptions and provide any technical
information and/or data that you used.
If you estimate potential costs or burdens, explain how
you arrived at your estimate in sufficient detail to allow for it to be
reproduced.
Provide specific examples to illustrate your concerns and
suggest alternatives.
Explain your views as clearly as possible, avoiding the
use of profanity or personal threats.
[[Page 54357]]
Make sure to submit your comments by the comment period
deadline identified.
C. Where can I get a copy of this document?
In addition to being available in the docket, an electronic copy of
this proposed rule will also be available on the Worldwide Web (WWW)
through the Technology Transfer Network (TTN). Following signature, a
copy of this proposed rule will be posted on the TTN's policy and
guidance page for newly proposed or promulgated rules at the following
address: https://www.epa.gov/ttn/oarpg/. The TTN provides information
and technology exchange in various areas of air pollution control. A
redline/strikeout document comparing the proposed revisions to the
appropriate sections of the current rules is located in the docket.
Table of Contents
The following topics are discussed in this preamble:
I. Background
II. Proposed Changes to the Ambient Monitoring Requirements
A. General Information
B. Definitions
C. Annual Monitoring Network Plan and Periodic Network
Assessment
D. Network Technical Requirements
E. Operating Schedules
F. System Modification
G. Annual Air Monitoring Data Certification
H. Data Submittal and Archiving Requirements
I. Network Design Criteria (Appendix D)
III. Proposed Changes to Quality Assurance Requirements
A. Quality Assurance Requirements for Monitors Used in
Evaluations for National Ambient Air Quality Standards--Appendix A
1. General Information
2. Quality System Requirements
3. Quality Control Checks for Gases
4. Quality Control Checks for Particulate Monitors
5. Calculations for Data Quality Assessment
B. Quality Assurance Requirements for Monitors Used in
Evaluations of Prevention of Significant Deterioration Projects--
Appendix B
1. General Information
2. Quality System Requirements
3. Quality Control Checks for Gases
4. Quality Control Checks for Particulate Monitors
5. Calculations for Data Quality Assessment
IV. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory Planning and Review and
Executive Order 13563: Improving Regulations and Regulatory Review
B. Paperwork Reduction Act
C. Regulatory Flexibility Act
D. Unfunded Mandates Reform Act
E. Executive Order 13132: Federalism
F. Executive Order 13175: Consultation and Coordination With
Indian Tribal Governments
G. Executive Order 13045: Protection of Children From
Environmental Health and Safety Risks
H. Executive Order 13211: Actions Concerning Regulations That
Significantly Affect Energy Supply, Distribution, or Use
I. National Technology Transfer and Advancement Act
J. Executive Order 12898: Federal Actions To Address
Environmental Justice in Minority Populations and Low-Income
Populations
I. Background
The EPA is proposing revisions to ambient air requirements for
criteria pollutants to provide clarifications to existing requirements
to reduce the compliance burden of monitoring agencies operating
ambient networks. This proposal focuses on ambient monitoring
requirements that are found in 40 CFR part 58 and the associated
appendices (A, D, and new Appendix B), including issues such as
operating schedules, the development of annual monitoring network
plans, data reporting and certification requirements, and the operation
of the required quality assurance (QA) program.
The EPA last completed a comprehensive revision of ambient air
monitoring regulations in a final rule published on October 17, 2006
(see 71 FR 61236). Minor revisions were completed in a direct final
rule published on June 12, 2007 (see 72 FR 32193). Periodic pollutant-
specific monitoring updates have occurred in conjunction with revisions
to the National Ambient Air Quality Standards (NAAQS). In such cases,
the monitoring revisions were typically finalized as part of the NAAQS
final rules.\1\
---------------------------------------------------------------------------
\1\ Links to the NAAQS final rules are available at: https://www.epa.gov/air/criteria.html.
---------------------------------------------------------------------------
II. Proposed Changes to the Ambient Monitoring Requirements
A. General Information
The following proposed changes to monitoring requirements impact
these subparts of part 58--Ambient Air Quality Surveillance: Subpart
A--General Provisions, and Subpart B--Monitoring Network. Specific
proposed changes to these subparts are described below.
B. Definitions
The EPA proposes to add and revise several terms to ensure
consistent interpretation within the monitoring regulations and to
harmonize usage of terms with the definition of key metadata fields
that are important components of the Air Quality System (AQS).\2\
---------------------------------------------------------------------------
\2\ The AQS is the EPA's repository of ambient air quality data.
The AQS stores data from over 10,000 monitors, 5,000 of which are
currently active. State, local and tribal agencies collect the data
and submit it to the AQS on a periodic basis. See https://www.epa.gov/ttn/airs/airsaqs/ for additional information.
---------------------------------------------------------------------------
The EPA proposes to add the term ``Certifying Agency'' to the list
of definitions. The certifying agency field was added to AQS in 2013 as
part of the development of a revised process for states and the EPA
Regions to meet the data certification requirements described in 40 CFR
58.15. The new term specifically describes any monitoring agency that
is responsible for meeting data certification requirements for a set of
monitors. In practice, certifying agencies are typically a state,
local, or tribal agency depending on the particular data reporting
arrangements that have been approved by an EPA regional office for a
given state. A list of certifying agencies by individual monitor is
available on the AQS-TTN Web site.\3\
---------------------------------------------------------------------------
\3\ https://www.epa.gov/ttn/airs/airsaqs/memos/
criteriamonitorlistbycertifying
agencyandPQAO.xls.
---------------------------------------------------------------------------
The term ``Chemical Speciation Network'' or CSN is being proposed
for addition to the definition list. The CSN network has been
functionally defined as being comprised of the Speciation Trends
Network sites and the supplemental speciation sites that are
collectively operated by monitoring agencies to obtain PM2.5
chemical species data.
The term ``Implementation Plan'' is being proposed for addition to
provide more specificity to current definitions that reference the word
``plan'' in their description. The EPA wishes to ensure that references
to State Implementation Plans (SIPs) are not confused with references
to Annual Monitoring Network Plans that are described in 40 CFR 58.10.
The term ``Local Agency'' is being proposed for revision to clarify
that such organizations are responsible for implementing portions of
annual monitoring network plans. The current definition refers to the
carrying out of a plan which is not specifically defined, leading to
possible confusion with SIPs.
The term ``meteorological measurements'' is being proposed for
clarification that such measurements refer to required parameters at
NCore and photochemical assessment monitoring stations (PAMS).
[[Page 54358]]
The terms ``Monitoring Agency'' and ``Monitoring Organization'' are
being proposed for clarification to include tribal monitoring agencies
and to simplify the monitoring organization definition to reference the
aforementioned monitoring agency definition.
The term ``NCore'' is being proposed for revision to remove
nitrogen dioxide (NO2) and lead in PM10 (Pb-
PM10) as a required measurement and to expand the definition
of basic meteorology to specifically reference the required
measurements: Wind speed, wind direction, temperature, and relative
humidity. The EPA clarifies that NO2 was never a required
NCore measurement and that the current definition was erroneous on this
issue. Additionally, the requirement to measure Pb-PM10 at
NCore sites in areas over 500,000 population is being proposed for
elimination in the rule.
The term ``Near-road NO2 Monitor'' is being proposed for
revision to ``Near-road Monitor.'' This revision is being made to
broaden the definition of near-road monitors to include all such
monitors operating under the specific requirements described in 40 CFR
part 58, appendix D (sections 4.2.1, 4.3.2, 4.7.1(b)(2)) and appendix E
(section 6.4(a), Table E-4) for near-road measurement of
PM2.5 and carbon monoxide (CO) in addition to
NO2.
The term ``Network Plan'' is being proposed for addition to clarify
that any such references in 40 CFR part 58 refer to the annual
monitoring network plan required in 40 CFR 58.10.
The term ``Plan'' is being proposed for deletion as its usage has
been replaced with more specific references to either the annual
monitoring network plan required in 40 CFR 58.10 or the SIP approved or
promulgated pursuant to section 110 of the Clean Air Act.
The term ``Population-oriented Monitoring (or sites)'' is being
proposed for deletion. This term along with the related usage of the
concept of population-oriented monitoring was deleted from 40 CFR part
58 in the 2013 PM2.5 NAAQS final rule (see 78 FR 3235-3236).
As explained in that rule, the action was taken to ensure consistency
with the longstanding definition of ambient air applied to the other
NAAQS pollutants.
The term ``Primary Monitor'' is being proposed for addition to the
definition list. The usage of this term has become important in AQS to
better define the processes used to calculate design values when more
than one monitor is being operated by a monitoring agency for a given
pollutant. This term identifies the primary monitor used as the default
data source in AQS for creating a combined site record.
The term ``Primary Quality Assurance Organization'' is being
proposed for revision to include the usage of the acronym, ``PQAO.''
The terms ``PSD Monitoring Organization'' and ``PSD Monitoring
Network'' are being added to support the proposed new appendix B that
will pertain specifically to QA requirements for prevention of
significant deterioration (PSD) networks.
The term ``PSD Reviewing Authority'' is being added to support the
addition of appendix B to the part 58 appendices and to clarify the
identification of the lead authority in determining the applicability
of QA requirements for PSD monitoring projects.
The term ``Reporting Organization'' is being proposed for revision
to clarify that the term refers specifically to the reporting of data
as defined in AQS. The AQS does allow the distinct designation of
agency roles that include analyzing, certifying, collecting, reporting,
and PQAO.
The term ``SLAMS'' (state and local air monitoring stations) is
being proposed for clarification to clearly indicate that the
designation of a monitor as SLAMS refers to a monitor required under
appendix D of part 58. The SLAMS monitors make up networks that include
NCore, PAMS, CSN, and other state or local agency sites that have been
so designated in annual monitoring network plans.
The terms ``State Agency'' and ``STN'' are proposed for minor
wording changes for purposes of clarity only.
The term ``State Speciation Site'' is being proposed for deletion
in lieu of the proposed addition of ``Supplemental Speciation Station''
to better describe the distinct elements of the CSN network which
includes the Speciation Trends Network Stations that are required under
section 4.7.4 of appendix D of part 58 and supplemental speciation
stations which are operated for specific monitoring agency needs and
are not considered to be required monitors under appendix D.
C. Annual Monitoring Network Plan and Periodic Network Assessment
The EPA finalized the current Annual Monitoring Network Plan
requirement as part of the 2006 amendments to the ambient monitoring
requirements (see 71 FR 61247-61249). The revised requirements were
intended to consolidate separate network plan requirements that existed
for SLAMS and national air monitoring stations (NAMS) networks, clarify
processes for providing public input in the network plans and obtaining
formal EPA Regional Office review, and revise the required plan
elements to address other changes that had occurred in part 58. Since
2006, further revisions to the annual monitoring network plan
requirements have occurred to address new requirements for monitoring
networks including the NCore multi-pollutant network, source-oriented
lead (Pb), near-road monitoring for NO2, CO and
PM2.5, other required NAAQS monitoring, and data quality
requirements for continuous PM2.5 Federal Equivalent Methods
(FEMs).
The current Annual Monitoring Network Plan requirements state that
plans must be made available for public inspection for at least 30 days
prior to submission to the EPA. Additionally, any plans that propose
SLAMS network modifications are subject to EPA Regional Administrator
approval, and either the monitoring agency or the EPA Regional Office
must provide an opportunity for public comment. This process to improve
transparency pertaining to the planning of ambient monitoring networks
has been successful and the EPA believes that state and local agencies
are increasingly receiving public comments on these plans.\4\ To aid in
the visibility of these plans, the EPA hosts an annual monitoring
network plan summary page on its Ambient Monitoring Technical
Information Center (AMTIC) Web site.\5\
---------------------------------------------------------------------------
\4\ The EPA notes that there is no specified process for
obtaining public input into draft annual monitoring network plans
although the typical process is to post the plans on state or local
Web sites along with an on-line process to obtain public comments.
\5\ See https://www.epa.gov/ttn/amtic/plans.html.
---------------------------------------------------------------------------
Since the revision of the annual monitoring network plan process in
2006, the EPA has received feedback from its regional offices as well
as some states that the regulatory language pertaining to public
involvement has been unclear. Areas of confusion include determining
the difference between the process of obtaining public inspection
versus comment, the responsibility of monitoring agencies to respond to
public comment in their submitted plans, and the responsibility of the
EPA regional offices to obtain public comment depending on a monitoring
agency's prior action as well as whether the annual monitoring network
plan was modified based on discussions with the monitoring agency
following plan submission.
The EPA believes that the intent of the 2006 revision to these
requirements was to support wider public involvement in the planning
and implementation of air monitoring
[[Page 54359]]
networks, and, to that extent, the solicitation of public comments
prior to the submission of the annual monitoring network plan to the
EPA regional office is a desirable part of the process. Indeed, the EPA
stated in the preamble to the 2006 amendments that ``Although the
public inspection requirement does not specifically require states to
obtain and respond to received comments, such a process is encouraged
with the subsequent transmission of comments to the appropriate EPA
regional office for review'' (see 71 FR 61248).
Given the heightened interest and visibility of the annual
monitoring network plan process since 2006, the EPA believes that it is
appropriate to propose that the public inspection aspect of this
requirement contained in 40 CFR 58.10(a)(1) be revised to clearly
indicate that obtaining public comment is a required part of the
process, and that plans that are submitted to the EPA regional offices
should address such comments that were received during the public
notice period. The EPA understands that this proposed change in process
could increase burden for those monitoring agencies that have not
routinely incorporated public comments into their annual monitoring
network plan process. However, we believe that these efforts will
increase the transparency of the current process and potentially reduce
questions and adverse comment from stakeholders who have not been
included in annual monitoring network plan discussions prior to
submission to the EPA. For those monitoring agencies that already have
been posting plans for public comment, this proposed change should have
no net effect on workload.
A related part of the annual monitoring network plan process is
described in 40 CFR 58.10(a)(2) with the distinction that this section
pertains specifically to plans that propose SLAMS modifications and
thereby also require specific approval from the EPA Regional
Administrator. Similar to the public comment issue described above, the
process of obtaining such comment for plans that contain network
modifications was not clearly described, with the regulatory text
initially placing the responsibility on the EPA regional offices to
obtain public comment, but then providing monitoring agencies with the
option of obtaining public comment, which consequently would relieve
the EPA regional office from having to do so. Consistent with the
proposed change to the comment process described above, the EPA is
proposing changes to the text in 40 CFR 58.10(a)(2) to reflect the fact
that public comments will have been required to be obtained by
monitoring agencies prior to submission and that the role of the EPA
regional office will be to review the submitted plan together with
public comments and any modifications to the plan based on these
comments. On an overall basis, the EPA believes that this proposed
change to clearly place the responsibility for obtaining public comment
on monitoring agencies makes sense since these organizations are, in
effect, closer to their stakeholders and in a better position to notify
the public about the availability and key issues contained in annual
monitoring network plans, compared with similar efforts by the EPA
regions that oversee many such agencies.
On a related note, the EPA emphasizes the value of the partnership
between monitoring agencies and their respective EPA regional offices,
and encourages an active dialogue between these parties during the
development and review of annual monitoring network plans. Although the
monitoring regulations only require that the EPA Regional
Administrators approve annual monitoring network plans that propose
changes to SLAMS stations, the EPA encourages monitoring agencies to
seek formal approval of submitted plans regardless of whether SLAMS
changes are proposed or not. Such a process would ensure that not only
plans with proposed modifications are formally approved, but also that
plans where potential network changes are indeed appropriate but not
proposed, would be subject to discussion. Although the EPA is not
proposing that annual monitoring network plans that do not propose
changes to SLAMS should also be subject to the EPA Regional
Administrator's approval, we support close working relationships
between monitoring agencies and the EPA regions and see value in having
a formal review of all such plans, regardless of whether network
modifications are proposed.
Another aspect of the annual monitoring network plan requirements
is the listing of required information for each proposed and existing
site as described in 40 CFR 58.10(b). The EPA is proposing to add two
elements to this list as described below.
First, the EPA is proposing to require that a PAMS network
description be specifically included as a part of the annual monitoring
network plan for any monitoring agencies affected by PAMS requirements.
The requirements for such a plan are already referenced in appendix D,
sections 5.2 and 5.4 of this part. In fact, the requirement for an
``approved PAMS network description provided by the state'' is already
specified in section 5.4. Accordingly, the EPA is proposing that a PAMS
network description be a required element in annual monitoring network
plans for affected monitoring agencies, and that any such plans already
developed for PAMS networks in accordance with section 5 of appendix D
could be used to meet this proposed requirement. The EPA believes that
the burden impact of this proposed change should be minimal, as a
review of archived 2012 annual monitoring network plans posted on the
EPA's AMTIC Web page shows that many such plans already include
references to PAMS stations. For purposes of consistency and clarity,
however, the EPA believes there is merit for proposing this revision to
the annual monitoring network plan requirements so that stakeholders
interested in the operation of PAMS stations can find the relevant
information in one place.
Second, the EPA is proposing language that affects ``long-term''
Special Purpose Monitors (SPMs), i.e., those SPMs operating for longer
than 24 months whose data could be used to calculate design values for
NAAQS pollutants in cases where the EPA approved methods are being
employed. As long as such monitors are classified as SPMs, their
operation can be discontinued without EPA approval per 40 CFR 58.20(f).
While such operational flexibility is a key component of special
purpose monitoring, the issue can become more complex when longer-term
SPMs measure elevated levels of criteria pollutants and potentially
become design value monitors for a region. In such cases, the EPA is
faced with scenarios where key monitors that can impact the attainment
status of a region can potentially be discontinued without prior
notification or approval. Given the important regulatory implications
of such monitoring network decisions, the EPA believes that it is
important that the ongoing operation and treatment of such SPMs be
specifically called out and discussed in annual monitoring network
plans. Therefore, the EPA is proposing that a new required element be
added to the annual monitoring network plan requirements. Specifically,
the EPA is proposing that such long-term SPMs be identified in the
plans along with a discussion of the rationale for keeping the
monitor(s) as SPMs or potentially reclassifying to SLAMS. The EPA is
not proposing that such monitors must become SLAMS, only that the
ongoing operation of such monitors and the rationale for retaining them
as SPMs be explicitly discussed to avoid confusion
[[Page 54360]]
and the potential for unintended complexities in the designations
process if any design value SPMs would be discontinued without adequate
discussion.
The EPA is proposing minor edits to the annual monitoring network
plan requirements to revise terminology referring to PM2.5
speciation monitoring, to note the proposed addition of appendix B to
the QA requirements (see section III.B of this preamble), and to
clarify that annual monitoring network plans should include statements
addressing whether the operation of each monitor meets the requirements
of the associated appendices in part 58.
Finally, the issue has arisen concerning the flexibility that the
EPA Regional Administrators have with reference to the approvals that
are required within 120 days of annual monitoring network plan
approval, for example, in the situation where the majority of the
submitted plan is acceptable but one or more of the required elements
is problematic. In these situations, which we believe to be infrequent,
the existing regulatory language provides sufficient flexibility for
such situations to be handled on a case-by-case basis, for example,
through the use of a partial approval process where the Regional
Administrator's approval decision letter specifies what elements of the
submitted plan are approved and what elements are not. Alternatively,
if the plan satisfies the requirements for network adequacy under
appendix D and the monitors are suitable for regulatory decisions
(consistent with the requirements of appendix A), the Regional
Administrator has the discretion to approve the plan, while noting
technical deficiencies to be corrected. We would expect that the
resolution of the specific items under discussion would be documented
through follow-up communications with the submitting monitoring agency
to ensure that a complete record exists for the basis of the annual
monitoring network plan approval.
The EPA solicits comments on all of the proposed changes to annual
monitoring network plans requirements contained in 40 CFR 58.10.
D. Network Technical Requirements
The EPA is proposing to revise the language in 40 CFR 58.11(a)(3)
to note the proposed revisions to appendix B to the QA requirements
(see section III.B of this preamble) that would pertain to PSD
monitoring sites.
E. Operating Schedules
The operating schedule requirements described in 40 CFR 58.12
pertain to the minimum required frequency of sampling for continuous
analyzers (for example, hourly averages) and manual methods for
particulate matter (PM) and Pb sampling (typically 24-hour averages for
manual methods). The EPA is proposing to revise these requirements in
three ways: By proposing added flexibility in the minimum required
sampling for PM2.5 mass sampling and for PM2.5
speciation sampling; by modifying language pertaining to continuous
mass monitoring to reflect revisions in regulatory language that were
finalized in the 2013 p.m. NAAQS final rule; and by clarifying the
applicability of certain criteria that can lead to an increase in the
required sampling frequency, for example, to a daily schedule.
With regard to the minimum required sampling frequency for manual
PM2.5 samplers, current requirements state that at least a
1-in-3 day frequency is mandated for required SLAMS monitors without a
collocated continuous monitor. For the majority of such manual
PM2.5 samplers, the EPA continues to believe that a 1-in-3
day sampling frequency is appropriate to meet the data quality
objectives that support the PM2.5 NAAQS.\6\ For a subset of
these monitors, however, the EPA believes that some regulatory
flexibility may be appropriate in situations where a particular monitor
is highly unlikely to record a violation of the PM2.5 NAAQS.
Such situations might occur in areas with very low PM2.5
concentrations relative to the NAAQS and/or in urban areas with many
more monitors than are required by appendix D and a subset of those
monitors are reading lower than other monitors in the area. In these
situations, the EPA believes it is appropriate to propose that the
required sampling frequency could be reduced to 1-in-6 day sampling or
another alternate schedule through a case-by-case approval by the EPA
Regional Administrator. Such approvals could be based on factors that
are already described in 40 CFR 58.12(d)(1)(ii) such as historical
PM2.5 data assessments, the attainment status of the area,
the location of design value sites, and the presence of continuous
PM2.5 monitors at nearby locations. The EPA envisions that
the request for such reductions in sampling frequency would occur
during the annual monitoring network plan process as operating
schedules are a required part of the plans as stated in 40 CFR
58.10(b)(4).
---------------------------------------------------------------------------
\6\ According to a retrieval from AQS dated 12-23-2013,
approximately 65% of primary PM2.5 samplers (those
monitors with a parameter occurrence code of ``1'') operated on a 1-
in-3 day sampling frequency.
---------------------------------------------------------------------------
For sites with a collocated continuous monitor, the EPA also
believes that the current regulatory flexibility to reduce to 1-in-6
day sampling or a seasonal sampling schedule is appropriate based on
factors described above, and in certain cases, may also be applicable
to lower reading SLAMS sites without a collocated continuous monitor,
for example, to reduce frequency from 1-in-6 day sampling to a seasonal
schedule. Accordingly, we have proposed such flexibility through
changes in the regulatory language in 40 CFR 58.12(d)(1)(i) and (ii).
The EPA also believes that some flexibility for sampling frequency
is appropriate to propose for PM2.5 Chemical Speciation
Stations, specifically the Speciation Trends Network (STN) sites that
are at approximately 53 locations.\7\ The STN stations are currently
required to sample on at least a 1-in-3 day frequency with no
opportunity for flexibility. While the EPA firmly believes in the long-
term importance of the STN stations to support the development of SIPs,
modeling exercises, health studies, and the investigation of air
pollution episodes and exceptional events, we do not believe that the
current inflexibility with regard to sampling frequency is in the best
interests of monitoring agencies, the EPA, or stakeholders. For the
past several years, the EPA has been investigating alternative
monitoring technologies such as continuous PM2.5 speciation
methods that can supplement or potentially even replace manual
PM2.5 speciation methods.\8\ As these methods become more
refined, the EPA may wish to selectively reduce sampling frequency at
manual samplers for one or more channels to conserve resources for
reinvestment in other needs within the CSN network. Additionally, the
EPA is currently conducting an assessment of the entire CSN network to
evaluate the long-term viability of the program in the context of
changes in air quality, the recently revised PM NAAQS, rising
analytical costs, and flat or declining resources. Accordingly, for the
reasons mentioned above, the EPA is proposing that a reduction in
sampling frequency from 1-in-3 day be permissible for manual
PM2.5 samplers at STN stations. The approval for such
changes at STN stations, on a case by case basis, would be made by the
EPA Administrator as the authority for changes to STN has
[[Page 54361]]
been retained at the Administrator level per appendix D of this part,
section 4.7.4. Factors that would be considered as part of the decision
would include an area's design value, the role of the particular site
in national health studies, the correlation of the site's species data
with nearby sites, and presence of other leveraged measurements. In
practice, we would expect a close working relationship with the EPA
regional offices and monitoring agencies to consider such changes to
STN, preferably as part of the annual monitoring network plan process,
taking into account the findings of the CSN assessment process that is
expected to be completed later in 2014, as well as a parallel effort
being undertaken for the Interagency Monitoring of Protected Visual
Environments (IMPROVE) network.\9\
---------------------------------------------------------------------------
\7\ https://www.epa.gov/ttn/amtic/specgen.html.
\8\ https://www.epa.gov/ttnamti1/spesunset.html.
\9\ https://vista.cira.colostate.edu/improve/Default.htm.
---------------------------------------------------------------------------
The EPA is proposing editorial revisions to 40 CFR 58.12(d)(1)(ii)
to harmonize the language regarding the use of continuous FEM or
approved regional methods (ARM) monitors to support sampling frequency
flexibility for manual PM2.5 samplers with the current
language in 40 CFR 58.12(d)(1)(iii) that was revised as part of 2013 PM
NAAQS final rule. Specifically, the phrase ``unless it is identified in
the monitoring agency's annual monitoring network plan as not
appropriate for comparison to the NAAQS and the EPA Regional
Administrator has approved that the data from that monitor may be
excluded from comparison to the NAAQS'' is being proposed for appending
to the current regulatory language. This change reflects the new
process that was finalized in the 2013 PM NAAQS final rule that allows
monitoring agencies to request that continuous PM2.5 FEM
data be excluded from NAAQS comparison based on technical criteria
described in 40 CFR 58.11(e) (see 78 FR 3241-3244). If such requests
are made by monitoring agencies and subsequently approved by the EPA
regional offices as part of the annual monitoring plan process, it
follows that the data from these continuous PM2.5 FEMs would
also not be of sufficient quality to support a request for sampling
reduction for a collocated manual PM2.5 sampler. The EPA
revised the relevant language in one section of 40 CFR 58.12 during the
2013 PM rulemaking but failed to revise a similar phrase in another
section of 40 CFR 58.12. Accordingly, the EPA is proposing the change
to ensure consistent regulatory language throughout 40 CFR 58.12.
Within these editorial changes, we are also proposing the addition of
the phrase ``and the EPA Regional Administrator has approved that the
data from that monitor may be excluded from comparison to the NAAQS''
to the revisions that were made with the 2013 PM NAAQS. This revision
is being proposed to clearly indicate that two distinct actions are
necessary for the data from a continuous PM2.5 FEM to be
considered not comparable to the NAAQS; first, the identification of
the relevant monitor(s) in an agency's annual monitoring network plan,
and, second, the approval by the EPA Regional Administrator of that
request to exclude data. The language used by the EPA in the relevant
sections of 40 CFR 58.12 related to the initial request by monitoring
agencies but did not specifically address the needed approval by the
EPA.
Finally, the EPA is clarifying the applicability of statements in
40 CFR 58.12(d)(1)(ii) and (iii) that reference the relationship of
sampling frequency to site design values. Specifically, we are
proposing clarifications and revisions affecting the following
statements: (1) ``Required SLAMS stations whose measurements determine
the design value for their area and that are within 10
percent of the NAAQS; and all required sites where one or more 24-hour
values have exceeded the NAAQS each year for a consecutive period of at
least 3 years are required to maintain at least a 1-in-3 day sampling
frequency,'' and (2) ``Required SLAMS stations whose measurements
determine the 24-hour design value for their area and whose data are
within 5 percent of the level of the 24-hour
PM2.5 NAAQS must have a Federal Reference Method (FRM) or
FEM operate on a daily schedule if that area's design value for the
annual NAAQS is less than the level of the annual PM2.5
standard.'' Since these provisions were finalized in 2006, there has
been some confusion among monitoring agencies and regional offices
concerning the applicability of the sampling frequency adjustments
since design values are recalculated annually and, in some situations,
such revised design values can either fall below the comparative
criteria or rise above the criteria. For example, if according to 40
CFR 58.12(d)(1)(iii) a sampler must be on a daily sampling schedule
because its design value is within 5 percent of the 24-hour
NAAQS and it meets the other listed criteria, how and when should the
sampling frequency be revised if the referenced 24-hour design value
falls out of the 5 percent criteria the following year? In
an extreme example, what would happen if the 24-hour design value
changed each year to be alternately within the 5 percent criteria and
then not within the criteria?
It was not the EPA's intention in the 2006 monitoring revisions to
create scenarios in which the required sampling frequencies for
individual samplers would be ``chasing'' annual changes in design
values. Such a framework would be difficult to implement for both
monitoring agencies and regional offices for logistical reasons
including the scheduling of operators and the availability of
PM2.5 filters, and also because of the time lag involved
with reporting and certifying data and the validation of revised design
values, which typically does not occur until the summer following the
completion of each calendar year's sampling. To provide some clarity to
this situation as well as to provide a framework where changes in
sampling frequency occur on a more consistent and predictable basis,
the EPA is proposing that design value-driven sampling frequency
changes be maintained for a minimum 3-year period once such a change is
triggered. Additionally, such changes in sampling frequency would be
required to be implemented no later than January 1 of the year which
followed the recalculation and certification of a triggering design
value. For example, if a triggering design value that required a change
to daily sampling frequency was calculated in the summer of 2014 based
on 2011-2013 certified data, then the affected sampler would be
required to have an increased sampling frequency no later than January
1, 2015, and would maintain that daily frequency through at least 2017,
regardless of changes to the triggering design value in the intervening
years.
To accomplish these proposed changes, the EPA is proposing changes
in the 40 CFR 58.12 regulatory text to clarify that sampling frequency
changes that are triggered by design values must be maintained until
the triggering design value site no longer meets the criteria for at
least 3 consecutive years. Specifically, these changes include the
insertion of the phrase ``until the design value no longer meets these
criteria for 3 consecutive years'' into 40 CFR 58.12(d)(1)(ii) and the
sentence ``The daily schedule must be maintained until the referenced
design values no longer meet these criteria for 3 consecutive years''
into 40 CFR 58.12(d)(1)(iii). The EPA notes that where a design value
is based on 3 years of data, 3 consecutive years of design values would
require 5 years of data (e.g., 2010-2012, 2011-2013, 2012-2014). New
regulatory
[[Page 54362]]
language has been proposed in 40 CFR 58.12(d)(1)(iv) to document the
timing of when design value-driven changes in sampling frequency must
be implemented.
On balance, the EPA believes that the overall impact of proposed
changes to the operating schedule requirements will be a modest
reduction in the burden for monitoring agencies. We believe that the
number of PM2.5 FRM and CSN samplers impacted by these
proposed changes will be relatively small, but where they occur will
provide some logistical relief for sites that are less critical in
terms of NAAQS implementation and other key objectives. The EPA
solicits comment on all of these proposed changes to 40 CFR 58.12
requirements.
F. System Modification
In the 2006 monitoring amendments, the EPA finalized a requirement
in 40 CFR 58.14(a) for monitoring agencies to ``develop and implement a
plan and schedule to modify the ambient air quality network that
complies with the finding of the network assessments required every 5
years by 58.10(e).'' The remainder of the associated regulatory
language reads very much like the required procedure for making annual
monitoring network plans available for public inspection, comment, and
the EPA Regional Administrator's approval as described in 40 CFR
58.10(a)(1) and (2). Since 2006, there has been confusion between the
EPA and the monitoring agencies as to whether a separate plan was
required to be submitted by 40 CFR 58.14(a) relative to the annual
monitoring network plan, with that separate plan devoted specifically
to discussing the results of the 5-year network assessment.
A review of the 2006 monitoring proposal and final rule reveals no
specific discussion concerning the submission of a distinct plan
devoted specifically to the implementation of the 5-year network
assessment. While the EPA continues to support the importance of the
network assessment requirement,\10\ there appears to be no specific
benefit to the requirement for a distinct plan to discuss the 5-year
network assessments, and the inference of the need for such a plan may
be attributable to some redundancy in the aforementioned requirements
when the regulatory language was revised in 2006. Monitoring agencies,
for example, could include a specific section or attachment to the
annual monitoring network plan that fulfilled all the requirements
described in 40 CFR 58.14(a) including how each agency would implement
the findings of the assessment and the schedule for doing so. By
including such information in the annual monitoring network plans, the
implied need to develop a separate plan with the attendant burden of
public posting, obtaining public comment, and the EPA Regional
Administrator's review and approval can be avoided, reducing the burden
on all parties.
---------------------------------------------------------------------------
\10\ The next 5-year network assessment will be due no later
than July 1, 2015, according to the schedule established by 40 CFR
58.10(d).
---------------------------------------------------------------------------
In terms of timing, these specific sections or attachments
referring to the 5-year network assessments could be required either in
the year when the assessment is due (e.g., 2015) or in the year
following when the assessment is due (e.g., 2016). The submission in
the year following the network assessment would allow more time for
monitoring agencies to fully consider the results of the 5-year
assessment and would also allow the public more time to review and
comment on the recommendations.
Accordingly, the EPA is proposing to revise the regulatory language
in 40 CFR 58.14(a) to clearly indicate that a separate plan is not
needed to account for the findings of the 5-year network assessment,
and that the information concerning the implementation of the 5-year
assessment, referred to in the proposed regulatory language as a
``network modification plan,'' shall be submitted as part of the annual
monitoring network plan that is no later than the year after the
network assessment is due.\11\ According to the proposed schedule, the
annual monitoring network plans that are due in 2016, 2021, etc., would
contain the information referencing the network assessments.
---------------------------------------------------------------------------
\11\ Monitoring agencies, at their discretion, could submit the
network modification plan in the year that the assessment is due if
sufficient feedback had been received. On balance, EPA believes that
the extra year following the completion of the network assessment
would be valuable to assure a productive outcome from the assessment
process.
---------------------------------------------------------------------------
The EPA is also proposing to revise an incorrect cross-reference in
the current text of 40 CFR 58.14(a) in which the network assessment
requirement is noted as being contained in 58.10(e) when the correct
cross-reference is 58.10(d).
G. Annual Air Monitoring Data Certification
The data certification requirement is intended to provide ambient
air quality data users with an indication that all required validation
and reporting steps have been completed, and that the certified data
sets are now considered final and appropriate for all uses including
the calculation of design values and the determination of NAAQS
attainment status. The formal certification process currently involves
the transmission of a data certification letter to the EPA signed by a
senior monitoring agency official that references the list of monitors
being certified. The letter is accompanied by required AQS reports that
summarize the data being certified and the accompanying QA data that
support the validation of the referenced list of monitors. Once the
letter and required reports are submitted to the EPA, the data
certification requirement has been fulfilled. In practice, the EPA has
provided an additional discretionary review of the data certification
submissions by monitoring agencies to make sure the submissions are
complete and that the EPA agrees that the referenced data are of
appropriate quality. When these reviews have been completed, the EPA's
review has been documented by the presence of a specific AQS flag for
each monitor-year of data that has been certified and reviewed.
The actual breadth of data certification requirements has not
materially changed since the original requirements were finalized in
1979 as part of the requirement for monitoring agencies to submit an
annual SLAMS summary report (see 44 FR 27573). Data certification
requirements were last revised in 2006 when the deadline for
certification was changed to May 1 from July 1 for most measurements.
Current requirements include the certification of data collected at
all SLAMS and SPMs using FRM, FEM, or ARM methods. In practice, this
requirement includes a very wide range of measurements that are not
limited to criteria pollutants but also extend to non-criteria
pollutant measurements at PAMS stations, meteorological measurements at
PAMS and NCore stations, and PM2.5 chemical speciation
parameters. For monitoring agencies operating these complex stations,
this places an additional burden on the data review and validation
process in addition to the routine procedures already in place to
validate and report data as required by 40 CFR 58.16. For example,
current PAMS requirements include the reporting of approximately 54
individual ``target list'' volatile organic compounds per station while
many dozens of PM2.5 species are reported at CSN stations.
None of these specialized monitoring programs were in place when
the data certification requirements were originally promulgated and the
large number of measurements being obtained
[[Page 54363]]
in typical modern-day monitoring networks has resulted in a burden
overload that has threatened the viability of the data certification
process. For example, monitoring agencies have struggled with the
availability of specific QA checks that can be used to meet the
certification requirements for PAMS and CSN data, and the EPA's
discretionary review of data certification submissions have become
increasingly incomplete or delayed due to the enormous number of
monitors being submitted for certification under the current
requirements.
The EPA believes that the data certification requirements need to
be revised to streamline the associated workload for monitoring
agencies as well as the EPA so that the process can be focused on those
measurements that have greatest impacts on state programs, namely the
criteria pollutants that support the calculation of annual design
values and the mandatory designations process. By focusing the data
certification process on the NAAQS, the greatest value will be derived
from the certification process and both the monitoring agencies and the
EPA will be able to devote scarce resources to the most critical of
ambient monitoring objectives. The EPA is not implying that the need
for thorough data validation processes is unimportant for non-criteria
pollutants. However we believe that existing QA plans and standard
operating procedures, together with the regulatory language in 40 CFR
58.16(c) to edit and report validated data, is sufficient to assure the
quality of non-criteria pollutant measurements being reported to AQS.
Accordingly, the EPA is proposing several changes in the data
certification requirements to accomplish a streamlining of this
important process. First, to support the focus on certification of
criteria pollutant measurements, the EPA is proposing to revise
relevant sections of 40 CFR 58.15 to focus the requirement on FRM, FEM,
and ARM monitors at SLAMS and at SPM stations rather than at all SLAMS
which also include PAMS and CSN measurements that may not utilize
federally approved methods. This proposed wording change limits the
data certification requirement to criteria pollutants since the EPA
approved methods do not exist for non-criteria measurements. Second,
the EPA is also proposing that the required AQS reports be submitted to
the Regional Administrator rather than through the Regional
Administrator to the Administrator as is currently required. From a
process standpoint, this proposed change effectively places each EPA
regional office in charge of the entire data certification process
(including the discretionary review) versus the EPA headquarters where
the discretionary reviews have taken place in the past. This delegation
of responsibility for the discretionary review will allow this
important part of the certification process to be shared among the ten
EPA regional offices, and result in a more timely review of
certification results and the posting of appropriate certification
status flags in AQS. The EPA notes that significant progress has
already been made in revising this part of the certification process
and that a new AQS report, the AMP 600, has been developed to more
efficiently support the sharing of relevant information between
certifying agencies and the EPA regional offices.\12\
---------------------------------------------------------------------------
\12\ Note relevant training material available on the AQS TTN
Web site: https://www.epa.gov/ttn/airs/airsaqs/training/
2013Q2WebinarFinal.pdf.
---------------------------------------------------------------------------
Additionally, minor editorial changes are being proposed in 40 CFR
58.15 to generalize the title of the official responsible for data
certification (senior official versus senior air pollution control
officer) and to remove an outdated reference to the former due date for
the data certification letter (July 1 versus the current due date of
May 1).
H. Data Submittal and Archiving Requirements
The requirements described in 40 CFR 58.16 address the specific
measurements that must be reported to AQS as well as the relevant
schedule for doing so. Required measurements include criteria
pollutants in support of NAAQS monitoring objectives as well as public
reporting, specific ozone (O3) and PM2.5
precursor measurements such as those obtained at PAMS, NCore, and CSN
stations, selected meteorological measurements at PAMS and NCore
stations, and associated QA data that support the assessment of
precision and bias.
In 1997, an additional set of required supplemental measurements
was added to 40 CFR 58.16 in support of the newly promulgated FRM for
PM2.5, described in 40 CFR part 50, appendix L. These
measurements included maximum, minimum, and average ambient
temperature; maximum, minimum, and average ambient pressure; flow rate
coefficient of variation (CV); total sample volume; and elapsed sample
time. In the 2006 monitoring amendments, many of these supplemental
measurements were removed from the requirements based on the EPA's
confidence that the PM2.5 FRM was meeting data quality
objectives (see 71 FR 2748). At that time, reporting requirements were
retained for average daily ambient temperature and average daily
ambient pressure, as well as any applicable sampler flags, in addition
to PM2.5 mass and field blank mass. Given the additional
years of data supporting the performance of the PM2.5 FRM as
well as the near ubiquitous availability of meteorological data
available from sources such as the National Weather Service automated
surface observing system \13\ in addition to air quality networks, the
EPA believes that it is no longer necessary to require agencies to
report the average daily temperature and average daily pressure from
manual PM2.5 samplers, thereby providing some modest relief
from the associated reporting burden. Accordingly, the EPA is proposing
to remove AQS reporting requirements for average daily temperature and
average daily pressure as related to PM2.5 measurements with
the expectation that monitoring agencies will retain such measurements
as needed to support data validation needs as well as to fulfill
requirements in associated QA project plans and standard operating
procedures. The EPA is also proposing to remove similar language
referenced elsewhere in 40 CFR 58.16 that pertains to measurements at
Pb sites as well as to other average temperature and average pressure
measurements recorded by samplers or from nearby airports. For the
reasons noted above, the EPA believes that meteorological data are more
than adequately available from a number of sources, and that the
removal of specific requirements for such data to be reported to AQS
represents an opportunity for burden reduction. The EPA notes that the
requirement to report specific meteorological data for NCore and PAMS
stations remains unchanged.
---------------------------------------------------------------------------
\13\ See https://www.nws.noaa.gov/ost/asostech.html.
---------------------------------------------------------------------------
The EPA is also proposing a change to the data reporting schedule
described in 40 CFR 58.16(b) and (d) to provide additional flexibility
for reporting PM2.5 chemical speciation data measured at CSN
stations. Specifically, we are proposing that such data be required to
be reported to AQS within 6 months following the end of each quarterly
reporting period, as is presently required for certain PAMS
measurements such as volatile organic compounds. This change would
provide an additional 90 days for PM2.5 chemical speciation
data to be reported compared with the current requirement of reporting
90 days after the end of each
[[Page 54364]]
quarterly reporting period. This change is being proposed to provide
both the EPA and monitoring agencies with potential data reporting
flexibility as technological and procedural revisions are considered
for the national analytical frameworks that support the CSN network.
Given that the primary objectives of the CSN (and IMPROVE) programs are
to support long-term needs such as SIP development, modeling, and
health studies, the EPA believes that such programs would not be
negatively impacted by the revised reporting requirements and that
potential contractual efficiencies could be realized by allowing more
time for analytical laboratories to complete their QA reviews and
report their results to AQS.
I. Network Design Criteria (Appendix D)
The EPA is proposing two changes that affect the required suite of
measurements in the NCore network. This multi-pollutant network became
operational on January 1, 2011, and includes approximately 80 stations
that are located in both urban and rural areas.\14\
---------------------------------------------------------------------------
\14\ See https://www.epa.gov/ttn/amtic/ncore/ for more
information.
---------------------------------------------------------------------------
The EPA is proposing a minor change to section 3 of appendix D to
part 58, the design criteria for NCore sites. Specifically, we are
proposing to delete the requirement to measure speciated
PM10-2.5 from the list of measurements in section 3(b). An
identical revision was finalized in the text of 40 CFR 58.16(a) in the
2013 p.m. NAAQS final rule (see 78 FR 3244). At that time, we noted the
lack of consensus on appropriate sampling and analytical techniques for
speciated PM10-2.5, and the pending analysis of data from a
pilot project that examined these issues. Based on the supportive
comments received from monitoring agencies and multi-state
organizations, the EPA deleted the requirement for speciated
PM10-2.5 from 40 CFR 58.16(a). During this process, the EPA
inadvertently failed to complete a similar change that was required in
the language of section 3 of appendix D. Accordingly we are proposing
this change to align the NCore monitoring requirements between the two
sections noted above.
The EPA is also proposing to delete the requirement to measure Pb
at urban NCore sites, either as Pb in Total Suspended Particles (Pb-
TSP) or as Pb-PM10. This requirement was finalized as part
of the reconsideration of Pb monitoring requirements that occurred in
2010 (see 75 FR 81126). At that time, we noted that monitoring of Pb at
such nonsource locations at NCore sites would support the
characterization of typical neighborhood-scale Pb concentrations in
urban areas to assist with the understanding of the risk posed by Pb to
the general population. We also noted that such information could
assist with the determination of nonattainment boundaries and support
the development of long-term trends.
Since this requirement was finalized in late 2010, nonsource lead
data has been measured at 50 urban NCore sites, with the majority of
sites having already collected at least 2 years of data. In all cases,
valid ambient Pb readings have been low, with maximum 3-month rolling
averages typically reading around 0.01 micrograms per cubic meter as
compared to the NAAQS level of 0.15 micrograms per cubic meter.\15\ We
expect the majority of sites to have the 3 years necessary to calculate
a design value following the completion of monitoring in 2014. Given
the uniformly low readings being measured at these NCore sites, we
believe it is appropriate to consider eliminating this requirement. As
noted in the associated docket memo, nonsource Pb data will continue to
be measured (as Pb-PM10) at the 27 National Air Toxics
Trends Sites (NATTS) and at hundreds of PM2.5 speciation
stations that comprise the CSN and IMPROVE networks. The EPA believes
that these ongoing networks adequately support the nonsource monitoring
objectives articulated in the 2010 Pb monitoring reconsideration.
---------------------------------------------------------------------------
\15\ See supporting information for reconsideration of existing
requirements to monitor for lead at urban NCore site, Kevin
Cavender, Docket number EPA-HQ-OAR-2013-0619.
---------------------------------------------------------------------------
Accordingly, the EPA is proposing to delete the requirement to
monitor for nonsource Pb at NCore sites from appendix D of 40 CFR part
58.\16\ Given the requirement to collect a minimum of 3 years of Pb
data in order to support the calculation of design values, the EPA
proposes that monitoring agencies would be able to request permission
to discontinue nonsource monitoring following the collection of at
least 3 years of data at each urban NCore site.\17\ Affected monitoring
agencies should work closely with their respective EPA regional offices
to ensure coordination of these changes to the network.
---------------------------------------------------------------------------
\16\ Specific revisions are proposed in 40 CFR part 58, appendix
D, section 3(b) and sections 4.5(b) and 4.5(c).
\17\ The EPA will review requests for shutdown under the
provisions of 40 CFR 58.14. Although EPA anticipates that these
nonsource monitors will have design values well below the NAAQS and
will be eligible to be discontinued after three years of data have
been collected, in the event that a monitor records levels
approaching the NAAQS it may not qualify to be discontinued.
---------------------------------------------------------------------------
The EPA solicits comments on these proposed changes to Pb
monitoring requirements.
III. Proposed Changes to Quality Assurance Requirements
A. Quality Assurance Requirements for Monitors Used in Evaluations for
National Ambient Air Quality Standards--Appendix A
1. General Information
The following proposed changes to monitoring requirements impact
these subparts of part 58--Ambient Air Quality Surveillance; appendix
A--Quality Assurance Requirements for SLAMS, SPMs and PSD Air
Monitoring. Changes that affect the overall appendix follow while those
specific to the various sections of the appendix will be addressed
under a specific section heading. The EPA notes that the entire
regulatory text section for appendix A is being reprinted with this
proposal since this section is being reorganized for clarity as well as
being selectively revised as described in detail below. Likewise,
although the EPA is proposing a new appendix B to apply to PSD
monitors, much of the content of appendix B is taken directly from the
existing requirements for these monitors set forth in appendix A. The
EPA is soliciting comment on the specific provisions of appendices A
and B that are being proposed for revision. However, there are a number
of provisions that are being reprinted in the regulatory text solely
for clarity to assist the public in understanding the changes being
proposed; the EPA is not soliciting comment on those provisions and
considers changes to those provisions to be beyond the scope of this
rulemaking.
The QA requirements in appendix A have been developed for measuring
the criteria pollutants of O3, NO2, sulfur
dioxide (SO2), CO, Pb and PM (PM10 and
PM2.5) and are minimum requirements for monitoring these
ambient air pollutants for use in NAAQS attainment demonstrations. To
emphasize the objective of this appendix, the EPA proposes to change
the title of appendix A to ``Quality Assurance Requirements for
Monitors used in Evaluations of National Ambient Air Quality
Standards,'' and remove the terms SLAMS and SPMs from the title. We do,
however, in the applicability paragraph, indicate that any monitor
identified as SLAMS must meet the appendix A criteria in order to avoid
any confusion about SLAMS monitors measuring criteria pollutants.
[[Page 54365]]
Special purpose monitors may in fact be monitoring for a criteria
pollutant for other objectives than NAAQS determinations. Therefore,
appendix A attempts to clarify in the title and the applicability
section that the QA requirements specified in this appendix are for
criteria pollutant monitors that are designated, through the part 58
ambient air regulations and monitoring organization annual monitoring
network plans, as eligible to be used for NAAQS evaluation purposes.
The applicability section also provides a reporting mechanism in AQS to
identify any criteria pollutant monitors that are not used for NAAQS
evaluations. The criteria pollutants identified for NAAQS exclusion
will require review and approval by the EPA regional offices and will
increase transparency and efficiencies in the NAAQS designation, data
quality evaluation and data certification processes.
The current appendix A regulation has separate sections for
automated (continuous) and manual method types. Since there are
continuous and manual methods for measuring PM which have different
quality control (QC) requirements, monitoring organizations have found
it difficult to navigate the current appendix A requirements. The EPA
proposes to reformat the document by pollutant rather than by method
type. The four gaseous pollutants (CO, NO2, SO2
and O3) will be contained in one section since the QC
requirements are very similar, and separate sections will be provided
for PM10, PM2.5, and Pb.
In the 2006 monitoring rule revisions, the PSD QA requirements,
which were previously in appendix B, were added to appendix A and
appendix B was reserved. The PSD requirements, in most cases, mimicked
appendix A in structure but because PSD monitoring is often only for a
period of one year, some of the frequencies of implementation of the
PSD requirements are higher than the appendix A requirements. In
addition, the agencies governing the implementation, assessment and
approval of the QA requirements are different for PSD and ambient air
monitoring for NAAQS decisions (i.e., the EPA regions for appendix A
versus reviewing authorities for PSD). The combined regulations have
caused confusion among monitoring organizations and those implementing
PSD requirements, and the EPA proposes that the PSD requirements be
moved back to a separate appendix B. This change would also provide
more flexibility for revision if changes in either appendix are needed.
Details of this proposed change will follow in Section III.B.
Finally, the EPA proposes that the appendix A regulation emphasize
the use of PQAO and moved the definition and explanation to the
beginning of the regulation in order to ensure that the application and
use of PQAO in appendix A is clearly understood. The definition for
PQAO is not being proposed for change. Since the PQAO can be a
consolidation of a number of local monitoring organizations, the EPA
proposes to add a sentence clarifying that the agency identified as the
PQAO (usually the state agency) will be responsible for overseeing that
the appendix A requirements are being met by all consolidated local
agencies within the PQAO. Current appendix A regulation requires PQAOs
to be approved by the EPA regions during network reviews or audits. The
EPA believes this approval can occur at any time and proposes to
eliminate the wording that suggests that PQAO approvals can only occur
during events like network reviews or audits.
2. Quality System Requirements
The EPA proposes to remove the QA requirements for
PM10-2.5 (see current sections 3.2.6, 3.2.8, 3.3.6, 3.3.8,
4.3). Appendix A has traditionally been used to describe the QA
requirements of the criteria pollutants used in making NAAQS attainment
decisions. While the 40 CFR part 58 Ambient Air Monitoring regulation
requires monitoring for the CSN, PAMS, and total oxides of Nitrogen
(NOy) for NCore, the QA requirements for these networks are
found in technical assistance documents and not in appendix A. In 2006,
the EPA proposed a PM10-2.5 NAAQS along with requisite QA
requirements in appendix A. While the PM10-2.5 NAAQS was not
promulgated, PM10-2.5 monitoring was required to be
performed at NCore sites and the EPA proposed requisite QA requirements
in appendix A. Some of the PM QC requirements, like collocation for
precision and the performance evaluation programs for bias, are
accomplished on a percentage of monitoring sites within a PQAO. For
example, collocated sampling for PM2.5 and PM10
is required at approximately 15 percent of the monitoring sites within
a PQAO. Since virtually every NCore site is the responsibility of a
different PQAO, the appendix A requirements for PM10-2.5, if
implemented at the PQAO level, would have been required to be
implemented at almost every NCore site, which would have been expensive
and an unintended burden. Therefore, the EPA required the
implementation of the PM10-2.5 QC requirements at a national
level and worked with the EPA regions and monitoring organizations to
identify the sites that would implement the requirements. The
implementation of the PM10-2.5 QC requirements at NCore
sites fundamentally changed how QC is implemented in appendix A and has
been a cause of confusion with these parties. Since PM10-2.5
is not a NAAQS pollutant and the QC requirements cannot be cost-
effectively implemented at a PQAO level, the EPA is proposing to
eliminate the PM10-2.5 requirements including flow rate
verifications, semi-annual flow rate audits, collocated sampling
procedures, and the PM10-2.5 Performance Evaluation Program
(PEP). Similar to the technical assistance documents associated for the
CSN \18\ and PAMS \19\ networks, the EPA will develop QA guidance for
the PM10-2.5 network which will afford more flexibility for
implementation and revision of QC activities for PM10-2.5.
---------------------------------------------------------------------------
\18\ See https://www.epa.gov/ttn/amtic/specguid.html for CSN
quality assurance project plan.
\19\ See https://www.epa.gov/ttn/amtic/pamsguidance.html for PAMS
technical assistance document.
---------------------------------------------------------------------------
The EPA proposes that the QA Pb requirements of collocated sampling
(see current section 3.3.4.3) and Pb performance evaluation procedures
(see current section 3.3.4.4) for non-source NCore sites be eliminated.
The 2010 Pb rule in 40 CFR part 58, appendix D, section 4.5(b), added a
requirement to conduct non-source oriented Pb monitoring at each NCore
site in a core based statistical area (CBSA) with a population of
500,000 or more. This requirement had some monitoring organizations
implementing Pb monitoring at only one site, the NCore site. Since the
appendix A requirements are focused on PQAOs, the QC requirements would
increase at PQAOs who were required to implement Pb monitoring at
NCore. Similar to the PM10-2.5 QA requirements,
the requirement for Pb at NCore sites forced the EPA away from a focus
on PQAOs to working with the EPA regions and monitoring organizations
for implementation of the Pb Performance Evaluation Program (Pb-PEP) at
national levels. Therefore, the EPA is proposing to eliminate the
collocation requirement and the Pb-PEP requirements while retaining the
requirements for flow rate verifications and flow rate audits which do
not require additional monitors or independent sampling and analysis.
Similar to the CSN and PAMS programs, the EPA will develop QA guidance
for the Pb NCore network which will afford more flexibility for change/
revision to accommodate Pb monitoring at non-source NCore sites.
Additionally, the
[[Page 54366]]
EPA is proposing to delete the requirement to measure Pb at these
specific NCore sites, either as Pb-TSP or as Pb-PM10 (see
section II.I of this rule). If that proposed change is finalized, it
will eliminate the need for any associated QA requirements including
collocation, Pb-PEP or any QC requirements for these monitors. If the
proposed change to NCore Pb requirements is not finalized, then the EPA
will consider the proposed revision to QA requirements as described
above on its own merits.
The EPA proposes that quality management plan (QMP) (current
section 2.1.1) and quality assurance project plan (QAPP) (current
section 2.1.2) submission and approval dates be reported by monitoring
organizations and the EPA. This will allow for timely and accurate
reporting of this information. Since 2007, the EPA has been tracking
the submission and approval of QMPs and QAPPs by polling the EPA
regions each year and updating a spreadsheet to the AMTIC Web site. The
development of the annual spreadsheet is time consuming on the part of
monitoring organizations and the EPA. It is expected that simplified
reporting at the monitoring organization and the EPA regional office
level to AQS will reduce entry errors and the burden of incorporating
this information into annual spreadsheets, and increase transparency of
this important quality system documentation. In order to reduce the
initial burden of this data entry activity, the EPA has populated AQS
with the last set of updated QMP and QAPP data from the annual
spreadsheet review cycle. If this portion of the proposal is finalized,
monitoring organizations will only need to update AQS as necessary.
In addition, some monitoring organizations have received delegation
of authority to approve their QAPP through the monitoring
organization's own QA organization. The EPA proposes that if a PQAO or
monitoring organization has been delegated authority to review and
approve their QAPP, an electronic copy must be submitted to the EPA
regional office at the time it is submitted to the PQAO/monitoring
organization's QAPP approving authority. Submission of an electronic
version to the EPA at the time of completion is not considered an added
burden on the monitoring organization because such submission is
already a standard practice as part of the review process for technical
systems audits.
The EPA proposes to add some clarifying language to the section
describing the National Performance Evaluation Program (NPEP) (current
section 2.4) explaining self-implementation of the performance
evaluation by the monitoring organization. The clarification also adds
the definition of independent assessment which is included in the PEP
(PM2.5-PEP, Pb-PEP and National Performance Audit Program
(NPAP)) QAPPs and guidance and is included in the self-implementation
memo sent to the monitoring organizations on an annual basis and posted
on the AMTIC Web site \20\. The clarification is not a new requirement
but provides a better reference for this information in addition to the
annual memo sent to the monitoring organizations.
---------------------------------------------------------------------------
\20\ See https://www.epa.gov/ttn/amtic/npepqa.html.
---------------------------------------------------------------------------
The EPA proposes to add some clarifying language to the technical
systems audits (TSA) section (current section 2.4). The current TSA
requirements are performed at the monitoring organization level. Since
the EPA is revising the language in appendix A to focus on PQAOs
instead of monitoring organizations, this may have an effect on those
EPA Regions that want to perform TSA on monitoring organizations within
a PQAO (a PQAO can be a single monitoring organization or a
consolidation of a number of local monitoring organizations). The EPA
proposes a TSA frequency of 3 years for each PQAO, but includes
language that if a PQAO is made up of a number of monitoring
organizations, all monitoring organizations within the PQAO be audited
within 6 years. This proposed language maintains the every 3 years TSA
requirement as it applies to PQAOs but provides additional flexibility
for the EPA regions to audit every monitoring organization within the
PQAO every 6 years. This change does not materially affect the burden
on monitoring organizations.
The EPA proposes to require monitoring organizations to complete an
annual survey for the Ambient Air Protocol Gas Verification Program
(AA-PGVP) (current section 2.6.1). Since 2009, the EPA has had a
separate information collection request (ICR) requiring monitoring
organizations to complete an annual survey of the producers that supply
their gas standards (for calibrations and QC) in order to be able to
select standards from these producers for verification. The survey
generally takes less than 10 minutes to complete. The EPA proposes to
add the requirement to appendix A. In addition, the EPA proposes to add
language that monitoring organizations participate, at the request of
the EPA, in the AA-PGVP by sending a gas standard to one of the
verification laboratories every 5 years. Since many monitoring
organizations already volunteer to send in cylinders, this proposed new
requirement may not materially affect most agencies and will not affect
those agencies not using gas standards.
3. Quality Control Checks for Gases
The EPA proposes to lower the audit concentrations (current section
3.2.1) of the one-point QC checks to 0.005 and 0.08 parts per million
(ppm) for SO2, NO2, and O3 (currently
0.01 to 0.1 ppm), and to between 0.5 and 5 ppm for CO monitors
(currently 1 and 10 ppm). With the development of more sensitive
monitoring instruments with lower detection limits, technical
improvements in calibrators, and lower ambient air concentrations in
general, the EPA feels this revision will better reflect the precision
and bias of the data. Since the audit concentrations are selected using
the mean or median concentration of typical ambient air concentrations
(guidance on this is provided in the QA Handbook \21\), the EPA is
proposing to add some clarification to the current language by
requiring monitoring organizations to select either the highest or
lowest concentration in the ranges identified if their mean or median
concentrations are above or below the prescribed range. There is no
additional burden to this requirement since the frequency is the same
and the audit concentrations are not so low as to make them
unachievable to generate or measure.
---------------------------------------------------------------------------
\21\ QA Handbook for Air Pollution Measurement Vol. II, Ambient
Air Quality Monitoring Program at: https://www.epa.gov/ttn/amtic/qalist.html.
---------------------------------------------------------------------------
The EPA proposes to remove reference to zero and span adjustments
(current section 3.2.1.1) and revise the one-point QC language to
simply require that the QC check be conducted before any calibration or
adjustment to the monitor. Recent revisions of the QA Handbook
discourage the implementation of frequent span adjustments so the
proposed language helps to clarify that no adjustment be made prior to
implementation of the one-point QC check.
The EPA proposes to remove the requirement (current section 3.2.2)
to implement an annual performance evaluation for one monitor in each
calendar quarter when monitoring organizations have less than four
monitoring instruments. The minimum requirement for the annual
performance evaluation for the primary monitor at a site is one per
year. The current regulation requires evaluation of the
[[Page 54367]]
monitors at 25 percent per quarter so that the performance evaluations
are performed in all four quarters. There are cases where some
monitoring organizations have less than four primary monitors for a
gaseous pollutant, and the current language suggests that a monitor
already receiving a performance evaluation be re-audited to provide for
performance evaluations in all four quarters. This is a burden
reduction for monitoring agencies operating smaller networks and does
not change the requirement of an annual performance evaluation for each
primary monitor.
The current annual performance evaluation language (current section
3.2.2.1) requires that the audits be conducted by selecting three
consecutive audit levels (currently five audit levels are provided in
appendix A). Due to the implementation of the NCore network, the
inception of trace gas monitors, and lower ambient air concentrations
being measured under typical circumstances, there is a need for audit
levels at lower concentrations to more accurately represent the
uncertainties present in much of the ambient data. The EPA proposes to
expand the audit levels from five to ten and remove the requirement to
audit three consecutive levels. The current regulation also requires
that the three audit levels should bracket 80 percent of the ambient
air concentrations measured by the analyzer. This current language has
caused some confusion and monitoring organizations have requested the
use of an audit point to establish monitor accuracy around the NAAQS
levels. Therefore, the EPA is proposing to revise the language so that
two of the audits levels selected represent 10-80 percent of routinely-
collected ambient concentrations either measured by the monitor or in
the PQAOs network of monitors. The proposed revision allows the third
point to be selected at the NAAQS level (e.g., 75 ppb for
SO2) or above the highest 3-year routine hourly
concentration, whichever is greater.
The EPA proposes to revise the language (current section
3.2.2.2(a)) addressing the limits on excess nitric oxide (NO) that must
be followed during gas phase titration (GPT) procedures involving
NO2 audits. The current NO limit (maintaining at least 0.08
ppm) is very restrictive and requires auditors to make numerous mid-
audit adjustments during a GPT that result in making the NO2
audit a very time consuming procedure. Monitoring agency staff have
advised us that the observance of such excess NO limits has no apparent
effect on NO2 calibrations being conducted with modern-day
GPT capable calibration equipment and, therefore, that the requirement
in the context of performing audits is unnecessary.\22\ We also note
the increasing availability of the EPA approved direct NO2
methods that do not utilize converters, rendering the use of GPT
techniques that require the output of NO and NOX to be a
potentially diminishingly used procedure in the future. Accordingly, we
have proposed a more general statement regarding GPT that acknowledges
the ongoing usage of monitoring agency procedures and guidance
documents that have successfully supported NO2 calibration
activities. The EPA believes that if such procedures have been
successfully used during calibrations when instrument adjustments are
potentially being made, then such procedures are appropriate for audit
use when instruments are not subject to adjustment. The EPA solicits
comment on this proposed generalization of the GPT requirements,
including whether a more specific set of requirements similar to the
current excess NO levels can be developed based on operational
experience and/or peer reviewed literature.
---------------------------------------------------------------------------
\22\ See supporting information in Excess NO Issue paper, Mike
Papp and Lewis Weinstock, Docket number EPA-HQ-OAR-2013-0619.
---------------------------------------------------------------------------
The EPA proposes to remove language (current section 3.2.2.2(b)) in
the annual performance evaluation section that requires regional
approval for audit gases for any monitors operating at ranges higher
that 1.0 ppm for O3, SO2 and NO2 and
greater than 50 ppm for CO. The EPA does not need to approve a
monitoring organization's use of audit gases to audit above proposed
concentration levels. There should be very few cases where a
performance evaluation needs to be performed above level 10, but there
may be some legitimate instances (e.g., SO2 audits in areas
impacted by volcanic emissions). Since data reported to AQS above the
highest level may be flagged or rejected, the EPA proposes that PQAOs
notify the EPA regions of sites auditing at concentrations above level
10 so that reporting accommodations can be made.
The EPA proposes to provide additional explanatory language in
appendix A to describe the NPAP (current section 2.4). The NPAP has
been a long standing program for the ambient air monitoring community.
The NPAP is a performance evaluation which is a type of audit where
quantitative data are collected independently in order to evaluate the
proficiency of an analyst, monitoring instrument or laboratory. It has
been briefly mentioned in section 2.4 of the current appendix A
requirements. Since 2007, the EPA has distributed a memo to all
monitoring organizations in order to determine whether the monitoring
organization plans to self-implement the NPAP program or utilize the
federally implemented program. In order to make this decision, the NPAP
adequacy and independence requirements are described in the memo. The
EPA proposes to include these same requirements in appendix A in a
separate section for NPAP. In addition, the memo currently states that
20 percent of the sites would be audited each year and, therefore, all
sites would be audited in a 5-year period. Since there is a possibility
that monitoring organizations may want some higher priority sites
audited more frequently, the EPA is proposing to revise the language to
require all sites to be audited within a 6-year period to provide more
flexibility and discretion for monitoring agencies. This revision does
not change the number of sites audited in any given year, but allows
for increased frequency of sites deemed as high priority.
4. Quality Control Checks for Particulate Monitors
The EPA proposes to require that flow rate verifications (current
section 3.2.3) be reported to AQS. Particulate matter concentrations
(e.g., PM2.5, PM10, Pb) are reported in mass per
unit of volume (e.g., [mu]g/m\3\). Flow rate verifications are
implemented at required frequencies in order to ensure that the PM
sampler is providing an accurate and repeatable measure of volume which
is critical for the determination of concentration. If a given flow
rate verification does not meet acceptance criteria, the EPA guidance
suggests that data may be invalidated back to the most recent
acceptable verification which is why these checks are performed at
higher frequencies. Implementation of the flow rate verification is
currently a requirement, but the reporting to AQS has only been a
requirement for PM10 continuous instruments. This is the
only QC requirement in appendix A that was not fully required for
reporting for all pollutants and has been a cause of confusion. When
performing TSAs, the EPA regions review the flow rate verification
information. There are cases where it is difficult to find the flow
rate verification information to ascertain completeness, data quality
and whether corrective actions have been implemented in the case of
flow rate verification failures. In addition, the EPA regions have
mentioned that some of the monitoring organizations have
[[Page 54368]]
been reporting this data to AQS in an effort to increase transparency
and reliability in data quality. In a recent review of 2012 data, out
of the 1,110 SLAMS PM2.5 samplers providing flow rate audit
data (which are required to be reported), flow rate verification data
was also reported for 543 samplers or about 49 percent for the samplers
with flow rate audit data. With the development of a new QA transaction
in AQS, we believe that the reporting of flow rate verification data
would improve the evaluation of data quality for data certification and
at national levels, provide consistent interpretation in the regulation
for all PM pollutants without being overly burdensome (approximately 12
per sampler per year).
In addition, the flow rate verification requirements for all the
particulate monitors suggest randomization of the implementation of
flow rate verifications with respect to time of day, day of the week
and routine service and adjustments. Since this is a suggestion, the
EPA proposes to remove this language from the regulation and instead
include it in QA guidance.
The EPA proposes to add clarifying language to the PM2.5
collocation requirements (current section 3.2.5) that a site can only
count for the collocation of the method designation of the primary
monitor at that site. Precision is estimated at the PQAO level and at
15 percent of the sites for each method designation that is designated
as a primary monitor. When developing the collocation requirements, the
EPA intended to have the collocated monitors distributed to as many
sites as possible in order to capture as much of the temporal and
spatial variability in the PQAO. Therefore, since there can be only one
primary monitor at a site for any given time period, it was originally
intended that the primary monitor and the QA collocated monitor (for
the primary) at a monitoring site count as one collocation. There have
been some cases where multiple monitoring methods have been placed at a
single site to fulfill multiple collocation requirements, which is not
the intent of the current requirement. For example, a site (Site A) may
have a primary monitor that is designated as a FRM (FRM A). This site
may also have a FEM (FEM B) at the site that is not the primary
monitor. If this site was selected for collocation, then the QA
collocated monitor must be the same method designation as the primary,
so the site would be collocated with another FRM A monitor. For primary
monitors that are FEMs, the current requirement calls for the first QA
collocated monitor of a FEM primary monitor be a FRM monitor. Some
monitoring organizations have been using the collocated FRM monitors at
Site A to satisfy the collocation requirements for other sites (e.g.,
Sites B, C, D) that have a FEM (FEM B or other FEM) as the primary
monitor rather than placing a QA collocated FRM monitor at Site B (C or
D). This was not the intent of the original regulation and the EPA
provided additional guidance to monitoring organizations in 2010 \23\
on the correct (intended) interpretation. This revision does not change
the current regulation and does not increase or decrease burden, but is
intended to provide clarity on how the PQAO identifies the number and
types of monitors needed to achieve the collocation requirements.
---------------------------------------------------------------------------
\23\ QA EYE Issue 9 Page 3 at: https://www.epa.gov/ttn/amtic/qanews.html.
---------------------------------------------------------------------------
The EPA proposes to provide more flexibility to monitoring
organizations when selecting sites for collocation. Appendix A
currently (current section 3.2.5.3) requires 80 percent of the
collocated monitors be deployed at sites within 20 percent
of the NAAQS and if the monitoring organization does not have sites
within that range, then 60 percent of the sites are to be deployed
among the highest 25 percent of all sites within the network.
Monitoring organizations have found this difficult to achieve. Some
monitoring organizations do not have many sites and, at times, due to
permission, access and limited space issues, the requirement was not
always achievable. Realizing that the collocated monitors provide
precision estimates for the PQAO (since only 15 percent of the sites
are collocated), while also acknowledging that sites that measure
concentrations close to the NAAQS are important, the EPA proposes to
require that 50 percent (reduction from 80 percent) of the collocated
monitors be deployed at sites within 20 percent of the
NAAQS, and if the monitoring organization does not have sites within
that range, then 50 percent of the sites are to be deployed among the
highest sites within the network. Although this requirement does not
change the number of sites requiring collocation, it does provide the
monitoring organizations additional flexibility in its choice of
collocated sites.
5. Calculations for Data Quality Assessment
In order to provide reasonable estimates of data quality, the EPA
uses data above an established threshold concentration usually related
to the detection limits of the measurement. Measurement pairs are
selected for use in the precision and bias calculations only when both
measurements are above a threshold concentration.
For many years, the threshold concentration for Pb precision and
bias data was 0.02 ug/m\3\. The EPA promulgated a new Pb FRM (see 78 FR
40000) utilizing the Inductively Coupled Plasma Mass Spectrometry (ICP-
MS) analysis technique in 2013 as a revision to appendix G of 40 CFR
part 50 \24\. This new FRM demonstrated method detection limits (MDLs)
\25\ below 0.0002 [mu]g/m\3\, which is well below the EPA requirement
of five percent of the current Pb NAAQS level of 0.15 [mu]g/m\3\ or
0.0075 [mu]g/m\3\. As a result of the increased sensitivity inherent in
this new FRM, the EPA proposes to lower the acceptable Pb concentration
(current section 4) from the current value of 0.02 ug/m\3\ to 0.002
[mu]g/m\3\ for measurements obtained using the new Pb FRM and other
more recently approved equivalent methods that have the requisite
increased sensitivity.\26\ The current 0.02 ug/m\3\ value will be
retained for the previous Pb FRM that has subsequently been re-
designated as Federal Equivalent Method EQLA-0813-803, as well as older
equivalent methods that were approved prior to the more recent work on
developing more sensitive methods. Since ambient Pb concentrations are
lower and methods more sensitive, lowering the threshold concentration
will allow much more collocated information to be evaluated which will
provide more representative estimates of precision and bias.
---------------------------------------------------------------------------
\24\ See 78 FR 40000, July 3, 2013.
\25\ MDL is described as the minimum concentration of a
substance that can be measured and reported with 99-percent
confidence that the analyte concentration is greater than zero.
\26\ FEMS approved on or after March 4, 2010, have the required
sensitivity to utilize the 0.002 [mu]g/m\3\ reporting limit with the
exception of manual equivalent method EQLA-0813-803, the previous
FRM based on flame atomic absorption spectroscopy.
---------------------------------------------------------------------------
The EPA also proposes to remove the total suspended particulate
(TSP) threshold concentration for precision and bias since TSP is no
longer a NAAQS required pollutant and the EPA no longer has QC
requirements for it.
The EPA proposes to remove the statistical check currently
described in section 4.1.5 of appendix A. The check was developed to
perform a comparison of the one-point QC checks and the annual
performance evaluation data performed by the same PQAO. The section
suggests that 95 percent of all the bias estimates from the annual
performance evaluation (reported as a
[[Page 54369]]
percent difference) should fall within the 95 percent probability
interval developed using the one-point QC checks. The problem with this
check is that PQAOs with very good repeatability on the one-point QC
check data had a hard time meeting this requirement since the
probability interval became very tight, making it more difficult for
better performing PQAOs to meet the requirement. Separate statistics to
evaluate the one-point QC checks and the performance evaluations are
already promulgated, so the removal of this check does not affect data
quality assessments.
Similar to the statistical comparison of performance evaluations
data, the EPA proposes to remove the statistical check (current section
4.2.4) to compare the flow rate audit data and flow rate verification
data. The existing language suggests that 95 percent of all the flow
rate audit data results (reported as percent difference) should fall
within the 95 percent probability interval developed from the flow rate
verification data for the PQAO. The problem, as with the one-point QC
check, was that monitoring organizations with very good repeatability
on the flow rate verifications had a hard time meeting this requirement
since the probability interval became very tight, making it difficult
for better performing PQAOs to meet the requirement. Separate
statistics to evaluate the flow rate verifications and flow rate audits
are already promulgated, so the removal of this check does not affect
data quality assessments.
B. Quality Assurance Requirements for Monitors Used in Evaluations of
Prevention of Significant Deterioration Projects-Appendix B
1. General Information
The following proposed changes to monitoring requirements impact
these subparts of part 58--Ambient Air Quality Surveillance; appendix
B--Quality Assurance Requirements for Prevention of Significant
Deterioration (PSD) Air Monitoring. Changes that affect the overall
appendix follow while those specific to the various sections of the
appendix will be addressed under specific section headings. Since the
PSD QA have been included in appendix A since 2006, section headings
refer to the current appendix A sections.
The quality assurance requirements in appendix B have been
developed for measuring the criteria pollutants of O3,
NO2, SO2, CO, PM2.5, PM10
and Pb and are minimum QA requirements for the control and assessment
of the quality of the PSD ambient air monitoring data submitted to the
PSD reviewing authority \27\ or the EPA by an organization operating a
network of PSD stations.
---------------------------------------------------------------------------
\27\ Permitting authority and reviewing authority are often used
synonymously in PSD permitting. Since reviewing authority has been
defined in 40 CFR 51.166(b), it is used throughout appendix B.
---------------------------------------------------------------------------
In the 2006 monitoring rule revisions, the PSD QA requirements,
which were previously in appendix B, were consolidated with appendix A
and appendix B was held in reserve. The PSD requirements, in most
cases, parallel appendix A in structure and content but because PSD
monitoring is only required for a period of one year or less, some of
the frequencies of implementation of the QC requirements for PSD are
higher than the corresponding appendix A requirements. In addition, the
agencies governing the implementation, assessment and approval of the
QA requirements are different; the reviewing authorities for PSD
monitoring and the EPA regions for ambient air monitoring for NAAQS
decisions. The combined regulations have caused confusion or
misinterpretations of the regulations among the public and monitoring
organizations implementing NAAQS or PSD requirements, and have resulted
in failure, in some cases, to perform the necessary QC requirements.
Accordingly, the EPA proposes that the PSD QA requirements be removed
from appendix A and returned to appendix B which is currently reserved.
Separating the two sets of QA requirements would clearly distinguish
the PSD QA requirements and allow more flexibility for future revisions
to either monitoring program.
With this proposed rule, the EPA would not change most of the QC
requirements for PSD. Therefore, the discussion that follows will cover
those sections of the PSD requirements that the EPA proposes to change
from the current appendix A requirements.
The applicability section of appendix B clarifies that the PSD QA
requirements are not assumed to be minimum requirements for data used
in NAAQS decisions. One reason for this distinction is in the
flexibility allowed in PSD monitoring for the NPEP (current appendix A
section 2.4). The proposed PSD requirements allow the PSD reviewing
authority to decide whether implementation of the NPEP will be
performed. The NPEP, which is described in appendix A, includes the
NPAP, PM2.5 Performance Evaluation Program
(PM2.5-PEP), and the Pb-PEP. Accordingly, under the proposed
rule, if a PSD reviewing authority were to have the intent of using PSD
data for any official comparison to the NAAQS beyond the permitting
application, such as for attainment/nonattainment designations or clean
data determinations, then all requirements in appendix B including
implementation of the NPEP would apply. In this case, monitoring would
more closely conform to the appendix A requirements. The EPA proposes
this flexibility for PSD because the NPEP requires either federal
implementation or implementation by a qualified individual, group or
organization that is not part of the organization directly performing
and accountable for the work being assessed. The NPEP may require
specialized equipment, certified auditors and a number of activities
which are enumerated in the sections associated with these programs.
Arranging this type of support service may be more difficult for the
operator of a single or small number of PSD monitoring stations
operating for only a year or less.
The EPA cannot accept funding from private contractors or industry,
and federal implementation of the NPEP for PSD would face several
funding and logistical hurdles. This creates an inequity in the NPEP
implementation options available to the PSD monitoring organizations
compared to the state/local/tribal monitoring organization monitoring
for NAAQS compliance. The EPA has had success in training and
certifying private contractors in various categories of performance
evaluations conducted under NPEP, but many have not made the necessary
investments in capital equipment to implement all categories of the
performance evaluations. Since the monitoring objectives for the
collection of data for PSD are not necessarily the same as those for
NAAQS evaluations, the EPA proposes to allow the PSD reviewing
authority to determine whether a PSD monitoring project must implement
the NPEP.
The EPA proposes to clarify the definition of PSD PQAO. The PQAO
was first defined in appendix A in 2006 (current appendix A section
3.1.1) when the PSD requirements were combined with appendix A. The
definition is not substantially changed for PSD, but the EPA proposes
to clarify that a PSD PQAO can only be associated with one PSD
reviewing authority. Distinguishing among the PSD PQAOs that coordinate
with a PSD reviewing authority would be consistent with discrete
jurisdictions for PSD permitting, and it would simplify oversight of
the QA requirements for each PSD network.
[[Page 54370]]
Given that companies may apply for PSD permits throughout the
United States, it is expected that some PSD monitoring organizations
will work with multiple reviewing authorities. The PSD PQAO code which
may appear in the AQS data base and other records defines the PSD
monitoring organization or a coordinated aggregation of such
organizations that is responsible for a set of stations within one PSD
reviewing authority that monitors the same pollutant and for which data
quality assessments will be pooled. The PSD monitoring organizations
that work with multiple PSD reviewing authorities would have individual
PSD PQAO codes for each PSD reviewing authority. This approach will
allow for the flexibility to develop appropriate quality systems for
each PSD reviewing authority.
The EPA proposes to add definitions of ``PSD monitoring
organization'' and ``PSD monitoring network'' to 40 CFR 58.1. The
definitions have been developed to improve understanding of the
appendix B regulations.
Since the EPA uses the term ``monitoring organization'' quite
frequently in the NAAQS associated ambient air regulations, the EPA
wants to provide a better definition of the term in the PSD QA
requirements. Therefore, the EPA proposes the term ``PSD monitoring
organization'' to identify ``a source owner/operator, a government
agency, or its contractor that operates an ambient air pollution
monitoring network for PSD purposes.''
The EPA also proposes to define ``PSD monitoring network'' in order
to distinguish ``a set of monitors that provide concentration
information for a specific PSD permit.'' The EPA will place both
definitions in 40 CFR 58.1.
2. Quality System Requirements
The EPA proposes to remove the PM10-2.5 requirements for
flow rate verifications, semi-annual flow rate audits, collocated
sampling procedures and PM10-2.5 Performance Evaluation
Program from appendix B (current appendix A sections 3.2.6, 3.2.8,
3.3.6, 3.3.8, 4.3). In 2006, the EPA proposed a PM10-2.5
NAAQS along with requisite QA requirements in appendix A. While the
PM10-2.5 NAAQS was not promulgated, PM10-2.5
monitoring was required to be performed at NCore sites and the EPA
proposed requisite QA requirements in appendix A. Since PSD monitoring
is distinct from monitoring at NCore sites and PM10-2.5 is
not a criteria pollutant, it will be removed from the PSD QA
requirements.
The EPA proposes that the Pb QA requirements of collocated sampling
(current appendix A section 3.3.4.3) and Pb performance evaluation
procedures (current appendix A section 3.3.4.4) for non-source oriented
NCore sites be eliminated for PSD. The 2010 Pb rule in 40 CFR part 58,
appendix D, section 4.5(b), added a requirement to conduct non-source
oriented Pb monitoring at each NCore site in a CBSA with a population
of 500,000 or more. Since PSD does not implement NCore sites, the EPA
proposes to eliminate the Pb QA language specific to non-source NCore
sites from PSD while retaining the PSD QA requirements for routine Pb
monitoring.
The EPA proposes that elements of QMPs and QAPPs which are separate
documents and are described in appendix A, sections 2.1.1 and 2.1.2,
can be combined into a single document for PSD monitoring networks. The
QMP provides a ``blueprint'' of a PSD monitoring organization's quality
system. It includes quality policies and describes how the organization
as a whole manages and implements its quality system regardless of what
monitoring is being performed. The QAPP includes details for
implementing a specific PSD monitoring activity. For PSD monitoring,
the EPA believes the project-specific QAPP takes priority but there are
important aspects of the QMP that could be incorporated into the QAPP.
The current appendix A requirements allow smaller organizations or
organizations that do infrequent work with EPA to combine the QMP with
the QAPP based on negotiations with the funding agency and provided
guidance \28\ on a graded approach to developing these documents. In
the case of PSD QMPs and QAPPs, the EPA proposes that the PSD reviewing
authority, which has the approval authority for these documents, also
have the flexibility for allowing the PSD PQAO to combine pertinent
elements of the QMP into the QAPP rather than requiring the submission
of both QMP and QAPP documents separately.
---------------------------------------------------------------------------
\28\ Graded approach to Tribal QAPP and QMPs https://www.epa.gov/ttn/amtic/cpreldoc.html.
---------------------------------------------------------------------------
The EPA proposes to add language to the appendix B version of the
data quality objectives (DQO) section (current appendix A section
2.3.1) which allows flexibility for the PSD reviewing authority and the
PSD monitoring organization to determine if adherence to the DQOs
specified in appendix A, which are the DQO goals for NAAQS decisions,
are appropriate or whether project-specific goals are necessary.
Allowing the PSD reviewing authority and the PSD monitoring
organization flexibility to change the DQOs does not change the
implementation requirements for the types and frequency of the QC
checks in appendix B, but does give some flexibility in the acceptance
of data for use in specific projects for which the PSD data are
collected. As an example, the goal for acceptable measurement
uncertainty for the collection of O3 data for NAAQS
determinations is defined for precision as an upper 90 percent
confidence limit for CV of seven percent and for bias as an upper 95
percent confidence limit for the absolute bias of seven percent. The
precision and bias estimates are made with 3 years of one-point QC
check data. A single or a few one-point QC checks over seven percent
would not have a significant effect on meeting the DQO goal. The PSD
monitoring DQO, depending on the objectives of the PSD monitoring
network, may require a stricter DQO goal or one less restrictive. Since
PSD monitoring covers a period of 1 year or less, one-point QC checks
over seven percent will increase the likelihood of failing to meet the
DQO goal since there would be fewer QC checks available in the
monitoring period to estimate precision and bias. With fewer checks,
any individual check will statistically have more influence over the
precision or bias estimate. Realizing that PSD monitoring may have
different monitoring objectives, the EPA proposes to add language that
would allow decisions on data quality objectives to be determined
through consultation between the appropriate PSD reviewing authority
and PSD monitoring organization.
The EPA proposes to add some clarifying language to the section
describing the NPEP (current appendix A section 2.4) to explain self-
implementation of the performance evaluation by the PSD monitoring
organization. Self-implementation of NPEP has always been an option for
monitoring organizations but the requirements for self-implementation
were described in the technical implementation documents (i.e.,
implementation plans and QAPPs) for the program and in an annual self-
implementation decision memo that is distributed to monitoring
organizations.\29\ These major requirements for self-implementation are
proposed to be included in the appendix B sections pertaining to the
NPEP program (NPAP, PM2.5-PEP and Pb-PEP).
---------------------------------------------------------------------------
\29\ https://www.epa.gov/ttn/amtic/npepqa.html.
---------------------------------------------------------------------------
The NPEP clarification also adds a definition of ``independent
assessment.''
[[Page 54371]]
The proposed definition is derived from the NPEP (NPAP,
PM2.5-PEP, and Pb-PEP) QAPPs and guidance; it also appears
in the annual self-implementation memo described above. The
clarification is not a new requirement but consolidates this
information.
The EPA proposes to require PSD PQAOs to provide information to the
PSD reviewing authority on the vendors of gas standards that they use
(or will use) for the duration of the PSD monitoring project. A QAPP or
monitoring plan may incorporate this information; however, that
document must then be updated if there is a change in the vendor used.
The current regulation (current appendix A section 2.6.1) requires any
gas vendor advertising and distributing ``EPA Protocol Gas'' to
participate in the AA-PGVP. The EPA posts a list of these vendors on
the AMTIC Web site.\30\ This is not expected to be a burden since
information of this type is normally included in a QAPP or standard
operating procedure for a monitoring activity.
---------------------------------------------------------------------------
\30\ https://www.epa.gov/ttn/amtic/aapgvp.html.
---------------------------------------------------------------------------
3. Quality Control Checks for Gases
The EPA proposes to lower the audit concentrations (current
appendix A section 3.2.1) of the one-point QC checks to 0.005 and 0.08
ppm for SO2, NO2, and O3 (currently
0.01 to 0.1 ppm), and to between 0.5 and 5 ppm for CO monitors
(currently 1 and 10 ppm). With the development of more sensitive
monitoring instruments with lower detection limits, technical
improvements in calibrators, and lower ambient air concentrations in
general, the EPA believes this revision will better reflect the
precision and bias of the routinely-collected ambient air data. Since
the audit concentrations are selected using the mean or median
concentration of typical ambient air data (guidance on this is provided
in the QA Handbook \31\), the EPA is proposing to add some
clarification to the current language by requiring PSD monitoring
organizations to select either the highest or lowest concentration in
the ranges identified if the mean or median values of the routinely-
collected concentrations are above or below the prescribed range. There
is no additional burden added by this requirement since the frequency
is the same and the audit concentrations are not so low as to make them
unachievable to generate or measure.
---------------------------------------------------------------------------
\31\ QA Handbook for Air Pollution Measurement Vol. II Ambient
Air Quality Monitoring Program at: https://www.epa.gov/ttn/amtic/qalist.html.
---------------------------------------------------------------------------
The EPA proposes to remove the existing reference to zero and span
adjustments (current appendix A, section 3.2.1.1) and to revise the
one-point QC language to simply require that the QC check be conducted
before making any calibration or adjustment to the monitor. Recent
revisions of the QA Handbook discourage the practice of making frequent
span adjustments so the proposed language helps to clarify that no
adjustment be made prior to implementation of the one-point QC check.
The current annual performance evaluation language (current
appendix A, section 3.2.2.1) requires that the audits be conducted by
selecting three consecutive audit levels (currently appendix A
recognizes five audit levels). Due to the implementation of the NCore
network, the inception of trace gas monitors, and lower ambient air
concentrations being measured under typical circumstances, there is a
need for audit levels at lower concentrations to more accurately
represent the uncertainties present in the ambient air data. The EPA
proposes to expand the audit levels from five to ten and remove the
requirement to audit three consecutive levels. The current regulation
also requires that the three audit levels should bracket 80 percent of
the ambient air concentrations measured by the analyzer. This current
``bracketing language'' has caused some confusion and monitoring
organizations have requested the use of an audit point to establish
monitor accuracy around the NAAQS levels. Therefore, the EPA is
proposing to revise the language so that two of the audit levels
selected represent 10 to 80 percent of routinely-collected ambient
concentrations either measured by the monitor or in the PSD PQAOs
network of monitors. The proposed revision allows the third point to be
selected at a concentration that is consistent with PSD-specific DQOs
(e.g., the 75 ppb NAAQS level for SO2).
The EPA proposes to revise the language (current appendix A,
section 3.2.2.2(a)) addressing the limits on excess NO that must be
followed during GPT procedures involving NO2 audits. The
current NO limit (maintaining at least 0.08 ppm) is very restrictive
and requires auditors to make numerous mid-audit adjustments during a
GPT that result in making the NO2 audit a very time
consuming procedure. Monitoring agency staff have advised us that the
observance of such excess NO limits has no apparent effect on
NO2 calibrations being conducted with modern-day GPT-capable
calibration equipment and, therefore, that the requirements in the
context of performing audits is unnecessary.\32\ We also note the
increasing availability of the EPA-approved direct NO2
methods that do not utilize converters, rendering the use of GPT
techniques that require the output of NO and NOX to be a
potentially diminishingly used procedure in the future. Accordingly, we
have proposed a more general statement regarding GPT that acknowledges
the ongoing usage of monitoring agency procedures and guidance
documents that have successfully supported NO2 calibration
activities. The EPA believes that if such procedures have been
successfully used during calibrations when instrument adjustments are
potentially being made, than such procedures are appropriate for audit
use when instruments are not subject to adjustment. The EPA solicits
comment on this proposed generalization of the GPT requirements,
including whether a more specific set of requirements similar to the
current excess NO levels can be developed based on operational
experience and/or peer reviewed literature.
---------------------------------------------------------------------------
\32\ See supporting information in Excess NO Issue paper, Mike
Papp and Lewis Weinstock, Docket number EPA-HQ-OAR-2013-0619.
---------------------------------------------------------------------------
The EPA proposes to remove language (current appendix A section
3.2.2.2(b)) in the annual performance evaluation section that requires
regional approval for audit gases for any monitors operating at ranges
higher that 1.0 ppm for O3, SO2 and
NO2 and greater than 50 ppm for CO. The EPA does not need to
approve a monitoring organization's use of audit gases to audit above
proposed concentration levels since the EPA has identified the
requirements for all audit gases used in the program in current
appendix A, section 2.6.1. There should be very few cases where a
performance evaluation needs to be performed above level 10 but there
may be some legitimate instances (e.g., an SO2 audit in
areas impacted by volcanic emissions). Since data reported to AQS above
the highest level may be rejected (if PSD PE data are reported to AQS),
the EPA proposes that PQAOs notify the PSD reviewing authority of sites
auditing at concentrations above level 10 so that reporting
accommodations can be made.
The EPA proposes to describe the NPAP (current appendix A, section
2.4) in more detail. The NPAP is a long-standing program for the
ambient air monitoring community. The NPAP is a performance evaluation
which is a type of audit where quantitative data are collected
independently in order to evaluate the proficiency of an analyst,
monitoring instrument or laboratory.
[[Page 54372]]
This program has been briefly mentioned in section 2.4 of the current
appendix A requirements. In appendix A, the EPA is proposing to add
language consistent with an annual decision memorandum \33\ distributed
to all state and local monitoring organizations in order to determine
whether the monitoring organization plans to self-implement the NPAP
program or utilize the federally implemented program. In order to make
this decision, the NPAP adequacy and independence requirements are
described in the decision memorandum. The EPA proposes to include these
same requirements in appendix B in a separate section for NPAP. As
described in the applicability section, the implementation of NPAP is
at the discretion of the PSD reviewing authority but must be
implemented if data are used in any NAAQS determinations. Since PSD
monitoring is implemented at shorter intervals (usually a year) and
with fewer monitors, if NPAP is performed, it is required to be
performed annually on each monitor operated in the PSD network.
---------------------------------------------------------------------------
\33\ https://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/npappep2014.pdf.
---------------------------------------------------------------------------
4. Quality Control Checks for Particulate Monitors
The EPA proposes to have one flow rate verification frequency
requirement for all PM PSD monitors. The current regulations (current
appendix A, table A-2) provides for monthly flow rate verifications for
most samplers used to monitor PM2.5, PM10 and Pb
and quarterly flow rate verifications for high-volume PM10
or TSP samplers (for Pb). With longer duration NAAQS monitoring, the
quarterly verification frequencies are adequate for these high-volume
PM10 or TSP samplers. However, with the short duration of
PSD monitoring, the EPA believes that monthly flow rate verifications
are more appropriate to ensure that any sampler flow rate problems are
identified more quickly and to reduce the potential for a significant
amount of data invalidation that could extend monitoring activities.
The EPA proposes to grant more flexibility to PSD monitoring
organizations when selecting PM2.5 method designations for
sites that require collocation. Appendix A currently (current appendix
A, section 3.2.5.2(b)) requires that if a primary monitor is a FEM,
then the first QC collocated monitor must be a FRM monitor. Most of the
FEM monitors are continuous monitors while the FRM monitors are filter-
based. Continuous monitors (which are all FEMs) may be advantageous for
use at the more remote PSD monitoring locations, since the site
operator would not need to visit a site as often to retrieve filters
(current FRMs are filter-based). The current collocation requirements
for FEMs require a filter-based FRM for collocation which would mean a
visit to retrieve the FRM filters at least one week after the QC
collocated monitor operated. Therefore, the EPA proposes that the FRM
be selected as the QC collocated monitor unless the PSD PQAO submits a
waiver request to allow for collocation with a FEM to the PSD reviewing
authority. If the request for a waiver is approved, then the QC monitor
must be the same method designation as the primary FEM monitor.
The EPA proposes to allow the PSD reviewing authority to waive the
PM2.5 3 [mu]g/m\3\ concentration validity threshold for
implementation of the PM2.5-PEP in the last quarter of PSD
monitoring. The PM2.5-PEP (current appendix A section 3.2.7)
requires five valid PM2.5-PEP audits per year for
PM2.5 monitoring networks with less than or equal to five
sites and eight valid PM2.5-PEP audits per year with
PM2.5 monitoring networks greater than five sites. Any PEP
sample collected with a concentration less than 3 [mu]g/m\3\ are not
considered valid, since they cannot be used for bias estimates, and re-
sampling is required at a later date. With NAAQS related monitoring,
which aggregates the PM2.5-PEP data over a 3-year period,
re-sampling is easily accomplished. Due to the relatively short-term
nature of most PSD monitoring, the likelihood of measuring low
concentrations in many areas attaining the PM2.5 standard
and the time required to weigh filters collected in performance
evaluations, a PSD monitoring organization's QAPP may contain a
provision to waive the 3 [mu]g/m\3\ threshold for validity of
performance evaluations conducted in the last quarter of monitoring,
subject to approval by the PSD reviewing authority.
5. Calculations for Data Quality Assessment
In order to allow reasonable estimates of data quality, the EPA
uses data above an established threshold concentration usually related
to the detection limits of the measurement method. Measurement pairs
are selected for use in the precision and bias calculations only when
both measurements are above a threshold concentration.
For many years, the threshold concentration for Pb precision and
bias data has been 0.02 ug/m\3\. The EPA promulgated a new Pb FRM
utilizing the ICP-MS analysis technique in 2013 as a revision to
appendix G of 40 CFR part 50.\34\ This new FRM demonstrated MDLs \35\
below 0.0002 [mu]g/m\3\ which is well below the EPA requirement of five
percent of the current Pb NAAQS level of 0.15 [mu]g/m\3\ or 0.0075
[mu]g/m\3\. As a result of the increased sensitivity inherent in this
new FRM, the EPA proposes to lower the acceptable Pb concentration
(current section 4) from the current value of 0.02 ug/m\3\ to 0.002
[mu]g/m\3\ for measurements obtained using the new Pb FRM and other
more recently approved equivalent methods that have the requisite
increased sensitivity.\36\ The current 0.02 ug/m\3\ value will be
retained for the previous Pb FRM that has subsequently been
redesignated as Federal Equivalent Method EQLA-0813-803 as well as
older equivalent methods that were approved prior to the more recent
work on developing more sensitive methods. Since ambient Pb
concentrations are lower and methods more sensitive, lowering the
threshold concentration will allow much more collocated information to
be evaluated, which will provide more representative estimates of
precision and bias.
---------------------------------------------------------------------------
\34\ See 78 FR 40000, July 3, 2013.
\35\ MDL is described as the minimum concentration of a
substance that can be measured and reported with 99 percent
confidence that the analyte concentration is greater than zero.
\36\ FEMs approved on or after March 4, 2010, have the required
sensitivity to utilize the 0.002 [mu]g/m\3\ reporting limit with the
exception of manual equivalent method EQLA-0813-803, the previous
FRM based on flame atomic absorption spectroscopy.
---------------------------------------------------------------------------
The EPA also proposes to remove the TSP threshold concentration
since TSP is no longer an ambient indicator of PM NAAQS required
pollutant and the EPA no longer applies QC requirements for it.
The EPA proposes to remove the statistical check currently
described in section 4.1.5 of appendix A. The check was developed to
perform a comparison of the one-point QC checks and the annual
performance evaluation data performed by the same PQAO. The section
suggests that 95 percent of all the bias estimates of the annual
performance evaluations (reported as a percent difference) should fall
within the 95 percent probability interval developed using the one-
point QC checks. The problem with this check is that PQAOs with very
good repeatability on the one-point QC check data had a hard time
meeting this requirement since the probability interval became very
tight, making it more difficult for better performing PQAOs to meet the
requirement. Separate statistics to
[[Page 54373]]
evaluate the one-point QC checks and the performance evaluations are
already promulgated, so the removal of this check does not affect data
quality assessments.
Similar to the statistical comparison of performance evaluation
data, the EPA proposes to remove the statistical check (current
appendix A, section 4.2.4) to compare the flow rate audit data and flow
rate verification data. The existing language suggests that 95 percent
of all the flow rate audit data (reported as percent difference) should
fall within the 95 percent probability interval developed from the flow
rate verification data for the PQAO. The problem, as with the one-point
QC check, was that monitoring organizations with very good
repeatability on the flow rate verifications had a hard time meeting
this requirement since the probability interval became very tight,
making it difficult for better performing PQAOs to meet the
requirement. Separate statistics to evaluate the flow rate
verifications and flow rate audits are already promulgated so the
removal of this check does not affect data quality assessments.
The EPA proposes to remove the reporting requirements that are
currently in section 5 of appendix A because they do not pertain to PSD
monitoring (current sections 5.1, 5.1.1 and 5.1.2.1). Since PSD
organizations are not required to certify their data to the EPA nor
report to AQS, the EPA will remove language related to these
requirements and language that required the EPA to calculate and report
the measurement uncertainty for the entire calendar year. The EPA will
retain the quarterly PSD reporting requirements (current section 5.2 in
appendix A) and require that those requirements be consistent with Part
58.16 as it pertains to PSD ambient air quality data and QC data, as
described in appendix B.
IV. Statutory and Executive Order Reviews
A. Executive Order 12866: Regulatory Planning and Review and Executive
Order 13563: Improving Regulation and Regulatory Review
This action is not a ``significant regulatory action'' under the
terms of Executive Order 12866 (58 FR 51735, October 4, 1993) and is
therefore not subject to review under Executive Orders 12866 and 13563
(76 FR 3821, January 21, 2011).
B. Paperwork Reduction Act
This action does not impose an information collection burden under
the provisions of the Paperwork Reduction Act, 44 U.S.C. 3501 et seq.
Burden is defined at 5 CFR 1320.3(b). While the EPA believes that the
net effect of the proposed changes to requirements is a net decrease in
burden, the current information collection request calculation tools
are not sufficiently detailed to show a material change in burden
compared with the existing requirements.
C. Regulatory Flexibility Act
The Regulatory Flexibility Act (RFA) generally requires an agency
to prepare a regulatory flexibility analysis of any rule subject to
notice and comment rulemaking requirements under the Administrative
Procedure Act or any other statute unless the agency certifies that the
rule will not have a significant economic impact on a substantial
number of small entities. Small entities include small businesses,
small organizations, and small governmental jurisdictions.
For purposes of assessing the impacts of this rule on small
entities, small entity is defined as (1) a small business as defined by
the Small Business Administration's (SBA) regulations at 13 CFR
121.201; (2) a small governmental jurisdiction that is a government of
a city, county, town, school district or special district with a
population of less than 50,000; and (3) a small organization that is
any not-for-profit enterprise which is independently owned and operated
and is not dominant in its field.
After considering the economic impacts of this rule on small
entities, I certify that this action will not have a significant
economic impact on a substantial number of small entities. This
proposed rule will neither impose emission measurement requirements
beyond those specified in the current regulations, nor will it change
any emission standard. As such, it will not present a significant
economic impact on small entities.
D. Unfunded Mandates Reform Act
This action contains no federal mandates under the provisions of
Title II of the Unfunded Mandates Reform Act of 1995 (UMRA), 2 U.S.C.
1531-1538 for state, local, or tribal governments or the private
sector. This action imposes no enforceable duty on any state, local or
tribal governments or the private sector. Therefore, this action is not
subject to the requirements of sections 202 or 205 of the UMRA. This
action is also not subject to the requirements of section 203 of UMRA
because it contains no regulatory requirements that might significantly
or uniquely affect small governments.
E. Executive Order 13132: Federalism
This action does not have federalism implications. It will not have
substantial direct effects on the states, on the relationship between
the national government and the states, or on the distribution of power
and responsibilities among the various levels of government, as
specified in Executive Order 13132. This action proposes minor changes
to existing monitoring requirements and will not materially impact the
time required to operate monitoring networks. Thus, Executive Order
13132 does not apply to this action. In the spirit of Executive Order
13132, and consistent with the EPA policy to promote communications
between the EPA and state and local governments, the EPA specifically
solicits comment on this proposed rule from state and local officials.
F. Executive Order 13175: Consultation and Coordination With Indian
Tribal Governments
This action does not have tribal implications, as specified in
Executive Order 13175 (65 FR 67249, November 9, 2000). This proposed
rule imposes no requirements on tribal governments. This action
proposes minor changes to existing monitoring requirements and will not
materially impact the time required to operate monitoring networks.
Thus, Executive Order 13175 does not apply to this action. In the
spirit of Executive order 13175, the EPA specifically solicits
additional comment on this proposed action from tribal officials.
G. Executive Order 13045: Protection of Children From Environmental
Health and Safety Risks
The EPA interprets E.O. 13045 (62 FR 19885, April 23, 1997) as
applying only to those regulatory actions that concern health or safety
risks, such that the analysis required under section 5-501 of the E.O.
has the potential to influence the regulation. This action is not
subject to E.O. 13045 because it does not establish an environmental
standard intended to mitigate health or safety risks.
H. Executive Order 13211: Actions Concerning Regulations That
Significantly Affect Energy Supply, Distribution, or Use
This action is not a ``significant energy action'' as defined in
Executive Order 13211 (66 FR 28355 (May 22, 2001)), because it is not
likely to have a significant adverse effect on the
[[Page 54374]]
supply, distribution, or use of energy. This action proposes minor
changes to existing monitoring requirements.
I. National Technology Transfer and Advancement Act
Section 12(d) of the National Technology Transfer and Advancement
Act of 1995 (``NTTAA''), Public Law No. 104-113 (15 U.S.C. 272 note)
directs the EPA to use voluntary consensus standards in its regulatory
activities unless to do so would be inconsistent with applicable law or
otherwise impractical. Voluntary consensus standards are technical
standards (e.g., materials specifications, test methods, sampling
procedures, and business practices) that are developed or adopted by
voluntary consensus standards bodies. The NTTAA directs the EPA to
provide Congress, through OMB, explanations when the agency decides not
to use available and applicable voluntary consensus standards. This
proposed rulemaking does not involve technical standards. Therefore
this action is not subject to the NTTAA.
J. Executive Order 12898: Federal Actions To Address Environmental
Justice in Minority Populations and Low-Income Populations
Executive Order (E.O.) 12898 (59 FR 7629 (Feb. 16, 1994))
establishes federal executive policy on environmental justice. Its main
provision directs federal agencies, to the greatest extent practicable
and permitted by law, to make environmental justice part of their
mission by identifying and addressing, as appropriate,
disproportionately high and adverse human health or environmental
effects of their programs, policies, and activities on minority
populations and low-income populations in the United States.
The EPA has determined that this proposed rule will not have
disproportionately high and adverse human health or environmental
effects on minority or low-income populations because it does not
affect the level of protection provided to human health or the
environment.
List of Subjects in 40 CFR Part 58
Environmental protection, Administrative practice and procedure,
Air pollution control, Intergovernmental relations.
Dated: August 13, 2014.
Gina McCarthy,
Administrator.
For the reasons stated in the preamble, the Environmental
Protection Agency proposes to amend title 40, chapter 1 of the Code of
Federal Regulations as follows:
PART 58--AMBIENT AIR QUALITY SURVEILLANCE
0
1. The authority citation for part 58 continues to read as follows:
Authority: 42 U.S.C. 7403, 7405, 7410, 7414, 7601, 7611, 7614,
and 7619.
0
2. Revise Sec. 58.1 to read as follows:
Sec. 58.1 Definitions.
As used in this part, all terms not defined herein have the meaning
given them in the Clean Air Act.
AADT means the annual average daily traffic.
Act means the Clean Air Act as amended (42 U.S.C. 7401, et seq.)
Additive and multiplicative bias means the linear regression
intercept and slope of a linear plot fitted to corresponding candidate
and reference method mean measurement data pairs.
Administrator means the Administrator of the Environmental
Protection Agency (EPA) or his or her authorized representative.
Air Quality System (AQS) means the EPA's computerized system for
storing and reporting of information relating to ambient air quality
data.
Approved regional method (ARM) means a continuous PM2.5
method that has been approved specifically within a state or local air
monitoring network for purposes of comparison to the NAAQS and to meet
other monitoring objectives.
AQCR means air quality control region.
Area-wide means all monitors sited at neighborhood, urban, and
regional scales, as well as those monitors sited at either micro- or
middle-scale that are representative of many such locations in the same
CBSA.
Certifying agency means a state, local, or tribal agency
responsible for meeting the data certification requirements in
accordance with Sec. 58.15 of this part for a unique set of monitors.
Chemical Speciation Network (CSN) includes Speciation Trends
Network stations (STN) as specified in paragraph 4.7.4 of appendix D of
this part and supplemental speciation stations that provide chemical
species data of fine particulate.
CO means carbon monoxide.
Combined statistical area (CSA) is defined by the U.S. Office of
Management and Budget as a geographical area consisting of two or more
adjacent Core Based Statistical Areas (CBSA) with employment
interchange of at least 15 percent. Combination is automatic if the
employment interchange is 25 percent and determined by local opinion if
more than 15 but less than 25 percent.
Core-based statistical area (CBSA) is defined by the U.S. Office of
Management and Budget, as a statistical geographic entity consisting of
the county or counties associated with at least one urbanized area/
urban cluster of at least 10,000 population, plus adjacent counties
having a high degree of social and economic integration. Metropolitan
Statistical Areas (MSAs) and micropolitan statistical areas are the two
categories of CBSA (metropolitan areas have populations greater than
50,000; and micropolitan areas have populations between 10,000 and
50,000). In the case of very large cities where two or more CBSAs are
combined, these larger areas are referred to as combined statistical
areas (CSAs)
Corrected concentration pertains to the result of an accuracy or
precision assessment test of an open path analyzer in which a high-
concentration test or audit standard gas contained in a short test cell
is inserted into the optical measurement beam of the instrument. When
the pollutant concentration measured by the analyzer in such a test
includes both the pollutant concentration in the test cell and the
concentration in the atmosphere, the atmospheric pollutant
concentration must be subtracted from the test measurement to obtain
the corrected concentration test result. The corrected concentration is
equal to the measured concentration minus the average of the
atmospheric pollutant concentrations measured (without the test cell)
immediately before and immediately after the test.
Design value means the calculated concentration according to the
applicable appendix of part 50 of this chapter for the highest site in
an attainment or nonattainment area.
EDO means environmental data operations.
Effective concentration pertains to testing an open path analyzer
with a high-concentration calibration or audit standard gas contained
in a short test cell inserted into the optical measurement beam of the
instrument. Effective concentration is the equivalent ambient-level
concentration that would produce the same spectral absorbance over the
actual atmospheric monitoring path length as produced by the high-
concentration gas in the short test cell. Quantitatively, effective
concentration is equal to the actual concentration of the gas standard
in the test cell multiplied by the ratio of the path length of the test
cell to the actual atmospheric monitoring path length.
[[Page 54375]]
Federal equivalent method (FEM) means a method for measuring the
concentration of an air pollutant in the ambient air that has been
designated as an equivalent method in accordance with part 53; it does
not include a method for which an equivalent method designation has
been canceled in accordance with Sec. 53.11 or Sec. 53.16.
Federal reference method (FRM) means a method of sampling and
analyzing the ambient air for an air pollutant that is specified as a
reference method in an appendix to part 50 of this chapter, or a method
that has been designated as a reference method in accordance with this
part; it does not include a method for which a reference method
designation has been canceled in accordance with Sec. 53.11 or Sec.
5316.
HNO3 means nitric acid.
Implementation Plan means an implementation plan approved or
promulgated by the EPA pursuant to section 110 of the Act.
Local agency means any local government agency, other than the
state agency, which is charged by a state with the responsibility for
carrying out a portion of the annual monitoring network plan required
by Sec. 58.10.
Meteorological measurements means measurements of wind speed, wind
direction, barometric pressure, temperature, relative humidity, solar
radiation, ultraviolet radiation, and/or precipitation that occur at
stations including NCore and PAMS.
Metropolitan Statistical Area (MSA) means a CBSA associated with at
least one urbanized area of 50,000 population or greater. The central
county, plus adjacent counties with a high degree of integration,
comprise the area.
Monitor means an instrument, sampler, analyzer, or other device
that measures or assists in the measurement of atmospheric air
pollutants and which is acceptable for use in ambient air surveillance
under the applicable provisions of appendix C to this part.
Monitoring agency means a state, local or Tribal agency responsible
for meeting the requirements of this part.
Monitoring organization means a monitoring agency or other
monitoring organization responsible for operating a monitoring site for
which the quality assurance regulations apply.
Monitoring path for an open path analyzer means the actual path in
space between two geographical locations over which the pollutant
concentration is measured and averaged.
Monitoring path length of an open path analyzer means the length of
the monitoring path in the atmosphere over which the average pollutant
concentration measurement (path-averaged concentration) is determined.
See also, optical measurement path length.
Monitoring planning area (MPA) means a contiguous geographic area
with established, well-defined boundaries, such as a CBSA, county or
state, having a common area that is used for planning monitoring
locations for PM2.5. A MPA may cross state boundaries, such
as the Philadelphia PA-NJ MSA, and be further subdivided into community
monitoring zones. The MPAs are generally oriented toward CBSAs or CSAs
with populations greater than 200,000, but for convenience, those
portions of a state that are not associated with CBSAs can be
considered as a single MPA.
NATTS means the national air toxics trends stations. This network
provides hazardous air pollution ambient data.
NCore means the National Core multipollutant monitoring stations.
Monitors at these sites are required to measure particles
(PM2.5, speciated PM2.5, PM10-2.5),
O3, SO2, CO, nitrogen oxides (NO/NOy),
and meteorology (wind speed, wind direction, temperature, relative
humidity).
Near-road monitor means any approved monitor meeting the applicable
specifications described in 40 CFR part 58, appendix D (sections 4.2.1,
4.3.2, 4.7.1(b)(2)) and appendix E (section 6.4(a), Table E-4) for
near-road measurement of PM2.5, CO, or NO2.
Network means all stations of a given type or types.
Network Plan means the Annual Monitoring Network Plan described in
Sec. 58.10 of this part.
NH3 means ammonia.
NO2 means nitrogen dioxide.
NO means nitrogen oxide.
NOX means the sum of the concentrations of NO2 and NO.
NOy means the sum of all total reactive nitrogen oxides, including
NO, NO2, and other nitrogen oxides referred to as
NOZ.
O3 means ozone.
Open path analyzer means an automated analytical method that
measures the average atmospheric pollutant concentration in situ along
one or more monitoring paths having a monitoring path length of 5
meters or more and that has been designated as a reference or
equivalent method under the provisions of part 53 of this chapter.
Optical measurement path length means the actual length of the
optical beam over which measurement of the pollutant is determined. The
path-integrated pollutant concentration measured by the analyzer is
divided by the optical measurement path length to determine the path-
averaged concentration. Generally, the optical measurement path length
is:
(1) Equal to the monitoring path length for a (bistatic) system
having a transmitter and a receiver at opposite ends of the monitoring
path;
(2) Equal to twice the monitoring path length for a (monostatic)
system having a transmitter and receiver at one end of the monitoring
path and a mirror or retroreflector at the other end; or
(3) Equal to some multiple of the monitoring path length for more
complex systems having multiple passes of the measurement beam through
the monitoring path.
PAMS means photochemical assessment monitoring stations.
Pb means lead.
PM means particulate matter, including but not limited to
PM10, PM10C, PM2.5, and
PM10-2.5.
PM2.5 means particulate matter with an aerodynamic diameter less
than or equal to a nominal 2.5 micrometers as measured by a reference
method based on appendix L of part 50 and designated in accordance with
part 53, by an equivalent method designated in accordance with part 53,
or by an approved regional method designated in accordance with
appendix C to this part.
PM10 means particulate matter with an aerodynamic diameter less
than or equal to a nominal 10 micrometers as measured by a reference
method based on appendix J of part 50 and designated in accordance with
part 53 or by an equivalent method designated in accordance with part
53.
PM10C means particulate matter with an aerodynamic diameter less
than or equal to a nominal 10 micrometers as measured by a reference
method based on appendix O of part 50 and designated in accordance with
part 53 or by an equivalent method designated in accordance with part
53.
PM10-2.5 means particulate matter with an aerodynamic diameter less
than or equal to a nominal 10 micrometers and greater than a nominal
2.5 micrometers as measured by a reference method based on appendix O
to part 50 and designated in accordance with part 53 or by an
equivalent method designated in accordance with part 53.
Point analyzer means an automated analytical method that measures
pollutant concentration in an ambient air sample extracted from the
atmosphere at a specific inlet probe point, and that has been
designated as a reference or equivalent method in accordance with part
53 of this chapter.
Primary Monitor means the monitor identified by the monitoring
organization that provides concentration data used for comparison to
the
[[Page 54376]]
NAAQS. For any specific site, only one monitor for each pollutant can
be designated in AQS as primary monitor for a given period of time. The
primary monitor identifies the default data source for creating a
combined site record for purposes of NAAQS comparisons.
Primary quality assurance organization (PQAO) means a monitoring
organization, a group of monitoring organizations or other organization
that is responsible for a set of stations that monitor the same
pollutant and for which data quality assessments can be pooled. Each
criteria pollutant sampler/monitor at a monitoring station in the SLAMS
and SPM networks must be associated with only one PQAO.
Probe means the actual inlet where an air sample is extracted from
the atmosphere for delivery to a sampler or point analyzer for
pollutant analysis.
PSD monitoring network means a set of stations that provide
concentration information for a specific PSD permit.
PSD monitoring organization means a source owner/operator, a
government agency, or a contractor of the source or agency that
operates an ambient air pollution monitoring network for PSD purposes.
PSD reviewing authority means the state air pollution control
agency, local agency, other state agency, tribe, or other agency
authorized by the Administrator to carry out a permit program under
Sec. 51.165 and Sec. 51.166, or the Administrator in the case of EPA-
implemented permit programs under Sec. 52.21.
PSD station means any station operated for the purpose of
establishing the effect on air quality of the emissions from a proposed
source for purposes of prevention of significant deterioration as
required by Sec. 51.24(n).
Regional Administrator means the Administrator of one of the ten
EPA regional offices or his or her authorized representative.
Reporting organization means an entity, such as a state, local, or
tribal monitoring agency, that reports air quality data to the EPA.
Site means a geographic location. One or more stations may be at
the same site.
SLAMS means state or local air monitoring stations. The SLAMS
include the ambient air quality monitoring sites and monitors that are
required by appendix D of this part and are needed for the monitoring
objectives of appendix D, including NAAQS comparisons, but may serve
other data purposes. The SLAMS includes NCore, PAMS, CSN, and all other
state or locally operated criteria pollutant monitors operated in
accordance to this part, that have not been designated and approved by
the Regional Administrator as SPM stations in an annual monitoring
network plan.
SO2 means sulfur dioxide.
Special purpose monitor (SPM) station means a monitor included in
an agency's monitoring network that the agency has designated as a
special purpose monitor station in its annual monitoring network plan
and in the AQS, and which the agency does not count when showing
compliance with the minimum requirements of this subpart for the number
and siting of monitors of various types. Any SPM operated by an air
monitoring agency must be included in the periodic assessments and
annual monitoring network plan required by Sec. 58.10 and approved by
the Regional Administrator.
State agency means the air pollution control agency primarily
responsible for development and implementation of a State
Implementation Plan under the Act.
Station means a single monitor, or a group of monitors, located at
a particular site.
STN station means a PM2.5 chemical speciation station
designated to be part of the speciation trends network. This network
provides chemical species data of fine particulate.
Supplemental speciation station means a PM2.5 chemical
speciation station that is operated for monitoring agency needs and not
part of the STN.
Traceable means that a local standard has been compared and
certified, either directly or via not more than one intermediate
standard, to a National Institute of Standards and Technology (NIST)-
certified primary standard such as a NIST-traceable Reference Material
(NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS).
TSP (total suspended particulates) means particulate matter as
measured by the method described in appendix B of part 50.
Urbanized area means an area with a minimum residential population
of at least 50,000 people and which generally includes core census
block groups or blocks that have a population density of at least 1,000
people per square mile and surrounding census blocks that have an
overall density of at least 500 people per square mile. The Census
Bureau notes that under certain conditions, less densely settled
territory may be part of each Urbanized Area.
VOCs means volatile organic compounds.
0
3. In Sec. 58.10:
0
a. Revise paragraphs (a)(1) and (a)(2).
0
b. Add paragraph (a)(9).
0
c. Add paragraph (b)(14).
The revisions and additions read as follows:
Sec. 58.10 Annual monitoring network plan and periodic network
assessment.
(a)(1) Beginning July 1, 2007, the state, or where applicable
local, agency shall submit to the Regional Administrator an annual
monitoring network plan which shall provide for the documentation of
the establishment and maintenance of an air quality surveillance system
that consists of a network of SLAMS monitoring stations that can
include FRM, FEM, and ARM monitors that are part of SLAMS, NCore, CSN,
PAMS, and SPM stations. The plan shall include a purpose statement for
each monitor along with a statement of whether the operation of each
monitor meets the requirements of appendices A, B, C, D, and E of this
part, where applicable. The Regional Administrator may require the
submission of additional information as needed to evaluate compliance
with applicable requirements of part 58 and its appendices. The annual
monitoring network plan must be made available for public inspection
and comment for at least 30 days prior to submission to the EPA and the
submitted plan shall reference and address any such received comments.
(2) Any annual monitoring network plan that proposes SLAMS network
modifications (including new or discontinued monitoring sites, new
determinations that data are not of sufficient quality to be compared
to the NAAQS, and changes in identification of monitors as suitable or
not suitable for comparison against the annual PM2.5 NAAQS)
is subject to the approval of the EPA Regional Administrator, who shall
approve or disapprove the plan within 120 days of submission of a
complete plan to the EPA.
* * * * *
(9) A detailed description of the PAMS network being operated in
accordance with the requirements of appendix D to this part shall be
submitted as part of the annual monitoring network plan for review by
the EPA Administrator. The PAMS Network Description described in
section 5 of appendix D may be used to meet this requirement.
(b) * * *
(14) The identification of any SPMs operating for a longer period
than 24 months that utilize FRM, FEM, and/or ARM monitors accompanied
by a discussion of the rationale for retention
[[Page 54377]]
as an SPM rather than a reclassification to SLAMS.
* * * * *
0
4. In Sec. 58.11, revise paragraph (a)(3) to read as follows:
Sec. 58.11 Network technical requirements.
(a) * * *
(3) The owner or operator of an existing or a proposed source shall
follow the quality assurance criteria in appendix B to this part that
apply to PSD monitoring when operating a PSD site.
* * * * *
0
5. In Sec. 58.12:
0
a. Revise paragraph (d)(1).
0
b. Revise paragraph (d)(3).
The revisions read as follows:
Sec. 58.12 Operating schedules.
* * * * *
(d) * * *
(1)(i) Manual PM2.5 samplers at required SLAMS stations
without a collocated continuously operating PM2.5 monitor
must operate on at least a 1-in-3 day schedule unless a waiver for an
alternative schedule has been approved per paragraph (d)(1)(ii) of this
section.
(ii) For SLAMS PM2.5 sites with both manual and
continuous PM2.5 monitors operating, the monitoring agency
may request approval for a reduction to 1-in-6 day PM2.5
sampling or for seasonal sampling from the EPA Regional Administrator.
Other requests for a reduction to 1-in-6 day PM2.5 sampling
or for seasonal sampling may be approved on a case-by-case basis. The
EPA Regional Administrator may grant sampling frequency reductions
after consideration of factors (including but not limited to the
historical PM2.5 data quality assessments, the location of
current PM2.5 design value sites, and their regulatory data
needs) if the Regional Administrator determines that the reduction in
sampling frequency will not compromise data needed for implementation
of the NAAQS. Required SLAMS stations whose measurements determine the
design value for their area and that are within plus or minus 10
percent of the annual NAAQS, and all required sites where one or more
24-hour values have exceeded the 24-hour NAAQS each year for a
consecutive period of at least 3 years are required to maintain at
least a 1-in-3 day sampling frequency until the design value no longer
meets these criteria for 3 consecutive years. A continuously operating
FEM or ARM PM2.5 monitor satisfies this requirement unless
it is identified in the monitoring agency's annual monitoring network
plan as not appropriate for comparison to the NAAQS and the EPA
Regional Administrator has approved that the data from that monitor may
be excluded from comparison to the NAAQS.
(iii) Required SLAMS stations whose measurements determine the 24-
hour design value for their area and whose data are within plus or
minus 5 percent of the level of the 24-hour PM2.5 NAAQS must
have an FRM or FEM operate on a daily schedule if that area's design
value for the annual NAAQS is less than the level of the annual
PM2.5 standard. A continuously operating FEM or ARM
PM2.5 monitor satisfies this requirement unless it is
identified in the monitoring agency's annual monitoring network plan as
not appropriate for comparison to the NAAQS and the EPA Regional
Administrator has approved that the data from that monitor may be
excluded from comparison to the NAAQS. The daily schedule must be
maintained until the referenced design values no longer meets these
criteria for 3 consecutive years.
(iv) Changes in sampling frequency attributable to changes in
design values shall be implemented no later than January 1 of the
calendar year following the certification of such data as described in
Sec. 58.15.
* * * * *
(3) Manual PM2.5 speciation samplers at STN stations
must operate on at least a 1-in-3 day sampling frequency unless a
reduction in sampling frequency has been approved by the EPA
Administrator based on factors such as area's design value, the role of
the particular site in national health studies, the correlation of the
site's species data with nearby sites, and presence of other leveraged
measurements.
* * * * *
0
6. In Sec. 58.14, revise paragraph (a) to read as follows:
Sec. 58.14 System modification.
(a) The state, or where appropriate local, agency shall develop and
implement a network modification plan and schedule to modify the
ambient air quality monitoring network that implements the findings of
the network assessment required every 5 years by Sec. 58.10(d). The
network modification plan shall be submitted as part of the Annual
Monitoring Network Plan that is due no later than the year after
submittal of the network assessment.
* * * * *
0
7. Revise Sec. 58.15 to read as follows:
Sec. 58.15 Annual air monitoring data certification.
(a) The state, or where appropriate local, agency shall submit to
the EPA Regional Administrator an annual air monitoring data
certification letter to certify data collected by FRM, FEM, and ARM
monitors at SLAMS and SPM sites that meet criteria in appendix A to
this part from January 1 to December 31 of the previous year. The head
official in each monitoring agency, or his or her designee, shall
certify that the previous year of ambient concentration and quality
assurance data are completely submitted to AQS and that the ambient
concentration data are accurate to the best of her or his knowledge,
taking into consideration the quality assurance findings. The annual
data certification letter is due by May 1 of each year.
(b) Along with each certification letter, the state shall submit to
the Regional Administrator an annual summary report of all the ambient
air quality data collected by FRM, FEM, and ARM monitors at SLAMS and
SPM sites. The annual report(s) shall be submitted for data collected
from January 1 to December 31 of the previous year. The annual summary
serves as the record of the specific data that is the object of the
certification letter.
(c) Along with each certification letter, the state shall submit to
the Regional Administrator a summary of the precision and accuracy data
for all ambient air quality data collected by FRM, FEM, and ARM
monitors at SLAMS and SPM sites. The summary of precision and accuracy
shall be submitted for data collected from January 1 to December 31 of
the previous year.
0
8. In Sec. 58.16, revise paragraphs (a), (c), and (d) to read as
follows:
Sec. 58.16 Data submittal and archiving requirements.
(a) The state, or where appropriate, local agency, shall report to
the Administrator, via AQS all ambient air quality data and associated
quality assurance data for SO2; CO; O3;
NO2; NO; NOy; NOX; Pb-TSP mass
concentration; Pb-PM10 mass concentration; PM10
mass concentration; PM2.5 mass concentration; for filter-
based PM2.5 FRM/FEM, the field blank mass; chemically
speciated PM2.5 mass concentration data; PM10-2.5
mass concentration; meteorological data from NCore and PAMS sites; and
metadata records and information specified by the AQS Data Coding
Manual (https://www.epa.gov/ttn/airs/airsaqs/manuals/manuals.htm). Air
quality data and information must be submitted directly to the AQS via
electronic transmission on the specified schedule described in
paragraphs (b) and (d) of this section.
* * * * *
(c) Air quality data submitted for each reporting period must be
edited,
[[Page 54378]]
validated, and entered into the AQS (within the time limits specified
in paragraphs (b) and (d) of this section) pursuant to appropriate AQS
procedures. The procedures for editing and validating data are
described in the AQS Data Coding Manual and in each monitoring agency's
quality assurance project plan.
(d) The state shall report VOC and if collected, carbonyl,
NH3, and HNO3 data from PAMS sites, and
chemically speciated PM2.5 mass concentration data to AQS
within 6 months following the end of each quarterly reporting period
listed in paragraph (b) of this section.
* * * * *
0
9. Revise Appendix A to part 58 to read as follows:
Appendix A to Part 58--Quality Assurance Requirements for Monitors Used
in Evaluations of National Ambient Air Quality Standards
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability. (a) This appendix specifies the minimum
quality system requirements applicable to SLAMS and other monitor
types whose data are intended to be used to determine compliance
with the NAAQS (e.g., SPMs, tribal, CASTNET, industrial, etc),
unless the EPA Regional Administrator has reviewed and approved the
monitor for exclusion from NAAQS use and these quality assurance
requirements.
(b) Primary quality assurance organizations are encouraged to
develop and maintain quality systems more extensive than the
required minimums. Additional guidance for the requirements
reflected in this appendix can be found in the ``Quality Assurance
Handbook for Air Pollution Measurement Systems,'' Volume II (see
reference 10 of this appendix) and at a national level in references
1, 2, and 3 of this appendix.
1.2 Primary Quality Assurance Organization (PQAO). A PQAO is
defined as a monitoring organization or a coordinated aggregation of
such organizations that is responsible for a set of stations that
monitors the same pollutant and for which data quality assessments
will be pooled. Each criteria pollutant/monitor must be associated
with only one PQAO. In some cases, data quality is assessed at the
PQAO level.
1.2.1 Each PQAO shall be defined such that measurement
uncertainty among all stations in the organization can be expected
to be reasonably homogeneous as a result of common factors. Common
factors that should be considered in defining PQAOs include:
(a) Operation by a common team of field operators according to a
common set of procedures;
(b) Use of a common quality assurance project plan (QAPP) or
standard operating procedures;
(c) Common calibration facilities and standards;
(d) Oversight by a common quality assurance organization; and
(e) Support by a common management organization (i.e., state
agency) or laboratory.
Since data quality assessments are made and data certified at
the PQAO level, the monitoring organization identified as the PQAO
will be responsible for the oversight of the quality of data of all
monitoring organizations within the PQAO.
1.2.2 Monitoring organizations having difficulty describing its
PQAO or in assigning specific monitors to primary quality assurance
organizations should consult with the appropriate EPA regional
office. Any consolidation of monitoring organizations to PQAOs shall
be subject to final approval by the appropriate EPA regional office.
1.2.3 Each PQAO is required to implement a quality system that
provides sufficient information to assess the quality of the
monitoring data. The quality system must, at a minimum, include the
specific requirements described in this appendix. Failure to conduct
or pass a required check or procedure, or a series of required
checks or procedures, does not by itself invalidate data for
regulatory decision making. Rather, PQAOs and the EPA shall use the
checks and procedures required in this appendix in combination with
other data quality information, reports, and similar documentation
that demonstrate overall compliance with part 58. Accordingly, the
EPA and PQAOs shall use a ``weight of evidence'' approach when
determining the suitability of data for regulatory decisions. The
EPA reserves the authority to use or not use monitoring data
submitted by a monitoring organization when making regulatory
decisions based on the EPA's assessment of the quality of the data.
Consensus built validation templates or validation criteria already
approved in Quality Assurance Project Plans (QAPPs) should be used
as the basis for the weight of evidence approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used to describe deviations
from a true concentration or estimate that are related to the
measurement process and not to spatial or temporal population
attributes of the air being measured.
(b) Precision. A measurement of mutual agreement among
individual measurements of the same property usually under
prescribed similar conditions, expressed generally in terms of the
standard deviation.
(c) Bias. The systematic or persistent distortion of a
measurement process which causes errors in one direction.
(d) Accuracy. The degree of agreement between an observed value
and an accepted reference value. Accuracy includes a combination of
random error (imprecision) and systematic error (bias) components
which are due to sampling and analytical operations.
(e) Completeness. A measure of the amount of valid data obtained
from a measurement system compared to the amount that was expected
to be obtained under correct, normal conditions.
(f) Detection Limit. The lowest concentration or amount of
target analyte that can be determined to be different from zero by a
single measurement at a stated level of probability.
1.4 Measurement Quality Checks. The measurement quality checks
described in sections 3 of this appendix shall be reported to AQS
and are included in the data required for certification.
1.5 Assessments and Reports. Periodic assessments and
documentation of data quality are required to be reported to the
EPA. To provide national uniformity in this assessment and reporting
of data quality for all networks, specific assessment and reporting
procedures are prescribed in detail in sections 3, 4, and 5 of this
appendix. On the other hand, the selection and extent of the quality
assurance and quality control activities used by a monitoring
organization depend on a number of local factors such as field and
laboratory conditions, the objectives for monitoring, the level of
data quality needed, the expertise of assigned personnel, the cost
of control procedures, pollutant concentration levels, etc.
Therefore, quality system requirements in section 2 of this appendix
are specified in general terms to allow each monitoring organization
to develop a quality system that is most efficient and effective for
its own circumstances while achieving the data quality objectives
described in this appendix.
2. Quality System Requirements
A quality system (reference 1 of this appendix) is the means by
which an organization manages the quality of the monitoring
information it produces in a systematic, organized manner. It
provides a framework for planning, implementing, assessing and
reporting work performed by an organization and for carrying out
required quality assurance and quality control activities.
2.1 Quality Management Plans and Quality Assurance Project
Plans. All PQAOs must develop a quality system that is described and
approved in quality management plans (QMP) and QAPPs to ensure that
the monitoring results:
(a) Meet a well-defined need, use, or purpose (reference 5 of
this appendix);
(b) Provide data of adequate quality for the intended monitoring
objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards specifications;
(e) Comply with statutory (and other legal) requirements; and
(f) Reflect consideration of cost and economics.
2.1.1 The QMP describes the quality system in terms of the
organizational structure, functional responsibilities of management
and staff, lines of authority, and required interfaces for those
planning, implementing, assessing and reporting activities involving
environmental data operations (EDO). The QMP must be suitably
documented in accordance with EPA requirements (reference 2 of this
appendix), and approved by the appropriate Regional Administrator,
or his or her representative. The quality system described in the
QMP
[[Page 54379]]
will be reviewed during the systems audits described in section 2.5
of this appendix. Organizations that implement long-term monitoring
programs with EPA funds should have a separate QMP document. Smaller
organizations, organizations that do infrequent work with the EPA or
have monitoring programs of limited size or scope may combine the
QMP with the QAPP if approved by, and subject to any conditions of
the EPA. Additional guidance on this process can be found in
reference 10 of this appendix. Approval of the recipient's QMP by
the appropriate Regional Administrator or his or her representative
may allow delegation of authority to review and approve
environmental data collection activities adequately described and
covered under the scope of the QMP and documented in appropriate
planning documents (QAPP) to the PQAOs independent quality assurance
function. Where a PQAO or monitoring organization has been delegated
authority to review and approve their QAPP, an electronic copy must
be submitted to the EPA region at the time it is submitted to the
PQAO/monitoring organizations QAPP approving authority. The QAPP
will be reviewed by the EPA during systems audits or circumstances
related to data quality. The QMP submission and approval dates for
PQAOs/monitoring organizations must be reported to AQS.
2.1.2 The QAPP is a formal document describing, in sufficient
detail, the quality system that must be implemented to ensure that
the results of work performed will satisfy the stated objectives.
PQAOs must develop QAPPs that describe how the organization intends
to control measurement uncertainty to an appropriate level in order
to achieve the data quality objectives for the EDO. The quality
assurance policy of the EPA requires every EDO to have a written and
approved QAPP prior to the start of the EDO. It is the
responsibility of the PQAO/monitoring organization to adhere to this
policy. The QAPP must be suitably documented in accordance with EPA
requirements (reference 3 of this appendix) which include standard
operating procedures for all EDOs either within the document or by
appropriate reference. The QAPP must identify each PQAO operating
monitors under the QAPP as well as generally identify the sites and
monitors to which it is applicable. The QAPP submission and approval
dates must be reported to AQS.
2.1.3 'The PQAO/monitoring organization's quality system must
have adequate resources both in personnel and funding to plan,
implement, assess and report on the achievement of the requirements
of this appendix and its approved QAPP.
2.2 Independence of Quality Assurance. The PQAO must provide for
a quality assurance management function; that aspect of the overall
management system of the organization that determines and implements
the quality policy defined in a PQAO's QMP. Quality management
includes strategic planning, allocation of resources and other
systematic planning activities (e.g., planning, implementation,
assessing and reporting) pertaining to the quality system. The
quality assurance management function must have sufficient technical
expertise and management authority to conduct independent oversight
and assure the implementation of the organization's quality system
relative to the ambient air quality monitoring program and should be
organizationally independent of environmental data generation
activities.
2.3. Data Quality Performance Requirements.
2.3.1 Data Quality Objectives. The DQOs, or the results of other
systematic planning processes, are statements that define the
appropriate type of data to collect and specify the tolerable levels
of potential decision errors that will be used as a basis for
establishing the quality and quantity of data needed to support the
monitoring objectives (reference 5 of this appendix). The DQOs will
be developed by the EPA to support the primary regulatory objectives
for each criteria pollutant. As they are developed, they will be
added to the regulation. The quality of the conclusions derived from
data interpretation can be affected by population uncertainty
(spatial or temporal uncertainty) and measurement uncertainty
(uncertainty associated with collecting, analyzing, reducing and
reporting concentration data). This appendix focuses on assessing
and controlling measurement uncertainty.
2.3.1.1 Measurement Uncertainty for Automated and Manual PM2.5
Methods. The goal for acceptable measurement uncertainty is defined
for precision as an upper 90 percent confidence limit for the
coefficient of variation (CV) of 10 percent and plus or minus 10
percent for total bias.
2.3.1.2 Measurement Uncertainty for Automated O3 Methods. The
goal for acceptable measurement uncertainty is defined for precision
as an upper 90 percent confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence limit for the absolute
bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb Methods. The goal for
acceptable measurement uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV of 20 percent and for
bias as an upper 95 percent confidence limit for the absolute bias
of 15 percent.
2.3.1.4 Measurement Uncertainty for NO2. The goal for acceptable
measurement uncertainty is defined for precision as an upper 90
percent confidence limit for the CV of 15 percent and for bias as an
upper 95 percent confidence limit for the absolute bias of 15
percent.
2.3.1.5 Measurement Uncertainty for SO2. The goal for acceptable
measurement uncertainty for precision is defined as an upper 90
percent confidence limit for the CV of 10 percent and for bias as an
upper 95 percent confidence limit for the absolute bias of 10
percent.
2.4 National Performance Evaluation Programs. The PQAO shall
provide for the implementation of a program of independent and
adequate audits of all monitors providing data for NAAQS compliance
purposes including the provision of adequate resources for such
audit programs. A monitoring plan (or QAPP) which provides for PQAO
participation in the EPA's National Performance Audit Program
(NPAP), the PM2.5 Performance Evaluation Program
(PM2.5-PEP) program and the Pb Performance Evaluation
Program (Pb-PEP) and indicates the consent of the PQAO for the EPA
to apply an appropriate portion of the grant funds, which the EPA
would otherwise award to the PQAO for these QA activities, will be
deemed by the EPA to meet this requirement. For clarification and to
participate, PQAOs should contact either the appropriate EPA
regional quality assurance (QA) coordinator at the appropriate EPA
regional office location, or the NPAP coordinator at the EPA Air
Quality Assessment Division, Office of Air Quality Planning and
Standards, in Research Triangle Park, North Carolina. The PQAOs that
plan to implement these programs (self-implement) rather than use
the federal programs must meet the adequacy requirements found in
the appropriate sections that follow, as well as meet the definition
of independent assessment that follows.
2.4.1 Independent assessment. An assessment performed by a
qualified individual, group, or organization that is not part of the
organization directly performing and accountable for the work being
assessed. This auditing organization must not be involved with the
generation of the ambient air monitoring data. An organization can
conduct the performance evaluation (PE) if it can meet this
definition and has a management structure that, at a minimum, will
allow for the separation of its routine sampling personnel from its
auditing personnel by two levels of management. In addition, the
sample analysis of audit filters must be performed by a laboratory
facility and laboratory equipment separate from the facilities used
for routine sample analysis. Field and laboratory personnel will be
required to meet PE field and laboratory training and certification
requirements to establish comparability to federally implemented
programs.
2.5 Technical Systems Audit Program. Technical systems audits of
each PQAO shall be conducted at least every 3 years by the
appropriate EPA regional office and reported to the AQS. If a PQAO
is made up of more than one monitoring organization, all monitoring
organizations in the PQAO should be audited within 6 years (two TSA
cycles of the PQAO). As an example, if a state has five local
monitoring organizations that are consolidated under one PQAO, all
five local monitoring organizations will receive a technical systems
audit within a 6-year period. Systems audit programs are described
in reference 10 of this appendix. For further instructions, PQAOs
should contact the appropriate EPA regional QA coordinator.
2.6 Gaseous and Flow Rate Audit Standards.
2.6.1 Gaseous pollutant concentration standards (permeation
devices or cylinders of compressed gas) used to obtain test
concentrations for carbon monoxide (CO), sulfur dioxide
(SO2), nitrogen oxide (NO), and nitrogen dioxide
(NO2) must be traceable to either a National Institute of
Standards and Technology (NIST) Traceable Reference Material (NTRM)
or a NIST-certified Gas
[[Page 54380]]
Manufacturer's Internal Standard (GMIS), certified in accordance
with one of the procedures given in reference 4 of this appendix.
Vendors advertising certification with the procedures provided in
reference 4 of this appendix and distributing gases as ``EPA
Protocol Gas'' for ambient air monitoring purposes must participate
in the EPA Ambient Air Protocol Gas Verification Program or not use
``EPA'' in any form of advertising. Monitoring organizations must
provide information to the EPA on the gas producers they use on an
annual basis and those PQAOs purchasing standards will be obligated,
at the request of the EPA, to participate in the program at least
once every 5 years by sending a new unused standard to a designated
verification laboratory.
2.6.2 Test concentrations for ozone (O3) must be
obtained in accordance with the ultraviolet photometric calibration
procedure specified in appendix D to part 50 of this chapter and by
means of a certified NIST-traceable O3 transfer standard.
Consult references 7 and 8 of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be made by a flow measuring
instrument that is NIST-traceable to an authoritative volume or
other applicable standard. Guidance for certifying some types of
flowmeters is provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance. Requirements and guidance
documents for developing the quality system are contained in
references 1 through 11 of this appendix, which also contain many
suggested procedures, checks, and control specifications. Reference
10 describes specific guidance for the development of a quality
system for data collected for comparison to the NAAQS. Many specific
quality control checks and specifications for methods are included
in the respective reference methods described in part 50 of this
chapter or in the respective equivalent method descriptions
available from the EPA (reference 6 of this appendix). Similarly,
quality control procedures related to specifically designated
reference and equivalent method monitors are contained in the
respective operation or instruction manuals associated with those
monitors.
3. Measurement Quality Check Requirements
This section provides the requirements for PQAOs to perform the
measurement quality checks that can be used to assess data quality.
Data from these checks are required to be submitted to the AQS
within the same time frame as routinely-collected ambient
concentration data as described in 40 CFR 58.16. Table A-1 of this
appendix provides a summary of the types and frequency of the
measurement quality checks that will be described in this section.
3.1. Gaseous Monitors of SO2, NO2,
O3, and CO.
3.1.1 One-Point Quality Control (QC) Check for SO2,
NO2, O3, and CO. (a) A one-point QC check must
be performed at least once every 2 weeks on each automated monitor
used to measure SO2, NO2, O3 and
CO. With the advent of automated calibration systems, more frequent
checking is strongly encouraged. See Reference 10 of this appendix
for guidance on the review procedure. The QC check is made by
challenging the monitor with a QC check gas of known concentration
(effective concentration for open path monitors) between the
prescribed range of 0.005 and 0.08 parts per million (ppm) for
SO2, NO2, and O3, and between the
prescribed range of 0.5 and 5 ppm for CO monitors. The QC check gas
concentration selected within the prescribed range must be related
to the mean or median of the ambient air concentrations normally
measured at sites within the monitoring network in order to
appropriately reflect the precision and bias at these ambient air
concentration ranges. If the mean or median concentrations at the
sites are below or above the prescribed range for the relevant
pollutant, select the lowest or highest concentration in the range.
An additional QC check point is encouraged for those organizations
that may have occasional high values or would like to confirm the
monitors' linearity at the higher end of the operational range or
around NAAQS concentrations.
(b) Point analyzers must operate in their normal sampling mode
during the QC check and the test atmosphere must pass through all
filters, scrubbers, conditioners and other components used during
normal ambient sampling and as much of the ambient air inlet system
as is practicable. The QC check must be conducted before any
calibration or adjustment to the monitor.
(c) Open path monitors are tested by inserting a test cell
containing a QC check gas concentration into the optical measurement
beam of the instrument. If possible, the normally used transmitter,
receiver, and as appropriate, reflecting devices should be used
during the test, and the normal monitoring configuration of the
instrument should be altered as little as possible to accommodate
the test cell for the test. However, if permitted by the associated
operation or instruction manual, an alternate local light source or
an alternate optical path that does not include the normal
atmospheric monitoring path may be used. The actual concentration of
the QC check gas in the test cell must be selected to produce an
effective concentration in the range specified earlier in this
section. Generally, the QC test concentration measurement will be
the sum of the atmospheric pollutant concentration and the QC test
concentration. As such, the result must be corrected to remove the
atmospheric concentration contribution. The corrected concentration
is obtained by subtracting the average of the atmospheric
concentrations measured by the open path instrument under test
immediately before and immediately after the QC test from the QC
check gas concentration measurement. If the difference between these
before and after measurements is greater than 20 percent of the
effective concentration of the test gas, discard the test result and
repeat the test. If possible, open path monitors should be tested
during periods when the atmospheric pollutant concentrations are
relatively low and steady.
(d) Report the audit concentration of the QC gas and the
corresponding measured concentration indicated by the monitor to
AQS. The percent differences between these concentrations are used
to assess the precision and bias of the monitoring data as described
in sections 4.1.2 (precision) and 4.1.3 (bias) of this appendix.
3.1.2 Annual performance evaluation for SO2, NO2, O3, or CO. A
performance evaluation must be conducted on each primary monitor
once a year. This can be accomplished by evaluating 25 percent of
the primary monitors each quarter. The evaluation should be
conducted by a trained experienced technician other than the routine
site operator.
3.1.2.1 The evaluation is made by challenging the monitor with
audit gas standards of known concentration from at least three audit
levels. Two of the audit levels selected will represent a range of
10-80 percent of the typical ambient air concentrations either
measured by the monitor or in the PQAOs network of monitors. The
third point should be at the NAAQS level or above the highest 3-year
ambient air hourly concentration, whichever is greater. An
additional 4th level is encouraged for those agencies that would
like to confirm the monitors' linearity at the higher end of the
operational range. In rare circumstances, there may be sites
measuring concentrations above audit level 10. Notify the
appropriate EPA region and the AQS program in order to make
accommodations for auditing at levels above level 10.
----------------------------------------------------------------------------------------------------------------
Concentration range, ppm
Audit level ---------------------------------------------------------------------------
O3 SO2 NO2 CO
----------------------------------------------------------------------------------------------------------------
1................................... 0.004-0.0059 0.0003-0.0029 0.0003-0.0029 0.020-0.059
2................................... 0.006-0.019 0.0030-0.0049 0.0030-0.0049 0.060-0.199
3................................... 0.020-0.039 0.0050-0.0079 0.0050-0.0079 0.200-0.899
4................................... 0.040-0.069 0.0080-0.0199 0.0080-0.0199 0.900-2.999
5................................... 0.070-0.089 0.0200-0.0499 0.0200-0.0499 3.000-7.999
6................................... 0.090-0.119 0.0500-0.0999 0.0500-0.0999 8.000-15.999
7................................... 0.120-0.139 0.1000-0.1499 0.1000-0.2999 16.000-30.999
8................................... 0.140-0.169 0.1500-0.2599 0.3000-0.4999 31.000-39.999
[[Page 54381]]
9................................... 0.170-0.189 0.2600-0.7999 0.5000-0.7999 40.000-49.999
10.................................. 0.190-0.259 0.8000-1.000 0.8000-1.000 50.000-60.000
----------------------------------------------------------------------------------------------------------------
3.1.2.2 The NO2 audit techniques may vary depending
on the ambient monitoring method. For chemiluminescence-type
NO2 analyzers, gas phase titration (GPT) techniques
should be based on EPA guidance documents and monitoring agency
experience. The NO2 gas standards may be more appropriate
than GPT for direct NO2 methods that do not employ
converters. Care should be taken to ensure the stability of such gas
standards prior to use.
3.1.2.3 The standards from which audit gas test concentrations
are obtained must meet the specifications of section 2.6.1 of this
appendix. The gas standards and equipment used for the performance
evaluation must not be the same as the standards and equipment used
for one-point QC, calibrations, span evaluations or NPAP.
3.1.2.4 For point analyzers, the evaluation shall be carried out
by allowing the monitor to analyze the audit gas test atmosphere in
its normal sampling mode such that the test atmosphere passes
through all filters, scrubbers, conditioners, and other sample inlet
components used during normal ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open path monitors are evaluated by inserting a test
cell containing the various audit gas concentrations into the
optical measurement beam of the instrument. If possible, the
normally used transmitter, receiver, and, as appropriate, reflecting
devices should be used during the evaluation, and the normal
monitoring configuration of the instrument should be modified as
little as possible to accommodate the test cell for the evaluation.
However, if permitted by the associated operation or instruction
manual, an alternate local light source or an alternate optical path
that does not include the normal atmospheric monitoring path may be
used. The actual concentrations of the audit gas in the test cell
must be selected to produce effective concentrations in the
evaluation level ranges specified in this section of this appendix.
Generally, each evaluation concentration measurement result will be
the sum of the atmospheric pollutant concentration and the
evaluation test concentration. As such, the result must be corrected
to remove the atmospheric concentration contribution. The corrected
concentration is obtained by subtracting the average of the
atmospheric concentrations measured by the open path instrument
under test immediately before and immediately after the evaluation
test (or preferably before and after each evaluation concentration
level) from the evaluation concentration measurement. If the
difference between the before and after measurements is greater than
20 percent of the effective concentration of the test gas standard,
discard the test result for that concentration level and repeat the
test for that level. If possible, open path monitors should be
evaluated during periods when the atmospheric pollutant
concentrations are relatively low and steady. Also, if the open path
instrument is not installed in a permanent manner, the monitoring
path length must be reverified to be within plus or minus 3 percent
to validate the evaluation since the monitoring path length is
critical to the determination of the effective concentration.
3.1.2.6 Report both the evaluation concentrations (effective
concentrations for open path monitors) of the audit gases and the
corresponding measured concentration (corrected concentrations, if
applicable, for open path monitors) indicated or produced by the
monitor being tested to AQS. The percent differences between these
concentrations are used to assess the quality of the monitoring data
as described in section 4.1.1 of this appendix.
3.1.3 National Performance Audit Program (NPAP).
The NPAP is a performance evaluation which is a type of audit
where quantitative data are collected independently in order to
evaluate the proficiency of an analyst, monitoring instrument or
laboratory. Details of the program can be found in reference 11 of
this appendix. The program requirements include:
3.1.3.1 Performing audits of the primary monitors at 20 percent
of monitoring sites per year, and 100 percent of the sites in 6
years. High-priority sites may be visited more often. Since not all
gaseous criteria pollutants are monitored at every site within a
PQAO, it is not required that 20 percent of the primary monitors for
each pollutant receive an NPAP audit each year only that 20 percent
of the PQAOs monitoring sites receive an NPAP audit. It is expected
that over the 6-year period all primary monitors for all gaseous
pollutants will receive an NPAP audit.
3.1.3.2 Developing a delivery system that will allow for the
audit concentration gasses to be introduced to the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified against the NIST
standard reference methods or special review procedures and
validated annually for CO, SO2 and NO2, and at
the beginning of each quarter of audits for O3.
3.1.3.4 As described in section 2.4 of this appendix, the PQAO
may elect, on an annual basis, to utilize the federally implemented
NPAP program. If the PQAO plans to self-implement NPAP, the EPA will
establish training and other technical requirements for PQAOs to
establish comparability to federally implemented programs. In
addition to meeting the requirements in sections 3.1.3.1 through
3.1.3.3 of this appendix, the PQAO must:
(a) Utilize an audit system equivalent to the federally
implemented NPAP audit system and is separate from equipment used in
annual performance evaluations.
(b) Perform a whole system check by having the NPAP system
tested against an independent and qualified EPA lab, or equivalent.
(c) Evaluate the system with the EPA NPAP program through
collocated auditing at an acceptable number of sites each year (at
least one for an agency network of five or less sites; at least two
for a network with more than five sites).
(d) Incorporate the NPAP in the PQAO's quality assurance project
plan.
(e) Be subject to review by independent, EPA-trained personnel.
(f) Participate in initial and update training/certification
sessions.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A one-point flow rate
verification check must be performed at least once every month (each
verification minimally separated by 14 days) on each monitor used to
measure PM2.5. The verification is made by checking the
operational flow rate of the monitor. If the verification is made in
conjunction with a flow rate adjustment, it must be made prior to
such flow rate adjustment. For the standard procedure, use a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix to check the monitor's normal flow rate. Care should
be used in selecting and using the flow rate measurement device such
that it does not alter the normal operating flow rate of the
monitor. Report the flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between the audit and measured flow rates are used to
assess the bias of the monitoring data as described in section 4.2.2
of this appendix (using flow rates in lieu of concentrations).
3.2.2 Semi-Annual Flow Rate Audit for PM2.5. Audit the flow rate
of the particulate monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months apart. The EPA strongly
encourages more frequent auditing. The audit should (preferably) be
conducted by a trained experienced technician other than the routine
site operator. The audit is made by measuring the monitor's normal
operating flow rate(s) using a flow rate transfer standard certified
in accordance with section 2.6 of this appendix. The flow rate
standard used for auditing must not be the same flow rate standard
used for verifications or to calibrate the monitor. However, both
the calibration standard and the audit standard may be referenced to
the same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The
[[Page 54382]]
percent differences between these flow rates are used to evaluate
monitor performance.
3.2.3 Collocated Quality Control Sampling Procedures for PM2.5.
For each pair of collocated monitors, designate one sampler as the
primary monitor whose concentrations will be used to report air
quality for the site, and designate the other as the quality control
monitor. There can be only one primary monitor at a monitoring site
for a given time period.
3.2.3.1 For each distinct monitoring method designation (FRM or
FEM) that a PQAO is using for a primary monitor, the PQAO must:
(a) Have 15 percent of the primary monitors of each method
designation collocated (values of 0.5 and greater round up); and
(b) Have at least one collocated quality control monitor (if the
total number of monitors is less than three). The first collocated
monitor must be a designated FRM monitor.
3.2.3.2 In addition, monitors selected for collocation must also
meet the following requirements:
(a) A primary monitor designated as an EPA FRM shall be
collocated with a quality control monitor having the same EPA FRM
method designation.
(b) For each primary monitor designated as an EPA FEM used by
the PQAO, 50 percent of the monitors designated for collocation, or
the first if only one collocation is necessary, shall be collocated
with a FRM quality control monitor and 50 percent of the monitors
shall be collocated with a monitor having the same method
designation as the FEM primary monitor. If an odd number of
collocated monitors is required, the additional monitor shall be a
FRM quality control monitor. An example of the distribution of
collocated monitors for each unique FEM is provided below. Table A-2
of this appendix demonstrates the procedure with a PQAO having an
FRM and multiple FEMs.
----------------------------------------------------------------------------------------------------------------
Collocated
Primary FEMS of a unique method Collocated Collocated with same method
designation with an FRM designation
----------------------------------------------------------------------------------------------------------------
``1-9''.......................................... 1 1 0
``10-16''........................................ 2 1 1
``17-23''........................................ 3 2 1
``24-29''........................................ 4 2 2
``30-36''........................................ 5 3 2
``37-43''........................................ 6 3 3
----------------------------------------------------------------------------------------------------------------
3.2.3.3 Since the collocation requirements are used to assess
precision of the primary monitors and there can only be one primary
monitor at a monitoring site, a site can only count for the
collocation of the method designation of the primary monitor at that
site.
3.2.3.4 The collocated monitors should be deployed according to
the following protocol:
(a) Fifty percent of the collocated quality control monitors
should be deployed at sites with annual average or daily
concentrations estimated to be within 20 percent of
either the annual or 24-hour NAAQS and the remainder at the PQAOs
discretion;
(b) If an organization has no sites with annual average or daily
concentrations within 20 percent of the annual NAAQS or
24-hour NAAQS, 50 percent of the collocated quality control monitors
should be deployed at those sites with the annual mean
concentrations or 24-hour concentrations among the highest for all
sites in the network and the remainder at the PQAOs discretion.
(c) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference. A waiver allowing up to 10 meters horizontal distance
and up to 3 meters vertical distance (inlet to inlet) between a
primary and collocated sampler may be approved by the Regional
Administrator for sites at a neighborhood or larger scale of
representation during the annual network plan approval process.
Calibration, sampling, and analysis must be the same for both
primary and collocated quality control samplers and the same as for
all other samplers in the network.
(d) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation Program (PEP) Procedures. The
PEP is an independent assessment used to estimate total measurement
system bias. These evaluations will be performed under the NPEP as
described in section 2.4 of this appendix or a comparable program.
Performance evaluations will be performed annually within each PQAO.
For PQAOs with less than or equal to five monitoring sites, five
valid performance evaluation audits must be collected and reported
each year. For PQAOs with greater than five monitoring sites, eight
valid performance evaluation audits must be collected and reported
each year. A valid performance evaluation audit means that both the
primary monitor and PEP audit concentrations are valid and above 3
[mu]g/m\3\. Siting of the PEP monitor should be consistent with
section 3.2.3.7. However, any horizontal distance greater than 4
meters and any vertical distance greater than one meter must be
reported to the EPA regional PEP coordinator. Additionally for every
monitor designated as a primary monitor, a primary quality assurance
organization must:
3.2.4.1 Have each method designation evaluated each year; and,
3.2.4.2 Have all FRM, FEM or ARM samplers subject to a PEP audit
at least once every six years; which equates to approximately 15
percent of the monitoring sites audited each year.
3.2.4.3 Additional information concerning the PEP is contained
in reference 10 of this appendix. The calculations for evaluating
bias between the primary monitor and the performance evaluation
monitor for PM2.5 are described in section 4.2.5 of this
appendix.
3.3 PM10.
3.3.1 Flow Rate Verification for PM10 Low Volume Samplers (less
than 200 liter/minute). A one-point flow rate verification check
must be performed at least once every month (each verification
minimally separated by 14 days) on each monitor used to measure
PM10. The verification is made by checking the
operational flow rate of the monitor. If the verification is made in
conjunction with a flow rate adjustment, it must be made prior to
such flow rate adjustment. For the standard procedure, use a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix to check the monitor's normal flow rate. Care should
be taken in selecting and using the flow rate measurement device
such that it does not alter the normal operating flow rate of the
monitor. The percent differences between the audit and measured flow
rates are reported to AQS and used to assess the bias of the
monitoring data as described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.3.2 Flow Rate Verification for PM10 High Volume Samplers
(greater than 200 liters/minute). For PM10 high volume samplers, the
verification frequency is one verification every 90 days (quarter)
with 4 in a year. Other than verification frequency, follow the same
technical procedure as described in section 3.3.1 of this appendix.
3.3.3 Semi-Annual Flow Rate Audit for PM10. Audit the flow rate
of the particulate monitor twice a year. The two audits should
ideally be spaced between 5 and 7 months apart. The EPA strongly
encourages more frequent auditing. The audit should (preferably) be
conducted by a trained experienced technician other than the routine
site operator. The audit is made by measuring the monitor's normal
operating
[[Page 54383]]
flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between these flow rates are used to evaluate monitor
performance.
3.3.4 Collocated Quality Control Sampling Procedures for Manual
PM10. Collocated sampling for PM10 is only required for
manual samplers. For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site and designate the other as the
quality control monitor.
3.3.4.1 For manual PM10 samplers, a PQAO must:
(a) Have 15 percent of the primary monitors collocated (values
of 0.5 and greater round up); and
(b) Have at least one collocated quality control monitor (if the
total number of monitors is less than three).
3.3.4.2 The collocated quality control monitors should be
deployed according to the following protocol:
(a) Fifty percent of the collocated quality control monitors
should be deployed at sites with daily concentrations estimated to
be within 20 percent of the applicable NAAQS and the
remainder at the PQAOs discretion;
(b) If an organization has no sites with daily concentrations
within 20 percent of the NAAQS, 50 percent of the
collocated quality control monitors should be deployed at those
sites with the daily mean concentrations among the highest for all
sites in the network and the remainder at the PQAOs discretion.
(c) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference. A waiver allowing up to 10 meters horizontal distance
and up to 3 meters vertical distance (inlet to inlet) between a
primary and collocated sampler may be approved by the Regional
Administrator for sites at a neighborhood or larger scale of
representation. This waiver may be approved during the annual
network plan approval process. Calibration, sampling, and analysis
must be the same for both collocated samplers and the same as for
all other samplers in the network.
(d) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
(e) In determining the number of collocated quality control
sites required for PM10, monitoring networks for lead
(Pb-PM10) should be treated independently from networks
for particulate matter (PM), even though the separate networks may
share one or more common samplers. However, a single quality control
monitor that meets the collocation requirements for Pb-
PM10 and PM10 may serve as a collocated
quality control monitor for both networks. Extreme care must be
taken when using the filter from a quality control monitor for both
PM10 and Pb analysis. A PM10 filter weighing
should occur prior to any Pb analysis.
3.4 Pb.
3.4.1 Flow Rate Verification for Pb-PM10 Low Volume Samplers
(less than 200 liter/minute). A one-point flow rate verification
check must be performed at least once every month (each verification
minimally separated by 14 days) on each monitor used to measure Pb.
The verification is made by checking the operational flow rate of
the monitor. If the verification is made in conjunction with a flow
rate adjustment, it must be made prior to such flow rate adjustment.
For the standard procedure, use a flow rate transfer standard
certified in accordance with section 2.6 of this appendix to check
the monitor's normal flow rate. Care should be taken in selecting
and using the flow rate measurement device such that it does not
alter the normal operating flow rate of the monitor. The percent
differences between the audit and measured flow rates are reported
to AQS and used to assess the bias of the monitoring data as
described in section 4.2.2 of this appendix (using flow rates in
lieu of concentrations).
3.4.2 Flow Rate Verification for Pb High Volume Samplers
(greater than 200 liters/minute). For high volume samplers, the
verification frequency is one verification every 90 days (quarter)
with four in a year. Other than verification frequency, follow the
same technical procedure as described in section 3.4.1 of this
appendix.
3.4.3 Semi-Annual Flow Rate Audit for Pb. Audit the flow rate of
the particulate monitor twice a year. The two audits should ideally
be spaced between 5 and 7 months apart. The EPA strongly encourages
more frequent auditing. The audit should (preferably) be conducted
by a trained experienced technician other than the routine site
operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor to AQS. The percent
differences between these flow rates are used to evaluate monitor
performance.
3.4.4 Collocated Quality Control Sampling for TSP Pb for
monitoring sites other than non-source NCore. For each pair of
collocated monitors for manual TSP Pb samplers, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the
quality control monitor.
3.4.4.1 A PQAO must:
(a) Have 15 percent of the primary monitors (not counting non-
source NCore sites in PQAO) collocated. Values of 0.5 and greater
round up; and
(b) Have at least one collocated quality control monitor (if the
total number of monitors is less than three).
3.4.4.2 The collocated quality control monitors should be
deployed according to the following protocol:
(a) The first collocated Pb site selected must be the site
measuring the highest Pb concentrations in the network. If the site
is impractical, alternative sites, approved by the EPA Regional
Administrator, may be selected. If additional collocated sites are
necessary, collocated sites may be chosen that reflect average
ambient air Pb concentrations in the network.
(b) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference.
(c) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
3.4.5 Collocated Quality Control Sampling for Pb-PM10 at
monitoring sites other than non-source NCore. If a PQAO is
monitoring for Pb-PM10 at sites other than at a non-
source oriented NCore site then the PQAO must:
3.4.5.1 Have 15 percent of the primary monitors (not counting
non-source NCore sites in PQAO) collocated. Values of 0.5 and
greater round up; and
3.4.5.2 Have at least one collocated quality control monitor (if
the total number of monitors is less than three).
3.4.5.3 The collocated monitors should be deployed according to
the following protocol:
(a) Fifty percent of the collocated quality control monitors
should be deployed at sites with the highest 3-month average
concentrations and the remainder at the PQAOs discretion.
(b) The two collocated monitors must be within 4 meters (inlet
to inlet) of each other and at least 2 meters apart for flow rates
greater than 200 liters/min or at least 1 meter apart for samplers
having flow rates less than 200 liters/min to preclude airflow
interference. A waiver allowing up to 10 meters horizontal distance
and up to 3 meters vertical distance (inlet to inlet) between a
primary and collocated sampler may be approved by the Regional
Administrator for sites at a neighborhood or larger scale of
representation. This waiver may be approved during the annual
network plan approval
[[Page 54384]]
process. Calibration, sampling, and analysis must be the same for
both collocated samplers and the same as for all other samplers in
the network.
(c) Sample the collocated quality control monitor on a 1-in-12
day schedule. Report the measurements from both primary and
collocated quality control monitors at each collocated sampling site
to AQS. The calculations for evaluating precision between the two
collocated monitors are described in section 4.2.1 of this appendix.
(d) In determining the number of collocated quality control
sites required for Pb-PM10, monitoring networks for
PM10 should be treated independently from networks for
Pb-PM10, even though the separate networks may share one
or more common samplers. However, a single quality control monitor
that meets the collocation requirements for Pb-PM10 and
PM10 may serve as a collocated quality control monitor
for both networks. Extreme care must be taken when using a using the
filter from a quality control monitor for both PM10 and
Pb analysis. A PM10 filter weighing should occur prior to
any Pb analysis.
3.4.6 Pb Analysis Audits. Each calendar quarter, audit the Pb
reference or equivalent method analytical procedure using filters
containing a known quantity of Pb. These audit filters are prepared
by depositing a Pb standard on unexposed filters and allowing them
to dry thoroughly. The audit samples must be prepared using batches
of reagents different from those used to calibrate the Pb analytical
equipment being audited. Prepare audit samples in the following
concentration ranges:
------------------------------------------------------------------------
Equivalent ambient Pb
Range concentration, [mu]g/m\3\
------------------------------------------------------------------------
1................................... 30-100% of Pb NAAQS.
2................................... 200-300% of Pb NAAQS.
------------------------------------------------------------------------
(a) Extract the audit samples using the same extraction
procedure used for exposed filters.
(b) Analyze three audit samples in each of the two ranges each
quarter samples are analyzed. The audit sample analyses shall be
distributed as much as possible over the entire calendar quarter.
(c) Report the audit concentrations (in [mu]g Pb/filter or
strip) and the corresponding measured concentrations (in [mu]g Pb/
filter or strip) to AQS using AQS unit code 077. The percent
differences between the concentrations are used to calculate
analytical accuracy as described in section 4.2.6 of this appendix.
3.4.7 Pb PEP Procedures for monitoring sites other than non-
source NCore. The PEP is an independent assessment used to estimate
total measurement system bias. These evaluations will be performed
under the NPEP described in section 2.4 of this appendix or a
comparable program. Each year, one performance evaluation audit must
be performed at one Pb site in each primary quality assurance
organization that has less than or equal to five sites and two
audits at PQAOs with greater than five sites. Non-source oriented
NCore sites are not counted. In addition, each year, four collocated
samples from PQAOs with less than or equal to five sites and six
collocated samples at PQAOs with greater than five sites must be
sent to an independent laboratory, the same laboratory as the
performance evaluation audit, for analysis. Siting of this PEP
monitor should be consistent with section 3.4.5.4. However, any
horizontal distance greater than 4 meters and any vertical distance
greater than 1 meter must be reported to the EPA regional PEP
coordinator. The calculations for evaluating bias between the
primary monitor and the performance evaluation monitor for Pb are
described in section 4.2.4 of this appendix.
4. Calculations for Data Quality Assessment
(a) Calculations of measurement uncertainty are carried out by
the EPA according to the following procedures. The PQAOs must report
the data to AQS for all measurement quality checks as specified in
this appendix even though they may elect to perform some or all of
the calculations in this section on their own.
(b) The EPA will provide annual assessments of data quality
aggregated by site and PQAO for SO2, NO2,
O3 and CO and by PQAO for PM10,
PM2.5, and Pb.
(c) At low concentrations, agreement between the measurements of
collocated quality control samplers, expressed as relative percent
difference or percent difference, may be relatively poor. For this
reason, collocated measurement pairs are selected for use in the
precision and bias calculations only when both measurements are
equal to or above the following limits:
(1) Pb: 0.002 [mu]g/m\3\ (Methods approved after 3/04/2010, with
exception of manual equivalent method EQLA-0813-803).
(2) Pb: 0.02 [mu]g/m\3\ (Methods approved before 3/04/2010, and
manual equivalent method EQLA-0813-803).
(3) PM10(Hi-Vol): 15 [mu]g/m\3\.
(4) PM10(Lo-Vol): 3 [mu]g/m\3\.
(5) PM2.5: 3 [mu]g/m\3\.
4.1 Statistics for the Assessment of QC Checks for SO2, NO2, O3
and CO.
4.1.1 Percent Difference. Many of the measurement quality checks
start with a comparison of an audit concentration or value (flow
rate) to the concentration/value measured by the monitor and use
percent difference as the comparison statistic as described in
equation 1 of this section. For each single point check, calculate
the percent difference, di, as follows:
[GRAPHIC] [TIFF OMITTED] TP11SE14.000
where, meas is the concentration indicated by the PQAO's instrument
and audit is the audit concentration of the standard used in the QC
check being measured.
4.1.2 Precision Estimate. The precision estimate is used to
assess the one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The precision estimator is the coefficient of variation upper bound
and is calculated using equation 2 of this section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.001
where, n is the number of single point checks being aggregated; X
\2\ 0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is calculated using the
one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The bias estimator is an upper bound on the mean absolute value of
the percent differences as described in equation 3 of this section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.002
where, n is the number of single point checks being aggregated;
t0.95,-1 is the 95th quantile of a t-distribution with n-
1 degrees of freedom; the quantity AB is the mean of the absolute
values of the di's and is calculated using equation 4 of
this section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.003
and the quantity AS is the standard deviation of the absolute value
of the di's and is calculated using equation 5 of this
section:
[[Page 54385]]
[GRAPHIC] [TIFF OMITTED] TP11SE14.004
4.1.3.1 Assigning a sign (positive/negative) to the bias
estimate. Since the bias statistic as calculated in equation 3 of
this appendix uses absolute values, it does not have a tendency
(negative or positive bias) associated with it. A sign will be
designated by rank ordering the percent differences of the QC check
samples from a given site for a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th percentiles of the percent
differences for each site. The absolute bias upper bound should be
flagged as positive if both percentiles are positive and negative if
both percentiles are negative. The absolute bias upper bound would
not be flagged if the 25th and 75th percentiles are of different
signs.
4.2 Statistics for the Assessment of PM10, PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler Precision Estimate for
PM10, PM2.5 and Pb. Precision is estimated via duplicate
measurements from collocated samplers. It is recommended that the
precision be aggregated at the PQAO level quarterly, annually, and
at the 3-year level. The data pair would only be considered valid if
both concentrations are greater than or equal to the minimum values
specified in section 4(c) of this appendix. For each collocated data
pair, calculate the relative percent difference, di,
using equation 6 of this appendix:
[GRAPHIC] [TIFF OMITTED] TP11SE14.005
where, Xi is the concentration from the primary sampler
and Yi is the concentration value from the audit sampler.
The coefficient of variation upper bound is calculated using
equation 7 of this appendix:
[GRAPHIC] [TIFF OMITTED] TP11SE14.006
where, n is the number of valid data pairs being aggregated, and X
\2\ 0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di is
calculated from two values with error.
4.2.2 One-Point Flow Rate Verification Bias Estimate for PM10,
PM2.5 and Pb. For each one-point flow rate verification, calculate
the percent difference in volume using equation 1 of this appendix
where meas is the value indicated by the sampler's volume
measurement and audit is the actual volume indicated by the auditing
flow meter. The absolute volume bias upper bound is then calculated
using equation 3, where n is the number of flow rate audits being
aggregated; t0.95,n-1is the 95th quantile of a t-
distribution with n-1 degrees of freedom, the quantity AB is the
mean of the absolute values of the di's and is
calculated using equation 4 of this appendix, and the quantity AS in
equation 3 of this appendix is the standard deviation of the
absolute values if the di's and is calculated
using equation 5 of this appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias Estimate for PM10, PM2.5
and Pb. Use the same procedure described in section 4.2.2 for the
evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs Bias Estimate for Pb. The
Pb bias estimate is calculated using the paired routine and the PEP
monitor as described in section 3.4.7. Use the same procedures as
described in section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs Bias Estimate for PM2.5.
The bias estimate is calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias estimator is based on the
mean percent differences (Equation 1). The mean percent difference,
D, is calculated by Equation 8 below.
[GRAPHIC] [TIFF OMITTED] TP11SE14.007
where, nj is the number of pairs and d1,
d2,...dnj are the biases for each pair to be
averaged.
4.2.6 Pb Analysis Audit Bias Estimate. The bias estimate is
calculated using the analysis audit data described in section 3.4.6.
Use the same bias estimate procedure as described in section 4.1.3
of this appendix.
5. Reporting Requirements
5.1 Reporting Requirements. For each pollutant, prepare a list
of all monitoring sites and their AQS site identification codes in
each PQAO and submit the list to the appropriate EPA regional
office, with a copy to AQS. Whenever there is a change in this list
of monitoring sites in a PQAO, report this change to the EPA
regional office and to AQS.
5.1.1 Quarterly Reports. For each quarter, each PQAO shall
report to AQS directly (or via the appropriate EPA regional office
for organizations not direct users of AQS) the results of all valid
measurement quality checks it has carried out during the quarter.
The quarterly reports must be submitted consistent with the data
reporting requirements specified for air quality data as set forth
in 40 CFR 58.16. The EPA strongly encourages early submission of the
quality assurance data in order to assist the PQAOs ability to
control and evaluate the quality of the ambient air data.
5.1.2 Annual Reports.
5.1.2.1 When the PQAO has certified relevant data for the
calendar year, the EPA will calculate and report the measurement
uncertainty for the entire calendar year.
6.0 References
(1) American National Standard--Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental
Technology Programs. ANSI/ASQC E4-2004. February 2004. Available
from American Society for Quality Control, 611 East Wisconsin
Avenue, Milwaukee, WI 53202.
(2) EPA Requirements for Quality Management Plans. EPA QA/R-2. EPA/
240/B-01/002. March 2001, Reissue May 2006. Office of Environmental
Information, Washington DC 20460. https://www.epa.gov/quality/qs-docs/r2-final.pdf.
(3) EPA Requirements for Quality Assurance Project Plans for
Environmental Data Operations. EPA QA/R-5. EPA/240/B-01/003. March
2001, Reissue May 2006. Office of Environmental Information,
Washington DC 20460. https://www.epa.gov/quality/qs-docs/r5-final.pdf.
(4) EPA Traceability Protocol for Assay and Certification of Gaseous
Calibration Standards. EPA-600/R-12/531. May, 2012. Available from
U.S. Environmental Protection Agency, National Risk Management
Research Laboratory, Research Triangle Park NC 27711. https://www.epa.gov/nrmrl/appcd/mmd/db-traceability-protocol.html.
(5) Guidance for the Data Quality Objectives Process. EPA QA/G-4.
EPA/240/B-06/001. February, 2006. Office of Environmental
Information, Washington DC 20460. https://www.epa.gov/quality/qs-docs/g4-final.pdf.
(6) List of Designated Reference and Equivalent Methods. Available
from U.S. Environmental Protection Agency, National Exposure
Research Laboratory,
[[Page 54386]]
Human Exposure and Atmospheric Sciences Division, MD-D205-03,
Research Triangle Park, NC 27711. https://www.epa.gov/ttn/amtic/criteria.html.
(7) Transfer Standards for the Calibration of Ambient Air Monitoring
Analyzers for Ozone. EPA-454/B-13-004 U.S. Environmental Protection
Agency, Research Triangle Park, NC 27711, October, 2013. https://www.epa.gov/ttn/amtic/qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical Assistance Document for
the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057. U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711,
September, 1979. https://www.epa.gov/ttn/amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume 1-A Field Guide to Environmental Quality Assurance.
EPA-600/R-94/038a. April 1994. Available from U.S. Environmental
Protection Agency, ORD Publications Office, Center for Environmental
Research Information (CERI), 26 W. Martin Luther King Drive,
Cincinnati, OH 45268. https://www.epa.gov/ttn/amtic/qabook.html.
(10) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II: Ambient Air Quality Monitoring Program Quality
System Development. EPA-454/B-13-003. https://www.epa.gov/ttn/amtic/qabook.html.
(11) National Performance Evaluation Program Standard Operating
Procedures. https://www.epa.gov/ttn/amtic/npapsop.html.
Table A-1 of Appendix A to Part 58--Minimum Data Assessment Requirements for NAAQS Related Criteria Pollutant Monitors
--------------------------------------------------------------------------------------------------------------------------------------------------------
Method Assessment method Coverage Minimum frequency Parameters reported AQS assessment type
--------------------------------------------------------------------------------------------------------------------------------------------------------
Gaseous Methods (CO, NO2, SO2, O3)
--------------------------------------------------------------------------------------------------------------------------------------------------------
1-Point QC for SO2, NO2, O3, CO.... Response check at Each analyzer......... Once per 2 weeks..... Audit concentration 1-Point QC.
concentration 0.005- \1\ and measured
0.08 ppm SO2, NO2, concentration \2\.
O3, and 0.5 and 5 ppm
CO.
Annual performance evaluation for See section 3.1.2 of Each analyzer......... Once per year........ Audit concentration Annual PE.
SO2, NO2, O3, CO. this appendix. \1\ and measured
concentration \2\
for each level.
NPAP for SO2, NO2, O3, CO.......... Independent Audit..... 20% of sites each year Once per year........ Audit concentration NPAP.
\1\ and measured
concentration \2\
for each level.
--------------------------------------------------------------------------------------------------------------------------------------------------------
Particulate Methods
--------------------------------------------------------------------------------------------------------------------------------------------------------
Continuous \4\ method-collocated Collocated samplers... 15%................... 1-in-12 days......... Primary sampler No Transaction
quality control sampling PM2.5. concentration and reported as raw
duplicate sampler data.
concentration.\3\
Manual method-collocated quality Collocated samplers... 15%................... 1-in-12 days......... Primary sampler No Transaction
control sampling PM10, PM2.5, Pb- concentration and reported as raw
TSP, Pb-PM10. duplicate sampler data.
concentration.\3\
Flow rate verification PM10 (low Check of sampler flow Each sampler.......... Once every month..... Audit flow rate and Flow Rate
Vol) PM2.5, Pb-PM10. rate. measured flow rate Verification.
indicated by the
sampler.
Flow rate verification PM10 (High- Check of sampler flow Each sampler.......... Once every quarter... Audit flow rate and Flow Rate
Vol), Pb-TSP. rate. measured flow rate Verification.
indicated by the
sampler.
Semi-annual flow rate audit PM10, Check of sampler flow Each sampler.......... Once every 6 months.. Audit flow rate and Semi Annual Flow Rate
TSP, PM10-2.5, PM2.5, Pb-TSP, Pb- rate using measured flow rate Audit.
PM10.. independent standard. indicated by the
sampler.
Pb analysis audits Pb-TSP, Pb-PM10. Check of analytical Analytical............ Once each quarter.... Measured value and Pb Analysis Audits.
system with Pb audit audit value ([mu]g
strips/filters. Pb/filter) using AQS
unit code 077.
Performance Evaluation Program Collocated samplers... (1) 5 valid audits for Distributed over all Primary sampler PEP.
PM2.5. primary QA orgs, with 4 quarters. concentration and
<=5 sites. (2) 8 performance
valid audits for evaluation sampler
primary QA orgs, with concentration.
>5 sites. (3) All
samplers in 6 years.
Performance Evaluation Program Pb- Collocated samplers... (1) 1 valid audit and Distributed over all Primary sampler PEP.
TSP, Pb-PM10. 4 collocated samples 4 quarters. concentration and
for primary QA orgs, performance
with <=5 sites. (2) 2 evaluation sampler
valid audits and 6 concentration.
collocated samples Primary sampler
for primary QA orgs concentration and
with >5 sites. duplicate sampler
concentration.
--------------------------------------------------------------------------------------------------------------------------------------------------------
\1\ Effective concentration for open path analyzers.
\2\ Corrected concentration, if applicable for open path analyzers.
\3\ Both primary and collocated sampler values are reported as raw data.
\4\ PM2.5 is the only particulate criteria pollutant requiring collocation of continuous and manual primary monitors.
[[Page 54387]]
Table A-2 of Appendix A to Part 58--Summary of PM2.5 Number and Type of Collocation (15% Collocation
Requirement) Required Using an Example of a PQAO That Has 54 Primary Monitors (54 Sites) With One Federal
Reference Method Type and Three Types of Approved Federal Equivalent Methods
----------------------------------------------------------------------------------------------------------------
Number of
Number of collocated with
Primary sampler method designation Total number of Total number of collocated with same method
monitors collocated FRM designation as
primary
----------------------------------------------------------------------------------------------------------------
FRM................................. 20 3 3 3
FEM (A)............................. 20 3 2 1
FEM (B)............................. 2 1 1 0
FEM (C)............................. 12 2 1 1
----------------------------------------------------------------------------------------------------------------
0
10. Add Appendix B to part 58 to read as follows:
Appendix B to Part 58--Quality Assurance Requirements for Prevention of
Significant Deterioration (PSD) Air Monitoring
1. General Information
2. Quality System Requirements
3. Measurement Quality Check Requirements
4. Calculations for Data Quality Assessments
5. Reporting Requirements
6. References
1. General Information
1.1 Applicability.
(a) This appendix specifies the minimum quality assurance
requirements for the control and assessment of the quality of the
ambient air monitoring data submitted to a PSD reviewing authority
or the EPA by an organization operating an air monitoring station,
or network of stations, operated in order to comply with Part 51 New
Source Review--Prevention of Significant Deterioration (PSD). Such
organizations are encouraged to develop and maintain quality
assurance programs more extensive than the required minimum.
Additional guidance for the requirements reflected in this appendix
can be found in the ``Quality Assurance Handbook for Air Pollution
Measurement Systems,'' Volume II (Ambient Air) and ``Quality
Assurance Handbook for Air Pollution Measurement Systems,'' Volume
IV (Meteorological Measurements) and at a national level in
references 1, 2, and 3 of this appendix.
(b) It is not assumed that data generated for PSD under this
appendix will be used in making NAAQS decisions. However, if all the
requirements in this appendix are followed (including the NPEP
programs) and reported to AQS, with review and concurrence from the
EPA region, data may be used for NAAQS decisions. With the exception
of the NPEP programs (NPAP, PM2.5 PEP, Pb-PEP) for which
implementation is at the discretion of the PSD reviewing authority,
all other quality assurance and quality control requirements found
in the appendix must be met.
1.2 PSD Primary Quality Assurance Organization (PQAO). A PSD
PQAO is defined as a monitoring organization or a coordinated
aggregation of such organizations that is responsible for a set of
stations within one reviewing authority that monitors the same
pollutant and for which data quality assessments will be pooled.
Each criteria pollutant/monitor must be associated with only one PSD
PQAO.
1.2.1 Each PSD PQAO shall be defined such that measurement
uncertainty among all stations in the organization can be expected
to be reasonably homogeneous, as a result of common factors. A PSD
PQAO must be associated with only one PSD reviewing authority.
Common factors that should be considered in defining PSD PQAOs
include:
(a) Operation by a common team of field operators according to a
common set of procedures;
(b) Use of a common QAPP and/or standard operating procedures;
(c) Common calibration facilities and standards;
(d) Oversight by a common quality assurance organization; and
(e) Support by a common management organization or laboratory.
1.2.2 PSD monitoring organizations having difficulty describing
its PQAO or in assigning specific monitors to a PSD PQAO should
consult with the reviewing authority. Any consolidation of PSD PQAOs
shall be subject to final approval by the PSD reviewing authority.
1.2.3 Each PSD PQAO is required to implement a quality system
that provides sufficient information to assess the quality of the
monitoring data. The quality system must, at a minimum, include the
specific requirements described in this appendix. Failure to conduct
or pass a required check or procedure, or a series of required
checks or procedures, does not by itself invalidate data for
regulatory decision making. Rather, PSD PQAOs and the PSD reviewing
authority shall use the checks and procedures required in this
appendix in combination with other data quality information,
reports, and similar documentation that demonstrate overall
compliance with parts 51, 52 and 58 of this chapter. Accordingly,
the PSD reviewing authority shall use a ``weight of evidence''
approach when determining the suitability of data for regulatory
decisions. The PSD reviewing authority reserves the authority to use
or not use monitoring data submitted by a PSD monitoring
organization when making regulatory decisions based on the PSD
reviewing authority's assessment of the quality of the data.
Generally, consensus built validation templates or validation
criteria already approved in quality assurance project plans (QAPPs)
should be used as the basis for the weight of evidence approach.
1.3 Definitions.
(a) Measurement Uncertainty. A term used to describe deviations
from a true concentration or estimate that are related to the
measurement process and not to spatial or temporal population
attributes of the air being measured.
(b) Precision. A measurement of mutual agreement among
individual measurements of the same property usually under
prescribed similar conditions, expressed generally in terms of the
standard deviation.
(c) Bias. The systematic or persistent distortion of a
measurement process which causes errors in one direction.
(d) Accuracy. The degree of agreement between an observed value
and an accepted reference value. Accuracy includes a combination of
random error (imprecision) and systematic error (bias) components
which are due to sampling and analytical operations.
(e) Completeness. A measure of the amount of valid data obtained
from a measurement system compared to the amount that was expected
to be obtained under correct, normal conditions.
(f) Detectability. The low critical range value of a
characteristic that a method specific procedure can reliably
discern.
1.4 Measurement Quality Check Reporting. The measurement quality
checks described in section 3 of this appendix, are required to be
submitted to the PSD reviewing authority within the same time frame
as routinely-collected ambient concentration data as described in 40
CFR 58.16. The PSD reviewing authority may as well require that the
measurement quality check data be reported to AQS.
1.5 Assessments and Reports. Periodic assessments and
documentation of data quality are required to be reported to the PSD
reviewing authority. To provide national uniformity in this
assessment and reporting of data quality for all networks, specific
assessment and reporting procedures are prescribed in detail in
sections 3, 4, and 5 of this appendix.
2. Quality System Requirements
A quality system (reference 1 of this appendix) is the means by
which an organization manages the quality of the monitoring
information it produces in a
[[Page 54388]]
systematic, organized manner. It provides a framework for planning,
implementing, assessing and reporting work performed by an
organization and for carrying out required quality assurance and
quality control activities.
2.1 Quality Assurance Project Plans. All PSD PQAOs must develop
a quality system that is described and approved in quality assurance
project plans (QAPP) to ensure that the monitoring results:
(a) Meet a well-defined need, use, or purpose (reference 5 of
this appendix);
(b) Provide data of adequate quality for the intended monitoring
objectives;
(c) Satisfy stakeholder expectations;
(d) Comply with applicable standards specifications;
(e) Comply with statutory (and other legal) requirements; and
(f) Assure quality assurance and quality control adequacy and
independence.
2.1.1 The QAPP is a formal document that describes these
activities in sufficient detail and is supported by standard
operating procedures. The QAPP must describe how the organization
intends to control measurement uncertainty to an appropriate level
in order to achieve the objectives for which the data are collected.
The QAPP must be documented in accordance with EPA requirements
(reference 3 of this appendix).
2.1.2 The PSD PQAO's quality system must have adequate resources
both in personnel and funding to plan, implement, assess and report
on the achievement of the requirements of this appendix and it's
approved QAPP.
2.1.3 Incorporation of quality management plan (QMP) elements
into the QAPP. The QMP describes the quality system in terms of the
organizational structure, functional responsibilities of management
and staff, lines of authority, and required interfaces for those
planning, implementing, assessing and reporting activities involving
environmental data operations (EDO). The PSD PQAOs may combine
pertinent elements of the QMP into the QAPP rather than requiring
the submission of both QMP and QAPP documents separately, with prior
approval of the PSD reviewing authority. Additional guidance on QMPs
can be found in reference 2 of this appendix.
2.2 Independence of Quality Assurance Management. The PSD PQAO
must provide for a quality assurance management function for its PSD
data collection operation, that aspect of the overall management
system of the organization that determines and implements the
quality policy defined in a PSD PQAO's QAPP. Quality management
includes strategic planning, allocation of resources and other
systematic planning activities (e.g., planning, implementation,
assessing and reporting) pertaining to the quality system. The
quality assurance management function must have sufficient technical
expertise and management authority to conduct independent oversight
and assure the implementation of the organization's quality system
relative to the ambient air quality monitoring program and should be
organizationally independent of environmental data generation
activities.
2.3. Data Quality Performance Requirements.
2.3.1 Data Quality Objectives (DQOs). The DQOs, or the results
of other systematic planning processes, are statements that define
the appropriate type of data to collect and specify the tolerable
levels of potential decision errors that will be used as a basis for
establishing the quality and quantity of data needed to support air
monitoring objectives (reference 5 of the appendix). The DQOs have
been developed by the EPA to support attainment decisions for
comparison to national ambient air quality standards (NAAQS). The
reviewing authority and the PSD monitoring organization will be
jointly responsible for determining whether adherence to the EPA
developed NAAQS DQOs specified in appendix A of this part are
appropriate or if DQOs from a project-specific systematic planning
process are necessary.
2.3.1.1 Measurement Uncertainty for Automated and Manual PM2.5
Methods. The goal for acceptable measurement uncertainty for
precision is defined as an upper 90 percent confidence limit for the
coefficient of variation (CV) of 10 percent and plus or minus 10
percent for total bias.
2.3.1.2 Measurement Uncertainty for Automated Ozone Methods. The
goal for acceptable measurement uncertainty is defined for precision
as an upper 90 percent confidence limit for the CV of 7 percent and
for bias as an upper 95 percent confidence limit for the absolute
bias of 7 percent.
2.3.1.3 Measurement Uncertainty for Pb Methods. The goal for
acceptable measurement uncertainty is defined for precision as an
upper 90 percent confidence limit for the CV of 20 percent and for
bias as an upper 95 percent confidence limit for the absolute bias
of 15 percent.
2.3.1.4 Measurement Uncertainty for NO2. The goal for acceptable
measurement uncertainty is defined for precision as an upper 90
percent confidence limit for the CV of 15 percent and for bias as an
upper 95 percent confidence limit for the absolute bias of 15
percent.
2.3.1.5 Measurement Uncertainty for SO2. The goal for acceptable
measurement uncertainty for precision is defined as an upper 90
percent confidence limit for the CV of 10 percent and for bias as an
upper 95 percent confidence limit for the absolute bias of 10
percent.
2.4 National Performance Evaluation Program. Organizations
operating PSD monitoring networks are required to implement the
EPA's national performance evaluation program (NPEP) if the data
will be used for NAAQS decisions and at the discretion of the PSD
reviewing authority if PSD data is not used for NAAQS decisions. The
NPEP includes the National Performance Audit Program (NPAP), the
PM2.5 Performance Evaluation Program (PM2.5-
PEP) and the Pb Performance Evaluation Program (Pb-PEP). The PSD
QAPP shall provide for the implementation of NPEP including the
provision of adequate resources for such audit programs. Contact the
PSD reviewing authority to determine the best procedure for
implementing the audits which may include an audit by the PSD
reviewing authority, a contractor certified for the activity, or
through self-implementation which is described in sections below. A
determination of which entity will be performing this audit program
should be made as early as possible and during the QAPP development
process. The PSD PQAOs, including contractors that plan to implement
these programs on behalf of PSD PQAOs, that plan to implement these
programs (self-implement) rather than use the federal programs, must
meet the adequacy requirements found in the appropriate sections
that follow, as well as meet the definition of independent
assessment that follows.
2.4.1 Independent Assessment. An assessment performed by a
qualified individual, group, or organization that is not part of the
organization directly performing and accountable for the work being
assessed. This auditing organization must not be involved with the
generation of the routinely-collected ambient air monitoring data.
An organization can conduct the performance evaluation (PE) if it
can meet this definition and has a management structure that, at a
minimum, will allow for the separation of its routine sampling
personnel from its auditing personnel by two levels of management.
In addition, the sample analysis of audit filters must be performed
by a laboratory facility and laboratory equipment separate from the
facilities used for routine sample analysis. Field and laboratory
personnel will be required to meet the performance evaluation field
and laboratory training and certification requirements. The PSD PQAO
will be required to participate in the centralized field and
laboratory standards certification and comparison processes to
establish comparability to federally implemented programs.
2.5 Technical Systems Audit Program. The PSD reviewing authority
or the EPA, may conduct system audits of the ambient air monitoring
programs or organizations operating PSD networks. The PSD monitoring
organizations shall consult with the PSD reviewing authority to
verify the schedule of any such technical systems audit. Systems
audit programs are described in reference 10 of this appendix.
2.6 Gaseous and Flow Rate Audit Standards.
2.6.1 Gaseous pollutant concentration standards (permeation
devices or cylinders of compressed gas) used to obtain test
concentrations for carbon monoxide (CO), sulfur dioxide
(SO2), nitrogen oxide (NO), and nitrogen dioxide
(NO2) must be traceable to either a National Institute of
Standards and Technology (NIST) Traceable Reference Material (NTRM)
or a NIST-certified Gas Manufacturer's Internal Standard (GMIS),
certified in accordance with one of the procedures given in
reference 4 of this appendix. Vendors advertising certification with
the procedures provided in reference 4 of this appendix and
distributing gases as ``EPA Protocol Gas'' must participate in the
EPA Protocol Gas Verification Program or not use ``EPA'' in any form
of advertising. The PSD PQAOs must provide information to the PSD
reviewing authority on the gas vendors they use (or will use) for
the duration of the PSD monitoring project. This information can be
provided in the QAPP or monitoring plan, but must be updated if
there is a change in the producer used.
[[Page 54389]]
2.6.2 Test concentrations for ozone (O3) must be
obtained in accordance with the ultraviolet photometric calibration
procedure specified in appendix D to part 50, and by means of a
certified NIST-traceable O3 transfer standard. Consult
references 7 and 8 of this appendix for guidance on transfer
standards for O3.
2.6.3 Flow rate measurements must be made by a flow measuring
instrument that is NIST-traceable to an authoritative volume or
other applicable standard. Guidance for certifying some types of
flow-meters is provided in reference 10 of this appendix.
2.7 Primary Requirements and Guidance. Requirements and guidance
documents for developing the quality system are contained in
references 1 through 11 of this appendix, which also contain many
suggested procedures, checks, and control specifications. Reference
10 describes specific guidance for the development of a quality
system for data collected for comparison to the NAAQS. Many specific
quality control checks and specifications for methods are included
in the respective reference methods described in part 50 or in the
respective equivalent method descriptions available from the EPA
(reference 6 of this appendix). Similarly, quality control
procedures related to specifically designated reference and
equivalent method monitors are contained in the respective operation
or instruction manuals associated with those monitors. For PSD
monitoring, the use of reference and equivalent method monitors are
required.
3. Measurement Quality Check Requirements
This section provides the requirements for PSD PQAOs to perform
the measurement quality checks that can be used to assess data
quality. Data from these checks are required to be submitted to the
PSD reviewing authority within the same time frame as routinely-
collected ambient concentration data as described in 40 CFR 58.16.
Table B-1 of this appendix provides a summary of the types and
frequency of the measurement quality checks that are described in
this section. Reporting these results to AQS may be required by the
PSD reviewing authority.
3.1 Gaseous monitors of SO2, NO2, O3, and CO.
3.1.1 One-Point Quality Control (QC) Check for SO2, NO2, O3, and
CO. (a) A one-point QC check must be performed at least once every 2
weeks on each automated monitor used to measure SO2,
NO2, O3 and CO. With the advent of automated
calibration systems, more frequent checking is strongly encouraged
and may be required by the PSD reviewing authority. See Reference 10
of this appendix for guidance on the review procedure. The QC check
is made by challenging the monitor with a QC check gas of known
concentration (effective concentration for open path monitors)
between the prescribed range of 0.005 and 0.08 parts per million
(ppm) for SO2, NO2, and O3, and
between the prescribed range of 0.5 and 5 ppm for CO monitors. The
QC check gas concentration selected within the prescribed range must
be related to the mean or median of the ambient air concentrations
normally measured at sites within the PSD monitoring network in
order to appropriately reflect the precision and bias at these
routine concentration ranges. If the mean or median concentrations
at the sites are below or above the prescribed range, select the
lowest or highest concentration in the range. An additional QC check
point is encouraged for those organizations that may have occasional
high values or would like to confirm the monitors' linearity at the
higher end of the operational range.
(b) Point analyzers must operate in their normal sampling mode
during the QC check and the test atmosphere must pass through all
filters, scrubbers, conditioners and other components used during
normal ambient sampling and as much of the ambient air inlet system
as is practicable. The QC check must be conducted before any
calibration or adjustment to the monitor.
(c) Open-path monitors are tested by inserting a test cell
containing a QC check gas concentration into the optical measurement
beam of the instrument. If possible, the normally used transmitter,
receiver, and as appropriate, reflecting devices should be used
during the test and the normal monitoring configuration of the
instrument should be altered as little as possible to accommodate
the test cell for the test. However, if permitted by the associated
operation or instruction manual, an alternate local light source or
an alternate optical path that does not include the normal
atmospheric monitoring path may be used. The actual concentration of
the QC check gas in the test cell must be selected to produce an
effective concentration in the range specified earlier in this
section. Generally, the QC test concentration measurement will be
the sum of the atmospheric pollutant concentration and the QC test
concentration. As such, the result must be corrected to remove the
atmospheric concentration contribution. The corrected concentration
is obtained by subtracting the average of the atmospheric
concentrations measured by the open path instrument under test
immediately before and immediately after the QC test from the QC
check gas concentration measurement. If the difference between these
before and after measurements is greater than 20 percent of the
effective concentration of the test gas, discard the test result and
repeat the test. If possible, open path monitors should be tested
during periods when the atmospheric pollutant concentrations are
relatively low and steady.
(d) Report the audit concentration of the QC gas and the
corresponding measured concentration indicated by the monitor. The
percent differences between these concentrations are used to assess
the precision and bias of the monitoring data as described in
sections 4.1.2 (precision) and 4.1.3 (bias) of this appendix.
3.1.2 Quarterly performance evaluation for SO2, NO2, O3 , or CO.
Evaluate each primary monitor each calendar quarter during which
monitors are operated or a least once (if operated for less than one
quarter). The quarterly performance evaluation (quarterly PE) must
be performed by a qualified individual, group, or organization that
is not part of the organization directly performing and accountable
for the work being assessed. The person or entity performing the
quarterly PE must not be involved with the generation of the
routinely-collected ambient air monitoring data. A PSD monitoring
organization can conduct the quarterly PE itself if it can meet this
definition and has a management structure that, at a minimum, will
allow for the separation of its routine sampling personnel from its
auditing personnel by two levels of management. The quarterly PE
also requires a set of equipment and standards independent from
those used for routine calibrations or zero, span or precision
checks. The PE personnel will be required to meet PE training and
certification requirements.
3.1.2.1 The evaluation is made by challenging the monitor with
audit gas standards of known concentration from at least three audit
levels. Two of the audit levels selected will represent a range of
10-80 percent of the typical ambient air concentrations either
measured by the monitor or in the PQAOs network of monitors. The
third point should be at the NAAQS level or above the highest
anticipated routine hourly concentration, whichever is greater. An
additional 4th level is encouraged for those PSD organizations that
would like to confirm the monitor's linearity at the higher end of
the operational range. In rare circumstances, there may be sites
measuring concentrations above audit level 10. These sites should be
identified to the PSD reviewing authority.
----------------------------------------------------------------------------------------------------------------
Concentration range, ppm
Audit level ---------------------------------------------------------------------------
O3 SO2 NO2 CO
----------------------------------------------------------------------------------------------------------------
1................................... 0.004-0.0059 0.0003-0.0029 0.0003-0.0029 0.020-0.059
2................................... 0.006-0.019 0.0030-0.0049 0.0030-0.0049 0.060-0.199
3................................... 0.020-0.039 0.0050-0.0079 0.0050-0.0079 0.200-0.899
4................................... 0.040-0.069 0.0080-0.0199 0.0080-0.0199 0.900-2.999
5................................... 0.070-0.089 0.0200-0.0499 0.0200-0.0499 3.000-7.999
6................................... 0.090-0.119 0.0500-0.0999 0.0500-0.0999 8.000-15.999
7................................... 0.120-0.139 0.1000-0.1499 0.1000-0.2999 16.000-30.999
8................................... 0.140-0.169 0.1500-0.2599 0.3000-0.4999 31.000-39.999
[[Page 54390]]
9................................... 0.170-0.189 0.2600-0.7999 0.5000-0.7999 40.000-49.999
10.................................. 0.190-0.259 0.8000-1.000 0.8000-1.000 50.000-60.000
----------------------------------------------------------------------------------------------------------------
3.1.2.2 The NO2 audit techniques may vary depending
on the ambient monitoring method. For chemiluminescence-type
NO2 analyzers, gas phase titration (GPT) techniques
should be based on the EPA guidance documents and monitoring agency
experience. The NO2 gas standards may be more appropriate
than GPT for direct NO2 methods that do not employ
converters. Care should be taken to ensure the stability of such gas
standards prior to use.
3.1.2.3 The standards from which audit gas test concentrations
are obtained must meet the specifications of section 2.6.1 of this
appendix.
3.1.2.4 For point analyzers, the evaluation shall be carried out
by allowing the monitor to analyze the audit gas test atmosphere in
its normal sampling mode such that the test atmosphere passes
through all filters, scrubbers, conditioners, and other sample inlet
components used during normal ambient sampling and as much of the
ambient air inlet system as is practicable.
3.1.2.5 Open-path monitors are evaluated by inserting a test
cell containing the various audit gas concentrations into the
optical measurement beam of the instrument. If possible, the
normally used transmitter, receiver, and, as appropriate, reflecting
devices should be used during the evaluation, and the normal
monitoring configuration of the instrument should be modified as
little as possible to accommodate the test cell for the evaluation.
However, if permitted by the associated operation or instruction
manual, an alternate local light source or an alternate optical path
that does not include the normal atmospheric monitoring path may be
used. The actual concentrations of the audit gas in the test cell
must be selected to produce effective concentrations in the
evaluation level ranges specified in this section of this appendix.
Generally, each evaluation concentration measurement result will be
the sum of the atmospheric pollutant concentration and the
evaluation test concentration. As such, the result must be corrected
to remove the atmospheric concentration contribution. The corrected
concentration is obtained by subtracting the average of the
atmospheric concentrations measured by the open-path instrument
under test immediately before and immediately after the evaluation
test (or preferably before and after each evaluation concentration
level) from the evaluation concentration measurement. If the
difference between the before and after measurements is greater than
20 percent of the effective concentration of the test gas standard,
discard the test result for that concentration level and repeat the
test for that level. If possible, open path monitors should be
evaluated during periods when the atmospheric pollutant
concentrations are relatively low and steady. Also, if the open-path
instrument is not installed in a permanent manner, the monitoring
path length must be reverified to be within plus or minus 3 percent
to validate the evaluation, since the monitoring path length is
critical to the determination of the effective concentration.
3.1.2.6 Report both the evaluation concentrations (effective
concentrations for open-path monitors) of the audit gases and the
corresponding measured concentration (corrected concentrations, if
applicable, for open-path monitors) indicated or produced by the
monitor being tested. The percent differences between these
concentrations are used to assess the quality of the monitoring data
as described in section 4.1.1 of this appendix.
3.1.3 National Performance Evaluation Program (NPAP).
As stated in sections 1.1 and 2.4, PSD monitoring networks may
be subject to the NPEP, which includes the NPAP. The NPAP is a
performance evaluation which is a type of audit where quantitative
data are collected independently in order to evaluate the
proficiency of an analyst, monitoring instrument and laboratory. The
NPAP should not be confused with the quarterly PE program described
in section 3.1.2. The PSD organizations shall consult with the PSD
reviewing authority or the EPA regarding whether the implementation
of NPAP is required and the implementation options available.
Details of the EPA NPAP can be found in reference 11 of this
appendix. The program requirements include:
3.1.3.1 Performing audits on 100 percent of monitors and sites
each year including monitors and sites that may be operated for less
than 1 year. The reviewing authority has the authority to require
more frequent audits at sites they consider to be high priority.
3.1.3.2 Developing a delivery system that will allow for the
audit concentration gasses to be introduced at the probe inlet where
logistically feasible.
3.1.3.3 Using audit gases that are verified against the National
Institute for Standards and Technology (NIST) standard reference
methods or special review procedures and validated annually for CO,
SO2 and NO2, and at the beginning of each
quarter of audits for O3.
3.1.3.4 The PSD PQAO may elect to self-implement NPAP. In these
cases, the PSD reviewing authority will work with those PSD PQAOs to
establish training and other technical requirements to establish
comparability to federally implemented programs. In addition to
meeting the requirements in sections 3.1.1.3 through 3.1.3.3, the
PSD PQAO must:
(a) Ensure that the PSD audit system is equivalent to the EPA
NPAP audit system and is an entirely separate set of equipment and
standards from the equipment used for quarterly performance
evaluations. If this system does not generate and analyze the audit
concentrations, as the EPA NPAP system does, its equivalence to the
EPA NPAP system must be proven to be as accurate under a full range
of appropriate and varying conditions as described in section
3.1.3.6.
(b) Perform a whole system check by having the PSD audit system
tested at an independent and qualified EPA lab, or equivalent.
(c) Evaluate the system with the EPA NPAP program through
collocated auditing at an acceptable number of sites each year (at
least one for a PSD network of five or less sites; at least two for
a network with more than five sites).
(d) Incorporate the NPAP into the PSD PQAO's QAPP.
(e) Be subject to review by independent, EPA-trained personnel.
(f) Participate in initial and update training/certification
sessions.
3.2 PM2.5.
3.2.1 Flow Rate Verification for PM2.5. A one-point
flow rate verification check must be performed at least once every
month (each verification minimally separated by 14 days) on each
monitor used to measure PM2.5. The verification is made
by checking the operational flow rate of the monitor. If the
verification is made in conjunction with a flow rate adjustment, it
must be made prior to such flow rate adjustment. For the standard
procedure, use a flow rate transfer standard certified in accordance
with section 2.6 of this appendix to check the monitor's normal flow
rate. Care should be used in selecting and using the flow rate
measurement device such that it does not alter the normal operating
flow rate of the monitor. Flow rate verification results are to be
reported to the PSD reviewing authority quarterly as described in
section 5.1. Reporting these results to AQS is encouraged. The
percent differences between the audit and measured flow rates are
used to assess the bias of the monitoring data as described in
section 4.2.2 of this appendix (using flow rates in lieu of
concentrations).
3.2.2 Semi-Annual Flow Rate Audit for PM2.5. Every 6
months, audit the flow rate of the PM2.5 particulate
monitors. For short-term monitoring operations (those less than 1
year), the flow rate audits must occur at start up, at the midpoint,
and near the completion of the monitoring project. The audit must be
conducted by a trained technician other than the routine site
operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard.
[[Page 54391]]
Care must be taken in auditing the flow rate to be certain that the
flow measurement device does not alter the normal operating flow
rate of the monitor. Report the audit flow rate of the transfer
standard and the corresponding flow rate measured by the monitor.
The percent differences between these flow rates are used to
evaluate monitor performance.
3.2.3 Collocated Sampling Procedures for PM2.5. A PSD
PQAO must have at least one collocated monitor for each PSD
monitoring network.
3.2.3.1 For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the QC
monitor. There can be only one primary monitor at a monitoring site
for a given time period.
(a) If the primary monitor is a FRM, then the quality control
monitor must be a FRM of the same method designation.
(b) If the primary monitor is a FEM, then the quality control
monitor must be a FRM unless the PSD PQAO submits a waiver for this
requirement, provides a specific reason why a FRM cannot be
implemented, and the waiver is approved by the PSD reviewing
authority. If the waiver is approved, then the quality control
monitor must be the same method designation as the primary FEM
monitor.
3.2.3.2 In addition, the collocated monitors should be deployed
according to the following protocol:
(a) The collocated quality control monitor(s) should be deployed
at sites with the highest predicted daily PM2.5
concentrations in the network. If the highest PM2.5
concentration site is impractical for collocation purposes,
alternative sites approved by the PSD reviewing authority may be
selected. If additional collocated sites are necessary, the PSD PQAO
and the reviewing authority should determine the appropriate
location(s) based on data needs.
(b) The two collocated monitors must be within 4 meters of each
other and at least 2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for samplers having flow rates
less than 200 liters/min to preclude airflow interference. A waiver
allowing up to 10 meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a primary and collocated
quality control monitor may be approved by the PSD reviewing
authority for sites at a neighborhood or larger scale of
representation. This waiver may be approved during the QAPP review
and approval process. Calibration, sampling, and analysis must be
the same for both collocated samplers and the same as for all other
samplers in the network.
(c) Sample the collocated quality control monitor on a 6-day
schedule for sites not requiring daily monitoring and on a 3-day
schedule for any site requiring daily monitoring. Report the
measurements from both primary and collocated quality control
monitors at each collocated sampling site. The calculations for
evaluating precision between the two collocated monitors are
described in section 4.2.1 of this appendix.
3.2.4 PM2.5 Performance Evaluation Program (PEP)
Procedures. As stated in sections 1.1 and 2.4 of this appendix, PSD
monitoring networks may be subject to the NPEP, which includes the
PM2.5 PEP. The PSD monitoring organizations shall consult
with the PSD reviewing authority or the EPA regarding whether the
implementation of PM2.5 PEP is required and the
implementation options available for the PM2.5 PEP. For
PSD PQAOs with less than or equal to five monitoring sites, five
valid performance evaluation audits must be collected and reported
each year. For PSD PQAOs with greater than five monitoring sites,
eight valid performance evaluation audits must be collected and
reported each year. Additionally, within the five or eight required
audits, each type of method designation (FRM/FEM designation) used
as a primary monitor in the PSD network shall be audited. For a PE
to be valid, both the primary monitor and PEP audit measurements
must meet quality control requirements and be above 3 [micro]g/m\3\
or a predefined lower concentration level determined by a systematic
planning process and approved by the PSD reviewing authority. Due to
the relatively short-term nature of most PSD monitoring, the
likelihood of measuring low concentrations in many areas attaining
the PM2.5 standard and the time required to weigh filters
collected in PEs, a PSD monitoring organization's QAPP may contain a
provision to waive the 3 [micro]g/m\3\ threshold for validity of PEs
conducted in the last quarter of monitoring, subject to approval by
the PSD reviewing authority.
3.3 PM10.
3.3.1 Flow Rate Verification for PM10. A one-point
flow rate verification check must be performed at least once every
month (each verification minimally seperated by 14 days) on each
monitor used to measure PM10. The verification is made by
checking the operational flow rate of the monitor. If the
verification is made in conjunction with a flow rate adjustment, it
must be made prior to such flow rate adjustment. For the standard
procedure, use a flow rate transfer standard certified in accordance
with section 2.6 of this appendix to check the monitor's normal flow
rate. Care should be taken in selecting and using the flow rate
measurement device such that it does not alter the normal operating
flow rate of the monitor. The percent differences between the audit
and measured flow rates are used to assess the bias of the
monitoring data as described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.3.2 Semi-Annual Flow Rate Audit for PM10. Every 6
months, audit the flow rate of the PM10 particulate
monitors. For short-term monitoring operations (those less than 1
year), the flow rate audits must occur at start up, at the midpoint,
and near the completion of the monitoring project. Where possible,
the EPA strongly encourages more frequent auditing. The audit must
be conducted by a trained technician other than the routine site
operator. The audit is made by measuring the monitor's normal
operating flow rate using a flow rate transfer standard certified in
accordance with section 2.6 of this appendix. The flow rate standard
used for auditing must not be the same flow rate standard used for
verifications or to calibrate the monitor. However, both the
calibration standard and the audit standard may be referenced to the
same primary flow rate or volume standard. Care must be taken in
auditing the flow rate to be certain that the flow measurement
device does not alter the normal operating flow rate of the monitor.
Report the audit flow rate of the transfer standard and the
corresponding flow rate measured by the monitor. The percent
differences between these flow rates are used to evaluate monitor
performance
3.3.3 Collocated Sampling Procedures for Manual PM10.
A PSD PQAO must have at least one collocated monitor for each PSD
monitoring network.
3.3.3.1 For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the
quality control monitor.
3.3.3.2 In addition, the collocated monitors should be deployed
according to the following protocol:
(a) The collocated quality control monitor(s) should be deployed
at sites with the highest predicted daily PM10
concentrations in the network. If the highest PM10
concentration site is impractical for collocation purposes,
alternative sites approved by the PSD reviewing authority may be
selected.
(b) The two collocated monitors must be within 4 meters of each
other and at least 2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for samplers having flow rates
less than 200 liters/min to preclude airflow interference. A waiver
allowing up to 10 meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a primary and collocated
sampler may be approved by the PSD reviewing authority for sites at
a neighborhood or larger scale of representation. This waiver may be
approved during the QAPP review and approval process. Calibration,
sampling, and analysis must be the same for both collocated samplers
and the same as for all other samplers in the network.
(c) Sample the collocated quality control monitor on a 6-day
schedule or 3-day schedule for any site requiring daily monitoring.
Report the measurements from both primary and collocated quality
control monitors at each collocated sampling site. The calculations
for evaluating precision between the two collocated monitors are
described in section 4.2.1 of this appendix.
(d) In determining the number of collocated sites required for
PM10, PSD monitoring networks for Pb-PM10
should be treated independently from networks for particulate matter
(PM), even though the separate networks may share one or more common
samplers. However, a single quality control monitor that meets the
collocation requirements for Pb-PM10 and PM10
may serve as a collocated quality control monitor for both networks.
Extreme care must be taken if using the filter from a quality
control monitor for both PM10 and Pb analysis.
PM10 filter weighing should occur prior to any Pb
analysis.
[[Page 54392]]
3.4 Pb.
3.4.1 Flow Rate Verification for Pb. A one-point flow rate
verification check must be performed at least once every month (each
verification minimally separated by 14 days) on each monitor used to
measure Pb. The verification is made by checking the operational
flow rate of the monitor. If the verification is made in conjunction
with a flow rate adjustment, it must be made prior to such flow rate
adjustment. Use a flow rate transfer standard certified in
accordance with section 2.6 of this appendix to check the monitor's
normal flow rate. Care should be taken in selecting and using the
flow rate measurement device such that it does not alter the normal
operating flow rate of the monitor. The percent differences between
the audit and measured flow rates are used to assess the bias of the
monitoring data as described in section 4.2.2 of this appendix
(using flow rates in lieu of concentrations).
3.4.2 Semi-Annual Flow Rate Audit for Pb. Every 6 months, audit
the flow rate of the Pb particulate monitors. For short-term
monitoring operations (those less than 1 year), the flow rate audits
must occur at start up, at the midpoint, and near the completion of
the monitoring project. Where possible, the EPA strongly encourages
more frequent auditing. The audit must be conducted by a trained
technician other than the routine site operator. The audit is made
by measuring the monitor's normal operating flow rate using a flow
rate transfer standard certified in accordance with section 2.6 of
this appendix. The flow rate standard used for auditing must not be
the same flow rate standard used to in verifications or to calibrate
the monitor. However, both the calibration standard and the audit
standard may be referenced to the same primary flow rate or volume
standard. Great care must be taken in auditing the flow rate to be
certain that the flow measurement device does not alter the normal
operating flow rate of the monitor. Report the audit flow rate of
the transfer standard and the corresponding flow rate measured by
the monitor. The percent differences between these flow rates are
used to evaluate monitor performance.
3.4.3 Collocated Sampling for Pb. A PSD PQAO must have at least
one collocated monitor for each PSD monitoring network.
3.4.3.1 For each pair of collocated monitors, designate one
sampler as the primary monitor whose concentrations will be used to
report air quality for the site, and designate the other as the
quality control monitor.
3.4.3.2 In addition, the collocated monitors should be deployed
according to the following protocol:
(a) The collocated quality control monitor(s) should be deployed
at sites with the highest predicted daily Pb concentrations in the
network. If the highest Pb concentration site is impractical for
collocation purposes, alternative sites approved by the PSD
reviewing authority may be selected.
(b) The two collocated monitors must be within 4 meters of each
other and at least 2 meters apart for flow rates greater than 200
liters/min or at least 1 meter apart for samplers having flow rates
less than 200 liters/min to preclude airflow interference. A waiver
allowing up to 10 meters horizontal distance and up to 3 meters
vertical distance (inlet to inlet) between a primary and collocated
sampler may be approved by the reviewing authority for sites at a
neighborhood or larger scale of representation. This waiver may be
approved during the QAPP review and approval process. Calibration,
sampling, and analysis must be the same for both collocated samplers
and the same as for all other samplers in the network.
(c) Sample the collocated quality control monitor on a 6-day
schedule if daily monitoring is not required or 3-day schedule for
any site requiring daily monitoring. Report the measurements from
both primary and collocated quality control monitors at each
collocated sampling site. The calculations for evaluating precision
between the two collocated monitors are described in section 4.2.1
of this appendix.
(d) In determining the number of collocated sites required for
Pb-PM10, PSD monitoring networks for PM10
should be treated independently from networks for Pb-
PM10, even though the separate networks may share one or
more common samplers. However, a single quality control monitor that
meets the collocation requirements for Pb-PM10 and
PM10 may serve as a collocated quality control monitor
for both networks. Extreme care must be taken if using a using the
filter from a quality control monitor for both PM10 and
Pb analysis. The PM10 filter weighing should occur prior
to any Pb analysis.
3.4.4 Pb Analysis Audits. Each calendar quarter, audit the Pb
reference or equivalent method analytical procedure using filters
containing a known quantity of Pb. These audit filters are prepared
by depositing a Pb standard on unexposed filters and allowing them
to dry thoroughly. The audit samples must be prepared using batches
of reagents different from those used to calibrate the Pb analytical
equipment being audited. Prepare audit samples in the following
concentration ranges:
------------------------------------------------------------------------
Equivalent ambient Pb
Range concentration, [micro]g/m\3\
------------------------------------------------------------------------
1................................... 30-100% of Pb NAAQS.
2................................... 200-300% of Pb NAAQS.
------------------------------------------------------------------------
(a) Audit samples must be extracted using the same extraction
procedure used for exposed filters.
(b) Analyze three audit samples in each of the two ranges each
quarter samples are analyzed. The audit sample analyses shall be
distributed as much as possible over the entire calendar quarter.
(c) Report the audit concentrations (in [micro]g Pb/filter or
strip) and the corresponding measured concentrations (in [micro]g
Pb/filter or strip) using AQS unit code 077 (if reporting to AQS).
The percent differences between the concentrations are used to
calculate analytical accuracy as described in section 4.2.5 of this
appendix.
3.4.5 Pb Performance Evaluation Program (PEP) Procedures. As
stated in sections 1.1 and 2.4, PSD monitoring networks may be
subject to the NPEP, which includes the Pb Performance Evaluation
Program. PSD monitoring organizations shall consult with the PSD
reviewing authority or the EPA regarding whether the implementation
of Pb-PEP is required and the implementation options available for
the Pb-PEP. The PEP is an independent assessment used to estimate
total measurement system bias. Each year, one PE audit must be
performed at one Pb site in each PSD PQAO network that has less than
or equal to five sites and two audits for PSD PQAO networks with
greater than five sites. In addition, each year, four collocated
samples from PSD PQAO networks with less than or equal to five sites
and six collocated samples from PSD PQAO networks with greater than
five sites must be sent to an independent laboratory for analysis.
The calculations for evaluating bias between the primary monitor and
the PE monitor for Pb are described in section 4.2.4 of this
appendix.
4. Calculations for Data Quality Assessment
(a) Calculations of measurement uncertainty are carried out by
PSD PQAO according to the following procedures. The PSD PQAOs should
report the data for all appropriate measurement quality checks as
specified in this appendix even though they may elect to perform
some or all of the calculations in this section on their own.
(b) At low concentrations, agreement between the measurements of
collocated samplers, expressed as relative percent difference or
percent difference, may be relatively poor. For this reason,
collocated measurement pairs will be selected for use in the
precision and bias calculations only when both measurements are
equal to or above the following limits:
(1) Pb: 0.002 [micro]g/m\3\ (Methods approved after 3/04/2010,
with exception of manual equivalent method EQLA-0813-803).
(2) Pb: 0.02 [micro]g/m\3\ (Methods approved before 3/04/2010,
and manual equivalent method EQLA-0813-803).
(3) PM10 (Hi-Vol): 15 [micro]g/m\3\.
(4) PM10 (Lo-Vol): 3 [micro]g/m\3\.
(5) PM2.5: 3 [micro]g/m\3\.
The PM2.5 3 [micro]g/m\3\ limit for the
PM2.5-PEP may be superseded by mutual agreement between
the PSD PQAO and the PSD reviewing authority as specified in section
3.2.4 of the appendix and detailed in the approved QAPP.
4.1 Statistics for the Assessment of QC Checks for SO2, NO2, O3
and CO.
4.1.1 Percent Difference. Many of the measurement quality checks
start with a comparison of an audit concentration or value (flow-
rate) to the concentration/value measured by the monitor and use
percent difference as the comparison statistic as described in
equation 1 of this section. For each single point check, calculate
the percent difference, di, as follows:
[GRAPHIC] [TIFF OMITTED] TP11SE14.008
[[Page 54393]]
where, meas is the concentration indicated by the PQAO's instrument
and audit is the audit concentration of the standard used in the QC
check being measured.
4.1.2 Precision Estimate. The precision estimate is used to
assess the one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The precision estimator is the coefficient of variation upper bound
and is calculated using equation 2 of this section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.009
where, n is the number of single point checks being aggregated; X\2\
0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom.
4.1.3 Bias Estimate. The bias estimate is calculated using the
one-point QC checks for SO2, NO2,
O3, or CO described in section 3.1.1 of this appendix.
The bias estimator is an upper bound on the mean absolute value of
the percent differences as described in equation 3 of this section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.010
where, n is the number of single point checks being aggregated;
t0.95,n-1 is the 95th quantile of a t-distribution with
n-1 degrees of freedom; the quantity AB is the mean of the absolute
values of the di's and is calculated using equation 4 of
this section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.011
and the quantity AS is the standard deviation of the absolute value
of the di's and is calculated using equation 5 of this
section:
[GRAPHIC] [TIFF OMITTED] TP11SE14.012
4.1.3.1 Assigning a sign (positive/negative) to the bias
estimate. Since the bias statistic as calculated in equation 3 of
this appendix uses absolute values, it does not have a tendency
(negative or positive bias) associated with it. A sign will be
designated by rank ordering the percent differences of the QC check
samples from a given site for a particular assessment interval.
4.1.3.2 Calculate the 25th and 75th percentiles of the percent
differences for each site. The absolute bias upper bound should be
flagged as positive if both percentiles are positive and negative if
both percentiles are negative. The absolute bias upper bound would
not be flagged if the 25th and 75th percentiles are of different
signs.
4.2 Statistics for the Assessment of PM10, PM2.5, and Pb.
4.2.1 Collocated Quality Control Sampler Precision Estimate for
PM10, PM2.5 and Pb. Precision is estimated via duplicate
measurements from collocated samplers. It is recommended that the
precision be aggregated at the PQAO level quarterly, annually, and
at the 3-year level. The data pair would only be considered valid if
both concentrations are greater than or equal to the minimum values
specified in section 4(c) of this appendix. For each collocated data
pair, calculate the relative percent difference, di,
using equation 6 of this appendix:
[GRAPHIC] [TIFF OMITTED] TP11SE14.013
where, Xi is the concentration from the primary sampler
and Yi is the concentration value from the audit sampler.
The coefficient of variation upper bound is calculated using
equation 7 of this appendix:
[GRAPHIC] [TIFF OMITTED] TP11SE14.014
where, n is the number of valid data pairs being aggregated, and
X\2\ 0.1,n-1 is the 10th percentile of a chi-squared
distribution with n-1 degrees of freedom. The factor of 2 in the
denominator adjusts for the fact that each di is
calculated from two values with error.
4.2.2 One-Point Flow Rate Verification Bias Estimate for PM10,
PM2.5 and Pb. For each one-point flow rate verification, calculate
the percent difference in volume using equation 1 of this appendix
where meas is the value indicated by the sampler's volume
measurement and audit is the actual volume indicated by the auditing
flow meter. The absolute volume bias upper bound is then calculated
using equation 3, where n is the number of flow rate audits being
aggregated; t0.95,n-1 is the 95th quantile of a t-
distribution with n-1 degrees of freedom, the quantity AB is the
mean of the absolute values of the di's and is calculated
using equation 4 of this appendix, and the quantity AS in equation 3
of this appendix is the standard deviation of the absolute values if
the di's and is calculated using equation 5 of this
appendix.
4.2.3 Semi-Annual Flow Rate Audit Bias Estimate for PM10, PM2.5
and Pb. Use the same procedure described in section 4.2.2 for the
evaluation of flow rate audits.
4.2.4 Performance Evaluation Programs Bias Estimate for Pb. The
Pb bias estimate is calculated using the paired routine and the
[[Page 54394]]
PEP monitor as described in section 3.4.5. Use the same procedures
as described in section 4.1.3 of this appendix.
4.2.5 Performance Evaluation Programs Bias Estimate for PM2.5.
The bias estimate is calculated using the PEP audits described in
section 4.1.3 of this appendix. The bias estimator is based on the
mean percent differences (Equation 1). The mean percent difference,
D, is calculated by Equation 8 below.
[GRAPHIC] [TIFF OMITTED] TP11SE14.015
where, nj is the number of pairs and
d1,d2, . . . dnj are the biases for
each pair to be averaged.
4.2.6 Pb Analysis Audit Bias Estimate. The bias estimate is
calculated using the analysis audit data described in section 3.4.4.
Use the same bias estimate procedure as described in section 4.1.3
of this appendix.
5. Reporting Requirements
5.1 Quarterly Reports. For each quarter, each PSD PQAO shall
report to the PSD reviewing authority (and AQS if required by the
PSD reviewing authority) the results of all valid measurement
quality checks it has carried out during the quarter. The quarterly
reports must be submitted consistent with the data reporting
requirements specified for air quality data as set forth in 40 CFR
58.16 and pertain to PSD monitoring.
6.0 References
(1) American National Standard--Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental
Technology Programs. ANSI/ASQC E4-2004. February 2004. Available
from American Society for Quality Control, 611 East Wisconsin
Avenue, Milwaukee, WI 53202.
(2) EPA Requirements for Quality Management Plans. EPA QA/R-2. EPA/
240/B-01/002. March 2001, Reissue May 2006. Office of Environmental
Information, Washington, DC 20460. https://www.epa.gov/quality/qs-docs/r2-final.pdf.
(3) EPA Requirements for Quality Assurance Project Plans for
Environmental Data Operations. EPA QA/R-5. EPA/240/B-01/003. March
2001, Reissue May 2006. Office of Environmental Information,
Washington, DC 20460. https://www.epa.gov/quality/qs-docs/r5-final.pdf.
(4) EPA Traceability Protocol for Assay and Certification of Gaseous
Calibration Standards. EPA-600/R-12/531. May, 2012. Available from
U.S. Environmental Protection Agency, National Risk Management
Research Laboratory, Research Triangle Park, NC 27711. https://www.epa.gov/nrmrl/appcd/mmd/db-traceability-protocol.html.
(5) Guidance for the Data Quality Objectives Process. EPA QA/G-4.
EPA/240/B-06/001. February, 2006. Office of Environmental
Information, Washington, DC 20460. https://www.epa.gov/quality/qs-docs/g4-final.pdf.
(6) List of Designated Reference and Equivalent Methods. Available
from U.S. Environmental Protection Agency, National Exposure
Research Laboratory, Human Exposure and Atmospheric Sciences
Division, MD-D205-03, Research Triangle Park, NC 27711. https://www.epa.gov/ttn/amtic/criteria.html.
(7) Transfer Standards for the Calibration of Ambient Air Monitoring
Analyzers for Ozone. EPA-454/B-13-004 U.S. Environmental Protection
Agency, Research Triangle Park, NC 27711, October, 2013. https://www.epa.gov/ttn/amtic/qapollutant.html.
(8) Paur, R.J. and F.F. McElroy. Technical Assistance Document for
the Calibration of Ambient Ozone Monitors. EPA-600/4-79-057. U.S.
Environmental Protection Agency, Research Triangle Park, NC 27711,
September, 1979. https://www.epa.gov/ttn/amtic/cpreldoc.html.
(9) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume 1--A Field Guide to Environmental Quality Assurance.
EPA-600/R-94/038a. April 1994. Available from U.S. Environmental
Protection Agency, ORD Publications Office, Center for Environmental
Research Information (CERI), 26 W. Martin Luther King Drive,
Cincinnati, OH 45268. https://www.epa.gov/ttn/amtic/qabook.html.
(10) Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II: Ambient Air Quality Monitoring Program Quality
System Development. EPA-454/B-13-003. https://www.epa.gov/ttn/amtic/qabook.html.
(11) National Performance Evaluation Program Standard Operating
Procedures. https://www.epa.gov/ttn/amtic/npapsop.html.
Table B-1--Minimum Data Assessment Requirements for NAAQS Related Criteria Pollutant PSD Monitors
--------------------------------------------------------------------------------------------------------------------------------------------------------
Method Assessment method Coverage Minimum frequency Parameters reported AQS assessment type
--------------------------------------------------------------------------------------------------------------------------------------------------------
Gaseous Methods (CO, NO2, SO2, O3)
--------------------------------------------------------------------------------------------------------------------------------------------------------
1-Point QC for SO2, NO2, O3, CO.... Response check at Each analyzer......... Once per 2 weeks..... Audit concentration 1-Point QC.
concentration 0.005- \1\ and measured
0.08 ppm SO2, NO2, concentration \2\.
O3, & 0.5 and 5 ppm
CO.
Quarterly performance evaluation See section 3.1.2 of Each analyzer......... Once per quarter..... Audit concentration Annual PE.
for SO2, NO2, O3, CO. this appendix. \1\ and measured
concentration \2\
for each level.
NPAP for SO2, NO2, O3, CO \3\...... Independent Audit..... Each primary monitor.. Once per year........ Audit concentration NPAP.
\1\ and measured
concentration \2\
for each level.
--------------------------------------------------------------------------------------------------------------------------------------------------------
Particulate Methods
--------------------------------------------------------------------------------------------------------------------------------------------------------
Collocated sampling PM10, PM2.5, Pb Collocated samplers... 1 per PSD Network per Every 6 days or every Primary sampler No Transaction
pollutant. 3 days if daily concentration and reported as raw
monitoring required. duplicate sampler data.
concentration \4\.
Flow rate verification............. Check of sampler flow Each sampler.......... Once every month..... Audit flow rate and Flow Rate
PM10, PM2.5, Pb.................... rate. measured flow rate Verification.
indicated by the
sampler.
Semi-annual flow rate audit........ Check of sampler flow Each sampler.......... Once every 6 months Audit flow rate and Semi Annual Flow Rate
PM10, PM2.5, Pb.................... rate using or beginning, middle measured flow rate Audit.
independent standard. and end of indicated by the
monitoring. sampler.
[[Page 54395]]
Pb analysis audits................. Check of analytical Analytical............ Each quarter......... Measured value and Pb Analysis Audits.
Pb-TSP, Pb-PM10.................... system with Pb audit audit value (ug Pb/
strips/filters. filter) using AQS
unit code 077 for
parameters:
14129--Pb (TSP) LC
FRM/FEM.
85129--Pb (TSP) LC
Non-FRM/FEM.
Performance Evaluation Program Collocated samplers... (1) 5 valid audits for Over all 4 quarters.. Primary sampler PEP.
PM2.5 \3\. PQAOs with <= 5 sites. concentration and
(2) 8 valid audits for performance
PQAOs with > 5 sites. evaluation sampler
(3) All samplers in 6 concentration.
years.
Performance Evaluation Program..... Collocated samplers... (1) 1 valid audit and Over all 4 quarters.. Primary sampler PEP.
Pb \3\............................. 4 collocated samples concentration and
for PQAOs, with <=5 performance
sites. evaluation sampler
(2) 2 valid audits and concentration.
6 collocated samples Primary sampler
for PQAOs with > 5 concentration and
sites. duplicate sampler
concentration.
--------------------------------------------------------------------------------------------------------------------------------------------------------
\1\ Effective concentration for open path analyzers.
\2\ Corrected concentration, if applicable for open path analyzers.
\3\ NPAP, PM2.5 PEP and Pb-PEP must be implemented if data is used for NAAQS decisions otherwise implementation is at PSD reviewing authority
discretion.
\4\ Both primary and collocated sampler values are reported as raw data.
0
11. In Appendix D to part 58, revise paragraph 3(b), remove and reserve
paragraph 4.5(b), and revise paragraph 4.5(c) to read as follows:
Appendix D to Part 58--Network Design Criteria for Ambient Air Quality
Monitoring
* * * * *
3. * * *
(b) The NCore sites must measure, at a minimum, PM2.5
particle mass using continuous and integrated/filter-based samplers,
speciated PM2.5, PM10-2.5 particle mass,
O3, SO2, CO, NO/NOY, wind speed,
wind direction, relative humidity, and ambient temperature.
(1) Although the measurement of NOy is required in
support of a number of monitoring objectives, available commercial
instruments may indicate little difference in their measurement of
NOy compared to the conventional measurement of
NOX, particularly in areas with relatively fresh sources
of nitrogen emissions. Therefore, in areas with negligible expected
difference between NOy and NOX measured
concentrations, the Administrator may allow for waivers that permit
NOX monitoring to be substituted for the required
NOy monitoring at applicable NCore sites.
(2) The EPA recognizes that, in some cases, the physical
location of the NCore site may not be suitable for representative
meteorological measurements due to the site's physical surroundings.
It is also possible that nearby meteorological measurements may be
able to fulfill this data need. In these cases, the requirement for
meteorological monitoring can be waived by the Administrator.
* * * * *
4.5 * * *
(b) [Reserved]
(c) The EPA Regional Administrator may require additional
monitoring beyond the minimum monitoring requirements contained in
paragraph 4.5(a) of this appendix where the likelihood of Pb air
quality violations is significant or where the emissions density,
topography, or population locations are complex and varied. EPA
Regional Administrators may require additional monitoring at
locations including, but not limited to, those near existing
additional industrial sources of Pb, recently closed industrial
sources of Pb, airports where piston-engine aircraft emit Pb, and
other sources of re-entrained Pb dust.
* * * * *
[FR Doc. 2014-19758 Filed 9-10-14; 8:45 am]
BILLING CODE 6560-50-P