Title 40

PART 58 APPENDIX A



Appendix A to Part 58 - Quality Assurance Requirements for Monitors used in Evaluations of National Ambient Air Quality Standards

40:6.0.1.1.6.9.1.1.34 : Appendix A

Appendix A to Part 58 - Quality Assurance Requirements for Monitors used in Evaluations of National Ambient Air Quality Standards 1. General Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data Quality Assessments 5. Reporting Requirements 6. References 1. General Information

1.1 Applicability. (a) This appendix specifies the minimum quality system requirements applicable to SLAMS and other monitor types whose data are intended to be used to determine compliance with the NAAQS (e.g., SPMs, tribal, CASTNET, NCore, industrial, etc.), unless the EPA Regional Administrator has reviewed and approved the monitor for exclusion from NAAQS use and these quality assurance requirements.

(b) Primary quality assurance organizations are encouraged to develop and maintain quality systems more extensive than the required minimums. Additional guidance for the requirements reflected in this appendix can be found in the “Quality Assurance Handbook for Air Pollution Measurement Systems,” Volume II (see reference 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix.

1.2 Primary Quality Assurance Organization (PQAO). A PQAO is defined as a monitoring organization or a group of monitoring organizations or other organization that is responsible for a set of stations that monitors the same pollutant and for which data quality assessments will be pooled. Each criteria pollutant sampler/monitor must be associated with only one PQAO. In some cases, data quality is assessed at the PQAO level.

1.2.1 Each PQAO shall be defined such that measurement uncertainty among all stations in the organization can be expected to be reasonably homogeneous as a result of common factors. Common factors that should be considered in defining PQAOs include:

(a) Operation by a common team of field operators according to a common set of procedures;

(b) Use of a common quality assurance project plan (QAPP) or standard operating procedures;

(c) Common calibration facilities and standards;

(d) Oversight by a common quality assurance organization; and

(e) Support by a common management organization (i.e., state agency) or laboratory.

Since data quality assessments are made and data certified at the PQAO level, the monitoring organization identified as the PQAO will be responsible for the oversight of the quality of data of all monitoring organizations within the PQAO.

1.2.2 Monitoring organizations having difficulty describing its PQAO or in assigning specific monitors to primary quality assurance organizations should consult with the appropriate EPA Regional Office. Any consolidation of monitoring organizations to PQAOs shall be subject to final approval by the appropriate EPA Regional Office.

1.2.3 Each PQAO is required to implement a quality system that provides sufficient information to assess the quality of the monitoring data. The quality system must, at a minimum, include the specific requirements described in this appendix. Failure to conduct or pass a required check or procedure, or a series of required checks or procedures, does not by itself invalidate data for regulatory decision making. Rather, PQAOs and the EPA shall use the checks and procedures required in this appendix in combination with other data quality information, reports, and similar documentation that demonstrate overall compliance with Part 58. Accordingly, the EPA and PQAOs shall use a “weight of evidence” approach when determining the suitability of data for regulatory decisions. The EPA reserves the authority to use or not use monitoring data submitted by a monitoring organization when making regulatory decisions based on the EPA's assessment of the quality of the data. Consensus built validation templates or validation criteria already approved in QAPPs should be used as the basis for the weight of evidence approach.

1.3 Definitions.

(a) Measurement Uncertainty. A term used to describe deviations from a true concentration or estimate that are related to the measurement process and not to spatial or temporal population attributes of the air being measured.

(b) Precision. A measurement of mutual agreement among individual measurements of the same property usually under prescribed similar conditions, expressed generally in terms of the standard deviation.

(c) Bias. The systematic or persistent distortion of a measurement process which causes errors in one direction.

(d) Accuracy. The degree of agreement between an observed value and an accepted reference value. Accuracy includes a combination of random error (imprecision) and systematic error (bias) components which are due to sampling and analytical operations.

(e) Completeness. A measure of the amount of valid data obtained from a measurement system compared to the amount that was expected to be obtained under correct, normal conditions.

(f) Detection Limit. The lowest concentration or amount of target analyte that can be determined to be different from zero by a single measurement at a stated level of probability.

1.4 Measurement Quality Checks. The measurement quality checks described in section 3 of this appendix shall be reported to AQS and are included in the data required for certification.

1.5 Assessments and Reports. Periodic assessments and documentation of data quality are required to be reported to the EPA. To provide national uniformity in this assessment and reporting of data quality for all networks, specific assessment and reporting procedures are prescribed in detail in sections 3, 4, and 5 of this appendix. On the other hand, the selection and extent of the quality assurance and quality control activities used by a monitoring organization depend on a number of local factors such as field and laboratory conditions, the objectives for monitoring, the level of data quality needed, the expertise of assigned personnel, the cost of control procedures, pollutant concentration levels, etc. Therefore, quality system requirements in section 2 of this appendix are specified in general terms to allow each monitoring organization to develop a quality system that is most efficient and effective for its own circumstances while achieving the data quality objectives described in this appendix.

2. Quality System Requirements

A quality system (reference 1 of this appendix) is the means by which an organization manages the quality of the monitoring information it produces in a systematic, organized manner. It provides a framework for planning, implementing, assessing and reporting work performed by an organization and for carrying out required quality assurance and quality control activities.

2.1 Quality Management Plans and Quality Assurance Project Plans. All PQAOs must develop a quality system that is described and approved in quality management plans (QMP) and QAPPs to ensure that the monitoring results:

(a) Meet a well-defined need, use, or purpose (reference 5 of this appendix);

(b) Provide data of adequate quality for the intended monitoring objectives;

(c) Satisfy stakeholder expectations;

(d) Comply with applicable standards specifications;

(e) Comply with statutory (and other legal) requirements; and

(f) Reflect consideration of cost and economics.

2.1.1 The QMP describes the quality system in terms of the organizational structure, functional responsibilities of management and staff, lines of authority, and required interfaces for those planning, implementing, assessing and reporting activities involving environmental data operations (EDO). The QMP must be suitably documented in accordance with EPA requirements (reference 2 of this appendix), and approved by the appropriate Regional Administrator, or his or her representative. The quality system described in the QMP will be reviewed during the systems audits described in section 2.5 of this appendix. Organizations that implement long-term monitoring programs with EPA funds should have a separate QMP document. Smaller organizations, organizations that do infrequent work with the EPA or have monitoring programs of limited size or scope may combine the QMP with the QAPP if approved by, and subject to any conditions of the EPA. Additional guidance on this process can be found in reference 10 of this appendix. Approval of the recipient's QMP by the appropriate Regional Administrator or his or her representative may allow delegation of authority to the PQAOs independent quality assurance function to review and approve environmental data collection activities adequately described and covered under the scope of the QMP and documented in appropriate planning documents (QAPP). Where a PQAO or monitoring organization has been delegated authority to review and approve their QAPP, an electronic copy must be submitted to the EPA region at the time it is submitted to the PQAO/monitoring organization's QAPP approving authority. The QAPP will be reviewed by the EPA during systems audits or circumstances related to data quality. The QMP submission and approval dates for PQAOs/monitoring organizations must be reported to AQS either by the monitoring organization or the EPA Region.

2.1.2 The QAPP is a formal document describing, in sufficient detail, the quality system that must be implemented to ensure that the results of work performed will satisfy the stated objectives. PQAOs must develop QAPPs that describe how the organization intends to control measurement uncertainty to an appropriate level in order to achieve the data quality objectives for the EDO. The quality assurance policy of the EPA requires every EDO to have a written and approved QAPP prior to the start of the EDO. It is the responsibility of the PQAO/monitoring organization to adhere to this policy. The QAPP must be suitably documented in accordance with EPA requirements (reference 3 of this appendix) and include standard operating procedures for all EDOs either within the document or by appropriate reference. The QAPP must identify each PQAO operating monitors under the QAPP as well as generally identify the sites and monitors to which it is applicable either within the document or by appropriate reference. The QAPP submission and approval dates must be reported to AQS either by the monitoring organization or the EPA Region.

2.1.3 The PQAO/monitoring organization's quality system must have adequate resources both in personnel and funding to plan, implement, assess and report on the achievement of the requirements of this appendix and it's approved QAPP.

2.2 Independence of Quality Assurance. The PQAO must provide for a quality assurance management function, that aspect of the overall management system of the organization that determines and implements the quality policy defined in a PQAO's QMP. Quality management includes strategic planning, allocation of resources and other systematic planning activities (e.g., planning, implementation, assessing and reporting) pertaining to the quality system. The quality assurance management function must have sufficient technical expertise and management authority to conduct independent oversight and assure the implementation of the organization's quality system relative to the ambient air quality monitoring program and should be organizationally independent of environmental data generation activities.

2.3. Data Quality Performance Requirements.

2.3.1 Data Quality Objectives. The DQOs, or the results of other systematic planning processes, are statements that define the appropriate type of data to collect and specify the tolerable levels of potential decision errors that will be used as a basis for establishing the quality and quantity of data needed to support the monitoring objectives (reference 5 of this appendix). The DQOs will be developed by the EPA to support the primary regulatory objectives for each criteria pollutant. As they are developed, they will be added to the regulation. The quality of the conclusions derived from data interpretation can be affected by population uncertainty (spatial or temporal uncertainty) and measurement uncertainty (uncertainty associated with collecting, analyzing, reducing and reporting concentration data). This appendix focuses on assessing and controlling measurement uncertainty.

2.3.1.1 Measurement Uncertainty for Automated and Manual PM2.5 Methods. The goal for acceptable measurement uncertainty is defined for precision as an upper 90 percent confidence limit for the coefficient of variation (CV) of 10 percent and ±10 percent for total bias.

2.3.1.2 Measurement Uncertainty for Automated O3 Methods. The goal for acceptable measurement uncertainty is defined for precision as an upper 90 percent confidence limit for the CV of 7 percent and for bias as an upper 95 percent confidence limit for the absolute bias of 7 percent.

2.3.1.3 Measurement Uncertainty for Pb Methods. The goal for acceptable measurement uncertainty is defined for precision as an upper 90 percent confidence limit for the CV of 20 percent and for bias as an upper 95 percent confidence limit for the absolute bias of 15 percent.

2.3.1.4 Measurement Uncertainty for NO2. The goal for acceptable measurement uncertainty is defined for precision as an upper 90 percent confidence limit for the CV of 15 percent and for bias as an upper 95 percent confidence limit for the absolute bias of 15 percent.

2.3.1.5 Measurement Uncertainty for SO2. The goal for acceptable measurement uncertainty for precision is defined as an upper 90 percent confidence limit for the CV of 10 percent and for bias as an upper 95 percent confidence limit for the absolute bias of 10 percent.

2.4 National Performance Evaluation Programs. The PQAO shall provide for the implementation of a program of independent and adequate audits of all monitors providing data for NAAQS compliance purposes including the provision of adequate resources for such audit programs. A monitoring plan (or QAPP) which provides for PQAO participation in the EPA's National Performance Audit Program (NPAP), the PM2.5 Performance Evaluation Program (PM2.5-PEP) program and the Pb Performance Evaluation Program (Pb-PEP) and indicates the consent of the PQAO for the EPA to apply an appropriate portion of the grant funds, which the EPA would otherwise award to the PQAO for these QA activities, will be deemed by the EPA to meet this requirement. For clarification and to participate, PQAOs should contact either the appropriate EPA regional quality assurance (QA) coordinator at the appropriate EPA Regional Office location, or the NPAP coordinator at the EPA Air Quality Assessment Division, Office of Air Quality Planning and Standards, in Research Triangle Park, North Carolina. The PQAOs that plan to implement these programs (self-implement) rather than use the federal programs must meet the adequacy requirements found in the appropriate sections that follow, as well as meet the definition of independent assessment that follows.

2.4.1 Independent assessment. An assessment performed by a qualified individual, group, or organization that is not part of the organization directly performing and accountable for the work being assessed. This auditing organization must not be involved with the generation of the ambient air monitoring data. An organization can conduct the performance evaluation (PE) if it can meet this definition and has a management structure that, at a minimum, will allow for the separation of its routine sampling personnel from its auditing personnel by two levels of management. In addition, the sample analysis of audit filters must be performed by a laboratory facility and laboratory equipment separate from the facilities used for routine sample analysis. Field and laboratory personnel will be required to meet PE field and laboratory training and certification requirements to establish comparability to federally implemented programs.

2.5 Technical Systems Audit Program. Technical systems audits of each PQAO shall be conducted at least every 3 years by the appropriate EPA Regional Office and reported to the AQS. If a PQAO is made up of more than one monitoring organization, all monitoring organizations in the PQAO should be audited within 6 years (two TSA cycles of the PQAO). As an example, if a state has five local monitoring organizations that are consolidated under one PQAO, all five local monitoring organizations should receive a technical systems audit within a 6-year period. Systems audit programs are described in reference 10 of this appendix.

2.6 Gaseous and Flow Rate Audit Standards.

2.6.1 Gaseous pollutant concentration standards (permeation devices or cylinders of compressed gas) used to obtain test concentrations for CO, SO2, NO, and NO2 must be traceable to either a National Institute of Standards and Technology (NIST) Traceable Reference Material (NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS), certified in accordance with one of the procedures given in reference 4 of this appendix. Vendors advertising certification with the procedures provided in reference 4 of this appendix and distributing gases as “EPA Protocol Gas” for ambient air monitoring purposes must participate in the EPA Ambient Air Protocol Gas Verification Program or not use “EPA” in any form of advertising. Monitoring organizations must provide information to the EPA on the gas producers they use on an annual basis and those PQAOs purchasing standards will be obligated, at the request of the EPA, to participate in the program at least once every 5 years by sending a new unused standard to a designated verification laboratory.

2.6.2 Test concentrations for O3 must be obtained in accordance with the ultraviolet photometric calibration procedure specified in appendix D to Part 50 of this chapter and by means of a certified NIST-traceable O3 transfer standard. Consult references 7 and 8 of this appendix for guidance on transfer standards for O3.

2.6.3 Flow rate measurements must be made by a flow measuring instrument that is NIST-traceable to an authoritative volume or other applicable standard. Guidance for certifying some types of flowmeters is provided in reference 10 of this appendix.

2.7 Primary Requirements and Guidance. Requirements and guidance documents for developing the quality system are contained in references 1 through 11 of this appendix, which also contain many suggested procedures, checks, and control specifications. Reference 10 describes specific guidance for the development of a quality system for data collected for comparison to the NAAQS. Many specific quality control checks and specifications for methods are included in the respective reference methods described in Part 50 of this chapter or in the respective equivalent method descriptions available from the EPA (reference 6 of this appendix). Similarly, quality control procedures related to specifically designated reference and equivalent method monitors are contained in the respective operation or instruction manuals associated with those monitors.

3. Measurement Quality Check Requirements

This section provides the requirements for PQAOs to perform the measurement quality checks that can be used to assess data quality. Data from these checks are required to be submitted to the AQS within the same time frame as routinely-collected ambient concentration data as described in 40 CFR 58.16. Table A-1 of this appendix provides a summary of the types and frequency of the measurement quality checks that will be described in this section.

3.1. Gaseous Monitors of SO2, NO2, O3, and CO.

3.1.1 One-Point Quality Control (QC) Check for SO2, NO2, O3, and CO. (a) A one-point QC check must be performed at least once every 2 weeks on each automated monitor used to measure SO2, NO2, O3 and CO. With the advent of automated calibration systems, more frequent checking is strongly encouraged. See Reference 10 of this appendix for guidance on the review procedure. The QC check is made by challenging the monitor with a QC check gas of known concentration (effective concentration for open path monitors) between the prescribed range of 0.005 and 0.08 parts per million (ppm) for SO2, NO2, and O3, and between the prescribed range of 0.5 and 5 ppm for CO monitors. The QC check gas concentration selected within the prescribed range should be related to the monitoring objectives for the monitor. If monitoring at an NCore site or for trace level monitoring, the QC check concentration should be selected to represent the mean or median concentrations at the site. If the mean or median concentrations at trace gas sites are below the MDL of the instrument the agency can select the lowest concentration in the prescribed range that can be practically achieved. If the mean or median concentrations at trace gas sites are above the prescribed range the agency can select the highest concentration in the prescribed range. An additional QC check point is encouraged for those organizations that may have occasional high values or would like to confirm the monitors' linearity at the higher end of the operational range or around NAAQS concentrations. If monitoring for NAAQS decisions, the QC concentration can be selected at a higher concentration within the prescribed range but should also consider precision points around mean or median monitor concentrations.

(b) Point analyzers must operate in their normal sampling mode during the QC check and the test atmosphere must pass through all filters, scrubbers, conditioners and other components used during normal ambient sampling and as much of the ambient air inlet system as is practicable. The QC check must be conducted before any calibration or adjustment to the monitor.

(c) Open path monitors are tested by inserting a test cell containing a QC check gas concentration into the optical measurement beam of the instrument. If possible, the normally used transmitter, receiver, and as appropriate, reflecting devices should be used during the test, and the normal monitoring configuration of the instrument should be altered as little as possible to accommodate the test cell for the test. However, if permitted by the associated operation or instruction manual, an alternate local light source or an alternate optical path that does not include the normal atmospheric monitoring path may be used. The actual concentration of the QC check gas in the test cell must be selected to produce an effective concentration in the range specified earlier in this section. Generally, the QC test concentration measurement will be the sum of the atmospheric pollutant concentration and the QC test concentration. As such, the result must be corrected to remove the atmospheric concentration contribution. The corrected concentration is obtained by subtracting the average of the atmospheric concentrations measured by the open path instrument under test immediately before and immediately after the QC test from the QC check gas concentration measurement. If the difference between these before and after measurements is greater than 20 percent of the effective concentration of the test gas, discard the test result and repeat the test. If possible, open path monitors should be tested during periods when the atmospheric pollutant concentrations are relatively low and steady.

(d) Report the audit concentration of the QC gas and the corresponding measured concentration indicated by the monitor to AQS. The percent differences between these concentrations are used to assess the precision and bias of the monitoring data as described in sections 4.1.2 (precision) and 4.1.3 (bias) of this appendix.

3.1.2 Annual performance evaluation for SO2, NO2, O3, or CO. A performance evaluation must be conducted on each primary monitor once a year. This can be accomplished by evaluating 25 percent of the primary monitors each quarter. The evaluation should be conducted by a trained experienced technician other than the routine site operator.

3.1.2.1 The evaluation is made by challenging the monitor with audit gas standards of known concentration from at least three audit levels. One point must be within two to three times the method detection limit of the instruments within the PQAOs network, the second point will be less than or equal to the 99th percentile of the data at the site or the network of sites in the PQAO or the next highest audit concentration level. The third point can be around the primary NAAQS or the highest 3-year concentration at the site or the network of sites in the PQAO. An additional 4th level is encouraged for those agencies that would like to confirm the monitors' linearity at the higher end of the operational range. In rare circumstances, there may be sites measuring concentrations above audit level 10. Notify the appropriate EPA region and the AQS program in order to make accommodations for auditing at levels above level 10.

Audit level Concentration Range, ppm
O3 SO2 NO2 CO
1 0.004-0.0059 0.0003-0.0029 0.0003-0.0029 0.020-0.059
2 0.006-0.019 0.0030-0.0049 0.0030-0.0049 0.060-0.199
3 0.020-0.039 0.0050-0.0079 0.0050-0.0079 0.200-0.899
4 0.040-0.069 0.0080-0.0199 0.0080-0.0199 0.900-2.999
5 0.070-0.089 0.0200-0.0499 0.0200-0.0499 3.000-7.999
6 0.090-0.119 0.0500-0.0999 0.0500-0.0999 8.000-15.999
7 0.120-0.139 0.1000-0.1499 0.1000-0.2999 16.000-30.999
8 0.140-0.169 0.1500-0.2599 0.3000-0.4999 31.000-39.999
9 0.170-0.189 0.2600-0.7999 0.5000-0.7999 40.000-49.999
10 0.190-0.259 0.8000-1.000 0.8000-1.000 50.000-60.000