I. Compilation of quality system documents
1.1 Necessity of quality system documents
On December 1, 2001, the State Bureau of Quality and Technical Supervision issued the "Evaluation Criteria for Metrological Certification/Examination and Approval (Acceptance) of Product Quality Inspection Institutions" (for trial implementation), which completely replaced the original assessment specifications. For laboratories, this means that the old documented quality system has completely become invalid. The original documents were designed based on the old specifications and cannot cover the requirements of the new criteria (such as the newly added contents like "guarantee of impartiality" and "proficiency testing"). They also do not meet the systematic requirements of the new criteria for the "documented system". Therefore, the first step in establishing a quality system according to the new criteria is to transform the requirements of the criteria into documents that can be implemented within the laboratory. Only by preparing new quality system documents can it be proved that the laboratory has the ability to meet the criteria, and these documents are also the core basis for the subsequent metrological certification evaluation.
1.2 Functions and Characteristics of Quality System Documents
1.2.1 Core role: The internal constitution of the laboratory
The implementation of the quality system depends on documented carriers. Without documents, the quality system is merely an abstract management concept. With documents, it becomes an executable and verifiable rule system. Specifically, the functions of documents are reflected in three aspects:
Standardize behaviors: Clearly define the work boundaries of all personnel (for example, testing personnel must operate according to standard methods, and administrators must calibrate equipment regularly).
Proof of ability: Documents are the "evidence" of the existence of the quality system (for example, the "Equipment Management Procedure" can prove that the laboratory has the ability to manage and control equipment).
Goal orientation: Transform "quality goals" (such as "100% testing accuracy") into specific process requirements (such as "check the equipment status before testing").
In short, preparing quality system documents is like "enacting laws" for the laboratory, and all work must be "based on laws".
1.2.2 Four key features
The vitality of the quality system documents stems from the balance between their rigidity and flexibility, which is specifically manifested in four characteristics:
Regulatory nature: Once a document is approved by the top management, it shall have the effect of an "internal regulation" - all personnel must strictly implement it without exception. If it needs to be revised, the standard process of "application → review → approval → release → retrieval of the old version" must be followed, and arbitrary changes are prohibited. Meanwhile, the document serves as the basis for internal and external audits (the assessors check "whether the laboratory complies with the criteria" through the document, and the laboratory conducts self - inspections "whether it deviates from the requirements" through the document).
Uniqueness: A laboratory can only have one set of effective document systems. One activity can only correspond to one procedure (for example, there cannot be two operating procedures for "sample reception" at the same time), and one regulation can only have one clear explanation (for example, "'effective version' refers to the document approved most recently"). It is necessary to ensure that all documents in use are the latest versions through "version control". Old versions need to be promptly retrieved and destroyed to avoid misusage.
Applicability: Documents cannot be copied from templates. They must be in line with the nature, tasks, and characteristics of the laboratory. For example, the documents of an environmental monitoring station should highlight the full - process control of "sampling → transportation → analysis". Food laboratories should emphasize the details of "sample pretreatment". The documents of small laboratories should be simplified (avoid the redundancy of "comprehensive but impractical"). Only "well - fitting" documents can avoid becoming "ornaments".
Witnessability: As a "provider of fair data", the laboratory's data needs to have legal traceability (for example, the test records should be able to correspond to "who did it, what equipment was used, according to what method, and when it was done"). At the same time, documents are tools for the "self - improvement" of the quality system. Through records (such as original test records and audit reports), problems can be quickly located (such as the lack of calibration records for a certain piece of equipment), so as to make timely rectifications and achieve the "PDCA cycle" (Plan → Do → Check → Act).
1.3 Composition and Hierarchy of Quality System Documents
Quality system documents form a hierarchical logical system, and the core purpose is to "implement step by step from the general guidelines to the operations". The common three - tier structure is as follows:
Level 1: Quality Manual (Policy Document): It clarifies the laboratory's quality policy, the overall system framework, and the allocation of elements (e.g., "Personnel training" is the responsibility of the Human Resources Department). It serves as the "general guideline" of the system.
Second layer: Program documents (procedural documents): Regulations for specific management or technical processes (e.g., "Sample Management Procedure", "Internal Audit Procedure"), which answer the question of "how to do it" (e.g., "What items need to be checked when receiving samples?").
Level III: Operational documents (detailed documents): Include work instructions (specific operation steps for a certain testing project), technical standards (such as GB 5749 - 2022 Hygienic Standards for Drinking Water), quality records (such as original testing records, equipment calibration records), etc. They are documents that directly guide operations.
Some laboratories will add a fourth layer (such as the "quality plan", a temporary document for specific projects), but the core logic remains unchanged - from "why to do it" to "how to do it", with detailed specifications at each level.
1.4 Principles for Compiling Quality System Documents
The core of writing a document is to "make the document useful", and three major principles need to be followed:
System coordination: The elements of the quality system (personnel, equipment, samples, methods) are interrelated (for example, "personnel training" needs to match the requirements of "equipment operation"). When compiling, it is necessary to start from the overall situation of the laboratory to ensure that the interfaces of various documents are tight (for example, the "Sample Management Procedure" should be connected with the "Testing Method Procedure" to avoid the loophole of "no testing process after sample reception"), thus forming a "closed-loop system".
Scientific and reasonable: "Scientific" means meeting the requirements of the criteria (for example, if the criteria require that "equipment needs to be calibrated regularly", the document should clearly state "calibration cycle, institution, and record requirements"); "reasonable" means conforming to actual laws (for example, the testing process should follow the sequence of "sample reception → pre - treatment → analysis → report" and cannot be reversed). Avoid "preparing documents just to meet the criteria" and instead "use the criteria to guide the document preparation".
Operable implementation: The documents should be "clearly written and fully implemented". For example, instead of saying "Samples should be well preserved", it should be stated that "Samples need to be stored in a refrigerator at 4°C, with a storage period not exceeding 7 days. The sample administrator shall conduct daily inspections and keep records." It should be easy to inspect (e.g., stipulate that "Test records shall include the signatures of personnel, equipment numbers, and method standard numbers") and easy to trace (e.g., the record numbers should correspond to "reports → original records → samples → equipment").
1.5 Writing process of quality system documents
Writing a document is not a matter of "writing on a whim" but a systematic task based on the current situation and corresponding standards, which needs to be completed in four steps:
1.5.1 Step 1: Training and learning – Unifying cognition
Organize all personnel to repeatedly study the new guidelines and relevant regulations (such as the "Metrology Law" and the "Administrative Measures for the Accreditation of Laboratory Qualifications"), and focus on solving three problems:
Understanding the requirements of the criteria: For example, the "4.1 Organization" clause of the criteria requires that "the laboratory shall have a clear organizational structure". One should understand "why it should be done in this way" (to avoid unclear responsibilities).
Master the changes in the provisions: Compare with the old specifications and clarify the newly added content in the new standards (such as "declaration of impartiality" and "proficiency testing").
Clarify individual responsibilities: For example, the technical leader is responsible for writing "technical documents" (such as work instructions), and the quality leader is responsible for writing "management documents" (such as procedural documents).
Only when all employees understand can the documents be effectively implemented.
1.5.2 II. Step: Survey planning – Identify the gaps
In response to the new guidelines, comprehensively review the current situation of the laboratory and identify the problems that need to be rectified.
Organizational structure: Are there key positions such as "Quality Manager" and "Technical Manager"?
Duties and authorities: Are the duties of each department clearly defined (e.g., is it the testing laboratory or the administrative department that is responsible for "sample management"?)?
Existing system: Does the old document comply with the new criteria? (For example, the old "Sample Management Measures" did not stipulate "sample identification", and it needs to be supplemented.)
Resource allocation: Do the equipment and reference materials meet the requirements of the new criteria? (For example, if the standard for a certain testing item is updated, the new version of the standard needs to be purchased.)
The output of this stage is the "gap list" and "rectification measures" (such as "add a quality supervisor position" and "revise the 'Sample Management Measures'").
1.5.3 Step 3: Writing and Reviewing – Division of Labor and Collaboration
Compile in the order of "outline first, details later". The core work includes:
Formulate specifications: Unify document formats (such as font, page number, numbering rules), outlines (such as the chapter framework of the quality manual), and terms (such as "the definition of 'effective version'");
Division of labor for writing: The quality supervisor writes the "Quality Manual" (guiding principle), the technical supervisor writes the "Procedure Documents" (process), and the testing team leader writes the "Operation Instructions" (operation).
Seminar and coordination: Organize each department to discuss the "consistency" of documents (for example, whether the "Sample Management Procedure" is in line with the "Testing Method Procedure");
Approval and release: The quality manual shall be approved by the top management, the procedure documents by the person in charge of quality, and the work instructions by the person in charge of technology. The old - version documents need to be retrieved before release to avoid confusion.
1.5.4 Step 4: Numbering management – Unique identification
The document number is the core tool of "version control" and needs to be unique and easily recognizable. For example, the numbering rules of a water environment monitoring center (code "SHJ"):
SHJ - Type Code - Serial Number - Era Code
- SHJ: Central code (fixed);
- Type codes: Quality Manual (MS), Procedure Document (CX), Work Instruction (ZY);
- Serial number: The sequence of the document (e.g., 001, 002);
- Code for the era: The year of publication (e.g., 2024).
Example: `SHJ - CX - 001 - 2024` (The first procedural document released in 2024).
II. Compilation of the Quality Manual
2.1 Core role of the quality manual
The quality manual is the highest programmatic document of the quality system, equivalent to the "general principles of the constitution" of the laboratory. Its functions are reflected in three aspects:
Indicate the direction: Clarify the laboratory's quality policy (e.g., "Scientific, fair, accurate, and efficient") and quality objectives (e.g., "100% accuracy of test reports") to provide an "action guide" for all staff.
Description system: Present the overall framework of the quality system (such as organizational structure and element distribution) to let everyone know "what our system looks like";
Review basis: It is the first document for the metrological certification review—reviewers judge whether the laboratory's system covers all elements of the criteria through the manual.
2.2 Compilation Methods and Requirements for the Quality Manual
2.2.1 Three ways to write paths
The laboratory can make a choice based on its own foundation.
1. Compile the manual and the procedure documents simultaneously: This approach is suitable for laboratories familiar with the system, which can ensure the consistency between the two. However, it is necessary to coordinate the progress of each compilation group.
2. Write the manual first and then the procedure documents: This approach is suitable for newly established laboratories. First, define the "guidelines" and then refine the "processes". However, it may result in the procedure documents being divorced from the manual.
3. Write the program document first, then the manual: This approach is suitable for laboratories with a certain foundation. First, organize the existing processes, and then refine the "guiding principles". However, the systematicness of the manual needs to be ensured.
2.2.2 Core requirements for writing
Comply closely with the standards: The manual must cover all elements of the new standards (such as "4.1 Organization", "4.2 Quality Manual", "4.3 Document Control") without omission.
Principled statement: The manual is a "guide" rather than an "operation manual" - it only states "what to do" rather than "how to do it" (for example, the manual states that "sample management shall comply with the requirements of the guidelines", and the specific operations are described in the "Sample Management Procedure").
Fit oneself: Highlight the characteristics of the laboratory (for example, the manual of an environmental monitoring station should include "Full - process control of water environment monitoring", and the manual of a food laboratory should include "Requirements for pre - treatment of food samples").
Measurability: Quality objectives should be specific and measurable (e.g., "Customer satisfaction ≥ 95%", rather than "Improve customer satisfaction").
2.3 Structure and Content of the Quality Manual
The structure of the quality manual should be logically clear and well - structured, and it is usually divided into three parts:
2.3.1 Part I: General Information
Cover: It includes the name of the laboratory, the name of the manual (e.g., "Quality Manual of XX Water Environment Monitoring Center"), the version number (e.g., V2.0), the release date, and the implementation date.
Approval page: Signature of the top management (to certify the effectiveness of the manual);
Revised pages: Record the modification history of the manual (e.g., the modification content from V1.0 to V2.0: a new chapter titled "Proficiency Testing" was added);
Table of contents: List the content by chapter (for quick reference);
Preface: Laboratory overview (establishment time, testing capabilities, certification status), purpose of compiling the manual (e.g., "To establish a quality system that complies with the new guidelines").
Quality policy and objectives: Clear and concise (e.g., "Quality policy: Fairness, accuracy, and continuous improvement; Quality objective: 100% accuracy of test reports, customer complaint rate ≤ 1%").
Quality element allocation table: Specify the responsible department for each criterion element (for example, "4.4 Testing methods" is the responsibility of the technical director).
Organization chart: Show the relationships among various departments (e.g., top management → quality manager → technical manager → Testing Room I/Testing Room II/Sampling Room);
Supervision mechanism: Clearly define the responsibilities and powers of supervisors (for example, supervisors can check the integrity of test records and have the right to require rectification);
Proficiency testing measures: For example, "Participate in the national proficiency testing of water environment monitoring once a year, and the result must be qualified."
2.3.2 Part II: Description of Quality System Elements
Write in chapters according to the element order of the new standard (e.g., "4.1 Organization", "4.2 Quality Manual", "4.3 Document Control"). Each chapter shall include:
Purpose: Why do this thing (e.g., "The purpose of document control is to ensure the use of valid versions of documents");
Scope: Which processes does it apply to (e.g., "Applies to all quality system documents in the laboratory")?
Responsibilities: Who is responsible for doing what (e.g., "The quality person - in - charge is responsible for the approval of documents");
Program summary: How to do it (e.g., "Preparation of documents → Review → Approval → Release → Recycling");
Resource guarantee: What kind of support is needed (e.g., "Assign 1 document administrator to be responsible for document storage")?
Related documents: Corresponding procedure documents or work instructions (e.g., "Document Control Procedure").
2.3.3 Part III: Appendices and Support Documents
Appendix: Supplementary information (such as the laboratory floor plan – showing the locations of the sampling room, testing room, and equipment room; statement of impartiality – the laboratory's commitment to independence and impartiality).
Supported file directory: List all associated files (such as the names and numbers of program files, job instructions, and technical standards).
2.4 Example of Quality Manual Compilation
Chapter 1 Preface
1.1 Overview: XX Water Environment Monitoring Center was established in 2000. It is a third - party institution specializing in water environment monitoring and has the CMA certification qualification (Certificate No.: CMA2024001). Its testing capabilities cover 100 water quality indicators (such as COD, ammonia nitrogen, and heavy metals).
1.2 Subject content and scope of application: This manual describes the quality system framework of this center and is applicable to all testing activities and management work.
1.3 Terms and Abbreviations: CMA (Measurement Certification), COD (Chemical Oxygen Demand), Proficiency Testing (Inter-laboratory Comparison Test).
1.4 Communication information: Address (No. 123, XX Road, XX City), Phone number (010 - 12345678), Email (shj@).
Chapter 2 Management of Quality Manual
2.1 Objectives and basis: To standardize the compilation, revision, and use of the manual, in accordance with the clause "4.2 Quality manual" of the new "Evaluation Criteria".
2.2 Scope of application: It applies to the entire life cycle of this manual (preparation → release → revision → abolition).
2.3 Management responsibilities: The person in charge of quality is responsible for the daily management of the manual, and the top management is responsible for approval.
2.4 Compilation and revision: For compilation, a "Manual Compilation Team" (with the quality supervisor as the team leader) shall be established. For revision, relevant departments shall submit an application. After review and approval, a new version shall be released.
2.5 Dissemination and training: After the manual is released, organize training for all employees (employees can start work only after passing the exam), and conduct at least one retraining session every year.
2.6 Numbering rule: Use `SHJ - MS - Serial number - Year code` (e.g., SHJ-MS-001-2024).
2.7 Related documents: "Document Control Procedure" (SHJ - CX - 001 - 2024).
Chapter 3 Quality Policy and Quality Objectives
3.1 Quality Policy: Scientific and Impartial, Accurate and Efficient, Continuous Improvement – Provide accurate test data with scientific methods and an impartial attitude, and continuously improve the quality system.
3.2 Quality objectives:
- The accuracy rate of the test report is 100%.
- Customer satisfaction rate ≥ 95%;
- Add 2 new testing capabilities every year.
- The passing rate of proficiency testing is 100%.
3.3 Quality commitment: Be responsible for the authenticity and accuracy of the test data. If data errors are caused by our center, we are willing to assume legal liability.
Chapter 4 Organization and Management
4.1 Organizational structure: Top management → Quality manager → Technical manager → Testing Room 1 (responsible for the testing of routine indicators), Testing Room 2 (responsible for the testing of heavy metals), Sampling Room (responsible for on - site sampling), Equipment Room (responsible for equipment management), Personnel Section (responsible for staff training).
4.2 Responsibilities and authorities:
- Top management: Approve the quality policy, quality objectives, and quality manual;
- Quality manager: Responsible for internal audits, management reviews, and document control;
- Technical supervisor: Responsible for the validation of testing methods, proficiency testing, and the compilation of work instructions.
- Head of the testing laboratory: Responsible for the testing activities and personnel management of this laboratory.
Chapter 4 Organizational Structure and Basic Management Specifications
This chapter focuses on the organizational structure and core management rules of the laboratory. The core contents include:
Organizational structure setup: Visually present the department hierarchy, affiliation relationships, and key positions (such as the technical director and quality director) in the form of a block diagram, and clarify the power and responsibility boundaries of the "decision - making layer - management layer - execution layer".
Duties and authorities: Refine the specific responsibilities (e.g., testers need to operate according to standards, and equipment administrators need to maintain the status of equipment) and authorities (e.g., the quality person - in - charge has the right to suspend non - compliant testing activities) of each department (such as the monitoring business room and the quality assurance room) and each position (such as testers and equipment administrators).
Right delegation: Clearly define the scope of authorization (e.g., temporarily entrusting the technical director to approve reports), the approval process (requiring the signature of the top management), and the liability tracing mechanism (the authorizer is responsible for the delegation act).
Confidentiality and Ownership Protection: Regarding the confidential information (such as product formulas and test data) and intellectual property rights (such as technical solutions) of the client, full - process control requirements are stipulated, including information storage (encrypted servers), transmission (dedicated lines or encrypted emails), and destruction (shredders or data erasure).
Management of mandatory tasks: Standardize the acceptance (the requirements and deadlines of the tasks need to be confirmed), distribution (implement according to the responsibilities of each department), and execution (prioritize resource allocation and promptly report the progress) of tasks assigned by the government or superior departments.
Relevant documents and appendices: List the procedural documents (such as the "Management Procedure for Mandatory Tasks") that support the above rules and appendices such as the organizational structure diagram.
Chapter 5 Construction of Quality System and Guarantee of Its Effectiveness
5.1 Overview
The quality of test reports is influenced by multi - dimensional factors such as personnel capabilities, equipment status, method effectiveness, and environmental conditions. The core value of the quality system lies in: incorporating these factors into the "controllable scope" through systematic design - sorting out the relationships among various factors (e.g., personnel training should match the equipment operation requirements) with the idea of "overall optimization" and coordinating the cooperation among departments (e.g., the connection between the sampling and testing links); and regularly identifying and correcting deviations through means such as audits (checking the implementation of the system) and reviews (evaluating the suitability of the system) to ensure the continuous and effective operation of the system.
The division of responsibilities in the quality system of this center is clearly defined:
Top management (e.g., the director of the center): Take overall responsibility for the establishment, implementation, and maintenance of the system (approve the quality policy and provide resource guarantee).
Monitoring Business Room + Quality Assurance Room: Take the lead in organizing the implementation of the system (e.g., formulating procedural documents and supervising implementation);
Each sub - center/analysis and testing laboratory: Responsible for the specific implementation of the system requirements of this department (such as conducting testing according to the quality loop) and daily maintenance (such as recording the usage of equipment).
5.2 Logic for establishing the quality system
The center takes the "process approach" as the core. Based on the GB/T 19000 series of quality management standards (emphasizing the PDCA cycle) and GB/T 15481 "General requirements for the competence of testing and calibration laboratories" (focusing on technical capabilities and management compliance), and combined with the business characteristics of water environment monitoring (such as the uncertainty of on - site sampling and the timeliness of samples), a documented quality system has been established - all requirements are solidified in written documents to avoid the ambiguity of "oral agreements".
The system operation needs to meet four key principles:
1. Full participation: The system is not a "document for management" but a work guide for all employees. From inspectors to administrators, everyone needs to understand their own responsibilities and the system requirements to ensure consistent implementation.
2. Demand-oriented: The test data shall meet both the requirements of the entrusting party (such as the test accuracy required by the client) and the regulatory requirements (such as the monitoring indicators of the superior water administrative department).
3. Service feasibility: The system design should take into account the actual scenarios of "external services", such as the process for customers to inquire about the testing progress and the reception requirements for sample submission, so as to avoid the contradiction of "compliant processes but inconvenience for customers".
4. Prevention first: Focus on "pre - control of quality risks" (such as avoiding operational errors through personnel training and avoiding data deviations through equipment calibration) rather than "post - event correction" (such as rework after a report error). At the same time, establish quantifiable assessment indicators (such as the pass rate of test results) to verify the prevention effect.
5.2.1 Design of quality loop and quality elements
The quality loop is the "full - process map" for the formation of the quality of test reports. Based on the characteristics of water environment monitoring, the quality loop of our center covers 10 key links:
On-site investigation → Design of sampling points → Formulation of monitoring plan → Sample collection → Sample management → Analytical testing → Data evaluation → Report preparation → Feedback to customers → Continuous improvement
Each link is closely interconnected (for example, on - site investigations provide a basis for site selection, and site selection affects the representativeness of samples), which is the core logic of the system design.
Quality elements are the basic units that make up the system, and are divided into three layers according to the core - auxiliary - foundation framework:
Basic process elements (7): Core links that directly determine the detection quality — including on - site investigation (clarify monitoring objectives), design of sampling points (ensure sample representativeness), monitoring plan (plan resources and processes), sample collection management (avoid sample contamination), analysis and testing (operate according to standards), evaluation and verification (judge data validity), report feedback (transmit detection results);
Auxiliary process elements (2): Support the effective implementation of basic processes - Quality control (e.g., parallel sample testing, recovery rate tests to verify data accuracy), Opinion handling (e.g., responding to customer complaints and improving services);
Basic elements (3): Underlying support for all processes - Quality documents and records (basis and traces of system operation, such as work instructions and inspection records), personnel (guarantee of capabilities, e.g., inspectors need to work with certificates), and statistical techniques (scientific methods for data processing, such as standard deviation calculation).
Chapter 2 Positioning and Lifecycle Management of the Quality Manual
2.1 Purpose and Basis
Purpose
There are three core objectives for establishing a quality manual:
1. Implement the quality policy: Transform the abstract policy of "scientific, fair, accurate, and efficient" into executable operational requirements (for example, "accurate" corresponds to "test results need to be checked by two people").
2. Standardize quality activities: Cover the entire process from "sample reception" to "report issuance" to avoid human errors (e.g., inspectors randomly changing testing methods).
3. Ensure data reliability: Through systematic management, ensure the fairness (free from external interference), accuracy (meeting standard requirements), and traceability (every step is recorded) of the test data, and ultimately enhance the laboratory's technical capabilities and market credibility.
According to
The underlying logic of the system establishment stems from four key standards/criteria:
GB/T 15481—2000 "General Requirements for the Competence of Testing and Calibration Laboratories": The "entry criterion" for laboratory competence accreditation, which stipulates the core requirements for technology (such as equipment calibration) and management (such as document control).
"Evaluation Criteria for Measurement Certification/Examination and Approval (Acceptance) of Product Inspection Institutions": The basis for the assessment of measurement certification, focusing on "traceability of measurement values" (e.g., equipment needs to be verified by a legal institution) and "reliability of results".
ISO/IEC Guide 2, "General Terms and Definitions for Standardization and Related Activities": It provides common terms in the field of standardization (such as "standard method" and "non - standard method") to ensure consistent expression in the system documents.
JJF 1001—1998 "General Metrological Terms and Their Definitions": It serves as the terminological basis in the metrological field (such as the differences between "calibration" and "verification") and supports the accurate understanding of the metrological traceability process.
2.2 Scope of application
The management rules of this manual cover three types of quality system documents:
Quality Manual: The "top-level program" of the system, which elaborates on the quality policy, objectives, and overall framework.
Procedure documents: Operation guides for specific processes (e.g., Sample Management Procedure, Document Control Procedure).
Quality documents: Supportive documents (such as work instructions, record forms, equipment archives).
Applicable scenarios include:
Internal management: Used for the implementation of the system (e.g., inspectors operate according to the manual), inspection (e.g., the quality assurance department conducts random checks on records), and audit (e.g., internal auditors verify the implementation of the system).
External interaction: Provide a basis for customers to evaluate the testing capabilities (e.g., customers refer to the manual to understand quality control requirements), and provide a basis for review institutions to assess the management system (e.g., the CNAS review team checks the compliance of the manual).
2.3 Daily management of the quality manual
Compilation, approval and release
Compilation: Led by the chief technical person in charge of the center, a compilation team is formed by technical backbones from the Quality Assurance Office, the Monitoring Business Office, and each sub - center. Based on the laboratory's business scope (e.g., surface water and groundwater detection) and management needs (e.g., new detection items), the content drafting is completed in line with the standard requirements.
Approval: After the compilation is completed, it needs to be approved by the highest management of the center (such as the director's office meeting) to confirm that the content complies with laws and regulations (such as the "Metrology Law"), standard requirements, and the actual situation of the laboratory.
Release: After approval, specify the effective date (e.g., "Effective from January 1, 2024"), convey it to all departments through channels such as the internal office system and paper documents, and record the issuance ledger.
Centralized management and controlled status
Responsible department: The monitoring business room is the responsible department for the quality manual and is responsible for: maintaining the effectiveness of the manual version (promptly updating expired content, such as clause adjustments after standard revision), answering questions from various departments (for example, when inspectors consult on how to fill out records), and coordinating the revision and re - publication of the manual.
Controlled management: All manuals used for the operation of the system shall be marked with the "controlled" label (such as a red seal or electronic watermark), and the distribution scope is limited to the employees of this center and approved external institutions (such as the review team); manuals without the "controlled" label shall be regarded as "non - valid versions" and shall not be used to guide work (for example, if a tester operates according to an old - version manual, they will be subject to assessment).
External distribution and requirements for holders
External distribution: It requires approval from the person in charge of the Quality Assurance Room. The recipient (e.g., customers, review agencies) and the purpose (e.g., customers' review of quality control requirements) should be clearly specified, and information such as "distribution date, recipient, purpose" should be recorded.
Obligations of the manual holder: The manual holder must meet the "Three Musts" - must learn and understand (grasp the core requirements of the manual and their own responsibilities. For example, inspectors need to remember that "equipment must be calibrated before inspection"); must strictly implement (carry out work in accordance with the provisions of the manual and shall not simplify the process at will); must properly keep (shall not make private copies or disclose it to outsiders. If it is lost, it must be reported within 24 hours). If the manual is revised, the holder shall actively replace the revised pages (for example, after page 5 is revised, the old page shall be replaced with the new one) to ensure the use of the latest version.
2.4 Revision and Reprint Rules
Revisions trigger conditions: Revisions shall be initiated when the following situations occur:
1. Standard update (e.g., the edition change of GB/T 15481);
2. Institutional adjustments (such as department mergers and the establishment of new sub - centers);
3. Expansion of business scope (e.g., adding a new project of "soil testing").
4. System defects are found during internal audits/management reviews (e.g., "non - standard record filling" requires the revision of the "Record Management Requirements").
Revision requirements: Adopt the principle of "full-page replacement" — that is, the revised pages need to replace the old pages as a whole, and mark the "revision date (e.g., 2024-03-15)" and "revision version (e.g., V1.1)".
Reprint conditions: When the revised content exceeds 30% of the total pages of the manual (for example, 3 new chapters are added), or when there are significant changes in the architecture (such as adjusting the chapter order), a reprint needs to be initiated. For a reprint, the process of "preparation - approval - release" needs to be carried out again, and all old - version manuals need to be replaced (for example, the old manuals should be collected and destroyed uniformly).
2.5 Requirements for publicity, implementation and enforcement
Promotion and implementation plan: After the manual is approved and released, the Quality Assurance Office shall formulate a promotion and implementation plan, the content of which includes:
- Training participants: All employees (trained in batches by department. For example, testers will be trained on the "testing process" first, and administrators will be trained on "document control" first).
- Training content: Core requirements of the manual (e.g., quality policy, testing process), clauses related to employees' positions (e.g., equipment administrators need to master the "equipment calibration requirements").
- Training time: Complete within 1 month after the release (ensure that all employees understand the requirements before implementation).
New employee training: Before newly transferred employees start work, they must complete the study and assessment of the quality manual (such as a closed-book exam or on-site operation assessment). Those who fail to pass the assessment are not allowed to start work (for example, inspectors need to answer correctly about "methods of quality control" before they can start working).
Implementation supervision: The Quality Assurance Office needs to verify the implementation effect through "daily inspections (e.g., randomly checking whether the test records meet the requirements of the manual) and quarterly assessments (e.g., counting the 'number of times of violating the manual clauses')". If employees find any unreasonable parts in the manual during implementation (e.g., "the process is too cumbersome and affects efficiency"), they need to promptly report them to the Quality Assurance Office for subsequent revision.
2.6 Document numbering rules
The quality system documents adopt the numbering rule of "abbreviation of the institution - document type - chapter number - year". Example:
- ××SHJ - CX - 02 - 2002: Program document (CX) of ×× Center (××SHJ), Chapter 2 (02), issued in 2002 (e.g., "Procedure for Control and Maintenance of Documents and Data").
- ××SHJ - ZY - 02 - 2002: Work Instruction (ZY) of ×× Center, Chapter 2 (02), issued in 2002 (e.g., "Rules for Compiling Procedure Documents").
2.7 Related Documents
- "Document and Data Control and Maintenance Procedure" (××SHJ - CX - 02 - 2002): Standardize the compilation, approval, distribution, revision, and destruction of documents.
- Rules for Compiling Program Documents (××SHJ-ZY-02-2002): Specify the format of program documents (e.g., "Purpose - Scope - Responsibilities - Process") and content requirements (e.g., "Referenced Standards" shall be included).
Additional notes
- Compiler: ××× (Responsible for drafting and coordinating the content of the manual to ensure it conforms to the actual situation of the laboratory);
- Reviewer: ××× (responsible for reviewing the compliance of the manual, such as whether it meets the requirements of GB/T 15481).
Launch of quality activities
Quality Function Deployment (QFD) is a key tool for transforming abstract quality elements into specific execution actions. For example, starting from the core element of "accuracy of water environment monitoring data", it can be disassembled layer by layer into secondary activities such as "control of sample collection representativeness", "standardization of pre - treatment steps", "compliance of instrument calibration frequency", and "consistency of detection personnel operations". Then, it can be further refined into implementable tertiary tasks such as "the sampling point spacing does not exceed 500 meters", "the sample refrigeration temperature is controlled at 4℃±1℃", and "the gas chromatograph is calibrated once every 15 days". Through the QFD relationship matrix table (such as Table 5 - 2 and Table 5 - 3), the responsible department corresponding to each activity can be clearly mapped. For instance, "sample collection" is the responsibility of the sampling room, "instrument calibration" is coordinated by the equipment management room, and "operation consistency" is supervised by the analysis room. Thus, the boundaries of cross - departmental collaboration are clarified, and execution loopholes caused by ambiguous responsibilities are avoided.
Establishment of quality system
According to the systems theory, the core structure of the quality system consists of four major elements: organizational structure, "procedure documents", "business processes", and "resource support", and these four elements support each other.
Organizational structure: Build a hierarchical division of labor based on the principle of "equality of power and responsibility" - the director of the center is responsible for the overall system, the chief quality officer coordinates the operation of the system, the quality assurance office conducts specific supervision, and each business office (sampling, analysis, reporting) assumes the quality responsibility within its link (see Chapter 4 of the manual for details).
Procedure documents: They are the operation guides for standardizing processes, clearly stating what to do, who will do it, and how to do it (for example, the Testing Process Control Procedure stipulates the requirements for the entire steps of sampling → transportation → testing → review → reporting). Refer to Chapter 5 of the manual for the detailed compilation rules.
Business process: It is the logical main line running through the system, covering the entire chain from "customer entrustment" to "report issuance". It integrates organizations, procedures, and resources into a implementable business process (for example, "recording the water temperature and pH value synchronously during sampling" is a specific action in the "sample representativeness" process).
Resource guarantee: It is the foundation for the operation of the system, including certified testing personnel, calibrated instruments and equipment, laboratory environment meeting the requirements of GB/T 27025 (e.g., the cleanliness of the microbiology laboratory reaches Class 100,000), currently effective testing methods (e.g., GB 3838 - 2002 "Environmental Quality Standards for Surface Water"), etc. For specific requirements, please refer to Chapters 6 - 8 of the manual.
Quality system documents
Quality system documents are the collection of regulations for internal quality management. They are both the basis for implementation and the to prove the effectiveness of the system to customers. They are divided into a four - layer structure:
1. Quality Manual
As a programmatic document, the quality manual clarifies the center's quality policy (e.g., "Accurate data, efficient service, and continuous improvement") and comprehensively covers five core control domains:
- Sample quality control (selection of sampling points, storage conditions, transportation specifications);
- Detection process control (method validation, parallel sample testing, data review);
- Instrument and equipment control (procurement acceptance, calibration and maintenance, scrapping process);
- Personnel quality control (pre - job training, annual assessment, qualification management);
- Facility environment control (laboratory temperature and humidity, ventilation system, safety protection).
It is the "fundamental rule" that the center has long adhered to, and also the core basis for third - party certification (such as CNAS laboratory accreditation).
2. Program files
It is the "detailed operation version" of the quality manual, focusing on "how to execute". For example, the "Document and Data Control and Maintenance Procedure" stipulates the processes of document preparation, approval, distribution, revision, and cancellation to ensure "only one effective version"; the "Internal Quality System Audit Procedure" clarifies the requirements for planning, implementation, and follow - up of internal audits. The list of procedure documents is detailed in Appendix A. It is the "action guide" for front - line personnel.
3. Other quality documents
It is a supportive operation document that provides "step-by-step" guidance for specific positions or equipment. For example, the "Operation Manual of GC - MS Gas Chromatograph - Mass Spectrometer" details the steps of starting the machine, sample injection, and data processing; the "Operation Instruction for Ammonia - Nitrogen Detection" specifies the reagent preparation, digestion time, and colorimetric conditions. This type of document is the key to realizing the last - mile implementation of quality requirements.
4. Quality records
It is the "objective evidence" of quality activities, including sampling records, instrument calibration logs, test report review forms, customer complaint handling forms, etc. Records need to meet three requirements:
Authenticity and completeness: Key information must not be tampered with or omitted (for example, the sampling time should be accurate to the minute).
Easy to retrieve: Classify and file by project and time (for example, store the "COD detection records for the 1st quarter of 2024" in the corresponding folder).
Compliance with preservation requirements: Store in a fire - and moisture - proof archive room, and ensure no damage during the preservation period (for example, preserve the original records for 5 years and the reports for 10 years).
Operation of the quality system
1. Operation control
To ensure that testing activities comply with the requirements of the system, the center has established a comprehensive quality assurance system focusing on five key dimensions:
Personnel control: Testing personnel must be certified to work (e.g., holding the "Certification of Technical Assessment for Environmental Monitoring Personnel"). They should participate in at least 40 hours of professional training each year. Those who fail the assessment will be suspended from work.
Equipment control: The instruments shall be calibrated according to the procedures (e.g., the pH meter shall be calibrated once a month) and maintained (e.g., the liquid chromatography column shall be replaced every 3 months). The uncalibrated or faulty equipment shall be labeled with "Out of service".
Environmental control: The temperature, humidity, and cleanliness of the laboratory need to be monitored in real - time (for example, the temperature of the atomic absorption room should be controlled at 20℃±2℃). When the standards are exceeded, emergency measures should be initiated (such as turning on the air - conditioning and purification systems).
Method control: The current valid standards (e.g., GB 11914 - 89 "Water quality - Determination of chemical oxygen demand - Dichromate method") shall be adopted for the detection methods. If the standards are updated, the method validation shall be completed and the procedure documents shall be revised within 30 days.
Accident and complaint handling: Establish the "Quality Accident Handling Procedure" and the "Customer Complaint Response Procedure" — Upon receiving a customer complaint (e.g., "The data does not match the actual situation"), a response shall be provided within 24 hours, and a solution shall be given within 5 days.
The system operation process can be visually presented through the "Operation Control Block Diagram" in Appendix B, which facilitates the quick identification of bottlenecks (for example, "Sample transportation delay" needs to be traced back to the logistics link).
2. Document control
The compilation, approval, distribution, revision, and invalidation of all system documents shall follow the "Document and Data Control and Maintenance Procedure". For example, after a certain procedure document is revised, the old version needs to be retrieved and destroyed, and the new version needs to be stamped with the "Valid" seal and distributed to all relevant personnel to prevent the misuse of expired documents.
Internal quality system audit
Internal audit is the core means to verify the "compliance" (meeting the document requirements) and "effectiveness" (achieving the quality objectives) of the system operation. The process is as follows:
1. Audit planning
Main body: Led by the person in charge of the overall quality, formulate an annual audit plan (specify the audit scope, frequency, and time), and implement it after obtaining approval from the director of the center.
Personnel: The internal audit team shall be composed of "certified internal auditors" - they must have passed the national or provincial training and assessment, and have no direct conflict of interest with the audited department (for example, they shall not audit the analysis room where they work).
2. On-site implementation
Adopt the method of combining "checking, observing and asking":
Check records: Verify whether the sampling records and instrument calibration logs comply with the program requirements.
Watch the operation: Observe whether the testing personnel conduct parallel sample tests in accordance with the work instruction.
Inquire about the situation: Interview the person in charge of the interview sampling room to find out whether the sample preservation measures have been implemented effectively.
After the audit is completed, prepare the "Internal Audit Report" to clarify the "conformity items" (e.g., "The equipment calibration rate is 100%") and the "non - conformity items" (e.g., "A testing personnel failed to fill in the parallel sample records").
3. Closed-loop of non-conforming items
For the "non-conformities", the responsible department needs to complete three things within 3 days:
1. Cause analysis: For example, "Did not participate in the latest record - filling training";
2. Measure formulation: For example, "Complete the training and pass the assessment within 2 days, and only those who pass the assessment can start working."
3. Approval and implementation: The measures shall be implemented after being signed by the review team leader and approved by the chief quality officer.
The Quality Assurance Room is responsible for follow - up verification. If the measures are effective (e.g., "The subsequent records of this person are complete"), the process is closed - loop. If they are ineffective (e.g., "There are still omissions"), the reasons need to be re - analyzed and the measures adjusted.
4. Handling of special situations
If "suspicious test results" are found during the review (for example, the COD value of a certain water sample is 200 mg/L, far exceeding the standard limit without a reasonable explanation), three emergency measures shall be initiated immediately:
1. Stop the relevant testing activities to prevent the errors from expanding.
2. Notify the affected customers (e.g., an enterprise) and relevant departments (e.g., the Department of Ecology and Environment) in writing and explain the situation.
3. Retest the samples, calibrate the instruments, confirm the root cause of the problem (e.g., "reagent expired"), and take corrective actions (e.g., "replace the reagent and retest").
5. Data management and result application
Data archiving: The Quality Assurance Room collects audit plans, reports, and corrective action records, and archives them after approval by the Chief Quality Officer. The retention period is 5 years.
System improvement: For "effective corrective actions" (such as "increase the frequency of parallel sample tests to 2 times"), they need to be incorporated into the system documents (such as revising the "Inspection Process Control Procedure").
Input for management review: The internal audit results will serve as an important basis for the "management review" and provide directions for system optimization (for example, "The internal audit finds that personnel training is insufficient, so the management review needs to increase the training budget").
Management review of quality system
Management review is the "strategic review" of the quality system by the top management of the center. The purpose is to evaluate whether the system is "adaptable to changes" (such as regulatory updates and upgrades in customer requirements) and "operating effectively" (such as the achievement rate of quality objectives).
1. Review planning
Frequency: At least once a year. If a major quality accident occurs (e.g., incorrect test data leading to decision-making errors) or the external environment changes (e.g., the revision of the Environmental Quality Standards for Surface Water), the review frequency needs to be increased.
Organization: The review is chaired by the director of the center. The members of the review team include the deputy director, the chief technical officer, the chief quality officer, and the directors of each department, ensuring coverage of the key nodes of the system.
2. Review process
1. Data collection: The Quality Assurance Office shall collect the following data 10 days in advance:
- Results of internal audits (e.g., the number of non-conformities, the completion rate of corrective actions);
- Customer feedback (such as complaint rate, satisfaction survey results);
- Statistical analysis of test data (e.g., passing rate of COD tests, deviation rate of parallel samples);
- Resource changes (such as the addition of ICP-MS instruments and personnel changes).
2. Meeting discussion: The review team discussed topics such as "whether the quality policy is appropriate", "whether the system operation is effective", and "whether the resources are sufficient", and formed improvement resolutions (e.g., "add temperature and humidity monitoring points in the laboratory", "adjust the training plan").
3. Measure implementation: Each department shall formulate specific plans according to the resolutions (e.g., "Install 5 temperature and humidity recorders within 1 month"), and implement them after being approved by the director of the center.
4. Tracking verification: The Quality Assurance Room verifies the effectiveness (e.g., "checking whether the recorder data meets the requirements") within 15 days after the implementation of the measures to ensure the improvement is put into practice.
3. Data management
After being approved by the director of the center, the review materials (meeting minutes, records of improvement measures, and reports on resource changes) shall be filed and kept by the Quality Assurance Office for a period of 10 years.
Exceptional permission for deviation of testing work from procedures and standards
1. Basic principles
Testing activities must strictly follow the system documents and standard specifications and shall not deviate randomly. This is the bottom line for ensuring the accuracy of data. If a deviation is really necessary, it must meet three conditions: "necessary, reasonable, and controllable".
2. Permissible situations
Exceptions can only be applied for in the following four situations:
1. Incomplete standards: There are defects in the methods of the current standards (for example, the detection method for a certain pollutant does not cover complex matrices), and there are more advanced methods available for replacement (for example, using liquid chromatography - mass spectrometry to replace the traditional spectrophotometry).
2. Unreasonable method: The test steps specified in the standard obviously do not conform to the theory (for example, the end - point determination of a certain titration method relies on subjective observation, which easily leads to errors), and there is theoretical support for adjustment (for example, using an automatic titrator instead).
3. Force majeure factors: Occurrence of unforeseeable and insurmountable events (such as power outage in the laboratory, making it impossible to complete the digestion steps as per the procedure).
4. Special requirements of the customer: The customer puts forward reasonable special requirements (e.g., "The report needs to be issued within 24 hours") and gives written consent to deviate from the original procedure (e.g., "Simplify the pre - treatment steps but increase the frequency of parallel sample testing").
3. Deviation from the processing flow
If a deviation is indeed necessary, the following steps should be followed:
1. Application and approval: The testing personnel shall fill out the "Application Form for Deviation Permission", stating the reasons for deviation and alternative solutions. After being reviewed by the chief technical officer and approved by the director of the center, the deviation shall be implemented (a written agreement shall be attached if there are special requirements from the customer).
2. Immediate control: If a deviation has occurred (e.g., the instrument was not calibrated according to the procedure), the testing shall be stopped immediately. Report to the person in charge of the analysis laboratory and label the affected samples with "Pending re - inspection" to prevent the outflow of incorrect data.
3. Investigation and correction: Analyze the reasons for the deviation (e.g., "Omission of instrument calibration records"), take corrective measures (e.g., "Complete the calibration and supplement the records within 3 days"), and ensure that the problems have been resolved before resuming the testing.
4. Notification and recording: If the deviation affects the test results (e.g., the data deviation exceeds 5%), the customer and relevant departments (e.g., the Environmental Protection Department) shall be notified in writing within 24 hours; all deviation situations (reasons, measures, results) shall be recorded in detail in the "Deviation Situation Registration Book" and included in the agenda of the next internal audit to prevent recurrence.
4. Handling of major deviations
In case of a "major quality accident" (such as incorrect test data leading to wrong decisions by the customer), in addition to implementing the above measures, the "Accidental Situation Control Procedure" shall be initiated:
1. Establish an accident investigation team (led by the director of the center, with the participation of the Quality Assurance Office and the Analysis Office).
2. Complete the investigation into the cause of the accident within 3 days and form the "Accident Investigation Report".
3. Propose a rectification plan (such as "replacing testing personnel and revising procedural documents") within 5 days, and implement it after being approved by the director of the center.
4. Submit the "Rectification Report" to customers and relevant departments to eliminate negative impacts.
Verification and comparison experiments
To continuously prove the "reliability" and "stability" of the testing ability, the center evaluates the effectiveness of the method and the consistency of personnel operations through verification and comparison experiments. The specific methods are as follows:
1. External proficiency testing
Participate in proficiency testing activities organized by national or provincial authoritative institutions (such as the "Proficiency Testing for Heavy Metal Detection in Water Environment" organized by the China National Environmental Monitoring Centre). Evaluate the testing level of our center through data comparison with over 100 laboratories across the country. If the result is "satisfactory", it indicates that the capabilities meet the requirements; if it is "unsatisfactory", the reasons (such as "errors in standard solution preparation") need to be analyzed, improvement measures (such as "training personnel and replacing reference materials") should be taken, and the proficiency testing should be retaken.
2. Internal quality control
Conduct internal comparisons in accordance with the "Laboratory Comparison and Proficiency Testing Procedure":
Inter - laboratory comparison: The sub - center of the organization (such as a monitoring station in a certain city) conducts tests on the same sample (such as a water sample numbered "20240501") and compares the data consistency. If the deviation of the COD test value of a certain sub - center exceeds 10%, targeted training should be provided for its personnel.
Re - test the retained samples: Re - test the "retained samples" (for example, the tested water samples are stored for 3 months according to the procedure) to verify the "repeatability" of the data. If the deviation between the two test results exceeds 5%, it is necessary to check the instrument status (such as whether it is calibrated) and personnel operation (such as whether the sample is injected according to the procedure).
3. Applications of mathematical statistics
Use statistical techniques such as standard deviation, relative deviation, and Z - value to analyze the comparison results, identify abnormal data (for example, the parallel sample deviation rate of a certain tester is 15%, far exceeding the standard of 5%), and take improvement measures (such as "increase the frequency of operational assessments for this person").
Through the above methods, the center can continuously verify that "the test data is accurate and reliable", enhance the trust of customers (such as environmental protection departments and enterprises) in the results, and provide data support for system improvement at the same time.
Implementation requirements for verification and comparison
Operating specifications for repeated inspection
Repeated testing is the core means to reduce random errors through multiple detections and verify the reliability of results, which is divided into two categories: repeated testing using the same method and cross - validation using different methods.
Repetition of the same method: Use the same detection method, the same equipment, and the same operator to continuously detect the same sample 2 - 3 times under the same environmental conditions (e.g., temperature 25±2℃, humidity 50±10%). For example, when detecting COD in water quality, use the potassium dichromate method to conduct repeated measurements 3 times, and the relative standard deviation (RSD) of the results should be ≤5%. This step is to verify the consistency of the operation process and identify random errors caused by the operator's technique (e.g., deviation in the volume measured by a pipette) and equipment fluctuations (e.g., absorbance drift of a spectrophotometer).
Cross - validation with different methods: Use detection methods with different principles to detect the same sample (for example, measure ammonia - nitrogen simultaneously by the Nessler's reagent method and the salicylic acid method). If the difference between the results of the two methods exceeds the allowable range (for example, ±10%), the cause needs to be immediately investigated: Is it due to sample matrix interference (such as the influence of organic matter in sewage on the Nessler's reagent method), equipment calibration deviation (such as the uncalibrated wavelength of the spectrophotometer), or operator error (such as incorrect reagent addition amount)? Cross - validation can effectively identify the systematic errors of a single method and ensure the accuracy of the results.
Periodic verification of reference materials
Reference materials (reference samples) are primary reference materials with known and accurate characteristics and values traceable to national/international standards. Their core function is to calibrate measurement systems and verify the accuracy of test results. The following rules shall be followed for regular verification:
Frequency setting: Determine according to the stability of the detection items and the status of the equipment - High - frequency detection items (such as COD, ammonia nitrogen) should be tested once a month, and low - frequency items (such as heavy metals, volatile organic compounds) should be tested once a quarter; After a new device is put into use or a device is repaired, a standard sample verification is required immediately (for example, after replacing the light source of a spectrophotometer, a standard sample should be used to verify the accuracy of its absorbance).
Operation requirements: Organized by the person in charge of the testing department, the testing personnel shall operate according to the standard method, and be supervised by the quality assurance office. During verification, the serial number, validity period, standard value, measured value, and relative deviation of the standard sample need to be recorded (for example, if the standard value of COD in the standard sample is 50mg/L, the measured value is 48mg/L, and the relative deviation is -4%). If the deviation exceeds the allowable range (such as ±5%), the testing of this project needs to be suspended, and the problems (such as reagent failure, equipment wavelength deviation) need to be investigated. After rectification, re - verification shall be carried out until the deviation meets the requirements.
Traceability requirements: Standard samples shall be purchased from qualified institutions (such as the National Institute of Metrology of China and the National Environmental Monitoring Center). Purchase records and certificates shall be retained (the certificates shall include the characteristic values, uncertainties, and traceability paths of the standard samples). Standard samples without certificates or with expired certificates shall not be used to avoid deviations in test results caused by inaccurate standard samples themselves.
Correlation analysis of sample characteristics
The characteristics of samples (such as matrix, concentration, and stability) directly affect the reliability of test results. It is necessary to determine the correlation between the test method and the results based on the characteristics:
Matrix correlation: Samples of different matrices (such as sewage, drinking water, and seawater) have different interferences on the detection method. For example, sewage contains a large amount of organic matter, which will react with mercuric iodide in the Nessler's reagent method, resulting in a higher result of ammonia nitrogen. Therefore, distillation pretreatment is required first. While the matrix of drinking water is relatively pure and can be directly detected. It is necessary to determine the correlation between "pretreatment steps" and "accuracy of detection results" through experiments (for example, the deviation between the ammonia nitrogen result after distillation and the standard value is ≤ 5%) to ensure that the method is applicable to the sample matrix.
Concentration correlation: The sample concentration range needs to match the linear range of the detection method. For example, the linear range of the COD detection method is 10 - 1000 mg/L. If the COD concentration of the sample is 1500 mg/L, it needs to be diluted 5 times (diluted to 300 mg/L) before detection. Otherwise, the result will deviate from the linearity (for example, the absorbance exceeds the linear range of the instrument, resulting in a lower result) — It is necessary to verify the correlation between the "dilution factor" and the "result accuracy" (for example, the deviation between the result after 5 - fold dilution and the standard value is ≤ 5%) to ensure that the diluted result can still reflect the true concentration of the sample.
Stability correlation: The stability of samples (e.g., being volatile or easily degradable) affects the detection time. For example, BOD samples need to be tested within 24 hours after collection; otherwise, the decomposition of organic matter by microorganisms will lead to lower results. On the other hand, heavy metal samples (such as lead and cadmium) have better stability and can be stored in a refrigerator at 4°C for 7 days. It is necessary to determine the correlation between "the time elapsed after sampling" and "the detection results" (e.g., for BOD samples, the result deviation is ≤10% after being placed for 24 hours), and specify the storage conditions and detection time limits for the samples.
Content requirements for verification and comparison test plans
The verification and comparison test plan is the "roadmap" to ensure the orderly implementation of activities. The following core contents need to be clarified:
Selection of test items
Priority should be given to key detection items (such as COD, ammonia nitrogen, and total phosphorus required by environmental protection departments), items prone to fluctuations (such as volatile organic compounds VOCs and total nitrogen), or items of concern to customers (such as residual chlorine and turbidity in drinking water). The selection criteria include:
- Detection frequency of the project: For high-frequency projects (e.g., COD is detected 100 times per month), operational fatigue is likely to occur, and frequent verification is required.
- Complexity of the method: Instrumental analysis methods such as gas chromatography and liquid chromatography are easily affected by the equipment status (e.g., the column efficiency of the chromatographic column), and key comparisons are required.
- Regulatory requirements: If a project belongs to the "National Proficiency Testing Program" (such as the COD proficiency testing organized by the National Environmental Monitoring Center), it shall be included in the annual comparison plan to ensure that the results meet national requirements.
Determination of participating laboratories
External laboratories: Laboratories with CMA qualification (metrological certification) and testing capabilities matching those of our center should be selected (such as municipal water quality monitoring stations and third - party testing institutions in the same region). For example, if our center uses the potassium dichromate method to detect COD, the external laboratory should also use the same method to avoid invalid comparison results due to differences in methods.
Internal laboratory: Specify the departments in this center participating in the comparison (for example, the First Testing Room is responsible for COD, and the Second Testing Room is responsible for ammonia nitrogen) to ensure the comparability of results among internal departments. For example, when the First Testing Room and the Second Testing Room simultaneously test the COD of the same sample, the result deviation should be ≤5% to verify the consistency of internal operations (such as whether digestion and titration are carried out according to standard methods).
Planning of time arrangement
The time needs to match the "regulatory requirements" of the testing cycle.
- Quarterly comparison: For high-frequency items (such as COD and ammonia nitrogen), conduct it once at the end of each quarter (e.g., March 25th, June 25th, September 25th, December 25th) to ensure timely detection of operational deviations within the quarter (e.g., whether the results fluctuate after the replacement of testing personnel in the second quarter).
- Annual verification: For low-frequency items (e.g., heavy metals), it shall be conducted once a year (e.g., in November) in combination with the annual internal audit to verify the stability of the whole-year test results.
- Special time: After a new device is put into use, a new employee starts work, or the testing standards are updated, a comparison should be arranged immediately. For example, the newly purchased atomic absorption spectrophotometer needs to be compared with the old device (by testing the lead concentration of the same sample) to verify its accuracy. After the new employee Zhang San starts work, he needs to be compared with the senior tester Li Si (by testing the ammonia nitrogen of the same sample) to verify his operation ability.
Regulations for participants
It is necessary to clarify the operators and supervisors.
- Operator: Key position personnel (such as the testing team leader, senior testers with more than 5 years of work experience) or new employees (to verify their operating ability) shall be preferentially selected; the same comparison project shall be operated by fixed personnel (for example, the COD comparison shall be completed by Zhang San from Testing Room 1) to avoid result fluctuations caused by personnel changes (for example, Li Si's titration speed is faster than Zhang San's, resulting in a higher COD result).
- Supervisor: To be filled by personnel from the Quality Assurance Office, responsible for supervising the standardization of the comparison process (e.g., whether operations are carried out according to standard methods, whether reagents are within the validity period, and whether records are complete) — for example, the supervisor needs to check whether the testing personnel strictly control the temperature (170 - 180°C) and time (2 hours) when digesting COD samples.
Explanation of the functions of relevant documents
Verification, comparison, and the preparation of procedure documents need to be associated with the following core documents. The functions of each document are as follows:
××SHJ - CX - 02 Document and Data Control and Maintenance Procedure: Standardize the entire process of compilation, distribution, revision, and abolition of procedural documents. For example, after a procedural document is revised (such as when the testing standard GB 11914 - 89 is updated to GB 11914 - 2023), the old version needs to be recalled and the new version distributed through this procedure to ensure that all testing personnel use the latest "COD Testing Procedural Document".
××SHJ - CX - 03 Quality System Audit Procedure: Specify the process of internal audits (such as audit plans, auditors, and rectification of non - conformities) —— The results of verification and comparison shall be included in internal audits (for example, if the deviation of the verification results of standard samples exceeds the range, it shall be regarded as a non - conformity), and identify the loopholes in the system (such as whether the result deviation is caused by poor management of standard samples).
××SHJ - CX - 04 Quality System Management Review Procedure: Regularly review the suitability of the quality system. For example, if the comparison results show that "the passing rate of COD standard sample verification has decreased from 95% to 80%", the verification frequency (increase from once a month to twice a month) needs to be adjusted or the operation process (such as strengthening the training of testing personnel) needs to be improved through management review.
××SHJ - CX - 06 Laboratory Comparison and Proficiency Testing Procedure: Specify the specific operating steps of the comparison test (such as sample preparation, testing, and result statistics) - it is the "operation guide" for verification and comparison (for example, it is stipulated that the comparison samples need to be numbered blindly to prevent the testing personnel from adjusting the results because they know the sample concentration).
××SHJ - CX - 07 Corrective and Preventive Measures Procedure: In response to the problems found in the verification and comparison (such as the deviation of standard sample verification exceeding the range), formulate corrective measures (such as replacing the ineffective potassium dichromate reagent and calibrating the spectrophotometer) and conduct follow - up verification to ensure the complete resolution of the problems (such as the deviation of standard sample verification ≤5% after rectification).
××SHJ - CX - 11 Emergency Situation Control Procedure: Handle unexpected situations during the comparison process (such as sample damage, equipment failure) — for example, if the comparison sample leaks during transportation, resample and record the situation (e.g., "On March 25, 2024, the COD comparison sample leaked. A new sample numbered S240325 was collected."), to ensure the validity of the comparison results.
Writing specifications for program documents
The core role of procedure documents
Procedural documents are the "operation manuals" of the quality system, which transform the abstract quality policies in the quality manual (such as "accurate, fair, and timely") into specific executable steps. For example, the quality manual stipulates that "ensure the accuracy of test results". The procedural document then clarifies through the "Laboratory Comparison and Proficiency Testing Procedure" that the test results shall be verified with reference materials quarterly. The verification is organized by the person - in - charge of the testing department, implemented by the testing personnel, and supervised by the quality assurance office. The verification results need to be recorded in the "Reference Sample Verification Record Form". In this way, "accuracy" is no longer just a slogan, but has specific operation processes and division of responsibilities, ensuring that all employees know "how to do" to achieve the quality goals.
Requirements for the content and format of procedure documents
Program files shall follow the principles of "structuring and standardization", and the contents include the following parts:
1. Cover
The cover is the "identity marker" of a document and should include:
- Laboratory name and logo (e.g., "Water Environment Monitoring Center of ×× City");
- Document name (e.g., "Quality System Management Review Procedure");
- Document number (e.g., ××SHJ-CX-04-2024);
- Compilers, approvers and dates (e.g., "Compiler: Li Si, 2024-03-01; Approver: Zhang San, 2024-03-10");
- Effective date and version number (e.g., "Effective date: 2024-04-01; Version: V1.0");
- Controlled status (e.g., Controlled/Uncontrolled), security level (e.g., For internal use), distribution registration number (e.g., F-001).
The core function of the cover information is to quickly identify the identity of the document and avoid confusion between different documents (e.g., the "Management Review Procedure" in the old version V0.9 and the new version V1.0).
2. Table of Contents
The table of contents should list the main chapters and page numbers of the document (e.g., "1 Purpose 3; 2 Scope of Application 4; 3 Responsibilities 5"), which facilitates quick content search. Especially for long - page procedure documents (e.g., the "Laboratory Comparison and Proficiency Testing Procedure" has a total of 15 pages), the table of contents can improve reading efficiency.
3. Masthead
The masthead should be set at the top of each page of the document, including:
- Laboratory name and logo;
- Document number and name (e.g., ××SHJ - CX - 04 Quality System Management Review Procedure);
- Effective date and edition number;
- Page number (e.g., Page 3 of 15).
The function of the masthead is to ensure the integrity of the document. If the pages of the document are scattered, the document they belong to can be identified through the masthead (for example, "××SHJ - CX - 04" is the management review procedure), preventing loss or confusion.
4. Main body content
The main body is the core of the procedural document and shall contain the following elements:
(1) Purpose
Clearly define the core goal of the procedure, which should be concise and straightforward. For example:
(2) Scope of application
Define the boundaries of the procedure and clarify "who uses it and where it is used". For example:
It applies to the implementation of all quality management system review activities in this center, including annual reviews and reviews under special circumstances (such as a sharp increase in customer complaints and organizational structure adjustments).
(3) Responsibilities
Clarify "who will do it", "what to do", and "what responsibilities to assume" to avoid shifting blame. For example, the responsibilities in the "Quality System Management Review Procedure":
- Director of the center: Preside over the review meeting, approve the "Management Review Report", and be responsible for the review conclusion (e.g., decide whether to revise the quality system).
- Quality manager: Develop review plans, coordinate with various departments to provide materials (such as the "Statistics of COD test results" from the testing department and the "Customer complaint records" from the business department), and track the implementation of improvement measures.
- Quality Assurance Room: Organize the implementation of reviews (e.g., notify participants, summarize materials), record the meeting content, and track and verify improvement measures (e.g., check whether the testing department has revised the "COD Testing Operation Instruction").
- All departments: Submit the review materials of your own department and implement improvement measures (for example, the testing department revises the "Operation Instruction for Ammonia Nitrogen Detection").
(4) Work procedures
Describe the operation details in logical order, clearly specifying "input, output, process, requirements":
- Input: Materials required for carrying out activities (e.g., for management review, internal audit reports, customer complaint records, and standard sample verification results need to be input).
- Output: Results generated by the activity (e.g., management review report, improvement measure form);
- Process: Describe in the order of "Plan → Prepare → Implement → Summarize → Improve" —— For example, the process of management review:
1. Plan formulation: The director of the center determines the review date (e.g., December 20, 2024). The person in charge of quality formulates the "Management Review Plan", the content of which includes the review content (e.g., adaptability of the quality policy, testing quality), participants (the director of the center, directors of each department, personnel of the quality assurance room), and time arrangement (9:00 - 12:00 a.m.).
2. Data preparation: Each department shall submit data as planned (for example, the First Testing Room shall submit "Statistics of COD Testing Results in 2024", and the Business Section shall submit "Records of Customer Complaints in 2024"). The Quality Assurance Room shall compile them into the "List of Management Review Data".
3. Meeting implementation: The director of the center presides over the meeting. Each department reports the situation of its own department (for example, the first testing room reports that "the accuracy rate of COD test results is 98%, which fails to reach the target of 99%"). The participants discuss the issues (for example, "the reason why the COD accuracy rate fails to reach the target: the frequency of standard sample verification is insufficient"). The director of the center puts forward the conclusion (for example, "increase the frequency of COD standard sample verification to twice a month").
4. The technical person in charge of the report shall compile the "Management Review Report" within 5 working days after the meeting. The content includes the review overview (date, participants), existing problems (e.g., "Insufficient verification frequency of COD standard samples"), and improvement suggestions (e.g., "Revise the 'Laboratory Comparison and Proficiency Testing Procedure' to increase the verification frequency of COD standard samples").
5. Improvement implementation: Each department shall formulate the "Management Review Improvement Measures Form" as required by the report, clearly specifying the improvement content (e.g., the testing section revises the "COD Testing Operation Instruction"), the responsible person (Wang Wu, the director of Testing Room 1), and the completion time (January 10, 2025). The Quality Assurance Office shall conduct verification within 10 working days after the improvement is completed (e.g., check whether the revised operation instruction complies with the standard GB 11914 - 2023).
- Requirements: Specific regulations regarding personnel, equipment, environment, and records (e.g., "The review meeting must have complete records, which should include the participants, discussion content, and conclusions." "The improvement measures must be completed within the specified time. If not completed, the reasons must be stated.").
(5) Relevant documents
Refer to other files related to this program to avoid content duplication. For example, in the "Specification for Writing Program Files", it refers to:
The numbering of program documents shall be carried out in accordance with Article 3.1 of the "Management Method for Quality System Documents" (××SHJ - CX - 02).
(6) Quality records
List the record forms used in the procedure to ensure the traceability of the results. For example, the records of the management review procedure:
- "Management Review Plan"
- List of Management Review Materials
- Minutes of Management Review Meeting
- "Management Review Report"
- "Management Review Improvement Measures Form" (including improvement content, responsible person, completion time, verification results).
(7) Appendix
It includes blank form templates or supplementary instructions for easy operation. For example, the appendix of "Specification for Writing Procedure Documents":
- Appendix A: Cover template for program documents (including name, number, edition, etc.);
- Appendix B: Template for the inside cover of program documents (including compilation and approval information).
5. End of the publication
Supplement the compilation/revision instructions of the document when necessary—for example, "This document was revised in March 2024. The revision contents are 'Add the comparison requirements for new employees' and 'Adjust the verification frequency of reference samples'", which facilitates the subsequent tracing of the document change history.
Example illustration of program document writing
Example I: Specification for Program File Writing
II. Management of Changes to Quality System Documents
When the quality system documents need to be adjusted, the Quality Assurance Office, as the department in charge of system management, needs to take the lead in confirming the rationality of the change requirements first. Whether it is due to standard updates, process optimization, or document defects found in actual operations, the source of the requirements should be verified first (for example, the testing standard GB/T 18204.2 - 2014 is revised to the 2023 version, or the laboratory reports that the description of an operation instruction is vague, resulting in operational deviations). Subsequently, the Quality Assurance Office needs to organize the original compiling department to participate in the changes. Since the original compiling department is most familiar with the background, logic, and applicable scenarios of the document formulation, it can ensure the consistency of the changed content with the original document. The entire process must strictly follow the specifications of the "Document and Data Control and Maintenance Procedure": from the "Change Application" (clearly stating the reason, scope, and expected effect of the change) to the "Review" (the Quality Assurance Office, the original compiling department, and relevant using departments jointly evaluate the impact of the change on the system), then to the "Approval" (requiring the signature confirmation of the quality responsible person), and finally the "Release and Replacement". The new document needs to synchronously update the distribution list (including all using departments such as the laboratory, customer service department, and administrative department), and recycle and invalidate the old - version documents to prevent the mixed use of new and old documents.
III. Verification and reporting of improvement measures for management review
The improvement measures from the management review output need to be implemented effectively, and the Quality Assurance Office undertakes the core responsibility of "effect verification". The verification methods should be in line with the types of measures.
- For process optimization measures (such as shortening the sample transfer time), it is necessary to verify the time-consuming comparison before and after the improvement in the "Sample Transfer Record".
- For measures to improve quality (such as reducing the non - compliance rate in inspections), it is necessary to calculate the proportion of non - compliant batches within 3 months after the improvement.
- For personnel capacity-related measures (such as new training), it is necessary to check the training sign-in sheet, assessment results, and the improvement of skills in actual operations.
After the verification is completed, the Quality Assurance Office needs to fill out the "Management Review Improvement Measures Form". The content should be specific to: the implementation details of the improvement measures (e.g., "The laboratory adds 2 new automatic sample pre - treatment devices"), the responsible department (the laboratory), the implementation time (from March 1st to March 15th, 2024), the verification results (e.g., "The sample transfer time is shortened from 48 hours to 24 hours, meeting the target"), and the reasons for non - compliance (if any, e.g., "The measures were not fully implemented due to the delay in equipment procurement").
Finally, the Quality Assurance Office needs to report the verification results to the director of the center through the technical leader. As the technical person in charge of the quality system, the technical leader needs to first confirm the technical rationality of the improvement measures (e.g., "Does the newly added equipment meet the requirements of the testing standards?"), and then present to the director the "implementation status of the measures", "effect data", and "suggestions for the next step" (adjust the measures if the standards are not met), so as to ensure that the director fully understands the actual effectiveness of the system improvement.
V. Explanation of relevant documents
The relevant documents are the "underlying rules" for the operation of the quality system, and their association with management activities needs to be clarified.
"Internal Quality System Audit Procedure" (××SHJ - CX - 03 - 2002): It is the "input source" for management review. The non - conformities of the system and improvement suggestions found in the internal audit need to be included in the management review discussion to promote the optimization of the system.
"Document and Data Control and Maintenance Procedure" (××SHJ - CX - 02 - 2002): It is the "operation guide" for document changes. The requirements of this procedure must be followed throughout the entire process from the proposal of the requirement to the release of the document to ensure the "effectiveness" and "consistency" of the documents.
VI. Quality Record Management
1. The core role of quality records
Quality records are the "written traces" of quality activities, and their value is reflected in two aspects:
Objective tracing: It can reproduce the "whole process" of quality activities. For example, a certain "Inspection Record" needs to include: sample number, inspection time, inspectors, instruments used (model + calibration status), inspection methods (standard number), environmental conditions (temperature/humidity), original data, calculation process and result judgment. If the customer has objections to the results later, the operation scenario at that time can be traced back through the records to verify the accuracy of the results.
Prevention and improvement: It serves as the basis for "root cause analysis". For example, if the test results of three consecutive batches of samples are unqualified, it can be found from the records that "the same uncalibrated spectrophotometer was used in all cases". Therefore, preventive measures such as "regularly calibrating the equipment" can be taken to avoid the recurrence of similar problems.
2. General requirements for quality records
Quality records shall meet the three major principles of "manageable, searchable, and traceable".
Easy to manage: Records need to be classified and filed according to "department + process" (e.g., "testing records" in the laboratory and "training records" in the administrative department), and a unified numbering rule should be established (e.g., "JL-Laboratory-2024-001", where "JL" represents records, "2024" is the year, and "001" is the serial number) to facilitate quick retrieval.
Easy to operate: The form design should be "concise and clear" - mark the required fields with asterisks and avoid redundant fields (for example, there is no need to fill in the "customer address" in the inspection record unless it is related to the inspection) to ensure that the fillers can complete it quickly and reduce errors.
Complete information: Technical records should be able to "replicate technical activities" (for example, the "soil pH testing record" should include the air - drying time of the sample, the mesh number of sieving, and the reagent concentration; otherwise, the experiment cannot be repeated). Management records should be able to "track management behaviors" (for example, the "contract review record" should include the reviewers, review opinions, and approval results; otherwise, it is impossible to trace whether the "contract terms meet the system requirements").
3. Compilation specifications for inspection record sheets
The inspection record form is a record "directly related to product quality", and three points need to be focused on:
Column design adaptability: It is necessary to cover "all requirements of the inspection standard". For example, the food microbiological inspection form should include "sample name, batch, sampling location, detection items (total number of colonies, coliform group), detection method (GB 4789.2 - 2016), culture temperature (36℃±1℃), culture time (48 hours±2 hours), result judgment (qualified/unqualified)" to ensure that the inspection process can be fully reflected.
Format standardization: The "font, field order, and signature position" need to be unified. For example, all inspection forms should use A4 paper and the Song typeface in 5-point font, and the "Signature of inspection personnel" should be placed in the bottom right corner. This is to avoid information omission caused by different departments using different versions.
Unique identifier: A "unique number" (e.g., "JY-2024-03-005", where "JY" stands for inspection, "2024-03" represents the month, and "005" is the serial number for that month) needs to be assigned to each inspection form and associated with the "sample number" (e.g., "Sample number: YP-2024-03-005") to ensure full-chain traceability of "sample - record - report".
VII. Quality plan management
1. Definition and Value of Quality Plan
A quality plan is a "personalized quality solution for specific scenarios". When the general quality system documents (such as quality manuals and procedure documents) cannot meet the requirements of "special products, projects or contracts" (for example, customers require "urgent testing", "testing using non - standard methods" or "new business launched for the first time"), a quality plan needs to be prepared to combine the "general rules" with the "specific requirements" to ensure the achievement of quality objectives.
For example, if a customer requests "to complete the pesticide residue testing of 100 drinking water samples within 3 days", the "sample transfer time of 48 hours" in the general procedure cannot meet the requirement. At this time, the quality plan needs to be adjusted: "Send the samples to the laboratory within 2 hours after receiving them, and the laboratory will open the 'express lane' to prioritize the testing of this batch of samples." At the same time, clarify the "personnel allocation for express testing" (add 2 testers) and "instrument support" (reserve a dedicated gas chromatograph) to ensure on-time delivery.
2. Core content of the quality plan
The quality plan shall cover the key links of the "entire project lifecycle". The main contents include:
Project overview: Clearly define the project name, customer requirements, delivery time, and detection parameters (e.g., "Heavy metal detection project for wastewater of an enterprise", "Detection parameters: Cd, Pb, Cr", "Delivery time: April 10, 2024").
Quality objective: It needs to be specific and measurable — for example, the accuracy rate of test results is 100%, the error rate of reports is 0, the customer satisfaction rate is ≥ 95%.
Division of responsibilities: Clearly define the "responsible department/person for each stage" - for example, "Sample collection: Zhang San from the Sampling Department", "Testing: Li Si from the Laboratory", "Report review: Wang Wu, the technical person - in - charge".
Special requirements: Establish rules for the "uniqueness" of the project - for example, "If using a non - standard method provided by the customer, method validation (recovery rate ≥ 85%) must be completed first" and "The instrument must be calibrated 24 hours before testing."
Inspection guidance: Provide "targeted operation guidelines" - for example, "Wastewater samples need to be filtered through a 0.45μm filter membrane and then acidified with nitric acid to pH."<2", more detailed than the general instruction manual;
Stage control: Clearly define the "audit requirements for key nodes" - for example, "An internal audit should be conducted in the middle of the project (April 5th) to check whether sample processing and testing operations comply with the plan."
Revised rules: Specify the "process for plan changes" - for example, "If the customer adds test parameters, the business department shall submit a change application, which shall be reviewed by the quality assurance office and approved by the technical person - in - charge before revising the plan and notifying all relevant departments."
3. Requirements for the preparation of the quality plan
The quality plan should avoid being "formalistic" and meet the three major principles of "coordination, clarity, and operability".
Coordination with the existing system: It is necessary to be consistent with the quality policy and quality manual. For example, if the quality policy is "scientific, accurate, and efficient", the "efficient" goal in the quality plan should correspond to "delivery time ≤ 3 days", and the "accurate" goal should correspond to "test result error ≤ 1%". There should be no contradiction such as "ignoring accuracy for the sake of expediting".
Target focus on particularity: Goals need to be set according to the "unique requirements" of the project. For example, for "the first project on detecting microplastics in soil", the goals can be set as "pass the on - site review by the client at one time" and "make the test results traceable to international reference materials".
The measures should be implementable: "Specific and executable" measures need to be formulated around the goal. For example, to achieve "100% accuracy of test results", the measures can be set as "Conduct 2 parallel samples for each batch of samples, and the deviation of the parallel sample results should be ≤5%; Conduct 1 standard addition recovery experiment for every 10 samples, and the recovery rate should be ≥80%."
Format standardization: It is necessary to specify the "fixed structure of the quality plan", such as "Project overview → Quality objectives → Responsibility division → Special requirements → Inspection guidance → Stage control → Revision rules", to ensure the "readability" and "comparability" of all quality plans and facilitate the review by the management department.