Statistical Process Control (SPC) and Shewhart Control Charts
I. The core logic of SPC: Achieve "process prevention" through statistical techniques
The essence of Statistical Process Control (SPC) is to continuously monitor each stage of the process through statistical tools, identify and eliminate "abnormal fluctuations", and ultimately achieve quality stability and improvement. Its core is "prevention" - instead of reworking after defective products appear, it intervenes when problems are in their infancy.
The value of SPC for different roles in an enterprise is highly specific:
Frontline operators: Use SPC tools (such as control charts) to observe the fluctuations in work (e.g., changes in the cutting depth of machine tools), quickly locate the unstable points in operations (e.g., dimensional deviations caused by tool wear), and adjust methods targeting specific issues to reduce repeated errors.
Management cadres: SPC provides objective process data, eliminating the contradiction between the production department's view that "quality problems are due to operational errors" and the quality department's view that "they are due to process design defects" — data speaks for itself. For example, through control charts, it is found that the increase in the defective product rate is due to the fluctuation of raw material batches rather than workers' operations.
Enterprise leaders: SPC reduces rework by stabilizing the process (for example, reducing the defective product rate from 3% to 0.8%), reduces raw material waste (for example, increasing the utilization rate of steel from 85% to 92%), and ultimately improves productivity (for example, increasing the production line speed from 100 pieces per hour to 120 pieces per hour), which is directly translated into profit growth (after a mechanical enterprise used SPC, its annual profits and taxes increased by 18%).
The three core characteristics of SPC determine its universality:
1. Full - link coverage: From the entry of raw materials into the factory to the shipment of finished products, every link needs to be involved, requiring the full cooperation of R & D, production, quality control, and logistics personnel - this is completely consistent with the "total quality control by all employees" concept of Total Quality Management (TQM).
2. Scientifically distinguish fluctuations: The core tool is the Shewhart control chart. It separates chance fluctuations (inevitable random errors, such as minor machine vibrations) from abnormal fluctuations (eliminable systematic errors, such as mold wear) through statistical methods. Only by identifying abnormalities can problems be solved precisely.
3. Applicable across scenarios: It can be used not only in the manufacturing industry (such as monitoring the strength of automobile weld spots), but also extended to services (controlling the waiting time of bank customers) and management (optimizing the approval efficiency of administrative documents) —— SPC can find room for improvement in any scenario with "processes" and "fluctuations".
II. A Brief History of the Development of SPC: From Invention in the United States to Global Revival
The theoretical origin of SPC is the research of American statistician W. A. Shewhart in the 1920s. He first proposed the concept of "process control", which means monitoring fluctuations through statistical methods to prevent defects. However, this theory was not popularized at that time. It was not until World War II that the U.S. military industrial department began to promote Shewhart's method in order to ensure the quality of weapons (such as the consistency of shell sizes).
After the war, the United States became the global industrial hegemon, lacking external competition. Enterprises believed that "traditional methods were sufficient," and SPC was gradually forgotten. In contrast, Japan's economy collapsed after the war and it urgently needed to improve quality to restore exports. In 1950, American quality management expert W. Edwards Deming introduced SPC to Japan, emphasizing that "process control is the foundation of quality." Japanese enterprises (such as Toyota and Sony) quickly embraced this theory and spent 30 years integrating SPC into their production processes:
- Toyota monitors more than 100 key variables (such as bolt torque and body gap) on the assembly line using SPC. The defective product rate dropped from 5% in 1950 to 0.1% in 1980.
- Sony Electronics uses SPC to control the brightness of color TV picture tubes, and the product qualification rate has increased from 60% to 95%.
By 1980, Japan had leapt to the leading position in global quality and productivity. Under the competitive pressure of Japanese products, American enterprises had to attach importance to SPC again:
- Stelco in Canada has listed SPC as one of the "Seven High - tech Directions" (alongside technologies such as continuous casting and vacuum degassing).
- The three major American automobile companies (Ford, General Motors, and Chrysler) developed the QS 9000 standard based on ISO 9000 and compulsorily required suppliers to use SPC to monitor key components.
- The LTV Steel Company in the United States uses SPC to monitor the steelmaking temperature, resulting in a 10% reduction in energy consumption and a 15% increase in production.
So far, SPC has evolved from being "invented in the United States, promoted in Japan, and revived globally" to become the core tool of modern quality management.
III. Evolution of SPC: Three Stages from "Alarm" to "Diagnosis + Adjustment"
SPC is not static but evolves continuously with the increasing complexity of industries. To date, it has gone through three key stages:
1. SPC (Basic Alarms) – Shewhart's Separation of Variations
2. SPCD (Diagnostic Upgrade) – Zhang Gongxu's "Finding the Causes"
In 1982, Zhang Gongxu, a Chinese quality management expert, proposed the two-quality diagnosis theory, which broke through Shewhart's traditional framework. He divided quality into "total quality" (final product quality) and "partial quality" (intermediate quality of each process), and used statistical methods to find out the specific causes and locations of abnormalities (for example, whether it was due to unqualified raw material batches or the offset of the machine tool spindle). This theory upgraded SPC from "being able to give warnings" to "being able to diagnose", and it was named SPCD (Statistical Process Control and Diagnosis).
After that, Zhang Gongxu's team further improved it.
- In 1994, the "multivariate step-by-step diagnosis" was proposed to solve the problem of simultaneous fluctuations of multiple variables (for example, the "body gap" in the automobile assembly line is affected by three variables: "welding pressure", "fixture position" and "steel plate thickness", and the main cause can be identified).
- In 1996, the "Two - quality Multivariate Diagnosis" was proposed: It can solve the problems of complex systems with multiple processes and multiple indicators (for example, in chip production with 10 processes and 50 key indicators, it can locate the problem such as "abnormal lithography temperature in the 3rd process").
Currently, SPCD has entered the practical application stage, and China still leads the world in this field. Multiple domestic automotive and electronics enterprises have used SPCD to reduce the abnormal situation handling time from 2 hours to 15 minutes.
3. SPCDA (Closed-loop Optimization) – From Diagnosis to Solution
SPCD can identify the causes of abnormalities but cannot automatically adjust the process—just like a doctor still needs to prescribe medicine for treatment after making a diagnosis. Therefore, the next stage of SPC is SPCDA (Statistical Process Control, Diagnosis and Adjustment), which means "control + diagnosis + automatic adjustment":
- For example, the temperature sensor on the production line detects that "the temperature of the heating furnace is lower than the standard" (control), SPCD diagnoses that "the power of the heating tube has decreased" (diagnosis), and the system automatically increases the voltage of the heating tube (adjustment) — the entire process does not require manual intervention, achieving "unmanned process optimization".
Currently, foreign countries refer to this area as ASPC (Algorithmic Statistical Process Control), but there are no mature results yet. The team led by Zhang Gongxu in China is researching the implementation of SPCDA in high-end manufacturing scenarios such as semiconductors and new energy, attempting to solve the problem of "automatic optimization of complex processes".
IV. Implementation of SPC and SPCD: Key Steps from Theory to Practice
The value of SPC is not a "slogan on the wall" but an "action on the ground". The following steps need to be followed:
1. Step 1: Training – Unify Cognition and Skills
The implementation of SPC requires all employees to understand its logic, and training is the foundation of all foundations. The training content needs to be "stratified and classified":
- Frontline operators: Learn "how to collect data, how to read control charts, and how to report anomalies" (for example: Measure the dimensions of 10 parts every hour and record them on the control chart. If any points exceed the limit, immediately find the team leader).
- Managers: Learn "how to analyze causes using causal diagrams and how to find key variables using Pareto charts" (for example: analyze the causes of "the increase in defective product rate", and use a Pareto chart to find that "raw material batch problems" account for 70%).
- Technicians: Learn "Two-quality diagnosis theory and multivariate statistical methods" (for example, use SPCD to diagnose abnormalities in multiple processes and identify that "the deviation of machine tool parameters in the 5th process" is the main cause).
2. Step 2: Identify the key variables – grasp the core contradiction
There are many variables in the process (for example, there are more than 100 variables such as temperature, composition, and time in steelmaking), but only the "key variables" will affect the final quality (i.e., "the few variables that contribute the most to product quality"). The method for identifying key variables is "cause-and-effect diagram + Pareto chart":
Cause-and-effect diagram analysis: For each process, list all the factors that may affect the quality (for example, when analyzing "steel plate thickness", list "raw material thickness, rolling mill pressure, rolling temperature, and roll gap clearance").
Pareto chart ranking: Calculate the "proportion of quality issues" for each factor (for example, the issue of raw material thickness accounts for 60%, and the mill pressure accounts for 20%), and identify the "vital few" with the highest proportions.
For example, through this method, the LTV Steel Company in the United States identified a total of 20,000 key variables. These variables are the core of SPC monitoring. They account for 10% of all variables but affect 90% of the product quality.
3. Step 3: Set standards – transform experience into rules
After identifying the key variables, it is necessary to develop executable control standards for each variable. This is the most difficult and time - consuming step in the implementation of SPC. (For example, developing the standards for a workshop requires a workload equivalent to more than two person - years, that is, one person working for two years.) The standards should include the following core contents (taking "rolling mill pressure" as an example):
Content Description
Control content Working pressure of the rolling mill
Process standard The pressure shall be maintained at 100±5 bar (statistically stable range)
Control reasons Too low pressure → The steel plate is too thick; too high pressure → The steel plate cracks
Measurement regulations Measure once every 10 minutes using a pressure sensor and record the data in the MES system
Control chart type Use the X-R chart (mean-range chart) —— Monitor the "average level" and "fluctuation range" of pressure
Corrective measures If the pressure exceeds the limit, immediately stop the rolling mill and check the pressure valve of the hydraulic system
Approval and Review Preparer: Workshop Technician; Approver: Production Manager; Reviewed once every quarter
These standards need to form a "Process Control Standard Table" to ensure that there are "clear rules" for each link. Only through standardization can we avoid operations that "vary from person to person" and ensure the consistency of SPC.
Conclusion: The essence of SPC is "the certainty of the process"
The core of SPC is not "drawing control charts", but turning the "uncertain process" into a "certain process" through statistical techniques – making the fluctuations in production, services, and management "monitorable, diagnosable, and adjustable", and ultimately achieving "stable quality, lower costs, and higher efficiency".
From Shewhart's basic alarms to Zhang Gongxu's diagnostic upgrade, and then to the future SPCDA automatic adjustment, the evolution of SPC has always centered around one goal: to use scientific methods to address the "uncertainties" in the process. This is also the fundamental reason why it can become the cornerstone of modern quality management.
Step 4: Compile a "legislative-level" control standard manual to ensure that quality requirements are "implemented down to the fingertips"
The control standard manual is the quality constitution within an enterprise. It transforms the process control rules with legislative nature into job operation instructions that front - line employees can directly execute. Its writing logic should closely adhere to understandable and implementable: The language should get rid of the obscurity of professional terms (for example, instead of saying parameters need to be adaptively adjusted, it should say adjust the cutting speed to 3 - 5m/min); The content should be bound to specific scenarios (for example, Set the barrel temperature of injection molding machine Model X to 180℃ and 200℃); The format should meet the needs of front - line workers (for example, make it the size of a pocket book for workers to consult at any time).
The implementation phase is even more crucial: The handbook needs to be "brought to life" through "layered penetration." Specifically, for stamping workers, explain "the precision calibration steps after mold replacement"; for quality inspectors, provide training on "the criteria for determining the dimensions of finished products"; for team leaders, clarify "the assessment indicators for handbook implementation" (for example, include the "operation compliance rate according to the handbook" in the monthly performance evaluation). The 600 handbooks of the American LTV Company are a perfect example of this logic in action. They cover the entire process from raw material inspection, production and processing to finished product packaging, and each key process has its "exclusive handbook." For instance, the "Engine Bolt Tightening Handbook" in the assembly workshop specifies that "the bolts should be tightened three times with a torque wrench, with a final torque of 80 - 90 N·m"; the "Paint Thickness Handbook" in the painting workshop stipulates that "three points should be measured with a thickness gauge, and the thickness should not be less than 40 μm." These handbooks are not just "papers on the wall" but "operation guides in the hands of front - line workers."
Step 5: Use the control chart as the "eyes of the process" to make the standards "dynamically adapt to the actual situation"
The core of statistical monitoring is the control chart – it is the radar for real-time detection of process fluctuations. Through continuous collection of process data (such as the dimensions and weights of 10 products sampled per hour) and plotting them on a control chart (such as the mean-range chart X-R), it is possible to quickly determine whether the process is statistically in control: If the data points exceed the control limits (such as the upper control limit UCL), or a trend of seven consecutive points rising appears, it indicates that there are abnormalities in the process (such as equipment aging and fluctuations in raw material batches).
However, the significance of monitoring lies not only in "identifying problems" but also in "optimizing standards". When an anomaly occurs, the first thing to reflect on is "whether the control standard manual conforms to the actual situation". For example, the original manual stipulates that the welding current should be between 100 - 130A, but in actual production, it is found that the defective rate is lower when the current is between 120 - 150A. At this time, the current range in the manual needs to be revised, and then the new standard should be implemented through training. This cycle of "monitoring - revising - re - monitoring" ensures that the manual always "fits the real state of the process" and prevents the standards from becoming "outdated dogmas".
Step 6: Solve problems with the "diagnosis + closed-loop" approach to enable "continuous cycling" of quality improvement
The key to diagnosing and solving problems is to establish the logic of "precise analysis + feedback iteration". The core lies in grasping three points:
1. Use traditional tools to find the root cause: The seven quality management tools (cause-and-effect diagram, Pareto chart, histogram, etc.) are the basic diagnostic package. For example, use the cause-and-effect diagram (fishbone diagram) to analyze the reasons for the high defective product rate from six dimensions of man, machine, material, method, environment, and measurement. Use the Pareto chart to find the 20% key factors accounting for 80% of the problems (for example, 85% of the defective products are unqualified in size, so prioritize solving the size problem).
2. Use the diagnostic theory to "determine the direction": The "two-quality diagnosis theory" can help enterprises distinguish between "inherent quality" (the stable quality of the process itself) and "additional quality" (fluctuations caused by abnormal factors). For example, if the problem lies in "inherent quality" (such as dimensional deviations caused by defective mold design), the process design needs to be improved; if it is "additional quality" (such as inconsistent operation methods of workers), the operation standards should be optimized. It avoids blind rectifications like "treating the head when it aches".
3. Use the feedback loop for "iterative optimization": After solving the problem, if "new key quality factors" are found (for example, the impact of "workshop temperature" on welding quality was not considered originally), it is necessary to "re - inject" it into the previous steps - provide feedback to Step 2 (re - identify key quality factors) and include "temperature" in the core indicators; provide feedback to Step 3 (adjust control standards) and set the requirement of "temperature between 20 - 28°C"; provide feedback to Step 4 (revise the manual) and add the operation of "recording the temperature once every half an hour". This closed - loop from diagnosis to previous steps enables quality improvement to "rise in a spiral".
The effect of SPC: From "passively putting out fires" to "actively controlling quality", using data to exchange for efficiency
The value of SPC lies in shifting quality control from "post-event inspection" to "pre-event prevention". Taking the LTV company as an example, after implementing SPC in 1985, behind the 20% increase in labor productivity were specific improvements in quality and efficiency: the defective product rate dropped from 5% to 1.5%, reducing the costs of rework and scrap; the process cycle was shortened from 10 days to 8 days because the abnormal downtime was reduced by 30%; the workers' operation time was shortened by 15% because the manual specified the standards, eliminating the need for "trial and error"; the customer complaint rate decreased by 35% because the product quality fluctuations were significantly reduced. The core of these results is that SPC enables enterprises to "make decisions driven by data" - monitor the process through control charts, unify standards with manuals, and solve problems with diagnostic tools, ultimately achieving a win - win situation of "stable quality and improved efficiency".