After more than 15 years, Six Sigma remains a hot topic in the business world with many companies having introduced it or at least considered it. But whether a person is just embarking on the journey or is already a skilled user, they may be wondering how to define Six Sigma in simple terms and describe clearly how it sets itself apart from previous phenomena.

Some say there is nothing new about Six Sigma. Others proclaim its novelty. It is described variously as a toolkit, a project methodology and/or a business strategy. Its contents reflect the teachings of W. Edwards Deming and Joseph M. Juran, yet it is not purely an American invention. Some elements were developed in Japan, but it is not a Japanese management concept.

The fact is, Six Sigma has become what it is because it is the “perfect storm” – the result of various forces coming together in a decades-long development which contains the genius of untold numbers of contributors in an ever-changing business environment. To really understand it and define it, therefore, it is worthwhile stepping back and viewing Six Sigma in its full historical context.

Historical Development of Process Improvement

For those who are not quality engineers, it is easy to overlook the cumulative learning process behind Six Sigma.

Imagine this learning process as a climb up a mountain. As one moves slowly upward, the starting point of the journey, the base of the mountain, always remains the same. However, as learning produces new insights, the view becomes ever more breathtaking – and the winds more powerful. This historical progression can be split into several periods.

Climbing the Mountain
Climbing the Mountain

The Foundation: 1920s-1940s

Despite earlier developments in management theory, the first true thought leaders of process-oriented quality management appeared in the decades after World War I. The three most important names in quality in the United States at the time were Walter Shewhart, Juran and Deming. Shewhart and Juran were both working at Bell Telephone Laboratories to achieve quality and uniformity in manufacturing. It was here that Shewhart developed the first control charts in the 1920s, marking the first application of statistics to manage process variability.

In 1939, Shewhart published a book entitled Statistical Method from the Viewpoint of Quality Control which firmly established the concept of statistical process control (SPC). Deming, his friend and associate, and another recognized expert in sampling and applied statistics in quality, was the editor. In the same year, Deming demonstrated at the U.S. Census Bureau that quality and productivity can be improved in non-manufacturing processes, as well.

The principles of quality control as developed and practiced by these men were later applied diligently in military production during World War II.

The Embellishment: 1950s-1970s

After the war, Juran and Deming continued to promote quality awareness. Many other greats contributed significantly to the effort during the next 30 years: Armand Feigenbaum (concept of “total quality”), Philip Crosby (cost of poor quality, management responsibility for zero defects) and Eliyahu Goldratt (throughput accounting, managing constraints), to name just a few. However, from the start of the post-war period, American industry seemed more concerned with quantity, leaving quality to remain, at most, an isolated function rather than an integral part of strategy.

If American industry was not interested in applying the principles of quality control in a big way, Japanese industry (struggling to overcome the ravages of war and a bad reputation in quality) was more than willing to do so. In the 1950s, both Juran and Deming gave lectures and seminars in Japan about managing quality, and throughout the coming decades both men developed close relationships with industrial leaders there. From them, the Japanese learned two key things – the application of statistical method to measure and control process variability and the responsibility of management for quality.

Throughout their history, the Japanese have always been open to foreign influences, adopting a promising concept and then building upon it. The embellishment period of quality was no exception. Taking what they learned, people such as Taiichi Ohno (Toyota production system), Kaoru Ishikawa (fishbone diagram, statistical thinking on the shop-floor), Genichi Taguchi (loss function, robust designs in design of experiments, or DOE), as well as Yoji Akao and Shigeru Mizuno (quality function deployment, or QFD) and Masaaki Imai (Kaizen) went on to develop new tools and methodologies to support quality.

Japanese leadership emphasized continuous process improvement within project structures involving all process participants. They enforced a rigorous customer focus and essentially broke away from the notion of a separate “quality function.” For them, quality not only supported strategy, it was strategy.

A good example of all this is the Toyota production system. During this 30-year period, Toyota became skilled at running manufacturing operations with a minimum of waste and inventory and a maximum of right-the-first-time production quality. The ability of Toyota and other manufacturers to produce top quality cars at low cost became the envy of the automobile industry worldwide and was a major catalyst triggering the next historical phase of quality.

The Breakthrough: 1980s

Although many American and European companies had well-developed quality functions, under the pressure of growing Japanese competitiveness in many key industries, Western (particularly American) industry now began to take special interest in what the Japanese did differently and, apparently, better.

One event is considered a major turning point in the awareness of American industry regarding Japanese quality: a television program on NBC in 1980 entitled, “If Japan Can, Why Can’t We?” Deming and his contribution to Japanese quality were prominently featured in the program. This broadcast finally brought Deming “back home” and created a surge in demand for his teachings, as well as his time. In particular, the Ford Motor Company, keen to catch up with the Japanese, developed a long and close relationship with the quality guru.

Now came the challenge of learning from the Japanese, instead of the other way around. As industry experts, engineers and scholars studied Japanese quality systems, there was a need to “translate” the Japanese experience into usable terms. The phrase “just-in-time” (JIT) described the low-inventory manufacturing environment observed at Toyota and a group at MIT coined the term “Lean” to explain the drastically reduced waste of resources typical of Japanese production.

As the decade progressed, two more developments had a profound effect on quality and process awareness – the expansion of international financial markets (due to the growth in private pensions and funding of new technology) and the expanding consulting industry. The combined forces rekindled the interest in quality as a way of creating shareholder value, and led to consulting products to help attain it.

Various methodologies in the 1980s helped Western companies address the success factors of quality and competitiveness, including Total Quality Management (TQM), Lean production, and the Japanese consultant Masaaki Imai’s Kaizen, all based on varying combinations of Juran/Deming fundamentals and the Japanese experience of applying them. Around this time, Six Sigma had its beginnings at Motorola as a tool to manage production variation using rigorous statistical analysis. These methodologies taught organizations the need to master customer focus, process definitions, disciplined measurement, root-cause validation and consistent process monitoring. A significant breakthrough in quality awareness was achieved and the “quality movement” was born.

Contributing to the emerging storm of process awareness were other trends, such as the personalization of computer power, a growing interest in systems thinking, the use of network and project structures to reduce traditional hierarchies, and the resulting interest in professional project management. The first stirrings of major change were clearly visible.

However, if the learning during this period was great, so, unfortunately, were some of the disappointments. Despite all the enthusiasm for the rediscovery of quality by a new generation of leadership, the strong connection between process and business performance was not achieved broadly. Quality programs remained largely the isolated responsibility of “traditional” quality departments that too often suffered from insufficient management support.

The Perfect Storm: 1990s

Nevertheless, the stirrings of change in the 1980s set the scene for the coming storm of the 1990s. As new communications technologies took root, this period was marked by accelerating globalization and emphasis on process quality for gaining and maintaining a competitive edge.

Business process reengineering (BPR) made its appearance, with its focus on customers’ needs and management’s role in challenging the status quo of existing processes. More people began to listen and learn. Meanwhile, at Motorola and AlliedSignal, financial accountability was added to the in-depth statistical approach of Six Sigma, turning it into a strategic framework which made the connection between process results and the bottom line. By mid-decade, it caught the attention of Jack Welch, CEO at General Electric, who then proceeded to create a vision for his firm based on the full implications of Six Sigma.

Faced with global competition, shareholder demands and taking advantage of new communications media and laptop technology, GE made Six Sigma the company’s global strategic platform. It was this very visible example that tipped the quality tempest into a major storm, catching the attention of financial markets and the imaginations of Welch’s counterparts around the world. At a time of booming stock markets, someone had demonstrated to Western business the power of rigorous process analysis and control, as well as the need to link these to the highest levels of the organization.

From then on, the storm continued to increase its velocity and sweep through industry after industry – supported by a technical environment that had not existed earlier, mass training concepts with a common language, and the use of statistics software on personal laptops (the symbol of the information worker of the 1990s). All this placed the power of data analysis directly into the business, transcending all functional borders. It was this content and timing that allowed Six Sigma, more than any other approach, to quickly become a truly global phenomenon, a framework understood and applied by the many instead of only the limited few.

Gathering Strength in the New Century

Any large storm, given the right energy sources, has the capacity to grow. Now that Western management has finally understood the link between superior process quality and the bottom line, the pace of learning is accelerating. Far from slowing down, Six Sigma appears to be continuing on its path.

As organizations practice Six Sigma and share their experiences in the multitude of literature and forums available on the subject, opportunities for improving the improvement methodology can be observed. The most notable current trend is renewed attention to Lean. Six Sigma was originally strongly focused on the reduction of process variation by going through the “data door” of statistical analysis, only nominally including the “process door” of more qualitative tools. However, Six Sigma combined with Lean principles provides a sophisticated approach to waste and cycle time reduction that was somehow missing until now.

Other potential sources of impetus could come from integration with widely established and understood tools and structures, such as balanced scorecards, ISO 9000, the criteria for the Malcolm Baldrige National Quality Award from the National Institute of Standards and Technology in the United States, as well as the model for excellence from the European Foundation for Quality Management.

Far from being competing principles and methodologies, these are all natural complements that can be easily blended with Six Sigma. After all, they too have contributed in one way or another to the winds of change that led to the “perfect storm” of strategic process management.

About the Author