iSixSigma

Reduction of variation – originated with Six Sigma?

Six Sigma – iSixSigma Forums Old Forums General Reduction of variation – originated with Six Sigma?

Viewing 14 posts - 1 through 14 (of 14 total)
  • Author
    Posts
  • #29058

    johansson
    Participant

    Folks:
    Did the concept of reduction in variation yielding more predictable data with which to make more consisent and reliable management decisions originate with Six Sigma? I was speaking to this at a recent presentation I gave, in connection with Six Sigma, but want to check whether the concept originated with Six Sigma or not.
    Max

    0
    #73478

    Manee
    Participant

    Reduction in variation was introduced by Prof. Genichi Taguchi in Japan in 1970 and in America in 1980s.  It is known as Taguchi Mehod.
    Manee
     
     

    0
    #73479

    ?
    Participant

    Puh-leeze. Gimme a break. What are you, cok’n, jok’n and smok’n?
    Walter Shewhart wrote of Reduction in Variation in the 20’s, while working for Ma Bell.
    Whatever you’re smok’n, pass it along…

    0
    #73483

    Eileen
    Participant

    Max,
    In the US and Japan, the concept of reducing variation was heavily promoted by Dr. Deming’s lectures and books in the early 1980’s(US). He was a disciple of Dr.Walter Shewhart. As Dr. Deming stated: “Shrink, Shrink, Shrink that variation.” This was way before Six Sigma.
    Eileen
     
     

    0
    #73484

    Sambuddha
    Member

    Manee,
    Are you sure “Reduction in variation was introduced by Prof. Genichi Taguchi in Japan in 1970 and in America in 1980s.” – is a fact?
    I thought Taguchi Jr, Chowdhury, Skip Creveling  might disagree with you. Taguchi’s S/N ratio and Loss Function was about making designs robust to the point they are somewhat insensitive to variation.  He did not “introduce reduction in variation to the world” concept by any stretch of history or imagination.
    Taguchi received the Deming award in 1960. So I would turn the clock back to the times of Shewhart, or Deming to find who should get the credit.
     

    0
    #73487

    Andy Schlotter
    Participant

    No. Reduction of process variation is the cornerstone of statistically-based quality systems dating back to the early 1900’s with Shewhart. I’ll bet it’s been around longer than that – probably as long as statistics.

    0
    #73488

    Manee
    Participant

    Sorry about the mix up. Dr. Shewhart is the real guru
    Manee

    0
    #73489

    Ropp
    Participant

    To expand a little on what Eileen posted;
    Shewhart wrote – “The engineer desires to reduce the variability in quality to an economic minimum.”  in “Statistical Method from the viewpoint of Quality Control” 1939.However, his main work was on correctly identifying and quantifying variation by SPC. Perhaps he would have worked on the reduction problem more if he’d had more time?
    His colleauge and protogee, Dr. Deming further expanded this and strongly advocated reduction in variability.
    Taguchi suggested a means to quantify the impact to society with the  Taguchi loss function. He suggested methods of investigation and improvement, some better than others.
    The Six Sigma movement makes it a primary aim.
    All have contributed and should be given laurels. It is a continuous improvement thing. As many of us know, not all business management has fully embraced this philosophy.
    This concept most certainly was not a sudden blinding light originating from Six Sigma.

    0
    #73496

    Robert Butler
    Participant

    Andy Schlotter is correct.  The issue of variation reduction has been a central tenet of statistical analysis from the beginning.  As a formal area of study statistics is over 200 years old.  If you really want to read something on the origins of variation reduction and indeed on the origins of most of the major issues of statistical focus, I’d recommend “The History of Statistics before 1900”.

    0
    #73558

    Mike Carnell
    Participant

    Max,
    I agree with the people who are driving this back to Shewhart as the inovator. There were some random occurances such as the French firearms manufacturer (even that was based on interchangeable parts).
    In the context of Six Sigma The second Industrial revolution and interchangable part is what drove this. When everything was done by craftsmen it would make it easier but not real significant issue. The minute you have mass production, particularly in the sense that Ford needed (since he was doing Lean before it was called Lean) the idea of variation reduction becomes much more critical as a success factor. After Shewhart it is a matter of record.

    0
    #73571

    Niraj Goyal
    Participant

    The concept of variation was probably there as a theoretical concept in statistics as mentioned here.
    It was developed into control charts by Shewart in a frm that was applicable to business and probably used by by a few.
    It was picked up In japan from DR Deming in the 1950s and forms the basis of TQM in Japan. The Japanese spread it in their companies between the 1950’s and 1980’s and a lot of their phenomenal improvements were due to the application of these techniques.
    In US the widespread application started after that first as TQM and when that did not work in the cultural contexts there changed, modified and developed into Six Sigma with new emphases, developments and knowledge and way of implementation.
    Taguchi brought his work into design to prevent not only “recurrence” but also Occurrence. 

    0
    #73576

    Mike Carnell
    Participant

    Niraj,
    I agree with most of what you said.
    I really take issue with the comment that TQM did not work in the US. There were and still are a lot of companies making a lot of progress with TQM. If you look closely at a lot of the “Yelow Belt” programs out there, they are primarily TQM with some SS buzz words woven into the material so it stays incognito. Try to pitch TQM to a transaction group and watch them tell you how it is old news etc. Pitch the same thing called Yellow Belt and they like it. TQM is the right tool for most of the transaction stuff because they have no concept of engineered processes or what a defect is or costs.
    I had the Allied Automotive Turbocharger group in Torrance, California. TQM was ad ministered by a very dedicated lady. With the team structure and TQM it was a very nice platform to build some good Six Sigma projects on.
    It did work. Just like anything else the deployments will be distributed across the spectrum of great to poor. Typically a function of management. If they are screwed up then they will have a tendancy to screw up anything they touch.
    As far as Taguchi. Take the normal distribution the way SS displays it centered between two spec limits. Turn the distribution upside down. Looks a lot like the Taguchi Loss function. Now if you superimpose them. Taguchi represents loss, Normal distribution represents departure from the mean. AS the departure from the mean increases measured on the normal distribution, lossincreases measured by the Taguchi loss function. Same basic concept.

    0
    #73585

    Niraj Goyal
    Participant

    Mike
    I agree with you too. By saying TQM did not work in US I did not mean it did not work at all. No doubt it worked to some extent in some places more than in others. Bugt unlike Japan it did not become a national rage.
    And today I have come across a lot of companies in US that are allergic to TQM as a word, even though many of them may actually in practice be using largely the simpler techniques of Six Sigma which are available in TQM too.
    Niraj  

    0
    #73587

    Mike Carnell
    Participant

    Niraj,
    I think the disenchantment with TQM was something that happened after it had been around for a while. It was an evolutionary proces where survival of the least fittest became the law of the jungle. People learned to do a Ishikawa Diagram or Pareto and didn’t know what to do with the output. Instead of understanding the logic people were taught mechanical steps by tool junkies. They are the “Tool Zombies.” They get caught between the real world and the academic world because they are taught by people who do not understand the strategy.
    I would much rather have a BB that completely understands when to use a t test than some one who can pontificat on the derivation of the distribution, difference in the two formulas, etc. If they understand when to use it Minitab will take care of the calculations. It goes back to the basic discussion of the difference between an educated person and a smart person. The educated person can be entertaining to watch try to get results but the smart person is the one you can leave by themselves and they get the job done. The scary ones are the smart one that are educated as well (college degree does not equal educated).
    SS is going down the same path. It is very slowly evolving into a play thing for the statisticians. They act indignant about MBB’s, BB’s, and GB’s encroaching on their territory. They opened the door. Now they want the territory back so they can drive the pure stats back in. If they can succeed inconvincing the world tha SS is about stats they get their territory back but they castrate the process. SS was successful because it delivered results not because it used stats. The stats were just a tool.
    If you are being taught SS and your instructor does not take you back to the stategy of Y=(f)x then you need to make them explain it. They may not be doing it because they don’t get it. If they don’t get it and they are just tool junkies. They are just running a “Six Sigma Zombie Factory.” Mindless followers of an obscure process.
    Thinking about this is no way to spend a weekend. Thinking about the responses is no way to spend the rest of the weekend.

    0
Viewing 14 posts - 1 through 14 (of 14 total)

The forum ‘General’ is closed to new topics and replies.