Home › Forums › Old Forums › General › Six Sigma Vs Standard Deviation
This topic contains 6 replies, has 1 voice, and was last updated by DRAGOS BANDUR 13 years, 6 months ago.
In a Standard Normal Distribution, as data points converge to the Arithmetic mean, Standard Deviation
decreases. Why does 6 Sigma increase under the same circumstances ?
For a Standard Normal distribution, where the transformation is
Z=(x-mu)/sigma,
the transformed mean is 0, and standard deviation is 1.
It will not change no matter what your starting Normal distribution looks like.
cause
the sigma process metric (sigma) is defined as the number of standard deviations that will fit between the mean and nearest customer/process limit
as the s.d. gets smaller, more s.d. will fit, so the ‘sigma’ gets bigger
confusing isn’t it
Historically, the value derived by dividing the estimated standard deviation into the distance between the process average and the closest specification limit has been referred to as Z min, or the Z score or some other derivative incorporating the letter ‘Z’.
Marc Richardson
Sr. Q.A. Engineer
Don’t confuse the population with the sampling distribution of the mean. As the sample size increases, the distribution of the sample mean converges to a normal distribution around the true mean with a STANDARD ERROR which goes down as sigma/sqrt(n). But the population never changes.
Most often the normal bell curve is thought to be plus or minus three standard deviations and represent 99.73% of the process. If this curve fits exactly between the customer’s specs then 0.135% of the process would be out of spec on each side of the curve. If the process moves off target then even more will be out of spec on one side or the other. This is unacceptable. Therefore the customer wants you to reduce the variation so the whole process can move back and forth with out the tails crossing the lower or upper spec. As variation is reduced the standard diviation gets smaller. As the standard divation gets smaller the width of the curve gets smaller and has more room to move back and forth. As you have more room to move back and forth (due to smaller Sigma) more standard divations will fit between the specs. The customer wants you the have small enough standard diviation (Variation) so that +/- 6 standard diviations will fit between the specs. Then instead of 0.135% (2700 PPM) on the 2 tails of a +/- 3 sigma process being out of spec, you will have a +/- 6 sigma process with only 3.4 PPM out of spec… if the process is on target. Imagine a little bell curve with +/- 6 sigma sitting between the specs with roon to move back and forth instead of a big bell curve with only +/- 3 sigma touching both sides of the specs.
Think of “Six Sigma” as a distance and of Sigma as a measurement unit (which by definition is a quantity that fills a number of times the measured distance); obviously, the smaller the measurement unit (i.e. Sigma) the larger the “number of times”.
Now, let us consider any characteristic that remains centred on Target:
“Six Sigma” does not increase when Sigma decreases because it is always “Six”: if Sigma decreases, Six Sigma will follow closely.
Also, the distance between Target and Specification remains constant and specified.
The only thing that increases when Sigma decreases is the ratio between the Specification Interval (fixed by specification) and process variation (measured in Sigma and decreasing as Sigma decreases).
In conclusion your question is “Why the Capability Index of a centred process increases when standard deviation decreases?” This is an easy one!
The forum ‘General’ is closed to new topics and replies.
© Copyright iSixSigma 2000-2014. User Agreement. Any reproduction or other use of content without the express written consent of iSixSigma is prohibited. More »