# Calculating CP without a LSL

Six Sigma – iSixSigma › Forums › General Forums › General › Calculating CP without a LSL

- This topic has 11 replies, 9 voices, and was last updated 9 years, 1 month ago by MBBinWI.

- AuthorPosts
- May 21, 2010 at 5:15 pm #53457

Calculating CP without a LSLParticipant@Calculating-CP-without-a-LSL**Include @Calculating-CP-without-a-LSL in your post and this person will**

be notified via email.Hi, I’ve been tasked with finding out the capability of a process with a customer specification of a given time. So lets say this time is 1 hour. I am confused as to what I should use for a LSL. Multiple things come to my head such as our Process’s LCL, or even 0. Can anyone advise?

0May 21, 2010 at 8:34 pm #190187

Roanoke Black BeltMember@Roanoke-Black-Belt**Include @Roanoke-Black-Belt in your post and this person will**

be notified via email.I don’t believe that you should calculate Cp. Instead calculate Cpk. If the only customer requirement is an Upper Spec Limit, then use (USL – X-bar)/(3*sigma). Assuming you data is normally distributed, this will predict the % of the population that falls above the USL.

0May 22, 2010 at 10:24 am #190191Hey Roanoke have you ever seen time data that is expected to be normal?

Time skews away for the the natural boundary 0, which should not be used as a spec. If you put zero in as a spec and click on boundary in Minitab, it will only return Cpk with respect to the USL.

Instead of having the normal fixation on normality (it is not important and neither are transformations), figure out what should be expected and use it. Exponential or Weibull are good places to start. Knowing Cpk is not important, knowing a credible number for what is expected to exceed the customer’s expectations is.

0May 27, 2010 at 3:14 am #190213

MBBinWIParticipant@MBBinWI**Include @MBBinWI in your post and this person will**

be notified via email.Technically, you cannot calculate Cp, for as you identify, you don’t have an LSL. Really, for this situation, you would be better off calculating a non-normal Cpm. Cpm is how well the distribution mean meets the target as well as how much of the tail of the distribution exceeds the SL (either upper or lower). In Minitab, you can select a Weibull distribution (probably the most flexible) or if your process has a half-life decay type effect, perhaps exponential might be appropriate, or perhaps a Beta distribution if seemingly normal but with a long skewed tail. In any case, your customers are likely expecting a specific level of performance (the target) but will tolerate something less (your USL). With these two, you can calculate a Cpm, which will show you how well you meet your target and do not exceed your customer’s tolerance.

0May 27, 2010 at 3:56 am #190214Don Wheeler calls is Lepkurtophobia . Following is an excerpt of his article publised

in QD——–

The symptoms of leptokurtophobia are (1) routinely asking if your data are normally distributed and (2) transforming your data to make them appear to be less leptokurtic and more mound shaped.

——————-

Don’t worry too much about normality – nothing in life is normal . Transform only if it makes sense .

0July 6, 2010 at 11:40 am #190405

GilbertMember@Stu-Dapples**Include @Stu-Dapples in your post and this person will**

be notified via email.Hi there, I am a trainee Black Belt and the fixation on Normal Distribution is quite apparent from my training material!

I could be wrong but do you not assume normally distributed data for a lot of the analysis tools run via minitab? If the data is not ND, what effect will it have on results?

As you can probably see I am new to all this!

Any advice appreciated,

Stu

0July 22, 2010 at 3:07 am #190479You need to find a statistical tool for non-normal distributions. Perhaps the Weibull, as someone posted, will work. Some time ago I had a similar problem and heard of something called a Pierson analysis.

Find a statistician and ask them.

0July 26, 2010 at 1:28 pm #190500I think Gary and MBBinWI gave sound advice.

The bottom line is that if you compute the capability of any process that is not normally distributed with an analytical tool that assumes normality then the answer will be suspect at best and should be trusted only with caution.

Minitab has two functions that might be useful in this case:

1. Individual Distribution Identification (stat=>quality tools=>individual distribution identification)

2. Nonnormal Capability Analysis (stat=>quality tools=>capability analysis=>nonnormal)You could use one of the distributions already mentioned (gamma, beta, & exponential) to compute capability using nonnormal analysis (#2 above). Or, you could also try to identify the distribution that best models exhibited performance (#1 above) and then compute nonnormal capability using that distribution (#2 above).

Since Minitab makes it so fast to do this analysis I would compute it several different ways just to get an idea of the spread. That way you can better understand how much importance the actual distribution of the process data holds in the capability analysis.

0August 11, 2010 at 10:55 am #190585

MBBinWIParticipant@MBBinWI**Include @MBBinWI in your post and this person will**

be notified via email.Gary: Actually, yes I have seen time data that is expected to approach a normal distribution. Time to failure data for robust systems/components often approaches normality (yes, Weibull is still the proper distribution, but at 3.5 to 4 shape factor it is close enough to normal that it is nearly indistinguishable for most purposes).

HOWEVER, with tools like Minitab and others being as capable as they are nowadays, it is foolish to try to simplify by using a normal assumption – just use the non-normal analyses available in the tools.

As another poster has said, most training concentrates on normal distributions, but that is not the norm (pun intended). I would encourage all the training bodies to de-emphasize the normal distribution and use more real-world example distribution types. Alas, this will likely fall on deaf ears.0August 11, 2010 at 1:37 pm #190587

GilbertMember@Stu-Dapples**Include @Stu-Dapples in your post and this person will**

be notified via email.Very interesting and useful advice Gents, much appreciated. I am off to my week 2 training session on Monday, I may just print a copy of this discussion and leave it surrupticiously on my lecturers desk! (and potentially take a spelling class?)

Once I am a competant driver with Minitab, I may start taking it off road a little and toying with the different distribution analysis tools. For now, it serves well to ask the Master BB and see what he says!

Many thanks again guys, if anyone has any more advice (or general comment on this) I appreciate your input

Stu

0February 17, 2011 at 8:17 am #191263

adamsParticipant@john-adams**Include @john-adams in your post and this person will**

be notified via email.Thanks for your comment. I appreciate it and always look for new ways to increase my six sigma knowledge-meet you soon again,

resume writing service0February 22, 2011 at 4:56 am #191279

MBBinWIParticipant@MBBinWI**Include @MBBinWI in your post and this person will**

be notified via email.**Gary wrote:**Hey Roanoke have you ever seen time data that is expected to be normal?

Time skews away for the the natural boundary 0, which should not be used as a spec. If you put zero in as a spec and click on boundary in Minitab, it will only return Cpk with respect to the USL.

Instead of having the normal fixation on normality (it is not important and neither are transformations), figure out what should be expected and use it. Exponential or Weibull are good places to start. Knowing Cpk is not important, knowing a credible number for what is expected to exceed the customer’s expectations is.

Actually, yes, I have seen time data that is normally distributed (or nearly so). Look at race times for sprints or distance runners at the olympics for example.

That said, if there is only a rational boundary on one side of the distribution, then Cpk is a more appropriate capability measure.

0 - AuthorPosts

You must be logged in to reply to this topic.