iSixSigma

UCL/ LCL

Viewing 17 posts - 1 through 17 (of 17 total)
  • Author
    Posts
  • #38682

    April
    Participant

    I’m trying to determine the best way to calculate UCL and LCL for a target mean. The value in question is a time interval, where the process is working best when it is faster (not necessarily within an LCL). In our samples, the variation is quite large. If I apply the existing standard deviation to the target, it puts the LCL below zero, which doesn’t make sense. How have others have handled a scenario like this when trying to represent a target time interval?
    Thank you in advance,
    April

    0
    #116161

    FTSBB
    Participant

    April,
    Your control limits are set correctly; there is nothing wrong with the math.  Also, there will not be a problem with any of your points ever falling out below the LCL, as the subgroup averages will NEVER be less than 0.  Just mentally put the LCL at 0 and you can interpret the chart as usual.

    0
    #116164

    Darth
    Participant

    April, I am not sure what you are trying to do.  Do I understand that you are trying to calculate UCL/LCL around a spec/target value?  You don’t use UCL/LCL to do that.  They are calculated from the actual data and represent the true process variation.  It sounds like you are trying to calculate confidence intervals.  Make sure of what you are trying to accomplish here and are not using the tool incorrectly.

    0
    #116169

    Mikel
    Member

    What my esteemed colleague Dr. Darth forgot to tell you is that time data probably is not normally distributed (it is why you get those nasty limits below 0).
    Do a histogram of your data and tell us what it looks like.
    Send your data to [email protected] if you want some help.

    0
    #116171

    Darth
    Participant

    Did our poster indicate whether she was plotting individual time data which might have an effect although some might say the control chart should be somewhat immune to normality issues or was she plotting subgroups of time data which should have offset the nonnormality?  My take was that she was trying to develop control limits around a target value which would have no relevance to the actual data.  She also mentioned a s.d. which she was using for the calculations.  Reread the post closely and see if what she is even doing makes sense.
     

    0
    #116176

    Mikel
    Member

    I want you to note the excellent behavior I am exhibiting this evening, I have offered to help and I’ll tell you what’s there when I see the data.
     

    0
    #116195

    Darth
    Participant

    Yes, Stan, I did notice that you and Jack are being so kind this evening in your postings.  Keep up the good work.
     

    0
    #116205

    Mikel
    Member

    Hey, it was not Jack, it was Don.
    I am again in beautiful Mexico, land of tequila and really smart engineers.
    What was the tequila I agreed to pay you off with way back on New Years day?

    0
    #116220

    April
    Participant

    Thanks for your feedback everyone. I read your posts and I am going to stick with step 1 for now – plotting the existing data and process variation. My mean is much smaller than my standard deviation (about 1/3), and I am using ‘mean +- 2s’ to define the UCL and LCL. The LCL brings me below 0. The values vary from a few minutes to several hundred hours. Is it possible to define an LCL given these values, even after Tequila?

    0
    #116222

    Mikel
    Member

    Data definitely goes better with tequila (or Jack).
    Seriously though, I doubt that anyone understands what you are seeing in your data. Take me up on my offer to look at your data if you want help.

    0
    #116226

    P3
    Participant

    April,
    I think you should consider (again) what Darth mentioned earlier, before the tequila entered the thread, about what you’re really trying to accomplish here. I have to agree it sounds like you’re more into confidence intervals rather than control limits.
    I think so for two reasons; One, you’re using ±2 stdev, which roughly covers 95% (as is the usual choice for conf. int.). I don’t say it’s wrong, but the most common pick for limits is ±3. Secondly, you don’t set control limits about a target. Limits are calculated based upon the actual spread and the mean of the process, exactly as Darth said.
    Another curious reflection. You said your most preferable choice of time would be a small time value, and you also said there were values from minutes up to several hundred hours. Those “large” values, are they to be considered outliers and therefore not representative to a stable process? What I mean is, do you have mostly minute-values, and those larger values are more seldom but do occur. Because if you want as small time values as possible, and you actually have values in the range of a few minures, why do several hundred hours appear? Well, as Stan said, it’s better to see the data.
    When calculating control limits you’re assuming normal data. Let it be approximately normal or whatever; I don’t mind. Time values are seldom normal. Exponential, Gamma, or if there’s some kind of tear and wear involved, maybe Weibull is your true distribution. Since time never goes below zero, and normal data don’t have such limits, you will end up (occasionally) with LCL:s that pop up on the “wrong” side of the zero axis.
    Compare it with ratios. When trying to put conf intervals on percentages you can’t (shouldn’t) use the Normal model. Here the binomial approach is the correct choice. Why? Simply because you have physcial limits at 0 or 100.
    P3

    0
    #116259

    Darth
    Participant

    The really expensive one in the fancy bottle.  I was hoping it was good tequila and pretty women but then again, an old fart like you would probably be happier with smart engineers.

    0
    #126400

    narayanan
    Member

    Before going for UCL / LCL finding, make your process to normal distribution and reduce the variations – Reduce standard deviation considerably. There is no other way.

    0
    #126407

    Giri
    Participant

    Dear Mr.April,
    1.  The least point of LCL is ZERO and it can’t go below that.
    2.  If there is such huge deviatation, your process is not in contrrol.  Ensure that your process are capable and reduce the deviation.  Thjen you can use various control charts.
    3.  However to reduce the deviation, you can use run chart, pareto, histogram etc.to identify the behaviour of the process, cause and root cause.
    Regards, Giritharan
     
     
     

    0
    #126410

    Darth
    Participant

    Note that the original thread was way back in March.  I am pretty sure the issue has been resolved already.

    0
    #126431

    John H.
    Participant

    Shankar
    Before making a general statement, please research the topic, Gaussian Distributions do not apply to all processes and you can have a distribution with a nonexistant mean and variance.
    John H.  

    0
    #126439

    Mike Carnell
    Participant

    Darth,
    Good point. I guess there is a reason they have a date on the post.
    The thing that amazes me about this type of post (the two previous to yours – actually mostly the first one) is that it is such superficial self-agrandized crap. They are simply regurgitating some dogma that they got in SPC 101 and think it makes them sound enlightened. If in fact this was more timely the original poster would have no more idea what to do than they did before “…..reduce the variation. It is the only way.” WTF. I guess that means you close your eyes and use the force. One of those Sunday morning faith healing methodologies for variation reduction. Where is Jim Baker when you need him?
    Regards

    0
Viewing 17 posts - 1 through 17 (of 17 total)

The forum ‘General’ is closed to new topics and replies.