Non Normal, UnStable Data

Six Sigma – iSixSigma Forums Old Forums General Non Normal, UnStable Data

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
  • #30350


    During Black Belt Training, my company focused on what tools to use and not use.  They were big on pointing out what only worked for stable, normal data.  The problem is, they never told us what to do if the data was non normal and unstable.  Here is my problem.
    I am trying to set up a measurement system to track inventory turns.  It is measured as the amount of turns for inventory dollars.  There is very little data (it is generated monthly) but I have historical data for the last 3 1/2 years.  What control chart can I use?  I have thought about using a box-cox transformation to make the data normal but it will then make no sense to the team member who will be tracking it. 



    You are correct to use the Box Cox transformation. This should let you understand your data better. After you have done this the new transformed data set is something you will begin to work with to understand the range you have been operating in. This is what you need to begin to work with predicting new data points and understading what your range is. As for the users not understanding the data, just re-trasnform the data set and the new adjusted ranges for the chart/graph they will monitor once you are ready to hand off the project.
    When you are looking at Inventory Data, there are several issues in a manufacturing environment to consider,
    1) Yearly inventories will almost surely show sharp declines in the month you are taking the physical or the month preceeding the physical. This should be treated as Common Cause Variation.
    2) If you can tie your inventory levels to Sales dollars (or shipment dollars) in the same month you will begin to better understand the relationship and the data  will become more useful.
    3) Look for Special Cause variation, such as one time Sales events, or large purchases of start up materials or new product/model introductions. Treat them as special cause, unless they occur at regular intervals, then they might qualify as Common cause.
    Inventory reduction is an old Lean/SS project, it needs to have actions associated with it that are perminant in terms of reducing inventory levels in any business condition. For example moving from Traditional MRP Systems to Pull Systems, or Reducing Recieving/Shipping Leadtimes in the system. Just measuring and trying to reduce without a solid firm measuring device to insure different actions are taking place will not result in long lasting results.
    Good luck,



    A data transformation could work, but as you mentioned, this would be confusing to others.  Another approach is to modify the control limits.
    I would suggest using an IX & MR chart where each plot point is one monthly result – its inventory turn.  With about 3.5 years of data, you would have 40+ points to plot. 
    Unfortunately, the conventional IX chart assumes the process output has a normal distribution, as its control limits are symmetrical around the centerline:
    UCL = Xbar +2.66 *MRbar 
    LCL = Xbar – 2.66*MR bar 
    If these data have a non-normal distribution, you can replace the above limits with non-symmetrical control limits that are based on the actual output distribution.
    To do this, plot the 40 historical points on normal probability paper.  Fit a curve through the plotted points.  Find the .135 percentile (the value below which lies .135 percent of the cumulative distribution), the 50 percentile (the median), and the 99.865 percentile.  The formulas for this special version of the IX chart (sometimes referred to as a Johnson control chart) are then:
    UCL = 99.865 percentile
    Centerline = 50 percentile
    LCL = .135 percentile
    These control limits afford the same risk values as the conventional control limits for the IX chart when the process output has a normal distribution.  For example, when in control, 99.73 percent of the plot points should fall between the control limits.  Half of these point should be above the centerline, with the other half below it.
    I got the idea for this type of chart from the book, Measuring Process Capability, by DR Bothe.  It’s a good book to have if you’re trying to estimate and interpret capability indexes for a variety of processes.
    Hope this helps.



    Wheeler did extensive work on the issue of normality and control charts, and wrote a dandy little book on the subject.  He clearly demonstrates that normality is simply not required for control charts.  Just chart the raw data, whether it is normal or not.  In practice, the difference you get by transformation is small, and the potential for confusion is large.
    If you are interested in stripping out long term trends, or cycles, use double exponential smoothing or Winter’s Method, save your residuals, and put them in an I-MR chart.  The resulting chart tells whether something other than expected change is at work.


    Erik L

    Utilising the Box-Cox transformation could certainly be a way to go about normalizing the data set.  Pay attention to the 95% CIs for the chart and do not simply take the optimal lambda.  Based off of the description below,  I’d guess at this point that 0 is included within the CIs.  If it is then you can use a log transform for the data and establish the control limits for the process.  For the on-going user you would need to re-transform (anti-log in this case) the control limits for the new data points that come off of the process so that it could be used for on-going analysis; this way you would have apples-to-apples.



    Sounds like your BB trainers could have been better. It is a well known fact to practioners utilizing control charts that NORMALITY IS NOT A FACTOR WHEN USING CONTROL CHARTS!!!
    A well known author Donald Wheeler in his benchmark study on control charts “Understanding Statistical Process Conrol” goes to great lengths to display that fact.
    So back to your question based on the type of data you have I’d use an ImR.


    John J. Flaig

    There is a major problem with the advice given by previous responders because they have gotten the cart before the horse. The problem fundamentally is that the process is “out-of-control”. Discussion about transforms to make it more normal or relative frequency approximations are pointless as the PROCESS IS OUT-OF-CONTROL! You must resolve this problem first.
    Since I don’t have the data it is impossible for me to give you advice about this issue but some general things to check are:
    1. Is there a pattern to the data (check autocorrelation). If there is autocorrelation you might consider an EWMA chart but this chart will be hard for people to read (the same problem with transformed data).
    2. Since inventory activity is often manipulated by managment it is quite likely that the system would be unstable. Look for special causes related to managment interventions.
    Once you have found and addressed the instability issue, THEN you can use an individuals control chart to monitor the process.
    John J. Flaig, Ph.D.

Viewing 7 posts - 1 through 7 (of 7 total)

The forum ‘General’ is closed to new topics and replies.