iSixSigma

Using Parts per Trillion Data as Continuous?

Six Sigma – iSixSigma Forums General Forums Tools & Templates Using Parts per Trillion Data as Continuous?

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #250955

    Derek Kozlowski
    Participant

    Hey folks, I am wrestling with an issue.  We do a lot of measurements where our primary measure is ppb or ppt.  I understand that this is actually a percentage, as the data is given by our test devices as a non integer (8.312, 10.480, etc.)  However, showing this as a percentage make the data almost impossible to decipher, so it is generally accepted to be a number, rounded up or down to be count data.

    The problem comes with trying to analyze this data for normality, capability, etc.  Can any reasonable assumption be made to treat the data as continuous?  I am almost sure the answer is no, but am having a hard time making the right decisions in our work sphere.

    This is semiconductor business, and samples much be made by deposition, then impurities measured.  It is extremely difficult to obtain samples to test, as they must be grown through deposition, which is a very slow and expensive process.  I am trying to find a way that we can use this without gathering the required amount of data points form most attribute tools.  20-30 data points is a considerable investment in both time and reactor use.  We generally can’t get this in our experiments.

    What can you offer me to help with this?

    Thanks

    0
    #250959

    Robert Butler
    Participant

    The short answer to your question is – your concern is really of no concern.

    1. From Agresti Categorical Data Analysis 2nd Edition page 3

    ” Variables are classified as continuous or discrete, according to the number of values they can take. Actual measurements of all variables occurs in a discrete manner, due to precision limitations in measuring instruments. The continuous-discrete classification, in practice, distinguishes between variables that take lots of values and variables that take few values. For instance, statisticians often treat discrete interval variables having a large number of values (such as test scores) as continuous, using them in methods for continuous responses.”

    …so, go ahead and treat your measures as continuous.

    2. Standard calculations for capability do require the data to be approximately normally distributed.  For those cases where this is not the case one needs to use the methods for calculating capability when the data is not normal. Chapter 8 “Measuring Capability for Non-Normal Variable Data” in Bothe’s book Measuring Process Capability has the details.

    1
    #250981

    Anonymous
    Inactive

    This was very informative. Thanks for sharing this with us.

    (https://nicciskincare.com/products/glow-gang-watermelon-face-toner-mist)

    0
    #250990

    Derek Kozlowski
    Participant

    Thank you.  This was my general feeling as well, but was unsure enough to seek greater wisdom!  As always, it takes a village…

    0
    #250992

    Robert Butler
    Participant

    …one additional thought.  With measurements in parts per billion/trillion you are sure to run up against round-off error in whatever analysis program you are using.  This will be true even if you have (as most programs do today) double precision.  I would recommend you express the measures in scientific notation, drop the 10’s power and run the analysis on what is left.  At the end you can convert everything back to ppb or ppt.

    0
    #251032

    Chris Seider
    Participant

    Sure you could.

    Just look at your data–if it’s reported/recorded to the nearest 5 or 10 or 100…then you won’t have the precision potentially to use as continuous.

    Don’t forget to do an MSA!

    0
    #251044

    MBBinWI
    Participant

    @derekkoz – as the learned @rbutler identifies, once you limit the number of decimal places you are going to use, you have created a discrete measure.  The key is whether the number of decimals is sufficient to provide the resolution needed to answer the question being investigated.  Now, don’t get me wrong, there are certainly absolute discrete measures, but even continuous measures are functionally discrete once you limit the number of decimals.  The real question for you is to what level of precision do you need in order to answer the issues you are looking to understand.

    I don’t know if there is any proof of this, but generally you should have one decimal greater than the number of significant places that you are looking to analyze.  So, if you are trying to answer a question where the precision is to the 5th decimal place, and you are able to accurately and precisely measure to the 6th decimal, then you should be fine.  If not, then you are going to have issues.

    This level of precision is one that I have found in industry.  At the outset of an improvement effort, a rather gross measurement scale is sufficient because the issues are rather large.  As these are reduced, the measurements need to become more precise in order to discern the differences and make further improvements.

    Good Luck.

    0
Viewing 7 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic.