iSixSigma

Jonathon Andell

Activity

  • I agree with you. It goes back to the sigma shift baloney, which nobody has been able to validate to my satisfaction. Also, in Minitab, Ppk presumes that within-subgroup variation is “short term,” while between-subgroup variation is “long term.” I am not one bit comfortable allowing Minitab to make that distinction regarding my process or its…[Read more]

  • Think of residual analysis as a way to extract the maximum information possible from the data. If any of the readily available residual plots shows a non-random pattern, then something in the process is likely to be causing that pattern. Often the nature of the pattern gives strong “hints” as to what the cause is, and that leads to increased…[Read more]

  • Amidst all the flotsam and jetsam of the responses, Darth had the most practical input. Stan said something similar with his sarcastic banking comment, but Darth’s input was straightforward. 10 years, 4 months ago

  • Thanks for the invite Stanny.To which “this” are you referring? 10 years, 4 months ago

  • Stan’s inputs will be most useful if the fine folks at Minitab are afflicted with excessively positive self-images. 10 years, 4 months ago

  • I wish you were right, but I run into a lot of what I call the send-in-your-box-top kind of Black Belts still using the average & range method. If only the GM folks in the 1960’s had opted for abacus instead of pencil & paper, we might not be in this mess.Lots of good discussion here. A few additional comments, perhaps pointing out the obvious,…[Read more]

  • Darth, it seems like Stan and my 12-year-old son make use of a similar tactic. When he digs in on pushing people’s buttons, and I finally let him know he’s crossed the line, he suddenly tosses in a defensive, “I was just kidding.” He still thinks it’s a tactic superior to an apology and a backing off.Of course, he’s a few years away from…[Read more]

  • This certainly sounds like a process that could benefit from QbD. The question now becomes one of how far along the project has progressed, and whether there remains time and opportunity to use the tools. If you would like to chat a bit, give me your email address and we can set up a time. 10 years, 5 months ago

  • I hear about organizations claiming to pursue it, but I haven’t seen examples. Overall, QbD is a process that wraps itself around new product development. One potential application is to ensure that the ultimate manufacturing process is able to perform defect-free and efficiently. I have seen a few of what I call “anti-examples,” in which lack of…[Read more]

  • 8 what and 2 what? If those are process means, we still need to know the variation and the sample sizes. 10 years, 5 months ago

  • Oddly enough I find myself in full agreement with you, Stan. It’s disconcerting not to have even a minor quibble.I have been way beyond merely unimpressed with the whole “sigma shift” baloney ever since it came out. I spoke with Bill Smith one upon a time, and he seemed way to smart to miss the fact the “six sigma” statistically means 2 parts per…[Read more]

  • Seems likely indeed. However, predictable could take on a number of forms. My point to Marcy was that control charts may not reveal anything new, but there seems no harm in investing 10-15 minutes to check it out. I think we are much more in agreement than not. 10 years, 5 months ago

  • Your choice of words is of concern: “1.5 six sigma” makes no sense. However, rather than name calling, I’ll offer a brief explanation.If your top brass wants to increase “sigma” level by 1.5 every year they clearly don’t understand the numbers. It takes roughly a 30-fold reduction of defect levels to go from 4-sigma to 5-sigma. That means you are…[Read more]

  • I agree with you that displaying the data through time is appropriate, especially if it goes back a few years. And control limits may not provide anything actionable.That being said, and given the ease of using most software packages, I cannot see the harm in plotting the data in several formats: – Runs chart (non-parametric – ignores the amount…[Read more]

  • I’m not sure this contribution will serve for “research” purposes, but here it is. My perception is that Lean itself is unlikely to improve the writing of code, unless the code itself is going to contribute to some form of lean implementation.However, the methods of Lean can be used to manage the flow of projects. Many organizations use “push”…[Read more]

  • For now I will assume you are following changes in the mean (continuous data), or changes in the proportion (discrete data). For continuous data you also could establish sample sizes to track changes in standard deviation (actually, variance, which is the square of standard deviation).For continuous data you need to start with a reasonable…[Read more]

  • I designed and taught a Black Belt and a Champion course for a home construction firm and its subcontractors, and coached the projects. I have colleagues who consult in both six sigma and lean for construction. Both methods are highly applicable.Lean will simplify the work flow, reduce delays, and trim costs. Six sigma will help understand why…[Read more]

  • I agree with both respondents thus far. For the relatively low PPM of your operation you will find that both methods result in nearly the same outcomes. Processes with higher PPM will show divergence between the two methods, at which point Eric’s method would be advised.As Eric pointed out, this works when the defect types occur more or less…[Read more]

  • Doesn’t mean I shouldn’t… 10 years, 5 months ago

  • Travel on my own nickel isn’t a very good option these days.I do like a lot of what Taguchi advocates prior to selecting a matrix, but I rarely use any of his matrices. Did you ever read Schmidt & Launsby’s book on DOE? Interesting outlook.Can I be a moron, too? 10 years, 5 months ago

  • Load More