Okay, okay. I know this has already been covered to death in other blogs and various discussion forums. But I am nonetheless compelled to offer my own take on the Wall Street Journal’s article concerning the departure of Home Depot CEO Robert Nardelli. And more specifically on the comments within that article suggesting that this is a substantialexample of Six Sigma “not panning out as promised.”

The article prominently cites a study conducted by QualPro Inc. Before I get started, I’ll freely admit I haven’t read the study. I couldn’t find it on their website, or anywhere else online. (If anyone knows where to get it, please let me know.) So, I’ll have to rely on the words of the article’s author to describe it:

“Now QualPro Inc., a company that markets a competing process-management technique, has issued a study comparing the stock performance of companies that adopted Six Sigma with the performance of the Standard & Poor’s 500-stock index. QualPro has done work for Lowe’s Cos., Home Depot’s main competitor.

“Given that the study was issued by a Six Sigma competitor, it isn’t surprising that the comparisons aren’t flattering.”

They go on:

“A number of former GE executives — including W. James McNerney Jr., former CEO of 3M Co.; Dave Cote, CEO of Honeywell International Inc.; and Mr. Nardelli — helped spread the Six Sigma word but have seen their companies’ stock prices lag.

“Since announcing the adoption of Six Sigma on July 1, 2001, Home Depot shares are down 8.3% compared with a 16% rise in the S&P 500 over the same period. The stock rose more than 2% yesterday on the New York Stock Exchange, to $41.07, after Mr. Nardelli’s resignation.

“Honeywell shares are down 7.2% since its Six Sigma announcement in early January 2000, compared with a 3.6% fall in the S&P 500. Shares of 3M are off about 1% since late December 2003 versus the S&P 500’s 29% climb. GE shares rose sharply in the 1990s, but they’re down 16% since July 2000, when the company adopted Six Sigma, compared with the 2.6% fall in the S&P 500.”

First of all, I know I speak for most of us when I howl: correlation does not necessarily indicate causation. There’s enough material for several blog entries here, but I’ll restrain myself because there are more interesting things to quibble with here. For example, what to make of this?

“Of the 58 companies reviewed in the QualPro report, 52 underperformed the S&P 500 index from the time they launched their Six Sigma programs through Dec. 5, 2006. Other underperformers include Lockheed Martin Corp., Ford Motor Co. and Xerox Corp.”

Yet the George group claims on their website that “our client index has tripled in value while the S&P 500 has declined.” Further, on the back cover of this month’s iSixSigma magazine you can see their data indicating that “George Group clients outperform all major indices.” Presumably a large portion of those clients deployed Six Sigma. So what does it all mean? Is Six Sigma a good thing or a bad thing for stock price? George Group and QualPro both cite Xerox as an example – what are we to take from that? The point is, it’s impossible to tell. I’ve no doubt that QualPro and George Group both used accurate data analyzed correctly. And yet they report diametrically opposite conclusions. Clearly what was measured and how it was measured must be different, but none of that nuance is communicated. Again, there are several blogs’ worth of questions to ponder here.

More importantly, neither study asks the more interesting question, which is: what would the stock price of these companies have done over the period of the study if they had not deployed Six Sigma? And that’s a question that we can’t answer, because it requires an experiment that’s impossible to run. No one understands Y=f(x) for stock price (at least, no one worth less than 10 figures), so talking about single data points collected on a single x for a particular company doesn’t carry much weight. And given the massive differences company-to-company, I’m not sure aggregating 50 different companies together in a single study is any better. It just sounds better to the casual reader.

While I’m on the subject of QualPro, I found some of the statements on their website, um, interesting. For example, they say:

“Most of us were taught that the optimum process for gaining knowledge is to test one-thing-at-a-time and hold everything else constant. It is called the scientific method. MVT® (Multivariable Testing) proves that the scientific method doesn’t really work in the real world.”

I feel compelled to point out that the scientific method has been around for about 2900 years. Has QualPro suddenly discovered something that minor logicians like Aristotle, Descartes, and Gödel missed? Can QualPro seriously believe what they are saying? It’s ridiculous. In a sense they win by default, because I can’t even formulate a cogent argument to the contrary. It’s like trying to box with a swamp vapor – you can’t hit a bad smell.

And don’t even get me started on marketing multi-variate testing as “better” than Six Sigma. What decent program doesn’t include muti-variate where appropriate? As for equating OFAT with the scientific method, I’m personally insulted by the comparison. What is it about the induction/deduction cycle that precludes testing more than one variable at a time? Nothing. The scientific method says nothing about how to go from induction to deduction. DOE or “MVT” is a great way to go. Fine, promote MVT, but why denigrate the scientific method to do it? MVT and the scientific method are perfectly compatible.

Finally, just to continue being picky, despite QualPro’s claims to the contrary you can learn about interactions via OFAT experimentation. If you don’t believe me, believe Daniel:

Factorial One-Factor-at-a-Time Experiments
Cuthbert Daniel
The American Statistician, Vol. 48, No. 2 (May, 1994), pp. 132-135
doi:10.2307/2684266

In the end though, I guess none of this matters because:

“QualPro’s proprietary Multivariable Testing (MVT®) system uses more complex mathematics than is used in a Polaris Missile.”

Boy, I can’t think of a better reason to adopt a continuous improvement methodology than that.

About the Author