Bill Ford, Six Sigma, and Work Function
Six Sigma – iSixSigma › Forums › Old Forums › General › Bill Ford, Six Sigma, and Work Function
 This topic has 77 replies, 15 voices, and was last updated 17 years, 10 months ago by V. Laxmanan.

AuthorPosts

June 29, 2004 at 10:46 am #36006
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear All: First of all, I would like to thank everyone for the encouraging comments and challenging me to offer examples to test the idea of a work function and how it applies to Six Sigma. I have also submitted another post describing how the work function affects what we do when we develop control charts. In the interim, I was intrigued by what I read this evening in the most recent issue of Fortune magazine (June 28, 2004). With Ford’s market share slipping, the cover story discusses Bill Ford’s plans to make more money by selling few cars! I have tried to show here how this line of reasoning can be related to what Planck did to develop his quantum theory. This would also provide some additional perspective that is needed to appreciate how to fix the constant h in the equation y = hx + c = hx – W which I suggest is a generalization of Einstein’s photoelectric law. Many different arguments can be used to fix h in problems outside physics. There seems to be no guiding principle that is correct in each and every situation. It is in this context that the distinction between what I have called an “apparent quantum” and the Planck constant h must be appreciated. The more important idea, however, is the nonzero work function W. Each (x, y) pair that we observe can be associated with a work function W. This is what we mean by y = hx + c = (hx – W). When we analyze the relation between x and y, using a different h only changes the numerical magnitude of W that is associated with different points on the xy graph. This is similar to different scales to measure the temperature of a body (Celsius, Fahrenheit, Kelvin) to find the temperature gradients within the body, or to find the cooling or heating rate of the body. The work function affects many problems, including Bill Ford’s vision. Again thanks. Regards.V. LaxmananEmail: [email protected] Fordquanta1.pdf (Word document) Fordquanta2.pdf (PowerPoint presentation)
0June 29, 2004 at 7:33 pm #102634This is so obvious as to be ridiculous. I can’t believe that you haven’t been scorched by flaming posts all day long. There is nothing new here, and it is barely relevant to Six Sigma. You spend 4 pages babling about nothing, and then made a jump in the last paragrpah to explain the obvious that Ford will either have to raise prices or reduce costs if they want to make more money while selling fewer cars. Give me a break!
0June 29, 2004 at 8:16 pm #102639
SwaggertyParticipant@George Include @George in your post and this person will
be notified via email.V.
Sean is obviously a flatearther. Keep up the innovative analytics and, above all, keep posting them on the forum. Relevance is where you find it – or in Sean’s case, don’t find it. I agree, if Fords cost too much, don’t but them (I drive a BMW so I can’t really relate to it either, but I understood where you were coming from.). Solid work. Thanks. G. –
0June 29, 2004 at 9:17 pm #102643V,
I do appreciate the effort you put in the posts. However, practically speaking, ‘making more money (I assume this is profit) selling less cars’, is not a new thing. Rolls Royce, Ferrari, etc. are prime examples of ‘niche’ market selling and ‘recognition of quality work’ put out in these cars.
For Ford to make more money selling less cars, the following thinking applies:
1 – Less cars sold = less produced = more attention on production quality and improved productivity. (Improved quality will sell the car)
2 – Less cars sold = less purchasing costs, less waste, etc.
3 – Less cars sold = more production floor space, less inventory carrying, etc.
4 – Less cars sold = less work force to make them, less OT, etc.
Other such equalities can lead to more money ONLY if gains are realized on each car. Same per car cost will not equal more money with less cars sold. Ford will need to add ‘value’ to the car, take away incentives (make the car sell itself). They can also stop making models that are poor sellers or money losers (way of making less cars).
0June 30, 2004 at 12:36 am #102648
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear PB:
I agree with what you have pointed out.
We all know that Bill Ford is not going to make more money by just raising the price of his cars. He might not have a buyer. And, Bill Ford knows that as well. So, what are the options? This is the reason for this exercise.
I was just trying to put this into a mathematical form within the context of Planck’s quantum theory. The equations I tried to derive and compare with Planck’s equation give Bill Ford new degrees of freedom. These are contained in the temperature T (we need to understand what this means in economics or the financial and business world), and the Boltzmann constant k which actually includes two constants – the gas constant and the Avogadro number. Is there the equivalent of a “mole” outside chemistry and physics, or the equivalent of a gas constant (which has to be expansion and compression to heating and cooling and due to pressure changes).
We think about these things “figuratively” now – hot cars, market momentum, pressure etc. With adequate effort all the equations that you have written can be converted into the language of mathematics and eventually incorporated into k and T for the business world.
Again, I have just provided a thought process that hopefully somebody even more creative will someday find to be useful. May be Bill Ford will actually be able to “fine tune” his company to meet his goal and vision. The math model provides a tool within which to develop this vision.
I don’t expect everyone to agree with me, but I was encouraged by some of the positive reception to my ideas and felt emboldened to share this. I had thought about some of these things before, but the Fortune cover story prompted to put these ideas together and submit them to the forum.
Yes, I have noted some of the negative remarks to this post and that’s OK. You win some, you lose some. I want this to be a winwin for ALL. Thanks and regards.
Laxman.0June 30, 2004 at 12:42 am #102649
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear George:
Thanks a lot and please also see my comments with PB’s post. I was simply trying to show that there are options besides raising prices.
These are contained in the business world equivalent of the constant Boltzmann constant k (itself a ratio of two interesting constants) and the temperature T.
Once Bill Ford can figure out (empirically, not theoretically) how to raise the temperature T and/or manipulate the effect of the business Boltzmann, he can definitely make more money by selling fewer cars.
Just look at the long list of cars in the Ford stable given in Figure 2 of my PowerPoint Presentation! A lot of cost here without realized profits (more money). That’s where Six Sigma teams at Ford can help. Ford should be able to achieve savings higher than GE or Motorola.
Regards.
Laxman0June 30, 2004 at 1:03 am #102650V.,I have been wondering where you were going with this and an earlier comment about Platygæanism got me thinking along a different line. Im not say that you are, but suppose you were, taking us through an antiscience laboratory experiment in which we (the forum) became your unwitting guinea pigs and your precise methodology was phenomenology what would have been the expected incremental outcomes? And what would the study protocol look like? Lets see, antiscience disavows and repudiates the three centrums of truth, i.e., pragmatism, coherence, and correspondence, and phenomenology further decrees that truth can be arrived at in the absence of applied theory or deductive reasoning (and yes, I did look some of this up, but forgot to record reference citations sorry) therefore what would a phenomenologicallybased antiscience lab experiment look like? I believe it would look a great deal like your many postings on work function especially the one with attached calculations and conclusions. Therefore, I believe that I have found you out, you rascally academician you. Of course, I could be completely off base in this and if that is the case, I apologize profusely unless, of course, you are not ready for the experiment to conclude, in which case also never mind. T.
0June 30, 2004 at 1:53 am #102652
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Theodore:
I assure you nothing antiscience here – actually a deep respect for science and Einstein’s remarkable conception of a work function, and the profound implication of a nonzero c if the governing law is
y = hx + c
We have, for purely mathematical reasons, tended to use the law y = hx, or y = mx , in many fields and called it “science”, without any basis in empirical observations – a hallmark of science.
Now, strangely it is I who is being accused of being antiscience, a very serious charge indeed, when all I have been doing here is to show repeatedly that empirical observations support the law y = hx + c.
Einstein’s work function, let me repeat, is the most beautiful explanation for the nonzero c. One has to really appreciate what James Watt must have gone through when Matthew Boulton asked him, “James, how much work can a horse do?”
We have to ask that same question over and over again.
So, Theodore, I hope I have made myself clear – not a bone, or even a cartiledge, of any antiscience here – all science – for the betterment of mankind and the U. S. economy in particular under the circumstances that we live in today.
Please do let me know, if you still have any doubts. Regards.
Laxman0June 30, 2004 at 2:42 am #102653V.,I accept that you are indeed serious about your science. But this drives me back to wondering where you are coming from and going, unless, of course, (wink, nudge) the antiscience experiment is ongoing . T. J
0June 30, 2004 at 2:54 am #102654
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.Dear V. Laxmanan: I have reviewed your posts
concerning the work function. I would agree there might
be some interesting implications for the field of business
management and quality management; however, it would
seem that the readership of this forum has asked a very
simple question of this profound thinking you offer. That
question appears to be about the variable labels in the
relation y = hx + c. For example, if one declares the
variable Y to be Process Yield, then what might the
other variables be (i.e., h, x, and c)? Please, offer us
something we can get our teeth into. Surely, finding the
right empirical data is not required to provide us with a
set of hypothetical labels (and perhaps an application
scenario). With such profession specific labels, it will be
much easier for many of us to evaluate your proposition
and share the same profound insight you say is there.
After all, if you cannot lead us back into our world of
application, the insight remains yours alone. I humbly
await your example labels. Respectfully, Reigle Stewart0June 30, 2004 at 3:12 am #102655
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Riegle Stweart:
I believe enough has been posted already with specific examples of the type you wish to get a good feel for what can be accomplished. I would just be repeating myself over and over again.
May I suggest that you take a look at the data has been analyzed in several recent posts. Regards.
Laxman0June 30, 2004 at 3:25 am #102656
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Reigle Stewart:
Oops… there were two typos as you see below..
Dear Riegle Stweart..
Sorry about that. Regards.
Laxman0June 30, 2004 at 4:33 am #102661
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.V. Laxmanan: With the greatest respect, I have reviewed
each of your posts. I also reviewed (in detail) your posted
paper as well as the power point presentation. Perhaps
its my middleaged eye sight, but I could not find
anywhere within these references where you related the
terms y, h, x, and c to certain specific variables within our
profession’s lineofsight. I did read (in many portions of
your work) where you related these terms to the physical
variables from which the work function originated (i.e.,
physics), but not to the field of quality. Is it too much to
simply fillintheblanks for us: Y = ?, h = ?, x = ? and c = ?.
Doing so would greatly help the others (and me)
understand your reasoning and therein support further
inquiry. Thank You for Helping Us Along. Reigle Stewart0June 30, 2004 at 5:53 am #102663Theodore,
Great post …
Presumably it is also possible to associate many other scientific equations will other fields, such as the Ideal Gas Law :
V = volume of sales, P = competitive pressure, T = market demand, n = product availability, and k = Boltzman’s constant?
Andy0June 30, 2004 at 6:09 am #102664
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Reigle Stewart:
I take your query very seriously and I promised to myself that I would think about what it is that I have failed to convey to this forum. Then, I saw this post. Let me try this example here.
Let x = Number of Opportunties
y = Number of Defects
The ratio y/x = Defects per Opportunity. If we multiply this ratio by million we get defects per million opportunities (DPMO).
Now imagine making carefully controlled observations on a process, similar to the well controlled scientific experiments that led to important discoveries – I know at least some in this forum are getting tired of this. But let me persist for a moment.
For example, and this is really true, Millikan published two papers on photoelectricity in the same volume of Physical Review A, in 1916. In the first paper, he provides just two data points, on his very careful measurements of K and f in the equation K = hf W and promises to report more detailed measurements later.
With just two good measurements, he shows that the Planck constant h that he is able to deduce is consistent with the then already know value (deduced by Planck himself, in 1900). Millikan had developed a new approach to determining this universal constant of nature and along the way provide experimental proof for the existence of light quanta. Millikan himself is on record as calling Einstein’s view a “reckless hypothesis”.
Likewise, let’s imagine two very good measurements on a process where management’s goal is the achievement of 3.4 DPMO. The observations are:
x (in millions) 0.5 1
y 1 3
Notice that when x doubled the number of defects y tripled. But, like I said, these are very carefully controlled measurements and there can be no doubt of the veracity of these observations. What is going on?
The relating x and y is the law y = hx + c = hx – W where c = – W is nonzero. It is readily shown that h = 4 and c = 1 and W = 1.
For the example given here y = 4x – 1 = 4(x – 0.25).
There is a finite cutoff, x = 0.25, below which defects go to zero. The higher the work function W, the more difficult it is to produce defects and the lower the numerical value of y.
The ratio y/x = 4 – (W/x) keeps changing as x increases because of the nonzero work function W.
However, if we fail to recognize the existence of a finite W and focus only on the ratio y/x we miss the fact that defects are increasig at a high rate of h = 4 defects per million opportunties.
In this example, when x = 0.5 million, the ratio y/x tells us that we are operating at 2 DPMO and when x = 1, the ratio y/x tells us that we are operating at 3 DPMO. Failure to recognize the work function leads to false sense of what is going on. Furthermore, we lead to believe that Six Sigma is not working since y/x keeps increasing as x increases!
Note that, theoretically speaking, the constant c can also be positive in some corporations. This is illustrated by the following example.
x (in million) 0.5 1
y 1 5
Now, the law is y = 4x + 1 = 4(x + 0.25) and the ratio y/x = 4 + (0.25/x) keeps decreasing as x increases. However, the defects are still increasing at the same fixed rate h = 4 DPMO as in the first example. Now we think DPMO is decreasing but the absolute number of defects y just keeps going up.
To cite a Wall Street Journal article (I think it was in June 2000), they found that the unemployment rate in Ohio was going down. But, the number of unemployed just kept going up. They wondered why!
Our focus on the ratio y/x thus makes us oblivious to the importance of the work function. (This can be illustrated nicely with a graph as well.)
Now, change x to revenues and y to profits and the same example can be rephrased to show how the use of the traditional profit margin can be a misleading indicator of the financial efficiency of a corporation. Again there is a nonzero work function that we must account. And there are many examples.
In the problems of interest to us, some “work” must be done to keep defects low. And, some work must be done to keep profits high. The same mathematics works for both.
This is really very simple. Unfortunately, we keep using simple y/x ratios to make comparisons without thinking about the fact the operating law may be y = hx + c instead of y = hx or y = mx, or whatever we wish to call the ratio y/x.
I am repeating myself but only out of a great deal of respect for you as well. I sincerely hope this helps. Best regards.
Laxman
0June 30, 2004 at 7:19 am #102666Dear Laxman,
If you would like to try out a simple ‘physical’ model with which to test your thesis, just let me know.
I suspect that your work function is related to what is known as the ‘functional entitlement’ of a product. The functional entitlement is the yield that one would obtain if all random (point) defects are zero. In other words, the functional entitlement is due to ‘extended defects’, or those defects that extend to first, second, or even third and higher nearest neighbors.
I’m afraid I can’t check this out for you as I have other assingments, but good luck anyway.
Andy U0June 30, 2004 at 1:27 pm #102683
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Andy U:
Thanks a lot for your suggestion. I would certainly like to set up some experiment along these lines and would love to be involved.
But history also tells us that such a critical test of a new model must come from others who embrace it. (I am really giving broader meaning to Einstein’s W, there is nothing fundamentally “new”.)
So, I would like to be involved but I don’t think I should be doing the experiments myself.
Later I will post an interesting example of how y = hx + c works in the area of traffic related fatalities.
In January 2001, I remember reading an newspaper article that claimed that drivers were speeding (this was after the 55 mph speed limit had been raised) but the traffic fatality rate was going down. The article also provided some data points from the 1990s.
I was intrigued and was able to see that the law was y = hx + c where h < 0. Example, take the equation y = – 4x + 10. The ratio y/x = – 4 + (10/x) keeps decreasing with increasing x. Here y is the number of trafficrelated deaths and x is the miles traveled. They call it VMT, vehicle miles traveled in this field.
But this also told me that if y/x ratio has been decreasing with increasing x and h 0.
There was a hue and cry in the newspapers about an “epidemic” of highway related fatalities. That is what led Congress to create what is now the NHTSA, National Highway and Transportation Safety Administration.
The NHTSA and the NSC (National Safety Council) have data on x and y going back to the early 1920. When I plotted all of the data it was clear that we were seeing all three types of linear behavior , as in the following.
y = 4x – 1, y = 4x + 1, y = 4x + 1
This means (at least to me) that there is a smooth curve joining these three line segments. If we do not recognize the work function, we draw misleading conclusions. In this case, something good happened. But the maximum point on the fatalitiesVMT graph occurred after the Arab oil embargo of 1973 when Congress also passed the 55 mph speed limit as an energy conservation measure. Then came seat belt laws and airbags and antilock brakes. It was real battle between the automotive industry and the safety advocates. Ralph Nader wrote his Unsafe at Any Speed and made the headlines and captured national attention.
Many complex factors but a simple equation, the generalization of Planck’s curve, seems to apply here. This elaboration is only to make things more interesting.
Such an experiment is needed to see how there is an interaction of many different types of “defects” that lead to what we call a “defective”. But I need an establish quality expert now who is willing to take some more steps. Anyway, thanks for the suggestion. You have made this more interesting.
Remember, Andy, Rome was not built in a day. And the theory of relativity that Einstein conceived was not finished in 1905. Indeed, as far as Einstein was concerned it had not bee finished at all. Even towards the end of his life (in 1954) Einstein expressed the view that gravity was the least understood force of all. We have just started taking the “baby steps” needed. Regards.
Laxman0June 30, 2004 at 1:48 pm #102685
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Andy:
By the way, to continue the story, since we talked about gravity, Einstein’s suggestion of K = hf – W also led to a revolution, following Millikan’s confirmation.
Although Einstein himself did not like the “probabilisitic” interpretations implied by quantum theory (God does not play dice with this world), he contributed to the revolution that led to the “dual” view of light and all types of matter.
After de Broglie suggested that electrons must have wave like properties (here is a theorist who never did any experiments), Einstein got a copy of a paper by Bose, who gave a new proof for Planck’s radiation law. Einstein immediately recognized the new insights that Bose was bringing to the problem. He had the paper translated (into German) and recommended its publication. He then suggested that if Bose is correct, there must be a new type of condensate, now called the BoseEinstein Condensate (BEC) produced by the fusion of matter waves associated with slowly moving atoms.
This was in 1925. The BEC was finally synthesized in 1995 and the 2001 Noble Prize was awarded for the synthesis of this new state of matter. MIT researchers, and others, are trying to build new types of lasers called “atom lasers” based on this idea.
We can learn from these developments. Underlying is the willing to pursue what mathematical logic sometimes tells us. The trail for the BEC was a very long one – but those who followed the trail were able to find it. When the BEC falls, I found out (based on a few data points that were available) that the acceleration is about 10 meters per second squared. This is higher than any known value of the gravitational acceleration here on earth.
I checked the data for possible errors in observation. I only had two points on the distance time graph. I used Galileo’s law to find the acceleration of the falling BEC drop – it expands like a balloon even as it falls – and emits a very bright light. I could not explain the higher than usual gravitational acceleration based on any errors in the distance time values I had. I did communicate what I found to those in a position to check it out with further experimenation. But, they are too busy following other interesting trails that were already pursuing.
This is still an open book and we might still learn something new about gravity with simple experiments being done here on earth. Again, thanks.
Laxman0June 30, 2004 at 2:08 pm #102687Dear V. Laxmanan,
I’m following your posts from the begin without any prejudice , but just looking if you can help me to explain some of my working experiences.
So, if you have a proposal for a little experiment to do, please advise, so I do a check if I can do it (manufacturing area).
My email is: [email protected]
Rgs,
Peppe0June 30, 2004 at 3:09 pm #102692Andy there might well be universal formulaic and relational truths look at the basic 80:20 rule. Its been either applied or misapplied (depending on your perspective) to just about every aspect of life. Much like the perceived capacity to solve the mysteries of life now being attributed to better understanding the ramifications and predictability of the coin toss. Impressingly complex in its abstract simplicity apparently anyway.
0June 30, 2004 at 3:10 pm #102693
GabrielParticipant@Gabriel Include @Gabriel in your post and this person will
be notified via email.V,
I said that I would comment on your examples if I had something to say. Till now, I didn’t have anything to say because I was unable to find a meaning to all this. Again, my fault.
But now there is something that really intrigues me:
Can something of the type y = 4x + 10 be the equation of the traficrelated deaths in function of the miles travelled? Does it means that I will still crash even if I don’t travel at all (y=4*0+10=10)? Does it means that people who died in car crashes will resurrect if we travel more than 2.5 (x>2.5 ==> y=4x+10<0)?
I think you are subestimating our usage of the independent term in linear equations. y=hx+c, and NOT y=hx, is the standard solution of any linear regression analysis.
I also think you are overestimationg the power of the c.
Some things are just proportional. The case of the number of heads after tossing the coin n timeas is a case. The ratio heads/tosses is not constant beacuse sometimes you get a head and sometimes you get a tail, and you don’t know forewards what you will get. But that ratio will be statistically constant as n increases, with the CI becoming narrower and narrower arrounf the true and never knowledgable ratio. The ratio will not increase or decrease unless there is a special cuase, like a nonrandom “selective” cummulative deformation of the coin that keeps deforming in such a way that it progressivelly favours one of the faces (such a thing happening in real life goes beyond my imagination capability). You can have a sudden special cause that make the ratio shift, such as a sudden onestep deformation of a zone of the coin in a way that it favours one face. But what you will have there is a sudden change in the ratio too, not an increasing or decreasing ratio. Just one ratio before the breake and a new ratio after it.
Also, when you compare your research with the physics principles discovere by Einstein et al, your comarisons are not consistent most of the times. You show, for example the example of what I think was the photoelecric experiment, were one can learly see how points of the chart for one element make a clear stright line, but for another element they clearly make a different stright line parallel to the first one. Note that, in that case, eahc line represents the behaviour of ONE element. In fact, they are two different charts which are superimposed. With your example of the golf players in the US open you did the same, you made a line for each player. That’s Ok. But in the other examples you just seem to select some any 2 points in the cloud of points, trace a stright line connecting them, and then trace lines paralel to that line touching other points, some time even just one point per line. And the two points used for the first line have not the property of belonging to the same process, as was the case for the photoelctric effect and for the golf players. You would mix in the same line points that are form different check points, from different sample numbers, etc… So such a line is what? The function of the number of defectives in function of the sample size that aplies to waht?
And I also think that you are putting too much stress on the lineal function, either with c=0 or any other real number. Soem things are just not linear, so y=hx+c will not be adecuate to explain their behaviour. With any h and with any c. You have power, exponential, log, trigonometry, and many other behaviours not covered by your work function. But you seem to try to solve this by making the constant c not so constant. Of course I can represent ANY function with y=hx+c if you let me choose the “c” function. For example, y = sin x = hx + c, where c = sin x – hx.
In short, y=kx, y=hcW, and y=f(x) (any f) can work. You just have to use the right function for modeling the behaviour, and not to force one of the functions.
But, again, all that is just what I think. And I can be wrong.0June 30, 2004 at 3:20 pm #102695Gabriel,
But, again, all that is just what I think. And I can be wrong.
But you are not wrong. Just this morning I finally came to the same conlcusion about what was bugging me about all of V’s hocus pocus. He is oversimplifying our understanding and use of linear regression, and making a bold assumption that we simply use y=hx rather than finding an intercept, or better yet using a nonlinear regression as required.
I still maintain that he has not come up with anything new, nor added any value to a problem solving method. The linear equations is powerful, but only when used correctly and I don’t believe that he has used it correctly even once in any of his long and glorious examples.
SP0June 30, 2004 at 4:49 pm #102702
GabrielParticipant@Gabriel Include @Gabriel in your post and this person will
be notified via email.The acceleration of the gravity on the surface of the Earth ranges from aaprox 1.8 to 1.85 m/s2.
Your value of “abut 10 m/s2” is about 1.5 ro 2% higher. Could the difference be due a lack of accuracy in the estimation of the value 10m/s2?
I never measured the freefall of a “BoseEinstein Condensate” object, but maybe measuring the position of a body that “it expands like a balloon even as it falls and emits a very bright light” is not so precise, specially if the drop was of a very small height rather than from a few feets. From your phrase “I found out (based on a few data points that were available) that the acceleration is about 10 meters per second squared” it seems that there may be some variation in the data too. If the acceleration is constant and if there was no variation, you would get the same acceleration value with any comination of the first (zero speed) point with any other point and with the combination of any 3 points. And also the resolution could be a matter. If the resolution is not better than 1%, the anything else will be insufficient to assure an error smaller than 1.5% in the estimation of the acceleration.
If you could post the raw data, I could make a better analysis. Form the scarce info I have now, the hypothesis that the difference is due to measurement error / uncertainty cannot be rejected.0June 30, 2004 at 5:11 pm #102704I am in complete agreement with you thoughts about this thread, however, you may have made a type concerning gravitational acceleration. I believe the value should be around 9.8 m/sec^2.
Kirk0June 30, 2004 at 5:12 pm #102705And count how many TYPOs I just made in my previous message…
0June 30, 2004 at 5:47 pm #102706
GabrielParticipant@Gabriel Include @Gabriel in your post and this person will
be notified via email.You are right of course. It should read:
“The acceleration of the gravity on the surface of the Earth ranges from aaprox 9.8 to 9.85 m/s2.”
Thanks for pointing it out Kirk.0June 30, 2004 at 6:19 pm #102708
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.V. Lamanan. Thank you very much for your application
example (i.e.DPMO). The idea that y=defects and x=
opportunities greatly facilitated my understanding of your
value proposition. It seems to me that y=hx+c is
applicable, but does represent a significant overlap with a
statistical procedure called linear regression. As you
may know, simple linear regression assumes the form y=
bo+b1x1, where y is the dependent variable, x is the
independent variable, bo is the intercept and b1 is the
slope. Essentially, it would appear that your work function
is of the same form as the linear regression model. If so,
we can say there is an association between the absolute
number of observed defects (y) and the absolute number
of defect opportunities (x). For any given situation, the
absolute number of observed defects will likely vary over
time even though the total number of available defect
opportunities remains constant (owing to influence of
random and/or nonrandom causes). But, the case y > x is
not physically possible given the conjugal relationship
between a defect and its corresponding opportunity. This
is to say that a single opportunity can only be realized in
one of two possible states; namely, 0 or 1 either the
opportunity proves to be defective or it does not (once
brought into existence). If x=0, it is not physically possible
to realize the case y > 0, again due to the conjugal
relationship. Nor can x be a negative number in the
physical world. Given this discussion, how would you
propose to establish the h and c terms of the work
function. If the work function is analogous to regression, I
need no guidance, but if the work function is not
analogous to regression, I must seek your advice on how
these terms should be properly established. What is your
thinking? Respectfully, Reigle Stewart0June 30, 2004 at 6:31 pm #102710
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Kirk and also Gabriel:
It was not a typo!
The value of 10 meters per second squared was deduced from the data on distance and time that I found on the MIT video of a falling BEC. I also wrote a short article about this.
I sent it to the MIT group we had an interesting email exchange – five over just one weekend!
They finally agreed with my calculations and that the discrepancy between the commonly accepted before that you have both pointed out could not be explained easily and was not due to experimental errors.
I then checked if similar data was availabe from other groups who had been studying BECs and falling BECs. However, no data was available. I then went back to the MIT group and suggested that we review their videos carefully against markers that may be presented in the background – or better yet simply do some more experiments.
I also contact the groups at NIST and Caltech.
However, no luck. They had their own trails to pursue.
The acceleration due to gravity does not depends on the mass of a body. This is what Galileo found. But, it does depend on the shape of a body. Hence in a race between a cylinder and sphere, rolling down an inclined plane, the sphere will win.
But when an irregular shaped boulder rolls down the mountain side, we can only “guess” the time it will take to reach the bottom. We must do experiments all over again. With the BEC the shape is changing since it expands like a balloon as it falls.
But no luck with my proposals to get any of the groups I contacted to do some more experiments to check in this out carefully. Regards.
Laxman
P. S. There are a lot of typos in what have been posting too, since I am trying to get things done in a hurry and respond to the messages. So, please don’t count!
Gabriel:
I am sending a file on traffic fatalities that should answer the question you posed in other post today about y = 4x + 1 and traffic fatalities. You are absolute right and that is exactly got me interested in this problem. I even sent a letter to the editor of The Detroit News that the analysis was based on erroneous use of the ratio. They didn’t bother to publish it. Neither did The Wall Street Journal when I tried to explain the curious Ohio unemployment rate. Thanks for following my posts. This exchange makes it all seem worthwhile. Regards.
Laxman0June 30, 2004 at 6:53 pm #102712
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.AND THE PRACTICAL SIGNIFICANCE TO SIX SIGMA OF ALL OF THIS IS????
0June 30, 2004 at 7:20 pm #102714Darth – don’t be a Luddite. Open yourself to innovation and new analytical approaches. Perhaps V. can take the time offline to bring you up to speed on the practical aspects of work function Six Sigma applications. He’s already demonstrated that his process gets him fairly close to the results gained via Six Sigma basic statistics. What more do you want?
0June 30, 2004 at 7:27 pm #102715
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.Darth: The practical significance of all this is as follows.
Laxaman has identifed a simple equation that MIGHT be
useful to forecast the performance of something that
would normally reported as a simple ratio. As you know,
many six sigma metrics are presented as a simple ratio,
but often fail to communicate what is really going on. For
example, final yield is often defined as the ratio of output
divided by input. However, the classic yield metric does
not account for rework (like rolled throughput yield does).
Therefore, the classic yield metric (output / input) can be
highly deceptive. It is possible to have a final yield of
100%, but in reality, the input exceeds the output.
Another example would be process capability, which we
recognize as a contrast of the design bandwidth to the
process bandwidth. Laxaman contends that such ratios
are often misleading in terms of what is acutally
happening and should be altered depending upon the
circumstantial conditions at hand (by way of changing the
constants). If his work function is merely a linear
regression model, then there is nothing “new.” On the
other hand, if the work function is not analogous to
regression AND there exists a practical way to establish
the constants, THEN it is POSSIBLE that the work function
should be further studied for inclusion in the six sigma
toolbox. If what Laxaman has put forth holds water, it
represents a type of “universal performance metric” with
many applications within our field of endeavor.
Respectfully, Reigle Stewart0June 30, 2004 at 7:30 pm #102716
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Darth:
This is a very GOOD question and deserves a response.
I am a R & D person. That is my professional background and what I have done all my life. When you finish a research project, if you are an academician, you write a paper and get it published. End of story.
If people find it useful and want to talk to you about it, BIG BONUS.
If you are doing industrial R & D, like I have, you see if you can file a patent. If you can, WONDERFUL. You get a plaque and a silver dollar from your company at the department annual or monthly meetings to honor those who have made contributions to the company.
But, I had a boss who was not happy with that. He wanted to do more. He actually went from one division to the other seeking a customer to implement the patent that he got and show that he could produce cost savings for the company he was working.
He made great strides in his career and became department head and he encouraged everyone to do the same. Everyone in the department was trying to implement their patent (“work”) and produce cost savings. Indeed, I can say, our department was one of the “stars” of the R & D center where I used to work.
I tried to do the same and emulate his example. I did not rest with securing a patent. We went to the divisions and got funding to the tune of a few million dollars to set up an Advanced Development Center and start producing and testing prototypes. Then we transferred the technology to our suppliers. I also worked on implementing patents that others had filed, just as enthusiastically, when I was recruited.
But, all this was not so easy. They asked me the same question that you are asking. I told them, I should not be the one answering the question. The divisions that will sponsor this work should.
So, we put together a team involving representatives from many divisions who provided the neeed inputs to do a costbenefit analysis. It was only after the divisions agreed that there would a great benefit from all this were the steps that I described (Development Center, prototypes, suppliers, etc.).
These are the steps that I am aware of. This is the only way any successful research project is carried to its logical conclusion.
I believe in the merits of what I have developed. After all, I am resting on the shoulders of someone everyone worships.
It is Einstein’s work function. Now, I can only think about how it can be used and that is what I have done here.
The remaining steps will require a lot of hard work – by a lot of people.
Come aboard Darth!
Regards.
Laxman0June 30, 2004 at 7:30 pm #102717
Tired of it!Member@Tiredofit! Include @Tiredofit! in your post and this person will
be notified via email.OK OK, I now get it. So … Caltech does NOT return your call, Detroit News does NOT return your call, Wall Street does NOT return your call, why not then use this forum for your personal billboard? Who cares how many times you post per day and how long the notes are or how irrelevant the posts are. It’s FREE, isn’t it? Ah, the freedom of speech!!
Do you have a job?0June 30, 2004 at 7:32 pm #102719
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Reigle Stewart:
Thank you so much for answering Darth’s question, which is very important. I did not see your post before I answered Darth’s question.
I am very encouraged by what has been going on since June 22, 2004. Iraq is now a sovereign country. Let’s rebuild ours as well. Regards.
Laxman0June 30, 2004 at 7:37 pm #102720Another Luddite peeks out from the burrow….
0June 30, 2004 at 7:42 pm #102722
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.Nice synopsis Reigle, thanks. But that is my point. First, you state the classic yield metric can be deceiving since it does not account for rework, correct…then you say “like rolled throughput yield does”. Problem solved. You summarize that Laxaman “contends” about other ratios. That’s great. I saw lots of formulas with ill defined variables and no “practical” demonstration of validity. You then state that “If his work function is merely a linear regression model, then there is nothing “new”.” Agreed. So, my original question of where is the practical relevance to SS still stands. This has been a nice thread of theoretical proportions with much time spent on the basics of physics. Is it not appropriate at some point to ask Mr. Laxaman to put forth “proof” of his contentions. Whether a “universal performance metric” is feasible given the diversity of applications in our field is questionable. But, like all theoretical propositions, at some point it is proveable and useful or it remains a mere amusing discussion. Would love to see a case study. The best part of the thread has been the civility of the posters.
0June 30, 2004 at 7:43 pm #102723
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.To All. Innovation must rise above all else in our field of
endeavor. Rote understanding of existing knowledge is
great, but what marks the pioneer is the ability and desire
to innovate. V. Laxaman is doing a marvealous job at
trying to communicate a possible innovation. Until we
study it and comphrehend the possibilities, we will never
know. To this end, I am reminded of the movie “Close
Encounters of the Third Kind.” Imagine upon first contact,
the human species not understanding the “tones and
pulses” simply launched a missle at the source with the
rationale “lets shoot what we don’t understand.” We have
witnessed this so many times on this discussion board
(outofpocket dismissals). Remember, many inventions
had little practical significance at their inception, but latter
proved to be a milestone in the course of human history.
V. Laxaman is at least attempting to share a vision. He
may be wrong or offcourse, but he is certainly innovative.
His approach may be nothing new, but what a “packaging
job.” Even if it is the same as regression, it is compelling –
– when is the last time so much discussion was generated
about linear regression? Maybe this is a way to draw
management into the use of regression (if that is what it
turns out to be). Sometimes the moustrap don’t work, but
the box in comes in is quite useful. Please lay down your
swords and open your minds to the possibilities because
such opportunities (innovations) are far and few between.
Lets communicate with ET, not shoot him down before the
message is delivered. My Humble Opinion, Reigle
Stewart0June 30, 2004 at 7:53 pm #102725
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.Agreed we all need to be open to improving our profession. But, the expansion and success of SS has largely been based on its practical value to business leaders. At some point, the business executive must see that there will be an impact on their business processes and eventually business profits or whatever measure of success is relevant. All innovation must start with an idea, usually vague and unproven. I am merely prodding Laxamann to move on to some demonstrated proof of his theory.
0June 30, 2004 at 8:04 pm #102727
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.Tyrone, Luddite must be your new favorite word for the day since you want to apply it to anyone asking to move from theory to proof. A Luddite is against technology and technological innovation. Asking for practical demonstration of value is a bit different than denying it has value at all. Whether we like it or, our SS profession is one of practical application. That has been the core of its success and the reason so many of us are employed in this endeavor. It is doubtful a senior executive will embrace a theoretical proposition without some demonstration of its usefulness. I am not sure that the notion of the speed of a falling object can be clearly seen by most senior executives as a relevant way to improve their businesses. So, if you want to dwell in La La Land, that’s great. If this concept of Laxaman’s has value and it is his intent to improve how we do things, then let’s get on with it and show it is a better way.
0June 30, 2004 at 8:06 pm #102728
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.Darth: I fully agree with your position and support your
assertions. I do believe he is entitled to a little more time
to “boil the ocean,” although I too am getting somewhat
anxious about the proverbial “bottom line.” He did
promply reply to my request for a pragmatic example
(reference his DPMO example), although it too was
somewhat clouded with mysterious terms (like h and c).
But if he can explain these terms in the context of the
DPMO example, we may have something. If not, I would
agree that the context is more “theoretical” than
“practical.” Thank you for your kind consideration of my
position. Regards, Reigle Stewart0June 30, 2004 at 8:12 pm #102729Reigle,
I agree that innovation must rise. However, the innovation must be understood, its practicality digested and applicability defined before its acceptance. I do want to make sure that the mouse trap I ordered functions even thought I may like the package.
What Laxmaman has done is shown us possiblities. I am trying to understand the stage at which I use his work function idea (if I accept the same and if I can). In the golf theory he laid out, one can predict any victory only based on model generated for that game. Once the game is over, a new model needs to be generated for a different game for which some data is required. My understanding is that previous models can not predict future results. My understanding also is that the model (work function) can change with each new data point. Does this mean that the work function is applicable to finite data points only? What about a manufacturing process where there is continuous data? If Tiger wood shoots 8 under par on 1st day (and he is leading by 5 strokes), the model would predict him winning the whole thing. Let us say he continues the victorious march and now leads by 6 strokes on 2nd day. Then on 3rd day ??? 4th day???
I know I will have to continue digesting various posts from Laxmanan in this forum. I agree completely with one previous post. Posters have been very civil in this post.
PB0June 30, 2004 at 8:14 pm #102730Laxman,
Instead of all these posts going back and forth which I still question its relevancy to SS or its content, please post your email so anyone who is interested in your work, can contact you directly.
We had enough of this; let’s move to some practical SS discussions.0June 30, 2004 at 8:25 pm #102732
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Reigle Stewart:Thank you very much..
……It seems to me that y=hx +c is applicable, but does represent a significant overlap with a statistical procedure called linear regression. … etc.
Yes and no. You can skip to the last paragraph, if you wish, where I give the ‘nut shell’ version. .
As you know, least squares is just an attempt to fit a straight line through many points that fall approximately, but not exactly, along a straight line on the xy graph. We do it by minimizing the square of the errors from the fitted line, or regression line.
But Millikan never used this argument to determine h. (I have plottd his data in of my posts.) Instead he finds six values of the slopes and takes an average. I wondered why he didn’t use least squares. Then I read Legendre’s original March 1805 paper (thanks to the Internet). He tells us that what he is proposing is just a simple and mathematically elegant way of doing what we would like to do. We know the relation is linear, but how do we quantify it?
Then it hit me that minimizing the square of the errors may be just like minimizing what we call entropy (or chaos, or disorder) in physics. The position of a body is measured relative to a reference point. Thus, displacement equals (x – a) where “a” is an offet that we must allow for since we take a = 0 for convenience to measure the distance of a body. Just like we set the mean to be zero in the standard form of the normal distribution curve.
But, now we generalize this idea to any “position” or “metric” of interest to us. The velocity is obtained by dividing by time and velocity squared is the kinetic energy of the moving body. Hence, the square of the deviation from any mean is like the kinetic energy associated with the “system’ of “numbers” that we use quantify our observations in many walks of life. .
Once, you get this basic thrust, you have no choice but to go all the way back to what Planck did with entropy and energy and temperature and Einstein did with entropy and energy to then arrive at the work function – when light shines – no pun intended.
I went through the whole exercise replacing energy, entropy, and temperature by abstract mathematical symbols. That’s why I posted what I did about what Bill Ford’s challenge is.
He can reduce the number of cars sold, but we all know (and so does Bill Ford) that he cannot make more money by simply raising the price of the car. Something else must be done. This requires a deeper appreciation of the various constants that appear in Planck’s equation, as they apply to problems beyond the black body radiation.
A company can be thought of as body “radiating” energy – it is now called money. The higher the temperature of the company the easier it is to sell product. The higher the work function, the more difficult it is to produce defects in its processes. Einstein’s law applies to visible light. But light is special. A blackbody is messier.
This is the analogy that I have been working with when I first started seeing deviations from a linear law, then a power law, and then found that a powerexponential law desribes some of the data from the automotive industry that I was studying then I knew I had to understand Planck, thoroughly.
Only recently, after studying Business Corporate Scorecards (profits and revenues data for 900 U. S. corporations) for the past four years, I realized that it is time to call the constant “c” a work function. Then, I had to study Einstein’s paper all over again.
He first talks about entropy density and energy density of light. Now, I was ready. It was just density of property x and density of property y. Then he tells us something about the temperature of right. I said, I know what he is talking about now – thank goodness, and how smart to get to this first. He was showing that we have to think differently from Planck. He is talking about radiation outside a body emitting light. Planck is talking about the body itself.
Then he comes to the work function. Along the way he talks about entropy again and how the entropy of particles compressed into a gas is related to the probability of finding the particle in a given volume.
In a separate post yesterday, Andy U was talking about (or joking about!) the broader application of the ideal gas law. Yes, it can be taken literally. We can given new meaning to pV = nRT. But now we need to do experiments again to verify the law.
Sorry about these analogies again. But once we get them right, we can build a process and make sure it works just like the Universe that God created has been functioning with a single value of the Planck constant h. Why can’t the business world function in the same way?
A lot of soul searching is still required – now by others – if you are willing to see what I have tried to present here. I think I am ready to answer many questions that will be posed. I will try to do that to the best of my ability – to a receptive audience. Again, thanks. I hope I have answered your question, if only indirectly.
In a nut shell. It looks like classical linear regression but now we know why points show a scatter. Each has a different work function. If all points have the same work function, they would collapse on to a single perfect line y = hx + c and least squares slope becomes the same as the slope h that can be deduced in many ways. What is the best way. Ask Einstein. Study what he did with light. Regards.
Laxman
0June 30, 2004 at 8:29 pm #102733
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear PB:
I completely agree with the civility point noted in this last paragraph and please accept my whole hearted thanks – I mean ALL in this forum. Like Mike says, today, may be the job is done.
Everyone knows where to contact me. Regards.
Laxman0June 30, 2004 at 8:50 pm #102734
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear ALL:
I think I have a new name.
This is my closest encounter with the Six Sigma experts.
May be you will start hearing the messages from ET soon.
To, Reigle Stewart, my sincerest thanks.
Regards.
Laxman0June 30, 2004 at 8:50 pm #102735Laxmanan,
I believe you have handled the post response very well.
In your post to Reigle, you mention work functions for each x for Bill Ford. Some of the x’s he would have to consider would be – raising price (may generate a ve slope), improving quality, reducing defects, reducing direct/indirect cost(s), etc. However, each can not be studied as a separate work function, can it? (Unless you want to look at only 1 x that would be directly correlated to the y and not worry about the other x’s)
Would not your work function model y = hx + c now become
y = h1x1 +/ h2x2 +/ h3x3 +/..+/ c. (linear regression) ?
PB0June 30, 2004 at 9:11 pm #102737Darth,
You are getting soft. ;)0June 30, 2004 at 9:36 pm #102739
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear PB:
First thanks for the kind remarks.
Again, YES and NO to the question posed. This gets us back to what I think is called multivariate analysis (remember what Tim F did with the data on time, temperature, load, traffic.. posted by Indresh, yesterday.)
The work function, if you are willing to agree with me and give it a try, encompasses ALL of the uncertainties associated with a multivariate problem. If it succeeds, we might be able to get around multivariate regression and the problems that they pose. Simple linear regression (OLS) is difficult enough to use. – I just saw a post on plastic molded parts where someone is trying let the parts cool and predict the dimensions.
Anyway, getting back to your question, we need to identify the most important driver of all – what we truly believe is the relevant x for a process of interest. In the example of traffic fatality studies many different “x’s” have been used, each used to make a point or the other, but no universal understanding yet as to how to study this problem.
So, in a nut shell, find the correct x and measure the correct y. The work function takes care of all the things we don’t know.
Again, PB, this is the genius of a young Einstein who couldn’t find a job. He thought about a problem that puzzled the great minds in physics – each of whom received the Nobel Prize. That light may be a stream of particles had been proposed by Newton himself – who showed us how to separate light into the colors of the rainbow.
But Newton’s idea of particles of light was proved to be WRONG, convincingly, in the 1850s. Yet, Einstein revives the old discredited idea. He introduces the idea of energy and work, that Newton had never thought about (the law of conservation of energy could not formulated until the nature of heat was understood). Then he conceives the work function – to explain a very complex of set of things going when a photon makes a close encounter with an electron. It
– strikes the surface of a metal
– penetrates the surface of a metal
– strikes one of zillions of electrons
– what are the forces acting on the electron, zillions of positively charged nucei, zillion of negatively charged electrons – impossible job.
So Einstein say, let’s call this mess W. Thus, K = hf – W. Problem solved.
Millikan thought Einstein was nuts! He wanted to prove he was wrong. But, the good scientist that he was he actually ended up proving Einstein was right. He admits to this in his Nobel lectures.
The work function, in a nut shell is a powerful idea. It can helps us get around multivariate analysis. Now, I know I am going to get whacked.
But, if you wish, you can do the same over and over again and plot every measurable y against every measureable x and keep looking for h and c and a work function for each variable. May be that is what we have to do for sometime until we understand all of this fully. When we are done, we will find out that one good (x, y) will get the job done.
When was the last time anybody tried to measure the gravitational acceleration when they found something falling?
They did not do it even with the BEC, and neither did they, to the best of my knowledge when many orbiting bodies fell. They just watched and assumed that g = 9.80 plus or minus, whatever, and kept doing what they were doing with their equations.
Unfortunately, we cannot do this in the business world. We all know how to count money. But, we still don’t know how to count energy, never mind what the physicists and chemists tell you they can do!
That’s also why cold fusion did not take off. I am not saying it’s a great idea to explore – only that suddenly everyone was very serious about the measurement of energy, very precisely.
But every day – in automotive and aircraft labs engines are tested and we fail to account for many “losses”. Try preparing a graph of work versus heat input to an engine.
I tried to do that. I couldn’t find any data for any of the hundreds of automotive engines that we drive around in! But there is data all around us for hundreds and thousands of corporations. But nobody has analyzed to see how the law Profits = (Revenues – Costs) the equivalent of Work = (Heat in – Heat out) applies in the real world.
How are profits related to revenues? Believe me, y = hx + c.
What is my point? May be don’t need the multivariate regression where we consider profits, revuenues, assets, market values, shareholder equity, and you name it. Jut study profits and revenues to understand corporations and understand the work function and how much it changes for a given corporation. For Microsoft, I can tell, BINGO – perfect straight line with no real change in the work function. Not so for others. That’s why Microsoft is doing what it is doing and other are not.
Sadly, no one knows this simple truth – still waiting to be discovered. So, thanks again for asking. Regards.
Laxman0June 30, 2004 at 9:52 pm #102740
GabrielParticipant@Gabriel Include @Gabriel in your post and this person will
be notified via email.V. Laxmanan, or ET :)
– The typo Kirk and myself were talking about was mine, not yours. I mistaknly wrote that the acceleration of the gravity was 1.8 m/s2, instead of 9.8.
– After your explanation of how the value 10 was obtained, now I am almost sure that the difference with the accepted value of the acceleration of the gravity is due to measurement uncertainty. I just fid hard to imagine someone measuring the distance between two points with a tape over a video monitor (distance that would be further affected by any distortion introduced by the camera, by the video player or by the monitor), and measuring the time elapsed while a body (that changes its size and shape and that emits a very bright light) travels between those 2 points (time that would be further affected by any discrepancy in the precision of the playback speed introduced by the camera or by the video player), combining these measurements of time and distance (each with their errors) in the ecuation g=2*d/t2 (the t squared will magnify the error) and get g with an uncertainty of less than 2%. Note, for example, that if the time ealpsed was 1 second, measuring it with a 1/100 seconds chronograph will add 1% of uncertainty just because of the resulution. And with an acceleration of 10m/s2, 1 second equals to 5m travelled. Did the BEC body travelled that much in free fall? Do you have an estimate of the uncertainty in your determination of g?
– I hate to say this, but you are plain wrong in that the acceleration of gravity depends on the shape of the body. Your examle of the race between a cylinder and the sphere is so wrong. The reason why the sphere will win is because of the friction force between the rolling body and the plane. Put a friction free plane and both bodies will araive together at the bottom (and none of them will rotate). The acceleration of the gravity is, by definition, the acceleration of a body in a gravitatory field when the only force acting on that body is the weight (gravity attraction). With your reasoning, I could say that the acceleration of the gravity depends on the density of the body because a ball of cotton falls slower than a ball of steel of the same weight.
(Note: In fact, the acceleration of the gravity will depend on the shape of the body, or more exactly on the volumetric distribution of its mass, but for reasons that are far from what you said and that have no practical significance in the BEC thing falling experiment. The influence is due to the fact that the gravitatory field around an object is the result of integrating the infinitesimal gravitatory fields produced by each of the infinitesimal mass elements of the body. The nice thing is that if the body is spherical, the gravitatory field at any point of the space that falls outside the surface of the sphere equals the field at the same point that would be produced by a particle located at the center of the sphere and with its same mass. And the Earth is “nearly” spherical and that’s why the acceleration of the gravity is “nearly” equal at any point on its surface).
– Appart form highway fatalities, in the same post I mentinoed several other objections to how you use your model. If you have something to say about that, I’d like to know of it.0June 30, 2004 at 10:11 pm #102742
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Gabriel:
Regarding shape affecting acceleration – I know I am right or that is what physics books teach us. Please also take a look at first chapter of The Birth of a New Physics, by Bernard Cohen. You will find an interesting discussion of many different views about falling bodies before Cohen starts telling us the story that changed our view from the Aristotelian view. Please also read what Newton says in the very first page of his Principia. Here talks about is mass – he was the first give the scientific definition of mass and how to measure mass. He also talks about timing pendulums here – even hollow ones that were stuffed! It is amazing opening paragraph for a monumental work.
Now, I must admit I have never seen the race between a cylinder and a sphere.
Regarding BEC, I communicated directly with the most important people involved with these experiments at MIT – I don’t want name anyone here.
I could not believe what I deduced either, just like you cannot. Finally, after our email exchanges, I was unable to persuade them to revisit this important matter. I then tried other groups. Same outcome. No data and, amazingly, no interest in following the trail.
Then I also talked to someone about how much it costs to set a BEC unit. It is just about $ 1 million. Now, it may be even less, since MIT may be willing to donate theirs. They must have built new generations. If someone can help me get one of these units, I would be glad to repeat these experiments. Anyway, it looks like it was my fate to be here!
I am glad we are talking and I would be happy to continue this. But, perhaps, the forum may be getting tired of this. I don’t want to overstay my welcome.
Glad to continue if it is OK. Regards.
Laxman0June 30, 2004 at 10:18 pm #102743
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.Now this post and the reply from V are definitely relevant to SS…..NOT!!!! This has been fun, but let’s take this one off line as suggested earlier and get back to something, while more mundane, is relevant to what we do in SS. Can’t wait to run and tell Ken Lewis about how falling objects and acceleration of gravity are going to keep our profits going in light of rising interest rates. On second thought, I will probably find out first hand when he throws me out the window from the top floor of the corporate building.
0June 30, 2004 at 10:29 pm #102744
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Gabriel:
Just one more thing….And the Earth is “nearly” spherical and that’s why the acceleration of the gravity is “nearly” equal at any point on its surface).
Would it intrigue you, if I called your attention to modern satellite observations that show that the shape of the earth has changed over the last thirty some years since observations began! This was reported a couple of years ago in Science magazine. Be glad to give the exact reference, if you wish. Or you can do the Internet search.
But, in Newton’s days, pendulums were carried by those scientists (who were then called astronomers) who traveled to different parts of the globe. They reported various lengths for pendulums that vibrate in one second. Newton studied this data and used it conclude that – if the inverse square law is correct – the earth must be slightly bulged at the equator than it is at the poles. And we all agreed since he showed us so many different things in that marvelous book.
Now satellite photograph tells us that the bulge has disappeared. This is due to large scale redistribution of mass. The earth seems to be pulsating – the marvels of modern observational techniques.
Newton developed calculus (which then gave us least squares) to address this problem of mass distribution within the earth. He had to convince that it is OK to treat the earth as “point mass”.
We can have a lot of fun if all come together physicists, chemists, engineers, biologist, and statisiticians. But statistics – oh, that has always scared me! I just got by learning some simple statistics, unlike many of you here at the forum who no doubt know a lot about that discipline. Instead of trying to learn statistics, I started thinking about Planck’s law (also rooted in statistics, how could I be wrong!), when I got interested in studying data from many walks of life.
Only then I came to prefer the simple idea that Einstein puts forward. Enough WF, I shall NOT SAY IT. Regards.
Laxman0June 30, 2004 at 10:35 pm #102745Darth,
No, Luddite is not my new favorite word, nor Luddites my new favorite people. I do appreciate your telling me what the word means, your level of contribution is sustained. It appears that Laxmanan has tried to slowly format our little internal hard drives to give us opportunity and the wherewithal to take the time to think about and understand his concepts prior to plugging in the data. I understand your impatience and also am anxious to see his results, but Im willing to give him some time to get there. Did senior executives embrace Six Sigma when it was in the hallway stage of conceptualization? Probably not, but youd be in a better position than I to know the history of Six Sigma. I wonder if early Six Sigma theorists were thought by senior management to be out there in La La Land? Possibly. Now, Darth, demonstrate again your contribution to your field by running a search query to see how many postings Darth has versus Tyrone. Its been a most impressive demonstration each time you have done it. Actually, brummagem is my new favorite word. I hope you dont find it offensive. Tyrone0June 30, 2004 at 11:17 pm #102746
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.PB: Excellent points! Do recognize that the “model”
parameters of many control charts also change after each
new data point (or grouping of data) — much like the
EWMA chart. Also, the fundamental model of many SPRT
tests change (like Sequential Probability Ratio Tests; i.e.,
sequential hypothesis testing). So, a dynamic and
evolutionary model (like EVOP) does not trouble me as it
may concern you. What does trouble me is not knowing
how the “constants” should be established in his work
model: y=hx+c, especially for the case where c > 0.0June 30, 2004 at 11:27 pm #102747
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.V. Laxmanan: Thank you for your kind and timely
responses to my questions. Do understand that I deeply
appreciate your innovative thinking. There is currently
enough information and data for me to chew on for some
time. Now, I need to just sit back and think about it.
Again, your contribution and efforts are greatly
appreciated. Respectfully, Reigle Stewart.0June 30, 2004 at 11:51 pm #102748
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear PB and Riegle Stewart:
I have discussed the question of fixing h and c in detail and this is the “heart” of the matter.
Once I thought about the idea of a work function, it became clear to me that I have to defend my choice of h.
Anyway, we can discuss this offline, if you wish, just send me an email and will be happy to share my twopage summary write up from my book on Six Sigma and Einstein’s work function.
But, once you begin to appreciate the fundamental importance of a work function, you will soon come to the same conclusion that I came. There are many values of h that one can use. Einstein tells us what to do. We can listen to this wise man, or keep doing what we are doing to find h using linear regression etc.
But, the exact value of h when we have a scatter of points in (x, y) space is not important. What is important is the differences in the work function between the points. This difference is given by
Delta W = – Delta c = h (Delta x) – Delta y
This is what we must reduce to zero, ideally, to make all the process data collapse on to a single straight line zero scatter.
Using different values of h in the above equation is like using different temperature scales (Fahrenheit, Celsius, etc.) to find the temperature gradients or the heating and cooling rates when dealing with energy transfer problems. This is also noted in my recent posts.
Thanks again for all your kind remarks today. Do feel free to contact me if you find it is worth your time. Regards.
And cheers all!
Laxman
0June 30, 2004 at 11:54 pm #102749
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Reigle Stewart:
I do wish to thank you as well. I remember being up early last night when we started this. I am glad it is ending so well.
I did send my post to the forum on traffic fatalities. If you think of a fatality like a defect, you can may be appreciated the analogy some more. Please feel free to contact me off line, if it is worth your time. Regards.
Laxman0July 1, 2004 at 12:13 am #102750
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.counterfeit, gaudy but worthless, sham
How could I be offended by such a definition especially from such an eurudite, intellectual individual as yourself?0July 1, 2004 at 12:25 am #102752
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.By the way Tyrone, I took your advice and searched for your long list of contributions and even went back and looked at all the postings for V’s thread. The only thing I found was your three insults. Wow, another Lurker whose only contributions are insults or who post under a variety of names to hide the fact that they would rather watch and insult than post an opinion. This was a valuable and even at times heated thread until you sought fit to start the insulting. You don’t have to agree with anything anyone posts but why insult? I certainly did not insult or demean V. Keep it up and might be able to join the exaulted ranks of Messers. SSMBB and Mike (if you are not one and the same). Keep up the heckling and personal insults but if you are tempted to ever voice an opinion or comment on a topic please feel free to do it.
0July 1, 2004 at 1:04 am #102755Darth,
tyrone has a good point. In the begining people thought Dr. Harry was in La La Land. Wait a minute, they still do.0July 1, 2004 at 1:09 am #102756I prefer where h = 4.3 and c = 0.99.
0July 1, 2004 at 1:21 am #102758Reigle,
ET was not in Close Encounters.
My bet is Laxaman is a blubbering idiot with nothing to contribute. He has been going on for days with nothing to say.
When he has something of value to say, I’ll let you know. Do you prefer for me to use your Chaos Nut or Statman Too or Guess Who email address?0July 1, 2004 at 1:25 am #102759Just to give you an idea of how I will use the “innovative” work function. I am thinking it is best used when drinking heavily, preferably Jack Daniels, and most everyone in the room is passed out or incoherent. That way they will not find you to be some irrelevant nut the next day.
Just a thought.0July 1, 2004 at 1:49 am #102760
GabrielParticipant@Gabriel Include @Gabriel in your post and this person will
be notified via email.Even when Darth will say it is out of the scope of the forum (actually it is), nice information about the change in the shape of the Earth. So it is still “nearly” spherical, but not the same than 30 years ago.
Staying out of the scope, I would like a direct answer to this direct question (asked for the third time): Do you have an estimate of the uncertainty in your determination of the 10 m/s2 that rejects 9.8something as a possible true value?
Also out of the scope of the forum (readers beware: do not go on if you want “on scope” things), you are right that the acceleration of a sphere that goes down a slope will be different than that of a cylinder (unless the slope is friction free, in which case both will have the same acceleration). As you said, this can be seen in any textbook of Physics.
But you are still plain wrong linking that with “the acceleration of gravity depends on the shape of the body”. In the case of the slope, you have two forces in the direction of the movement: A component of the weight (parallel to the slope) and the friction. Fixed the angle of the slope, the component of the weight is proportional to the mass of the body. As the acceleration is inversally proportional to the mass of the body and directly proportional to the force, if the component of the weight was the only force in the direction of the movement (like in a friction free slope) then the masses would get cancelled and the acceleration would be a function of the proportionality constant only, i.e. constant and independent of the size, shape, weight, mass and any other things (for any body of any shape, even a cube, it will be g * sin alpha, where g is the acceleration of the grevity and alpha is the slope). Incidentally, as the weight passes through the center of gravity and the normal too, there is no force to produce a torque to spin the body, so it will not rotate either. Now, the typical textbook example you mention is the case where the body rolls down the slope without slipping. This type of movement nedds the friction without option to keep the “coordination” between the linear acceleration and the angular acceleration in such a way that the point of contact has zero speed relative to the slope (definition of “without slipping”). The torque nedded to accelerate angulary a body is proportional to the rotational inertia. Put two bodies with different rotational inertia and the same mass on the same slope and you will get different accelerations. And you don’t need a sphere and a cylinder. Put two homogenous cylinders with the same material, same density, same volume (so you will have the same mass), only that one is long and small in diameter (as a shaft) and the other is short and large in diameter (as a disk). The accelerations will be different. But do not blame an imaginary “shapedependant gravity” for that. Blame the friction. (incidientally, it is also the friction the one to blame for the different accelerations between a ball of cotton and a ball of steel in free fall). And yet you can find a cylinder and a sphere with the same mas and density that roll down the slope with the same acceleration (provided certain length/diameter ratio of the cylinder).0July 1, 2004 at 2:20 am #102761Darth – did you mean erudite? Or were you just making up words trying to be tricky?
0July 1, 2004 at 2:28 am #102762
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Gabriel:
I did not mean to ignore the question on uncertainty etc… given below. Just so many things going on today, I didn’t get to it.
Staying out of the scope, I would like a direct answer to this direct question (asked for the third time): Do you have an estimate of the uncertainty in your determination of the 10 m/s2 that rejects 9.8something as a possible true value?
I would be happy to send the article that I wrote (unpublished) which I shared with the MIT group, and the other groups noted, where I made the case that gravitational acceleration for the BEC is higher than the usual value. I you can send me an email, I would be happy to respond back with a return email.
Regarding shape effects on acceleration of a falling body versus what you are thinking, again I urge you to read Bernard Cohen’s Birth of a New Physics and the alternative views of motion that existed before Galileo came on the scene. What I am talking about is established physics, as taught in textbooks. For that matter texts were don’t with different isotopes of the same element, after Einstein’s general relativity, to check the acceleration some the “heavier” isotope has more neutrons.
I hope I have responded to your questions. We can still have different opinions, but we have to learn to communicate and may be someday come to an agreement. Regards.
Laxman0July 1, 2004 at 2:37 am #102764
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.Thank you Tyrone, believe it or not I did post a typo. As a result, I have punished myself accordingly and have reread all of V’s posts. I have also rederived Einstein’s Theory of Relativity and hand computed all the formulas for the speed of falling objects. Have a good night.
0July 1, 2004 at 2:47 am #102766
Ken FeldmanParticipant@Darth Include @Darth in your post and this person will
be notified via email.Now, now Stan, it is not nice to poke fun at Schizophrenics…they are people too. And as a result of psychotropics, even ET can be in Close Encounters.
0July 1, 2004 at 3:34 am #102768Had you also addressed the nagging conundrum of the coin toss it would have been a good days work. You were far too hard on yourself. Im not sure which was the most punishing, rereading the posts or rederiving Einstein’s Theory of Relativity. The speed of falling objects question, I thought had been adequately addressed several centuries back, but I am pleased that its being so vigorously revisited.
Have a good evening also.0July 1, 2004 at 3:43 am #102769
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.Hey Stan: Its good to see that you finally joined us in the
work function thread. I have really missed you … you
know, being on my case and all. My life has not been the
same over the last few days without you. I just wish you
knew what a kick we get from reading your posts. I must
admit that you really work at trying to induce an argument.
Well, old friend, our time in the limelight has come and
gone. If you have not noticed, I am off the shift discussion
since the debate was scheduled … so my job is done …
you committed publicly to several specific positions as
well as the debate (per all of your enlightening posts).
Look, everyone knows you won’t show up … you will have
some reason or other why you can’t debate. Personally, I
would love you forever if you did show up to the debate.
By the way, your cloak of Stan has since been revealed
by an associate of yours. I guess he got tired of your
arrogance. You certainly have an interesting history with
Motorola. By now, you should know that I use multiple
names when posting (as I have previously admitted
simply because it is kinda fun but your detection rate is
not much better than todays typical inspection efficiency.
You have picked up on some of these aliases while
others you have not. On some occasions, you have been
in left field while others you hit a home run. I have
reviewed the policies of this website and find nowhere
that such a practice is improper or violates some rule.
If I have overlooked such a reference, please point it out
and I will only use my given name, promise. But until
then, its a real gas watching you light off. By the way, the
new Annual National Six Sigma Korean Quality Award
(under the direction of the Korean Management
Association) was named in Dr. Harrys honor. Funny
thing, the engraving did not say Stan. Wow, was I
disappointed. I thought by now you would be world
famous (other than in your own shower I mean mind,
not shower.). Just kidding with you Stan. I love ya baby.
Keep the sarcasm coming it makes you a colorful
character. Reigle Stewart0July 1, 2004 at 3:48 am #102770Reigle,
Just curious, the original debate also included Statman. I have’t heard from him in a long time. To anyone – is Stan also Statman?? Darth??
Again, just out of curiosity. And this really is Matt – not Stan or Mike or Steve or anyone else. Just ole’ Matt.0July 1, 2004 at 4:06 am #102771
Reigle StewartParticipant@ReigleStewart Include @ReigleStewart in your post and this person will
be notified via email.Matt: Obviously, the debate has narrowed down to just Stan … the debate format, plans and time allocations have been made accordingly. Remember, the intent of this debate is to establish whether or not the shift factor has theoretical merit and practical application. This brings it down to a digital position. Owing to this, only two contestents are required. Dr. Harry’s book clearly presents his position. Stan’s paper (on the prerequiste topic) is due on July 15th to Dr. Montgomery. Stan has no wiggle room left … either he meets the debate specifications, or he does not. If he does not, he enters into a default postion. Reigle
0July 1, 2004 at 4:24 am #102772
GabrielParticipant@Gabriel Include @Gabriel in your post and this person will
be notified via email.Hi Reigle,
It’s been the best post from you ever. Congratulations.
And I am neither kidding nor being ironic. I really mean it.0July 1, 2004 at 5:29 am #102773
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Tyrone:
….rederiving Einstein’s Theory of Relativity…
We did not do that yet… may be later when we talk about things how money grows and the maximum rate of growth of money – like the speed of light being the maximum speed of a body.
What we did was rederive, or actually appreciate, Einstein’s theory of light quanta, also published in 1905. Although Einstein himself was more proud in some ways of his theory of relativity (no other theory gives us E = m*c*c as elegantly as Einstein’s theory does), he got the Nobel for his work on light quanta that I hope finds will soon find even wider applications once everyone gets a good grip on the work function.
As for coin tosses, it can wait another day, or week, or even a month. Or, we can wait for ET to reappear. Regards.
Laxman0July 1, 2004 at 9:43 am #102779Dear V.L.
I’m following this discussion with interest, but I’m not so expert as all others, but please could you clarify me if what I have noted if true or not (I must analyze better, but maybe you can help faster). I noteed many relation between your work function and the “first degree differencial equation” and their solutions ? It is correct ?
Apologies for my not perfect english.
Rgs,
Peppe0July 1, 2004 at 10:17 am #102781V., if you track with the thread, I was responding to Darth’s typo induced selfflagellation. I know that you did not rederive Einstein’s Theory of Relativity; you have been too busy challenging Newtonian mechanics. No small feat in itself. Tyrone
0July 1, 2004 at 3:25 pm #102794
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Peppe:
You said,
I noteed many relation between your work function and the “first degree differencial equation” and their solutions …
If have say yes and no to your question. Let me rephrase it. The first degree differential equation (I assume that is want you had in mind).
dy/dx = h …………………..(1)
Solution, y = hx + c ………………….(2)
But is x and y in a real world problem. If x is time and y is distance, equation 1 says that we are dealing with a body that is moving at a fixed velocity (or speed). The solution equation 2 then gives how distance changes with time t. But, it also tells us that we mean the distance only if we know the value of y (the distance) when x (time) goes to zero. When x = 0, y = c. For solving distancettime problems and studying the motion of a body, it is convenient for us take c = 0, i.e. distance is measured from where the body is at x (time) = 0.
Now instead of a “body” imagine a “process” moving in a xy space where x is opportunities and y is defects. Or, imagine a “corporation” is moving in xy space where x is revenues and y is profits. The differential equation and the solution to the differential equation do not change. But the interpretation must be examined.
Let do the profits and revenues problem. If profits are increasing at fixed rate when revenues increase dy/dx = h. But we know profits y only if we know something about profits when revenues x = 0. Are profits equal to zero when revenues x = 0? Many corporations are observed to produced no profits even when they have large revenues. This means y = 0 even if x has a finite value. We can write equation 2 as follows:
y = hx + c = hx – W = h[x – (W/h)] …………..(3)
Or defects increasing at a fixed rate as opportunities increase? The rate h is like the velocity. If this is what we see, to find the number of defects y when opportunities is x, we still need to know something about defects when opportunities go to zer. When opportunities x = 0, defects y = 0. But, this cannot be true! At least, that is my opinion today. So, we have to conclude c = 0 for the defectsopportunities problem.
At higher values of x, we may not obey the linear law, but “locally” we may find a linear behavior. Then we think there is a nonzero c, or what I called a work function.0July 1, 2004 at 4:03 pm #102796
V. LaxmananMember@V.Laxmanan Include @V.Laxmanan in your post and this person will
be notified via email.Dear Peppe:
I pressed the wrong key when I was typing the previous response. So it might seem incomplete. It is.
As I tried to show there when we solve the simple differential equation dy/dx = h = constant, we get y = hx + c where the meaning of x and y must be understood for each problem.
For x = profits and y = revenues, it is clear that
y = hx + c = hx – W = h[ x – (W/h)] ……….(3)
We observe many corporations with a lot of revenues by no profits (or negative profits. This means when revenues x = W/h the profits y = 0. I call (W/h) the cutoff revenues below which there is no profit (or negative profit). The smaller the value of W, you can call it work function (or Motorola function, GE function, whatever you like) the higher the profits y. The maximum profit, for a given revenue level x is given by y = hx. If we know h from the observations on revenues and profits, we might find a fixed value of h.
I have done this and tested the law for many corporations. For example, Microsoft shows this perfectly linear behavior if you take their historical data. They also linear behavior if you take their ten quarter data, and so on. There is small deviations from the law. This means a deviation of the work function W (or Microsoft function!).
Now, for the defectsopportunities problem of interest to Six Sigma methodology, the question becomes what is y when x goes to zero. That is why I posted the highway trafficfatalities data for the U. S. Here we see a behavior where y = 0 when x = 0. When miles driven (in a car) is zero, we must have zero deaths (due to car accident). But may be this is not true. The person who got killed may not be the driver of the car (which is what is implied) but a pedestrian on the road who got run over by the car.
There is a funny story (real actually) I read when I was studying the historical data. In the early 1900s there were only two cars in the whole state of Ohio. One day these two cars came straight down the road from opposite ends and collided with each other! That was the first accident in Ohio, at least. The newspapers had a picture. I was trying to get data at the smallest value of x (miles driven).
In the same manner, I suggest that Six Sigma practitioners must study how the defectsopportunities curve is evolving as opportunities x increase. Then we can understand the process better, improve and control it (DMAIC). I am doing the Analyze part. But, we also have to go back and to the Measure part well, so we can do the Improve and Control part better.
So, everything I have posted is related to Six Sigma. I was just trying to show that we can learn how do Analyze step by seeing what physicists and chemists have done. That is one of the most successful disciplines where analysis led to good and testable predictions.
Finally, the law K = hf – W was derived by using of conservation of energy. We can derive y = hx + c in the same way. For a corporation, Profits = (Revenues – Costs). This is always true. It is like the law of conservation of energy. The work function is telling us something about costs! When costs are reduced profits increase. It is like saying when speed increases, distance traveled increases.
For Six Sigma, Good ones = Total parts – Defects.
This is a law. This is always true. This can be converted as before to the idea of work function. When the work function is small, the good ones increase and are finally equal to the total parts made.
But, as in physics, with good measurement and good understanding of the process, we can also make good predictions. Regards.
Laxman
P. S. Using law of conservation of energy is much better and more powerful way to solve problems in physics and chemistry. But, often it is difficult to see how to use the conservation of energy law. People start using forces and then it gets difficult. In the same way, setting up differential equations to solve problem can also lead to some difficulties. Planck’s quantum theory (which led to Einstein’s work function) was developed because solutions to classical differential equations (Maxwell electromagnetic equations) did not agree with good experimental observations.
So Planck thought about this deeply between October 14, 1900 and December 14, 1900 and decided to take a different approach. It is actually a simpler one and is based on energy conservation law to define temperature in terms of energy and entropy (when no work is done). Then he introduced the relation for entropy coming from elementary combinatorial analysis as I discussed in the Bill Ford post which started this whole thread. But, when Einstein studied Planck’s paper, he agreed to a point. He showed that even light (or radiation) is made up of discrete energy units. At this stage he is using a simplified version of Planck’s law (with b = 0, see traffic fatality post) to illustrate the essential ideas. Then he introduces work function. Planck says, let’s take work done as zero. Einstein says, no we have to allow for this. Then we see the cutoff.
We see cutoff values o f x = W/h in many many Six Sigma problems! I hope this helps.0 
AuthorPosts
The forum ‘General’ is closed to new topics and replies.