I n my September column, I showed how the normal distribution is the distribution of maximum uncertainty. Now I will expand on that theme and answer the questions generated by that column.
ADVERTISEMENT |
Last month I demonstrated that the middle 91 percent of the normal distribution is spread out to the maximum extent possible for any unimodal probability model and that the outer 9 percent of the normal distribution is as far, or farther, away from the mean than the outer 9 percent of any unimodal probability model. This property of the normal distribution is not a fluke; it is part of a systematic pattern where the outer tails of the normal distribution dominate the outer tails of other unimodal distributions.
…
Comments
Poor teaching
"It is a fact that most industrial statistics classes have been and are being taught by people who do not have graduate degrees in statistics. In a field that rests upon the complexity of probability theory, this lack of understanding of the foundations will inevitably result in superstitious nonsense being taught as gospel."
It is not just those lacking in statistics degrees who have been responsible for such "superstitious nonsense". A well know professor of Industrial Engineering and Statistics, who has ironically been awarded the Shewhart medal, must carry as much blame for propagating the Six Sigma rubbish around normal distributions and defect tables, as the psychologist who started the scam.
"A process with PCRk 2.0 is referred to as a sixsigma process ... The reason that such a large process capability is often required is that it is difficult to maintain a process mean at the center of the specifications for long periods of time. A common model that is used to justify the importance of a six-sigma process is illustrated by referring to Fig. 16-15. If the process mean shifts off-center by 1.5 standard deviations, the PCRk decreases to 4.53 1.5. Assuming a normally distributed process, the fallout of the shifted process is 3.4 parts per million. Consequently, the mean of a 6-sigma process can shift 1.5 standard deviations from the center of the specifications and still maintain a fallout of 3.4 parts per million."
Text book: Applied Statistics and Probability for Engineers. Author: Professor of Industrial Engineering and Statistics
NOTE: If one checks the source of the "long periods of time" in the work of the said psychologist, one finds that it is about one day.
Heavy Tails
Hey ADB, I don't know you, but if I read you right, I feel some - right - skepticism in your lines, that I share. I also share your classification of Six Sigma rubbish: Six Sigma has only done good to consultants and to the ever-trainees salaried by the big Multinationals. When the Stats-gurus will start to think in terms of the more realistic parts-per-thousand or parts-per-hundred, may be we'll come to terms with them. Thank you.
Defective thinking
Umberto,
You are correct in that consultants have used the Six Sigma Scam to steal billions of dollars from the gullible. However, you too have been conned by the nonsense about defects. This stems from the guru of Six Sigma, the naive Mr Bill Smith, and his laughable claim that another way to improve quality is to broaden the specification limits. At times I wonder if there is any hope for quality amid the morass of Six Sigma trash.
Thanks
Thank you for a great article (actually all your articles provide a warehouse of knowledge, not just information). Your explanations are clear and to the point, such as "Only when this chart shows no evidence of a lack of homogeneity will it make sense to fit a probability model to those data." Far too often, people worry about transforming the data. And this just complicates the problem. And most SPC software or six sigma software makes it easy to transform data - without knowing what you are doing. Much of what I have learned through the years have come from you (starting with the Dr. Deming seminars) and your excellent books. Thanks again.
Bill McNeese
www.spcforexcel.com
Add new comment