Quality Digest      
  HomeSearchSubscribeGuestbookAdvertise December 22, 2024
This Month
Home
Articles
ISO 9000 Database
Columnists
Departments
Software
Contact Us
Web Links
FDA Compliance
Health Care
Web Links
Web Links
Web Links
Need Help?
Web Links
Web Links
Web Links
Web Links
ISO 9000 Database
ISO 9000 Database


Letters

Is it Over Yet?

Regarding the article “When Will It Be Over?” by Denise Robitaille ( Inside Standards, http://qualitydigest.com/IQedit/QDarticle_text.lasso?articleid=12514): I would dispute that there cannot be a perfect audit with no findings. If your process for continual improvement is good, you should anticipate any potential shortcomings and correct them prior to auditing.

--Chris Ziegler

 

This article clearly brings the true concept of auditing to point. I’ve been dissatisfied with third-party auditors who “go lightly” through the steps just to fulfill the basic requirements. I want more than just a “number” for my customers. So many times I pause in my work to reflect on how nice it will be when I finish all of the new “hot” projects, and just take care of “regular work,” only to laugh at myself as that little voice in my head reminds me that, as a manager, it is my job to manage change, not watch life pass me by.

I’m ready for the next project!

--Gary F. Keck

 

Making More of ISO 9001

The article “Making The Most of ISO 9001” (Pam Parry, December 2007) was an interesting read and full of good concepts. However, I disagree somewhat with this statement: “Certification to the international standard not only keeps employees working toward the quality mission, but also demonstrates to customers that ENLASO meets global best practices when it comes to quality management.”

I’ve always believed that certification only demonstrates compliance to a minimum level, not to best practices. I only have to think of many ISO 9001-certified suppliers that I’ve visited to have this confirmed.

--Richard Allan

 

The Limits of Control Charts

Doug Fair’s article (“3 ‘Nevers’ of Control Limits, Part 1,” QualityInsider, http://qualitydigest.com/IQedit/QDarticle_text.lasso?articleid=12489) can mislead people and actually cause problems to be missed. The control chart development process consists of two phases. In phase one, the limits are computed from the process data as Mr. Fair suggests. Assuming phase one establishes that the process is stationary (i.e., in control or stable), then we move to phase two. In phase two, the control limits computed in phase one are “typed in” to the program. That is, they are fixed and should remain so until the process has been determined to have changed based on the application of a set of statistically based run-and- pattern test rules.

Some incorrect SPC programs continue to update the control limits as more data are entered. If the process is slowly trending, these programs may not detect that the process has changed. This is a serious problem because process drift is frequently observed in real-world data.

So Mr. Fair is correct, in that phase one control limits should not be typed in, but in phase two they must be.

--Dr. John Flaig

 

I believe that “type-in” control limits are sometimes necessary. Control limits based on history data can reveal the normal line-running status and make sure that products coming out of this line are within specification. A real-time control limit only can ensure that the line is running statistically in control. However, statistically in control cannot ensure that the products are in specification. Customers only care about the product quality. Therefore, for a manufacturer, the type-in control limits may be more important than real-time control limits. Of course, the type-in control limits should be updated periodically to keep up with the line changes.

--Yixin Shi

 

Once you have gathered sufficient (statistically significant?) data and established that a process is stable, the control limits should be fixed based on the statistics. Allowing limits to float may mask increasing variation or subtle trends. Fixed limits avoid having to explain lack of response to that point from three weeks back--which was inside limits at the time. If changes to the process necessitate a recalculation of limits, your software should allow you to identify discrete time periods to retain the integrity of the limits in the earlier time periods. Floating limits are anathema to the teachings of Shewhart, Deming, and Wheeler.

--Gilbert Knapp

 

Doug Fair responds: I believe some readers misunderstood and thought that my screen clip actually displayed floating limits. Not so. Those limits were not floating limits. The mean and standard deviation were not significantly different from one subgroup to the next. Instead, those limits were different solely due to a difference in subgroup size (n ). The example was meant to emphasize that, like a P-chart with varying subgroup sizes, variables data control limits can change for a fixed mean and standard deviation purely because of the mathematics involved (and a challenging manufacturing situation which necessitates subgroup sizes to be different). However, if fixed, typed-in control limits are used and subgroup sizes change, then fixed limits cannot update to display correct control limits for the new n .

In a situation where subgroup sizes are identical from one subgroup to the next, yes, I believe that control limits should be fixed. And yes, they should be unchanging until process data indicate otherwise. But, as discussed with respect to mathematics, control limits must be based on mean, standard deviation, and subgroup size.

In summary, allowing the physical typing-in of control limits denotes a certain flexibility with which no one should be comfortable. I have seen many instances where an individual simply typed in what they wanted from the process, resulting in just another set of specification limits (usually closer together than the “real” specification limits).