The 21st century is already tagged with the title of "The Knowledge Age." This label presents many new problems
for those of us in the quality profession. If knowledge is the organization's most valuable resource, how do we ensure the quality of the knowledge base? Knowledge starts with
data, which consists of miscellaneous bits of information that may or may not be valid. Data in its raw form may be based upon a valid source that often isn't applicable across a total
population. For example, based upon a discussion with a single customer, you might conclude that the computer you're manufacturing is too slow. When you collect, analyze and group the data, you
find that only 8 percent of your customer complaints indicate that the computer is too slow. Does that mean that your organization has a problem that should be addressed? No. If only 1 percent of
the total customers complained, then only 0.08 percent of your customers complained about the computer speed. There is still one additional consideration: The customers who complained may be
using the computer with an application that it wasn't intended to serve. You have attained knowledge when you're able to say that the sample the information is based upon is
large enough to give a specific confidence level that a specific percentage of your customers think the computer is too slow when used correctly. Too often we accept as fact
data that's simply an individual's impression of an observed condition. (It wasn't long ago that most people accepted as fact the notion that our world was flat.) When one survey predicts with a
high degree of certainty that a Republican will win a given election, another survey conducted in the same area at the same time usually predicts that a Democrat will win. I've seen survey
results that indicate that 20 percent of the organizations that tried TQM considered it a failure; other surveys suggest 60 percent of these organizations considered it a failure. Figures don't
lie, but you would be surprised what good statisticians can do with a set of figures to prove the points that they want to make. If you need further convincing, conduct a
customer satisfaction survey of 100 of your customers. Provide that survey data to two different statisticians. Ask one of them to analyze it to determine customer satisfaction levels, and ask
the other statistician to determine customers dissatisfaction levels. Don't let them confer with each other, and compare their results. Another problem with information is that
it is often taken out of context. For example, since IBM started its benchmarking process, its stock dividends dropped by 75 percent to an all-time low. This statement is, in fact, correct and
could lead you to believe that there's a negative correlation between IBM's benchmarking activities and its shareholders' dividends, but this isn't true. The cause of the drastic drop wasn't
benchmarking; it was IBM management's approach to doing business. All too often, unrelated data is put together to draw a faulty conclusion that's then treated as knowledge. The transformation of
data into information and information into knowledge is a difficult process that is often very error-prone. As quality professionals, we need to get involved in these processes
using techniques such as regression analysis and causal models to define the validity of the data and the confidence level of the information that's being used by our knowledge workers and
becoming part of our organization's knowledge base. Just how valuable is an organization's knowledge base, and what is the cost of managing it? In the average high-tech
company in the United States, it costs $5,000 per knowledge worker to manipulate data and provide him or her with information. In his article "Knowledge Matrix" (Knowledge Management
magazine, Jan. 2000), Paul A. Strassmann points out that the knowledge value per employee varies widely. For example, the following companies, which are in the same industry, have different values: Warner-Lambert = $261,847 per worker; Johnson & Johnson = $582,568 per worker; Abbott Labs = $702,468 per worker.
It's obvious that many of our best organizations are invested heavily in establishing their knowledge bases and that they will rely extensively on these knowledge bases to
direct the organizations' present and future operations. The quality of this knowledge base must be extremely good, and it's up to the quality professionals to ensure that it's factual and
quantified so it's not used incorrectly. This presents a new challenge and opportunity to all of us in the quality profession. About the author H. James Harrington is COO of Systemcorp, an Internet-software development company. He was formerly a
principal at Ernst & Young, where he served as an international quality adviser. He has more than 45 years' experience as a quality professional and is the author of 12 books.
Harrington is a past president and chairman of the board of both the American Society for Quality and the International Academy for Quality. E-mail him at jharrington@qualitydigest.com . Visit his Web site at www.hjharrington.com |