Quality Digest      
  HomeSearchSubscribeGuestbookAdvertise November 15, 2024
This Month
Home
Articles
Columnists
Departments
Software
Need Help?
Resources
ISO 9000 Database
Web Links
Back Issues
Contact Us

by Kennedy Smith

Most organizations have more data than they know what do to with. However, the fact that an organization is full of information doesn’t mean that its business will run smoothly. Many experts contend that organizations will only achieve high-quality processes when their information is organized and shared throughout the organization.

Larry English, president of Information Impact International Inc., is a speaker, educator, author and consultant in knowledge management and information quality improvement. English has developed the Total Information Quality Management methodology, applying kaizen quality principles to information quality.

English has served as an adjunct associate professor in computer science. He’s a member of the American Society for Quality and a former advisor for the Data Management Association. An active member of various standards committees within the American National Standards Institute, he is an editorial advisor and monthly columnist for DM Review magazine. English is also the author of Improving Data Warehouse and Business Information Quality (John Wiley & Sons, 1999).

QD: What is information quality?

English: It’s consistently meeting end-customers’ and internal knowledge workers’ expectations. It’s the application of sound quality management principles to the information processes, which encompass all of an organization’s processes. It relates to application software as a component of business processes that captures information, updates information, moves information and presents it back to those who require it.

QD: What is the state of information quality today?

English: Information systems is the only area of business where redundancy is not just tolerated but encouraged. The fact that we build 25 separate customer files would be like hiring one person 25 times. But we don’t think anything about building redundant files, even though it’s technically feasible to simply share the data. Businesses must manage information as they manage other assets. It’s simply too expensive not to.

QD: Can you explain Total Information Quality Management?

English: TIQM draws from total quality management to focus on the information processes, from manufacturing to marketing to business transactions, such as sales and claims processing. Companies adopt TIQM to improve their processes, and to prevent the recurrence of the data defects that cause business process failure and create scrap and rework.

QD: How does process management affect information quality?

English: The way that we’re managing businesses today is vertical. It’s a leftover vestige of the industrial age, which is specialization of labor. In the industrial age, the activities were divided into small sets that one individual could perform repetitively and rapidly.

QD: So more companies should take a horizontal approach to information?

English: Yes. It’s already taking place in leading-edge organizations, and it’s a requirement to be fully effective and efficient in both software development and database design. When organizations develop isolated applications with department-focused databases, they share information by extracting data from one database and propagating it into another. Extracting, transforming and propagating data represents wasted time.

QD: What contributes to the breakdown of a value chain?

English: Often there’s a narrow, vertical look at each activity such as order placement or shipping, and we design systems that just capture data for one function. Downstream areas, like marketing, are left out of the data requirements. In these cases, software is developed from the perspective of only the immediate department, without recognizing and understanding the expectations from downstream knowledge workers in other departments. So, one of the challenges is recognizing the customer set for a given application. The information requirements for downstream knowledge workers need to be factored into the data requirements for an application.

A sound business process anticipates other stakeholders’ information requirements. I call it valuecentric information systems engineering. You still define the functionality based on your immediate set of activities, but the information must be captured to meet downstream beneficiaries’ needs. Often, organizations instead simply add requirements to the data that already exists.

QD: What should companies keep in mind when organizing their information?

English: They need to take a customer focus that includes not just the immediate beneficiaries of the software system but also downstream information consumers. The shared database needs to include all department requirements.

I’m an advocate of quality function deployment in software development, which means customers don’t end their obligation when they define their requirements. We need to listen to the customers and involve them in the design of the software.

We need to utilize prototyping techniques with minimal expense so we can test the designs without waiting until the end of development to test the system. We should test our design by giving the application to the people who use it and verify that it’s efficient and meets their needs. By involving actual businesspeople in the development, we can identify defects earlier in the process.

QD: What benefits might a company see by streamlining its databases?

English: Let me give you an example. A small coal company in Canada bought software packages because it was cheaper than developing software. However, they discovered that buying a lot of software that didn’t integrate was expensive. And, because no package met all of their needs, they had to modify the software they bought. With each new software release, they had to reincorporate their requirements.

Furthermore, because there was overlap in the data, they had to build interface programs to extract data from one database and translate it into another. That created complexity in keeping the data in-sync.

They scrapped that philosophy and started developing their own software and databases. They moved to a subject-focused view of designing data so customer and product information were in shared databases. The cost of developing software was less expensive because they didn’t have to keep redeveloping their custom code and maintaining data interfaces.

It’s very much like the assembly line. If a component comes in that doesn’t fit, you have to rework it. That’s the goal of information quality management--to eliminate scrap, rework and the waste of information.

This interview was conducted by Kennedy Smith, Quality Digest’s associate editor. Letters to the editor regarding this article can be sent to letters@qualitydigest.com.