Quality Digest      
  HomeSearchSubscribeGuestbookAdvertise December 21, 2024
This Month
Home
Articles
ISO 9000 Database
Columnists
Departments
Web Links
Software
Contact Us
Web Links
Web Links
Web Links
Web Links
Web Links
Need Help?
Web Links
Web Links
Web Links
Web Links
ISO 9000 Database
ISO 9000 Database


by Joe Lindsay

If your company is like most, you probably have separate software solutions to manage quality and compliance issues for each of your company's various operational functions: calibration management, document management, lab management and so forth. Generally, each of these are point solutions designed to address issues unique to that functional environment, with little or no capability to share critical data with other environments. Lacking integration, these information silos become a hindrance to achieving the goal of overseeing quality and compliance issues for the entire enterprise.

This article will discuss how software architecture may be leveraged to increase the effectiveness of your quality system by eliminating those silos. The result is a new breed of business process management software focused on compliance issues: compliance process management.

A case/time for change
Today's public and media have little tolerance for corporations that generate mystery or surprises. The Sarbanes-Oxley Act is one of the most vivid manifestations of this fact. In the wake of the back-to-back-to-back shocks of Enron, WorldCom and Tyco, the federal government stepped in with regulations designed to provide previously unseen levels of corporate transparency. In the life sciences industry, the ramifications of Vioxx and Chiron have resulted in a wave of change that has not yet finished transforming and challenging both the FDA and the industry.

To meet the challenge, the pharmaceutical industry is undergoing change and evolution. According to an article in the Sept. 3, 2003 issue of The Wall Street Journal, "The pharmaceutical industry has a little secret: Even as it invents futuristic new drugs, its manufacturing techniques lag far behind those of potato-chip and laundry-soap makers." In other words, we have outstanding scientists developing incredible drugs that save millions of lives, but once those drugs are approved, they leave the confines of the laboratory and the scientific process and come under the management of sales and marketing. Figure 1 shows how the average pharmaceutical company compares to a first-class pharmaceutical or manufacturing company.

Today's companies must manage critical processes, products, technologies, and analytics to support supply-chain investment strategy and risk-based decision-making. Sustaining this operation requires integrated systems that allow for real-time risk management. Although this goal is shared by many pharmaceutical and life sciences companies, the high cost and low sustainability present significant obstacles.

But there is hope. Information technology has evolved; a new emphasis on standards and interoperability offers compliance process management solutions that promise to remove some of the obstacles which prevent our current systems from delivering the value that current requirements demand. In the past, software solutions forced proprietary paths for both technology and data storage, resulting in systems that didn't easily integrate with other vendor products and prevented the sharing and dissemination of information. This largely stemmed from each department within the organization demanding an IT solution for its little part of the world, and no one taking a holistic view of what a truly integrated IT solution would mean for the organization.

Quality management professionals should demand IT implementations and strategies that increase the reliability of compliance process execution as well as an increased visibility of compliance performance. This puts the burden on the quality management team to generate requirement specifications that support a holistic approach to the entire quality management process. Requirements should call for compliance process management systems as the basis for quality process execution. This system should be required to capture and manage data that provide an unfettered view of the entire life cycle of the quality process. One deliverable or outcome of this new approach will be the data required for regulatory compliance.

 Where to start
Your IT provider, whether internal or external, only provides solutions based on your input. If your system requirement specifications don't identify these foundational requirements, the resultant systems may actually become a burden to, rather than an enabler of, quality. This requires that everyone involved understand the
big picture before engaging in defining any solution.

Too often the demand that is communicated to IT is for a point solution--i.e., something unique to that department or operation. Requirement specifications are then delivered to IT that specify a solution rather than defining the actual problem. When the quality team specifies the solution to IT, a solution is what IT will strive to deliver. By specifying a solution without clearly articulating the actual problems and needs, the quality team has isolated IT from the true business requirements and the ability to address them.

One unintended consequence of specifying point solutions is that it masks the need for integrated solutions. Quality management and regulatory compliance are collections of interconnected processes and workflows that require the supporting IT systems to be integrated. This requirement must span every IT quality system requirements specification. It calls for a strategy that spans all systems--an architectural strategy that encompasses all processes within the organization.

Focus on process, not product
Too often organizations focus on technology and products rather than on processes and outcomes. The goals are to execute in a manner that drives quality and compliance, and to reliably execute quality processes. The quality system typically has a broader scope than companion IT systems or any single-point solution. Lab data are managed by a LIMS system, manufacturing systems manage batch records, and various other systems manage complaints, deviations, calibration, and environmental, skills and change management, but the quality process spans all of these processes.

Typically the cost and complexity of integrating such a set of complex systems is prohibitive. The most significant reason stems from defining and specifying the integration requirements after the implementation of the individual systems, generating costly rework and modifications to meet the new requirement. The cost of reversing this lack of foresight is orders of magnitude greater than addressing them earlier.

Counterexamples: what not to do
Once in this situation there are two common approaches to alleviating the problem, neither of which is optimal. One approach is to take one of these systems (let's say your manufacturing management system) and extend its scope ad infinitum to cover the entire quality life cycle, well beyond the system's base as a manufacturing or document system. This is an approach with a built-in limitation derived by the base system. As the system is now stretched beyond the vendor's original intent, the added task of managing two divergent purposes represents a growing challenge. Not to be overlooked, the increasing complexity and modification costs of the resultant system can hinder future changes, quality improvements and innovation.

Another problematic approach is to integrate all of these systems for the sake of the quality or compliance process. This approach represents a greater level of complexity and prohibitive cost and resource requirements. Integrating multiple applications and systems is widely recognized to be the most costly and complicated of
IT projects.

Both approaches represent variations in giving priority to a technology or product over the correct process.

The correct approach is to drive the IT team to design and build an architectural strategy based upon a holistic view of the entire quality process and implement technology to enable it to work correctly.

An idealized architecture
So how should an IT department leverage this for the sake of quality systems? First, the quality management team and the IT department should never forget that the process is all about information, and that the data are to be protected and highly leveraged. The value of data increases with use.

Always avoid locking quality information into proprietary data stores or repositories. Company data should reside in standard database systems so that information can be easily accessed and leveraged. Create a clear separation between information storage and those processes that generate or act on that information. What does that really mean? Deploy systems based on standards-based information architectures. For relational data, store your data (for instance) in a relational fashion in standard ANSI SQL-compliant relational databases that are widely supported by reporting and analytic software vendors, such as those from IBM, Microsoft, Oracle or major open-source alternatives. This ensures not only that you have open access to your data, but also that there are many choices for reporting and analytics. Remember, the point is to increase the visibility of your quality process execution.

Documents and nonstructured content should also be stored in a repository that is both standards-based and vendor-independent. Also, store this information in formats that are widely used and support open or de facto standards. In this space the number of options are far fewer, with the most significant being the JSR168 standard.

These standardized approaches to storing the data required and generated by quality processes will support the widest set of options for accessing and using this important information later.

For quality systems, leveraging open standards and ensuring a clean separation of concerns results in these architecture components:

An enterprise data model
Shared or common relational databases
Shared or common document repository
A set of synergistic process and data- capture applications
A common or shared workflow and change management system
Analytics and reporting systems
A common interface and experience for system users
Shared external gateway (e.g., electronic submissions)
Common process and facilities for audit and validation


This is an idealized architecture; simply specifying this has little tactical value. However, it will guide tactical decisions and either prevent the perpetuation of information and procedural silos within your quality systems or mitigate the effect of such a scenario. It defines the boundaries for, and not the details of, actual implementations and solutions.

Having an architectural strategy will allow a quality management team to tactically address problems that deliver the vision and further the strategy. For instance, rather than trying to bolt on a quality systems process for skills management to your existing learning system, follow an approach that approximates this sample:

Document what the idealized best practice process would be given the context of your business. Be straightforward.
Determine, document and locate the business information needed or generated by that process.
Acquire or build a solution that is focused on executing that specific function. Store generated or required data in a shared SQL or other standard database, replicating and loading any data extracted for foreign systems. Ensure that the data are accessible to multiple applications in an unobstructed manner. Unstructured information and documents should be stored in a standards-based repository that can be accessed by multiple applications.
Generate reports against this data supporting the required business functions.


In summary, leveraging an architectural strategy as the context or basis for any and all IT systems engineered to support quality management will result in systems that truly enhance the operational quality and quality process execution for any given organization.

About the author
Joe Lindsay is the president and chief technology officer for Quality Systems Laboratories (QS Labs), a global compliance management and solutions provider for the life sciences and health care industries. With nearly 20 years of IT experience in a variety of computing environments, including mainframe, client-server and Internet technologies, Lindsay is a leader in both the development and operation of business-critical systems