Simulation Modeling--Part I
During the 1980s and 1990s, flowcharting was all the rage, particularly with supporters of ISO 9001. To keep ISO 9001 auditors happy, I’ve flowcharted concepts that would have been more easily understood in paragraph form. I’ve noticed that many people who follow quality operating procedures don’t use flowcharts at all.
Why, then, am I about to champion a much more complicated form for visually defining a process? Because I believe simulation modeling can reduce process narrative and effectively perform hypothetical analyses.
Process simulation is a technique that helps organizations predict, compare or optimize the performance of a process without the cost and risk of implementing new processes or disrupting existing operations. It generates representations of processes, resources, products and services in a dynamic computer model that mimics business operations. This is accomplished by stepping through the events in compressed time while displaying an animated picture of the flow. Because simulation software tracks statistics about model elements, performance metrics can be evaluated with model output data.
In his bestseller, The Fifth Discipline, Peter Senge defines situations “where cause and effect are subtle, and where the effects over time of interventions are not obvious” as dynamic complexity. He adds that conventional forecasting, planning and analysis tools aren’t equipped to deal with dynamic complexity.
Business processes such as supply chains, customer service and new product development are much too complex and dynamic to be understood and analyzed by flowcharting and spreadsheet techniques. Over time, the interactions of resources with processes, products and services result in many scenarios and outcomes that are impossible to evaluate without the help of a computer simulation model. Although flowcharts and spreadsheets are adequate in answering “what” questions, they’re inadequate for answering “how,” “when” and “what-if.”
Business process improvement approaches such as process reengineering, process redesign and benchmarking often are unsuccessful if present- and future-state solutions aren’t proven in simulation models. Often it’s impossible to understand the as-is process through simple flowcharting techniques due to the number of dissection points and variations of cost and time required to process individual items through a simple point on the flowchart.
During the 1980s, average cost, quality and cycle-time figures were sufficient for activity-based costing, process redesign and process reengineering projects. This is no longer true. Major errors are made when processes are changed based on averages. Organizations don’t usually lose customers due to the average performance of their processes. Instead, they lose them because customers are exposed to the negative end of the process output variation. This means that for each critical measurement of each activity, the average--the +2.5 s point and the -2.5 s point--should be defined and used to evaluate the average performance of the total process and performance variation over time.
At a minimum, the ±3 s limit point of a proposed future-state solution should be acceptable for 99.99 percent of your customers when a proposed process is approved. World-class organizations are developing processes that fail to meet customer expectations just 0.0003 percent of the time, or three errors per million transactions. Being right 99.9 percent of the time is inadequate. For example, if the following processes were correct only that often, then:
22,000 checks would be deducted from the wrong bank accounts every hour.
1,314 phone calls would be misplaced by telecommunications services every minute.
12 babies would be given to the wrong parents each day.
103,260 income tax returns would be processed incorrectly each year.
18,322 pieces of mail would be mishandled per hour.
880,000 credit cards in circulation would have incorrect cardholder information on their magnetic strips.
(Source: July/August 1998 issue of Quality Times magazine)
Remember that by the time a process is changed, let alone upgraded, your customer’s expectations will be much higher. Keeping this in mind, it’s easy to understand why flowcharting and spreadsheet approaches to improving a process are wholly inadequate.
Today, 5 percent of management decisions are considered bad and 85 percent are considered good. Only 10 percent of these decisions are considered “the best.” Process simulation is essential for businesses to make the best decisions. The ability to visualize how a process would behave, measure its performance and try “what-ifs” in a computer model makes process simulation a necessary tool for decision making and thus, a crystal ball for dynamic complexity.
H. James Harrington is CEO of the Harrington Institute Inc. and chairman of the board of Harrington Group. He has more than 45 years of experience as a quality professional and is the author of 22 books. Visit his Web site at www.harrington-institute.com.
|