Quality Digest      
  HomeSearchSubscribeGuestbookAdvertise December 22, 2024
This Month
Home
Articles
ISO 9000 Database
Columnists
Departments
Web Links
Software
Contact Us
Web Links
Web Links
Web Links
Web Links
Web Links
Need Help?
Web Links
Web Links
Web Links
Web Links
ISO 9000 Database
ISO 9000 Database


Columnist: H. James Harrington

Photo: Scott Paton, publisher

  
   

Simulation Modeling--Part 2

Computer-aided process analysis reveals much more than bottlenecks.

 

 

 

Computer simulation was first attempted by the defense industry during the 1950s. Early simulation models were built with programming languages such as FORTRAN. With the invention of general-purpose simulation languages such as SIMSCRIPT and GPSS in the early 1960s, simulation spread to other industries. The advent of personal computers and new simulation languages such as SLAM and SIMAN in the early 1980s made the technology available to staff personnel, engineers and managers.

In the late 1980s and early 1990s, the enhanced graphics capabilities of personal computers spurred software developers to create model development tools that included animation. The Microsoft Windows operating environment was simultaneously developed for the growing number of PC users, and enabled simulation tools such as Witness, ProModel and ithink. These provided menu-driven user interfaces, visual interactive modeling and impressive animation capabilities. From this point on, simulation languages became a mainstream tool.

The early 1990s brought another ex-citing development to simulation--object-oriented modeling and analysis. Simulation languages such as ModSim and Simple++ took advantage of object technology and reusable object libraries. This development facilitated situation-specific simulation solutions and made simulation available to a greater number of end users.

Today, there are seven levels of simulation modeling. Each helps organizations move forward in the art of process improvement. The seven steps are:

Block-flow diagrams

Flowcharts

Process-performance analysis

Process-knowledge dictionary

Process-variation analysis

Process-flow animation

Workflow monitoring

 

Let’s review each of these steps and the ways in which one leads to the next.

Block-flow diagramming is the simplest and most prevalent approach, providing a quick, uncomplicated view of a process. The block diagram serves as a starting point for other process-modeling approaches.

Flowcharts are among the oldest process visual aids. They graphically present activities that make up a process in much the same way that maps represent a particular area. Both flowcharts and maps illustrate how different elements fit together.

Process-performance analysis was developed to collect performance data related to each activity in a process and use these data to calculate the performance of the total process. Typical information that can be collected is:

Cycle time

Processing time

Cost

Wait time

Yield

Quantity processed per time period

 

An extension of process-performance analysis, the process-knowledge dictionary is a methodology that collects all other information related to the activity. Typical additional data include:

Operating procedures

Drawings and blueprints

Work instructions

Training documents

 

The process-knowledge dictionary is normally available online to management and employees performing the activity. It’s organized by and accessed through the activity block in the flow diagram.

Process-variation analysis uses random-number generation tables to calculate a realistic total process performance distribution for each key measurement. This technique combines the variation that occurs at each activity in a process to make a realistic prediction of its total variation.

Process-flow animation causes a flowchart on a computer screen to come to life. It can show the transactional flow through a process and demonstrate how bottlenecks affect process performance. For example, animation can show customers waiting while service agents are busy serving others. It can also show idle resources with unused capacity due to transportation delays.

Workflow monitoring is an online model that tracks transactions through a process. Each time a transaction enters an activity, it’s logged in. When it leaves, it’s logged out. The information is tracked so the exact status of each transaction is known at all times. Typically, the maximum time a transaction should remain in each activity is set in the program so that exceptions are highlighted and priorities re-established.

Depending upon the complexity of the simulation software package, the seven levels that open the door to best-value processes are partly or totally integrated. As an organization develops its process focus, it steps up through each of these levels. All organizations must evolve to level seven before they can excel.

About the author
H. James Harrington is CEO of the Harrington Institute Inc. and chairman of the board of Harrington Group. He has more than 55 years of experience as a quality professional and is the author of 22 books. Visit his Web site at www.harrington-institute.com.