In the intro to this series we noted that, too often, quality tools and the data we glean from them are used only to solve immediate, mostly shop-floor problems. These gold nuggets of opportunity aren’t used in an equally valuable way to address a company’s strategic goals.
Here we’ll consider how to master quality at the shop-floor, tactical level. More than just byproducts of the production process, quality data are the vital signs that determine if individual processes—and by extension, the entire production system—are healthy. That information can in turn help drive business strategy.
It starts on the shop floor
For most companies, learning to use data to their best advantage entails drifting between visionary potential and problematic reality—the strategic vs. tactical tension. As part of this inevitable tug-of-war, data are rounded up, variability is tamped down, and quality anxiously measured.
Manufacturers often start their search for SPC software from this need to grapple with tactical aspects of quality data. That’s understandable because they are their most immediate concerns: Were the data collected on time? Is my process in control? How many out-of-spec values are there? Did everyone enter the comments, causes, and actions as required? What can help me manage all this? Enterprisewide SPC software can and most times does.
Unfortunately, many organizations end their search at this shop-floor, tactical level. Critical problems get addressed; some processes are fine-tuned. Those immediate fixes have won the battle, but they haven’t addressed the war.
Implementing an enterprisewide SPC solution to fix specific issues, but stopping short of using it to its full potential, is “doing” quality. To truly master shop-floor quality, three fundamental elements are critical:
• Collecting data
• Acting on data
• Analyzing data
Let’s consider each of these as well as examples of how production departments typically “do” quality. Later in this series, we’ll look at ways to use these tactics to build and constantly improve an organizational strategy.
Collecting data
Four broad tasks summarize data collection on the shop floor: gathering, calculating, maintaining, and documenting. Gathering amasses information however it appears—on forms, in charts, by word of mouth. Calculating wrangles numbers to produce insights about cycle times, specifications, and defects. Maintaining covers machine use and availability, calibrations, and upgrades. Documenting puts all these into a reviewable format.
When you “do” data collection:
Data are collected but not centralized
Whether it’s done on paper, with an Excel spreadsheet, or by means of another system, data are gathered but aren’t centralized or well-coordinated. The information is there, but it’s not available for big-picture analysis. Here are some examples.
A food manufacturer used the same ingredients in many of its products, but these were called different things depending on the department. This made sitewide (never mind companywide) understanding of ingredient performance impossible.
An automotive supplier had a poorly configured system that could be centralized, but different engineers had configured the software over time. Because each collection protocol was unique, based on the specific engineer’s knowledge, operators would get a different set of data depending on how they navigated to it.
Calculations are done but usually manually
When they are required, calculations often are done manually, creating the inevitable errors due to human inconsistency, but also in method consistency.
For example, when quality managers from six sites of a beverage manufacturer met, their assumption was that all sites performed chemical analysis measurements identically. Discussion revealed three calculation methods were in play because sites used different versions of similar analytical equipment. They were close, but “close” is not “identical.”
Maintenance is performed but is usually specialized
For lines, work cells, or plants that require a lot of maintenance, homegrown systems are built to meet the specific needs of that situation. On one level this addresses the problem, but it becomes unmanageable when scaled, or when the system owner is absent.
For example, a machined-components supplier used coordinate measuring machines (CMMs) in its production process, but inconsistent naming conventions (e.g., “OD” vs. “outside diameter” vs. “diameter—outside”) prevented consistent analysis of the automated data collection.
Forms are completed but lack oversight and compliance
When operators manually complete forms, it doesn’t ensure compliance. It only ensures that an operator can fill in a form. A classic example is that on most data-collection forms, time either isn’t entered, or the operator just concurs with the time listed on the form. How realistic is it to believe that every check happened at exactly the time specified?
Or take the case of a packaging manufacturer where an operator was two minutes late entering the time a check was performed, and the software tallied it as not having been performed at all. The software isn’t at fault here; it’s just how it was configured. If 30-minute checks aren’t really necessary, then the manufacturer should change the standard operating procedure and reconfigure the software.
Acting on data
When it’s time to take action on data that have been gathered, most manufacturers—whether due to time constraints, staff shortages, or unrelated but urgent tasks—spend that time in a perfunctory manner. Actions tend to be localized, incomplete, or ineffective. The same time could be used to produce broader and far more useful results.
When you “do” acting on data:
Action is prompt but localized
Suppose action is needed somewhere on the shop floor, most likely in response to a localized notification on someone’s desktop software. The challenge here is that it’s necessary to be physically present at that workstation to see the result.
For example, a beverage manufacturer conducted visual inspections and kept a tally of the total rejects produced. However, the only way to know immediately if product was being rejected was to be on the floor. To check the hourly tally of rejects, an operator had to walk over and look at the display on the visual inspection tool. This was problematic when the operator had to oversee two filling lines in separate areas.
Responses are implemented but are minimal or incomplete
Reactions to often simple or incomplete criteria tend to focus only on out-of-spec values and periodic reviews. If sufficient, these aren’t bad per se, but usually this approach leaves opportunities hanging.
For example, at a medical device manufacturer, incoming inspections were performed manually on large lots of components. Most checks were go/no-go instead of measurements because it was faster. However, further inspection was needed if an issue was found. Supervisors allocated scarce time to check results periodically, but all results were verified in spreadsheets that didn’t feed any actionable system.
Control charts are used but not effectively
Control charts are the right tool in the right hands, but when teamed with inexperienced workers, they can be problematic. Keeping a workforce trained is challenging, as is preventing tampering. It’s human nature to want to “help” when things “don’t look quite right.”
For example, a complicated process required for producing thin film for plasma deposition proved unwieldy and confusing when it was depicted with control charts. Process parameters involved multiple voltages, currents, and gas flows to keep the product consistent. For individual, controllable parameters, control charts worked, but for big-picture, “How’s the process doing?” questions, they didn’t. Box-and-whisker plots were a more appropriate tool.
Analyzing data
Analysis on the shop floor would be revolutionized if it were consistently done in real time, although this isn’t often the case. Also, reliance on preconfigured report formats can hamper effective investigation.
When you “do” data analysis:
Data are analyzed but not in real time
When doing quality, analysis tends to be post-process. This isn’t bad, but it does mean that people must wait for reports generated by others, which delays the process and discourages “what-if” types of analysis.
For example, visual inspections at a medical packaging supplier were a big issue. Being unable to see what kinds of defects showed up during production runs led to 100-percent inspection and a lot of product sorting. If visual defects could be spotted in real time, adjustments could be made during the run, saving time and reducing scrap.
Reports are made but usually in a fixed format
As mentioned above, the “what-if” analysis is a struggle for many, and it’s critical in the tactical world because this affects product right now. Fixed reporting formats can’t support that, and they’re outdated the moment they’re sent.
For example, at an aerospace component supplier, all reports and analysis were based on the format required by the customer, but that format didn’t provide the needed flexibility for troubleshooting. End reports were negotiated up front as part of the work agreement, and engineers had to make do with the agreed-on format for analysis because the contract didn’t allow for modifications. Consequently, analysis of what actually occurred during production wasn’t as robust as it could be.
Doing vs. mastering quality
As the examples above show, “doing” quality comes with its own problems and frustrations. These fall into a handful of inadequate, perennial aggravations. Mastering quality on the shop floor, on the other hand, begins with a shift in data collection, assembling not just adequate but ideal amounts of data and presenting them in the most effective formats. Consider these differences:
Doing: Data collection is usually driven by requirements that may be derived internally (based on “trust” in the process or previous quality issues) or externally (industry regulations or customer requirements).
Doing: Data are gathered and forgotten, or are used only for audits.
Mastering: Data collection is centralized and standardized.
Doing: Data are localized and available only on the shop floor.
Doing: What’s collected is data rich but information poor, which can prevent alarms from triggering and someone taking action based on the data.
Mastering: Calculations are automated, and devices are integrated.
Doing: Tunnel vision prevents operators from spotting trends, which hampers improvement at both the site and enterprise level.
Mastering: Notifications are automated.
Data-driven decision making can bolster a cycle of perpetual improvement. Accurate data lay the foundation for safe and reliable product, optimal processes, brand reputation, and profit. Additionally, mastering shop-floor quality will improve operator efficiency and productivity. As notifications and subsequent workflows are triggered in real time, operators will require less training while building expertise.
Add new comment