This is the second half of our two-part interview with Doug Fair, who is the chief operating officer of InfinityQS. In part 1, we chatted with Fair about data gluttony, the process for uncovering actionable information, and the power of sampling. The conversation continues here as we dive into a pair of illustrative case studies, learn how to reimagine old paradigms and the quality function as a whole, and ponder the important data that manufacturers rarely even consider.
Quality Digest: Let’s dig a little deeper into those case studies, because they’re interesting to me, and I’m sure they’re interesting to our readers, too. Regarding the two clients you mention, the plant with 300+ personnel and the aerospace company, did they have struggles at the beginning of their process in acquiring too much data and suffering from data gluttony?
Doug Fair: Yes, the aerospace company had huge data gluttony problems! So much so that they weren’t even sure what to look at. However, the other company, where the plant was saved, had very few data collection methods in place at all. I encouraged them to devise a data sampling strategy, take a little bit of data, and then see what we could learn from them. They were blown away by what they learned even from very small samples of data. We were able to identify what maintenance tasks needed to be performed, what operational changes should be enacted, which production lines required specific changes, and which products had specific issues—all within a few weeks. It was nothing short of extraordinary. It’s all about leveraging the data and discovering the information buried within it. Then it’s just a matter of making the changes and improvements in the plant based on what that information reveals.
QD: These are two very different manufacturers... one didn’t have enough data and needed to understand what data could do, and the other was drowning in data and needed to understand better how to pare that back and leverage it for information. I’m sure these are things that you see all the time.
DF: For me, one of the most exciting things we do is helping our clients understand what these data are telling them and leverage these techniques to benefit their organization. It’s extraordinarily interesting to me. Maybe I’m just crazy that way, but I love doing this stuff.
One of the greatest opportunities for improvement for manufacturers comes from rejecting quality paradigms that people really should forget about. We need to consider what’s technologically possible today. The two case studies I described earlier are extraordinary examples, but they were plant-based and plant-specific examples. What we are seeing today with the advent of software as a service (SaaS) technologies, and with the ability to gather data and pass it over the web to a SaaS product, is that we can now consolidate quality data across multiple plants. Back in the day, quality efforts were usually focused on a single production line or a particular machine tool, or maybe a specific product code. Almost always, these efforts were confined to the plant level, and rarely were key learnings shared with other plants making the same products. But now, because of the SaaS capability that we have today across the world, we can aggregate data across multiple plants and regions, and that lets us acquire and analyze information across the enterprise. The implications are that more information is available across the enterprise, meaning that there are far greater opportunities for improving quality and reducing costs, and ensuring standardization and consistency of product manufacture.
A lot of people think, “We can’t view quality data simultaneously across multiple plants.” Well, yes, you can. You see, we now can put all quality data in the cloud, which opens the whole world to us. Now we can assess quality levels for a region and across multiple plants, or across the entire enterprise—even into the supply chain—all with tablet computers or smart phones.
We’ve got the technology to do it, and it’s inexpensive. Some organizations believe moving to these new technologies is very expensive. But just the opposite is the case. That old paradigm is simply erroneous. Organizations can keep their data collection devices; all their IT investments in hardware and infrastructure can be kept. Manufacturers don’t have to rip and replace data collection technologies that they have invested in. Instead, they can just redirect those data from going to a local area network product to a SaaS product. And by doing that, they can have—in one central cloud-based repository—all the enterprise’s quality data. It’s sitting there, ripe for quality professionals, engineers, and managers to review these data from an even larger standpoint across the enterprise every week or every month or every quarter. That provides the ability to frequently and regularly assess and interrogate quality data on a much higher level, and generate much greater returns on quality investments. That is, if we were able to achieve extraordinary improvements by focusing on data at the plant level alone, imagine what we could achieve by analyzing data across lots of plants.
All this goes against another paradigm, which says that quality resources should be deployed at the plant level. Not necessarily. If the data are now available across the enterprise in SaaS products, that means that manufacturers can prioritize quality activities by asking, “Where, across all regions and plants, should we focus our Six Sigma resources?” or, “Where do we deploy our very limited and incredibly valuable quality improvement specialists?” We haven’t had that visibility before, but now that reality is here, and it’s an inexpensive way that doesn’t require ripping and replacing of technology investments.
QD: That connects to this idea of reimagining quality, because there’s still a legacy belief of what quality is and what it is not. Executives are not quality people, and they’re often not statistical people, either. So when they hear the term, “quality improvement,” they see dollar signs, and not dollar signs that they’re going to be saving. Their initial reaction is, “Oh my gosh, what’s that going to cost us?” The idea of cost of quality is one that people still don’t really understand. Do you think that we need to proselytize these ideas to management?
DF: You’re spot on there. But I think that proselytization can occur with people who are engaged with the entire business ownership. That means vice presidents, directors, executive vice presidents, and the entire C-suite. These are the people who are saying, “We’ve got to make sure that our company, in its entirety, is competitive, that the quality is as good as it can be.” Because we now have the ability to extract information on a much higher level across the enterprise, as I described, by using SaaS systems, those people should be even more engaged. So, as quality professionals, we have to proselytize to those people in particular, because we’re going to be talking specifically about improving their bottom line for them, about increasing company market share, and about improving their ability to compete on a global level.
I’ve spent time in hundreds of manufacturing plants around the globe, and I’ve seen that quality resources are regularly expected to determine—on their own—what needs to be improved from a quality perspective. But if we have this visibility, this information, at the enterprise level regarding quality across all plants and all of regions, then that C-suite level can begin directing the prioritization of quality activities across the enterprise. Business owners, directors, vice presidents, and C-suite individuals should realize that if they want to get the biggest bang for their buck, then they need information that will allow them to target, globally, where the greatest opportunities for improving costs and quality exist. And by doing so, they can more properly prioritize, direct, and guide those quality resources to take on the work that needs to be done to more quickly improve their bottom line and competitiveness. That’s something we haven’t had before. But we have it now.
QD: Let’s close with something that I know you’re particularly passionate about: The data that people rarely look at, and the value that can be found within them.
DF: In almost every organization I’ve worked with, there is extraordinary value in data that no one looks at. I mean they literally never look at it. This is where I have seen huge improvements in quality, productivity, efficiency, and cost have been made.
Quality professionals are fantastic problem solvers. We’re proud of that. When something goes awry out on the shop floor, we descend on it, we fix it, we feel good about it, and it’s really exciting. But the exceptions-based view of quality improvement is only one part—I think it’s the smallest part—of the greatest opportunities we have before us to improve.
What people don’t realize is the extraordinary value of information hidden in data that are actually within specification limits. What I mean is that, because a value is in spec, most companies don’t look at it. It’s because of our exceptions-based paradigm of descending on problems and fixing them. Hey, got a problem here? Fix it. Great. Got a fire over there? Put it out. Great. But when people step back and look at their data, I’m telling them to include data that actually fall within specification limits. And they usually look at me like I’ve got three heads, but really, that’s where there’s a tremendous amount of value.
For example, a couple of years ago I worked closely with a beverage company in Europe. They proudly told me that everything was in spec, and they argued that there was little they could do to improve as they had no underfilled or overfilled bottles. But then we gathered some fill data and interrogated it. Once we did that, they could not believe how much they were overfilling bottles. Now, this process wasn’t out of spec, and none of the bottles were filled to the point that product spilled out of the bottles, but they were overfilling above the label-stated volume. None of their customers were receiving bottles with less than was advertised, but all of them were receiving more than stated by the label. The data we gathered told us about inconsistencies between fill heads, different bottle types and differences in fills between products. As a result of interrogating their data, the client realized a $1.1 million annual savings in unnecessary overfills—on just one of their 20-plus production lines. They were shocked at how much they were able to save, especially since their savings came from analyzing data that were within specifications—data that no one was looking at.
Add new comment