The argument for moving toward enterprisewide model-based definition is simple: The way we describe products is increasingly digital, not paper-based. The way we optimize and validate products seems almost entirely digital, except for a few remaining destructive tests. The way our production machines accept design instructions
Transferring our modeling data through simulation and related design iterations to final format and straight to machine production is the logical goal of all software and hardware development. It’s where everything we do as manufacturers and software developers has been heading for decades. I believe that both the hardest and easiest parts of the all-digital effort lie just ahead.
The hard part will be developing more complete translation standards between designs and machines—STEP, JT, QIF, and more—that don’t fall short or become obsolete as CAD innovations race forward via the market’s independent vendors. Their mathematical kernels and geometric representations of objects are slightly different from each other, even if an equally dimensioned object done by all parties looks exactly alike on the screen. The standards that are output by CAD vendors to transmit their designs to machines also vary, in that, again, geometric formulas for creating a cone or through-hole are not the same. It’s difficult to imagine a major change in this entrepreneurial dynamic.
Standards will be a huge future issue in automating the communications between sensors, wi-fi and networked devices, within the digital twin, and among whole classes of software and the hardware comprising a vision of IIoT and Industry 4.0.
Right now
But before we look too far ahead, there are real problems earlier, at the source that feeds production systems. CAD is where we must first reconcile and authenticate digital geometry and related product manufacturing instructions that are consumed downstream by machines. Design accuracy and the readability of these data are critical to product integrity and error-free automation—at its start and finish. Most immediately, unified, transferable model data are what allow OEMs and their supply chains to collaborate and support product development efficiently.
Currently, there is significant interruption and rework in the flow of CAD and semantic product manufacturing instructions data among these stakeholders. This impacts automation. A break in a surface line or intersection, for instance, can impact precision machining. Tiny faces that are acceptable within a CAD system design, but not by downstream consumption processes, can also disrupt quality and automation. Fortunately, there are elements of the data validation and transfer solution in place today that make conditions tolerable, but these solutions are generally implemented piecemeal without a vision for what can improve across the whole playing field.
How OEMs and supply chains share data, plus a new vision
The challenges to digital automation start with an adherence to proven, but legacy, practices regarding data transfer and the natural inclination of manufacturers to pick best-in-class, nonhomogeneous solutions. That’s how the tools of design and production evolved. That’s how use-cultures get firmly established and change-resistant. We see this pattern play out in OEM/supply-chain practices related to CAD data sharing. Here’s a typical process map:
First, an OEM provides requirements of a design deliverable to its supplier, from typeface to embedded annotations. The OEM includes requirements such as how the technical data will be delivered, which CAD format and quality criteria to pass, and in some cases, the feature history within the CAD format. It’s routinely up to tier 1s after this to organize their suppliers in any complex supply-chain scenario. Each tier 2 and 3 supplier will likely have two or more major and specialized CAD and simulation packages it uses, both for performance reasons and for faster compliance with partners.
Tier 1s that are in big markets such as automotive typically manage more than 30 CAD environments to accommodate all of the OEMs’ individual delivery expectations. This includes different incoming software versions and practices among their subsuppliers. Aside from their own internal formats and versions that they manage, OEMs have individualized rules about geometry quality, metadata, use of layers, and so forth. Neither primary manufacturers nor suppliers want engineering staff manually checking and resolving rule requirements and data from disparate information streams. Geometry repair and translation software are commonly employed today as point solutions in this process of mending and transferring data across platforms.
Figure 1:
The rewards are high for those manufacturers and suppliers that can break from an existing use-culture—i.e., manage their deadlines and risks—and integrate approaches that are new to them. Productivity, quality, and costs are dramatically improved by identifying opportunities for fully automating model quality as well as requirements and validation.
Making it easier: automated product data quality
Translation, in and of itself, is a decades-old practice. It has been evolving alongside the design tools it supports. With cooperation from CAD vendors that share their application programming interfaces, geometric healing and translation services are better and more accurate than ever before. Users have confidence in their data-exchange methods. Industry recognizes its own diversity and that of the software that supports product development. There is broad agreement, as a result, on the need for continued collaboration and supporting movements such as model-based definition.
This consensus is also allowing for the development of sophisticated approaches that are aimed at creating not just point solutions but also whole environments for fixing and managing complex standards, mathematical discrepancies, change configuration, and the requirements OEMs place on their supply chains for delivering a “clean” authoritative model out of a forest of CAD and CAE software systems.
Figure 2:
Figure 3:
Today there are enterprise-wide solutions for multi-CAD data exchange, optimization, and validation that can accurately, automatically, and seamlessly communicate comprehensive CAD and appropriately applied semantic product manufacturing instructions data between different manufacturers, divisions, teams, and supply groups. Where automation encounters issues of design intent and ambiguity, these areas are called out for review and often solved in minutes rather than days. PDQ checking approaches review more than 75 critical areas in the design-to-manufacture-to-archiving stream—all customizable. PDQ software, as described, can reside in PLM programs (or on a single desktop) to further visibility and automation.
As we grapple with competition, speed-to-market opportunities, and frankly, fewer trained human engineering resources, automating the complexity of CAD in our OEM and supply chain interactions makes clear sense. And if we believe in the larger purpose of model-based definition, then implementation of automated product data quality is fundamental to its success.
Add new comment