Electronic temperature compensation has been successfully applied to shop-floor gauges for more than 25 years. It’s become a mature technology and proven to be one of the most easily cost-justified means to achieve gauge correlation and eliminate the most common cause of high-resolution gauge error, namely temperature. Yet it is still poorly understood and commonly undervalued.
ADVERTISEMENT |
Electronic temperature compensation can save many times its investment cost within months by maintaining gauge repeatability and reproducibility (R&R), and thus effectively controlling production processes while temperatures vary, all in real time.
The uneasy relationship of gauges and temperatures
Gauges are used to control quality during the production or rework cycle of metal-worked components so as to ensure compliance with part specifications. It’s vitally important that these gauges can be relied on to check dimensions accurately. Gauges are tested for their accuracy and repeatability. These tests are usually performed at stable temperatures, at which time they may show acceptable results. Thereafter gauges will cease to be repeatable or accurate as temperatures fluctuate if thermal changes are not taken into account (see figure 1). Run the same gauge repeatability and reproducibility (R&R) tests while changing the temperatures of part, master, or gauge fixture or head, and a very different and unacceptable result will be obtained.
Figure 1:
The international reference standard, ISO 1:1975—“Standard reference temperature for industrial length measurements,” states:, “The standard reference temperature for industrial length measurements is fixed at 20°C.” So the correct dimension, by international convention, is that which is obtained when the part, setting master, and gauge—the “elements” of a measurement system—are at 20°C (68°F), unless otherwise specified. It can be impractical and expensive to try to keep all three elements of the measuring system at a stable 20°C/68°F, but a properly specified and configured electronic temperature compensation system can sense the temperatures of each of them and correct a gauge so that dimensions are displayed as if they were all at 68°F/20°C.
Manufacturers of metal components used in industries such as automotive, transport, aerospace, railroads, and mechanical machinery operate in highly competitive markets. Driven by goals such as improving mileage and environmental awareness, and reducing wear and tear, warranty expense, rework, or scrap costs, tolerances on critical dimensions of moving parts and their enclosures continue to get ever tighter. It’s not uncommon to find tolerances expressed to 3 (metric) or 4 (Imperial) decimal places. At this point the laws of physics must be addressed, and inevitably it becomes more expensive to achieve the required levels of accuracy. When these tolerances are applied to dimensions in excess of 50 or 75 mm, (2 or 3 in.), dimensional measurements can display considerable variation due to temperature fluctuations.
Many CMMs are equipped with temperature-compensation capability, but they are usually restricted to a thermal range of about 15°F or so, since 3D compensation is particularly challenging. Single-axis compensation can be effective over a much greater range such as 45°F to 130°F, a range of 85°F (30°C). There are other ways to reduce thermal effects in measurements, but they can be costly. Air conditioning, or at least air tempering or coolant control, is expensive and usually less successful. Waiting for thermal stabilization, perhaps in a controlled environment such as a gauge room, takes time. Ignoring the problem will ultimately take its toll in other ways, such as with customer rejects, warranty issues, and end-user dissatisfaction.
Basics of temperature compensation systems
Temperature compensation systems are a cost-effective way to solve the problem. They can help to squeeze out the much-needed last few microns or tenths of a thousandth accuracy and repeatability in fluctuating environmental conditions. However, they are not necessarily simple to define and set up. It’s too easy to oversimplify the solution. It’s not sufficient to specify to a gauge supplier that “temperature compensation is required.” This leaves too much to interpretation and has led to instances where the technology has earned itself a bad reputation. A good system will measure temperatures of workpieces, master, and gauge, and using customized correction coefficient, will correct for each of them if they are not at reference temperature. The sensors will respond quickly (or as quickly as physics will allow), and a compensation algorithm will correct measurements made by the gauging system so that they display as if all temperatures were at reference temperature.
When considering the use of temperature compensation, it’s worth spending some time defining the job that is to be done—and specifically the expected outcomes. With simple part shapes such as a cylinder liner or short shaft, the solution may be relatively straightforward, and a single temperature sensor may suffice to pick up all relevant workpiece temperatures so that repeatable accuracy (i.e., displaying the true dimension at reference temperature) can be achieved over a specified temperature range.
When part geometries are more complex it can become necessary to identify zones on the part that are prone to exhibiting different temperatures at the time of gauging. Differing rates of thermal conductivity may apply to different zones with differing masses, which may be separated by some distance so that they react to exposure to temperature in different ways. For example, in a washer or during a machining operation, or sitting idle for some time after such an operation, different zones on the part may respond at different rates to their recent exposure to thermal change. It becomes expedient to perform more empirical testing to ascertain the best locations for sensors and the best correction coefficients. This may be more expensive, but the benefits have proven to easily outweigh the cost.
Note that the term “correction coefficients” is used rather than “coefficients of expansion.” The reason is that typical handbook-derived coefficients are usually good only to within ±15 percent or so, and other factors, such as geometry, different materials, and inserts, can affect the rate of thermal changes. Consequently, it’s best to perform empirical tests to determine the best coefficient fit.
The most recent generations of compensation systems are user friendly and transparent. While password protection may be advisable so that programs can’t be altered without authorization, access to the key variables such as correction coefficients and dimensions that are used in the correction algorithm should be available to authorized users. Once the system is installed and operational, it’s important that good documentation of the system is maintained. A manual should include an explanation of the purpose of the system, the need for maintenance of the sensors (e.g., keeping them clean and in the correct position to make good contact). Too often there are changes in operators or managers who are unfamiliar with the system. This can result in the system being ignored, neglected, or being unplugged or otherwise taken out of service in the years that follow implementation. A simple reference manual can overcome this possibility.
While recognizing that budget managers are constantly striving to minimize capital expenditures, it’s worth considering that the investment payback in temperature compensation has been studied by users. It’s usually measured in weeks or months after calculating savings from reduced scrap and rework, improved gauge correlation and rejections, and avoiding the much larger investment in temperature controls such as environment tempering or coolant control.
Supporting data
For example, in one case a major automaker installed a new piston line and used temperature compensation on its gauges at a cost of $40,000. The rejected alternative option was to install a temperature-controlled accumulating facility around the entire gauging operation at a cost of $1 million. The line was tested, and subsequently went into production. The tests, and subsequent monitoring over a period of seven years, showed that the compensation system consistently corrected for more than 97 percent of thermal errors. Pistons could be inspected to within better than ±1 µm repeatability, regardless of ambient temperature changes and process variations.
Figure 2:
Following are other examples that show the benefits of using electronic temperature compensation that were obtained by other users.
Temperature compensation of engine cylinder bores at a major U.S. auto engine plant:
• Automobile engine-block production rate increased by more than 50 percent due to increased accuracy and gauging speed.
• Cylinder bore dimensional accuracy increased to 1 µm repeatability, and accuracy held.
• Temperature sensing was possible at multiple locations; three sensors in the gauge head accounted for variations within the bore (see figure 2).
Image 3: here
Temperature compensation of in-process grinding gauge:
• Cp and Cpk improved by more than 200 percent.
• Production rate improved from reducing “spark out” and dressing times.
• Dimensional accuracy improved and held to ±2 µm while grinding.
Image 4:here
Finally, here are data from a study performed after installing an aluminum transmission housing and gauge at another major automaker. Note that eight different bores were measured repeatedly while the part temperature varied by up to nearly 25°F (14°C).
Image 5:here
As these cases demonstrate, properly implemented temperature compensation systems can do an excellent job of minimizing the loss of accuracy and repeatability as temperatures change. It’s a cost-effective technology that helps to meet the increasing challenges posed by ever-tightening tolerance requirements.
Add new comment