They’re Either Simple and Obvious — or They’re Not
We’ve talked before about many of the geometric parameters that appear on drawing specifications, as well as some of the methods used to measure and inspect parts such as the precision metal components we make here at Metal Cutting Corporation. What we’ve only lightly touched on, however, are some of the challenges (dare we say, headaches?) involved in the calibration standards for various tools used in our business as well as by our customers.
Getting the Terminology Straight
Calibration standards can be simple and obvious — or they can be not simple and obvious at all. That’s because while, in theory, calibrating is absolute, there are many reasons why it is not exactly absolute. Even the shared assumptions about calibration standards are not without challenges. For instance:
- While there are accepted temporal intervals, technically a device could be out of calibration minutes after it has been calibrated.
- The traceability of the documents used for calibration is normally tracked back to a standard established by the National Institute of Standards and Technology (NIST); however, there are some circumstances where there simply are no NIST standards for calibration.
Measuring equipment is almost always calibrated, but there are certainly functional aspects of manufacturing equipment that need to be within specifications and therefore need to be calibrated. However, even the term calibration is subject to some interpretation: When someone says they are doing a calibration, are they actually performing some action to bring a device back into specification? Or by “calibrating,” do they simply mean they are checking devices to see if they are still functioning within the specified tolerance according to the calibration standards?
For example, here at Metal Cutting, when we calibrate our ovens, we are checking to make sure they are reading temperatures properly. If an oven is not working correctly, it is deemed out of calibration, and we will repair it and then perform the temperature calibration procedure again. While ovens cannot be sent out to be calibrated, other specialized tools we use are regularly sent out for calibration, with the purpose of bringing them back into specification. This might involve a measurement equipment vendor cleaning a device or programming it to measure in a certain way. For this category of equipment, calibration is more akin to rebuilding than adjusting.
Not All Calibration Standards Are Straightforward
For some things, it is simple to imagine the objects or concepts that serve as the basis for the established calibration standards. For example, it’s easy to get traceable NIST calibration standards for parts that are 1.0” (25.4 mm) long or 0.04” (1 mm) in diameter. However, it can be difficult to get a traceable standard for a part that is very long or a diameter that is very large or very small. For instance, a standard that is 2 meters long or 10 microns in diameter is simply too difficult to handle, for opposite reasons. And what about calibrating for electrical resistance, such as the ohm resistance of deionized water? There are techniques and methods for calibrating all of these; however, they are not as simple as calibrating for easy-to-handle lengths and diameters.
In the work we do with specialty metals here at Metal Cutting Corporation, we are often asked to ensure there are no cracks or voids in the metal we are either sent to process or we ourselves purchase and supply on behalf of our customers. Eddy current testing (ECT) is a familiar and interesting method we use that involves subtle techniques to inspect metal parts for surface flaws such as cracks. However, one truism about this method that can be hard for customers to appreciate is there is no NIST traceable calibration standard for ECT; there is no crack standard object or even a concept that can be used to set the ECT parameters such as frequency, amplitude, and sensitivity.
Are there other parameters for which there are no obvious calibration standards? Please leave a comment below to tell us about your experience.
Other Dimensional Challenges
Manufacturers all want calibration to be an independent reference of an immutable standard. However, all calibration involves some dependency between the NIST standard that exists in a vault and everything that is measured thereafter down the supply chain. So, whether it is an A2LA lab relying on their reference back to the object in the vault or a manufacturer that relies on objects that are referenced back to the independent lab’s object, there is always a series of contingencies in the execution of any calibration system, as well as tolerances in the system.
For example, the impact of stacked tolerances must be considered. If you send a piece of equipment out for calibration, you must remember to account for the stacking of the tolerance of the device being calibrated plus the tolerance of the pin used to calibrate the device and the tolerance of the lab that performs the task.
An issue that frequently comes up with calibration standards is, how many decimals out (how many zeros) should the calibration be? Here at Metal Cutting we use a Class XXX pin gage with a tolerance of 0.000020″ (0.000508 mm) to calibrate something with a much looser tolerance, such as a hand micrometer that goes out to 0.00005″ (0.00127 mm). Another interesting dimensional issue is that if an A2LA lab measures an object with a nominal of 1.000000″ to be 1.000003″ using their NIST-traceable equipment, then 1.000003” becomes the new normal — that is, it becomes the new size for calibrating measuring equipment that reads out to six decimal points.
Additionally, two different people might measure a part, each using a calibrated device that is in perfect working order and within the confines of the specified tolerance, and yet there can still be a difference in their measurements. Perhaps one is using a device calibrated to the higher end of the tolerance range, while the other is using a device a calibrated to the lower end. Especially with precision parts having very small measurements, this is another potential calibration issue, requiring corroboration that both are not just measuring with same devices, but also that they are calibrated using the same method and to the same tolerance. (You can read more about whether calibrated measuring is consistent measuring in our blog 7 Steps to Ensuring Your Measured Results Are in Spec.)
Three Distinctions in Calibration Standards
In the end, you might say there are really three distinctions in calibration standards:
- The obvious standards, such as NIST traceable calibrated pins for pass-fail inspection of lengths or diameters
- The not-so-obvious, such as those for temperature and other characteristics for which there are no objects that define the standards
- Those things for which there simply are no calibration standards at all, such as ECT
Here at Metal Cutting Corporation, where every day we produce thousands of small metal parts, these considerations are vital to maintaining our Quality Management System (QMS) standards so that we can meet our goal of delivering high-quality precision parts that meet your specifications.
Picking the right manufacturing partner will not only help you achieve the quality you want, but also help to ensure that your production costs stay within budget. For tips on finding the best partner for your metal fabrication needs, download our free guide, 7 Secrets to Choosing a New Contract Partner.
This week’s blogger, Josh Jablons, is the President of Metal Cutting Corporation.