Specifying a Device’s Capabilities and How It Is Calibrated
Here at Metal Cutting Corporation, calibration tolerance is yet another important aspect of quality control and our Quality Management System (QMS)
Recall that in instrument calibration, a measuring device is compared against an accurate, accepted, known standard in order to correlate, adjust, and document the accuracy of the device being calibrated. Calibration tolerance is the maximum acceptable deviation between the known standard and the calibrated device.
At Metal Cutting, whenever possible the calibration of the devices we use for measuring parts is based on NIST standards. For example, as we talk about in “The Quandaries of Calibration Standards,” we use NIST-traceable pins to establish data points for calibrating our laser micrometers.
However, calibration tolerance is variable, dependent on not only the device that is being calibrated, but also what is going to be measured with that device.
The Key to Calibration Tolerance
What does that mean? To illustrate, let’s look at the laser micrometers we use at Metal Cutting.
Each has a range in which it will work, and each device itself has its own tolerance. However, since we typically measure in a narrower range, we calibrate at the tighter range that we are working to rather than the broader range a micrometer is capable of measuring.
For instance, a device might be capable of measuring diameters from 0.010” to 0.250” (0.254 mm to 6.35 mm), but we are interested in a range of 0.020” to 0.100” (0.508–2.54 mm). Therefore, we would use pin gages in that tighter range and use that library to set the calibration of the device.
An interesting phenomenon of manufacturing is that the larger the diameter, the tougher it is to hold a tighter tolerance. So, for example, a Class XXX gage calibrated to three decimals for a 1.0” (25.4 mm) diameter would have a tolerance of ±0.000010” (0.000254 mm); but for a 10”.0 (254 mm) diameter, the tolerance of the same Class XXX gage would be ±0.000050” (0.00127 mm).
Typically, Metal Cutting works towards diameters of 0.004” to 0.067” (0.102–1.702 mm) while our devices are capable of measuring much larger diameters. However, since calibration tolerance also depends on the instrument’s capability, if a device can only measure to ±0.000020” (0.000508 mm), obviously you must calibrate to within that range.
Therefore, the key to calibration tolerance is to make sure you specify both what the device is capable of doing and the range to which it is calibrated.
Calibration Tolerance vs. Process Tolerance
In addition to calibration tolerance, we also take into consideration the tolerance to which we are cutting a part during the production process.
The rule of thumb is the calibration tolerance must always be tighter than the process tolerance, essentially applying the tried and true 10-to-1 rule.
So, for example, if the tolerance we need to hold for our process is ±0.001″ (0.0254 mm), then we would use a laser micrometer with a calibration tolerance of ±0.00010” (0.00254 mm). And the tolerance of the Class XXX pin used to calibrate the micrometer would need to have been even tighter!
Calibration Tolerance Certified by a Third Party
At Metal Cutting, we regularly send our NIST-traceable pin gages to an independent ISO certified lab for calibration. This is done to ensure the pins are within specifications and to maximize their functionality.
For instance, we send out our XXX pins to be calibrated within what an XXX gage normally measures at — that is, a calibration tolerance of ±0.000020″ (0.000508 mm).
ISO certified A2LA labs have equipment that is an order of magnitude better than what they are certifying. While Metal Cutting has very precise equipment in house, we don’t have the capabilities to calibrate to a level of, say, 0.0000001” (0.00000254 mm).
Additionally, with that high level of precision, you also must account for the effects of the thermal coefficient of expansion, requiring measurements to be done under the same tightly controlled temperature, humidity, air pressure, and other strict environmental conditions.
Therefore, we rely on ISO certified third-party calibration houses for calibrating our pin gages. This in turn ensures our in-house devices are calibrated using tools that have the proper capabilities and tolerances.
While calibration tolerance in ISO 9001: 2015 is not specified, clause 7 of the latest ISO standards does make recommendations for choosing, maintaining, and calibrating traceable measuring equipment.
(A side note: Although Metal Cutting Corporation itself is not an ISO certified calibration house, we are certified in the ISO 9001:2015 standards for quality management and risk mitigation.)
Weighing in on Calibration Tolerance
While calibration tolerance often refers to dimensions, in our world it can also refer to weight. That’s because, although Metal Cutting’s customers do not ask for parts of a discrete weight, our parts are so small and produced in such large quantities that we often count parts by weight.
Rather than individually counting tens of thousands of incredibly tiny parts, we instead use the weight of 100 or 1,000 parts as a reference for an accurate counting scale. Therefore, we routinely send our weight standards out to a certified lab for calibration.
This allows us to then use the weights internally to calibrate our scales for counting parts. Our in-house equipment includes multiple scales of different capacities and different tolerances, for our world of tiny parts ranging from grams to fractions of a milligram.
Critical to Precision Manufacturing
Understanding equipment capabilities and tolerances is critical to successful precision manufacturing. Here at Metal Cutting Corporation, where we produce thousands of small metal parts every day, our quality control standards and our firm grasp of calibration tolerance help us ensure that we deliver high-quality parts that meet customer specifications.