Measuring instruments have certain common errors built right into them, and the instruments most commonly used in machine shops are no exception. Recognizing these errors is a basic requirement for proper inspection. It’s not necessary to comprehend the precise physics or geometry underlying the error, but it is important to understand the nature of the error and the extent to which it might affect or limit the instrument’s precision.

Take the micrometer. This is a very accurate and stable instrument, but even this instrument has a certain capacity for error built in. Over-tightening the micrometer’s spindle can cause the anvil portion of the gage to change shape. Lower-quality micrometers are made of materials more prone to this error. While the amount of deflection may be only 0.0001 or 0.0002 inch, that could be 50 percent of some tolerance bands.

A caliper is prone to error described by “Abbé’s principle,” which says that a source of error is introduced anytime the reference line of a measuring system doesn’t lie along the same line as the dimension being measured. On a caliper, the scales or gears are not in line with the measuring faces or contacts. As a result, the caliper shifts and wiggles (in microscopic increments) in a way comparable to that of a table or chair when the legs become loose. The error can be minimized by measuring as close to the rail as possible.

Another error limits the instrument’s effectiveness at measuring an internal diameter. The design of the standard caliper places the measuring contacts or jaws offset from one another. That means the jaws will never “find” the maximum diameter of the workpiece.

One other matter to keep in mind with a caliper is the additive nature of the error amounts that are permitted by the instrument’s calibration. To pass calibration, a dial caliper with 0.001-inch resolution must be accurate within ±0.001 inch for length measurements and allow no more than 0.001 inch for parallelism error. But the measurement of a large part may be affected by both errors. In such a case, the possible error is 0.002 inch. According to the 10:1 rule, that means the instrument should inspect a characteristic only if the total tolerance is 0.020 inch or more.

Read more: Understanding Errors In Hand-Held Measuring Instruments