Getting Specific with Specifications

Getting Specific with Specifications (and why a negative MPE may be very positive)
George Schuetz, Mahr Federal Inc.


  Specifications for metrology equipment are usually published in nice glossy brochures that have lots of great pictures highlighting all the features of the product. But specifications can be confusing and are often a hot topic of debate. It's not that the manufacturer is trying to mislead anyone, but you need to bear in mind that they are trying to put their best foot forward when it comes to their products.

  One of the good things about published specifications is in the fine print—the part that says the manufacturer has the right to change the specifications at any time. Why should it be good that manufacturers can change their specs? Because, as we have discussed many times, there is an ever increasing demand on machining to produce parts with tighter tolerances. What once were fairly easy tolerances to achieve are now such that they require special machining processes in controlled environments.

  So it's rare that a metrology manufacturer will loosen their specifications. In reality, gages are frequently redesigned during their lifetimes to offer improved performance with better mechanical components, better displacement sensing, or even software compensation to correct repeatable errors.

  With long range measuring instruments, the accuracy specification is usually related to the range of measurement. The further the gage moves away from its starting or reference point, the more error one is apt to see. So measuring error for a long range linear measuring device might be specified something like:

Measuring Error ≤ (2 + (L/600)) µm where the length of the measuring instrument is 600 mm.

The way to read this is as follows:

   The measuring error is equal to or less than 2µm plus the result of the length displaced (L) (in mm) divided by 600, expressed in micro meters. At full displacement, this part of the equation is: 600 mm displaced/600 mm full range, which equals 1. Thus, at full range the equation for measuring error becomes:

Measuring error (at full range) ≤ (2 + (600/600)) = (2 + (1)) = 3 µm

If the displaced measuring location was at half range, then the equation would be:

Measuring error = (2 + (300/600)) = (2 + (.5)) = 2.5 µm

  So we can now expect that the smallest error is apt to be 2 µm, while at the extreme end of the scale it can be 3 µm.

  But how does this get translated onto a certificate that spells out the performance of the length measuring system? The certification lists the points measured and then the maximum permissible error at that point. The certificate will look like this:

Measuring Position Permissible Error  Recorded Error

100 mm 2.16µm  1.5µm
300 mm 2.5µm -1.7µm
600 mm 3.0µm  2.1µm

Now let's get a little negative about this subject.

  Nowhere in this discussion have we seen or mentioned the negative sign or negative values. A negative value for error does not mean better than perfect. Rather, these values are part of the language of metrology, and if you don't know the language, there is a good chance you could get confused: just like going to a foreign country and not knowing the language.

  There is a document on the language of metrology, called the International Vocabulary of Basic and General Terms in Metrology. It's great bedtime reading. One of the terms described in the Vocabulary is Maximum Permissible Error (MPE). In full, it is called the maximum permissible error of a measuring instrument and it can also be described as the limits of permissible error (of a measuring instrument). It is defined as "the extreme values of an error permitted by the specifications, regulations, etc., for a given measuring instrument."

  Therefore, the Permissible Error listed on the calibration certificate above really means Maximum Permissible Error, and since one can err in two directions with length measurement, it is reflected with both a plus and a minus value. While there is the possibility of a 0 – 3.0µm error for the 600mm position, the specification means that any value between -3.0µm and 3.0µm is acceptable.

  Here's another example of the language of metrology: on the bottom of the Certificate of Calibration there is a number that is associated with the uncertainty of measurement. This is always stated something like this:

  Uncertainty = K = 0.5µm (where 0.5µm can be any uncertainty as determined by the lab)

  Here again, uncertainty is not a single sided number. It is rather a number applied to both sides of the measured result creating a window where the true result is apt to fall. The uncertainty number does not get added to the permissible error.

  Knowing and understanding this, the purchaser of the linear measuring equipment can fully understand the published specifications and the certification that comes with the product. And while it may be a negative, that's really a good thing.