What is Calibration?
Put into simple terms calibration is the verification of a measuring tool against a “Standard” of known value and higher accuracy.
This comparison will validate the “Unit Under Test’s” specification as the “Standard” will be at least 5 times (ideally 10 times) more accurate than the UUT.
Everyone does a simple form of calibration at home, at least twice a year when the clocks change.
You hear the time tones on the radio or on the television and adjust your watch accordingly. You then use your watch to correct the time on all the clocks in your home. Your watch has now become a transfer standard as you pass on the “corrected” time to all other timepieces.
Periodically you check your watch against the radio or television and adjust it if required.
In doing this you are calibrating your watch against a higher accuracy standard.
If you never check your watch against these higher standards how sure are you that your watch is correct? This is the same for every measuring device, it needs to be checked against a known standard of higher accuracy to ensure it is correct and performing within its specification limits.
Another simple example:
Two Digital Multimeters are required to measure 100 Volts within 1 %.
Meter A indicates 99.1 Volts, and Meter B indicates 100.9 Volts.
Both meters are within tolerance, but which one is right?
If Meter B is used as the “Standard” then Meter A will appear to be out of tolerance.
Meter B is sent for calibration and is adjusted to indicate 100.0 Volts. This meter now has an accuracy of 0.1 % and therefore the most it could deviate is 99.9 Volts to 100.1 Volts.
Now if Meter A is compared with Meter B it is within tolerance (although right at the lower end of the tolerance range).
If Meter A is now adjusted to indicate the same as Meter B, this will hopefully keep Meter A from giving a false reading as it experiences normal drift between calibrations.