The following applications apply to most E.D.C. instruments.
The most understandable and easiest way to picture the uses for the our calibrators and standards is by illustration.
Using these instruments as STANDARDS and CALIBRATORS really does not need any explanation. Those uses are very obvious. However, the use of the terms "calibrators" or "standards" can be misleading to those who are not involved in metrology, e.g., design, Q.C., field service, test, etc.
What is not obvious, but is very logical, are some of the other applications.
A calibrator is a very stable, low noise, repeatable and linear power supply. It has voltage and current - so it's a power supply. If it has all the forgoing good things, e.g., stable .. etc., it requires only a little "tweaking" to make it a calibrator/standard.
However, as super quiet, stable and repeatable supply it is ideal for use in designing and bench testing prototype circuits. They have application as REFERENCES and SOURCES. Of course, they are not brute force supplies.
These instruments are also used as SIMULATORS. As an example: a large manufacturer, located in the Midwest, designs and builds large jet aircraft engines.
The jet engines are thoroughly tested, dynamically. The various parameters are monitored with strain gages, thermocouples and various transducers. The low level voltages and currents are amplified and then passed through analog-to-digital converters to the computers.
The computer system, in this application, is a controller as well as the data logger. Whenever one of the sensors exceeded the alarm level the computer would shut down the test.
Problems arose, as they have a habit of doing. Occasionally, an amplifier would change gain. Or, maybe an A/D converter would become inaccurate. Naturally, the computer, doing it's job as a controller, reached an alarm limit and would shut down the test, automatically. Engineers would check the engine and find that the problem was not in the engine, but, in the information that was being fed to the computer. Very expensive! Very frustrating!
The solution was reasonably simple. The test engineers installed a programmable simulator (calibrator into the system and multiplexed the output from the "simulator".
Whenever the computer "sees" one of the parameters approaching an alarm limit it calls up the "simulator" to provide a specific dc voltage or current through the circuit in question which includes the amplifier and A/D converter in that circuit. This output from the calibrator would simulate the current or voltage that the computer is recording. The computer then makes a digital comparison.
Example: The jet engine is running. The computer is scanning the input lines from the various sensors. Let us say that the recorded temperature in a certain part of the engine reads 600 degrees C. That represents about 33 mV dc in a type J thermocouple. After a while the reading moves up to 650 degrees C which, let's say, is approaching an alarm limit thus indicating a possible problem.
The computer would then call up the "simulator" and asked it for 33 mV. The computer then looked at the resultant digital input which registered 650 degrees. Now the computer knows that 33 mV does not represent 650 degrees. A simple arithmetic correction by the computer so that it recognizes "650șC" as actually and truly 600șC on that circuit.
The engine test continues uninterrupted.