|This article does not cite any sources. (June 2015)|
|This article needs additional citations for verification. (December 2009)|
Instrument error refers to the combined accuracy and precision of a measuring instrument, or the difference between the actual value and the value indicated by the instrument (error). Measuring instruments are usually calibrated on some regular frequency against a standard. The most rigorous standard is one maintained by a standards organization such as NIST in the United States, or the ISO in European countries. However, in physics—precision, accuracy, and error are computed based upon the instrument and the measurement data. Precision is to 1/2 of the granularity of the instrument's measurement capability. Precision is limited to the number of significant digits of measuring capability of the coarsest instrument or constant in a sequence of measurements and computations. Error is ± the granularity of the instrument's measurement capability. Error magnitudes are also added together when making multiple measurements for calculating a certain quantity. When making a calculation from a measurement to a specific number of significant digits, rounding (if needed) must be done properly. Accuracy might be determined by making multiple measurements of the same thing with the same instrument, and then calculating the result with a certain type of math function, or it might mean for example, a five pound weight could be measured on a scale and then the difference between five pounds and the measured weight could be the accuracy. The second definition makes accuracy related to calibration, while the first definition does not.
Removing instrument error
The instrument error is not like random error, that can't be removed. Sometimes the removal of instrument errors are very easy, but it is case dependent. In Engineering instruments, like voltmeter or ammeter for example, the instrument error is very difficult to remove. Ammeter has built in resistance, which can't be removed either way. So the only way is to minimize it. On the other hand, the removal of error of a thermometer is a bit simple. Only the calibration has to be removed and then again calibrate it carefully. Sometimes, the user doesn't care for removal of error from the instrument, else he compensates it in calculation, for example, the zero error in Vernier Caliper is eliminated by proper calculation.
Another way to deal with instrument error may be to reduce the reactivity of the system to being measured by using some sort of Weak measurement. That is, taking far more, but far less powerful readings to avoid changing the system by the act of measuring.
|This standards- or measurement-related article is a stub. You can help Wikipedia by expanding it.|