An attempt to distinguish accuracy and precision has always created confusion and argument. Read this article to find out when and where exactly these two terms can be used?
Accuracy: Accuracy describes the agreement between measured and true or standard value. In other words, Accuracy of an instrument is the degree of conformity of a measured/calculated quantity to its actual (true) value.
Precision: Precision is the agreement between several measurements or results. Precision of an Instrument is the ability to show repeatable, reliable and the same measurement each time." It is an indicator of the scatter in the data. More scatters would mean less precision.
It is very interesting fact that an accurate instrument is not necessarily precise. At the same time, Instruments are often precise but far from accurate. Therefore, perfect or required measurement is not possible.
If we weigh some material and if its weight comes out to be 10.4 kg, 10.5 kg, 10.4 kg, 10.6 and 10.5 kg. Then, these numbers are precise enough for us to believe that if we measure it again we would get 10.5 `/- 0.01 kg. Hence, these measurements are precise but not accurate. On the other hand, If the measured weight is same as expected weight (e.g. 10.5 kg exactly), It is said to be accurate.
The accuracy and precision of a measurement process is usually established by repeatedly measuring and obtaining a specified standard measurement. Such standards are defined in the International System of Units and maintained by national standards organizations such as the National Institute of Standards and Technology.
Standard deviation can be used to find precision in obtained results. Basically, standard deviation is measure of the dispersion of random errors about the mean value. If a large number of measurements or observations of the same quantity are made, the standard deviation is the square root of the sum of the squares of deviations from the mean value divided by the number of observations less one. Usually, 68% of all measurements fall within one standard deviation of the average. 95% of all measurements fall within two standard deviations of the average. There are different ways to obtain the precision of results. The simplest is the range (the difference between the highest and lowest results) often reported as a `/- deviation from the average.
Error: Error refers to the disagreement between a measurement and the true or standard value.
Approximation: If precision in the results of an experiment is not obtained. Then Physicists prefer close approximation method (which is nothing but rough estimates). The accuracy of estimates depends on various factors such as availability of reference materials, time devoted, and experience with similar problems.
Consider a value 2.4355. Since, this digit is greater than 5, or is 5 followed by nonzero digits, one can be added to the last digit, dropping all digits further to the right. Thus, rounding 2.4355 to three significant figures gives 2.44.
Data acquisition tool box:
In case of a data acquisition system, Accuracy and Precision is affected by factors such as board resolution or environmental noise. In addition, every component in the analog signal path disturbs system accuracy and performance, leading to drastic fluctuation in overall system accuracy from the required one.
For data acquisition hardware, accuracy is often expressed as a percent or a fraction of the least significant bit (LSB). Under ideal circumstances, board accuracy is typically ±0.5 LSB. Therefore, a 12 bit converter has only 11 usable bits.
In most circuit boards, programmable gain amplifier is fitted before the converter input to prevent system accuracy from getting affected. But this connection backfires, as sampling rate and settling time of the amplifier brings down the accuracy. Hence, to maintain full accuracy, the amplifier output must settle to a level given by the magnitude of 0.5 LSB before the next conversion, and is on the order of several tenths of a millisecond for most boards.
Settling time is a function of sampling rate and gain value. High rate, high gain configurations require longer settling times while low rate, low gain configurations require shorter settling times.
Precision (Resolution) of a device is determined by number of bits, representing an analog signal. A high-resolution device provides high precision by dividing input range into more divisions. But due to this, only smaller detectable voltage value is obtained.
A low-resolution device provides low precision by dividing input range into less division. But due to this, large detectable voltage is obtained.
A low precision, low-resolution device divides the input range into fewer divisions thereby increasing the detectable voltage value.
Since precision of a measurement is very important, it may be necessary to use more sophisticated (costly) equipment or a more time-consuming methodology to obtain high degree of precision. Hence it is very necessary for a surveyor to develop a methodology, whose resultant precision may be used to achieve the required accuracy (true value).
|Posted : 2/15/2006|