Do you know the accuracy (precision) of the multimeter
The accuracy of a digital multimeter is a combination of systematic and random errors in the measurement results. It indicates the degree of agreement between the measured value and the true value, and also reflects the size of the measurement error. Generally speaking, the higher the accuracy, the smaller the measurement error, and vice versa.
There are three ways to express the accuracy, which are as follows:
Accuracy = ± (a% RDG + b% FS) (2.2.1)
Accuracy = ± (a% RDG + n words) (2.2.2)
Accuracy = ± (a% RDG + b% FS + n words) (2.2.3)
In the formula (2.2.1), RDG is the reading value (that is, the display value), FS represents the full-scale value, and the previous item in the brackets represents the A/D converter and functional converter (such as voltage divider, shunt, true effective value converter), the latter is the error due to digitization. In the formula (2.2.2), n is the amount of change reflected in the last digit of the quantization error. If the error of n words is converted into a percentage of full scale, it becomes formula (2.2.1). Formula (2.2.3) is rather special. Some manufacturers use this expression, and one of the last two items represents the error introduced by other environments or functions.
Digital multimeters are far more accurate than analog analog multimeters. Taking the accuracy index of the basic range for measuring DC voltage as an example, 3.5 digits can reach ± 0.5%, 4.5 digits can reach 0.03%, etc. Example: OI857 and OI859CF multimeters. The accuracy of the multimeter is a very important indicator. It reflects the quality and process capability of the multimeter. It is difficult for a multimeter with poor accuracy to express the real value, which may easily cause misjudgment in measurement.
Resolution (resolution)
The voltage value corresponding to the last digit of the digital multimeter on the lowest voltage range is called resolution, which reflects the sensitivity of the meter. The resolution of digital digital instruments increases with the increase of display digits. The highest resolution indicators that digital multimeters with different digits can achieve are different, for example: 100μV for 3 1/2 digit multimeters.
The resolution index of the digital multimeter can also be displayed by resolution. Resolution is the percentage of the smallest number (other than zero) that the meter can display to the largest number. For example, the minimum number that can be displayed by a general 3 1/2-digit digital multimeter is 1, and the maximum number can be 1999, so the resolution is equal to 1/1999≈0.05%.
It should be pointed out that resolution and accuracy belong to two different concepts. The former characterizes the "sensitivity" of the instrument, that is, the ability to "recognize" tiny voltages; the latter reflects the "accuracy" of measurement, that is, the degree of consistency between the measurement result and the true value. There is no necessary connection between the two, so they cannot be confused, and the resolution (or resolution) should not be mistaken for similarity. Accuracy depends on the comprehensive error and quantization error of the internal A/D converter and functional converter of the instrument. From the perspective of measurement, resolution is a "virtual" indicator (which has nothing to do with measurement error), and accuracy is a "real" indicator (it determines the size of measurement error). Therefore, it is not possible to arbitrarily increase the number of display digits to improve the resolution of the instrument.
