What is the accuracy of a multimeter
The accuracy of a digital multimeter is the combination of systematic and random errors in measurement results. It represents the degree of consistency between the measured value and the true value, and also reflects the magnitude of measurement error. Generally speaking, the higher the accuracy, the smaller the measurement error, and vice versa.
There are three ways to express accuracy, as follows:
Accuracy=± (a% RDG+b% FS) (2.2.1)
Accuracy=± (a% RDG+n words) (2.2.2)
Accuracy=± (a% RDG+b% FS+n words) (2.2.3)
In equation (2.2.1), RDG represents the reading value (i.e. display value), FS represents the full scale value, the previous item in parentheses represents the comprehensive error of the A/D converter and functional converter (such as voltage divider, splitter, true RMS converter), and the latter item is the error caused by digital processing. In equation (2.2.2), n is the change in quantization error reflected in the last digit. If the error of n words is converted into a percentage of full scale, it becomes equation (2.2.1). Equation (2.2.3) is quite unique, and some manufacturers use this expression. One of the last two represents errors introduced by other environments or functions.
The accuracy of a digital multimeter is much better than that of an analog pointer multimeter. Taking the accuracy index of the basic range for measuring DC voltage as an example, it can reach ± 0.5% for 3 and a half bits, and 0.03% for 4 and a half bits. For example, OI857 and OI859CF multimeters. The accuracy of a multimeter is a very important indicator, which reflects the quality and process capability of the multimeter. A multimeter with poor accuracy is difficult to express the true value, which can easily lead to misjudgment in measurement.
