Introduction to the Accuracy of Digital Multimeter
Accuracy refers to the maximum allowable error in a specific use environment. In other words, precision is used to indicate
The closeness between the measured value of the digital multimeter and the actual value of the measured signal.
For DMMs, accuracy is usually expressed as a percentage of reading. For example, the meaning of 1% reading accuracy is: when the display of the digital multimeter is 100.0V, the actual voltage may be between 99.0V and 101.0V.
There may be specific values added to the basic precision in the specification. Its meaning is the number of words to be added to transform the rightmost end of the display. In the previous example, the precision might be stated as ±(1%+2). Therefore, if the GMM reads 100.0V, the actual voltage will be between 98.8V and 101.2V.
The accuracy of an analog meter is calculated based on the full-scale error, not the displayed reading. Typical accuracy for analog meters is ±2% or ±3% of full scale. Typical basic accuracy of a DMM is between ±(0.7%+1) and ±(0.1%+1) of reading, or even higher.