How much do you know about the accuracy of digital multimeters?
Accuracy refers to the maximum allowable error under a specific use environment. In other words, precision is used to indicate
How close the measured value of the digital multimeter is to the actual value of the signal being measured.
For digital multimeters, accuracy is usually expressed as a percentage of reading. For example, a reading accuracy of 1% means that when the digital multimeter displays 100.0V, the actual voltage may be between 99.0V and 101.0V.
Specific values may be added to the basic accuracy in the detailed instructions. Its meaning is the number of words to be added to transform the rightmost end of the display. In the previous example, the accuracy might be stated as ±(1%+2). Therefore, if the GMM reading is 100.0V, the actual voltage will be between 98.8V and 101.2V.
The accuracy of an analog meter is calculated based on the error of the full scale, not the displayed reading. The typical accuracy of an analog meter is ±2% or ±3% of full scale. The typical basic accuracy of a digital multimeter is between ±(0.7%+1) and ±(0.1%+1) of reading, or even better.
An introduction to the header of a multimeter
The meter head of the multimeter is a sensitive ammeter. The dial on the watch head is printed with various symbols, scales and numerical values. The symbol A-V-Ω indicates that this meter is a multimeter that can measure current, voltage and resistance. There are multiple scale lines printed on the dial. The one marked with "Ω" on the right end is the resistance scale line. The right end is zero and the left end is ∞. The scale value distribution is uneven. The symbol "-" or "DC" represents direct current, "~" or "AC" represents alternating current, and "~" represents the scale line shared by AC and DC. The several rows of numbers under the scale line are the scale values corresponding to the different gears of the selector switch.
