Error analysis of voltage measurement between digital multimeter and pointer multimeter
If the measured voltage is commercial power, that is, 50Hz alternating current, and both meters are qualified, it can only show that the internal resistance of the measured voltage is too large. At the same frequency, the biggest factor that affects the voltage measurement results of pointer multimeter and digital multimeter is the difference in internal resistance, which is not an order of magnitude. When the internal resistance of the measured voltage is small, the difference is not obvious, and when the internal resistance of the measured voltage is large, the measurement results will be quite different.
In this case, it is possible that the measured voltage is not the real 220V live wire power line, or the voltage measured after the live wire passes through an electrical appliance, or the voltage of the electrical leakage shell.
Excluding the above possibilities can only show that one of the two watches is not accurate and needs maintenance and adjustment.
There is an error in measuring the voltage. First of all, you have to find out what is the frequency of the measured AC voltage? Is this voltage a pure sine wave?
Now all kinds of multimeters on the market are marked with their frequency response range and AC waveform when measuring AC voltage. For all kinds of common digital multimeters, the frequency response is generally 40-1000Hz, and sine wave is required (distortion ≤1%). The measured AC voltage beyond the above range does not guarantee the measurement accuracy. This is because the AC/DC (alternating current/direct current) conversion circuits in most digital multimeters are basically designed with low power consumption dual operational amplifier TL062, and the GBW (gain bandwidth product) of the operational amplifier is limited, so the digital multimeter can't measure high-frequency AC voltage (of course, it is also related to whether the divider resistance of the multimeter is compensated).
As for the general pointer multimeter (which was first invented by Americans, 100 years ago), its internal structure is quite simple, and there is a high-sensitivity meter+diode rectifier+voltage divider (a few pointer multimeter add an AC amplifier composed of an operational amplifier between the meter and the voltage divider to improve the sensitivity), so the measurement accuracy of this ancient and cheap multimeter can't be compared with that of a digital multimeter. The voltage divider of this meter is generally not compensated by capacitance.
The difference between the two meters in measuring the same AC voltage is several tens of V. First, you need to check their voltage-dividing resistance network, is one of the resistances variable? If it's all normal, can you also see if the pointer on the pointer multimeter can point to zero? For digital multimeter, can you see if the calibration potentiometer of its AC voltage block is loose?
By the way, if you want to accurately measure the AC voltage of any waveform, it is recommended to buy a true TRMS multimeter, which can accurately measure the AC voltage of various waveforms such as sine wave, triangular wave and rectangular wave, and has nothing to do with the distortion.






