How is a digital multimeter's accuracy (uncertainty) determined?
The accuracy of the multimeter is also called uncertainty by some manufacturers, which generally says "within one year from the factory, the operating temperature is 18°C ~ 28°C (64°F ~ 82°F), and the relative humidity is less than 80%. ± (0.8% reading + 2 characters)." Many buyers or users are not very clear about this and often ask. I am assuming here that there is a meter, in a certain range, such as DC 200V, it is written like this, and the measured value shows 100.0 on the meter, so what should be the correct value at this time. I think that for general users, you can completely ignore the calculation of accuracy, and just think that it is 100V DC. Calculated according to the manufacturer's accuracy, when measuring 100V (display 100.0), the error is ±(0.8%*1000+2)=±10, that is, the error is 1.0V. When you substitute the reading, don't consider the decimal point to display the value. Substitute it into the calculation, add the decimal point to the calculated value, and then use the original reading to calculate the shipping cost. Like this example, the correct value is 100.0±1.0, which should be between DC 99.0~101.0V.
I am a rookie and I want to learn electronic maintenance. What kind of multimeter should I buy?
Under normal circumstances, friends who are just getting started can buy a popular multimeter (three-purpose meter, named because it can measure voltage, current, and resistance at the same time). The most used function is resistance, followed by DC voltage. The additional functions of diode, buzzer function, triode and capacitance measurement are also used more often, so you should consider purchasing an instrument with this function. Of course, some skilled overhaulers may have their own unique overhaul methods, which is another matter.
