How is a digital multimeter's accuracy (uncertainty) determined?
Answer: The accuracy of the multimeter is also called uncertainty by some manufacturers, which generally says "measured under the condition of operating temperature 18°C ~ 28°C (64°F ~ 82°F) and relative humidity less than 80% within one year after leaving the factory. , ±(0.8% reading + 2 characters)." Many buyers or users are not very clear about this and often ask. I am assuming here that there is a meter, in a certain range, such as DC 200V, it is written like this, and the measured value shows 100.0 on the meter, so what should be the correct value at this time. I think that for general users, you can completely ignore the calculation of accuracy, and just think that it is 100V DC. Calculated according to the manufacturer's accuracy, when measuring 100V (displaying 100.0), the error is ±(0.8%*1000+2)=±10, that is, the error is 1.0V. When substituting the reading, don't consider the decimal point to display The value is substituted into the calculation, and the calculated value must be added with a decimal point and then used to calculate the shipping cost. Like this example, the correct value is 100.0±1.0, which should be between DC 99.0 and 101.0V.






