How is the accuracy (uncertainty) of a digital multimeter computed?
The accuracy of a multimeter, also known as uncertainty by some manufacturers, is generally stated as "within one year of leaving the factory, the operating temperature is between 18 ° C and 28 ° C (64 ° F to 82 ° F), and the relative humidity is less than 80% when measured, ± (0.8% reading+2 words)." Many buyers or users are not very clear about this and often ask. I assume here that there is an instrument in a certain range, such as the DC 200V range, which is written as follows. The measured value displayed on the instrument is 100.0. What should be the correct value at this time. I think for the general user, they can completely ignore the accuracy calculation and simply consider it as DC 100V. According to the manufacturer's accuracy calculation, when measuring 100V (displaying 100.0), the error is ± (0.8% * 1000+2)=± 10, which is 1.0V. When substituting the reading, do not consider the decimal point, and use the displayed value to calculate. The calculated value should be added with the decimal point and the original reading should be used to calculate the shipping cost. For example, the correct value is 100.0 ± 1.0, which should be between 99.0 and 101.0V DC.
What is the basic working principle of a digital multimeter?
The basic circuit of a digital multimeter is a meter head circuit, which performs the basic function of quantifying the input DC voltage (analog quantity) and outputting it; Other functions generally require the addition of external circuits. PS: Nowadays, the integration of multimeter chips is increasing, and the number of peripheral circuits is decreasing. This has both advantages and disadvantages. Advantages: High integration, simple external circuits, and fewer quality faults caused by component quality issues; Disadvantages: Once the chip is broken, the cost of replacement is high and troublesome. Sometimes, even the cost of replacing a chip can be used to buy another instrument, so usually when it is broken, it has to be scrapped.
What is the difference between a three and a half digit and a four and a half digit digital multimeter?
Three and a half bits are also called 3 1/2 bits (pronounced as three and a half bits), and four and a half bits are also called 4 1/2 bits (pronounced as four and a half bits). We know that the accuracy represented by an analog quantity after quantization and conversion to a number is related to the number of digits in the number. The more digits there are, the closer it is to the original value, and the more accurate it is. This is generally speaking, without considering other situations. If the quantized value is 1.00000V, using one digit to represent it is the same as using N digits to represent it (:). So in general, the more digits there are, the more accurate it is, that is, four and a half digits are more accurate than three and a half digits.