Introduction to the Subvariation Rate and Accuracy of Digital Multimeter

Mar 19, 2023

Leave a message

Introduction to the Subvariation Rate and Accuracy of Digital Multimeter

 

resolution


Resolution refers to how well a watch measures. Knowing the resolution of a meter lets you know if you can see small changes in the signal being measured. For example, if the DMM has a resolution of 1mV in the 4V range, then when measuring a signal of 1V, you can see a small change of 1mV (1/1000th of a volt)


If you're measuring less than 1/4 inch (or 1 mm), you're definitely not going to use a ruler with inches (or centimeters). If it's 98.6°F, it's useless to measure it with a thermometer that only has integer markings. You'll need a thermometer with a resolution of 0.1°F.


The number of digits and words is used to describe the resolution of the table. DMMs are classified by the number of digits and words they can display.


A 3½-digit table that can display three full digits from 0 to 9, and a half digit (only 1 or none). A 3.5-digit digital table can achieve a resolution of 1999 words. A 4.5-digit digital table can achieve a resolution of 19999 words.


Using words to describe the resolution of digital tables is better than using bits to describe, and the resolution of 3 and a half digital tables has been increased to 3200 or 4000 words.


The 3200 count digital meter provides better resolution for some measurements. For example, a 1999-word meter, when measuring a voltage greater than 200V, you cannot display 0.1V. However, the 3200-word digital meter can still display 0.1V when measuring the voltage of 320 volts. When the measured voltage is higher than 320V and the resolution of 0.1V is to be achieved, a more expensive 20,000-word digital meter is required.


Accuracy Introduction


Accuracy refers to the maximum allowable error in a specific use environment. In other words, precision is used to indicate


The closeness of the measured value of the digital multimeter to the actual value of the signal under test.


For DMMs, accuracy is usually expressed in percent of reading. For example, the meaning of 1% reading accuracy is: when the display of the digital multimeter is 100.0V, the actual voltage may be between 99.0V and 101.0V.


Specific values may be added to basic precision in specification sheets. Its meaning is the number of words to be added to transform the rightmost end of the display. In the previous example, the precision might be stated as ±(1%+2). Therefore, if the GMM reads 100.0V, the actual voltage will be between 98.8V and 101.2V.


The accuracy of an analog meter is calculated based on the full-scale error, not the displayed reading. Typical accuracy for analog meters is ±2% or ±3% of full scale. Typical basic accuracy for a DMM is between ±(0.7%+1) and ±(0.1%+1) of reading, or even higher.

 

4 Capacitance Tester -

Send Inquiry