What are the technical parameters and measurement methods of the digital multimeter
1. Resolution, word count and bits
Resolution refers to the ability of the multimeter to distinguish small signals when measuring. Knowing the resolution of the multimeter allows you to determine whether it can observe small changes in the signal being measured. For example, if a DMM has a resolution of 1mV on the 4V range, it means it can see a change of 1mV (1/1000 of a V) when reading 1V.
If you had to measure lengths down to a minimum of 1/4 inch (or 1 mm), you wouldn't buy a ruler with a minimum scale of 1 inch (or 1 cm). A thermometer that only measures whole degrees isn't much use if the normal temperature is 98.6 degrees Fahrenheit. You need a thermometer with a resolution of 0.1 degrees.
The words "bits" and "words" are used to describe the resolution of a multimeter. They can be grouped by the number of words or digits displayed by the DMM.
A 3½-digit multimeter displays three whole digits (0 to 9) and a "half digit" (displays only a "1" or leaves it blank). A 3½-digit multimeter has a display resolution as high as 1,999 counts. A 4½-digit multimeter has a display resolution of up to 19,999 counts. Compared with "bits", using "words" can more accurately describe the resolution of the multimeter. Today's 3½-digit multimeters may have resolutions as high as 3,200, 4,000, or 6,000 counts.
For some measurements, a 3,200 count multimeter provides better resolution. For example, if you are measuring 200V or more, a 1,999-digit multimeter cannot measure 0.1V. And a 3,200M multimeter can display up to 0.1V when measuring voltages up to 320V. This resolution is the same as that of a more expensive 20,000 count multimeter until the voltage exceeds 320V.
2. Accuracy
Accuracy is the maximum allowable error under specified operating conditions. In other words, accuracy indicates how close the measured value displayed by the digital multimeter is to the actual value of the signal being measured.
The accuracy of a digital multimeter is usually expressed as a percentage of reading. An accuracy of 1% of reading means that if the displayed reading is 100V, the actual value of the voltage could be anywhere between 99V and 101V.
Specifications may also include a bit range added to the basic accuracy specification. The range represents the number of words by which the rightmost digit of the displayed value may vary. Thus, the accuracy in the above example can be expressed as "±(1 %+2)". Therefore, if the display reads 100V, the actual voltage value will be between 98.8V and 101.2V.
The parameters of an analog multimeter are determined by full-scale error, not by percentage of displayed reading. Typical accuracy for analog multimeters is ±2% or ±3% of full scale. At 1/10 full scale, accuracy changes to 20% or 30% of reading. Typical basic accuracy of a DMM is based on ±(0.7%+1) to ±(0.1%+1) of reading or better.
3. Ohm's law
The voltage, current, and resistance of any circuit can be calculated using Ohm's law, which states that "voltage equals current times resistance" (see Figure 1). Therefore, if any two values in this formula are known, the third value can be found.
Digital multimeters use Ohm's law to directly measure and display resistance, current or voltage. The following describes how to use a digital multimeter to easily measure the required parameters.
4. Digital and analog display
The digital display has high accuracy and resolution, showing three or more digits for each measurement.
The analog needle display is less accurate and has a lower effective resolution because the value between two tick marks must be estimated.
A bar graph shows signal changes and trends like an analog needle, but is more durable and less prone to damage.
