What are the digital multimeter's specifications and methods of measurement?
1. Resolution, bits, and word count
Resolution is the multimeter's capacity to discern minute signals during measuring. You can decide if the multimeter can detect minute changes in the signal being monitored by knowing the multimeter's resolution. A DMM may detect a change of 1mV (1/1000 of a V) when reading 1V, for instance, if it has a resolution of 1mV on the 4V range.
You wouldn't purchase a ruler with a minimum scale of 1 inch (or 1 cm) if you needed to measure lengths down to a minimum of 1/4 inch (or 1 mm). If the ambient temperature is 98.6 degrees Fahrenheit, a thermometer that only measures whole degrees isn't much help. A thermometer with 0.1 degree resolution is required.
The terms "bits" and "words" are used to characterize a multimeter's resolution. They can be divided into groups based on the quantity of words or numbers the DMM displays.
Three entire digits (0 to 9) are shown on a 312-digit multimeter, as well as a "half digit" (which either shows the number "1" or is left blank). The display resolution of a 312-digit multimeter can reach 1,999 counts. The display resolution for a 412-digit multimeter is up to 19,999 counts. Using "words" as opposed to "bits" can better convey the multimeter's resolution. The resolution of current 312-digit multimeters could reach 3,200, 4,000, or 6,000 counts.
A 3,200 count multimeter offers superior resolution for various measurements. For instance, a 1,999-digit multimeter cannot measure 0.1V while measuring 200V or higher. And while measuring voltages up to 320V, a 3,200M multimeter can show up to 0.1V. Until the voltage surpasses 320V, this resolution is equivalent to that of a more expensive 20,000 count multimeter.
2. Accuracy
The maximum permitted error under predetermined operating conditions is called accuracy. In other words, accuracy describes how closely the signal being measured and the measured value presented by the digital multimeter match up.
A digital multimeter's accuracy is often stated as a percentage of the reading. With a 1% reading accuracy, if a voltage reading of 100V is presented, the actual voltage might be between 99V and 101V.
A bit range may be added to the fundamental accuracy requirements in specifications. The number of words that can cause the rightmost digit of the displayed value to change is represented by the range. As a result, the accuracy in the example above can be written as "(1%+2)." As a result, even though the display says 100V, the real voltage is between 98.8V and 101.2V.
An analog multimeter's parameters are established by full-scale inaccuracy rather than by the percentage of the displayed value. Analog multimeter accuracy typically ranges between 2% and 3% of full scale. Accuracy drops to 20% or 30% of reading at 1/10 full scale. A DMM's basic accuracy is often based on readings of (0.7%) to (0.1%)+1 or better.
3. Ohm's law
The formula for any circuit's voltage, current, and resistance is known as Ohm's law, which asserts that "voltage equals current times resistance." As a result, the third value in this formula can be discovered if the first two numbers are known. Ohm's law is directly measured and shown by digital multimeters when measuring resistance, current, or voltage. The easiest way to measure the necessary parameters using a digital multimeter is explained in the following.
