How To Calculate The Accuracy Of Multimeter? - It's That Simple!

Mar 01, 2023

Leave a message

How to calculate the accuracy of multimeter? - It's that simple!

1 Smart Multimeter Digital Auto Range AC DC Tester

The precision specification is expressed as follows:
% of reading + % of range
In this formula, "% of reading" is proportional to the reading, and "% of range" is the offset value. These specifications are made for each measurement range.

If the accuracy cannot meet the measurement resolution requirements, then the resolution has no effect on the accuracy. However, you can still use a multimeter to monitor small changes during the measurement.

For example:

Assuming you want to measure a 10 Vdc signal using a 34401A multimeter with 1 year accuracy and a 10V range, the accuracy is: 0.0035 + 0.0005 = 10 x (0.0035 / 100) + 10 x (0.0005 / 100) = +/-0.00040

therefore:
Measured value: 10.00000
Accuracy is: * +/-0.00040
Resolution: 0.00001
The actual reading is between 9.9996 and 10.0004
The last two digits of the measurement include an error.

*Some models of multimeters use "ppm" instead of "% of reading" and "% of range".

The value of ppm can be obtained by multiplying by 1/1,000,000 (= 10-6).

example 1:

If the error of 1 (V) is 10ppm,
Then the actual error value is 1 x 10 x (1/1,000,000) = 0.00001 (V).
Example 2:

If the error of 1 0 (V) is 5ppm,
Then the actual error value is 10 x 5 x 10-6 = 50 u(V).

Send Inquiry