Basic Terms and Explanations of Multimeters

May 21, 2025

Leave a message

Basic Terms and Explanations of Multimeters

 

Accuracy: Refers to the difference between the measured value and the actual value of a digital multimeter. Expressed as a percentage of the reading or a percentage of the full range.


Analogmeter: An instrument that displays measured values using analog pointers. The user determines the reading by the position of the pointer during the travel.


Annunciator: Used to indicate that the selected range or function is incorrect.


Average Responding: It can measure sine waves accurately, but lacks precision when measuring non sine waves.


Count: The last digit after the * in a multimeter, often used together with percentages to indicate the accuracy of the multimeter.


Current shunt: In a digital multimeter, there is a low value resistor used to measure current. Measure the voltage at both ends of the digital multimeter and calculate the current value using Ohm's law.


Digital Multimeter (DMM): Display the value of a measurement signal in digital form. The characteristic of digital meters is that they have higher accuracy, resolution, reliability, and other indicators than analog meters.


Non sinusoidal waveform: waveforms such as pulse sequences, square waves, triangular waves, sawtooth waves, peak waves, etc.


Resolution: The small amount of variation that can be observed during measurement.


RMS: The measurement value of an AC signal equivalent to a DC signal.


Standard sine wave: a signal that changes in a sinusoidal pattern without distortion.


True rms: A digital multimeter that can measure the effective values of sine waves and non sine waves.

 

Professional multimter -

Send Inquiry