Ohmmeter
An ohmmeter is an electrical instrument designed to measure the resistance of a conductor, circuit component, or complete circuit, with results typically displayed in ohms (Ω).[1][2] It applies a test signal—either a known voltage or current—to the unknown resistance and calculates the value using Ohm's law (R = V/I), enabling diagnostics for continuity, shorts, opens, and component integrity in electrical and electronic systems.[1][3] The first practical portable ohmmeter, an insulation tester for high-resistance measurements, was invented in 1889 by British engineer Sydney Evershed and marketed as the "Megger," revolutionizing safe electrical installations by allowing on-site verification of wiring integrity without complex lab setups.[4] Earlier resistance measurement techniques, such as the Wheatstone bridge developed in 1833, laid foundational principles but required manual balancing rather than direct reading.[5] By the early 20th century, ohmmeters evolved into multimeter functions, with analog versions using moving-coil mechanisms and modern digital variants incorporating operational amplifiers for precision.[6] Ohmmeters operate via several principles, including the simple ohmmeter (a battery-powered series circuit with a meter), ratio methods (comparing unknown resistance to a standard via voltage dividers), and null-type bridges like the Wheatstone for high accuracy.[5][6] Common types include:- Series-type ohmmeters: Basic analog devices with an internal battery and adjustable shunt for full-scale calibration, suitable for general-purpose resistance checks up to megohms.[5]
- Micro-ohmmeters: Specialized for low resistances (down to micro-ohms) using four-wire Kelvin sensing to eliminate lead resistance errors.[1]
- Electrometer ohmmeters: For ultra-high resistances (up to 10¹⁶ Ω), employing constant-current sources and sensitive voltmeters in low-noise environments.[1]