An ohmmeter actually reads current, not resistance, and does so by apply voltage
ID: 1535372 • Letter: A
Question
An ohmmeter actually reads current, not resistance, and does so by apply voltage across the leads (this is why you must be careful not to kill an ohmmeter by applying voltage to it, across its own power supply). Because there is a minimum amount of current the meter can detect, it increases the applied voltage for the higher resistance scales. If the meter is designed to read about 0.005 A of current, what voltage is required for the 20 k scale? the 200 M scale? Which scale should you use when trying to figure out the polarity of a diode?
Explanation / Answer
1) I = 0.005 A, And resistance applied R = 20000 ohm, The voltage required will be V = IR = 0.005*20000 = 100 V
2) If R = 200, 000, 000 Ohm, V = IR = 0.005 * 200, 000, 000 = 100,000 V.
3) Tthe min resistance across the diode is 100,000 ohm, and max is about 1,000,000 ohm
So voltage required will need to be in scale of 100,000*0.005 to 1,000,000*0.005 = 500 V to 5000 V