A voltmeter is a device placed in parallel with a circuit element to measure the voltage drop across the element. As it turns out, it is difficult (impossible?) to measure a voltage drop directly, so a voltmeter typically works by measuring a current and then converting to potential. At least before the days of digital circuitry, a voltmeter was basically an ammeter (which measures current) in series with a very large resistance R0.
Suppose you use this kind of voltmeter to measure the voltage across a resistor R in a circuit. What are the limitations on R if you want to get a reading that is accurate to within 1%?
The resistor should be less than Ro / 99
Get Answers For Free
Most questions answered within 1 hours.