Respuesta :
Answer:
200 =/-5%.
Explanation:
This method of specifying accuracy seems odd to me. Since it is unclear I will assume that you really mean +/-2% of full scale. I will also assume that you are speaking of an analog meter because digital meters require a percentage and a count to specify accuracy. This makes that real accuracy a bit more complicated.
You are using a 500 volt range. At +/-2% this means +/- 10 volts. If your scale reading is 200 volts you may assume that the “real” value is between 190 volts and 210 volts. If you want to express the error in terms of the reading it would be 200 =/-5%.
Accuracy is expressed in different ways for convenience and marketing reasons. Here is a table to convert the various ways accuracy is expressed.
Answer:
- Absolute error [tex]= ^+_-10V[/tex]
- [tex]\%[/tex] error reading [tex]= ^+_-5\%[/tex]
Explanation:
voltmeter reading or measured value [tex]V_m = 200v[/tex]
Full scale of voltmeter [tex]V_{ps} = 500v[/tex]
Therefore,
Absolute error = True value - measured value
where
True value [tex]V_t = V_m{^+_- error[/tex]
Absolute error = [tex]GAE * V_{ps[/tex]
where
[tex]GAE = \frac{100-accuracy}{100}\\\\GAE = \frac{100-98}{100}\\\\GAE = 0.02 or 2\%[/tex]
Therefore,
[tex]Absolute error = 2\% * V_s\\\\Absolute error = \frac{2}{100} * 500\\\\Absolute error = ^+_-10v[/tex]
B) [tex]\%[/tex] of error reading in A = limiting error
[tex]= \frac{Absolute error}{V_m} * 100\\\\= \frac{10}{200} * 100\\\\= ^+_-5\%[/tex]
For more information on this visit
https://brainly.com/question/23379286