User is looking for details regarding meter accuracy from data sheets, %rdg + %FS
Technical data sheets
The meaning of %rdg + %FS is % Reading + % Full Scale.
- Here is a quick example of the application of this method of calculating the potential error range for a current reading:
Let's say we have a situation where:
* The meter under consideration is an ION 7350 which specifies the accuracy of the current reading in the datasheet as 0.25% of reading + 0.05% of full scale as can be seen excerpt from an old meter datasheet immediately below.
- * The meter is wired to the electrical system using CTs of the ratio 200:5
* The meter is reading a primary current value of 103.85 A
How would one calculate the expected error bound of this meter reading?
In order to make the calculation, we need a value for the full scale current. The specification of the current inputs to the meter shows the following data at another place in the datasheet:
Thus, the full scale value of the meter reading would be 10A on the input * the CT ratio which is 200 / 5 = 400 A.
So now the calculation of the error is as follows:
potential reading error = 0.25% of reading + 0.05% of full scale = 0.25% of 103.85 A + 0.05% of 400A = 0025 x 103.85 + 0.0005 x 400 = 0.259625 + 0.2 = 0.46
So when the meter reads 103.85 Amperes, the potential error of that reading is +/- 0.46 A.
NOTE: For the sake of simplicity, the above calculation excludes the potential error introduced into the measurement system by the instrument transformer. Naturally, the instrument transformer has an accuracy rating of its own. The error of that instrument would be added to the of the meter. Thus if the instrument transformers error rating was +/- 5 % of reading, this number would be added to value already calculated. In this case, a reading of 103.85 A would have an error of +/- 5.19 A. Thus combined error for current readings of the entire measurement system (i.e. the combination or the CTs and the ION meter) would be 0.46 + 5.19 = +/-5.65 A.