I'm suspecting that CT might not be chosen correctly. But I'm not sure which parameter of CT is linked to its offset.
Anyone can share his or her experience?
This very much device specific and a function of the dynamic range of the device. It depends on front end analogue design, A/D convertor used and of course firmware/software.
It is common in firmware to consider sampled readings below a certain threshold to be noise and set the reading to zero. While there maybe engineering and test data which has this information it is often not publish in the public specification.
If a CT has been chosen that has to large a primary rating then on the secondary side at the lower end there is not enough current to get a reading. For example if the CT was 10000:5 but the actual bus current was only 10A then the current signal at the device would be 5mA and that might be to low for the device to read.
Finally I seen installations where on the metering terminal block the CT shorting mechanism (required for safety and maintenance) has been left in place. This of course means little or no current at the device.
Discuss challenges in energy and automation with 30,000+ experts and peers.
Find answers in 10,000+ support articles to help solve your product and business challenges.
Find peer based solutions to your questions. Provide answers for fellow community members!