It's often quite difficult to tune DNP3 event deviation settings to ensure both a suitable resolution during normal operation, but also to maintain a suitably low network utilisation during busy periods, where the data is changing more rapidly. What I believe would be incredibly useful in this regards would be a Maximum Event Frequency (in events per minute) or a Minimum Time Between Events against each object in the Basic settings of the object. This could possibly also be combined with some form of 'swinging door' compression, so that it's not the absolute change in value from the last sample that we specify, but the allowable deviation from a straight line interpolation of the signal. This is commonly used in historian systems. The idea being that analog values are almost always linearly interpolated. So a value of 0.0 at time t=0sec, and 11.0 at time t=10sec results in a straight line from 0.0 to 11.0, and over 10seconds. If the outstation sends these data points to the master, then that is fine, and we're done. However, if the RTU is buffering them for a little while, and we generate another data point at t=20sec, which is of value 20.0, then we may be able to 'throw out' the t=10sec point, if we're not interested in the fact that it differs by 1.0 from the straight line interpolation between the 0sec and 20sec data points. The swinging door concept would also help to reduce stepped values turning into long ramps which is often a problem with DNP3. Since at a step change, we would typically get both a sample point immediately before the step, and one after the step when using the swinging door compression.
... View more