Discuss and solve problems in energy management and automation. Join conversations and share insights on products and solutions. Co-innovate and collaborate with a global network of peers.Register Now
Transposing the release notes from the Word format into Exchange is extremely time-consuming (can you believe that you can't paste images into this forum? Yikes.), so I'm going to change things up a bit moving forward. I've attached the notes as released to this post, and this post will copy the text content plus the added commentary in italics to give you the detail that we can't fit into the release notes themselves.
Manual Meter Readings Upload
When given the correct permissions, users will now see a new menu option under the Admin > Performance Analytics section called “Manual Meter Readings”. This will launch an Excel-based template that users can upload back into RA to insert manual meter readings into the system. These readings are stored as singular data points in PAM with all of the consumption grouped onto a 12:00:00 timestamp on the date of the meter reading.
This feature will pair well with the upcoming RA Mobile App and it’s Manual Meter Reading feature, which will work against the same Manually Read Meters that this feature focuses on.
Even though the meter readings assign all of the interval consumption to one timestamp, it's possible to use Calculated Measurements to cause the consumption to be disaggregated to each day between readings. If that's something you want to do for a client please log a ticket and we can walk you through the setup.
Toggle Between Raw and Corrected Data
There is now a button in the UX to toggle between the Raw and Corrected modes of the data in any PAM chart. This was previously available as a menu option but required many clicks to reach it. The toggle icon also shows the user the current state of the data: a smooth curve for corrected data and a more jagged curve for raw data.
The mouse-over text clears up any confusion that people may have about the current state and the purpose of this icon.
We're hoping you like this as much as we do - it's extremely helpful when you are trying to understand what the DQ engine is doing to the raw data. You can toggle between raw/corrected so easily now that you can easily visualize the way the curves change.
Select Flags for DQ Overlay
The option to show/hide DQ flags in PAM has been expanded, so that users can now select which kind of flags they want to see. Sometimes a data set may have numerous Notes on it, but the user only wants to see where Estimates exist. This improved filtering makes the DQ process even more transparent to users. The filtering options are: Suspect, Estimate, Data Error, Note, Edited, Locked.
There are more and more Notes being logged against the data as we build up the DQ capabilities, and it was becoming overwhelming to see so many generic DQ flags on the data. Now that you can filter, you can choose to ONLY see Data Errors, for example, to help you diagnose issues more readily.
Gap Filling “Look Back” Increased to 8 Weeks
When a gap in the data occurs at the beginning of a PAM chart, the system will now look back as far as 8 weeks prior to the chart’s starting date to find data it can use to initiate the gap filling features. This will help resolve issues where a user happens to be looking at the tail end of a long gap. Previously the chart wouldn’t fill the gap if it was longer than 14 days.
This was causing some problems for clients who had a lot of long gaps but still wanted to use the average profile fill to estimate those gaps. They were frequently looking at charts that started DURING the gap, and the estimations weren't being triggered because the last good value was more than 14 days prior. We picked 8 weeks as a fairly arbitrary number here. Let us know if you have feedback about that.
Scheduled Reports Have Date/Time Added
When a scheduled report is generated in Excel format, the file name has the date and time appended to the end. This helps to differentiate the files when users save them to a common location.
Some clients were having trouble differentiating between reports that they were filing away.
Data Stream Viewer Available from Excel Add-In
The Data Lobby has proven to be very valuable, with the Excel Add-In quickly becoming the dominant method of interacting with it. One feature available on in Service Desk that people missed was the Stream Viewer. There are now links to launch the Stream Viewer embedded in the Excel sheet in a column called “Visualization Link” so that users can take advantage of this without having to enter the Data Lobby Service Desk interface.
While adding this feature to the Excel file, we also added links to “View File” in a column titled “Initial File” and “View Document” in a field called “Stream Document”.
There will be more work done in the future to move further away from the Service Desk interface for the Data Lobby. Perhaps you're reading this and didn't even know that there WAS a Service Desk interface for the Data Lobby? If that's the case, then congrats - you are our target user and we're trying to put more tools into your hands. The original Data Lobby used Service Desk exclusively for manipulating the data and its metadata. When we introduced the Excel Add-In for bulk stream edits, we realized that it was just a better overall experience and so we're trying to give that interface as much power as possible.
Over time we will move more and more of the existing EEM Admin Tool features into the RA web app (with Admin privileges) and into the Data Lobby Excel Add-In.
Create Manually Read Meters Workflow Improved
When creating a new Manually Read Meter (using Cumulative style readings) using the Excel Add-In, there is now an option to seed the meter with an initial reading, so that subsequent manual meter reads will immediate begin to produce interval consumption. This is optional, so when creating manually read meters it can be omitted if the latest reading is not available. If this info is available, this makes the newly created meters much more functional for end users who might otherwise be confused about their first meter readings producing no results in PAM.
This is just to streamline the process. You could have also just created the empty meter stream and then used the Manual Meter Read spreadsheet (see above) to insert the first values, but often when we are commissioning a system we'll have a list of meters the customer wants created and that list will contain the last known readings, so this should offer some efficiency.
Manually Read Meters Will Reject Negative Interval Values
The system will now automatically prevent manually read meters from registering negative interval values. This prevents a meter swapout operation from generating large negative spikes. This is achieved by setting the “minimum delta per hour” option of the Admin Tool to zero when the meter is created using the Excel Add-In. This setting is only applied on Manually Read Meters, and it is a one-time operation. The setting can later be changed if necessary.
We wanted a way to prevent Manually Read Meters from registering large negative interval consumption spikes when meter swapouts and rollovers occur. These meter types don't trigger the standard DQ tools that Connected Interval meters use, so if you log a large cumulative value one day and a much smaller one the next day (off the new meter) then PAM would normally calculate a negative interval consumption value. Now we're simply using an existing EEM function to prevent negative interval values from being inserted. In the swapout case, the system simply won't insert any interval consumption at all between the "old" value and the "new" value, which is what you want.
Note that if you have a Manually Read Meter that NEEDS to record negative interval consumption, you can file a ticket and ask the PAM admins to turn off this feature on that meter.
Calc Engine Uses Reading Interval Effective Dates
When a data stream changes from 30m to 15m intervals (or vice versa) we can account for this by making admin-level changes, but with this release the Calculation Engine that operates on such data streams will also now recognize this enhancement.
Fully managing all of the impacts of reading interval changes is technically challenging but we are working through them in priority fashion.
Custom Threshold Exceptions Leave Notes
When a data point is above/below the custom threshold that can be set for a source-measurement pair in the Admin Tool, the system will now insert a note that’s viewable in PAM to explain why the timestamp is missing its value. This is expected to help in troubleshooting such cases. Previously it was very difficult to determine the root cause of a missing value in a stream when this was the culprit.
The custom threshold exceptions (for example: don't allow interval values less than 0) are very useful on certain data streams (see the Manually Read Meters Reject Negative Intervals feature above) but when this function kicks in, it can be hard for the end user to understand what is going on. Now, when this feature prevents an interval value from being written, it will also insert a note on that timestamp for the user to view an explanation.
Improved Error Message: “Disaggregation required but not allowed”
When a user request causes the calculation engine to attempt an illegal data dis-aggregation operation (an issue we intend to improve in the near future) the system will now display the name of the data stream causing the issue by referring to the hierarchy location and measurement group name in the error.
We expect to improve the logic in the Calc Engine to further reduce the occurence of this error message in the near future. We understand that it can be very frustrating to see this repeatedly.
Enhancements to DQ Logic
The Data Quality tools have proven to be a great step forward for IDM users, however the rules can always be improved in scope and precision. With this release the logic that evaluates whether a spike is a “catch up spike” or not has been improved in cases where preceding data includes 0’s and 1’s, while also preventing small variations (less than 1 of any unit) from triggering the “catch up spike” rules even when they are relatively large compared to preceding values.
We are going to be making it much easier to acquire utility-sources interval data for use in the IDM, as we focus on bringing the analytical power of interval data to the Supply and Sustainability domains of ESS. Our goal will be to get the utility-sourced interval data for as many of our Supply clients as possible, so that we can add value to their existing RA experiences.
In addition to this work, we are exploring a number of intriguing performance enhancements that new database technologies offer us. We’ll share achievements in performance through the RA newsletters as we realize them.