Share Share this | Facebook Twitter YouTube LinkedIn Flipboard

IEEE: The expertise to make smart grid a reality

Subscribe to the Newsletter

Submit

*Required field

Publishing Information

From Analytics to Answers, from Optimization to Operations

Analytics and optimization represent an evolution in the smart grid. Sensing, communication and proper numerical treatment of data are important to achieving results. Optimization and visualization bring actionable intelligence, while a strong underlying platform enables utility-wide benefits.

Veterans of the DistribuTech conferences have probably noticed the shift of emphasis from equipment to distribution automation, then to the smart grid and now to analytics and optimization.

Of course, power engineers have been doing analytics and optimization for decades. That is what they went to school for. However, let us review the main implementation issues, as they are manifesting themselves in radically new and rapidly changing technology environments:

  • obtaining real sensor data, in a timely manner, for the required calculations;
  • having the calculations completed fast enough for the process being analyzed, and available to everyone in the utility, with all data shared uniformly and easy among applications, so as to support for example an inquisitive person in the utility who has an idea worthy of exploration;
  • determining what the impact is from the various calculations toward the essential mission of keeping the lights on;
  • deciding what is the best thing to do in the next five minutes, given what is known.

Let us look into each of these a bit more.

Obtaining the sensor data itself depends on having the sensors installed to match the physical parameters one is interested in, and having adequate communications capability to get the information in time. That means fast enough to have a calculation, decision and action that lets something be monitored or controlled correctly.

Information is valuable; not only does sensing and communicating it cost money, but also its lifetime goes well beyond the next five minutes of smooth operation of the grid. It takes on value in predictive analytics, and in creating test cases for validating simulations.

Therefore, always think of data as a time series, not as a disposable item in an instant of time. Storage of data has to be fast enough to keep up with the influx, it has to be efficient enough that storage is physically possible, and data must be stored in an easy to write and easy to read fashion. Data that is locked in a proprietary database is just less useful than data stored with industry standard access methods.

Some analytics platforms have accelerators for connectivity, time series and geospatial analytics and related queries. As a utility gets more sophisticated in its analytics, best in class algorithms sometimes are repurposed from other unexpected industries, such as finance or electronic design automation, and may have the most diverse analytics capabilities, such as machine learning, optimization, spatial, graph and time series engines.

Even within the electric power industry, advanced analytics includes multiple industry-domain applications on the same platform instantiation. This is sometimes implemented as a tightly integrated stack with a single support organization.

Having calculations done in time is complicated because calculations may use a mathematical method that involves iterations, and therefore the compute time varies, often depending on the current set of data and the prior set for determination of derivatives. In the real world, something can go wrong in sensing or communications, so that not all of the sensor data are present at the same time. Since we are modeling physical systems here, time synchronization of the data set is important if the calculations (or analytics or math) are to reflect a slice in time and the mathematical model can be applied.

Sometimes data is just missing, and knowing how the analytics react to this is important to having good results. Does missing data become a zero, the last obtainable value, the average of the last few readable values or a virtual reading based on other measurable signals? The essential message here is that good analytics propagate data quality properly, and that an analytic that says “I don’t know now right now” may be better than one that always has an output.

Sharing analytic output allows engineers to build on each other’s skills, by using one analytic output as input to another. It also helps discover if the same calculation is being done in several places, possibly with different algorithms and therefore inconsistent results. The more computations done in an analytics and optimization platform that spans the operational side of the utility, the easier it is for the whole utility to become more intelligent. (This is often colloquially phrased as, “If only the utility knew what the utility knows, things could be improved faster.”)

Sharing data is as important, especially when major applications in a utility may call the same asset by different names. Data sharing therefore requires not only storing values, but the time the value was last updated, aliases in other systems, validity indication for the current period, and the party responsible for that data, should it be questionable. If the data follow some modern, standardized format, as suggested by the IEC Common Information Model, or at least some well-established data model within the utility, then the ultimate usefulness and longevity of the data is enhanced.

Comprehensive data models have support for connectivity, geospatial, time series, equipment attributes and relational access. All of this is another characteristic of the new generation of analytics and optimization platforms offered to the utility industry today. Data sharing also prompts innovation, tinkering and what-if scenario evaluation—all aspects of a well-functioning intelligent utility. Documented, structured access to data makes it easier to pull it in to a freelance Excel spreadsheet, Mat Lab, Lab View or other tool to check out ideas or suspected correlations, and in general to gain more return on investment from the analytics platform. Such calculations can then be the proving ground for feedback to the next update of enterprise scale analytics applications.

While the term engineer has been favored here, there is no reason why financial planners, capital investment strategists and others cannot bring in enterprise resource planning data and explore how it fits with operational data. Do failures in substations increase with decreased maintenance budgets? Do fault patterns suggest a better way to apply next year’s capital budget? Does call center volume decrease after distribution system improvements?

One further consideration is that data today are not necessarily only direct numerical operational values. Technologies exist to understand written language from maintenance reports. Weather, for example, is an essential input to utility operations. Originally primarily for load forecasting, sophisticated models now exist to take weather data and make three-day forward, highly accurate predictions in 4D on surface areas down to two km2. This can be used for damage prediction, outage preplanning and repair optimization, including foreign crew prepositioning. Not least, weather data are fed into sophisticated models for wind and solar generation output forecasting.

Whether the user is an engineer or cost analyst, the increasing amount of data makes it harder for people to discover patterns that might lead to an improvement leading to a new analytics application. This is where visualization becomes so important. Good visualization helps not only represent the output of analytics better, it can be the means of discovering that a new analytic might be needed, or at least an experimental tool.

There are excellent software programs available for becoming a data explorer. In fact, the advent of what computer science people call Big Data has given more prominence to a job category called ‘data scientist.’ Just as we mentioned as a criterion for sensor input side, timeliness is important. The electric power system is not only very interconnected, but the things being connected are those speedy electrons. Therefore, the need is speed, not only in the front end sensing but also in the back end visualization. The volume and velocity of data means a modern utility should think of very high resolution, large physical format displays for in-person collaboration, with platform visualization components that can display the data fast enough. High-performance in-memory analytics and data paths are essential not only to visualization, but to real-time reaction and decision-making.

In terms of operational excellence, optimization is often seen as the fuzziest part of the consideration. Not that there are not excellent mathematical solutions. Two issues often arise. One is that optimization algorithms in the industry may be limited to methods taught to undergraduate electrical engineering majors. While a utility engineer may be master of his or her power domain, the latest mathematical techniques may not be readily at hand. Having an industrial mathematician or optimization expert work with your utility team can bring beneficial results, not even considered in the project’s initial return on investment estimate.

The second issue is what to optimize, and that often comes down to who is paying for the optimization project. At a generating plant, optimization may be meeting electrical demand at the lowest fuel cost and least plant component stress. With the transmission group, it could be full and safe allocation of the power flow capacity. Within a distribution organization System Average Interruption Duration Index (SAIDI) and continuous, uninterrupted power with high-power quality is a typical optimization target. It is interesting how different sponsoring groups in a utility could potentially contract for different optimization software developments, which, when all used together, may be at cross-purposes. This is another reason we see the rise of platforms that have good paradigms for interconnecting their analytics and optimization applications: They result in utility-wide awareness of what is being computed, as well as easier ways to manage the processing of the data for enterprise-wide benefit.

Contributor

  • Jeffrey KatzJeff Katz is the Chief Technology Officer of the Energy and Utilities industry at IBM. He has contributed to the industry’s framework, Solution Architecture For Energy (SAFE), the IBM Innovation Jam workshops and the IBM Intelligent Utility Network initiative, and he is the primary industry liaison with IBM Research.

    Read more

About the Smart Grid Newsletter

A monthly publication, the IEEE Smart Grid Newsletter features practical and timely technical information and forward-looking commentary on smart grid developments and deployments around the world. Designed to foster greater understanding and collaboration between diverse stakeholders, the newsletter brings together experts, thought-leaders, and decision-makers to exchange information and discuss issues affecting the evolution of the smart grid.

Contributors

Erich W. GuntherErich W. Gunther, an IEEE fellow, is a cofounder, Chairman and CTO of EnerNex in Knoxville, TN.
Read More


Massoud AminMassoud Amin is a senior member of IEEE and chairman of the IEEE Smart Grid.
Read More



Manimaran GovindarasuManimaran Govindarasu, a senior member of IEEE, is Mehl Professor in the Department of Electrical and Computer Engineering at Iowa State University.
Read More

Adam HahnAdam Hahn, an IEEE member, is an information security engineer at the MITRE Corporation.
Read More


Chen-Ching LiuChen-Ching Liu, an IEEE fellow, is Boeing Distinguished Professor in the School of Electrical Engineering and Computer Science at Washington State University.
Read More

Jeff KatzJeff Katz is the Chief Technology Officer of the Energy and Utilities industry at IBM.
Read More