The Case for a Distributed Communications Architecture

By Donald Pollock

To date, for both technical and economic reasons, the distribution grid has provided utilities with the least visibility regarding operations. Distributed intelligence allows utilities to operate most efficiently, in effect “delegating” decisions that can be made autonomously in the field rather than backhauling volumes of data to make a simple decision and issue a command accordingly.

Early smart grid adopters have identified challenges and gained significant insights into best practices that can benefit those utilities looking to deploy smart grid systems today. They have learned that achieving alignment of business, regulatory and functional requirements across various departments and operating units can produce a consensus and inform a utility-wide roadmap for deployment of common technologies to achieve grid modernization goals. In order to fully leverage new investments, that kind of roadmap should be developed prior to embarking on deployment, so operating resources have the the flexibility to incorporate future applications and end-point types as they emerge.

That approach goes against the traditional model for grid modernization, in which disparate systems are deployed and generally managed independently of one another. Data generated within the various systems are typically collected, transmitted to and analyzed at a centralized location, using an array of communications technologies, each adopted with little regard to the next.

Learning from best practices suggests a better approach to grid modernization, one that emphasizes a system-wide common architecture, capable of pushing data collection, analysis and application out to the edge of the utility network while leveraging multiple communications technologies. This approach can maximize value by reducing the cost of implementation, communications and operations; delivering network visibility and control; providing for new applications and technology through a flexible foundation; and incorporating and extending the value of legacy assets.

One of the major challenges to building a smarter grid arises from the fact that the more intelligent system may contain thousands of communicating devices. Enabling those devices leads to an exponential increase in the volume of new data. While the data will help a utility to understand its distribution grid operations at a granular level and enable more targeted operational decisions, ultimately leading to more efficient operation of the distribution grid, managing the data flow and its analysis is no small job.

What is more, practical obstacles of bandwidth limitations and cost have constrained the types and volume of data returned, inhibiting the granular understanding of distribution operations. Handling device implementation and operation singularly, rather than in the context of other device operations, has further restricted the overall value of any data returned from any one device. More efficient use of each device, and deeper understanding of data, requires a method for not only overcoming the bandwidth cost challenge but also leveraging the data from disparate end points collectively.

Distributed intelligence architecture—a method for gathering, aggregating, and analyzing data from these different end points at the edges of the network—can meet these needs. A distributed architecture requires the ability, at the edge of the network, to communicate with smart devices, extract data, analyze it and prompt action.

Communications with end-point devices may need to come in various types to operationally and economically connect every device. Leveraging multiple communication options in parallel, and managing these communication options within a single point like a node, can enable all grid device data to be collected and provide the depth of analysis required to achieve the next level of smart grid value. For the node, having local access to the many streams of available data is crucial to meeting the full potential of analyzing the data gathered at the distributed level.

The architecture of most utility grid automation applications today focuses on collection and transportation of data from field devices back to a central data warehouse, where analysis may be performed by the application’s head-end management system. Recent industry discussions of analytics focus primarily on extracting data from the data warehouse for incremental analysis beyond what each individual application may be capable of. There is value in centralizing data storage and analytics. But there is also value in performing analysis of data at the edge, where the data originates, so as to:

  • Perform real-time monitoring and analysis on data collected;
  • Reduce large volumes of raw data to smaller amounts of manageable and usable data;
  • Select specific types or subsets of data to be backhauled to central systems;
  • Collect data only on exceptions determined by utility-configured thresholds;
  • Reduce communication costs associated with transporting data back to a central data warehouse;
  • Optimize data warehouse storage costs.

A distributed intelligence smart grid architecture with local data access affords enough flexibility for the utility to monitor and analyze data locally and/or centrally. Likewise, a distributed intelligence architecture allows for scalability, expandability and the addition of new applications as they emerge. This flexibility provides the greatest opportunity to extract value from both legacy and future systems. A useful component to this distributed architecture is a hardware and software platform that provides local data access, application management, data storage, analytics, and device monitoring at the edge of the electric delivery grid.




Donald Pollock is Global Vice President of Sales and Marketing at Ambient Corporation, a smart grid communications firm in Newton, Massachusetts, and supplier of the Ambient Smart Grid Node. Previously, he was managing director of Customers Matter Ltd, a marketing and research consultancy. For over 20 years, he has been an executive with or consultant to various businesses: Clients have included IBM, Saint Gobain, Baldor, Indesit, and Universal Powerline Association. He has a Bachelor of Science in psychology and business studies from Edinburgh University in Scotland.