Share Share this | Facebook Twitter YouTube LinkedIn Flipboard

IEEE: The expertise to make smart grid a reality

Interview with Carl Imhoff

In this interview, Carl Imhoff provides an update on the five-year Pacific Northwest Smart Grid Demonstration Project. He also describes a wide range of Smart Grid innovations, resulting from PNNL and other industry research, which utilities can consider now for their near-term and long-term strategic planning needs.

Question:The Pacific Northwest Smart Grid Demonstration Project has been under way for three years now. Can you share some success stories or lessons learned with us?

The project is designed to run for about five-and-a-half years so we don’t have actual results yet, but it’s off to a very good start.

One of the most important success stories thus far is the system’s interoperability across the many Smart Grid installations. We have 11 utilities participating across five states and they’re testing a range of Smart Grid value propositions, including energy storage, demand response, renewables integration, and interacting with some consumers and large institutional loads.

Cybersecurity specifications were designed into the project, which was particularly valuable for some of the smaller participating utilities and cooperatives. I think this exercise has been very fruitful and productive for the individual utilities and the overall project has successfully met its required cybersecurity plan.

The third major accomplishment is that we crafted a regional incentive signal, called “transactive incentive signal,” which was the first of its kind and introduces a new opportunity to improve regional grid efficiency and reliability in addition to its local utility benefits. The transactive control signal is designed to reflect bulk system needs for wind balancing, clean generation and transmission congestion at five-minute intervals and to test the ability to engage responsive Smart Grid assets to help meet these regional objectives. In order to achieve this, the Bonneville Power Administration is integrating its regional forecasts of demand and generation with wind forecasts from 3TIER, one of the project’s vendor partners, and then translating the information, including preferred green requirements, into a region-wide incentive signal. The signal, calculated with a system model developed by Alstom Grid, is communicated to the 11 utility portals every five minutes and local responsive assets, such as storage, demand response and renewable generation, can react by generating more or less power locally. Transactive energy management is an important new approach to better optimize existing and new Smart Grid technologies at scale and the topic is really gaining a lot of attention worldwide. It will be very interesting to see results from this and other related projects during the next two to three years.

How do you view the industry’s Smart Grid accomplishments to date? What must the industry focus on now?

The industry’s infrastructure upgrades are remarkable. Within the year, we’ll have about 1,500 phasor measurement units working across North America. We’ll also probably surpass 50 percent market penetration of smart meters by the end of 2014. So we’re no longer at the beginning of the Smart Grid journey, we’re substantially well into this journey and the transformation of the infrastructure.

This progress gives us the chance, very soon, to observe the system like we’ve never observed it before: in real time, all the way from end use to voltage transmission. And with distributed intelligence provided by all the computers and microprocessors deployed throughout the system, we also have the potential to control the grid like never before.

We now have an exciting opportunity to ask if this new, extraordinary capability to observe and control the grid gives us a chance to create power system paradigms that would improve upon today’s grid paradigm. These new tools offer many possibilities to increase the reliability and economic productivity of our grid assets and improve energy efficiency and reduce emissions from the nation’s energy system. So we can now pose the question: “What’s the nature of the grid we want to have in the future, given these new possibilities?”

From a strategic standpoint, we as a nation must frame a public-private dialogue to ask and consider this question. This needs to be a public-private discussion because the grid not only delivers benefits to consumers and utilities. It’s also a strategic aspect of our national energy and security agenda. In fact, this issue is one of the most important topics the industry needs to discuss during the next decade.

It will take a decade or more for this discussion to play out, but we have to decide how we are going to tackle this as a nation. I believe that the Department of Energy (DOE) has an important leadership opportunity to frame and facilitate this strategic public-private dialog and I expect DOE will find a way to make it happen.

Are there any particular near-term innovations that companies should start preparing to adopt?

I’d like to highlight three near-term innovations: the opportunity to create a new interface between buildings and the grid; transactive energy management; and a type of high-performance computing that I informally call “hPC.”

The interface between buildings and the grid is an important and exciting area to pursue. Buildings consume about 40 percent of U.S. energy and they have a lot of flexibility in how they operate and perform, but today they just take power from the system and use their control systems to optimize building services without any regard for the grid. They have potential to add a lot of value back to the grid but this is limited by the existing building-grid interface and local rate structures.

We now have a significant opportunity to create a two-way exchange of value between buildings and the grid. We can enable buildings to work in coordination with the grid and to respond to transactive energy signals from the grid requesting capacity or other services. In this manner, buildings can be used to improve efficiency, improve reliability, increase the use of renewables, provide ancillary services, energy storage and even resources for emergency response during super storms or other major service disruptions.

To enable this, we need a national buildings-grid interaction framework that ensures buildings remain effective and efficient for owners and occupants and that their controls effectively communicate with the grid’s signals. We’re supporting DOE now in laying the groundwork to begin the process of designing the needed framework and standards for the associated financial transactions. This kind of work doesn’t happen overnight but I encourage IEEE members and building controls vendors to begin thinking about this new capability now and learning more about it.

What is transactive energy management and how will utilities use it?

Transactive energy management is a way to engage responsive customer resources on a broad scale to improve system efficiencies, reliability and reduce operating costs. Today we have demand response capability but it is specialized and has a rigid structure and therefore touches only 5 or 10 percent of the load. The opportunity now is to take demand response to scale so that we’re applying it to almost all the load.

Transactive energy management is an exciting new framework to engage all types of supply and demand, leveraging Smart Grid tools in sensing, communication and control. It creates, at scale, the ability for the load to engage with supply on a level playing field that includes legacy systems and new Smart Grid assets such as smart loads, local generation, and energy storage. Its objective is for 80-100 percent of these assets to self-optimize in such a way that the system is more reliable, predictable, cleaner, greener, and more productive. Interest in this topic is really taking off in the U.S. and Europe, illustrating the industry’s growing desire to take advantage of Smart Grid concepts on a broad basis.

Please explain the concept and value of hPC and why it is important.

This type of affordable high performance computing, which I call hPC, will enable utilities to use commercial multi-core computers to take advantage of the emerging data generated by Smart Grid technologies. This is not the type of supercomputing typically found only at national laboratories, which is beyond the scope of normal utility computing facilities. I’m talking about small, parallel computer systems that use from 64- to 512-core processors and cost from $20,000 to $30,000. These are affordable machines that are showing substantial value to utilities in recent demonstrations. The benefits occur when a utility adjusts its software architecture, algorithms and grid tools to run in a parallel environment on these machines to accelerate time-critical performance in both operations and in system planning.

Right now, very little utility architectural software works in parallel environments. But even if a utility doesn’t use parallel processing today, it ought to be thinking about transitioning its codes and architectures for a near-term future when parallel processing will be very cheap and commonplace. It is a really exciting opportunity.

Here at PNNL, we are demonstrating the value of hPC and we are developing a hardware-agnostic, open architecture that vendors and utilities will be able to use for parallel computing. We’ve had some exciting results from our demonstrations. We have reduced state-estimation time for large regional transmission companies from 1-2 minutes to 2-5 seconds by using a 128-core processor. PNNL has also used hPC to perform massive contingency analyses 8,000-times faster was possible with traditional techniques, dramatically increasing the number of failure scenarios that can be analyzed for real-time operation. We have also begun using parallel computing platforms for real-time curation of high-velocity, high-volume data streams and can handle billions of data records for these tasks.

I encourage early-career IEEE members to become familiar with parallel processing concepts, whether they are involved in back-office data systems, monitoring networks or running control rooms. Parallel processing will be pervasive in all those areas and lead to innovations that we can’t imagine today.

What kinds of innovations should utilities begin incorporating into their longer term strategic planning?

Utilities are facing explosive increases in the volume and rate of data coming into their systems so they should consider new approaches for data management in their strategic planning. Furthermore, a lot of the new value streams from Smart Grid concepts will come from the boundary areas that typically separate customer, distribution and transmission data siloes. I encourage utilities to consider how the flow of information across these traditional organizational boundaries will create new value streams and opportunities in the future.

Utilities need to position themselves both defensively and offensively to address these challenges. For a defensive strategy, they should determine the types of networks they will need, the systems they will need to handle the increased volume of data and the techniques they will need to protect it.

For an offensive strategy, they should explore how to extract full benefits and value from these emerging Smart Grid data sets. New technologies are providing insights and tools that they would not have had before, and utilities need to be thinking about how they are going to take advantage of these capabilities and use them to approach their businesses in new ways. They need to be bold here. They should explore the interfaces between traditional business boundary areas to find new values and new opportunities to keep their utilities healthy and vibrant.

At PNNL, Carl Imhoff is responsible for research and development in advanced power transmission reliability, demand response, renewables integration, policy, and strategy for Smart Grid concepts, as well as grid analytics tools for visualization and high-performance computing. He has worked at PNNL for 30 years. He has been involved in the North American SynchroPhasor Initiative, the GridWise Alliance, the Consortium for Electric Reliability Technology Solutions, and the Western Electricity Coordinating Council.