Energy Department’s Investment Grant Program Advances Rapidly, As Scheduled
By Joseph Paladino
A $7.8-billion Smart Grid Investment Grant Program managed by the U.S. Department of Energy (DOE) is beginning to produce concrete results that will have wide relevance. Four analytic progress reports released in December evaluate impacts on demand reduction, operational and maintenance savings from advanced metering, reliability improvements from distribution automation and improved controls for voltage and reactive power management.
The U.S. Department of Energy's Smart Grid Investment Grant Program (SGIG), designed to accelerate electric grid modernization in the United States, consists of 99 projects selected from more than 400 applications that originated in just about every state. They involve a range of smart technologies and systems intended to increase the grid’s flexibility, reliability, efficiency, affordability and resilience. Total funding for the 99 SGIG projects amounts to $7.8 billion, with $3.4 billion coming from the 2009 stimulus bill (ARRA), and $4.4 billion from matching private-sector investments.
Work on the 99 projects, which were generally launched in 2010, has been rapid, as noted in a progress report, released in July by DOE’s Office of Electricity Delivery and Energy Reliability Equipment. Installation on all projects is expected to be completed this year and next, with data analysis and reporting to be finished in 2015. Nearly 80 percent of authorized funds have been spent, closely in keeping with plans.
In December, DOE issued four analysis reports that evaluate the realization of strategic objectives in large groups of selected projects. The reports address electricity demand reductions; O&M savings from advanced metering infrastructure; reliability improvements from distribution automation; and controls for voltages and reactive power management. Some of the key findings from each report follow.
Electricity demand reduction: Of the 99 projects, 62 are implementing advanced metering, customer systems and time-based rates to reduce electricity consumption during peak periods and/or reduce overall electricity consumption. Achieving such reductions could result in lower capital expenditures and improved capital asset utilization on the part of utilities, smaller electricity bills for consumers and less environmental impact.
Three of the utilities pursuing demand management projects—Oklahoma Gas and Electric, Marblehead Municipal Lighting Department (in Massachusetts) and Sioux Valley Energy (South Dakota)—published quantitative assessments of their efforts in mid-2011. DOE learned that peak demand reductions ranged from 5 to 37 percent among participating customers, as compared to control groups. The impact of programs varied widely, depending on the type of time-based rate (for example, critical peak price, variable peak price or time-of-use) and type of customer system (programmable communicating thermostat, in-home display or web portal).
Oklahoma Gas and Electric has decided to proceed with a system-wide roll-out of demand-management technologies to 120,000 customers, while the other two utilities are still considering their next steps.
O&M savings: Advanced metering infrastructure (AMI) is being installed in 63 projects to reduce meter operation costs, vehicle miles associated with meter reading, and the emissions and fuel consumption arising from support of customer service requests. Other benefits could include more efficient utility operations, lower electricity costs and prices, and improved customer services.
Initial findings come from 15 projects. Estimated reductions of meter operations costs range from 13 to 77 percent, and estimated cuts in vehicle miles, fuel consumption and emissions from 12 to 59 percent. It appears the spread of results can be explained in terms of differences in legacy metering systems, meter operations practices, and sizes and geographies of the service territories. Additional project analysis will produce more information on costs and impacts.
Reliability Improvements: Roughly half the projects—48 to be exact—seek to improve electric distribution system reliability to reduce the frequency and duration of both momentary and sustained outages. Some benefits could include higher levels of productivity and financial performance for businesses and greater convenience, savings from less food spoilage and reduced medical and safety problems for consumers; reductions in operations and maintenance costs associated with more efficient outage response; and greater ability to locate faults and manage power flows in distribution circuits.
The initial results come from four projects reporting covering 1,250 distribution feeders. For these projects and feeders, improvements were measured in the standard reliability indices:
- System Average Interruption Frequency Index (SAIFI)—number of outages was reduced by 11 to 49 percent;
- Momentary Average Interruption Frequency Index (MAIFI)—number of interruptions was reduced by 13 to 35 percent;
- System Average Interruption Duration Index (SAIDI) —number of minutes ranged from a reduction of 56 percent to an increase of 4 percent;
- and Customer Average Interruption Duration Index (CAIDI)—number of minutes ranged from a reduction of 15 percent to an increase of 29 percent.
CAIDI often increases when automated feeder switching results in fewer customers experiencing sustained outages.
Further analysis of additional projects and feeders will produce more information on costs and impacts.
Controls for voltages and reactive power management: Advanced voltage and volt-ampere reactive optimization (VVO) technologies are being implemented in 26 of the projects. VVO will permit operators to lower voltage levels during peak periods or for longer periods of time to effectively reduce energy requirements. The ability to make the voltage profile flatter along a circuit provides greater flexibility to utilities that might benefit from reducing electricity consumption during periods of peak demand. Achieving these objectives greatly enhances energy efficiency and will permit the deferral of capital expenditures associated with generation, transmission and distribution assets, as well as reduce electricity generation and environmental impacts. It may also provide more efficient utility operations, greater flexibility to manage peak loads and more opportunities for affordable rates.
The initial results come from eight projects that reported hourly load data for 31 feeders. About half the feeders experienced line loss reductions from capacitor bank switching ranging from 0 to 5 percent, while five experienced line loss reductions greater than 5 percent (where a 5 percent line loss reduction equates to an energy savings of approximately 0.2 percent). In addition, two projects that applied conservation voltage reduction practices reported peak demand reductions ranging from 1 to 2.5 percent.
A fifth analysis report on synchrophasor applications for electric transmission systems is scheduled for later in 2013.
The DOE-OE website smartgrid.gov contains copies of all reports referred to in this article. It also contains information on the 99 SGIG projects, including updates on equipment installations and costs, project descriptions, case studies and a library of about 1,400 smart grid studies and reports.
Joseph Paladino is Senior Advisor in the Department of Energy’s Office of Electricity Delivery and Energy Reliability, where he oversees efforts to determine the impact of smart grid projects funded by the American Recovery and Reinvestment Act of 2009. He has worked at DOE for 20 years in programs involving nuclear waste management, energy efficient buildings and electricity grid modernization. His particular interest is in the advancement and commercialization of technology. Prior to joining DOE, he worked for over 10 years in the private sector. He earned his undergraduate degree in biology from Middlebury College and his graduate degree in civil engineering from the University of Pittsburgh.