Leveraging Machine Learning for Phase Identification

Inaccurate phase connectivity information may cause several operational inefficiencies – for example, unbalanced phases that lead to significant energy losses and sharply reduced asset lifetimes. Traditional approaches to phase identification require either manual intervention or costly signal injection. These methods are usually infrequently performed. As a consequence, the phase identification can quickly become out-of-date. Using robust machine learning techniques, Itron’s Strategic Analytics group has developed algorithms to accurately classify meters according to their phase using voltage information readily available from AMI meters.

Itron’s phase identification is offered as a service, minimizing upfront cost. In addition, pilot programs are available for a limited number of feeders to allow the opportunity to evaluate the service’s accuracy and benefit to your utility.

Watch a recent webcast on our innovative phase identification technique here.

The Self-Generation Incentive Program: Behind-the-Meter Energy Storage Evaluation Results

The Self-Generation Incentive Program (SGIP) evaluation found that behind-the-meter (BTM) storage provides tangible benefits – load reduction during system peak hours, customer bill savings and system-level and localized demand response options. However, by optimizing for bill savings, the evaluation findings indicate that BTM AES systems are increasing GHG emissions overall.

Under current retail rates, the incentives for customers to dispatch AES to minimize bills are not well aligned with the goals of minimizing utility (and ratepayer) costs or GHG emissions. We observed that energy storage systems installed at facilities on TOU rates with demand charges largely ignored the TOU price differential while prioritizing non-coincident demand charge reduction. Storage systems were discharged to reduce non-coincident peak, but storage systems do not wait for off-peak energy pricing to recharge.

There is a strong relationship between utilization (measured as capacity factor) and roundtrip efficiency (RTE), which is the total energy discharged from the system divided by the total energy charged. We observe that the projects with the highest RTEs also tend to have the highest CFs. This in turn might suggest that if projects increased their annual capacity factor, the annual RTE would also increase. While this may be true, we find that even if all parasitic loads were removed leaving just the influence of single cycle RTE, GHG emissions would remain positive. We found in examining storage projects participating in DR programs that a storage system can be utilized identically across days (i.e., an equal capacity factor), but lead to increases or decreases in marginal emissions. The timing of charging and discharging in relation to the marginal emissions on the grid is paramount to just utilizing the system more often.

SGIP AES projects represented a combination of standalone projects and projects either co-located or paired directly with solar PV systems. Our analysis indicated that AES projects paired with PV were not prioritizing charging from PV. Going forward, the program administrators have modified SGIP eligibility rules to encourage AES charging from PV. Projects that are shown to charge from PV will have priority in a potential lottery. Furthermore, eligibility for investment tax credits might promote increased pairing of SGIP AES projects with PV or other renewable generators.

To date, the SGIP has provided incentives to over 4,000 residential and non-residential customers representing over 200 MW of storage capacity. As the program continues to mature and ratepayer dollars are expended to help fund the program, impact evaluations, like the ones conducted by our team, are critical exercises in the feedback loop from policy to design to implementation to policy again. Our findings and conclusions have helped spur new policy interventions and program design improvements in the SGIP and will hopefully benefit others as BTM storage continues to generate societal interest and utility programs are being developed to best capture the benefits of storage as an electric resource.

If you have additional questions, please contact us at StrategicAnalytics@itron.com.

The Smart Home Study

The vision of a smart grid is based on distributed energy resources dynamically interacting to create smart homes and smart loads that automatically respond to price signals to effectively balance energy supply and demand. Read more on Itron’s work with the Smart Home Consortium and the California Energy Commission.

It’s in the Water! – Improving Energy Savings Calculations from California Water Conservation

When I hear the term “water-energy nexus,” I imagine something complex and fantastical, when in actuality it is an intimately familiar and essential component to most of our lives. Water and energy are closely connected in a multitude of ways, from cooling the power plants that produce our energy to pumping water to crops and our kitchen taps. The water-energy nexus is a field of study devoted to understanding and innovating around the interconnected nature of energy and water in the modern world.

In a recent study at Itron, we take a close look at energy reductions associated with the provision of water to end users in California when statewide water use is reduced due to severe drought conditions. Producing and transporting potable water to end users in California alone accounts for 7 percent of the state’s total annual electricity consumption.  Even more surprising, this large figure doesn’t even include waste water treatment or energy inputs by end users such as water heating, which on average accounts for 20 percent of the average household’s energy usage. It makes sense then that consuming less water results in less water pumped, less water heated, and less energy consumed. So why then do most of us view water and energy as unrelated? It’s not a unique phenomenon. When we purchase food at the grocery store, clothes online, or water from our taps, we tend to think we are paying for the resource or commodity itself. We don’t typically consider that a portion of our payment is for the embedded energy required for extraction, processing, and delivery of materials associated with those goods and services, including water. We all learn that water flows freely downhill with gravity, but few of us appreciate that our water delivery system is a reversal of this process requiring significant energy inputs.

In 2015, Governor Jerry Brown mandated a 25 percent reduction in urban water use relative to 2013 levels in response to falling water table levels and dwindling snowpack after years of sustained drought and increased reliance by agriculture and urban areas on groundwater pumping. The 2015 sharp reduction in urban water use following the mandate provided a natural experiment for Itron’s study that focuses on estimating energy savings associated with urban water conservation.

The study relies on water agencies’ monthly electric bills associated with groundwater pumping, water treatment, and distribution between the baseline year of 2013 and the mandate year of 2015 to empirically derive an energy savings estimate coincident with the state’s water conservation mandate. Itron evaluated electric billing data across 32 water agencies that are broadly representative of the state’s diversity in water system sources. The study also serves as a cross-check on the pre-existing water-energy calculator developed by the California Public Utilities Commission (CPUC) for use by the state’s investor-owned utilities. The existing water-energy calculator relies on a set of static values at the granularity of the state’s 10 hydrologic regions as determined by the California State Water Board.

Results of the study show that the relationship between water and energy in California is highly variable and unique to each water agency. Across the 32 water agencies in the study, Itron finds the energy embedded in urban water production and distribution ranges from approximately 300 to 500 kWh per acre-foot, with a weighted average of 397 kWh per acre-foot in 2015. These values corroborate prior findings that the energy reduction potential from water conservation is substantial, however, preliminary findings do show that the CPUC water-energy calculator overestimates energy savings by 20-30 percent. In addition, the static approach of the CPUC water-energy calculator is found to mask significant year to year variability in the energy intensity of water production related to drought intensification and changes in water management policy, indicating that there is still a lot of research necessary if it is to be a viable planning tool.

When we learn something as foundational as that water flows downhill, we often take for granted its significance, when in fact water’s downhill passage has the power to shape mountains and carve canyons. The core concept of the water-energy nexus is simple but similarly very powerful. Ultimately, energy reductions from water conservation represent an underestimated energy saving resource that can be leveraged by any state to assist in meeting GHG emission and energy saving goals. We have the preliminary data now to re-envision water conservation as a combined energy saving opportunity. The integration of water-energy savings into portfolio level planning requires significant research to catch up to the rigor of energy programs savings analysis and will require the corroboration of all groups for the aggregation of applicable data. The water-energy nexus is ultimately part of a larger movement towards fully integrated systems that can be comprehensively managed and optimized by policymakers and implementers.

Grid Energy Storage: Will It Live Up to the Hype?

Every morning I wake up and take part in what I believe to be the greatest achievement in recent history. No, I’m not talking about my coffee maker, I’ll spare you my idolization of caffeine for another blog post. I’m talking about the electricity distribution system. As I turn on the various lights and appliances in my home each morning, I am increasing the amount of electricity that must be generated from a distant power plant, transmitted to a nearby distribution substation, and ultimately flows through into my coffee machine.

The electricity transmission and distribution system is constantly making sure that the supply of electricity matches the demand at every moment. This delicate balance is maintained by load forecasts that predict electricity demand minutes and days ahead, and market clearing houses that ensure the appropriate price is being offered for generators to supply the needed electricity. To supply this power at a moment’s notice, we need flexible power plants that can quickly ramp their electricity generation up or down to keep the system in balance. These generators are often the least efficient and most polluting generators on the grid, but we need them to maintain a balance on the grid. Right?

In recent years, one technology has upended the theory of electricity distribution. Touted as “The Missing Piece” by Tesla CEO Elon Musk, energy storage for the grid promises to eliminate the need to instantaneously match electricity generation with demand. If we could store the electricity during hours of excess generation (when the sun is shining and the wind is blowing) and then release it when we need it most (when we come home from work and plug in our electric vehicles), we could potentially do without the inefficient and polluting generators on the grid.

Grid energy storage is not a new technology. As early as 1930, the Connecticut Electric and Power Company implemented the first pumped-storage system in the United States. To store energy, water was pumped from the Housatonic River to a storage reservoir 230 feet above. When the energy was needed, water would flow through a hydroelectric turbine back down to the river and generate electricity. Other types of energy storage include compressed gas, thermal, and battery storage.

Of all the energy storage technologies, battery storage (specifically Lithium Ion battery storage) has gained the most popularity in recent years. Thanks to significant cost reductions, Li-Ion storage has become the most prolific storage technology in the United States and abroad. Companies like Tesla, Stem, Green Charge, Sunverge, Outback Power, and others offer Li-Ion battery storage projects that customers can install to manage their energy consumption. These storage systems can reduce customer bills by reducing peak demand, shifting energy to lower cost hours, and maximizing self-consumption of solar generation.

The State of California, a longtime leader in energy policy and regulation, is moving full steam ahead on battery technology. California’s Self-Generation Incentive Program (SGIP) has already provided financial incentives for more than 700 customer-sited battery energy storage projects. Beginning in 2017, the state’s three investor-owned utilities will collect $83 million on an annual basis through 2019 for the SGIP. Eighty-five percent of the $249 million in funding over the next three years is directed to fund energy storage projects.

But how are these battery systems performing? What types of benefits are they realizing for customers? And how are they changing the operations of the grid? These are all topics that Itron is exploring in significant depth for 2017. First, Itron staff will be discussing how utilities and system operators need to begin considering the increased proliferation of customer-sited storage technologies at the 15th Annual Energy Forecasting Conference. As the penetration levels of battery technologies increases, utilities must account for the battery algorithms when forecasting the load over a given period. Later in the year, Itron will build on its 2014-2015 SGIP Impact Evaluation Report and release the 2016 SGIP Energy Storage Impact Evaluation Report, which will quantify the impact of SGIP-funded battery storage projects on customer demand and carbon emissions. Finally, in August 2017, Itron will present at the International Energy Program Evaluation Conference on research methods related to storage analysis.

We’ve come a long way since 1882, when Thomas Edison built the first electricity distribution system in the United States. Edison’s electric grid resulted in numerous health and safety improvements related to the reduction of indoor smoke and open flames. In a way battery storage promises an equivalent quantum leap – but the viability of the technology remains unproven. The next couple of years will test whether the benefits of storage will live up to the hype.

I agree to have my personal information transfered to AWeber ( more information )
Opt in to receive notifications when a blog post is published. Don't miss the thought leadership, insight and news from Itron.
We hate spam. Your email address will not be sold or shared with anyone else.