Spring Cleaning: Organizing the Impacts within the Behind-the-Meter Energy Storage Closet

Electric utilities are increasingly promoting and adopting distributed energy technologies as an opportunity to modernize the electric grid and deliver a more efficient, reliable and flexible supply of electricity services. Behind-the-meter (BTM) advanced energy storage (AES) has become a key component of that modernization. Successes in program design, marketing, cost reductions and perceptions of societal benefits for both economic and environmental reasons have increased the penetration of AES technologies throughout California. These successes have been spurred by the unique and fundamental nature of AES as an electric supply source. AES technologies can obviate the need to instantaneously match electricity generation with demand. Customers can benefit by discharging to service load during periods when retail energy or demand rates are high and charge when rates are lower. Utilities can benefit by assuaging peak demand issues if storage is dispatched during periods of increased demand and charges during periods of excess power generation.

Veiled behind a meter, it can be difficult to understand how AES systems are being operated by customers, quantify the potential benefits of storage performance or determine how these systems can change operations on the grid. Itron’s Strategic Analytics group has a unique perspective in this regard. For the past decade, our team has been evaluating the impacts of the Self-Generation Incentive Program (SGIP) in California.

The SGIP was established in 2001 to help address peak electricity problems in California. Since its inception, the SGIP has provided incentives to a variety of distributed energy technologies including fuel cells, wind turbines, combined heat and power (CHP) and solar photovoltaic (PV). More recently, standalone AES projects – in addition to those paired with SGIP eligible technologies or PV – were made eligible for incentives.

The SGIP has three overarching goals at it relates to storage; 1) to reduce greenhouse gases, 2) to provide grid and customer support, and 3) to support market transformation. In 2018, our team conducted an impact evaluation to assess the ability of SGIP storage projects to satisfy those goals and provide customer, utility and environmental benefits. The evaluation examined how SGIP residential and nonresidential participants utilize these systems and the types of benefits realized from the system performance, along with how these systems are changing the operations on the grid.

Our team is presenting the results of that evaluation at the International Energy Program Evaluation Conference (IEPEC) in Denver next week. The multi-faceted evaluation included quantification of observed program-level impacts and storage optimization modeling. The evaluation examined the relationships between customer retail rates, utility marginal costs and storage dispatch behavior to better understand how utilities can maximize the benefits of BTM energy storage.

We hope to see you next week and discuss our findings with you in person!

Where Did Durbin and Watson Go Wrong?

I’m sure you have all been there—the spot where you mention autocorrelation and the Durbin-Watson statistic. I saw this recently in some testimony where the witness goes on to explain the statistic. The testimony read something like this (which is taken from Investopedia):

  • The Durbin Watson (DW) statistic is a test for autocorrelation of the residuals from a statistical regression analysis. The DW statistic will always have a value between 0 and 4. A value of 2.0 means that there is no autocorrelation detected in the sample. Values from 0 to less than 2 indicate positive autocorrelation and values from 2 to 4 indicate negative autocorrelation.

If you dig in further, you will find that there is a table that tells you whether the evidence for autocorrelation is conclusive or inconclusive. And the critical values depend on how much data you have and how many variables are in the model, all of which definitely starts to sound like blah, blah, blah…because none of it really makes any sense. You can understand it if you dig into the formulas (which we will do), but the whole thing sounds strange.

A smaller number (further below 2) means stronger positive autocorrelation?

A larger number (further above 2) means stronger negative autocorrelation?

Someone who has a stats background will know the term “correlation” and they will know that the correlation coefficient will be a number between -1 and 1. They will know that a value of 0.0 means uncorrelated. They will know that a value above 0.0 means positive correlation. They will also know that a value below 0.0 means negative correlation.

So, how did we get to this point where we have a test statistic that is flipped and that is centered on 2.0? First, a little history. The DW statistic was developed by James Durbin (British) and Geoffrey Watson (Australian) in 1950. Second, this statistic is only relevant for a time-series model. In a time-series model, you have a set of sequential observations. Let’s label the first observation “1” and the last observation “T.” Then an individual observation is for time period t, where t is between 1 and T. So far so good.

In its most general form, a model with an additive error can be written as:

We don’t really care about F[ ] or X at this point. We really only care about the string of model residuals or errors (the e(t) values). This brings us to the formula developed by Durbin and Watson, which is:

On its face, this seems ok. The sum in the numerator is over changes in sequential residual values squared if the denominator is the sum of squared residuals. But just looking at it, it is not at all clear why this is centered at 2.0.

Digging a bit deeper, we can expand the numerator and rewrite DW as follows:

The first and last ratios each have a value of about 1.0, the only difference being that the numerator sums run from 2 to T instead of from 1 to T. So, if the middle sum has a value close to zero, the DW value will be about 2.0. That’s why the DW is centered at 2.0.

This means that the middle ratio is where all the action is. Let’s look at this middle term:

In the numerator, we have the product of sequential residuals. If the sequential residuals are changing sign (+ to – or – to +), the products will be negative. Sum them up, flip the sign (multiply by -2) and add the result, and the DW goes up above 2.0. If the sequential residuals are not changing sign (+ to + or – to -), the product will be positive. Sum them up, flip the sign (multiply by -2) and add this in and the DW will go down below 2.0. Hmmm…

If you take off the -2 in the middle term, it turns out that what is left is the formula for the first order autocorrelation coefficient for a variable with a zero mean. This is usually represented by the symbol , which is the 17th character of the Greek alphabet and is pronounced rho.

This means the DW statistic is closely approximated by the formula:

Based on this approximation, I propose we define a new statistic (the S statistic after Watson’s middle name) defined as follows:

All the critical value tables can be transformed using the same formula, and then everything would work as it does now with the DW and its critical value tables.

But now the Investopedia definition is straightforward and greatly simplified:

  • The S statistic is a test for first order autocorrelation of the residuals from a statistical regression analysis. The S statistic will always have a value between -1.0 and +1.0. A value of 0.0 means that there is no autocorrelation detected in the sample. Values from 0.0 to 1.0 indicate positive autocorrelation and values from 0.0 to -1.0 indicate negative autocorrelation.

Even though this is clearly a great idea, it will never happen. Once a statistic is coined and the textbooks are written, it is too late to change. It will live on forever. It’s kind of like the QWERTY keyboard that I typed this on – not the best design, but the one that made sense in the distant past, the one we are used to, and therefore, the one we are stuck with.

Surely there is another planet or a parallel universe, however, where the counterparts of Durbin and Watson got it right and the S statistic is well known and is easily understood as a simple correlation coefficient.

Introducing Itron’s New President and CEO

Today, Tom Deitrich steps into his new position as president and chief executive officer at Itron. Deitrich has served as Itron’s executive vice president and COO since 2015, and succeeds Philip Mezey, Itron’s president and CEO since 2013.

After joining Itron in 2015, Deitrich has played a major role in shaping the company’s strategy to partner with cities and utilities to build secure, reliable, connected communities that can offer a multitude of new services. He architected the company’s innovation and product strategy, created the product business units and global commercial and customer enablement organization, and enhanced our overall operations.

Hear from our new president and CEO about his vision for Itron and how we will continue to make an impact on cities and utilities across the globe.

To read the full press release announcing Tom’s appointment, click here.

Is My Smart Home Really Smart?

I am a smart home geek. I have all the “smarts”: voice assistant/speaker, security cameras, lighting, fans/air purifiers and a thermostat. My home is cool when I need it to be with reasonable energy efficiency settings and I can turn fans and lights on/off when I am not even home. I have four smartphone apps to manage all of these devices with some ability for my voice assistant/smart speaker to integrate, which I have not yet had time or energy (pun intended) to do. I pay a little extra per month to support bringing renewable energy to my community and dream of owning an electric car someday. I have all of this…but am I really that “smart”? My smart solutions seem disjointed and disconnected.

Some of Itron’s utility customers have started connecting with their consumers in the smart home by using voice assistants/smart speakers like Amazon Alexa and Google Home to allow them to receive energy usage information, manage and pay bills, enroll in various utility programs, and communicate other rebates and incentives. Are these devices poised to be the smart customer service representatives of the future? It certainly gives consumers a new way to get relevant information from their utility instantly without needing to log in to a website or app. The smart home device market is saturated with products for lighting, security, appliances and more. Vendors with “hub” capabilities have emerged with bundles of products that can work together seamlessly. Using voice assistants/smart speakers, consumers can even order energy saving equipment directly or through marketplace services. For any devices that can be controlled to conserve energy or water and save consumers money on their bill, you can see the appeal for utilities to have a solid footing in this space.

Even more interesting to utilities are the large energy producing and consuming devices in the smart home. Itron’s Distributed Energy Management group supports utilities with solutions to help them maintain power quality with renewables like wind and solar, coupled with EV chargers and batteries, to balance supply and demand and ensure the right distribution capacity. Add in the use of smart thermostats and load controllers like pool pumps to support demand response and energy efficiency programs, you start getting a sense of what a really “smart” home looks like. Orchestration of resources in the home will only become more critical. Now imagine all of the components in and around the home working together in concert—seamless interoperability, optimization and control is the name of the game. That’s why Itron communications and data management services are built on open standards, and we are constantly growing our ecosystem of technology and service provider partners.

Utilities with the right infrastructure and solutions can transform consumers’ lives in the digital age. Itron provides the backbone to the smart cities, communities and homes we live in. This is Itron’s intelligent vision, which seems way better than just being “smart.”

Itron Idea Labs is focused on bringing new, innovative businesses, products and services to Itron customers. For more information, click here.

Itron and Discovery Education Awarded for Global Sustainability Leadership

Today, the Business Intelligence Group named Itron and Discovery Education, the global leader in standards-aligned digital curriculum resources, engaging content, and professional learning for K-12 classrooms, winners of the Sustainability Service of the Year project for Conservation Station: Creating a More Resourceful World. The 2019 Sustainability Awards program honors those people, teams and organizations who have made sustainability an integral part of their business practice or overall mission.

“Itron is committed to inspiring future generations to help create a more resourceful world and we are delighted to be awarded ‘Sustainability Service of the Year’ recognition for our collaboration with Discovery Education,” said Marina Donovan, vice president of global marketing and public affairs at Itron. “We are thankful to the Business Intelligence Group for this esteemed recognition. By engaging young people in learning pathways that help them ideate solutions for the most pressing resource challenges of our time, we are driving a real shift for the future of sustainability.”

Conservation Station: Creating a More Resourceful World aims to encourage a national dialogue on the importance of students’ understanding of current and future energy needs, resource utilization and conservation. The program material dives into the relationship between energy and water, and how innovative technologies are working to conserve both resources through the core pillars of resourcefulness: effectiveness, efficiency and sustainability.

“Discovery Education is proud to partner with forward-thinking organizations like Itron,” said Lori McFarling, senior vice president and chief marketing officer at Discovery Education. “What makes earning this award special is that it underscores our shared vision to best prepare young people to build a better tomorrow, and conserving natural resources through sustainable energy practices is critical in furthering this mission. Thank you to the Business Intelligence Group for this recognition.”

Maria Jimenez, Chief Nominations Officer, Business Intelligence Group continued, “we are proud to reward and recognize Itron and Discovery Education for their sustainability efforts. It was clear to our judges that their vision and strategy will continue to deliver results toward a cleaner, more sustainable world. Congratulations!”

To read the full press release, click here.

Holidays Are No Vacation for the Load Forecaster

Summer has arrived. While your friends are frolicking at the beach, you are struggling in a darkened cubicle to ensure that the load forecast performs well on July 4.

Indeed, holidays create a unique challenge for load forecasters. There are two broad categories of holidays:

  • Stationary holidays occur on the same weekday each year. For example, Martin Luther King, Jr. Day is celebrated in the U.S. on the third Monday in January. Memorial Day is celebrated on the last Monday in May.
  • Non-stationary holidays occur on different weekdays. In the U.S., Independence Day is July 4, regardless of the weekday. Similarly, in Canada, Canada Day is always on July 1.

Typically, short-term load models use five or fewer years of historical data. This creates no intrinsic problems regarding the stationary holidays. It can create issues when it comes to non-stationary holidays, however, because the model has not experienced those holidays on all seven days of the week.
First, we will look at the stationary holidays. For MLK Day, you can specify the variable as follows:

Let’s unpack this:

  • Month = 1 returns a 1 during January and 0 otherwise
  • Weekday = 2 returns a 1 on Mondays and 0 otherwise

The third Monday will fall between (and including) the 15th and the 21st.

  • Day>=15 returns a 1 on all days after and including the 15th of each month
  • Day<=21 returns a 1 on all days before and including the 21st of each month

The product of each of these expressions is a 1 on the third Monday in January. We can use similar logic to define a Memorial Day variable.

Now, let’s look at the non-stationary holidays. We can define Independence Day as follows:

This variable will return a 1 on July 4 every year. From a modeling perspective, the problem arises because July 4 is sometimes on Tuesday and sometimes on Sunday. The effect of the holiday on a Tuesday will likely be different than the effect on a Sunday. To complicate the issue further, we may need to address the surrounding days as well. For instance, when July 4 is on a Saturday, the holiday is likely observed on Friday, July 3.

Let’s first look at how to specify the variable accounting for different weekdays. In the following example, Saturdays are weighted with a 0.4, Sundays are weighted with a 0.25 and all other days are weighted with a 1.0. In other words, Saturdays will get 40% of the impact of the weekdays, while Sundays will get 25% of the impact of the weekdays.

You can do some analysis of residuals to determine the weights that make sense for your load.

You can expand this idea further by specifying weights for July 3 and July 5 as well.

By specifying the weights in transforms, you can build complicated logic to address the issue. Further, a single variable that accounts for July 3, July 4 and July 5 can be a powerful way to address the problem of the holiday spillover effect along with the shifting weekday.

Before you get too excited about this, you should consider the following. By the time you get the non-stationary holiday effect right, the holiday has already happened. And, when the holiday happens, you won’t get it right because it hasn’t happened on that same day in seven years. In other words, you’ll get it right, but it will be just in time to be too late.

Have you seen the new Itron website? Be sure to visit the new forecasting page at www.itron.com/forecasting.

Dates and Serial Numbers: Might As Well Leap

In Excel and MetrixND, dates are actually serial numbers. This is obvious in Excel when you format a cell containing a date as a number. In the following image from Excel, the first and second columns contain the same dates—the first column is formatted to display the value as a date and the second column is formatted to display the value as a number. To populate this series, I typed 1/1/1900 in the first cell and I added 1 to the prior cell in each subsequent row, dragging the series downward. For the sake of brevity, I have hidden the rows between 1/3/1900 and 2/28/1900.

There are a few observations to make about this image. First, the serial number is indexed to 1/1/1900. That is, Jan. 1, 1900 is indicated as 1. Second, the dates indicate that there was a Feb. 29 in 1900. There is one big problem with this however: 1900 was NOT a leap year.

Let’s review the rule for determining if a year is a leap year in the Gregorian calendar (which is the calendar used most widely around the world). A year is a leap year if the year is divisible by 4, except if it is divisible by 100, but years divisible by 100 are leap years if they are also divisible by 400. Thus, the year 2000 was a leap year because it is divisible by 4, 100 and 400, but the year 1900 was not a leap year because it is divisible by 4 and 100, but not by 400.

As a side note, the following MetrixND transform creates a binary variable that indicates if a year is or is not a leap year, but I digress.

This means that Excel is incorrectly identifying 1900 as a leap year. When I found this out, I was hoping that I had made a great discovery. Much to my chagrin however, Microsoft is aware of this and you can read a detailed explanation of it here. Essentially, the bug was intentionally implemented in Excel for backward compatibility with Lotus 1-2-3—an early spreadsheet application with which Excel competed—where this bug was introduced.

Let’s see how MetrixND addresses this issue. The following MetrixND transform utilizes the keyword DATE, which returns the serial number of the date. The following image displays the end of February and the start of March in 1900.

There are two things to observe. First, March 1 and March 2 are 61 and 62 respectively, which align exactly with the results from Excel. Second, there is (correctly) no Feb. 29 in 1900. This means all the days after Feb. 28, 1900 have the same serial number in Excel and MetrixND.

But, where does that leave us prior to Feb. 28? The following shows the dates including Jan. 1, 1900. The dates prior to Feb. 28, 1900 are different by one between the two applications, but that is of little consequence since we rarely have access to daily data from 1900.

The takeaway is that MetrixND indexes the date’s serial number to Dec. 31, 1899 instead of Jan. 1, 1900, as Excel does. Because MetrixND correctly makes 1900 a non-leap year, the serial numbers are consistent starting on March 1, 1900 and thereafter.

I am hopeful this pedantic piece of knowledge will allow you to sleep better at night.

Have you seen the new Itron website? Be sure to visit the new forecasting page at www.itron.com/forecasting.

Seeing Climate Change First-Hand

I just returned from a stirring expedition to the frozen Arctic, the fastest-warming place on earth. During the trip, I witnessed first-hand what is happening in this unique ecosystem, and it is concerning. The Arctic is ground zero for the impacts of a warming climate as changes are happening faster there than anywhere else in the world. It was a great opportunity to visit this unique place, and I would like to pass along what I saw and learned.

The Journey Begins
We started our journey in Oslo, Norway, where we spent three days learning about our upcoming adventure, then we set off for Svalbard to start our expedition. Ranging from 74° to 81° north latitude, the Svalbard archipelago and Longyearbyen—the world’s northern most town—are covered in ice for most of the year. Since 1971, temperatures in Svalbard have risen on average by 4°C, which is five times faster than the global average of 0.8°C.1 To prepare for sub-zero temperatures, I brought specialized clothing, but the weather was quite mild with temperatures in the mid-30s most days. This was not quite what I was expecting from a trip to the Arctic.

I could see the impacts of this warming trend at the Global Seed Vault in Longyearbyen, a frozen storage bank designed to keep the world’s plants safe from catastrophe. The permafrost is melting around the installation and has caused the entrance to leak, as a result, the site is currently being rebuilt. I could hear the warming in the crackling of ancient glacier ice as it melted into the fjords of Svalbard. On our hikes, we examined the bones of several Svalbard reindeer that starved over the winter months due to a greater amount of rainfall in a warming climate. Unlike snow, the rain freezes to the ground, making it harder for the reindeer to paw away and get to the food underneath.

A Warming Planet
While it’s difficult to draw any conclusions from a single experience, the science record is clear – the Arctic is warming and in turn, much of the ice is melting. Robert Swan, the Founder of the 2041 Foundation and the first person to walk to both north and south poles, was one of the leaders on our expedition. Robert told me that when he first walked to the North Pole in May 1989, the team struggled because the ice melted earlier that year than it ever had before – even so, they found a path albeit a sometimes-circuitous route. He said that same unassisted expedition today is nearly impossible due to a reduction of multi-year sea ice and the speed at which the ice melts each year.

We know this general warming trend is caused by increasing levels of carbon dioxide in the atmosphere, the basic physics of which were worked out in the late 1800s with more accurate models and calculations coming in early and mid-1900s as industrialization took over and the science improved.2 Today, we have a robust world-wide measured data set and advanced future predictive models that create a more refined understanding, but the basic science is over 200 years old.

When these discoveries were first made, the global CO2 level was around 286 ppm—today we are at 413 ppm.3 The effect is cumulative. The more CO2 that enters the atmosphere, the warmer our world will get, ice will melt, sea levels will rise and global weather patterns will change. The scientific consensus believes that in order to avoid the worst trouble, we need to keep average global temperature rise well below +2°C with a target of +1.5°C as the aspirational goal.4

To hold temperatures to these levels, all countries must drastically reduce our carbon emissions over the coming years and make additional efforts to remove significant amounts of carbon out of the atmosphere itself.

Committed to Creating a More Resourceful World
Despite these changes, life persists in the Arctic. On our journey, we saw reindeer, arctic foxes, polar bears, walruses, seals, humpback whales, beluga whales, blue whales, fin whales and the critically endangered bowhead whales. The bowhead is rare and not often seen around Svalbard after having been hunted nearly to extinction in the past, so we were lucky to see them.

Protecting this interesting and important ecosystem is why I’m glad to work at a company like Itron that has a vision of creating a more resourceful world. The essence of resourcefulness for me is about finding quick and clever solutions to the problems we face in society. There are many companies offering impressive solutions for renewable generation, storage technologies and carbon decoupling solutions like electric vehicles. The solutions exist today, but society has yet to deploy them at the scale required to stay within our global carbon target.

We need to determine strategies that bring them to scale, enabling society to smoothly transition from today’s carbon-based energy systems to the renewable options of the future. I believe this is a mission that utilities and smart cities must fully embrace.

Reference Guide:
1. According to a report commissioned by the Norwegian Environment Agency.
2. The Discovery of Global Warming.
3. https://www.co2levels.org/ - Historical CO2 record from the Law Dome DE08, DE08-2, and DSS ice cores. Present day Atmospheric CO2 concentrations (ppm) derived from in situ air measurements at Mauna Loa, Observatory, Hawaii.
4. Paris Agreement

Itron at IEPEC in August

The International Energy Program Evaluation Conference (IEPEC) is the premier conference for energy program implementers; evaluators; local, state, national and international representatives; and academic researchers involved in distributed energy resource (DER) energy program evaluation. The conference advances the goals of conserving natural resources and reducing greenhouse gas emissions by helping to overcome one of the key barriers to implementing DER programs – the lack of confidence in the reported results.

Itron’s Strategic Analytics group will be attending this year’s IEPEC conference in Denver, Colorado, Aug. 20-22. Our team will be presenting seven different papers and posters covering topics in behind-the-meter storage, smart homes and energy efficiency programs.

Itron’s Strategic Analytics staff will present four distinct papers showing our thought leadership on several prominent DER topics:

  • Brian McAuley will present “Spring Cleaning: Organizing the Impacts within the Behind-the-Meter Energy Storage Closet.” Brian will examine the relationships between customer retail rates, utility marginal costs and storage dispatch behavior to better understand how utilities can maximize the benefits of BTM energy storage.
  • Sean Maher will present “The Rocky Road to Shifting Load - Challenges of Installing Advanced Energy Storage and EV Chargers.” In his session, Sean will present an overview of implementation concerns associated with smart homes and the interconnectivity and control of smart equipment in California.
  • William Marin will present “Does Behind-the-Meter Storage Increase Solar Adoption? An Attribution Analysis Tells the Story.” His presentation will summarize survey findings with various market actors, such as storage customers and project developers, and discuss the implications of attributing the impacts of energy storage systems paired with PV.
  • Morgan Smith will present “Piquing Interest in Shifting Peaks: Studying the Real-World Impacts of Load Shifting.” In her presentation, Morgan will describe the ability of smart controls and advanced technologies to shift customer usage while maintaining customer satisfaction.

In addition to the presentations listed above, our staff will be showcasing multimedia displays on the following topics:

  • Mike Heng will present an interactive display showing in detail the critical steps required when sampling, metering, validating and analyzing BTM storage performance data in “Realizing the Full Capacity of Energy Storage Data: Critical Steps in Evaluating Behind-the-Meter Battery Data.”
  • Morgan Smith will showcase Itron’s rate analysis dashboard, which allows users to visualize the expected impact of combining different rate structures and a variety of emerging technologies in “Seeing is Believing: Visualizing Customer and Utility Impacts of Emerging Technologies.”
  • Ethan Barquest will present evaluation results that illustrate the effectiveness of income eligible programs in reaching targeted low-income customers and helping them overcome the barriers to purchasing high efficiency lighting in “Is this thing on? Low Income Participation Behaviors in Upstream Lighting.”

For more information on the conference, please visit https://www.iepec.org/ or contact any of the Itron presenters attending IEPEC. We hope to see you there!

Itron Rings Closing Bell at Nasdaq

On June 26, Philip Mezey, Itron president and CEO, and members of Itron’s leadership team joined Nasdaq in Times Square for a ceremonial ringing of the closing bell in celebration of Itron’s 26-year Nasdaq listing anniversary.

During this event, the team counted down to the closing bell and Philip Mezey signed his name, officially closing the Nasdaq market for the day at 4 p.m. EDT. The ceremony was broadcast on the four major financial news networks. Joining Itron for the event was Chris Dearborn, managing director of the market intelligence desk for Nasdaq as well as Joe Brantuk, vice president of new listings and IPOs.

This event was in conjunction with Itron Investor Day—hosted at the Nasdaq MarketSite on June 27— where we provided a deep dive for investors, analysts and media on our strategy, market outlook, new business segments and technology roadmap.

The team was honored and excited to participate in this iconic ceremony on one of the world’s largest stock exchanges.

You can watch the full bell ringing ceremony here and watch the Nasdaq interview with Philip Mezey here.

The Front Door for Cleantech Innovation in Los Angeles

The Los Angeles Department of Water and Power’s La Kretz Innovation Campus is the front door for cleantech innovation in Los Angeles. The campus serves as the home for the Los Angeles Cleantech Incubator (LACI), which is building an inclusive green economy through unprecedented programs like the Transportation Electrification Partnership (TEP). The TEP is accelerating progress toward the transportation electrification and zero emissions goods movement in the greater LA region in advance of the 2028 Olympic and Paralympic Games. LACI’s priority areas include zero emissions transportation, 100% clean energy and smart, sustainable cities.

Itron Idea Labs is a proud sponsor of LACI and its mission. Recently, Paul Notti of Itron’s customer enablement team and Frank Monforte of Itron’s forecast team attended a meeting at LACI where the current list of startup companies championed by LACI were introduced to the California Energy Commission staff responsible for funding clean technology research. The 16 startup companies span a range for cleantech technologies from long duration energy storage, spray-on solar PV window coatings, building decarbonization strategies, and transportation electrification via a myriad of EV charging technologies. The information gathered directly impacts Itron’s go-to-market strategies for operational load forecasting and demand management. This is just another example of how Itron is helping lead the way to a more resourceful, greener economy.

Resiliency in the Wake of Hurricane Season – Using IIoT Applications to Prepare for and Mitigate Natural Disasters

The 2019 Atlantic hurricane season has officially begun. From June 1 to Nov. 30, NOAA predicts a range of 9 to 15 storms to occur. With hurricane season upon us, disaster preparedness and mitigation is crucial to ensure the safety and well-being of citizens. Now, more than ever, smart, connected and resilient communities need to be built that can quickly recover after natural disasters, reduce the impacts and risks of hazards by forecasting issues before they occur, and restore power to communities much faster or avoid outages altogether while keeping citizens informed.

The devastation caused by natural disasters has a serious impact on critical infrastructure. Recent advances in technology, such as Industrial IoT (IIoT) applications, are helping communities better prepare and respond to natural disasters, while increasing reliability and safety during disasters. Itron has worked with utilities and municipalities to improve preparation through the implementation of smart grid technology to prevent thousands of service interruptions, sense when the power is out at each endpoint, trace back to where the outage has occurred, better target crews for more effective dispatch and understand when customers regain power once restoration work has begun. IIoT applications are making it possible to predict, prepare, respond and recover from natural disasters.

Predict and Prepare
Smart communities are leveraging IIoT applications to help mitigate risks before a natural disaster occurs. IIoT applications that enable grid awareness are helping utilities better understand the state of their electrical distribution systems—they can spot issues and fix them before they create unsafe conditions. For example, a pole tilt sensor can detect if a pole is down. Knowing what poles are down and where is helpful with speeding restoration efforts and can increase response times to those that need attention. Similarly, line sensors that monitor for hazardous situations on the distribution line can help spot issues before they create unsafe conditions. Monitoring solutions, such as voltage analysis and distribution transformer monitoring, can evaluate the health of devices on the grid to ensure they aren’t failing, which could create a potential safety issue.

Advanced sensors in the network enable utilities to anticipate where problems will occur by creating awareness of intermittent interference from vegetation, equipment not functioning properly, loose connections or heat buildup in the system. The utility or city is able to de-energize systems and dispatch crews more effectively when able to anticipate or detect where fires might occur and where damage is detected in the system.

Utilities are also leveraging IIoT applications to respond more quickly and effectively to the impacts of a disaster, including outages. Real-time intelligence in outage detection technology helps improve response times. As a result, utilities are able to accurately understand the size, location and extent of an outage, which aids in restoration by validating and continuously updating outage extents.

In addition to using smart meters to know instantly when the power is out from the meter’s built-in intelligence, utilities can also use gas shutoff devices. When coupled with sensors, the devices stop the flow of gas in a hurricane and other high-risk situations such as an earthquake or flood.

IIoT applications facilitate recovery from outages caused by natural disasters. For example, in Houston, when Hurricane Harvey made landfall in August 2017, more than 250,000 people in Texas lost power. Using smart grid technology, Itron customer CenterPoint Energy was able to recover and reconnect people to power quickly, avoiding an estimated 45 million outage minutes for its customers. Distribution automation devices, such as intelligent grid switches, allowed the utility to quickly isolate problems on their grid and restore service to customers.

When Hurricane Irma hit Florida just weeks after Hurricane Harvey devasted Texas, Florida Power & Light was able to restore service to 1 million customers before the storm even exited its system, and 2 million customers after one day. Smart grid technology also enabled the utility to avoid more than half a million outages during the storm.

As challenges intensify, resiliency must be at the heart of the intelligent devices, applications and networks that are required today and that our future calls for. With every season, communities and utilities gain key learnings on how to do better or what to avoid. As the 2019 hurricane season is underway, it’s critical that the utility industry and communities embrace a range of IIoT applications that can help them be more resilient to better prepare, mitigate and recover from natural disasters.

I agree to have my personal information transfered to AWeber ( more information )
Opt in to receive notifications when a blog post is published. Don't miss the thought leadership, insight and news from Itron.
We hate spam. Your email address will not be sold or shared with anyone else.