In early December, the World Meteorological Organization (WMO – the governing body of the global meteorological scientific community) issued a preliminary report on the Global Climate in 2019. The conclusions on global climate change are consistent and irrefutable:
- 2019 will be either the 2nd or 3rd warmest year on record
- The past 5 years have been the warmest on record
- This decade will be the warmest on record and each successive decade since the 1980s has been warmer than the past
- Ocean heat content, which is a measure of this warming, reached record levels again in 2019
- 2019 concludes a decade of exceptional global heat and high impact weather
The map below depicts the temperature anomalies for the January – October period of this year versus the 30-year normal from the WMO report. The majority of the land and oceans so far this year have been on the warm side of the distribution. However, not all areas have warmed consistently but rather have experienced extremes.
For example, the northwest portion of the US and southern Canada has been cooler than normal during the first 10 months of this year. The cooler zone in the middle of the North American Continent was on account of record wetness in this area which increased cloud cover, saturated soils, and led to enhanced evaporation cooling.
This situation (an isolated area of cool anomalies interspersed around mostly warm conditions over large spatial areas) is consistent with a warm year globally and an overall warming trend; extreme cold was much less common than extreme heat but both ends of the spectrum were represented this year and in years past.
On a daily basis, the impacts of climate change play out through more volatile and abnormal weather on a microclimatic level as well as globally. A key component of the changing climate is weather volatility and the increase in extremes in virtually all areas of the globe. The mid-autumn period in the Northern Hemisphere is one of the most heightened periods of weather volatility. This year has followed that trend of extreme events in both hemispheres.
We will analyze some of the recent trends in the US and discuss 2 items that have caused major impacts and economic aftershocks.
California, the state with the nation’s largest economy with GDP around 3 trillion US dollars, has always had to deal with wildfires. Recently though, wildfire season has become longer, more extreme, and more costly. In 2017 and 2018, 5 of the 6 most costly wildfires on record occurred with losses in the 25 billion dollar range. These are the 2 costliest years ever from a wildfire standpoint in California.
Overall, this decade has seen half of the state’s 10 largest wildfires by area and 7 of the 10 most destructive fires. Is the recent heightened frequency and severity of the fires an anomaly or is this the new normal?
Since the 1970’s, California’s annual burned area has increased more than fivefold according to the WMO. Over the past 5 decades, the summertime forest fires have increased in size by roughly 800 percent. The higher breadth and frequency of fires combined with a steady increase in sprawl has dramatically increased the impacts–– economically, societally, and in terms of regulation.
This fall was unique in a different way. Late October and November had a plethora of fires with nearly 7,000 reported as of December 1, according to Cal Fire statistic agency. Acres burned currently stands at nearly 250,000. This is not nearly as high as the past 2 years when millions of acres were burned with devastating consequences. The reason for this will be outlined below.
PSPS: Lights Out
What made this year unique was the way in which managing fire risk impacted commerce and the public in California. In 2019, there was an unprecedented widespread preemptive public safety power shutoff (PSPS) initiative. PG&E, along with other utility entities, shut power off for millions of residents when the company perceived heightened risk of wildfires from high winds affecting high voltage lines. The utilities were attempting to manage risk after being found liable for fire destruction/damage “sparked” by their power lines. More than 3 million people lost their power preemptively numerous times this fall as the utilities, especially PG&E, shut off the power on a neighborhood by neighborhood basis; this included portions of the San Francisco, Los Angeles, and San Diego metro areas with the biggest impacts in northern California.
The widespread outages this year have caused an entirely new risk for business, transportation, and the public. At any time, the utilities could turn off the power with no advance specific notice and maintain the blackouts until they see fit to restore power. This is great for companies that sell generators but not great for business and normal life in general. It is also a source of physiological stress due to power grid uncertainty.
The combination of the fires this fall and the business interruptions caused by blackouts made this season unique. As for fire, one obvious question is why was this season not as devastating as the past few? California wildfires are driven by wind (Diablo winds in northern California and Santa Ana winds in southern California), a fuel source (dry brush and forests), and low relative humidity. These in situ conditions tend to occur primarily in the fall – late September through early December.
There are long-term precursor conditions that dictate the magnitude of how severe a fire season will be. Wildfires are ultimately driven by temperatures and precipitation during the summer and fall. This determines how “primed” the risk is before any “sparks” develop. Since this is the dry season in California, the key lies in temperature trends during this time of the year.
The graph below portrays average temperatures during the June through October period from 1950 to 2019. Warmth, or lack thereof, during this precursor time is critical in setting the stage for individual fire seasons. Warmer periods will cause vegetation to dry out which leads to great fuel sources for wildfires.
The trend during the past 70 years is dramatic––temperatures are clearly getting warmer with time. As is always the case, there is year-on-year variability within the long term trend. This year was not as warm as the past 2 years and was the coolest since 2011. Hence, going into this fire season, brush and forests were not as “primed” as the past few years. This is not the only reason for this year having far less damage than the past years but it is an important component.
Climate modeling studies consistently point to increasing fire risk with California having the highest probability of these increases. In other words, the recent trend of long and large wildfire seasons occurring with great frequency shows all signs of continuing and accelerating.
The fall, or what we call the “shoulder season”, has increasingly been the most volatile or extreme period of the year. It is the time of the year when the extremes are more pronounced. This fall was a classic example of volatility in the system. Extreme warmth across the US in early October transitioned to record cold by late October and into November. This transition was abrupt as temperatures across much of the nation swung from record warm to record cold in a few weeks – an extremely unusual situation.
Looking back to October, the temperature rankings by state are a study in extremes. The map below shows the temperature rankings on the 1-125 scale, where 1 is the coldest and 125 is the warmest in the historical record; official records in the US extend back to 1895. On the cold side of the spectrum, 14 states had rankings in the top 10 percent within the historical record. Idaho had the coldest October since 1895 – a ranking of 1.
On the warm side, 11 states had rankings in the top 10 percent of the record. Florida had a ranking of 124 meaning that it was the second warmest October since 1895. In general, the western US was extremely cold while the eastern US was extremely warm – there was very little “normal”!
For reference, a “normal” distribution would have an equal number of states in the above normal sector, the below normal sector, and the normal sector. In other words, a third of the states would be warm, cold, and normal. The distribution this fall was a classic bimodal one with the extremes represented but very little in the middle.
November was another study in extremes. The map below shows the temperature rankings for last month. The warmth in the western US––California had a ranking of 116 on the 1-125 scale ––along with low humidity was one of the reasons for the fires in early November.
In the eastern US, nearly all states had below normal temperatures in November and 6 had rankings in the coldest 10 percent of the historical record.
More specifically, the cold in the central and eastern US was more reminiscent of January (the heart of winter) than November. During the first half of the month, nearly 1000 cold temperature records were broken. Readings dropped below zero as far south as Kansas, Iowa, and Ohio – an extremely unusual situation for so early in the season. On November 13th, New Orleans, LA got down to 34 degrees––a record––which is colder than the lowest temperature all of last winter (36 degrees).
Temperature volatility has been a theme of increasing frequency this decade. Consistent with a warming trend with time, extreme cold was less common than extreme heat. The fall season had the most volatility on both sides of the equation of any season.
The past few months have featured widespread record warmth quickly followed by widespread record cold. While fall volatility may not get the headlines compared to extremes in mid-winter and mid-summer, it causes major ramifications to individuals managing transportation equipment decisions (PFF: protect from freeze; PFH: protect from heat), supplying energy to consumers (heating demand and cooling demand), dealing with infrastructure issues, etc.
Temperature trends, especially in the fall, are not what they used to be. In this case, recent history (the decades of the 1980s, 1990s, and even last decade) cannot be used as a guide to make decisions – it is not business as usual.
For planning purposes, the old business model was to take the temperature average of the past 30 years, or more, and that is the data that is most representative of the average and the extremes. It is no longer that simple. With extremes becoming more frequent and temperatures warming with time, the risk has to be modeled in a different fashion.
The fire and ice metaphor is not confined to the U.S. but it extends around the globe. So much literature and so many headlines tend to be on extremes during the middle of seasons (mid-winter and mid-summer in both Hemispheres). The volatility during the shoulder seasons (fall for example) is typically at greater magnitudes and thus, the impacts on transportation, business, and planning are as if not more significant.
To be clear, the expectation remains that global weather patterns will remain in a state of heightened volatility compared to historical norms. Extreme weather is an increasing liability to the economy and the costs, both financially and societally, will continue to rise based on current climate modeling.
Planning Mitigation and Operational Execution
At Riskpulse, we have been working with some of the world’s largest and most innovative manufacturers to help with practical solutions to deal with increasing climate volatility.
Solutions range from optimizing refrigeration use in logistics, to improved planning for over the road fleets ahead of significant events, to supply chain network risk visibility. Riskpulse serves customers such as Honda, Unilever, NASA, Apple, Humana, Delhaize, 7-Eleven, Transplace and AB-InBev.
We are at the forefront of practical solution provision for both planning mitigation and operational execution for the systemic global changes that all supply chains will need to adjust for, as climate volatility continues/increases.