Rapid attribution: Is climate change involved in an extreme weather event?
Research into extreme weather and climate change is making progress. It is now possible to quickly quantify the degree to which the intensity or frequency of certain events is influenced by man-made climate change.
Why is it important to rapidly establish whether – and to what extent – an extreme regional weather event is now more likely as a result of climate change? Take as an example the extreme rainfall and flooding that France experienced in 2016 on the Seine and Loire. If climate change has already verifiably increased spring rainfall in these regions, more events of this kind can be expected in future as global warming continues. Such events could lead to damage in the billions – especially in the Greater Paris area.
Correct attribution critical for risk management
The requirements for public risk management here are different to those for an exceptional one-off event without any trend. Attribution to climate drivers can have direct practical consequences: the earlier climate change can be identified as being involved in a natural catas-trophe, the stronger the incentive for authorities to implement suitable adaptation measures.
It is much more difficult to attribute a single extreme weather event to its drivers shortly after its occurrence than, for example, the increase in global mean temperature that has taken place over many decades. With a set of global climate models, it is possible to attribute the warming to human-influenced climate change by way of a virtual experiment. A link is probable if the warming can only be reproduced when, in addition to natural drivers (historical volcanic eruptions and solar variability), the observed changes in greenhouse gas and aerosol concentrations, as well as land-use changes, are applied to the models. When the natural forcing variables alone are applied, excluding anthropogenic factors, the models do not arrive at the observed increase.
Earthquake-resistant construction in Japan
Official building standards have been in force in exposed regions of Japan since 1924. They have been updated many times. There were major changes, for example, in 1981 (after the 1978 Miyagi quake), following which, though a building may suffer damage from strong ground motion, it should not be capable of collapsing. There were many smaller changes in the years thereafter, relating, for example, to the stability of wooden buildings in 2000 and the requirement that all buildings under construction be inspected by an independent body and checked for compliance with the building standards, in 2006.
The series of earthquakes in April caused large losses in the Kumamoto prefecture and surrounding towns (e.g. Mashiki). There were 69 deaths, and many people were injured. Almost 300,000 had to be evacuated after the main shock. Some 8,000 buildings collapsed and more than 140,000 were damaged, 24,000 severely. A large proportion of the buildings that collapsed were wooden buildings with heavy roof structures built according to the pre-1981 building standards. Several cultural heritage sites (including Kumamoto Castle and the Aso Shrine) were damaged, as was infrastructure (roads, bridges and railway lines), either directly by the quake or by subsequent landslides.
Weather extremes as one-off events
What works for a variable such as global temperature, which is averaged over time and space, is of little use in the case of weather extremes, which are sporadic as regards time and place. Such weather extremes can be seen as unique events in terms of their individual meteoro-logical causes and course so that, strictly speaking, it is not possible to derive information on frequencies or on any changes to the frequencies.
However, the use of abstraction can help in this instance: all the events that produce intense precipitation are first pooled in one category. If the statistical series for events in this category is sufficiently large, we can check whether the associated distribution of rainfall – for example the return periods for high values – has altered significantly over time.
By this means it cannot be determined whether the observed changes are the result of climate change as opposed to natural climate variability. Such evidence can be pro-vided in the form of a climate model experiment, as illustrated with the 2016 spring floods in France. In this case, two distributions of the three-day rainfall amounts are generated for the regions in question: one of them for a virtual, preindustrial climate not influenced by climate change, and the other for the climate we have today. To obtain a statistically sufficient database, the climate models generate the distributions over and over again, and the results are then pooled. This also makes it possible to average out natural climate variability influences found in the climate model runs. The procedure ensures that climate change is the only thing determining the difference between the two distributions. Furthermore, the procedure is repeated with many different climate models, instead of just one.
More extreme precipitation events on the Seine and Loire in future
It is evident that the three-day rainfall amounts in the spring of 2016 in France were rare events in the present-day climate. They occur roughly every hundred years in the Loire region, and are even more unusual in the Seine basin. Nevertheless, the different climate models produce consistent and thus robust results. They show that, because of climate change, the probability of regional events of at least the same intensity as 2016 has increased by factors of 2 (Loire) and 2.3 (Seine) in comparison with a world without climate change. They also show that the probabilities of less extreme events have also increased as a result of climate change. The fact that these enhanced probabilities can be attributed to climate change means that such events will occur even more often in the future.
Such attribution studies of selected weather extremes such as heatwaves, droughts and intense precipitation have been conducted regularly since 2011, generally on the basis of models. At the end of the year subsequent to the event, these studies are published in special supplements to the Bulletin of the American Meteorological Society. Climate change was found to have influenced the frequency or intensity of 65% of the more than 100 events studied so far, while no influence could be demonstrated for 35%. This illustrates how climate change is already having a significant impact on extreme events.
Because of the time lag factor, however, these studies fail to meet the reasonable criterion of rapid attribution mentioned above. For this reason, for a few years now, articles containing an attribution analysis have been submitted to specialist journals within a few weeks of an event (rapid attribution) – the study on the floods in France outlined above, for example, was online just three weeks after the event. A further recent example is the torrential rain and flooding that occurred in August 2016 in Louisiana, particularly in the Baton Rouge area, where one place experienced just under 650 litres/m2 of rain over a period of three days.
A little over three weeks after the event, a study (van der Wiel et al., HESSD, 2016) appeared online, stating that an extreme event of this kind now occurs roughly every 30 years in the central Gulf Coast region of the USA, and has become more frequent by a factor of at least 1.4 as a result of climate change. Similar studies have also been published for a number of heat and extreme precipitation events in recent years.
Normalised losses alone not indicative enough
Rapid attribution analysis as described above is useful for informing risk management about the type and scale of the change in hazard activity and creates an incentive to improve adaptation efforts while the event is fresh in the minds of the relevant authorities. In the case of major events in a particular region, such analyses could help to identify a longterm driver of losses that is not clearly identifiable from the time series of normalised losses. This is because such a trend would only become apparent over a much longer observation period. Major catastrophes such as river floods remain rare events in many countries, affecting different regions, exposures and vulnerabilities over the decades.
One such time series does not per se disclose the causes of the different normalised loss amounts involved. These might be able to tell us something about the different regional exposures or efforts to protect against flooding, but might not at first hint at changes due to climate change. This is illustrated by the normalised losses of the great flood disasters in the United Kingdom since 1990 (see chart). All the major losses in this time sequence, in other words the events in 2000, 2007, 2014 and 2015, represent affected exposures that may partially overlap but which are also distinct from one another. It is only when we look at the attribution studies that we see that climate change has already influenced the probability of all of these events. They are 1.4 to 2 times more frequent than they would be in a world without climate change.
To foster rapid attribution studies, Munich Re participates as a stakeholder in the European research initiative EUCLEIA (European Climate and Weather Events: Interpretation and Attribution). EUCLEIA is developing an operational system for climate attribution with particular focus on Europe. The causes of changed event frequencies and/or intensities need to be identified as early as possible in order to take the appropriate steps. Besides early identification of trends in hazards and losses, the main implication for the insurance industry is the continued support for corresponding efforts to improve prevention.