Munich Re logo
Not if, but how

Explore Munich Re Group

Get to know our Group companies, branches and subsidiaries worldwide.

Virtual earthquakes in 3D
    alt txt

    properties.trackTitle

    properties.trackSubtitle

    Thanks to supercomputers, it is now possible to simulate earthquakes and their devastating effects. Although such analyses are a valuable risk management tool, the scope of application is still limited.

    On 17 January 1995, a major earthquake devastated the city of Kobe, killing almost 6,500 people and leaving tens of thousands homeless. After this event, the Japanese government approved the construction of the world‘s largest shaking table, nicknamed “E-Defense” (www.bosai.go.jp/hyogo). This impressive facility was designed to be capable of full-scale 3D earthquake testing on buildings. It opened up a new set of opportunities and challenges for scientists and engineers, and boosted their goal of large physical testing of structures subjected to strong motion shaking.

    Does the same apply to earthquakes themselves? Would it be possible to reproduce an earthquake by way of a large physical experiment? On the one hand, it would be extremely difficult because even for a small earthquake of Mw 5.0, the energy released is comparable to the 1945 Hiroshima atomic explosion. The other problem is that it could also be extremely dangerous. Fortunately, there is an alternative: high-performance computing (HPC) has provided the chance of creating a virtual laboratory, where rare and unpredictable, albeit realistic, natural events like earthquakes can be simulated and studied from a physical point of view.

    In fact, we sometimes tend to forget that an earthquake is a complex dynamic phenomenon in which the propagation of waves plays a crucial role. This is probably because after a significant event we usually look at a map, a static map, showing the maximum observed (or modelled) ground motion amplitudes as a tool to estimate the impact of this event. In most cases it is a sound procedure. Nonetheless, it is worth bearing in mind some caveats regarding this state of affairs:

    • The map has usually been computed using a ground motion prediction equation (GMPE), a simplistic, empirical model essentially based on the statistical regression of previous ground motion recordings observed elsewhere, and aimed at predicting a selection of ground motion parameters (e.g. Peak Ground Acceleration) as a function of a few key parameters, such as the distance from the fault, the magnitude of the earthquake itself, the focal mechanism, and finally the soil effect (e.g. amplification or deamplification).
    • It has been improved by observed data (recorded during the event under investigation), but only if those data were available.
    • It therefore might be not capable of taking into account certain effects related to the intrinsic nature of an earthquake.

    An earthquake releases a large amount of energy in a short time, primarily by means of motion and secondarily by way of sound and heat. It therefore essentially produces permanent displacement and seismic waves, propagating in the soil. Indeed, if we had a sufficient number of seismometers (instruments designed to record the ground motion as a function of time) deployed in the right place, we would be able to construct a film presenting the propagation of this elasto-dynamic wave. Unfortunately, this is not really feasible as only few countries in the world deploy dense networks of such instruments and because of the long time intervals between seismic events.

    By using a GMPE, it is usually possible to determine the ground motion of an earthquake on the basis of its magnitude, the source-to-site distance and subsoil conditions. Usually, but not always. If the area in question is characterised by complex geology and is located close to the seismic source (i.e. the fault itself), more physical modelling might be required to properly take into account the complex ground shaking occurring under these circumstances.

    A simple analogy might help to explain this more clearly: you pick up your luggage from the baggage carousel at the airport. You try to open the suitcase with the combination lock and realise you have taken the wrong bag. You have selected the suitcase on the basis of certain characteristics (colour, size, weight, brand), which it unfortunately shares with many other bags. You have correctly identified the “average suitcase”. However, you are not interested in the average suitcase. You want your bag.

    San Francisco, Los Angeles and Tokyo are three examples where risk management should not rely on the prediction of the average ground motion. If the spatial correlation is not taken into account, there may be great errors in the loss estimates.

    The so called, “physics-based simulation” (PBS) approach takes into account these additional factors, providing a more realistic picture of the specific earthquake scenario. The PBS approach differs substantially from the one based on GMPE: the latter aims at vastly simplifying the modelling of the maximum ground motion by means of very few input parameters and relies mainly on observed data. The former takes into account a distinctly more realistic description of the earthquake physics and is therefore suited to reproducing complex seismic wave propagation phenomena, such as “near-field” effects occurring in the proximity of the seismic source, resonance inside a soft “alluvial basin” or complex constitutive behaviour of the earth’s crust.

    An example of the capabilities of this method is provided by the PBS for the Christchurch quake of 22 February 2011. The observed time histories (not only the peak values) were compared against the modelled seismograms and proved that this state-of-the-art methodology is now mature enough, within a certain frequency band, to help us in providing further insights into the ground motion occurring close to a fault and in a very complex 3D geotechnical and geological environment.

    Given that PBS has proved its reliability, seismologists and engineers are now recreating seismic motion of past earthquakes and simulating the ground shaking induced by the rupture occurring along well-known faults. Besides earthquake-prone areas such as San Francisco, Los Angeles and Tokyo, PBS have also been conducted for Istanbul, Wellington and Santiago de Chile. At the moment, PBS remains confined to specific areas of the world owing, on the one hand, to the level of geotechnical/geological information required to model the target area and, on the other hand, to the high computational cost involved. Nonetheless, it is clearly one of the most promising approaches to better understanding the consequences of this infrequent but potentially destructive natural event. Munich Re works together with the Polytechnic University of Milan in order to exploit the benefits that PBS offers and to incorporate 3D scenarios into our probabilistic earthquake models (http://speed.mox.polimi.it).

    Seismic wave propagation

    Examples of a physics-based simulation: The first three images show the modelled horizontal ground velocity (in centimetres per second) orthogonal to the fault, for a scenario with a magnitude of 7.0 in the area of Istanbul. You can see the snapshots at 15, 25 and 35 seconds after the beginning of the rupture. The bottom picture shows the modelled horizontal peak ground velocity in the examined area.

    The bottom picture shows the modelled horizontal peak ground velocity in the examined area.

    Munich Re Experts
    Marco Stupazzini
    Consultant on geophysical risks in Corporate Underwriting/Geo Risks

    Newsletter

    Stay ahead of the curve with exclusive insights and industry updates! Subscribe to our Munich Re Insights Newsletter for a front-row seat to the latest trends in risk management, expert analyses and assessments, market insights, and innovations in the insurance industry. Join our community of forward-thinkers at Munich Re and empower your journey towards a more resilient future.