Didier Sornette (born June 25, 1957 in
Paris
Paris () is the capital and most populous city of France, with an estimated population of 2,165,423 residents in 2019 in an area of more than 105 km² (41 sq mi), making it the 30th most densely populated city in the world in 2020. S ...
) is a French researcher studying subjects including
complex systems
A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communicatio ...
and
risk management. He is Professor on the Chair of Entrepreneurial Risks at the
Swiss Federal Institute of Technology Zurich (ETH Zurich) and is also a professor of the
Swiss Finance Institute
The Swiss Finance Institute (SFI) is a national center for research, doctoral training, knowledge exchange, and continuing education in the fields of banking and finance. Created in 2006 as a public–private partnership, SFI is a common initiative ...
, He was previously a Professor of Geophysics at UCLA, Los Angeles California (1996–2006) and a Research Professor at the
French National Centre for Scientific Research
The French National Centre for Scientific Research (french: link=no, Centre national de la recherche scientifique, CNRS) is the French state research organisation and is the largest fundamental science agency in Europe.
In 2016, it employed 31,637 ...
(1981–2006).
Theory of earthquakes and fault networks
With his long-time collaborator Dr. Guy Ouillon, Sornette has been leading a research group on the “Physics of earthquakes” over the last 25 years. The group is active in the modelling of earthquakes, landslides, and other natural hazards, combining concepts and tools from statistical physics, statistics, tectonics, seismology and more. First located at the Laboratory of Condensed Matter Physics (University of Nice, France), then at the Earth and Space Department (UCLA, USA), the group is now at ETH-Zurich (Switzerland) since March 2006.
Earthquake prediction and forecasting
Earthquake prediction
The group has tackled the problem of earthquake and rupture prediction since the mid-90s within the broader physical concept of critical phenomena.
Considering rupture as a second-order phase transition, this predicts that, approaching rupture, the spatial correlation length of stress and damage increases. This in turn leads to a power-law acceleration of moment and strain release, up to the macroscopic failure time of the sample (i.e. a large earthquake in nature). This prediction has been checked on various natural and industrial/laboratory data, over a wide spectrum of different scales (laboratory samples, mines, California earthquakes catalog), and under different loading conditions of the system (constant stress rate, constant strain rate). The most puzzling observation is that the critical power-law rate acceleration is decorated by log-periodic oscillations, suggesting a universal ratio close to 2.2. The existence of such oscillations stems from interactions between seismogenic structures (see below for the case of faults and fractures), but also offers a better constraint to identify areas within which a large event may occur. The concept of critical piezo-electricity in polycrystals has been applied to the Earth's crust.
Earthquake forecasting
Earthquake forecasting differs from prediction in the sense that no alarm is issued, but a time-dependent probability of earthquake occurrence is estimated. Sornette's group has contributed significantly to the theoretical development and study of the properties of the now standard Epidemic Type Aftershock Sequence (ETAS) model. In a nutshell, this model states that each event triggers its own direct aftershocks, which themselves trigger their own aftershocks, and so on... The consequence is that events cannot be labeled anymore as foreshocks, mainshocks or aftershocks, as they can be all of that at the same time (with different levels of probability). In this model, the probability for an event to trigger another one primarily depends on their separating space and time distances, as well as on the magnitude of the triggering event, so that seismicity is then governed by a set of seven parameters. Sornette's group is currently pushing the model to its limits by allowing space and time variations of its parameters. Despite the fact that this new model reaches better forecasting scores than any other competing model, it is not sufficient to achieve systematic reliable predictions. The main reason is that this model predicts future seismicity rates quite accurately, but fails to put constraints on the magnitudes (which are assumed to be distributed according to the Gutenberg-Richter law, and to be independent of each other). Some other seismic or non-seismic precursors are thus required in order to further improve those forecasts. According to the ETAS model, the rate of triggered activity around a given event behaves isotropically. This over-simplified assumption has recently relaxed by coupling the statistics of ETAS to genuine mechanical information. This is done by modelling the stress perturbation due to a given event on its surroundings, and correlating it with the space-time rate of subsequent activity as a function of transferred stress amplitude and sign. This suggests that triggering of aftershocks stems from a combination of dynamic (seismic waves) and elasto-static processes. Another unambiguous interesting result of this work is that the Earth crust in Southern California has quite a short memory of past stress fluctuations lasting only about 3 to 4 months. This may put more constraint on the time window within which one may look for both seismic and non-seismic precursors.
Multifractal stress activated (MSA) model of rupture and earthquakes
Ouillon and Sornette have developed a pure statistical physics model of earthquake interaction and triggering, aiming at giving more flesh to the purely empirical ETAS linear model. The basic assumption of this "Multifractal stress activated" model is that, at any place and time, the local failure rate depends exponentially on the applied stress. The second key ingredient is to recognize that, In the Earth crust, the local stress field is the sum of the large scale, far-field stress due to plate motion, plus all stress fluctuations due to past earthquakes. As elastic stresses add up, the exponentiation thus makes this model nonlinear. Solving it analytically allowed them to predict that each event triggers some aftershocks with a rate decaying in time according to the Omori law, i.e. as 1/tp, but with a special twist that had not been recognized heretofore. The unique prediction of the MSA model is that the exponent p is not constant (close to 1) but increases linearly with the magnitude of the mainshock. Statistical analyses of various catalogs (California, Japan, Taiwan, Harvard CMT) have been carried out to test this prediction, which confirmed it using different statistical techniques (stacks to improve signal to noise ratio, specifically devised wavelets for a multiscale analysis, extreme magnitude distributions, etc.). This result thus shows that small events may trigger a smaller number of aftershocks than large ones, but that their cumulative effect may be more long-lasting in the Earth crust. A new technique has also recently introduced, called the barycentric fixed mass method, to improve considerably the estimation of multifractal structures of spatio-temporal seismicity expected from the MSA model.
Faulting, jointing and damage
A significant part of the activity of Sornette's group has also been devoted to the statistical physics modelling as well as properties of fractures and faults at different scales. Those features are important as they may control various transport properties of the crust as well as represent the loci of earthquake nucleation.
Statistical physics models of fractures and faults
Sornette and Sornette (1989) suggested to view earthquakes and global plate tectonics as self-organized critical phenomena. As fault networks are clearly self-organized critical systems in the sense that earthquakes occur on faults, and faults grow because of earthquakes,
[Sornette, A., Ph. Davy and D. Sornette, "Growth of fractal fault patterns", ''Phys. Rev. Lett.'' 65, 2266–2269 (1990)] resulting in hierarchical properties, the study of their statistics should also bring information about the seismic process itself. Davy,
Sornette and Sornette
introduced a model of growth pattern formation of faulting and showed that the existence of un-faulted areas is the natural consequence of the fractal organization of faulting.
Cowie et al. (1993; 1995) developed the first theoretical model that encompasses both the long range and time organization of complex fractal fault patterns and the short time dynamics of earthquake sequences. A result is the generic existence in the model of fault competition with intermittent activity of different faults. The geometrical and dynamical complexity of faults and earthquakes is shown to result from the interplay between spatio-temporal chaos and an initial featureless quenched heterogeneity. Miltenberger et al. and Sornette et al. (1994) showed that self-organized criticality in earthquakes and tectonic deformations are related to synchronization of threshold relaxation oscillators. Lee et al. (1999) demonstrated the intrinsic intermittent nature of seismic activity on faults, which results from their competition to accommodate the tectonic deformation. Sornette and Pisarenko (2003) performed a rigorous statistical analysis of distribution of plate sizes participating in plate tectonics and demonstrate the fractal nature of plate tectonics.
Statistical properties of fractures and faults
Using a collection of maps centered at the same location but at different scales in Saudi Arabia (meter to hundreds of kilometers, i.e. slightly more than five decades), it was shown that joints and fault patterns display distinct spatial scaling properties within distinct ranges of scales. These transition scales (which quantify the horizontal distribution of brittle structures) can be nicely correlated with the vertical mechanical layering of the host medium (the Earth crust). In particular, fracture patterns can be shown to be rather uniform at scales lower than the thickness of the sedimentary basin, and become heterogeneous and multifractal at larger scales. Those different regimes have been discovered by designing new multifractal analysis techniques (able to take account of the small size of the datasets as well as with irregular geometrical boundary conditions), as well as by introducing a new technique based on 2D anisotropic wavelet analysis. By mapping some joints within the crystalline basement in the same area, it was found that their spatial organization (spacing distribution) displayed discrete scale invariance over more than four decades. Using some other dataset and a theoretical model, Huang et al. also showed that, due to interactions between parallel structures, the length distribution of joints displays discrete scale invariance.
3D fault reconstruction and mapping
Motivated by earthquake prediction and forecast, Sornette' group has also contributed to the problem of 3D fault mapping. Given an earthquake catalog with a large number of events, the main idea is to invert for the set of planar segments that best fits this dataset. More recently, Ouillon and Sornette developed techniques that model the spatial distribution of events using a mixture of anisotropic Gaussian kernels. Those approaches allow one to identify a large number of faults that are not mapped by more traditional/geological techniques because they do not offer any signature at the surface. Those reconstructed 3D fault networks offer a good correlation with focal mechanisms, but also provide a significant gain when using them as the proxy of earthquakes locations in forecasting experiments. As catalogs can be very large (up to half-million events for Southern California), the catalog condensation technique has been introduced, which allows one to detect probable repeating events and get rid of this redundancy.
The Global Earthquake Forecasting System
In 2016, in collaboration with Prof. Friedemann Freund (with John Scoville) at NASA Ames and GeoCosmo, Sornette (with Guy Ouillon) has launched the Global Earthquake Forecasting Project (GEFS) to advance the field of earthquake prediction. This project is originally rooted in the rigorous theoretical and experimental solid-state physics of Prof. Friedemann Freund, whose theory is able to explain the whole spectrum of electromagnetic type phenomena that have been reported before large earthquakes for decades, if not centuries: when submitting rocks to significant stresses, electrons and positive holes are activated; the latter flow to less stressed domains of the material thus generating large-scale electric currents. Those in turn induce local geoelectric and geomagnetic anomalies, stimulated infrared emission, air ionization, increase levels of ozone and carbon monoxide. All those fluctuations are currently measured using ground stations or remote sensing technologies.
There are innumerable reports of heterogeneous types of precursory phenomena ranging from emission of electromagnetic waves from ultralow frequency (ULF) to visible (VIS) and near-infrared (NIR) light, electric field and magnetic field anomalies of various kinds (see below), all the way to unusual animal behavior, which has been reported again and again.
Space and ground anomalies preceding and/or contemporaneous to earthquakes include: (Satellite Component)
1. Thermal Infrared (TIR) anomalies
2. Total Electron Content (TEC) anomalies
3. Ionospheric tomography
4. Ionospheric electric field turbulences
5. Atmospheric Gravity Waves (AGW)
6. CO release from the ground
7. Ozone formation at ground level
8. VLF detection of air ionization
9. Mesospheric lightning
10. Lineaments in the VIS-NIR;
Ground Station Component:
1. Magnetic field variations
2. ULF emission from within the Earth crust
3. Tree potentials and ground potentials
4. Soil conductivity changes
5. Groundwater chemistry changes
6. Trace gas release from the ground
7. Radon emanation from the ground
8. Air ionization at the ground surface
9. Sub-ionospheric VLF/ELF propagation
10. Nightglow
These precursory signals are intermittent and seem not to occur systematically before every major earthquake. Researchers have not been able to explain and exploit them satisfactorily, but never together.
Unfortunately, there is no worldwide repository for such data, and those databases are most often under-exploited using too simplistic analyses, or neglecting cross-correlations among them (most often because such data are acquired and possessed by distinct and competing institutions). The GEFS stands as a revolutionary initiative with the following goals: (i) initiate collaborations with many datacenters across the world to unify competences; (ii) propose a collaborative platform (InnovWiki, developed at ETH Zürich) to develop a mega repository of data and tools of analysis; (iii) develop and test rigorously real-time, high-dimension multivariate algorithms to predict earthquakes (location, time and magnitude) using all available data.
Endo-exo dynamics of social collective behaviours
In 2004, Sornette used Amazon.com sales data to create a mathematical model for predicting
bestseller
A bestseller is a book or other media noted for its top selling status, with bestseller lists published by newspapers, magazines, and book store chains. Some lists are broken down into classifications and specialties (novel, nonfiction book, cook ...
potential based on very early sales results. This was further developed to characterise the dynamics of success of YouTube videos. This provides a general framework to analyse precursory and
aftershock
In seismology, an aftershock is a smaller earthquake that follows a larger earthquake, in the same area of the main shock, caused as the displaced crust adjusts to the effects of the main shock. Large earthquakes can have hundreds to thousa ...
properties of shocks and ruptures in finance, material
rupture,
earthquake
An earthquake (also known as a quake, tremor or temblor) is the shaking of the surface of the Earth resulting from a sudden release of energy in the Earth's lithosphere that creates seismic waves. Earthquakes can range in intensity, fro ...
s, amazon.com sales: his work has documented ubiquitous
power law
In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of the initial size of those quantities: one q ...
s similar to the
Omori law in seismology that allow one to distinguish between external shocks and endogenous
self-organization
Self-organization, also called spontaneous order in the social sciences, is a process where some form of overall order arises from local interactions between parts of an initially disordered system. The process can be spontaneous when suff ...
.
Logistic function
A logistic function or logistic curve is a common S-shaped curve (sigmoid curve) with equation
f(x) = \frac,
where
For values of x in the domain of real numbers from -\infty to +\infty, the S-curve shown on the right is obtained, with th ...
, logistic equations and extensions
With collaborators, Sornette has extensively contributed to the application and generalisation of the
logistic function
A logistic function or logistic curve is a common S-shaped curve (sigmoid curve) with equation
f(x) = \frac,
where
For values of x in the domain of real numbers from -\infty to +\infty, the S-curve shown on the right is obtained, with th ...
(and equation). Applications include tests of chaos of the discrete logistic map, an endo-exo approach to the classification of diseases, the introduction of delayed feedback of population on the carrying capacity to capture punctuated evolution, symbiosis, deterministic dynamical models of regime switching between conventions and business cycles in economic systems, the modelling of periodically collapsing bubbles, interactions between several species via the mutual dependences on their carrying capacities.
Another application is a methodology to determine the fundamental value of firms in the social networking sector, such as Facebook, Groupon, LinkedIn Corp., Pandora Media Inc, Twitter, Zynga and more recently the question of what justifies the skyrocketing values of the
unicorn (finance)
In business, a unicorn is a privately held startup company valued at over US$1 billion. The term was first published in 2013, coined by venture capitalist Aileen Lee, choosing the mythical animal to represent the statistical rarity of such succ ...
companies. The key idea proposed by Cauwels and Sornette is that revenues and profits of a social-networking firm are inherently linked to its user basis through a direct channel that has no equivalent in other sectors; the growth of the number of users can be calibrated with standard logistic growth models and allows for reliable extrapolations of the size of the business at long time horizons. With their PhD student, they have applied this methodology to the valuation of Zynga before its IPO and have shown its value by presenting ex-ante forecasts leading to a successful trading strategy. A recent application to the boom of so-called "unicorns", name given to the start-ups that are valued over $1 billion, such as Spotify's and Snapchat, are found in this master thesis.
Financial bubbles
He has contributed theoretical models, empirical tests of the detection and operational implementation of forecasts of
financial bubbles.
The JLS and LPPLS models
By combining (i) the economic theory of rational expectation bubbles, (ii) behavioral finance on imitation and herding of investors and traders and (iii) the mathematical and statistical physics of bifurcations and phase transitions, he has pioneered the log-periodic power law singularity (LPPLS) model of financial bubbles. The LPPLS model considers the faster-than-exponential (power law with finite-time singularity) increase in asset prices decorated by accelerating oscillations as the main diagnostic of bubbles. It embodies the effect of positive feedback loops of higher return anticipations competing with negative feedback spirals of crash expectations. The LPPLS model was first proposed in 1995 to predict the failure of critical pressure tanks embarked on the European
Ariane rocket and as a theoretical formulation of the acceleration moment release to predict earthquakes. The LPPLS model was then proposed to also apply to model financial bubbles and their burst by Sornette, Johansen and Bouchaud and independently by Feigenbaum and Freund. The formal analogy between mechanical ruptures, earthquakes and financial crashes was further refined within the rational expectation bubble framework of Blanchard and Watson by Johansen, Ledoit and Sornette. This approach is now referred to in the literature as the JLS model. Recently, Sornette has added the S to the LPPL acronym of "log-periodic power law" to make clear that the "power law" part should not be confused with
power law
In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of the initial size of those quantities: one q ...
distributions: indeed, the "power law" refers to the hyperbolic singularity of the form
, where