The second most devastating natural hazard worldwide, after tropical storms, are earthquakes together with their direct effects such as tsunamis, fires, and landslides. Predicting the time, magnitude and location of earthquakes is thus a critical need, but not yet possible and considered the Holy Grail of seismologists around the world. Ever since, scientists strive to identify reliable diagnostic precursors of an approaching earthquake, a parameter (or combination of parameters) measured before an earthquake, that allows to forecast with high probability the origin time of a pending earthquake.
In a new study, Dr. Sadegh Karimpouli and his colleagues from the Section Geomechanics and Scientific Drilling of GFZ German Research Centre for Geosciences, together with researchers from the German Climate Computing Centre in Hamburg, Stanford University, USA, and the University of Memphis, USA, report on a successful approach to forecast the ‘time-to-earthquake’ albeit in a lab. The researchers used acoustic monitoring during rock deformation experiments and novel Machine Learning techniques. The results have been published in the study entitled “Explainable machine learning for labquake prediction using catalog-driven features'' in the journal Earth and Planetary Science Letters. The study was funded by the EU HORIZON DT-GEO project. “We think that our results are very encouraging,” says first author Dr. Karimpouli from GFZ.
Rock deformation in the lab indicates important precursor processes
Regardless of whether accumulated energy is released in smaller or larger earthquakes, activating a 1-cm or a 100-km part of a tectonic fault, seismic events are believed to generally be preceded by preparatory processes. However, these processes cannot be easily measured in nature. “This is why we bring rock specimens to the laboratory and perform experiments under full control. In these experiments we can observe preparatory processes producing laboratory earthquakes, so-called Acoustic Emissions. Thanks to the high-resolution monitoring in the lab, these processes can be detected, interpreted and then used for earthquake forecasting,” says Dr. Grzegorz Kwiatek, Working Group Leader in GFZ-Section Geomechanics and Scientific Drilling, who conceptualized the study and supervised the project.
“Observational gap” between laboratory and natural scale
Earthquakes are the final outcome of a complex deformation process that is accumulating energy in the Earth’s crust. However, insufficient resolution of monitoring earthquakes in nature, as well as complexity of the natural fault systems, make it hard to investigate the significance of various parameters reflecting preparatory processes in the field. This is referred to as an “observational gap” between laboratory and natural scales.
To account for deficiencies of field observations, the research group turned to data from laboratory “stick-slip experiments”, which are able to reproduce analogues of multiple earthquake cycles including earthquake preparatory processes in a fully controlled environment. “Stick-slip” means that the fault is slipping, then reloaded until it slips again.
Many mini-quakes in rock sample accelerate research
This is an analogue of the seismic earthquake cycle in nature, where, however, it takes decades or centuries for the same area to generate another earthquake. To speed up research, the processes are recreated in the laboratory in fast motion.
Dr. Thomas Goebel, Center for Earthquake Research and Information, University of Memphis, USA, one of the co-authors of this study, conducted such experiments in the geomechanical high-pressure laboratory at GFZ. The cylindrical granitic sample of 5 cm diameter and 10 cm height hosting a rough fault with a complex fault surface was subjected to stress conditions typically acting in the Earth crust several kilometers below our feet and further loaded in order to trigger fault slip.
The experiments resulted in repetitive slips of the complex fault producing thousands of Acoustic Emissions, which mimics periodic occurrences of large and small earthquakes on major fault systems such as the North and East Anatolian Faults in Türkiye or the San Andreas Fault in California. The difference between natural and laboratory scale is the meticulous control over stress and damage conditions. This goes along with the ability to closely monitor both the seismic and aseismic processes at high resolution using Acoustic Emission sensors and other instrumentation. An aseismic process is slow movement of a fault without causing fast slip (earthquakes) that would result in seismic waves transporting potentially destructive energy.
Two previously unknown precursor phenomena are of particular interest
The project team carefully selected and extracted 47 seismo-mechanical and statistical time-dependent parameters out of the recorded seismic data signifying information on spatial and temporal evolution of stress and damage in the fault zone. This collection contains, e.g., classical precursory parameters derived from foreshocks such as the seismicity rate or the relation of small and large seismic events.
However, the team also used two new precursory parameters that evidence local damage evolution and stress field heterogeneity on and around the fault plane. Both parameters can be derived from studying very small signals typically hidden in the seismic background noise and deriving the faulting type of small-scale AEs, so-called focal mechanisms.
Dr. Sadegh Karimpouli, the lead author of the study, explained “The rationale behind selection of these parameters were, on one hand, to maximize the amount of significant input data that could cover the complexity of the earthquake process. On the other hand, we wanted to develop parameters that could be easily understood in the context of the physical processes occurring in the fault zone. Lastly, we aimed to quantify and grade their predictive capabilities through Machine Learning (ML) models”.
The authors also note a difference of the accuracy of their results while compared to studies of other groups performed on simple, smooth faults. This difference fits well with the current models that the fault structure and complexity play an important role in the earthquake initiation, effectively determining our capabilities for earthquake forecasting.
Dr. Karimpouli highlights that “Forecasting time-to-earthquake for complex rough faults as in our case is much more challenging. The complexity arises from the hundreds-times smaller number of input data, which makes it harder for ML to ‘learn’, as well as complex processes in the rough fault zone such as the evolution of roughness that don’t have to be taken into account in the smooth ones”. He adds: “Nevertheless, our observations of the training process indicate that having more input data from more experiments would allow us to easily improve the model performance”.
Resume
What's particularly important in this study is that the team was able to quantify the importance of earthquake precursory parameters. They identified that the newly employed neural network is taking advantage of every bit of information, even one that could be considered unimportant at the first glance by the human observer. The authors conclude that while there is no universal earthquake forecasting system just yet, the new findings hold the potential to make substantial steps forward in forecasting natural earthquakes along tectonic faults in Nature. Currently the scientists are working on upscaling their methods from laboratory to field scale using adequate but rare field observations.
The study in EPSL highlights that application of ML algorithms utilizing physics-informed parameters gives promise of constraining the time-to-failure even in the case of complex, heterogeneous rough faults.
Further work
The Working Group “Faulting Mechanics”, headed by Dr. Grzegorz Kwiatek as part of the GFZ-section ‘Geomechanics and Scientific Drilling’, is actively investigating potentials for generalization of developed ML models and effective use of precursory seismo-mechanical parameters across all spatial scales of the earthquake processes.
At the geological reservoir scale, the research group implements ML models during stimulation of geothermal reservoirs to forecast and thus – by controlling the injection process – prevent larger earthquakes that in the past led to the closure of developed enhanced geothermal systems.
Original study: Sadegh Karimpouli et al.: “Explainable machine learning for labquake prediction using catalog-driven features” (in: Earth and Planetary Science Letters, DOI: https://doi.org/10.1016/j.epsl.2023.118383)
More information and related projects:
GEOREAL project
Geothermal project Helsinki
https://www.gfz-potsdam.de/en/press/news/details/improved-risk-management-for-geothermal-systems