Earthquakes are natural phenomena that affect multiple time-space scales and of which we only have indirect measurements strongly affected by uncertainty. These phenomena may be investigated on different, coherent time-space-magnitude scales where they show different critical aspects and are related to different goals. In this work our objective is to consider the time evolution of disastrous earthquakes, that means to move in medium-large scales, and to model their occurrence in forecasting perspective by combining knowledge gained at different levels. In the past, phenomenological analyses of the seismicity led mainly to two patterns: time-stationary point processes on regional and long time scales, and self-exciting models to describe typically the increase of seismic activity on short space-time scales immediately after large earthquakes. Later it was noted that even catalogs deprived of secondary shocks show clusters. These observations and the need to consider jointly the previous patterns to provide a better description of the phenomenon have brought to hybrid models which widely require measurements of geodetic and geologic quantities. Some of the model components are smoothed seismicity background model, renewal models, EEPAS model, aftershock models (e.g. ETAS), tectonic model based on a strain-rate map converted into earthquake rates. On the other hand self-correcting models are the only ones that attempt, on large space-time scales, to incorporate physical conjecture into the probabilistic framework. They are inspired to the elastic rebound theory by Reid, which was transposed into the framework of stochastic point processes by Vere-Jones in 1978. Hence we can say that most of the probability models used in earthquake forecasting belong to the two classes of self-exciting and self-correcting models, which are conflicting from the point of view of the hazard function: piecewise decreasing function in case of self-exciting models, piecewise increasing function in case of self-correcting models. Attempts for conciliating this dissent were done i) by Schoenberg and Bolt (2000) by simply putting together the conditional intensity functions of two point processes, one from each class, ii) by Varini (2008) by employing a state-space model for estimating the different phases of a seismic cycle, and iii) by Votsi (2014) by considering a discrete-time hidden semi-Markov model whose states are associated with different levels of the stress field. In all these proposals the results were not completely satisfactory. We present a new stochastic model for earthquake occurrences which takes into account the need to consider jointly the opposite trends which characterize self-exciting and self-correcting models, and the idea to superimpose (not simply combine) behaviors characteristic of different time-scales in a single hierarchical model. We consider all the earthquakes that are associated with a seismogenic source and exceed the magnitude threshold Mw 4.45 so that the time period in which the data set can be considered as complete includes a sufficient number of strong earthquakes with Mw â¥ 5.3. These events are responsible for most of the release of seismic energy; we put them on the first level of our model and assume that these leaders events follow the stress release model. At the second level there are the subordinate events, that is, those that occur between two consecutive leaders and show the tendency to cluster in closeness to them. We consider the occurrence times of these events as ordered failure times in the time interval limited by the two leaders and we model them through a generalized Weibull distribution with a bathtub-shaped hazard function so as to match the clustering trend close to the extremes of the interval. The model has been examined following the Bayesian paradigm; in particular, to assign prior distributions of the model parameters we have adopted an objective Bayesian perspective which combines the empirical Bayes method and the use of vague-proper prior distributions, whereas estimation of the model parameters was performed by applying Markov chain Monte Carlo methods which allow to obtain not only the parameter estimates (typically as their posterior means), but also a measure of their uncertainty, as expressed through the simulated posterior distribution of each parameter. The proposed model is applied to the sequence of earthquakes associated with one of the most active composite seismogenic sources of the Italian Database of Individual Seismogenic Sources (DISS, version 3.0.2), which is located in the Central Apennines and includes L'Aquila earthquake, one of the most recent destructive earthquake in Italy. We report the performance of the model in terms of marginal likelihood; moreover, we compare our model with the stress release and ETAS models respectively, on the basis of two validation criteria: the Bayes factor and the information criterion by Ando and Tsay. The Bayes factor aims at the model comparison by looking for the model that best fits the data, whereas the Ando-Tsay criterion chooses which model gives the best predictions of future observations generated by the same process as the original data.

### A new proposal for stochastic modeling of earthquake occurrence through a compound model

#####
*R Rotondi;E Varini*

##### 2018

#### Abstract

Earthquakes are natural phenomena that affect multiple time-space scales and of which we only have indirect measurements strongly affected by uncertainty. These phenomena may be investigated on different, coherent time-space-magnitude scales where they show different critical aspects and are related to different goals. In this work our objective is to consider the time evolution of disastrous earthquakes, that means to move in medium-large scales, and to model their occurrence in forecasting perspective by combining knowledge gained at different levels. In the past, phenomenological analyses of the seismicity led mainly to two patterns: time-stationary point processes on regional and long time scales, and self-exciting models to describe typically the increase of seismic activity on short space-time scales immediately after large earthquakes. Later it was noted that even catalogs deprived of secondary shocks show clusters. These observations and the need to consider jointly the previous patterns to provide a better description of the phenomenon have brought to hybrid models which widely require measurements of geodetic and geologic quantities. Some of the model components are smoothed seismicity background model, renewal models, EEPAS model, aftershock models (e.g. ETAS), tectonic model based on a strain-rate map converted into earthquake rates. On the other hand self-correcting models are the only ones that attempt, on large space-time scales, to incorporate physical conjecture into the probabilistic framework. They are inspired to the elastic rebound theory by Reid, which was transposed into the framework of stochastic point processes by Vere-Jones in 1978. Hence we can say that most of the probability models used in earthquake forecasting belong to the two classes of self-exciting and self-correcting models, which are conflicting from the point of view of the hazard function: piecewise decreasing function in case of self-exciting models, piecewise increasing function in case of self-correcting models. Attempts for conciliating this dissent were done i) by Schoenberg and Bolt (2000) by simply putting together the conditional intensity functions of two point processes, one from each class, ii) by Varini (2008) by employing a state-space model for estimating the different phases of a seismic cycle, and iii) by Votsi (2014) by considering a discrete-time hidden semi-Markov model whose states are associated with different levels of the stress field. In all these proposals the results were not completely satisfactory. We present a new stochastic model for earthquake occurrences which takes into account the need to consider jointly the opposite trends which characterize self-exciting and self-correcting models, and the idea to superimpose (not simply combine) behaviors characteristic of different time-scales in a single hierarchical model. We consider all the earthquakes that are associated with a seismogenic source and exceed the magnitude threshold Mw 4.45 so that the time period in which the data set can be considered as complete includes a sufficient number of strong earthquakes with Mw â¥ 5.3. These events are responsible for most of the release of seismic energy; we put them on the first level of our model and assume that these leaders events follow the stress release model. At the second level there are the subordinate events, that is, those that occur between two consecutive leaders and show the tendency to cluster in closeness to them. We consider the occurrence times of these events as ordered failure times in the time interval limited by the two leaders and we model them through a generalized Weibull distribution with a bathtub-shaped hazard function so as to match the clustering trend close to the extremes of the interval. The model has been examined following the Bayesian paradigm; in particular, to assign prior distributions of the model parameters we have adopted an objective Bayesian perspective which combines the empirical Bayes method and the use of vague-proper prior distributions, whereas estimation of the model parameters was performed by applying Markov chain Monte Carlo methods which allow to obtain not only the parameter estimates (typically as their posterior means), but also a measure of their uncertainty, as expressed through the simulated posterior distribution of each parameter. The proposed model is applied to the sequence of earthquakes associated with one of the most active composite seismogenic sources of the Italian Database of Individual Seismogenic Sources (DISS, version 3.0.2), which is located in the Central Apennines and includes L'Aquila earthquake, one of the most recent destructive earthquake in Italy. We report the performance of the model in terms of marginal likelihood; moreover, we compare our model with the stress release and ETAS models respectively, on the basis of two validation criteria: the Bayes factor and the information criterion by Ando and Tsay. The Bayes factor aims at the model comparison by looking for the model that best fits the data, whereas the Ando-Tsay criterion chooses which model gives the best predictions of future observations generated by the same process as the original data.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.