**1. Introduction**

Building earthquake forecasting models is a fundamental step in any probabilistic seismic hazard analysis (PSHA). The spatial distribution of future seismicity is usually estimated using a seismicity catalog using two commonly adopted approaches called zonation [1,2] and smoothed seismicity [3,4]. In this work, we focus our attention on the smoothed seismicity approach. This approach uses statistical techniques to build a spatially gridded model using the epicenters of seismic events. One of the first examples of the smoothed seismicity model was developed by [3] and used the Gaussian isotropic spatial kernel to smooth the seismicity around epicenters. This model is based on only one parameter, i.e., the sigma of the Gaussian kernel: the larger the sigma, the larger the smoothing and vice versa. In the Frankel model, the sigma is fixed for any event, so it is called "fixed smoothed seismicity". Later, [4] developed a smoothed seismicity model that allows changing the sigma of the Gaussian kernel, and in general the size of any spatial

**Citation:** Taroni, M.; Akinci, A. A New Smoothed Seismicity Approach to Include Aftershocks and Foreshocks in Spatial Earthquake Forecasting: Application to the Global Mw ≥ 5.5 Seismicity. *Appl. Sci.* **2021**, *11*, 10899. https://doi.org/ 10.3390/app112210899

Academic Editor: Stefania Gentili

Received: 13 September 2021 Accepted: 8 November 2021 Published: 18 November 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

kernel function, according to the local density of earthquakes. The idea of this model is that where we have more events, we can use a smaller sigma to better define the seismic structures (i.e., the faults) that generate the seismicity. On the other hand, where we have fewer events, we can use a larger sigma to increase the coverage of the model in those lower seismogenic zones. In traditional PSHA, earthquakes are modeled using a Poisson process, where the occurrence of a future earthquake is independent of previous earthquakes from the same source [5]. The Poisson hypothesis holds for declustered catalogs. To include aftershocks and foreshocks within traditional PSHA, Ref. [6] presented an approach based on [7] theorem and its consequent generalization [8]. They demonstrated that the Poisson distribution could approximate the distribution of exceedances (also considering seismic sequences) in some specific conditions, e.g., for a probability of 10 percent or less of having an exceedance in 50 years (a typical value used for PSHA). Ref. [9] somewhat revised the initial [6] procedure. Rather than using their correction factor, Ref. [9] employed the b-value and the annual rate of the complete catalog as input for PSHA computations.

Both [6] and [9] sugges<sup>t</sup> using a declustered seismic catalog only for the spatial estimation to avoid spatial bias introduced by the seismic sequence.

Therefore, a method that wants to introduce such sequences in the spatial estimation for PSHA needs a technique to downweigh the importance of aftershocks and foreshocks. Indeed, any seismic sequence should have the same importance in the spatial estimation of seismicity, independently from the number of events in the sequence (which can greatly vary between the sequences). The delcustering technique is the most dichotomous approach: it gives a weight equal to 1 to the mainshock and 0 to all other events in the sequence.

In their pioneering work, Ref. [10] developed a model to determine the spatial distribution of seismicity, including also the aftershocks and foreshocks in the seismic catalog. This approach uses a statistical model for the seismicity triggering, the ETAS model [11] and the stochastic declustering procedure [12] to assign each event the probability to be an independent event. In fact, in the ETAS model, events in the catalogs are distinguished as independent and dependent instead of mainshocks and aftershocks. The aftershocks of a seismic sequence, dependent on the sequence's mainshock, obtain a very low weight in this framework. Ref. [10] model consists of the multiplication of each spatial kernel for the probability to be independent of the associated earthquake. Therefore, in this framework, the spatial density distribution of a seismic sequence is mainly concentrated near the mainshock of the sequence (i.e., the independent event that generates all the dependent events of the sequence). Using this method, the fault that caused the seismic sequence is only partially reconstructed.

Our new, simple approach tries to solve that problem using a uniform weight for all the events of the same seismic sequence (i.e., 1/M, where M is the number of events in the seismic sequence). In this manner, it is possible to describe the fault or the system of faults in a more coherent way, avoiding giving excessive weight to the mainshock of the sequence. Here, we use the global seismic catalog (CMT catalog), Ref. [13] to build four different spatial seismicity models, fixed and adaptive smoothed seismicity with and without our correction, to take into account the seismic sequences. Finally, we use the last ten years of the catalog to compare the performances of the models, using the spatial likelihoods of the models to measure their efficiency.
