Next Issue
Volume 5, December
Previous Issue
Volume 5, June
 
 

Risks, Volume 5, Issue 3 (September 2017) – 19 articles

Cover Story (view full-size image): The new class of (p,q)-spherical distributions allows modeling heavy and light tails data from various applied areas. The density level lines above the density hill of such a distribution may drastically change their main orientation when switching from the distribution's center to its tails. The adequate mathematical description of such distributions deals with new types of generalized uniform distributions, radius functionals and stochastic representations, generalized circle numbers of non-Euclidean circles, and a corresponding new geometric disintegration method. Due to a suitable coordinate system, the evaluation of probabilities of arbitrary events becomes tractable, also for the Gauss-exponential distribution appearing in high risk limit scenarios. View the paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
1714 KiB  
Article
An Integrated Approach to Pricing Catastrophe Reinsurance
by Carolyn W. Chang and Jack S. K. Chang
Risks 2017, 5(3), 51; https://doi.org/10.3390/risks5030051 - 19 Sep 2017
Cited by 6 | Viewed by 4421
Abstract
We propose an integrated approach straddling the actuarial science and the mathematical finance approaches to pricing a default-risky catastrophe reinsurance contract. We first apply an incomplete-market version of the no-arbitrage martingale pricing paradigm to price the reinsurance contract as a martingale by a [...] Read more.
We propose an integrated approach straddling the actuarial science and the mathematical finance approaches to pricing a default-risky catastrophe reinsurance contract. We first apply an incomplete-market version of the no-arbitrage martingale pricing paradigm to price the reinsurance contract as a martingale by a measure change, then we apply risk loading to price in—as in the traditional actuarial practice—market imperfections, the underwriting cycle, and other idiosyncratic factors identified in the practice and empirical literatures. This integrated approach is theoretically appealing for its merit of factoring risk premiums into the probability measure, and yet practical for being applicable to price a contract not traded on financial markets. We numerically study the catastrophe pricing effects and find that the reinsurance contract is more valuable when the catastrophe is more severe and the reinsurer’s default risk is lower because of a stronger balance sheet. We also find that the price is more sensitive to the severity of catastrophes than to the arrival frequency; implying (re)insurers should focus more on hedging the severity than the arrival frequency in their risk management programs. Full article
Show Figures

Figure 1

1843 KiB  
Article
Interest Rates Term Structure under Ambiguity
by Silvia Romagnoli and Simona Santoro
Risks 2017, 5(3), 50; https://doi.org/10.3390/risks5030050 - 14 Sep 2017
Viewed by 3608
Abstract
After financial crisis, the role of uncertainty in decision making processes has largely been recognized as the new variable that contributes to shaping interest rates and bond prices. Our aim is to discuss the impact of ambiguity on bonds interest rates (yields). Starting [...] Read more.
After financial crisis, the role of uncertainty in decision making processes has largely been recognized as the new variable that contributes to shaping interest rates and bond prices. Our aim is to discuss the impact of ambiguity on bonds interest rates (yields). Starting from the realistic assumption that investors ask for an ambiguity premium depending on the efficacy of government interventions (if any), we lead to an exponential multi-factor affine model which includes ambiguity as well as an ambiguous version of the Heath-Jarrow-Morton (HJM)model. As an example, we propose the realistic economic framework given by Ulrich (2008, 2011), and we recover the corresponding ambiguous HJM framework, thus offering a large set of interest rate models enriched with ambiguity. We also give a concrete view of how different simulated scenarios of ambiguity can influence the economic cycle (through rates and bond prices). Full article
Show Figures

Figure 1

450 KiB  
Article
Model Uncertainty in Operational Risk Modeling Due to Data Truncation: A Single Risk Case
by Daoping Yu and Vytaras Brazauskas
Risks 2017, 5(3), 49; https://doi.org/10.3390/risks5030049 - 13 Sep 2017
Cited by 4 | Viewed by 3647
Abstract
Over the last decade, researchers, practitioners, and regulators have had intense debates about how to treat the data collection threshold in operational risk modeling. Several approaches have been employed to fit the loss severity distribution: the empirical approach, the “naive” approach, the shifted [...] Read more.
Over the last decade, researchers, practitioners, and regulators have had intense debates about how to treat the data collection threshold in operational risk modeling. Several approaches have been employed to fit the loss severity distribution: the empirical approach, the “naive” approach, the shifted approach, and the truncated approach. Since each approach is based on a different set of assumptions, different probability models emerge. Thus, model uncertainty arises. The main objective of this paper is to understand the impact of model uncertainty on the value-at-risk (VaR) estimators. To accomplish that, we take the bank’s perspective and study a single risk. Under this simplified scenario, we can solve the problem analytically (when the underlying distribution is exponential) and show that it uncovers similar patterns among VaR estimates to those based on the simulation approach (when data follow a Lomax distribution). We demonstrate that for a fixed probability distribution, the choice of the truncated approach yields the lowest VaR estimates, which may be viewed as beneficial to the bank, whilst the “naive” and shifted approaches lead to higher estimates of VaR. The advantages and disadvantages of each approach and the probability distributions under study are further investigated using a real data set for legal losses in a business unit (Cruz 2002). Full article
Show Figures

Figure 1

655 KiB  
Article
A Cointegrated Regime-Switching Model Approach with Jumps Applied to Natural Gas Futures Prices
by Daniel Leonhardt, Antony Ware and Rudi Zagst
Risks 2017, 5(3), 48; https://doi.org/10.3390/risks5030048 - 12 Sep 2017
Cited by 4 | Viewed by 4524
Abstract
Energy commodities and their futures naturally show cointegrated price movements. However, there is empirical evidence that the prices of futures with different maturities might have, e.g., different jump behaviours in different market situations. Observing commodity futures over time, there is also evidence for [...] Read more.
Energy commodities and their futures naturally show cointegrated price movements. However, there is empirical evidence that the prices of futures with different maturities might have, e.g., different jump behaviours in different market situations. Observing commodity futures over time, there is also evidence for different states of the underlying volatility of the futures. In this paper, we therefore allow for cointegration of the term structure within a multi-factor model, which includes seasonality, as well as joint and individual jumps in the price processes of futures with different maturities. The seasonality in this model is realized via a deterministic function, and the jumps are represented with thinned-out compound Poisson processes. The model also includes a regime-switching approach that is modelled through a Markov chain and extends the class of geometric models. We show how the model can be calibrated to empirical data and give some practical applications. Full article
Show Figures

Figure 1

1094 KiB  
Article
Assessment of Policy Changes to Means-Tested Age Pension Using the Expected Utility Model: Implication for Decisions in Retirement
by Johan G. Andréasson and Pavel V. Shevchenko
Risks 2017, 5(3), 47; https://doi.org/10.3390/risks5030047 - 09 Sep 2017
Cited by 10 | Viewed by 4288
Abstract
Means-tested pension policies are typical for many countries, and the assessment of policy changes is critical for policy makers. In this paper, we consider the Australian means-tested Age Pension. In 2015, two important changes were made to the popular Allocated Pension accounts: the [...] Read more.
Means-tested pension policies are typical for many countries, and the assessment of policy changes is critical for policy makers. In this paper, we consider the Australian means-tested Age Pension. In 2015, two important changes were made to the popular Allocated Pension accounts: the income means-test is now based on deemed income rather than account withdrawals, and the income-test deduction no longer applies. We examine the implications of the new changes in regard to optimal decisions for consumption, investment and housing. We account for regulatory minimum withdrawal rules that are imposed by regulations on Allocated Pension accounts, as well as the 2017 asset-test rebalancing. The policy changes are considered under a utility-maximising life cycle model solved as an optimal stochastic control problem. We find that the new rules decrease the advantages of planning the consumption in relation to the means-test, while risky asset allocation becomes more sensitive to the asset-test. The difference in optimal drawdown between the old and new policy is only noticeable early in retirement until regulatory minimum withdrawal rates are enforced. However, the amount of extra Age Pension received by many households is now significantly higher due to the new deeming income rules, which benefit wealthier households who previously would not have received Age Pension due to the income-test and minimum withdrawals. Full article
(This article belongs to the Special Issue Ageing Population Risks)
Show Figures

Figure 1

356 KiB  
Article
Optimal Insurance Policies in the Presence of Costs
by Knut K. Aase
Risks 2017, 5(3), 46; https://doi.org/10.3390/risks5030046 - 06 Sep 2017
Cited by 4 | Viewed by 2809
Abstract
We reconsider costs in insurance, and suggest a new type of cost function, which we argue is a natural choice when there are relatively small, but frequent, claims. If a fixed cost is incurred each time a claim is made, we obtain a [...] Read more.
We reconsider costs in insurance, and suggest a new type of cost function, which we argue is a natural choice when there are relatively small, but frequent, claims. If a fixed cost is incurred each time a claim is made, we obtain a Pareto optimal deductible even if the cost function does not vary with the indemnity. The classical result says that deductibles appear if and only if costs are variable. This implies that when the claims are relatively small, it is not optimal for the insured to be compensated, since the costs outweigh the benefits and a deductible will naturally occur. When we constrain the contract to contain a cap, a non-trivial deductible is Pareto optimal regardless of the assumptions about the cost structure, which is what is known as an XL-contract. Full article
807 KiB  
Article
A Low Price Correction for Improved Volatility Estimation and Forecasting
by George-Jason Siouris and Alex Karagrigoriou
Risks 2017, 5(3), 45; https://doi.org/10.3390/risks5030045 - 28 Aug 2017
Cited by 4 | Viewed by 2714
Abstract
In this work, we focus on volatility estimation which plays a crucial role in risk analysis and management. In order to improve value at risk (VaR) forecasts, we discuss the concept of low price effect and introduce the low price correction which does [...] Read more.
In this work, we focus on volatility estimation which plays a crucial role in risk analysis and management. In order to improve value at risk (VaR) forecasts, we discuss the concept of low price effect and introduce the low price correction which does not require any additional parameters and instead of returns it takes into account the prices of the asset. Judgement on the forecasting quality of the proposed methodology is based on both the relative number of violations and VaR volatility. For illustrative purposes, a real example from the Athens Stock Exchange is fully explored. Full article
Show Figures

Figure 1

982 KiB  
Article
Existence and Uniqueness for the Multivariate Discrete Terminal Wealth Relative
by Andreas Hermes and Stanislaus Maier-Paape
Risks 2017, 5(3), 44; https://doi.org/10.3390/risks5030044 - 28 Aug 2017
Cited by 2 | Viewed by 2804
Abstract
In this paper, the multivariate fractional trading ansatz of money management from Vince (Vince 1990) is discussed. In particular, we prove existence and uniqueness of an “optimal f” of the respective optimization problem under reasonable assumptions on the trade return matrix. This [...] Read more.
In this paper, the multivariate fractional trading ansatz of money management from Vince (Vince 1990) is discussed. In particular, we prove existence and uniqueness of an “optimal f” of the respective optimization problem under reasonable assumptions on the trade return matrix. This result generalizes a similar result for the univariate fractional trading ansatz. Furthermore, our result guarantees that the multivariate optimal f solutions can always be found numerically by steepest ascent methods. Full article
Show Figures

Figure 1

389 KiB  
Article
On the First Crossing of Two Boundaries by an Order Statistics Risk Process
by Dimitrina S. Dimitrova, Zvetan G. Ignatov and Vladimir K. Kaishev
Risks 2017, 5(3), 43; https://doi.org/10.3390/risks5030043 - 18 Aug 2017
Cited by 5 | Viewed by 2671
Abstract
We derive a closed form expression for the probability that a non-decreasing, pure jump stochastic risk process with the order statistics (OS) property will not exit the strip between two non-decreasing, possibly discontinuous, time-dependent boundaries, within a finite time interval. The result yields [...] Read more.
We derive a closed form expression for the probability that a non-decreasing, pure jump stochastic risk process with the order statistics (OS) property will not exit the strip between two non-decreasing, possibly discontinuous, time-dependent boundaries, within a finite time interval. The result yields new expressions for the ruin probability in the insurance and the dual risk models with dependence between the claim severities or capital gains respectively. Full article
7864 KiB  
Article
Stochastic Period and Cohort Effect State-Space Mortality Models Incorporating Demographic Factors via Probabilistic Robust Principal Components
by Dorota Toczydlowska, Gareth W. Peters, Man Chung Fung and Pavel V. Shevchenko
Risks 2017, 5(3), 42; https://doi.org/10.3390/risks5030042 - 27 Jul 2017
Cited by 9 | Viewed by 4655
Abstract
In this study we develop a multi-factor extension of the family of Lee-Carter stochastic mortality models. We build upon the time, period and cohort stochastic model structure to extend it to include exogenous observable demographic features that can be used as additional factors [...] Read more.
In this study we develop a multi-factor extension of the family of Lee-Carter stochastic mortality models. We build upon the time, period and cohort stochastic model structure to extend it to include exogenous observable demographic features that can be used as additional factors to improve model fit and forecasting accuracy. We develop a dimension reduction feature extraction framework which (a) employs projection based techniques of dimensionality reduction; in doing this we also develop (b) a robust feature extraction framework that is amenable to different structures of demographic data; (c) we analyse demographic data sets from the patterns of missingness and the impact of such missingness on the feature extraction, and (d) introduce a class of multi-factor stochastic mortality models incorporating time, period, cohort and demographic features, which we develop within a Bayesian state-space estimation framework; finally (e) we develop an efficient combined Markov chain and filtering framework for sampling the posterior and forecasting. We undertake a detailed case study on the Human Mortality Database demographic data from European countries and we use the extracted features to better explain the term structure of mortality in the UK over time for male and female populations when compared to a pure Lee-Carter stochastic mortality model, demonstrating our feature extraction framework and consequent multi-factor mortality model improves both in sample fit and importantly out-off sample mortality forecasts by a non-trivial gain in performance. Full article
(This article belongs to the Special Issue Ageing Population Risks)
Show Figures

Figure 1

6492 KiB  
Article
Robust Estimation of Value-at-Risk through Distribution-Free and Parametric Approaches Using the Joint Severity and Frequency Model: Applications in Financial, Actuarial, and Natural Calamities Domains
by Sabyasachi Guharay, KC Chang and Jie Xu
Risks 2017, 5(3), 41; https://doi.org/10.3390/risks5030041 - 26 Jul 2017
Cited by 3 | Viewed by 6184
Abstract
Value-at-Risk (VaR) is a well-accepted risk metric in modern quantitative risk management (QRM). The classical Monte Carlo simulation (MCS) approach, denoted henceforth as the classical approach, assumes the independence of loss severity and loss frequency. In practice, this assumption does not always hold [...] Read more.
Value-at-Risk (VaR) is a well-accepted risk metric in modern quantitative risk management (QRM). The classical Monte Carlo simulation (MCS) approach, denoted henceforth as the classical approach, assumes the independence of loss severity and loss frequency. In practice, this assumption does not always hold true. Through mathematical analyses, we show that the classical approach is prone to significant biases when the independence assumption is violated. This is also corroborated by studying both simulated and real-world datasets. To overcome the limitations and to more accurately estimate VaR, we develop and implement the following two approaches for VaR estimation: the data-driven partitioning of frequency and severity (DPFS) using clustering analysis, and copula-based parametric modeling of frequency and severity (CPFS). These two approaches are verified using simulation experiments on synthetic data and validated on five publicly available datasets from diverse domains; namely, the financial indices data of Standard & Poor’s 500 and the Dow Jones industrial average, chemical loss spills as tracked by the US Coast Guard, Australian automobile accidents, and US hurricane losses. The classical approach estimates VaR inaccurately for 80% of the simulated data sets and for 60% of the real-world data sets studied in this work. Both the DPFS and the CPFS methodologies attain VaR estimates within 99% bootstrap confidence interval bounds for both simulated and real-world data. We provide a process flowchart for risk practitioners describing the steps for using the DPFS versus the CPFS methodology for VaR estimation in real-world loss datasets. Full article
Show Figures

Figure 1

1472 KiB  
Article
The Class of (p,q)-spherical Distributions with an Extension of the Sector and Circle Number Functions
by Wolf-Dieter Richter
Risks 2017, 5(3), 40; https://doi.org/10.3390/risks5030040 - 21 Jul 2017
Cited by 5 | Viewed by 3157
Abstract
For evaluating the probabilities of arbitrary random events with respect to a given multivariate probability distribution, specific techniques are of great interest. An important two-dimensional high risk limit law is the Gauss-exponential distribution whose probabilities can be dealt with based on the Gauss–Laplace [...] Read more.
For evaluating the probabilities of arbitrary random events with respect to a given multivariate probability distribution, specific techniques are of great interest. An important two-dimensional high risk limit law is the Gauss-exponential distribution whose probabilities can be dealt with based on the Gauss–Laplace law. The latter will be considered here as an element of the newly-introduced family of ( p , q ) -spherical distributions. Based on a suitably-defined non-Euclidean arc-length measure on ( p , q ) -circles, we prove geometric and stochastic representations of these distributions and correspondingly distributed random vectors, respectively. These representations allow dealing with the new probability measures similarly to with elliptically-contoured distributions and more general homogeneous star-shaped ones. This is demonstrated by the generalization of the Box–Muller simulation method. In passing, we prove an extension of the sector and circle number functions. Full article
Show Figures

Figure 1

528 KiB  
Article
Valuation of Non-Life Liabilities from Claims Triangles
by Mathias Lindholm, Filip Lindskog and Felix Wahl
Risks 2017, 5(3), 39; https://doi.org/10.3390/risks5030039 - 19 Jul 2017
Cited by 5 | Viewed by 3803
Abstract
This paper provides a complete program for the valuation of aggregate non-life insurance liability cash flows based on claims triangle data. The valuation is fully consistent with the principle of valuation by considering the costs associated with a transfer of the liability to [...] Read more.
This paper provides a complete program for the valuation of aggregate non-life insurance liability cash flows based on claims triangle data. The valuation is fully consistent with the principle of valuation by considering the costs associated with a transfer of the liability to a so-called reference undertaking subject to capital requirements throughout the runoff of the liability cash flow. The valuation program includes complete details on parameter estimation, bias correction and conservative estimation of the value of the liability under partial information. The latter is based on a new approach to the estimation of mean squared error of claims reserve prediction. Full article
Show Figures

Figure 1

1111 KiB  
Article
Stress Testing German Industry Sectors: Results from a Vine Copula Based Quantile Regression
by Matthias Fischer, Daniel Kraus, Marius Pfeuffer and Claudia Czado
Risks 2017, 5(3), 38; https://doi.org/10.3390/risks5030038 - 19 Jul 2017
Cited by 6 | Viewed by 4992
Abstract
Measuring interdependence between probabilities of default (PDs) in different industry sectors of an economy plays a crucial role in financial stress testing. Thereby, regression approaches may be employed to model the impact of stressed industry sectors as covariates on other response sectors. We [...] Read more.
Measuring interdependence between probabilities of default (PDs) in different industry sectors of an economy plays a crucial role in financial stress testing. Thereby, regression approaches may be employed to model the impact of stressed industry sectors as covariates on other response sectors. We identify vine copula based quantile regression as an eligible tool for conducting such stress tests as this method has good robustness properties, takes into account potential nonlinearities of conditional quantile functions and ensures that no quantile crossing effects occur. We illustrate its performance by a data set of sector specific PDs for the German economy. Empirical results are provided for a rough and a fine-grained industry sector classification scheme. Amongst others, we confirm that a stressed automobile industry has a severe impact on the German economy as a whole at different quantile levels whereas, e.g., for a stressed financial sector the impact is rather moderate. Moreover, the vine copula based quantile regression approach is benchmarked against both classical linear quantile regression and expectile regression in order to illustrate its methodological effectiveness in the scenarios evaluated. Full article
(This article belongs to the Special Issue Quantile Regression for Risk Assessment)
Show Figures

Figure 1

373 KiB  
Article
Bubbles, Blind-Spots and Brexit
by John Fry and Andrew Brint
Risks 2017, 5(3), 37; https://doi.org/10.3390/risks5030037 - 18 Jul 2017
Cited by 11 | Viewed by 4208
Abstract
In this paper we develop a well-established financial model to investigate whether bubbles were present in opinion polls and betting markets prior to the UK’s vote on EU membership on 23 June 2016. The importance of our contribution is threefold. Firstly, our continuous-time [...] Read more.
In this paper we develop a well-established financial model to investigate whether bubbles were present in opinion polls and betting markets prior to the UK’s vote on EU membership on 23 June 2016. The importance of our contribution is threefold. Firstly, our continuous-time model allows for irregularly spaced time series—a common feature of polling data. Secondly, we build on qualitative comparisons that are often made between market cycles and voting patterns. Thirdly, our approach is theoretically elegant. Thus, where bubbles are found we suggest a suitable adjustment. We find evidence of bubbles in polling data. This suggests they systematically over-estimate the proportion voting for remain. In contrast, bookmakers’ odds appear to show none of this bubble-like over-confidence. However, implied probabilities from bookmakers’ odds appear remarkably unresponsive to polling data that nonetheless indicates a close-fought vote. Full article
(This article belongs to the Special Issue The implications of Brexit)
Show Figures

Figure 1

405 KiB  
Article
A Robust Approach to Hedging and Pricing in Imperfect Markets
by Hirbod Assa and Nikolay Gospodinov
Risks 2017, 5(3), 36; https://doi.org/10.3390/risks5030036 - 18 Jul 2017
Viewed by 3291
Abstract
This paper proposes a model-free approach to hedging and pricing in the presence of market imperfections such as market incompleteness and frictions. The generality of this framework allows us to conduct an in-depth theoretical analysis of hedging strategies with a wide family of [...] Read more.
This paper proposes a model-free approach to hedging and pricing in the presence of market imperfections such as market incompleteness and frictions. The generality of this framework allows us to conduct an in-depth theoretical analysis of hedging strategies with a wide family of risk measures and pricing rules, and study the conditions under which the hedging problem admits a solution and pricing is possible. The practical implications of our proposed theoretical approach are illustrated with an application on hedging economic risk. Full article
(This article belongs to the Special Issue Quantile Regression for Risk Assessment)
2010 KiB  
Article
Implied Distributions from GBPUSD Risk-Reversals and Implication for Brexit Scenarios
by Iain J. Clark and Saeed Amen
Risks 2017, 5(3), 35; https://doi.org/10.3390/risks5030035 - 04 Jul 2017
Cited by 9 | Viewed by 7622
Abstract
Much of the debate around a potential British exit (Brexit) from the European Union has centred on the potential macroeconomic impact. In this paper, we instead focus on understanding market expectations for price action around the Brexit referendum date. Extracting implied distributions from [...] Read more.
Much of the debate around a potential British exit (Brexit) from the European Union has centred on the potential macroeconomic impact. In this paper, we instead focus on understanding market expectations for price action around the Brexit referendum date. Extracting implied distributions from the GBPUSD option volatility surface, we originally estimated, based on our visual observation of implied probability densities available up to 13 June 2016, that the market expected that a vote to leave could result in a move in the GBPUSD exchange rate from 1.4390 (spot reference on 10 June 2016) down to a range in 1.10 to 1.30, i.e., a 10–25% decline—very probably with highly volatile price action. To quantify this more objectively, we construct a mixture model corresponding to two scenarios for the GBPUSD exchange rate after the referendum vote, one scenario for “remain” and one for “leave”. Calibrating this model to four months of market data, from 24 February to 22 June 2016, we find that a “leave” vote was associated with a predicted devaluation of the British pound to approximately 1.37 USD per GBP, a 4.5% devaluation, and quite consistent with the observed post-referendum exchange rate move down from 1.4877 to 1.3622. We contrast the behaviour of the GBPUSD option market in the run-up to the Brexit vote with that during the 2014 Scottish Independence referendum, finding the potential impact of Brexit to be considerably higher. Full article
(This article belongs to the Special Issue The implications of Brexit)
Show Figures

Figure 1

1997 KiB  
Article
Backtesting the Lee–Carter and the Cairns–Blake–Dowd Stochastic Mortality Models on Italian Death Rates
by Carlo Maccheroni and Samuel Nocito
Risks 2017, 5(3), 34; https://doi.org/10.3390/risks5030034 - 04 Jul 2017
Cited by 10 | Viewed by 6241
Abstract
This work proposes a backtesting analysis that compares the Lee–Carter and the Cairns–Blake–Dowd mortality models, employing Italian data. The mortality data come from the Italian National Statistics Institute (ISTAT) database and span the period 1975–2014, over which we computed back-projections evaluating the performances [...] Read more.
This work proposes a backtesting analysis that compares the Lee–Carter and the Cairns–Blake–Dowd mortality models, employing Italian data. The mortality data come from the Italian National Statistics Institute (ISTAT) database and span the period 1975–2014, over which we computed back-projections evaluating the performances of the models compared with real data. We propose three different backtest approaches, evaluating the goodness of short-run forecast versus medium-length ones. We find that neither model was able to capture the improving shock on mortality observed for the male population on the analysed period. Moreover, the results suggest that CBD forecasts are reliable prevalently for ages above 75, and that LC forecasts are basically more accurate for this data. Full article
Show Figures

Figure 1

595 KiB  
Article
Analyzing the Gaver—Lewis Pareto Process under an Extremal Perspective
by Marta Ferreira and Helena Ferreira
Risks 2017, 5(3), 33; https://doi.org/10.3390/risks5030033 - 27 Jun 2017
Cited by 1 | Viewed by 2645
Abstract
Pareto processes are suitable to model stationary heavy-tailed data. Here, we consider the auto-regressive Gaver–Lewis Pareto Process and address a study of the tail behavior. We characterize its local and long-range dependence. We will see that consecutive observations are asymptotically tail independent, a [...] Read more.
Pareto processes are suitable to model stationary heavy-tailed data. Here, we consider the auto-regressive Gaver–Lewis Pareto Process and address a study of the tail behavior. We characterize its local and long-range dependence. We will see that consecutive observations are asymptotically tail independent, a feature that is often misevaluated by the most common extremal models and with strong relevance to the tail inference. This also reveals clustering at “penultimate” levels. Linear correlation may not exist in a heavy-tailed context and an alternative diagnostic tool will be presented. The derived properties relate to the auto-regressive parameter of the process and will provide estimators. A comparison of the proposals is conducted through simulation and an application to a real dataset illustrates the procedure. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop