Selected Papers from the 9th Conference in Actuarial Science & Finance on Samos

A special issue of Risks (ISSN 2227-9091).

Deadline for manuscript submissions: closed (30 April 2017) | Viewed by 24151

Special Issue Editor


E-Mail Website
Guest Editor
Department of Statistics and Actuarial Science, University of Iowa, 241 Schaeffer Hall, Iowa City, IA 52242-1409, USA
Interests: extreme value theory for insurance and finance; quantitative risk management; multivariate heavy-tailed distributions
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Samos Conference, jointly organized with the Katholieke Universiteit Leuven, the Université Catholique de Louvain, the Københavns Universitet, and the New York University, provides a forum for state-of-the-art results in the areas of insurance, finance, and risk management. The meeting is open to people from universities, insurance companies, banks, consulting firms, or regulatory authorities.

We welcome all participants to submit their manuscripts presented at the conference to this special issue. All manuscripts will be refereed through the same peer-review process of the journal.

Prof. Qihe Tang
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Risks is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Non-Life Insurance
  • Risk Management
  • Health and Pension Insurance
  • Life Insurance
  • Risk and Stochastic Control
  • Statistical and Computational Methods
  • Financial Theory and Practice

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

2117 KiB  
Article
Change Point Detection and Estimation of the Two-Sided Jumps of Asset Returns Using a Modified Kalman Filter
by Ourania Theodosiadou, Sotiris Skaperas and George Tsaklidis
Risks 2017, 5(1), 15; https://doi.org/10.3390/risks5010015 - 03 Mar 2017
Cited by 3 | Viewed by 4586
Abstract
In the first part of the paper, the positive and negative jumps of NASDAQ daily (log-) returns and three of its stocks are estimated based on the methodology presented by Theodosiadou et al. 2016, where jumps are assumed to be hidden random variables. [...] Read more.
In the first part of the paper, the positive and negative jumps of NASDAQ daily (log-) returns and three of its stocks are estimated based on the methodology presented by Theodosiadou et al. 2016, where jumps are assumed to be hidden random variables. For that reason, the use of stochastic state space models in discrete time is adopted. The daily return is expressed as the difference between the two-sided jumps under noise inclusion, and the recursive Kalman filter algorithm is used in order to estimate them. Since the estimated jumps have to be non-negative, the associated pdf truncation method, according to the non-negativity constraints, is applied. In order to overcome the resulting underestimation of the empirical time series, a scaling procedure follows the stage of truncation. In the second part of the paper, a nonparametric change point analysis concerning the (variance–) covariance is applied to the NASDAQ return time series, as well as to the estimated bivariate jump time series derived after the scaling procedure and to each jump component separately. A similar change point analysis is applied to the three other stocks of the NASDAQ index. Full article
Show Figures

Figure 1

534 KiB  
Article
Change Point Estimation in Panel Data without Boundary Issue
by Barbora Peštová and Michal Pešta
Risks 2017, 5(1), 7; https://doi.org/10.3390/risks5010007 - 22 Jan 2017
Cited by 15 | Viewed by 5063
Abstract
Panel data of our interest consist of a moderate number of panels, while the panels contain a small number of observations. An estimator of common breaks in panel means without a boundary issue for this kind of scenario is proposed. In particular, the [...] Read more.
Panel data of our interest consist of a moderate number of panels, while the panels contain a small number of observations. An estimator of common breaks in panel means without a boundary issue for this kind of scenario is proposed. In particular, the novel estimator is able to detect a common break point even when the change happens immediately after the first time point or just before the last observation period. Another advantage of the elaborated change point estimator is that it results in the last observation in situations with no structural breaks. The consistency of the change point estimator in panel data is established. The results are illustrated through a simulation study. As a by-product of the developed estimation technique, a theoretical utilization for correlation structure estimation, hypothesis testing and bootstrapping in panel data is demonstrated. A practical application to non-life insurance is presented, as well. Full article
Show Figures

Figure 1

554 KiB  
Article
Bayesian Option Pricing Framework with Stochastic Volatility for FX Data
by Ying Wang, Sai Tsang Boris Choy and Hoi Ying Wong
Risks 2016, 4(4), 51; https://doi.org/10.3390/risks4040051 - 16 Dec 2016
Cited by 1 | Viewed by 4955
Abstract
The application of stochastic volatility (SV) models in the option pricing literature usually assumes that the market has sufficient option data to calibrate the model’s risk-neutral parameters. When option data are insufficient or unavailable, market practitioners must estimate the model from the historical [...] Read more.
The application of stochastic volatility (SV) models in the option pricing literature usually assumes that the market has sufficient option data to calibrate the model’s risk-neutral parameters. When option data are insufficient or unavailable, market practitioners must estimate the model from the historical returns of the underlying asset and then transform the resulting model into its risk-neutral equivalent. However, the likelihood function of an SV model can only be expressed in a high-dimensional integration, which makes the estimation a highly challenging task. The Bayesian approach has been the classical way to estimate SV models under the data-generating (physical) probability measure, but the transformation from the estimated physical dynamic into its risk-neutral counterpart has not been addressed. Inspired by the generalized autoregressive conditional heteroskedasticity (GARCH) option pricing approach by Duan in 1995, we propose an SV model that enables us to simultaneously and conveniently perform Bayesian inference and transformation into risk-neutral dynamics. Our model relaxes the normality assumption on innovations of both return and volatility processes, and our empirical study shows that the estimated option prices generate realistic implied volatility smile shapes. In addition, the volatility premium is almost flat across strike prices, so adding a few option data to the historical time series of the underlying asset can greatly improve the estimation of option prices. Full article
Show Figures

Figure 1

724 KiB  
Article
Predicting Human Mortality: Quantitative Evaluation of Four Stochastic Models
by Anastasia Novokreshchenova
Risks 2016, 4(4), 45; https://doi.org/10.3390/risks4040045 - 02 Dec 2016
Cited by 2 | Viewed by 4999
Abstract
In this paper, we quantitatively compare the forecasts from four different mortality models. We consider one discrete-time model proposed by Lee and Carter (1992) and three continuous-time models: the Wills and Sherris (2011) model, the Feller process and the Ornstein-Uhlenbeck (OU) process. The [...] Read more.
In this paper, we quantitatively compare the forecasts from four different mortality models. We consider one discrete-time model proposed by Lee and Carter (1992) and three continuous-time models: the Wills and Sherris (2011) model, the Feller process and the Ornstein-Uhlenbeck (OU) process. The first two models estimate the whole surface of mortality simultaneously, while in the latter two, each generation is modelled and calibrated separately. We calibrate the models to UK and Australian population data. We find that all the models show relatively similar absolute total error for a given dataset, except the Lee-Carter model, whose performance differs significantly. To evaluate the forecasting performance we therefore look at two alternative measures: the relative error between the forecasted and the actual mortality rates and the percentage of actual mortality rates which fall within a prediction interval. In terms of the prediction intervals, the results are more divergent since each model implies a different structure for the variance of mortality rates. According to our experiments, the Wills and Sherris model produces superior results in terms of the prediction intervals. However, in terms of the mean absolute error, the OU and the Feller processes perform better. The forecasting performance of the Lee Carter model is mostly dependent on the choice of the dataset. Full article
Show Figures

Figure 1

2692 KiB  
Article
Optimal Premium as a Function of the Deductible: Customer Analysis and Portfolio Characteristics
by Julie Thøgersen
Risks 2016, 4(4), 42; https://doi.org/10.3390/risks4040042 - 09 Nov 2016
Cited by 5 | Viewed by 3861
Abstract
An insurance company offers an insurance contract ( p , K ) , consisting of a premium p and a deductible K. In this paper, we consider the problem of choosing the premium optimally as a function of the deductible. The insurance [...] Read more.
An insurance company offers an insurance contract ( p , K ) , consisting of a premium p and a deductible K. In this paper, we consider the problem of choosing the premium optimally as a function of the deductible. The insurance company is facing a market of N customers, each characterized by their personal claim frequency, α, and risk aversion, β. When a customer is offered an insurance contract, she/he will, based on these characteristics, choose whether or not to insure. The decision process of the customer is analyzed in detail. Since the customer characteristics are unknown to the company, it models them as i.i.d. random variables; A 1 , , A N for the claim frequencies and B 1 , , B N for the risk aversions. Depending on the distributions of A i and B i , expressions for the portfolio size n ( p ; K ) [ 0 , N ] and average claim frequency α ( p ; K ) in the portfolio are obtained. Knowing these, the company can choose the premium optimally, mainly by minimizing the ruin probability. Full article
Show Figures

Figure 1

Back to TopTop