Next Article in Journal
Remarks Regarding Computational Aspects in Algebras Obtained by Cayley–Dickson Process and Some of Their Applications
Previous Article in Journal
Automatic Grammatical Evolution-Based Optimization of Matrix Factorization Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Measurement and Analysis of High Frequency Assert Volatility Based on Functional Data Analysis

1
School of Economics, Xiamen University, Xiamen 361005, China
2
School of Medicine, Xiamen University, Xiamen 361005, China
3
National Institute for Data Science in Health and Medicine, Xiamen University, Xiamen 361005, China
4
Data Mining Research Center, Xiamen University, Xiamen 361005, China
5
School of Economics and Management, East China Jiaotong University, Nanchang 330013, China
6
School of Mathematical Sciences, Ocean University of China, Qingdao 266100, China
7
National Economic Engineering Laboratory, Dongbei University of Finance and Economics, Dalian 116025, China
8
School of Statistics, Huaqiao University, Xiamen 361005, China
9
School of Business Administration, Hunan University, Changsha 410082, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2022, 10(7), 1140; https://doi.org/10.3390/math10071140
Submission received: 1 March 2022 / Revised: 30 March 2022 / Accepted: 31 March 2022 / Published: 1 April 2022
(This article belongs to the Topic Data Science and Knowledge Discovery)

Abstract

:
Information and communication technology have enabled the collection of high-frequency financial asset time series data. However, the high spatial and temporal resolution nature of these data makes it challenging to compare financial asset characteristics patterns and identify the risk. To address this challenge, a method for realized volatility calculation based on the functional data analysis (FDA) method is proposed. A time–price functional curve is constructed by the functional data analysis method to calculate the realized volatility as the curvature integral of the time–price functional curve. This method could effectively eliminate the interference of market microstructure noise, which could not only allow capital asset price to be decomposed into a continuous term and a noise term by asymptotic convergence, but also could decouple the noise from the discrete-time series. Additionally, it could obtain the value of volatility at any given time, which is no concern about correlations between repeated, mixed frequencies and unequal intervals sampling problems and relaxes the structural constraints and distribution setting of data acquisition. To demonstrate our methods, we analyze a per-second level financial asset dataset. Additionally, sensitivity analysis on the selection of the no equally spaced sample is conducted, and we further add noise to ensure the robustness of our methods and discuss their implications in practice, especially being conducive to more micro analysis of the volatility of the financial market and understanding the rapidly changing changes.

1. Introduction

In recent years, with the rapid and convenient acquisition of high-frequency data of asset returns, scholars and investors have given an increasing amount of attention to volatility modeling [1,2]. ARCH class model, SV class model, and realized volatility class model are used to calculate volatility [3,4,5]. However, these three class models characterize volatility indirectly through rate of return and cannot depict dynamic changes at the intraday level. In particular, modeling intraday volatility usually uses continuous time stochastic process methods [6,7,8], which assume that volatility is generated by a potentially unknown diffusion process. However, these methods are unable to describe and characterize the long memory and periodicity intraday fluctuations. A commonality in these methods depends on the discrete-time points at which the measurements are taken, which are often called point-based realized volatility calculation models [9]. When the data sampling frequency is high enough and the sampled data do not cause micro-market interference, these methods of calculating realized volatility are in theory consistent estimators of true volatility.
With the development of data acquisition technology, the frequency of financial data acquisition is getting higher and higher. As the data sampling frequency increases, the interference of market microstructure noise on data becomes more and more obvious, such as bid–ask spread, infrequent trading, and so on [10,11,12]. If each sample is noisy, point-based realized volatility calculation models are challenged. For example, the sum of returns squares extremely depend on two consecutive price sampling points from t 1 to t, where t denotes a time variable. The sampling points themselves are biased and different sampling points may lead to different calculations [13]. Second, high-frequency data have a long memory, which indicates that past shocks persist into the future and have a big impact on the expected future [14,15,16]. In this framework, systemic changes in trading should be considered.From a temporal evolution perspective, if the price is considered and analyzed as discrete time series, the underlying stochastic process that generates these observations cannot be ascertained in terms of statistical analysis. That is to say, the asset price shifting from a time t-1 to t is not independent from current characteristics but should be considered systemic changes. Therefore, faced with these challenges, it is necessary to analyze the high-frequency data’s volatility modeling.
A method about the curve-based realized volatility calculation model is proposed. Discrete time series can be a sampling set of curves, which can be observed over time [17,18]. On the one side, a functional data analysis method could often be used to extract additional information from discrete time series, such as their derivatives to measure the velocity and acceleration of the price curve [19]. On the other side, applying the functional data analysis method could help identify the repeated patterns of price changes in financial markets [20,21]. High-frequency data contain a rich source of information, which provides opportunities to analyze the dynamic changes of short time intervals. However, these studies do not consider decoupling noise from discrete time series. The price is assumed to follow the Brownian semimartingale process, and the integral volatility can be estimated by the realized volatility [22]. Therefore, firstly, this study constructs a time–price functional curve on the basis of discrete observations by the functional data analysis method through the Bernstein basis function. Secondly, referring to the theory of differential geometry, this study takes the curvature integral of the time–price functional curve as the realized volatility of high-frequency data in a certain period of time, which is called functional volatility (FV). In addition, the period of time could be set arbitrarily, so we can calculate the realized volatility at any time scale. Even when the time scale is small enough, instantaneous realized volatility could be estimated.
Different from point-based realized volatility calculation methods, the curve-based realized volatility calculation method has three advantages. First, the curve-based realized volatility calculation method could obtain the information about the sampling time point from t-1 and t, where t denotes a time variable. The point-based realized volatility calculation method ignores the capital asset price’s underlying dynamic shifts from one time point to the next. For example, if two adjacent time points have equal price, the return will be zero. However, the shift from one time point to the next time point comprises much information about price changes. Second, the curve-based realized volatility calculation method could effectively eliminate the interference of market microstructure noise on data. The functional data analysis method allows capital asset price to be decomposed into a continuous term and a noise term by asymptotic convergence, which could decouple the noise from the discrete-time series. Third, the curve-based realized volatility calculation method treats the whole curve as a single entity and there is also no concern about correlations between repeated, mixed frequencies and unequal intervals sampling problems [23]. Point-based realized volatility calculation methods are limited by the correlation of repeated samples, the sampling process for irregular data. The functional data analysis method relaxes the structural constraints and distribution setting of data acquisition, which represents a change in framework towards the handling of discrete time series.
The remainder of this study is as follows: Section 2 is the Methods. The Simulation experiments are described in Section 3. Conclusions and future work are described in Section 4.

2. Methods

2.1. Determination of Basis Function

Functional data analysis (FDA) takes the functional data as the research object, regarding the observed data as a whole [24]. Consider a one-dimensional functional data X 1 ( t ) , X 2 ( t ) , , X n ( t ) , which is an implementation of a stochastic process X t on a closed interval F . From the dimension of time t, functional data are a kind of infinite-dimensional data [25]. In reality, it is difficult to obtain the curve with complete observation and without measurement error. Therefore, we first assume that
W i j = X i T i j + U i j , T i j F , 1 i n , 1 j M i
where U i j is the independent and identically distributed observation error and is independent of X i , satisfies E U i j = 0 , E U i j 2 = σ u 2 . The actual observation data are T i j , W i j , i = 1 , 2 , , n ; j = 1 , 2 , , M i .
Due to the infinite dimension of functional data, it is significant to reduce the dimension. The common method is to expand the functional data using a set of bases. In more detail, suppose that ϕ 1 ( t ) , ϕ 2 ( t ) , is a set of orthogonal bases defined on a closed interval I :
X ( t ) = k = 1 ξ k ϕ k ( t )
where ξ k = I X ( t ) ϕ k ( t ) d t is the projection of X t on ϕ k ( t ) . In practice, we need to truncate the number of expansions, that is, X ( t ) k = 1 K ξ k ϕ k ( t ) . In this way, the infinite-dimensional functional data X t are approximated to the sum of the expansions of finite terms. The information contained in infinite-dimensional function data can be expressed by a finite-dimensional vector ξ 1 , ξ 2 , , ξ K to achieve the purpose of dimension reduction.
Let the time series data be Y i , i = 0 , 1 , , n , and the fitting model can be constructed as follows:
Y ( t ) = j = 0 m β j φ j ( t ) + ε ( t ) , 0 t 1 , m < n
where φ j ( t ) , j = 0 , 1 , , m are a set of basis functions; β j , j = 0 , 1 , , m denote the coefficient vector to be determined; ε j ( t ) , j = 0 , 1 , , m represent the noise.
The time series Y t given here is the result of parameterization of the original time series data Y i ( i = 0 , 1 , , n ) . The determined parameters are divided into Δ t : t 0 < t 1 < < t n . In this way, we are faced with the problem of which basis function should be chosen. First of all, it should be noted that polynomial functions can meet the requirements of mining useful information from complex data, and it is easy to calculate the function value and the derivative value of each order to realize visualization.
All the m-degree polynomials constitute the m-degree polynomial space, and any group of M + 1 linearly independent polynomials in a polynomial space of degree M can be regarded as a group of bases [26]. To better reflect the regularity of complex data, the number of data transformation peaks and valleys is described by m. Through computer input and interactive modification of the fitting curve, we can achieve the goal of description.
The same curve can be represented by different polynomial basis functions, with different properties. Power polynomials t j , j = 1 , , m reflect the simplest polynomial basis [27]. The curve fitted by the power base has the advantages of simple form and easy calculation. However, the geometric mean of the coefficient vector in the power-based polynomial curve equation is not obvious. In addition, when the order is large, the coefficient matrix is ill-conditioned due to the system of linear equations that must be solved [28]. Lagrange basis function L j ( t ) = j i = 0 m t t i t j t i , j = 0 , 1 , , m is normative and has obvious regularity [29]. However, its derivation is complicated, and all data points need to be recalculated every time data are added. That is not suitable for the requirements of data mining. Fourier basis e i ω t can reveal the internal relationship between time and spectrum. However, when using Fourier transform, we need to use all the time-domain information of the signal, which lacks the time-domain positioning function [30]. Considering the characteristics of human–computer interaction and data mining, we select the Bernstein basis function in this paper. As a basis function, Bernstein polynomials are classical Bézier curves and become the foundation to develop complex curves and surfaces. It has many excellent properties, such as normalization, symmetry, recurrence, segmentation and control hull [31]. Therefore, we select the Bernstein basis function in this paper. The model fitted by Bernstein basis function is
Y ( t ) = j = 0 m β j B j , m ( t ) + ε ( t ) , 0 t 1
Here, β j , j = 0 , 1 , , m denotes the coefficient vector, which is called the control point of fitting curve. Basis function
B j , m ( t ) = C m j t j ( 1 t ) m j , 0 t 1 , j = 0 , 1 , , m
is known as the Bernstein basis function.
Besides the good properties of normalization, symmetry, recurrence and segmentation, the Bernstein function also has a convex–hull property [32]. The convex hull of a point set is defined as the set of all convex combinations formed by the elements of the point set. The convex hull property of the fitting curve with the Bernstein basis function means that the curve always lies in the convex hull of its control vertex (see Figure 1).

2.2. Bernstein Basis Function Modeling

Consider a time series as Y ( t ) , 0 t 1 . Let mth-degree Bernstein polynomials be as basis functions [31]
B j , m ( t ) = C m j t j ( 1 t ) m j , 0 t 1 , j = 0 , 1 , , m
Construct the actual model
Y ( t ) = j = 0 m β j B j , m ( t ) + ε ( t )
Fitting the time series data points, the sample regression equation is
Y ^ ( t ) = j = 0 m β ^ j B j , m ( t ) , 0 t 1
Here, the parameter estimation of β j is denoted by β ^ j . Therefore, the model based on Bernstein basis function is
Y ( t ) = j = 0 m β ^ j B j , m ( t ) + e ( t )
where β ^ j , j = 0 , 1 , , m is the estimator of the control vertex. B j , m ( t ) denotes the Bernstein basis function. e ( t ) expresses the error term, which is e ( t ) = Y ( t ) Y ^ ( t ) . We can further use the properties of the constructed curve to analyze the development law of the studied phenomenon.
It is noteworthy that Y ^ ( t ) is the value of fitting data point Y ( t ) on the curve (Equation (8)). The actual value obtained after the interference adjustment parameter is Y ( t ) in Equation (9). Moreover, stochastic variable ε ( t ) represents an error, including data measurement error and random error. Suppose ε ( t ) N 0 , σ 2 , and cov ε t 1 , ε t 2 = 0 for t 1 t 2 .
In this paper, the least-square method is utilized to estimate the control points β j , j = 0 , 1 , , m . The time-series data Y i , i = 0 , 1 , , n are first parameterized. Let τ i be the indexes corresponding to Y i , i = 0 , 1 , , n , τ i 0 .
By normalizing the parameterization results of the above, the normalized parameterization results are generated:
t i = τ i m a x ( τ i ) , i = 0 , 1 , , n
In the measurement of high-frequency asset volatility, n is the number of samples per day. Then, the fitted asset price curve can be determined by the least square approach. Let the fitted curve be
Y ^ t i = j = 0 m β ^ j B j , m t i , i = 0 , 1 , , n
The sample model is
Y t i = j = 0 m β ^ j B j , m t i + e t i
Calculate the control point to minimize the following formula:
E = i = 0 n Y t i Y ^ t i 2
That is,
E β ^ 0 , β ^ 1 , , β ^ m = i = 0 n Y t i j = 0 m β ^ j B j , m t i 2
According to the least square method, the control vertex can be obtained
β ^ 0 β ^ 1 β ^ m = Φ T Φ 1 Φ T Y t 0 Y t 1 Y t n
where
Φ = B 0 , m t 0 B 1 , m t 0 B m , m t 0 B 0 , m t 1 B 1 , m t 1 B m , m t 1 B 0 , m t n B 1 , m t n B m , m t n
Φ T is the transpose of Φ .
In this way, m + 1 control points β ^ 0 , β ^ 1 , , β ^ m in Equation (8) are estimated. The model requires that the fitting curve must coincide with the beginning and end of the original curve. Therefore, we modify the head and tail control points: β ^ 0 Y t 0 , β ^ m Y t n to ensure that the piecewise fitting curve can be successfully spliced to form an overall fitting curve, which can represent the whole data samples.
Then, we can obtain the fitting curve
Y ^ ( t ) = j = 0 m β ^ j B j , m ( t ) 0 t 1
As for our model, the only parameter to be determined is m, which is finally optimal by the minimum generalized cross-validation criterion (GCV):
G C V ( m ) = i = 1 N Y ( t ) j = 0 m β ^ j β j , m t i 2 ( 1 M ( m ) / N ) 2
where M ( m ) denotes the number of effective parameters in the model, and N is the number of actual observation samples.

2.3. Volatility Measurement

Intuitively, after constructing the function curve of coarse high-frequency data. It is more natural to quantify the volatility through the characteristics of the function than to measure the realized volatility. As a result, we can naturally measure asset volatility by measuring the curve fluctuation degree of the asset curve.
The curvature of the function can be used to determine the degree of volatility of the measurement curve. A curve’s curvature is the rotation rate of the tangent direction angle of a point on this curve to the arc length. It is defined by the differential and represents the curve’s bending at a certain point. The greater the curvature, the greater the degree of curve bending, that is, the greater the degree of fluctuation [33].
Let the constructed asset fluctuation curve equation be y = f ( t ) , which contains the second derivative. The curvature of a function curve at point M is defined as
K = y 1 + y 2 3 2
According to Equation (11), we can calculate the first and second derivatives of the asset change curve as follows:
y = Y ^ ( t ) t = j = 0 m β ^ j t m j t ( t 1 ) B j , m ( t ) ,
y = 2 Y ^ ( t ) t 2 = j = 0 m β ^ j m 2 + m ( t 1 ) 2 + j 2 j t 2 ( t 1 ) 2 + 2 j ( 1 m ) t ( t 1 ) 2 B j , m ( t ) ,
where 0 t 1 . Based on differential geometry theory, the total curvature of a curve is equal to the integral of curvature:
F V t = t j t j + 1 y 1 + y 2 3 2 d t
The formula above can characterize the volatility of high-frequency asset fluctuations and then be utilized as a volatility measure, which can be called functional volatility (FV). It is noted that the time interval here can be any period.

3. Experimental Analysis

To verify the effectiveness of our model, we use real data for empirical analysis and test the unequal spaced sample and noise on a high-frequency asset dataset. Due to a large amount of data, it is impossible to fit well with a low degree polynomial. Therefore, according to the references [34,35], we use 200 data sample points for fitting each time.
The data was obtained from a Thompson Ruter Tick History database (TRTH), which have asset changes per second. In particular, the original data are equidistant. We randomly select 1000 days and use the above model to calculate the daily functional volatility as the gold standard. Figure 2 and Figure 3 illustrate the asset price from 2012 to 2016 years used in this paper and daily volatility measured using complete dataset.
The relative error is chosen as the evaluation criterion in this paper, which can be expressed as
Error = 1 N i = 1 N F V i ^ F V i F V i
where F V ^ i , i = 0 , 1 , , N denotes the the daily volatility obtained by the original dataset, and the F V i indicates the corresponding volatility under simulated conditions.
In detail, we design the following two simulation settings.
  • We randomly remove a certain proportion of data from the original daily data, that is, produce non-equidistant high-frequency data. The corresponding proportion is controlled by DropRate.
  • We randomly add noise r * s i g m a to the original data, where r is randomly chosen from 0 to 1. Parameter s i g m a determines the degree of the added noise.
We presented the maximum (sub), average (avg), and standard (std) error of relative error across 500 replications in Table 1 and Table 2. Results perform the efficiency of our method to address non-equidistant and noisy situations, in which the averaged and related error is less than 10%. In terms of the maximum related errors, the errors are more than 10% when DropRate is equal to 0.3 or the sigma exceeds 0.4.
It is worth noting that we randomly remove the original data in Simulation 1 to produce non-equidistant asset data. At the same time, it also means that we estimate the model with missing data. This also verifies that our model can overcome the case with missing values.
Usually, stock prices are modeled by lognormal distributions [36]. Therefore, we generate 500 days of high-frequency asset price data through lognormal distribution to further verify the effectiveness of our model. Following the above settings, we consider the deviation of the model in the case of random loss of samples and adding noise. Table 3 and Table 4 demonstrate the relate error results under different drop rate and sigma. Results show that our method has good robustness.

4. Conclusions and Future Work

In recent decades, financial data collection technologies have evolved to allow more intensive sampling of temporal, spatial, and other continuous measurements. At the same time, the available financial data are increasingly complex, such as data being recorded continuously at an interval and data being recorded intermittently at several discrete points in time. Faced with the real situation, this paper proposed the FDA approach to measure the volatility indicators.
The FDA approach represents a change in philosophy on financial time series data processing. Classical multivariate statistical techniques could only obtain information from sampling time points, and they do not take advantage of additional information that could be implied by sampling information between points in time.
The FDA approach indicates that the financial time series data reflect the influence of certain smooth functions that are assumed to underlie the observations. Some additional information could be extracted from the smoothness of underlying functions [37]. For example, topology information, such as derivatives and curvature, could be obtained by curve calculation [38,39]. Therefore, we develop a set of volatility calculation models based on the concept of curvature integration.
Measurement volatility based on the FDA approach could provide a new analytical idea about the financial market for scholars, investors, and policy managers. On the one side, the FDA approach could measure the volatility at any period, especially the instantaneous volatility. This helps us to delve into micro time and explore financial markets. In particular, for policy managers, markets can be monitored in real time. On other hand, the FDA approach represents the financial time series data as a whole curve. Volatility measurement is an example. Time series data are a common form of the finance market. Therefore, in the research on the finance market, it is a good consideration to use the FDA approach or nest the FDA approach in existing methods, such as dimension reduction, clustering, and classification. Possibly, some new, valuable, interesting analyses and findings will emerge.
For future research, this work can be extended to discuss the following three aspects. First, Bernstein polynomials are classical Bézier curves that have become the foundation to develop complex curves and surfaces. Therefore, more complex Bézier polynomials as the basis function of volatility measure deserve to be studied. Second, how to produce the curvature graph for realized volatility curve is also an interesting topic, which can refer to these articles [40]. Finally, as for the measurement method of functional volatility, future research can consider obtaining a better fitting curve through the kernel method and verifying its mathematical properties.

Author Contributions

Conceptualization, M.Z. and C.Y.; methodology, Z.L. and F.W.; software, F.W. and Y.M.; validation, Y.X., F.W., and C.Y.; formal analysis, Z.L.; investigation, F.W.; data curation, Z.L. and Y.M.; writing—original draft preparation, F.W. and C.Y.; writing—review and editing, F.W. and C.Y.; visualization, M.Z.; supervision, Y.X.; project administration, Y.X.; funding acquisition, M.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the financial support provided by the major project of the National Social Science Foundation (20&ZD137).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Weng, F.; Zhang, H.; Yang, C. Volatility forecasting of crude oil futures based on a genetic algorithm regularization online extreme learning machine with a forgetting factor: The role of news during the COVID-19 pandemic. Resour. Policy 2021, 73, 102148. [Google Scholar] [CrossRef] [PubMed]
  2. Duttilo, P.; Gattone, S.; Di Battista, T. Volatility modeling: An overview of equity markets in the euro area during COVID-19 Pandemic. Mathematics 2021, 9, 1212. [Google Scholar] [CrossRef]
  3. Andersen, T.; Bollerslev, T.; Diebold, F.; Labys, P. The distribution of realized exchange rate volatility. J. Am. Stat. Assoc. 2001, 96, 42–55. [Google Scholar] [CrossRef]
  4. Bollerslev, T. Generalized autoregressive conditional heteroskedasticity. J. Econom. 1986, 31, 307–327. [Google Scholar] [CrossRef] [Green Version]
  5. Hansen, P.; Huang, Z.; Shek, H. Realized GARCH: A joint model for returns and realized measures of volatility. J. Appl. Econom. 2012, 27, 877–906. [Google Scholar] [CrossRef]
  6. Poon, S.; Granger, C. Practical issues in forecasting volatility. Financ. Anal. J. 2005, 61, 45–56. [Google Scholar] [CrossRef] [Green Version]
  7. Andersen, T.; Bollerslev, T.; Diebold, F. Parametric and nonparametric volatility measurement. In Handbook of Financial Econometrics: Tools and Techniques; North-Holland: Chicago, IL, USA, 2010; pp. 67–137. [Google Scholar]
  8. Hurvich, C.; Moulines, E.; Soulier, P. Estimating long memory in volatility. Econometrica 2005, 73, 1283–1328. [Google Scholar] [CrossRef] [Green Version]
  9. Liu, C.; Maheu, J. Are there structural breaks in realized volatility? J. Financ. Econom. 2008, 6, 326–360. [Google Scholar] [CrossRef]
  10. Bandi, F.M.; Russell, J.R. Separating microstructure noise from volatility. J. Financ. Econ. 2006, 79, 655–692. [Google Scholar] [CrossRef]
  11. Hansen, P.; Lunde, A. Realized variance and market microstructure noise. J. Bus. Econ. Stat. 2006, 24, 127–161. [Google Scholar] [CrossRef]
  12. Zhang, L.; Mykland, P.A.; Aït-Sahalia, Y. A tale of two time scales: Determining integrated volatility with noisy high-frequency data. J. Am. Stat. Assoc. 2005, 100, 1394–1411. [Google Scholar] [CrossRef]
  13. Duong, D.; Swanson, N. Empirical evidence on the importance of aggregation, asymmetry, and jumps for volatility prediction. J. Econom. 2015, 187, 606–621. [Google Scholar] [CrossRef] [Green Version]
  14. Breidt, F.; Crato, N.; Delima, P. The detection and estimation of long memory in stochastic volatility. J. Econom. 1998, 83, 325–348. [Google Scholar] [CrossRef] [Green Version]
  15. Baillie, R.; Cecen, A.; Han, Y. High frequency Deutsche Mark-US Dollar returns: FIGARCH representations and non linearities. Multinatl. Financ. J. 2000, 4, 247–267. [Google Scholar] [CrossRef]
  16. Granger, C. Long memory relationships and the aggregation of dynamic models. J. Econom. 1980, 14, 227–238. [Google Scholar] [CrossRef]
  17. Alvarez, A.; Panloup, F.; Pontier, M.; Savy, N. Estimation of the instantaneous volatility. Stat. Inference Stoch. Process. 2012, 15, 27–59. [Google Scholar] [CrossRef] [Green Version]
  18. Müller, H.; Sen, R.; Stadtmüller, U. Functional data analysis for volatility. J. Econom. 2011, 165, 233–245. [Google Scholar] [CrossRef] [Green Version]
  19. Shang, H. Forecasting intraday S&P 500 index returns: A functional time series approach. J. Forecast. 2017, 36, 741–755. [Google Scholar]
  20. Kokoszka, P.; Miao, H.; Zhang, X. Functional dynamic factor model for intraday price curves. J. Financ. Econom. 2015, 13, 456–477. [Google Scholar] [CrossRef] [Green Version]
  21. Shang, H.; Yang, Y.; Kearney, F. Intraday forecasts of a volatility index: Functional time series methods with dynamic updating. Ann. Oper. Res. 2019, 282, 331–354. [Google Scholar] [CrossRef] [Green Version]
  22. Yu, C.; Fang, Y.; Li, Z.; Zhang, B.; Zhao, X. Non-Parametric Estimation of High-Frequency Spot Volatility for Brownian Semimartingale with Jumps. J. Time Ser. Anal. 2014, 35, 572–591. [Google Scholar] [CrossRef]
  23. Wang, J.; Chiou, J.; Müller, H. Functional data analysis. Annu. Rev. Stat. Its Appl. 2016, 3, 257–295. [Google Scholar] [CrossRef] [Green Version]
  24. Ramsay, J. When the data are functions. Psychometrika 1982, 47, 379–396. [Google Scholar] [CrossRef]
  25. Kokoszka, P.; Reimherr, M. Introduction to Functional Data Analysis; Chapman and Hall/CR: London, UK, 2017. [Google Scholar]
  26. Ler, K. A brief proof of a maximal rank theorem for generic double points in projective space. Trans. Am. Math. Soc. 2001, 353, 1907–1920. [Google Scholar]
  27. Beaton, A.; Tukey, J. The fitting of power series, meaning polynomials, illustrated on band-spectroscopic data. Technometrics 1974, 16, 147–185. [Google Scholar] [CrossRef]
  28. Hatefi, E.; Hatefi, A. Nonlinear Statistical Spline Smoothers for Critical Spherical Black Hole Solutions in 4-dimension. arXiv 2022, arXiv:2201.00949. [Google Scholar]
  29. Dahiya, V. Analysis of Lagrange Interpolation Formula. IJISET-Int. J. Innov. Sci. Eng. Technol. 2014, 1, 619–624. [Google Scholar]
  30. Wang, X.; Wang, J.; Wang, X.; Yu, C. A Pseudo-Spectral Fourier Collocation Method for Inhomogeneous Elliptical Inclusions with Partial Differential Equations. Mathematics 2022, 10, 296. [Google Scholar] [CrossRef]
  31. Farouki, R. The Bernstein polynomial basis: A centennial retrospective. Comput. Aided Geom. Des. 2012, 29, 379–419. [Google Scholar] [CrossRef]
  32. Farouki, R.; Goodman, T. On the optimal stability of the Bernstein basis. Math. Comput. 1996, 65, 1553–1566. [Google Scholar] [CrossRef] [Green Version]
  33. Kühnel, W. Differential Geometry; American Mathematical Society: Providence, RI, USA, 2015. [Google Scholar]
  34. Jianping, Z.; Zhiguo, L.; Caiyun, C. A New Predictive Model on Data Mining—Predicting Arithmetic of Bernstein Basic Function Fitting and Its Application for Stock Market. Syst. Eng. Theory Pract. 2003, 9, 35–41. [Google Scholar]
  35. Shaojun, Z.; Hong, L. An improved model based on fitting predictions to Bernstein Basic Function. Stat. Decis. 2015, 8, 20–23. [Google Scholar]
  36. Wang, S. A class of distortion operators for pricing financial and insurance risks. J. Risk Insur. 2000, 1, 15–36. [Google Scholar] [CrossRef] [Green Version]
  37. Levitin, D.; Nuzzo, R.; Vines, B.; Ramsay, J. Introduction to functional data analysis. Can. Psychol. Can. 2007, 48, 135. [Google Scholar] [CrossRef] [Green Version]
  38. Ferraty, F.; Mas, A.; Vieu, P. Nonparametric regression on functional data: Inference and practical aspects. Aust. N. Z. J. Stat. 2007, 49, 267–286. [Google Scholar] [CrossRef] [Green Version]
  39. Mas, A.; Pumo, B. Functional linear regression with derivatives. J. Nonparametr. Stat. 2009, 21, 19–40. [Google Scholar] [CrossRef] [Green Version]
  40. Farin, G. Class a Bézier curves. Comput. Aided Geom. Des. 2006, 7, 573–581. [Google Scholar] [CrossRef]
Figure 1. Convex hull diagram with four control points P 0 , P 1 , P 2 , P 3 .
Figure 1. Convex hull diagram with four control points P 0 , P 1 , P 2 , P 3 .
Mathematics 10 01140 g001
Figure 2. Asset prices from 2012 to 2016.
Figure 2. Asset prices from 2012 to 2016.
Mathematics 10 01140 g002
Figure 3. Daily volatility measured using the complete dataset.
Figure 3. Daily volatility measured using the complete dataset.
Mathematics 10 01140 g003
Table 1. Related error results under different drop rates on real data.
Table 1. Related error results under different drop rates on real data.
DropRatesubavgstd
0.10.0039930.0000170.000192
0.20.0357710.0002250.001836
0.30.1153620.0007670.006560
Table 2. Relate error results under different sigma on real data.
Table 2. Relate error results under different sigma on real data.
Sigmasubavgstd
0.10.0325600.0161520.009657
0.20.0651140.0310760.019011
0.30.0979280.0491940.029173
0.40.1303920.0642720.038869
Table 3. Related error results under different drop rates on simulation data.
Table 3. Related error results under different drop rates on simulation data.
DropRatesubavgstd
0.10.0013010.0002090.000108
0.20.0129240.0003680.000861
0.30.2390790.0016880.012562
Table 4. Related error results under different sigma on simulation data.
Table 4. Related error results under different sigma on simulation data.
Sigmasubavgstd
0.10.0000720.000040.000571
0.20.0001450.0001140.002248
0.30.0979280.0001160.001764
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Liang, Z.; Weng, F.; Ma, Y.; Xu, Y.; Zhu, M.; Yang, C. Measurement and Analysis of High Frequency Assert Volatility Based on Functional Data Analysis. Mathematics 2022, 10, 1140. https://doi.org/10.3390/math10071140

AMA Style

Liang Z, Weng F, Ma Y, Xu Y, Zhu M, Yang C. Measurement and Analysis of High Frequency Assert Volatility Based on Functional Data Analysis. Mathematics. 2022; 10(7):1140. https://doi.org/10.3390/math10071140

Chicago/Turabian Style

Liang, Zhenjie, Futian Weng, Yuanting Ma, Yan Xu, Miao Zhu, and Cai Yang. 2022. "Measurement and Analysis of High Frequency Assert Volatility Based on Functional Data Analysis" Mathematics 10, no. 7: 1140. https://doi.org/10.3390/math10071140

APA Style

Liang, Z., Weng, F., Ma, Y., Xu, Y., Zhu, M., & Yang, C. (2022). Measurement and Analysis of High Frequency Assert Volatility Based on Functional Data Analysis. Mathematics, 10(7), 1140. https://doi.org/10.3390/math10071140

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop