Unsupervised Offline Changepoint Detection Ensembles
Abstract
:1. Introduction
Changepoint detection is a study of methods for identification of changes in the probability distribution of an observed stochastic process.
2. Materials and Methods
2.1. Ensembles of Offline Changepoint Detection Procedures
2.1.1. Time Series
2.1.2. Changepoint
2.1.3. Changepoint Detection Procedure
- A cost function, also called a model,
- A search method for finding , also called an optimization or detection algorithm,
- A constraint on the number of changepoints if the exact number of changepoints K is unknown.
2.1.4. CPD Ensemble Procedure
- Avoiding losing useful information as it could be if the outputs of the whole CPD procedure are aggregated. Our approach works with scores representing confidence and many time-related characteristics, while for the second approach, only the sets of changepoints are available for combining.
- Borrowing some useful and interesting ideas from the classification and outlier ensembles where combining scores of the base detectors is the most common approach;
- Getting more opportunities when working with the multivariate time series of scores by applying some classical techniques from the time series analysis;
- Constructing a framework with four main parts (cost function, scaling function, aggregation function, search method). It simplifies the experiments currently conducted or for further research and allows to improve the results by various minor changes added for the best-working methods.
- Model-centered: these are the models that we use to create an ensemble, but we do not pick subsets of data points or data features (data-centered).
- Independent: we calculate the ensemble components (cost function scores) before the aggregation independently from each other and not in a sequential way.
- Scaled (normalized): aggregation function should include a scaling procedure for each argument (cost function application result), since different cost functions may generate differently scaled outputs. Scaling avoids favoring one or more cost functions.
- MinMax: normalizing all values to the range , also known as MinMax scaling;
- Znorm: zero mean, unit variance scaling, also known as Z-normalization or standard scaling;
- MaxAbs (MinAbs): scaling by maximum (minimum) absolute value, also known as MaxAbs (MinAbs) scaling;
- Rank: using ranks of the criterion points from minimum to maximum value.
- Average: averaging scores of all cost functions;
- WeightedAverage: weighting cost functions and then averaging weighted scores. Difference between static and dynamic weighting is presented in [29]. Commonly, the weights for various models or cost functions are predetermined [16,29]. For unsupervised offline ensembles, the weights can show the degree of confidence of each separate detector.
- Max (Min): selecting maximum (or minimum) among scores of various cost functions;
- Sum: summarizing scores of various cost functions;
- ThresholdSum: discarding (pruning) scores below the selected threshold and then summarizing scores. Pruning can be applied either to the scores of every single model by using the threshold or to the scores at each point by selecting only the top models [12];
- AverageOfMax: dividing cost functions into groups, taking maximum of scores of each group, and then averaging;
- MaxOfAverage: dividing cost functions into groups, averaging scores of each group, and then taking maximum of averages;
- FeatureBagging: applying cost functions to feature subsets and averaging of the obtained scores.
2.2. Numerical Experiment
2.2.1. Benchmarks
Tennessee Eastman Process (TEP) Benchmark
Skoltech Anomaly Benchmark (SKAB)
2.2.2. Search Methods
Extension
2.2.3. Cost Functions
Median-Shift through Least Absolute Deviation—l1
Mean-Shift through Least Squared Deviation—l2
Mahalanobis-Type Metric—Mahalanobis
Piecewise Linear Model—Linear
Piecewise Autoregressive Model—ar
2.2.4. Scaling Functions
2.2.5. Aggregation Functions
2.2.6. Performance Measures
3. Results
- + aggregation + scaling;
- + aggregation + scaling;
- + aggregation + scaling;
- + aggregation + scaling.
3.1. CPD and CPDE Procedures
3.2. Cost Functions (Models)
3.3. Aggregation Functions
3.4. Scaling Functions
3.5. CPD and CPDE Procedures vs. SOTA Changepoint Detection Algorithms
4. Discussion
- According to the experiment results, the ensemble approach almost always is a priority option. Among non-ensemble procedures, the favorite ones are and methods but only with cost function. Among ensemble search algorithms, we recommend to use .
- The question of what combination function to select is mentioned in [13]. Averaging and maximum aggregation functions are the only methods compared by authors from a bias-variance trade-off perspective. None of these functions were declared as a clear winner. The experiments indicated that maximization achieved better results than averaging for larger datasets, while the opposite picture was for smaller datasets.Our experiment revealed that selecting either or is a confident strategy for steadily achieving high scores. aggregation function may lead as to the best or the worst score, while it still can be used in combination with and or functions without major losses in a score. After all, we either cannot highlight the best combination strategy.
- Regarding the scaling functions, we can recommend avoiding the to maximize the results for most cases, even though it sometimes allows a high score. All of the other scaling functions get similar results.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Pseudocode of the Ensemble Algorithms
- (Algorithm A1) based on the algorithm;
- (Algorithm A2) based on the algorithm;
- (Algorithm A3) based on the algorithm.
Algorithm A1 |
Input: signal , cost functions , number of regimes . for all , do for do Initialize end for end for . for do for all do end for end for Initialize L, a list with K elements. Initialize the last element: . Initialize . while do . end while Remove T from L. Output: set L of estimated breakpoint indexes. |
Algorithm A2 |
Input: signal , cost functions , half-window width w, peak search procedure . Initialize NT-long arrays filled with 0: , . ▹Score list. for do . . . for do . end for . end for . ▹Peak search procedure. Output: set L of estimated breakpoint indexes. |
Algorithm A3 |
Input: signal , cost functions , stopping criterion. Initialize . ▹Estimated breakpoints. repeat . ▹Number of breakpoints. ▹Dummy variables. if then Denote by the elements (in ascending order) of L, ie . end if Initialize G a -long array. ▹list of gains for do for do end for . end for . until stopping criterion is met. Output: set L of estimated breakpoint indexes. |
References
- Fearnhead, P.; Rigaill, G. Changepoint Detection in the Presence of Outliers. J. Am. Stat. Assoc. 2018, 114, 169–183. [Google Scholar] [CrossRef] [Green Version]
- Chandola, V.; Banerjee, A.; Kumar, V. Anomaly detection. ACM Comput. Surv. 2009, 41, 1–58. [Google Scholar] [CrossRef]
- Aggarwal, C.C. Outlier analysis. In Data Mining; Springer: Berlin/Heidelberg, Germany, 2015; pp. 237–263. [Google Scholar]
- Artemov, A.; Burnaev, E. Ensembles of detectors for online detection of transient changes. In Proceedings of the Eighth International Conference on Machine Vision (ICMV 2015), Barcelona, Spain, 19–21 November 2015; Verikas, A., Radeva, P., Nikolaev, D., Eds.; SPIE: Bellingham, WA, USA, 2015. [Google Scholar] [CrossRef]
- Tartakovsky, A.G.; Rozovskii, B.L.; Blazek, R.B.; Kim, H. A novel approach to detection of intrusions in computer networks via adaptive sequential and batch-sequential change-point detection methods. IEEE Trans. Signal Process. 2006, 54, 3372–3382. [Google Scholar] [CrossRef] [Green Version]
- Banerjee, T.; Chen, Y.C.; Dominguez-Garcia, A.D.; Veeravalli, V.V. Power system line outage detection and identification—A quickest change detection approach. In Proceedings of the 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Florence, Italy, 4–9 May 2014; pp. 3450–3454. [Google Scholar]
- Bai, J. Estimation of a change point in multiple regression models. Rev. Econ. Stat. 1997, 79, 551–563. [Google Scholar] [CrossRef]
- Reeves, J.; Chen, J.; Wang, X.L.; Lund, R.; Lu, Q.Q. A review and comparison of changepoint detection techniques for climate data. J. Appl. Meteorol. Climatol. 2007, 46, 900–915. [Google Scholar] [CrossRef]
- Rad, M.Z.; Ghuchani, S.R.; Bahaadinbeigy, K.; Khalilzadeh, M.M. Real time recognition of heart attack in a smart phone. Acta Inform. Med. 2015, 23, 151. [Google Scholar] [CrossRef] [Green Version]
- Shvetsov, N.; Buzun, N.; Dylov, D.V. Unsupervised non-parametric change point detection in electrocardiography. In Proceedings of the 32nd International Conference on Scientific and Statistical Database Management, Vienna, Austria, 7–9 July 2020; pp. 1–4. [Google Scholar]
- Zhao, K.; Wulder, M.A.; Hu, T.; Bright, R.; Wu, Q.; Qin, H.; Li, Y.; Toman, E.; Mallick, B.; Zhang, X.; et al. Detecting change-point, trend, and seasonality in satellite time series data to track abrupt changes and nonlinear dynamics: A Bayesian ensemble algorithm. Remote. Sens. Environ. 2019, 232, 111181. [Google Scholar] [CrossRef]
- Aggarwal, C.C. Outlier ensembles: Position paper. ACM SIGKDD Explor. Newsl. 2013, 14, 49–58. [Google Scholar] [CrossRef]
- Aggarwal, C.C.; Sathe, S. Theoretical foundations and algorithms for outlier ensembles. ACM Sigkdd Explor. Newsl. 2015, 17, 24–47. [Google Scholar] [CrossRef]
- Rayana, S.; Akoglu, L. Less is more: Building selective anomaly ensembles. ACM Trans. Knowl. Discov. Data (TKDD) 2016, 10, 1–33. [Google Scholar] [CrossRef]
- Chen, J.; Sathe, S.; Aggarwal, C.; Turaga, D. Outlier detection with autoencoder ensembles. In Proceedings of the 2017 SIAM International Conference on Data Mining, Houston, TX, USA, 27–29 April 2017; pp. 90–98. [Google Scholar]
- Smolyakov, D.; Sviridenko, N.; Ishimtsev, V.; Burikov, E.; Burnaev, E. Learning ensembles of anomaly detectors on synthetic data. In Proceedings of the International Symposium on Neural Networks, Moscow, Russia, 10–12 July 2019; pp. 292–306. [Google Scholar]
- Zhao, Y.; Nasrullah, Z.; Hryniewicki, M.K.; Li, Z. LSCP: Locally selective combination in parallel outlier ensembles. In Proceedings of the 2019 SIAM International Conference on Data Mining, Calgary, AB, Canada, 2–4 May 2019; pp. 585–593. [Google Scholar]
- Gao, J.; Fan, W.; Turaga, D.; Verscheure, O.; Meng, X.; Su, L.; Han, J. Consensus extraction from heterogeneous detectors to improve performance over network traffic anomaly detection. In Proceedings of the 2011 Proceedings IEEE Infocom, Shanghai, China, 10–15 April 2011; pp. 181–185. [Google Scholar]
- Alippi, C.; Boracchi, G.; Roveri, M. Ensembles of change-point methods to estimate the change point in residual sequences. Soft Comput. 2013, 17, 1971–1981. [Google Scholar] [CrossRef]
- Alippi, C.; Boracchi, G.; Puig, V.; Roveri, M. An ensemble approach to estimate the fault-time instant. In Proceedings of the 2013 Fourth International Conference on Intelligent Control and Information Processing (ICICIP), Beijing, China, 9–11 June 2013; pp. 836–841. [Google Scholar]
- Faithfull, W.J.; Rodríguez, J.J.; Kuncheva, L.I. Combining univariate approaches for ensemble change detection in multivariate data. Inf. Fusion 2019, 45, 202–214. [Google Scholar] [CrossRef] [Green Version]
- Truong, C.; Oudre, L.; Vayatis, N. Selective review of offline change point detection methods. Signal Process. 2020, 167, 107299. [Google Scholar] [CrossRef] [Green Version]
- Katser, I.; Kozitsin, V.; Maksimov, I. NPP Equipment Fault Detection Methods. Izvestiya vuzov. Yadernaya Energetika 2019, 4, 5–27. [Google Scholar] [CrossRef]
- Advanced Surveillance, Diagnostic and Prognostic Techniques in Monitoring Structures, Systems and Components in Nuclear Power Plants; Number NP-T-3.14 in Nuclear Energy Series; International Atomic Energy Agency: Vienna, Austria, 2013.
- Lu, Y. Industry 4.0: A survey on technologies, applications and open research issues. J. Ind. Inf. Integr. 2017, 6, 1–10. [Google Scholar] [CrossRef]
- Vaidya, S.; Ambad, P.; Bhosle, S. Industry 4.0—A Glimpse. Procedia Manuf. 2018, 20, 233–238. [Google Scholar] [CrossRef]
- Kuncheva, L.I. Combining Pattern Classifiers: Methods and Algorithms; John Wiley & Sons: Hoboken, NJ, USA, 2014. [Google Scholar]
- Nguyen, V.L.; Hüllermeier, E.; Rapp, M.; Mencía, E.L.; Fürnkranz, J. On Aggregation in Ensembles of Multilabel Classifiers. In Proceedings of the 23rd International Conference, DS 2020, Thessaloniki, Greece, 19–21 October 2020; pp. 533–547. [Google Scholar]
- Costa, V.S.; Farias, A.D.S.; Bedregal, B.; Santiago, R.H.; Canuto, A.M.d.P. Combining multiple algorithms in classifier ensembles using generalized mixture functions. Neurocomputing 2018, 313, 402–414. [Google Scholar] [CrossRef] [Green Version]
- Downs, J.J.; Vogel, E.F. A plant-wide industrial process control problem. Comput. Chem. Eng. 1993, 17, 245–255. [Google Scholar] [CrossRef]
- Chiang, L.H.; Russell, E.L.; Braatz, R.D. Fault Detection and Diagnosis in Industrial Systems; Springer: London, UK; Science & Business Media: Berlin, Germany, 2000. [Google Scholar]
- Katser, I.D.; Kozitsin, V.O. Skoltech Anomaly Benchmark (SKAB). 2020. Available online: https://www.kaggle.com/dsv/1693952 (accessed on 8 May 2021).
- Guédon, Y. Exploring the latent segmentation space for the assessment of multiple change-point models. Comput. Stat. 2013, 28, 2641–2678. [Google Scholar] [CrossRef]
- Fryzlewicz, P. Wild binary segmentation for multiple change-point detection. Ann. Stat. 2014, 42, 2243–2281. [Google Scholar] [CrossRef]
- Bai, J. Least absolute deviation estimation of a shift. In Econometric Theory; Cambridge University Press: Cambridge, UK, 1995; pp. 403–436. [Google Scholar] [CrossRef]
- Xing, E.P.; Jordan, M.I.; Russell, S.J.; Ng, A.Y. Distance metric learning with application to clustering with side-information. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2003; pp. 521–528. [Google Scholar]
- Mahalanobis, P.C. On the Generalized Distance in Statistics. In Proceedings of the National Institute of Sciences of India, Calcutta, India, 16 April 1936; Volume 2, pp. 49–55. [Google Scholar]
- Bai, J.; Perron, P. Critical values for multiple structural change tests. Econom. J. 2003, 6, 72–78. [Google Scholar] [CrossRef]
- Bai, J. Vector Autoregressive Models with Structural Changes in Regression Coefficients and in Variance-Covariance Matrices; Technical Report; China Economics and Management Academy, Central University of Finance: Beijing, China, 2000. [Google Scholar]
- Shao, J.D.; Rong, G.; Lee, J.M. Generalized orthogonal locality preserving projections for nonlinear fault detection and diagnosis. Chemom. Intell. Lab. Syst. 2009, 96, 75–83. [Google Scholar] [CrossRef]
- Odiowei, P.E.; Cao, Y. Nonlinear Dynamic Process Monitoring Using Canonical Variate Analysis and Kernel Density Estimations. IEEE Trans. Ind. Inform. 2010, 6, 36–45. [Google Scholar] [CrossRef] [Green Version]
- Yin, S.; Ding, S.X.; Haghani, A.; Hao, H.; Zhang, P. A comparison study of basic data-driven fault diagnosis and process monitoring methods on the benchmark Tennessee Eastman process. J. Process. Control. 2012, 22, 1567–1581. [Google Scholar] [CrossRef]
- Lavin, A.; Ahmad, S. Evaluating Real-Time Anomaly Detection Algorithms—The Numenta Anomaly Benchmark. In Proceedings of the 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA), Miami, FL, USA, 9–11 December 2015. [Google Scholar] [CrossRef] [Green Version]
- Safin, A.M.; Burnaev, E. Conformal kernel expected similarity for anomaly detection in time-series data. Adv. Syst. Sci. Appl. 2017, 17, 22–33. [Google Scholar]
- Ishimtsev, V.; Bernstein, A.; Burnaev, E.; Nazarov, I. Conformal k-NN Anomaly Detector for Univariate Data Streams. In Proceedings of the Machine Learning Research, Stockholm, Sweden, 13–16 June 2017; Volume 60: Conformal and Probabilistic Prediction and Applications, pp. 213–227. [Google Scholar]
- Kozitsin, V.; Katser, I.; Lakontsev, D. Online Forecasting and Anomaly Detection Based on the ARIMA Model. Appl. Sci. 2021, 11, 3194. [Google Scholar] [CrossRef]
- Filonov, P.; Kitashov, F.; Lavrentyev, A. Rnn-based early cyber-attack detection for the tennessee eastman process. arXiv 2017, arXiv:1709.02232. [Google Scholar]
- Zhang, C.; Song, D.; Chen, Y.; Feng, X.; Lumezanu, C.; Cheng, W.; Ni, J.; Zong, B.; Chen, H.; Chawla, N.V. A deep neural network for unsupervised anomaly detection and diagnosis in multivariate time series data. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 1409–1416. [Google Scholar]
- Filonov, P.; Lavrentyev, A.; Vorontsov, A. Multivariate industrial time series with cyber-attack simulation: Fault detection using an lstm-based predictive data model. arXiv 2016, arXiv:1612.06676. [Google Scholar]
- Hotelling, H. Multivariate Quality Control Illustrated by Air Testing of Sample Bombsights. In Techniques of Statistical Analysis; Eisenhart, C., Hastay, M.W., Wallis, W.A., Eds.; McGraw-Hill: New York, NY, USA, 1947; pp. 111–184. [Google Scholar]
Metric | ||||
---|---|---|---|---|
Standard | 1.0 | −0.11 | 1.0 | −1.0 |
LowFP | 1.0 | −0.22 | 1.0 | −1.0 |
LowFN | 1.0 | −0.11 | 1.0 | −2.0 |
Standard | LowFP | LowFN | Standard | LowFP | LowFN | Standard | LowFP | LowFN | ||
---|---|---|---|---|---|---|---|---|---|---|
Model | ||||||||||
ar (1) | 30.15 | 28.89 | 32.8 | 13.23 | 12.19 | 13.59 | 30.15 | 28.89 | 32.8 | |
mahalanobis | 36.88 | 35.82 | 37.29 | 27.79 | 27 | 28.05 | 36.88 | 35.82 | 37.29 | |
l1 | 32.53 | 31.98 | 32.8 | 20.63 | 19.85 | 21.69 | 32.53 | 31.98 | 32.8 | |
l2 | 30.3 | 29.52 | 31.31 | 22.09 | 21.68 | 22.66 | 30.3 | 29.52 | 31.31 | |
linear | 4.5 | 4.24 | 4.59 | - | - | - | 4.5 | 4.24 | 4.59 | |
Aggregation | Scaling | |||||||||
Min | MinMax | 41.81 | 41 | 42.16 | 23.55 | 23.28 | 23.63 | 41.81 | 41 | 42.16 |
Znorm | 25.66 | 24.9 | 26.63 | 23.28 | 22.76 | 23.46 | 25.66 | 24.9 | 26.63 | |
MinAbs | 22.85 | 21.8 | 24.76 | 23.82 | 23.35 | 25.4 | 22.85 | 21.82 | 24.76 | |
Rank | 41.81 | 41 | 42.16 | 22.93 | 22.37 | 23.23 | 41.81 | 41 | 42.16 | |
Sum | MinMax | 34.8 | 34 | 35.9 | 23.54 | 23.28 | 23.63 | 34.8 | 34 | 35.9 |
Znorm | 34.83 | 34.03 | 35.92 | 23.55 | 23.28 | 23.63 | 34.83 | 34.03 | 35.92 | |
MinAbs | 34.73 | 33.68 | 35.85 | 23.68 | 22.96 | 25.31 | 34.8 | 34 | 35.9 | |
Rank | 30.47 | 29.95 | 31.42 | 22.62 | 22.27 | 23.02 | 30.47 | 29.95 | 31.42 | |
WeightedSum | MinMax | 34.8 | 34 | 35.9 | 23.55 | 23.28 | 23.63 | 34.8 | 34 | 35.9 |
Znorm | 34.83 | 34.03 | 35.92 | 23.55 | 23.28 | 23.63 | 34.83 | 34.03 | 35.92 | |
MinAbs | 34.73 | 33.68 | 35.85 | 25.14 | 24.33 | 26.29 | 34.8 | 34 | 35.9 | |
Rank | 25.59 | 25.06 | 26.59 | 23.28 | 22.76 | 23.46 | 25.59 | 25.06 | 26.59 | |
ThresholdSum | MinMax | 33.51 | 32.58 | 35.04 | 11.46 | 10.95 | 12.4 | 33.64 | 32.73 | 35.13 |
Znorm | 34.83 | 34.03 | 35.92 | 11.46 | 10.95 | 12.4 | 34.83 | 34.03 | 35.92 | |
MinAbs | −5.5 | −11 | −3.67 | 13.75 | 13.22 | 13.93 | 33.64 | 32.73 | 35.13 | |
Rank | −5.5 | −11 | −3.67 | 6.38 | 5.59 | 7.43 | 0 | 0 | 0 |
Standard | LowFP | LowFN | Standard | LowFP | LowFN | Standard | LowFP | LowFN | ||
---|---|---|---|---|---|---|---|---|---|---|
Model | ||||||||||
ar (1) | 19.4 | 16.83 | 20.63 | 12.36 | 9.58 | 13.62 | 21.39 | 18.89 | 22.72 | |
mahalanobis | 22.37 | 19.9 | 23.37 | 15.55 | 13.44 | 16.27 | 24.1 | 21.69 | 25.04 | |
l1 | 18.64 | 15.99 | 20.12 | 18.4 | 16.22 | 19.19 | 17.87 | 15.1 | 19.09 | |
l2 | 18.96 | 16.5 | 20.33 | 14.78 | 12.4 | 16.01 | 17.46 | 14.81 | 18.82 | |
linear | 9.37 | 6.6 | 10.61 | - | - | - | 9.53 | 6.7 | 10.97 | |
Aggregation | Scaling | |||||||||
Min | MinMax | 19.77 | 17.04 | 20.87 | 14.41 | 11.88 | 15.51 | 0.18 | -4.69 | 1.91 |
Znorm | 17.71 | 15.01 | 18.99 | 15.85 | 13.19 | 16.98 | 13.03 | 10.85 | 14.07 | |
MinAbs | 19.33 | 16.67 | 20.83 | 16.51 | 13.92 | 17.68 | 7.05 | 3.98 | 8.55 | |
Rank | 19.77 | 17.04 | 20.87 | 16.08 | 13.22 | 17.39 | 0.6 | -3.84 | 2.19 | |
Sum | MinMax | 20.52 | 18.09 | 21.88 | 16.7 | 14.54 | 17.54 | 15.71 | 13 | 16.89 |
Znorm | 20.89 | 18.46 | 22.13 | 16.14 | 13.85 | 16.91 | 11.64 | 10.24 | 12.12 | |
MinAbs | 20.25 | 17.95 | 21.45 | 19.38 | 17.03 | 20.35 | 15.84 | 13.26 | 16.97 | |
Rank | 21.53 | 18.98 | 22.82 | 14.25 | 11.5 | 15.39 | 10.15 | 8.29 | 11.12 | |
WeightedSum | MinMax | 21.24 | 18.77 | 22.62 | 15.31 | 12.94 | 16.1 | 16.41 | 13.95 | 17.6 |
Znorm | 20.89 | 18.46 | 22.13 | 14.08 | 11.6 | 15.03 | 15.3 | 12.74 | 16.61 | |
MinAbs | 20.16 | 17.78 | 21.39 | 18.58 | 16.19 | 19.57 | 16.41 | 13.95 | 17.6 | |
Rank | 23.07 | 20.52 | 24.35 | 12.9 | 10.22 | 13.99 | 18.1 | 15.36 | 19.51 | |
ThresholdSum | MinMax | 21.2 | 18.69 | 22.6 | 9.5 | 7.31 | 10.69 | 15.71 | 13 | 16.89 |
Znorm | 21.7 | 19.32 | 22.93 | 10.06 | 7.93 | 11.32 | 11.64 | 10.24 | 12.12 | |
MinAbs | −5.5 | −11 | −3.67 | 9.96 | 6.91 | 11 | 15.8 | 13.17 | 16.94 | |
Rank | −5.5 | −11 | −3.67 | 10.87 | 7.98 | 12.12 | 0 | 0 | 0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Katser, I.; Kozitsin, V.; Lobachev, V.; Maksimov, I. Unsupervised Offline Changepoint Detection Ensembles. Appl. Sci. 2021, 11, 4280. https://doi.org/10.3390/app11094280
Katser I, Kozitsin V, Lobachev V, Maksimov I. Unsupervised Offline Changepoint Detection Ensembles. Applied Sciences. 2021; 11(9):4280. https://doi.org/10.3390/app11094280
Chicago/Turabian StyleKatser, Iurii, Viacheslav Kozitsin, Victor Lobachev, and Ivan Maksimov. 2021. "Unsupervised Offline Changepoint Detection Ensembles" Applied Sciences 11, no. 9: 4280. https://doi.org/10.3390/app11094280
APA StyleKatser, I., Kozitsin, V., Lobachev, V., & Maksimov, I. (2021). Unsupervised Offline Changepoint Detection Ensembles. Applied Sciences, 11(9), 4280. https://doi.org/10.3390/app11094280