entropy-logo

Journal Browser

Journal Browser

Entropy in Data Analysis II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Signal and Data Analysis".

Deadline for manuscript submissions: closed (15 October 2023) | Viewed by 7266

Special Issue Editors


E-Mail Website
Guest Editor
Research Fellow in Biomedical Signal Processing & Machine Learning, University of Toronto, Toronto, ON M5S 1A1, Canada
Interests: biomedical signal processing; nonlinear analysis; brain–computer interface; machine learning
Special Issues, Collections and Topics in MDPI journals
Department of Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, MA 02115, USA
Interests: Alzheimer's disease; biomedical signal processing; cardiovascular dynamics; fractal physiology; healthy aging; sleep
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Computational NeuroEngineering Lab, University of Florida, Gainesville, FL 32611, USA
Interests: information theoretic learning; kernel methods; adaptive signal processing; brain machine interfaces
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Entropy is a powerful nonlinear metric widely used to assess the dynamical characteristics of data. A number of methods, such as sample entropy, fuzzy entropy, permutation entropy, distribution entropy, and dispersion entropy, have been introduced to quantify the irregularity or uncertainty of signals and images. Their multiscale extensions have been developed to quantify the complexity of data to deal with the multiple time scales inherent in such signals and images. For a better understanding of the underlying signal-generating system, multivariate multiscale entropy methods have also been proposed to take into account both the time and spatial domains at the same time.

These entropy approaches have been used in a wide range of real-world applications ranging from neuroscience and biomedical engineering to mechanical and financial studies. In particular, they have been successfully used for physiological signals, such as electrocardiograms (ECG), electroencephalograms (EEG), electromyograms (EMG), electrooculograms (EOG), gait fluctuations, and respiratory signals to help the diagnosis of different diseases such as Alzheimer’s disease, Parkinson’s disease, ALS, and ataxia.

The main goal of this Special Issue is to disseminate new and original research based on entropy analyses in order to assist in a better understanding of the physiology and data-generating mechanism, early diagnosing disorders or diseases, treatment monitoring, and planning healthcare strategies, required to prevent the occurrences of certain pathologies. Another goal is dealing with practical challenges while using these entropy-based approaches such as the effect of various noises, the quantization influences, the lengths of data, or parameters tuning. This Special Issue also seeks contributions for signal analysis based on correntropy, mutual information, divergences, and so on, which can capture higher-order statistics and the information content of signals.

Potential topics include but are not limited to the following:

  • Spectral, sample, fuzzy, permutation, distribution, dispersion, and fluctuation dispersion entropies;
  • Kullback–Leibler divergence (relative entropy), correntropy, and causality analysis;
  • Analysis of physiological signals at multiple temporal, frequency, and spatial scales (ECG, EEG, EMG, MEG, etc.);
  • Underlying mechanisms behind entropy-based results used for physiological data to improve our understanding of the disease diagnosis/pathogenesis/progression;
  • Psychophysiological signals (physical/mental/emotional analysis), especially in newborns and the elderly;
  • Univariate and multivariate multiscale entropy and complexity loss theory for the diagnosing diseases and monitoring treatments;
  • Complexity loss theory in different diseases and disorders, especially dementia, epilepsy, and sleep disorders;
  • Practical considerations: data length, embedding dimension, time delay, noise power, and signal modality characterization for health;
  • Two- and three-dimensional entropy methods for image analysis;
  • Mechanical and financial applications of entropy methods.

Dr. Hamed Azami
Dr. Peng Li 
Prof. Dr. Jose C. Principe
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Related Special Issue

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 3388 KiB  
Article
A K-Means Classification and Entropy Pooling Portfolio Strategy for Small and Large Capitalization Cryptocurrencies
by Jules Clement Mba and Ehounou Serge Eloge Florentin Angaman
Entropy 2023, 25(8), 1208; https://doi.org/10.3390/e25081208 - 14 Aug 2023
Cited by 1 | Viewed by 844
Abstract
In this study, we propose three portfolio strategies: allocation based on the normality assumption, the skewed-Student t distribution, and the entropy pooling (EP) method for 14 small- and large-capitalization (cap) cryptocurrencies. We categorize our portfolios into three groups: portfolio 1, consisting of three [...] Read more.
In this study, we propose three portfolio strategies: allocation based on the normality assumption, the skewed-Student t distribution, and the entropy pooling (EP) method for 14 small- and large-capitalization (cap) cryptocurrencies. We categorize our portfolios into three groups: portfolio 1, consisting of three large-cap cryptocurrencies and four small-cap cryptocurrencies from various K-means classification clusters; and portfolios 2 and 3, consisting of seven small-cap and seven large-cap cryptocurrencies, respectively. Then, we investigate the performance of the proposed strategies on these portfolios by performing a backtest during a crypto market crash. Our backtesting covers April 2022 to October 2022, when many cryptocurrencies experienced significant losses. Our results indicate that the wealth progression under the normality assumption exceeds that of the other two strategies, though they all exhibit losses in terms of final wealth. In addition, we found that portfolio 3 is the best-performing portfolio in terms of wealth progression and performance measures, followed by portfolios 1 and 2, respectively. Hence, our results suggest that investors will benefit from investing in a portfolio consisting of large-cap cryptocurrencies. In other words, it may be safer to invest in large-cap cryptocurrencies than in small-cap cryptocurrencies. Moreover, our results indicate that adding large- and small-cap cryptocurrencies to a portfolio could improve the diversification benefit and risk-adjusted returns. Therefore, while cryptocurrencies may offer potentially high returns and diversification benefits in a portfolio, investors should be aware of the risks and carefully consider their investment objectives and risk tolerance before investing in them. Full article
(This article belongs to the Special Issue Entropy in Data Analysis II)
Show Figures

Figure 1

11 pages, 2156 KiB  
Article
Forecasting Tourist Arrivals for Hainan Island in China with Decomposed Broad Learning before the COVID-19 Pandemic
by Jingyao Chen, Jie Yang, Shigao Huang, Xin Li and Gang Liu
Entropy 2023, 25(2), 338; https://doi.org/10.3390/e25020338 - 12 Feb 2023
Cited by 1 | Viewed by 1576
Abstract
This study proposes a decomposed broad learning model to improve the forecasting accuracy for tourism arrivals on Hainan Island in China. With decomposed broad learning, we predicted monthly tourist arrivals from 12 countries to Hainan Island. We compared the actual tourist arrivals to [...] Read more.
This study proposes a decomposed broad learning model to improve the forecasting accuracy for tourism arrivals on Hainan Island in China. With decomposed broad learning, we predicted monthly tourist arrivals from 12 countries to Hainan Island. We compared the actual tourist arrivals to Hainan from the US with the predicted tourist arrivals using three models (FEWT-BL: fuzzy entropy empirical wavelet transform-based broad learning; BL: broad Learning; BPNN: back propagation neural network). The results indicated that US foreigners had the most arrivals in 12 countries, and FEWT-BL had the best performance in forecasting tourism arrivals. In conclusion, we establish a unique model for accurate tourism forecasting that can facilitate decision-making in tourism management, especially at turning points in time. Full article
(This article belongs to the Special Issue Entropy in Data Analysis II)
Show Figures

Figure 1

13 pages, 321 KiB  
Article
A Quantile General Index Derived from the Maximum Entropy Principle
by Tomonari Sei
Entropy 2022, 24(10), 1431; https://doi.org/10.3390/e24101431 - 08 Oct 2022
Viewed by 3715
Abstract
We propose a linear separation method of multivariate quantitative data in such a way that the average of each variable in the positive group is larger than that of the negative group. Here, the coefficients of the separating hyperplane are restricted to be [...] Read more.
We propose a linear separation method of multivariate quantitative data in such a way that the average of each variable in the positive group is larger than that of the negative group. Here, the coefficients of the separating hyperplane are restricted to be positive. Our method is derived from the maximum entropy principle. The composite score obtained as a result is called the quantile general index. The method is applied to the problem of determining the top 10 countries in the world based on the 17 scores of the Sustainable Development Goals (SDGs). Full article
(This article belongs to the Special Issue Entropy in Data Analysis II)
Show Figures

Figure 1

Back to TopTop