Next Issue
Volume 22, May
Previous Issue
Volume 22, March
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 22, Issue 4 (April 2020) – 120 articles

Cover Story (view full-size image): Here, we motivate a geometric perspective of the concept of information flow between components of a complex dynamical system. The most popular methods in this area are probabilistic in nature, including the Nobel-prize-winning work on Granger causality, and also the recently highly popular transfer entropy. Beyond conceptual advancement, a geometric description of causality further allows for new and efficient computational methods of causality inference. In this direction, we introduce a new measure of causal inference based on contrasting fractal correlation dimensions, conditionally applied to compete for explanations of future forecasts. In this setting, we believe our geometric interpretation of information flow has both computational efficiency and theoretical interpretation reasons to contribute positively to many fields of science. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
22 pages, 392 KiB  
Article
Robust Change Point Test for General Integer-Valued Time Series Models Based on Density Power Divergence
by Byungsoo Kim and Sangyeol Lee
Entropy 2020, 22(4), 493; https://doi.org/10.3390/e22040493 - 24 Apr 2020
Cited by 12 | Viewed by 3253
Abstract
In this study, we consider the problem of testing for a parameter change in general integer-valued time series models whose conditional distribution belongs to the one-parameter exponential family when the data are contaminated by outliers. In particular, we use a robust change point [...] Read more.
In this study, we consider the problem of testing for a parameter change in general integer-valued time series models whose conditional distribution belongs to the one-parameter exponential family when the data are contaminated by outliers. In particular, we use a robust change point test based on density power divergence (DPD) as the objective function of the minimum density power divergence estimator (MDPDE). The results show that under regularity conditions, the limiting null distribution of the DPD-based test is a function of a Brownian bridge. Monte Carlo simulations are conducted to evaluate the performance of the proposed test and show that the test inherits the robust properties of the MDPDE and DPD. Lastly, we demonstrate the proposed test using a real data analysis of the return times of extreme events related to Goldman Sachs Group stock. Full article
Show Figures

Figure 1

20 pages, 1065 KiB  
Article
An Efficient, Parallelized Algorithm for Optimal Conditional Entropy-Based Feature Selection
by Gustavo Estrela, Marco Dimas Gubitoso, Carlos Eduardo Ferreira, Junior Barrera and Marcelo S. Reis
Entropy 2020, 22(4), 492; https://doi.org/10.3390/e22040492 - 24 Apr 2020
Cited by 8 | Viewed by 3553
Abstract
In Machine Learning, feature selection is an important step in classifier design. It consists of finding a subset of features that is optimum for a given cost function. One possibility to solve feature selection is to organize all possible feature subsets into a [...] Read more.
In Machine Learning, feature selection is an important step in classifier design. It consists of finding a subset of features that is optimum for a given cost function. One possibility to solve feature selection is to organize all possible feature subsets into a Boolean lattice and to exploit the fact that the costs of chains in that lattice describe U-shaped curves. Minimization of such cost function is known as the U-curve problem. Recently, a study proposed U-Curve Search (UCS), an optimal algorithm for that problem, which was successfully used for feature selection. However, despite of the algorithm optimality, the UCS required time in computational assays was exponential on the number of features. Here, we report that such scalability issue arises due to the fact that the U-curve problem is NP-hard. In the sequence, we introduce the Parallel U-Curve Search (PUCS), a new algorithm for the U-curve problem. In PUCS, we present a novel way to partition the search space into smaller Boolean lattices, thus rendering the algorithm highly parallelizable. We also provide computational assays with both synthetic data and Machine Learning datasets, where the PUCS performance was assessed against UCS and other golden standard algorithms in feature selection. Full article
(This article belongs to the Special Issue Information-Theoretical Methods in Data Mining)
Show Figures

Figure 1

13 pages, 416 KiB  
Article
Useful Dual Functional of Entropic Information Measures
by Angelo Plastino, Mario Carlos Rocca and Flavia Pennini
Entropy 2020, 22(4), 491; https://doi.org/10.3390/e22040491 - 24 Apr 2020
Cited by 1 | Viewed by 2680
Abstract
There are entropic functionals galore, but not simple objective measures to distinguish between them. We remedy this situation here by appeal to Born’s proposal, of almost a hundred years ago, that the square modulus of any wave function | ψ | 2 be [...] Read more.
There are entropic functionals galore, but not simple objective measures to distinguish between them. We remedy this situation here by appeal to Born’s proposal, of almost a hundred years ago, that the square modulus of any wave function | ψ | 2 be regarded as a probability distribution P. the usefulness of using information measures like Shannon’s in this pure-state context has been highlighted in [Phys. Lett. A1993, 181, 446]. Here we will apply the notion with the purpose of generating a dual functional [ F α R : { S Q } R + ], which maps entropic functionals onto positive real numbers. In such an endeavor, we use as standard ingredients the coherent states of the harmonic oscillator (CHO), which are unique in the sense of possessing minimum uncertainty. This use is greatly facilitated by the fact that the CHO can be given analytic, compact closed form as shown in [Rev. Mex. Fis. E 2019, 65, 191]. Rewarding insights are to be obtained regarding the comparison between several standard entropic measures. Full article
(This article belongs to the Special Issue Entropic Forces in Complex Systems)
Show Figures

Figure 1

12 pages, 303 KiB  
Article
Limitations to Estimating Mutual Information in Large Neural Populations
by Jan Mölter and Geoffrey J. Goodhill
Entropy 2020, 22(4), 490; https://doi.org/10.3390/e22040490 - 24 Apr 2020
Cited by 4 | Viewed by 4767
Abstract
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily [...] Read more.
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

28 pages, 1036 KiB  
Article
Energy Dissipation and Decoherence in Solid-State Quantum Devices: Markovian versus non-Markovian Treatments
by Rita Claudia Iotti and Fausto Rossi
Entropy 2020, 22(4), 489; https://doi.org/10.3390/e22040489 - 24 Apr 2020
Cited by 4 | Viewed by 3093
Abstract
The design and optimization of new-generation solid-state quantum hardware absolutely requires reliable dissipation versus decoherence models. Depending on the device operational condition, the latter may range from Markov-type schemes (both phenomenological- and microscopic- like) to quantum-kinetic approaches. The primary goal of this paper [...] Read more.
The design and optimization of new-generation solid-state quantum hardware absolutely requires reliable dissipation versus decoherence models. Depending on the device operational condition, the latter may range from Markov-type schemes (both phenomenological- and microscopic- like) to quantum-kinetic approaches. The primary goal of this paper is to review in a cohesive way virtues versus limitations of the most popular approaches, focussing on a few critical issues recently pointed out (see, e.g., Phys. Rev. B 90, 125140 (2014); Eur. Phys. J. B 90, 250 (2017)) and linking them within a common framework. By means of properly designed simulated experiments of a prototypical quantum-dot nanostructure (described via a two-level electronic system coupled to a phonon bath), we shall show that both conventional (i.e., non-Lindblad) Markov models and density-matrix-based non-Markov approaches (i.e., quantum-kinetic treatments) may lead to significant positivity violations. While for the former case the problem is easily avoidable by choosing genuine Lindblad-type dissipation models, for the latter, a general strategy is still missing. Full article
(This article belongs to the Special Issue Open Quantum Systems (OQS) for Quantum Technologies)
Show Figures

Figure 1

14 pages, 603 KiB  
Article
Comparison of Outlier-Tolerant Models for Measuring Visual Complexity
by Adrian Carballal, Carlos Fernandez-Lozano, Nereida Rodriguez-Fernandez, Iria Santos and Juan Romero
Entropy 2020, 22(4), 488; https://doi.org/10.3390/e22040488 - 24 Apr 2020
Cited by 6 | Viewed by 3364
Abstract
Providing the visual complexity of an image in terms of impact or aesthetic preference can be of great applicability in areas such as psychology or marketing. To this end, certain areas such as Computer Vision have focused on identifying features and computational models [...] Read more.
Providing the visual complexity of an image in terms of impact or aesthetic preference can be of great applicability in areas such as psychology or marketing. To this end, certain areas such as Computer Vision have focused on identifying features and computational models that allow for satisfactory results. This paper studies the application of recent ML models using input images evaluated by humans and characterized by features related to visual complexity. According to the experiments carried out, it was confirmed that one of these methods, Correlation by Genetic Search (CGS), based on the search for minimum sets of features that maximize the correlation of the model with respect to the input data, predicted human ratings of image visual complexity better than any other model referenced to date in terms of correlation, RMSE or minimum number of features required by the model. In addition, the variability of these terms were studied eliminating images considered as outliers in previous studies, observing the robustness of the method when selecting the most important variables to make the prediction. Full article
Show Figures

Figure 1

18 pages, 634 KiB  
Article
An Improved Total Uncertainty Measure in the Evidence Theory and Its Application in Decision Making
by Miao Qin, Yongchuan Tang and Junhao Wen
Entropy 2020, 22(4), 487; https://doi.org/10.3390/e22040487 - 24 Apr 2020
Cited by 12 | Viewed by 3669
Abstract
Dempster–Shafer evidence theory (DS theory) has some superiorities in uncertain information processing for a large variety of applications. However, the problem of how to quantify the uncertainty of basic probability assignment (BPA) in DS theory framework remain unresolved. The goal of this paper [...] Read more.
Dempster–Shafer evidence theory (DS theory) has some superiorities in uncertain information processing for a large variety of applications. However, the problem of how to quantify the uncertainty of basic probability assignment (BPA) in DS theory framework remain unresolved. The goal of this paper is to define a new belief entropy for measuring uncertainty of BPA with desirable properties. The new entropy can be helpful for uncertainty management in practical applications such as decision making. The proposed uncertainty measure has two components. The first component is an improved version of Dubois–Prade entropy, which aims to capture the non-specificity portion of uncertainty with a consideration of the element number in frame of discernment (FOD). The second component is adopted from Nguyen entropy, which captures conflict in BPA. We prove that the proposed entropy satisfies some desired properties proposed in the literature. In addition, the proposed entropy can be reduced to Shannon entropy if the BPA is a probability distribution. Numerical examples are presented to show the efficiency and superiority of the proposed measure as well as an application in decision making. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

11 pages, 284 KiB  
Article
A New Limit Theorem for Quantum Walk in Terms of Quantum Bernoulli Noises
by Caishi Wang, Suling Ren and Yuling Tang
Entropy 2020, 22(4), 486; https://doi.org/10.3390/e22040486 - 24 Apr 2020
Cited by 2 | Viewed by 2672
Abstract
In this paper, we consider limit probability distributions of the quantum walk recently introduced by Wang and Ye (C.S. Wang and X.J. Ye, Quantum walk in terms of quantum Bernoulli noises, Quantum Inf. Process. 15 (2016), no. 5, 1897–1908). We first establish several [...] Read more.
In this paper, we consider limit probability distributions of the quantum walk recently introduced by Wang and Ye (C.S. Wang and X.J. Ye, Quantum walk in terms of quantum Bernoulli noises, Quantum Inf. Process. 15 (2016), no. 5, 1897–1908). We first establish several technical theorems, which themselves are also interesting. Then, by using these theorems, we prove that, for a wide range of choices of the initial state, the above-mentioned quantum walk has a limit probability distribution of standard Gauss type, which actually gives a new limit theorem for the walk. Full article
(This article belongs to the Special Issue Quantum Information Processing)
15 pages, 3589 KiB  
Article
Cooperation on Interdependent Networks by Means of Migration and Stochastic Imitation
by Sayantan Nag Chowdhury, Srilena Kundu, Maja Duh, Matjaž Perc and Dibakar Ghosh
Entropy 2020, 22(4), 485; https://doi.org/10.3390/e22040485 - 23 Apr 2020
Cited by 57 | Viewed by 4014
Abstract
Evolutionary game theory in the realm of network science appeals to a lot of research communities, as it constitutes a popular theoretical framework for studying the evolution of cooperation in social dilemmas. Recent research has shown that cooperation is markedly more resistant in [...] Read more.
Evolutionary game theory in the realm of network science appeals to a lot of research communities, as it constitutes a popular theoretical framework for studying the evolution of cooperation in social dilemmas. Recent research has shown that cooperation is markedly more resistant in interdependent networks, where traditional network reciprocity can be further enhanced due to various forms of interdependence between different network layers. However, the role of mobility in interdependent networks is yet to gain its well-deserved attention. Here we consider an interdependent network model, where individuals in each layer follow different evolutionary games, and where each player is considered as a mobile agent that can move locally inside its own layer to improve its fitness. Probabilistically, we also consider an imitation possibility from a neighbor on the other layer. We show that, by considering migration and stochastic imitation, further fascinating gateways to cooperation on interdependent networks can be observed. Notably, cooperation can be promoted on both layers, even if cooperation without interdependence would be improbable on one of the layers due to adverse conditions. Our results provide a rationale for engineering better social systems at the interface of networks and human decision making under testing dilemmas. Full article
(This article belongs to the Special Issue Dynamic Processes on Complex Networks)
Show Figures

Figure 1

23 pages, 5690 KiB  
Article
Melanoma and Nevus Skin Lesion Classification Using Handcraft and Deep Learning Feature Fusion via Mutual Information Measures
by Jose-Agustin Almaraz-Damian, Volodymyr Ponomaryov, Sergiy Sadovnychiy and Heydy Castillejos-Fernandez
Entropy 2020, 22(4), 484; https://doi.org/10.3390/e22040484 - 23 Apr 2020
Cited by 122 | Viewed by 9319
Abstract
In this paper, a new Computer-Aided Detection (CAD) system for the detection and classification of dangerous skin lesions (melanoma type) is presented, through a fusion of handcraft features related to the medical algorithm ABCD rule (Asymmetry Borders-Colors-Dermatoscopic Structures) and deep learning features employing [...] Read more.
In this paper, a new Computer-Aided Detection (CAD) system for the detection and classification of dangerous skin lesions (melanoma type) is presented, through a fusion of handcraft features related to the medical algorithm ABCD rule (Asymmetry Borders-Colors-Dermatoscopic Structures) and deep learning features employing Mutual Information (MI) measurements. The steps of a CAD system can be summarized as preprocessing, feature extraction, feature fusion, and classification. During the preprocessing step, a lesion image is enhanced, filtered, and segmented, with the aim to obtain the Region of Interest (ROI); in the next step, the feature extraction is performed. Handcraft features such as shape, color, and texture are used as the representation of the ABCD rule, and deep learning features are extracted using a Convolutional Neural Network (CNN) architecture, which is pre-trained on Imagenet (an ILSVRC Imagenet task). MI measurement is used as a fusion rule, gathering the most important information from both types of features. Finally, at the Classification step, several methods are employed such as Linear Regression (LR), Support Vector Machines (SVMs), and Relevant Vector Machines (RVMs). The designed framework was tested using the ISIC 2018 public dataset. The proposed framework appears to demonstrate an improved performance in comparison with other state-of-the-art methods in terms of the accuracy, specificity, and sensibility obtained in the training and test stages. Additionally, we propose and justify a novel procedure that should be used in adjusting the evaluation metrics for imbalanced datasets that are common for different kinds of skin lesions. Full article
Show Figures

Figure 1

2 pages, 172 KiB  
Correction
Correction: Li, Q.; Liang, S.Y. Incipient Fault Diagnosis of Rolling Bearings Based on Impulse-Step Impact Dictionary and Re-Weighted Minimizing Nonconvex Penalty Lq Regular Technique. Entropy 2017, 19, 421
by Qing Li and Steven Y. Liang
Entropy 2020, 22(4), 483; https://doi.org/10.3390/e22040483 - 23 Apr 2020
Viewed by 2060
Abstract
The authors were not aware of some errors and imprecise descriptions made in the proofreading phase, therefore, we wish to make the following corrections to this paper [...] Full article
24 pages, 905 KiB  
Article
On the Structure of the World Economy: An Absorbing Markov Chain Approach
by Olivera Kostoska, Viktor Stojkoski and Ljupco Kocarev
Entropy 2020, 22(4), 482; https://doi.org/10.3390/e22040482 - 23 Apr 2020
Cited by 6 | Viewed by 5882
Abstract
The expansion of global production networks has raised many important questions about the interdependence among countries and how future changes in the world economy are likely to affect the countries’ positioning in global value chains. We are approaching the structure and lengths of [...] Read more.
The expansion of global production networks has raised many important questions about the interdependence among countries and how future changes in the world economy are likely to affect the countries’ positioning in global value chains. We are approaching the structure and lengths of value chains from a completely different perspective than has been available so far. By assigning a random endogenous variable to a network linkage representing the number of intermediate sales/purchases before absorption (final use or value added), the discrete-time absorbing Markov chains proposed here shed new light on the world input/output networks. The variance of this variable can help assess the risk when shaping the chain length and optimize the level of production. Contrary to what might be expected simply on the basis of comparative advantage, the results reveal that both the input and output chains exhibit the same quasi-stationary product distribution. Put differently, the expected proportion of time spent in a state before absorption is invariant to changes of the network type. Finally, the several global metrics proposed here, including the probability distribution of global value added/final output, provide guidance for policy makers when estimating the resilience of world trading system and forecasting the macroeconomic developments. Full article
(This article belongs to the Special Issue Dynamic Processes on Complex Networks)
Show Figures

Figure 1

23 pages, 2929 KiB  
Article
A Study on Non-Linear DPL Model for Describing Heat Transfer in Skin Tissue during Hyperthermia Treatment
by Sunil Kumar Sharma and Dinesh Kumar
Entropy 2020, 22(4), 481; https://doi.org/10.3390/e22040481 - 22 Apr 2020
Cited by 18 | Viewed by 4032
Abstract
The article studies the simulation-based mathematical modeling of bioheat transfer under the Dirichlet boundary condition. We used complex non-linear dual-phase-lag bioheat transfer (DPLBHT) for analyzing the temperature distribution in skin tissues during hyperthermia treatment of infected cells. The perfusion term, metabolic heat source, [...] Read more.
The article studies the simulation-based mathematical modeling of bioheat transfer under the Dirichlet boundary condition. We used complex non-linear dual-phase-lag bioheat transfer (DPLBHT) for analyzing the temperature distribution in skin tissues during hyperthermia treatment of infected cells. The perfusion term, metabolic heat source, and external heat source were the three parts of the volumetric heat source that were used in the model. The non-linear DPLBHT model predicted a more accurate temperature within skin tissues. The finite element Runge–Kutta (4,5) (FERK (4,5)) method, which was based on two techniques, finite difference and Runge–Kutta (4,5), was applied for calculating the result in the case of our typical non-linear problem. The paper studies and presents the non-dimensional unit. Thermal damage of normal tissue was observed near zero during hyperthermia treatment. The effects of the non-dimensional time, non-dimensional space coordinate, location parameter, regional parameter, relaxation and thermalization time, metabolic heat source, associated metabolic heat source parameter, perfusion rate, associated perfusion heat source parameter, and external heat source coefficient on the dimensionless temperature profile were studied in detail during the hyperthermia treatment process. Full article
(This article belongs to the Special Issue Biological Statistical Mechanics)
Show Figures

Figure 1

26 pages, 12560 KiB  
Article
Hall Effect on Radiative Casson Fluid Flow with Chemical Reaction on a Rotating Cone through Entropy Optimization
by Wejdan Deebani, Asifa Tassaddiq, Zahir Shah, Abdullah Dawar and Farhad Ali
Entropy 2020, 22(4), 480; https://doi.org/10.3390/e22040480 - 22 Apr 2020
Cited by 29 | Viewed by 3250
Abstract
Magnetohydrodynamic (MHD) flow with Hall current has numerous applications in industrial areas such as Hall current accelerators, MHD power generators, planetary dynamics, Hall current sensors, etc. In this paper, the analysis of an unsteady MHD Casson fluid with chemical reaction over a rotating [...] Read more.
Magnetohydrodynamic (MHD) flow with Hall current has numerous applications in industrial areas such as Hall current accelerators, MHD power generators, planetary dynamics, Hall current sensors, etc. In this paper, the analysis of an unsteady MHD Casson fluid with chemical reaction over a rotating cone is presented. The impacts of Hall current, joule heating, thermal radiation, and viscous dissipation are analyzed. Entropy optimization is also considered in the present analysis. The system of coupled equations is tackled with homotopy analysis method (HAM). The convergence of HAM is also shown through figures. Deviations in the flow due to dimensionless parameters are shown graphically. Similarly, the variation in skin friction, Nusselt number, and Sherwood number are deliberated through Tables. A justification of the current consequences is presented. Full article
(This article belongs to the Special Issue Entropy Generation and Heat Transfer II)
Show Figures

Figure 1

18 pages, 1318 KiB  
Article
Binary Expression Enhances Reliability of Messaging in Gene Networks
by Leonardo R. Gama, Guilherme Giovanini, Gábor Balázsi and Alexandre F. Ramos
Entropy 2020, 22(4), 479; https://doi.org/10.3390/e22040479 - 22 Apr 2020
Cited by 2 | Viewed by 3226
Abstract
The promoter state of a gene and its expression levels are modulated by the amounts of transcription factors interacting with its regulatory regions. Hence, one may interpret a gene network as a communicating system in which the state of the promoter of a [...] Read more.
The promoter state of a gene and its expression levels are modulated by the amounts of transcription factors interacting with its regulatory regions. Hence, one may interpret a gene network as a communicating system in which the state of the promoter of a gene (the source) is communicated by the amounts of transcription factors that it expresses (the message) to modulate the state of the promoter and expression levels of another gene (the receptor). The reliability of the gene network dynamics can be quantified by Shannon’s entropy of the message and the mutual information between the message and the promoter state. Here we consider a stochastic model for a binary gene and use its exact steady state solutions to calculate the entropy and mutual information. We show that a slow switching promoter with long and equally standing ON and OFF states maximizes the mutual information and reduces entropy. That is a binary gene expression regime generating a high variance message governed by a bimodal probability distribution with peaks of the same height. Our results indicate that Shannon’s theory can be a powerful framework for understanding how bursty gene expression conciliates with the striking spatio-temporal precision exhibited in pattern formation of developing organisms. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

16 pages, 4406 KiB  
Article
Mechanical Fault Diagnosis of a High Voltage Circuit Breaker Based on High-Efficiency Time-Domain Feature Extraction with Entropy Features
by Jiajin Qi, Xu Gao and Nantian Huang
Entropy 2020, 22(4), 478; https://doi.org/10.3390/e22040478 - 22 Apr 2020
Cited by 27 | Viewed by 4028
Abstract
The fault samples of high voltage circuit breakers are few, the vibration signals are complex, the existing research methods cannot extract the effective information in the features, and it is easy to overfit, slow training, and other problems. To improve the efficiency of [...] Read more.
The fault samples of high voltage circuit breakers are few, the vibration signals are complex, the existing research methods cannot extract the effective information in the features, and it is easy to overfit, slow training, and other problems. To improve the efficiency of feature extraction of a circuit breaker vibration signal and the accuracy of circuit breaker state recognition, a Light Gradient Boosting Machine (LightGBM) method based on time-domain feature extraction with multi-type entropy features for mechanical fault diagnosis of the high voltage circuit breaker is proposed. First, the original vibration signal of the high voltage circuit breaker is segmented in the time domain; then, 16 features including 5 kinds of entropy features are extracted directly from each part of the original signal after time-domain segmentation, and the original feature set is constructed. Second, the Split importance value of each feature is calculated, and the optimal feature subset is determined by the forward feature selection, taking the classification accuracy of LightGBM as the decision variable. After that, the LightGBM classifier is constructed based on the feature vector of the optimal feature subset, which can accurately distinguish the mechanical fault state of the high voltage circuit breaker. The experimental results show that the new method has the advantages of high efficiency of feature extraction and high accuracy of fault identification. Full article
Show Figures

Figure 1

22 pages, 993 KiB  
Article
Higher-Order Cumulants Drive Neuronal Activity Patterns, Inducing UP-DOWN States in Neural Populations
by Roman Baravalle and Fernando Montani
Entropy 2020, 22(4), 477; https://doi.org/10.3390/e22040477 - 22 Apr 2020
Cited by 3 | Viewed by 3177
Abstract
A major challenge in neuroscience is to understand the role of the higher-order correlations structure of neuronal populations. The dichotomized Gaussian model (DG) generates spike trains by means of thresholding a multivariate Gaussian random variable. The DG inputs are Gaussian distributed, and thus [...] Read more.
A major challenge in neuroscience is to understand the role of the higher-order correlations structure of neuronal populations. The dichotomized Gaussian model (DG) generates spike trains by means of thresholding a multivariate Gaussian random variable. The DG inputs are Gaussian distributed, and thus have no interactions beyond the second order in their inputs; however, they can induce higher-order correlations in the outputs. We propose a combination of analytical and numerical techniques to estimate higher-order, above the second, cumulants of the firing probability distributions. Our findings show that a large amount of pairwise interactions in the inputs can induce the system into two possible regimes, one with low activity (“DOWN state”) and another one with high activity (“UP state”), and the appearance of these states is due to a combination between the third- and fourth-order cumulant. This could be part of a mechanism that would help the neural code to upgrade specific information about the stimuli, motivating us to examine the behavior of the critical fluctuations through the Binder cumulant close to the critical point. We show, using the Binder cumulant, that higher-order correlations in the outputs generate a critical neural system that portrays a second-order phase transition. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications)
Show Figures

Figure 1

20 pages, 3436 KiB  
Article
Analysis of an Integrated Solar Combined Cycle with Recuperative Gas Turbine and Double Recuperative and Double Expansion Propane Cycle
by Antonio Rovira, Rubén Abbas, Marta Muñoz and Andrés Sebastián
Entropy 2020, 22(4), 476; https://doi.org/10.3390/e22040476 - 21 Apr 2020
Cited by 6 | Viewed by 3539
Abstract
The main objective of this paper is to present and analyze an innovative configuration of integrated solar combined cycle (ISCC). As novelties, the plant includes a recuperative gas turbine and the conventional bottoming Rankine cycle is replaced by a recently developed double recuperative [...] Read more.
The main objective of this paper is to present and analyze an innovative configuration of integrated solar combined cycle (ISCC). As novelties, the plant includes a recuperative gas turbine and the conventional bottoming Rankine cycle is replaced by a recently developed double recuperative double expansion (DRDE) cycle. The configuration results in a fuel saving in the combustion chamber at the expense of a decreased exhaust gas temperature, which is just adequate to feed the DRDE cycle that uses propane as the working fluid. The solar contribution comes from a solar field of parabolic trough collectors, with oil as the heat transfer fluid. The optimum integration point for the solar contribution is addressed. The performance of the proposed ISCC-R-DRDE design conditions and off-design operation was assessed (daily and yearly) at two different locations. All results were compared to those obtained under the same conditions by a conventional ISCC, as well as similar configurations without solar integration. The proposed configuration obtains a lower heat rate on a yearly basis in the studied locations and lower levelized cost of energy (LCOE) than that of the ISCC, which indicates that such a configuration could become a promising technology. Full article
(This article belongs to the Special Issue Thermodynamic Optimization of Complex Energy Systems)
Show Figures

Figure 1

14 pages, 4954 KiB  
Article
Constructal Design of an Arrow-Shaped High Thermal Conductivity Channel in a Square Heat Generation Body
by Fengyin Zhang, Huijun Feng, Lingen Chen, Jiang You and Zhihui Xie
Entropy 2020, 22(4), 475; https://doi.org/10.3390/e22040475 - 20 Apr 2020
Cited by 15 | Viewed by 2832
Abstract
A heat conduction model with an arrow-shaped high thermal conductivity channel (ASHTCC) in a square heat generation body (SHGB) is established in this paper. By taking the minimum maximum temperature difference (MMTD) as the optimization goal, constructal designs of the ASHTCC are conducted [...] Read more.
A heat conduction model with an arrow-shaped high thermal conductivity channel (ASHTCC) in a square heat generation body (SHGB) is established in this paper. By taking the minimum maximum temperature difference (MMTD) as the optimization goal, constructal designs of the ASHTCC are conducted based on single, two, and three degrees of freedom optimizations under the condition of fixed ASHTCC material. The outcomes illustrate that the heat conduction performance (HCP) of the SHGB is better when the structure of the ASHTCC tends to be flat. Increasing the thermal conductivity ratio and area fraction of the ASHTCC material can improve the HCP of the SHGB. In the discussed numerical examples, the MMTD obtained by three degrees of freedom optimization are reduced by 8.42% and 4.40%, respectively, compared with those obtained by single and two degrees of freedom optimizations. Therefore, three degrees of freedom optimization can further improve the HCP of the SHGB. Compared the HCPs of the SHGBs with ASHTCC and the T-shaped one, the MMTD of the former is reduced by 13.0%. Thus, the structure of the ASHTCC is proven to be superior to that of the T-shaped one. The optimization results gained in this paper have reference values for the optimal structure designs for the heat dissipations of various electronic devices. Full article
Show Figures

Figure 1

20 pages, 32574 KiB  
Article
Modification of the Logistic Map Using Fuzzy Numbers with Application to Pseudorandom Number Generation and Image Encryption
by Lazaros Moysis, Christos Volos, Sajad Jafari, Jesus M. Munoz-Pacheco, Jacques Kengne, Karthikeyan Rajagopal and Ioannis Stouboulos
Entropy 2020, 22(4), 474; https://doi.org/10.3390/e22040474 - 20 Apr 2020
Cited by 42 | Viewed by 4335
Abstract
A modification of the classic logistic map is proposed, using fuzzy triangular numbers. The resulting map is analysed through its Lyapunov exponent (LE) and bifurcation diagrams. It shows higher complexity compared to the classic logistic map and showcases phenomena, like antimonotonicity and crisis. [...] Read more.
A modification of the classic logistic map is proposed, using fuzzy triangular numbers. The resulting map is analysed through its Lyapunov exponent (LE) and bifurcation diagrams. It shows higher complexity compared to the classic logistic map and showcases phenomena, like antimonotonicity and crisis. The map is then applied to the problem of pseudo random bit generation, using a simple rule to generate the bit sequence. The resulting random bit generator (RBG) successfully passes the National Institute of Standards and Technology (NIST) statistical tests, and it is then successfully applied to the problem of image encryption. Full article
Show Figures

Figure 1

16 pages, 940 KiB  
Article
Cross-Domain Recommendation Based on Sentiment Analysis and Latent Feature Mapping
by Yongpeng Wang, Hong Yu, Guoyin Wang and Yongfang Xie
Entropy 2020, 22(4), 473; https://doi.org/10.3390/e22040473 - 20 Apr 2020
Cited by 10 | Viewed by 4282
Abstract
Cross-domain recommendation is a promising solution in recommendation systems by using relatively rich information from the source domain to improve the recommendation accuracy of the target domain. Most of the existing methods consider the rating information of users in different domains, the label [...] Read more.
Cross-domain recommendation is a promising solution in recommendation systems by using relatively rich information from the source domain to improve the recommendation accuracy of the target domain. Most of the existing methods consider the rating information of users in different domains, the label information of users and items and the review information of users on items. However, they do not effectively use the latent sentiment information to find the accurate mapping of latent features in reviews between domains. User reviews usually include user’s subjective views, which can reflect the user’s preferences and sentiment tendencies to various attributes of the items. Therefore, in order to solve the cold-start problem in the recommendation process, this paper proposes a cross-domain recommendation algorithm (CDR-SAFM) based on sentiment analysis and latent feature mapping by combining the sentiment information implicit in user reviews in different domains. Different from previous sentiment research, this paper divides sentiment into three categories based on three-way decision ideas—namely, positive, negative and neutral—by conducting sentiment analysis on user review information. Furthermore, the Latent Dirichlet Allocation (LDA) is used to model the user’s semantic orientation to generate the latent sentiment review features. Moreover, the Multilayer Perceptron (MLP) is used to obtain the cross domain non-linear mapping function to transfer the user’s sentiment review features. Finally, this paper proves the effectiveness of the proposed CDR-SAFM framework by comparing it with existing recommendation algorithms in a cross-domain scenario on the Amazon dataset. Full article
(This article belongs to the Special Issue Computation in Complex Networks)
Show Figures

Figure 1

12 pages, 1548 KiB  
Article
Residue Cluster Classes: A Unified Protein Representation for Efficient Structural and Functional Classification
by Fernando Fontove and Gabriel Del Rio
Entropy 2020, 22(4), 472; https://doi.org/10.3390/e22040472 - 20 Apr 2020
Cited by 7 | Viewed by 3597
Abstract
Proteins are characterized by their structures and functions, and these two fundamental aspects of proteins are assumed to be related. To model such a relationship, a single representation to model both protein structure and function would be convenient, yet so far, the most [...] Read more.
Proteins are characterized by their structures and functions, and these two fundamental aspects of proteins are assumed to be related. To model such a relationship, a single representation to model both protein structure and function would be convenient, yet so far, the most effective models for protein structure or function classification do not rely on the same protein representation. Here we provide a computationally efficient implementation for large datasets to calculate residue cluster classes (RCCs) from protein three-dimensional structures and show that such representations enable a random forest algorithm to effectively learn the structural and functional classifications of proteins, according to the CATH and Gene Ontology criteria, respectively. RCCs are derived from residue contact maps built from different distance criteria, and we show that 7 or 8 Å with or without amino acid side-chain atoms rendered the best classification models. The potential use of a unified representation of proteins is discussed and possible future areas for improvement and exploration are presented. Full article
(This article belongs to the Special Issue Statistical Inference from High Dimensional Data)
Show Figures

Figure 1

24 pages, 459 KiB  
Article
Time-Dependent Pseudo-Hermitian Hamiltonians and a Hidden Geometric Aspect of Quantum Mechanics
by Ali Mostafazadeh
Entropy 2020, 22(4), 471; https://doi.org/10.3390/e22040471 - 20 Apr 2020
Cited by 23 | Viewed by 4184
Abstract
A non-Hermitian operator H defined in a Hilbert space with inner product · | · may serve as the Hamiltonian for a unitary quantum system if it is η -pseudo-Hermitian for a metric operator (positive-definite automorphism) η . The latter defines [...] Read more.
A non-Hermitian operator H defined in a Hilbert space with inner product · | · may serve as the Hamiltonian for a unitary quantum system if it is η -pseudo-Hermitian for a metric operator (positive-definite automorphism) η . The latter defines the inner product · | η · of the physical Hilbert space H η of the system. For situations where some of the eigenstates of H depend on time, η becomes time-dependent. Therefore, the system has a non-stationary Hilbert space. Such quantum systems, which are also encountered in the study of quantum mechanics in cosmological backgrounds, suffer from a conflict between the unitarity of time evolution and the unobservability of the Hamiltonian. Their proper treatment requires a geometric framework which clarifies the notion of the energy observable and leads to a geometric extension of quantum mechanics (GEQM). We provide a general introduction to the subject, review some of the recent developments, offer a straightforward description of the Heisenberg-picture formulation of the dynamics for quantum systems having a time-dependent Hilbert space, and outline the Heisenberg-picture formulation of dynamics in GEQM. Full article
(This article belongs to the Special Issue Quantum Dynamics with Non-Hermitian Hamiltonians)
Show Figures

Figure 1

13 pages, 260 KiB  
Article
Entropy-Based Measure of Statistical Complexity of a Game Strategy
by Fryderyk Falniowski
Entropy 2020, 22(4), 470; https://doi.org/10.3390/e22040470 - 20 Apr 2020
Cited by 2 | Viewed by 2829
Abstract
In this note, we introduce excess strategic entropy—an entropy-based measure of complexity of the strategy. It measures complexity and predictability of the (mixed) strategy of a player. We show and discuss properties of this measure and its possible applications. Full article
18 pages, 1086 KiB  
Article
Equation of State of Four- and Five-Dimensional Hard-Hypersphere Mixtures
by Mariano López de Haro, Andrés Santos and Santos B. Yuste
Entropy 2020, 22(4), 469; https://doi.org/10.3390/e22040469 - 20 Apr 2020
Cited by 4 | Viewed by 2957
Abstract
New proposals for the equation of state of four- and five-dimensional hard-hypersphere mixtures in terms of the equation of state of the corresponding monocomponent hard-hypersphere fluid are introduced. Such proposals (which are constructed in such a way so as to yield the exact [...] Read more.
New proposals for the equation of state of four- and five-dimensional hard-hypersphere mixtures in terms of the equation of state of the corresponding monocomponent hard-hypersphere fluid are introduced. Such proposals (which are constructed in such a way so as to yield the exact third virial coefficient) extend, on the one hand, recent similar formulations for hard-disk and (three-dimensional) hard-sphere mixtures and, on the other hand, two of our previous proposals also linking the mixture equation of state and the one of the monocomponent fluid but unable to reproduce the exact third virial coefficient. The old and new proposals are tested by comparison with published molecular dynamics and Monte Carlo simulation results and their relative merit is evaluated. Full article
(This article belongs to the Special Issue Statistical Mechanics and Thermodynamics of Liquids and Crystals)
Show Figures

Graphical abstract

20 pages, 4705 KiB  
Article
Feature Extraction of Ship-Radiated Noise Based on Enhanced Variational Mode Decomposition, Normalized Correlation Coefficient and Permutation Entropy
by Dongri Xie, Hamada Esmaiel, Haixin Sun, Jie Qi and Zeyad A. H. Qasem
Entropy 2020, 22(4), 468; https://doi.org/10.3390/e22040468 - 20 Apr 2020
Cited by 26 | Viewed by 3535
Abstract
Due to the complexity and variability of underwater acoustic channels, ship-radiated noise (SRN) detected using the passive sonar is prone to be distorted. The entropy-based feature extraction method can improve this situation, to some extent. However, it is impractical to directly extract the [...] Read more.
Due to the complexity and variability of underwater acoustic channels, ship-radiated noise (SRN) detected using the passive sonar is prone to be distorted. The entropy-based feature extraction method can improve this situation, to some extent. However, it is impractical to directly extract the entropy feature for the detected SRN signals. In addition, the existing conventional methods have a lack of suitable de-noising processing under the presence of marine environmental noise. To this end, this paper proposes a novel feature extraction method based on enhanced variational mode decomposition (EVMD), normalized correlation coefficient (norCC), permutation entropy (PE), and the particle swarm optimization-based support vector machine (PSO-SVM). Firstly, EVMD is utilized to obtain a group of intrinsic mode functions (IMFs) from the SRN signals. The noise-dominant IMFs are then eliminated by a de-noising processing prior to PE calculation. Next, the correlation coefficient between each signal-dominant IMF and the raw signal and PE of each signal-dominant IMF are calculated, respectively. After this, the norCC is used to weigh the corresponding PE and the sum of these weighted PE is considered as the final feature parameter. Finally, the feature vectors are fed into the PSO-SVM multi-class classifier to classify the SRN samples. The experimental results demonstrate that the recognition rate of the proposed methodology is up to 100%, which is much higher than the currently existing methods. Hence, the method proposed in this paper is more suitable for the feature extraction of SRN signals. Full article
Show Figures

Figure 1

9 pages, 262 KiB  
Article
Weyl Prior and Bayesian Statistics
by Ruichao Jiang, Javad Tavakoli and Yiqiang Zhao
Entropy 2020, 22(4), 467; https://doi.org/10.3390/e22040467 - 20 Apr 2020
Cited by 1 | Viewed by 2661
Abstract
When using Bayesian inference, one needs to choose a prior distribution for parameters. The well-known Jeffreys prior is based on the Riemann metric tensor on a statistical manifold. Takeuchi and Amari defined the α -parallel prior, which generalized the Jeffreys prior by exploiting [...] Read more.
When using Bayesian inference, one needs to choose a prior distribution for parameters. The well-known Jeffreys prior is based on the Riemann metric tensor on a statistical manifold. Takeuchi and Amari defined the α -parallel prior, which generalized the Jeffreys prior by exploiting a higher-order geometric object, known as a Chentsov–Amari tensor. In this paper, we propose a new prior based on the Weyl structure on a statistical manifold. It turns out that our prior is a special case of the α -parallel prior with the parameter α equaling n , where n is the dimension of the underlying statistical manifold and the minus sign is a result of conventions used in the definition of α -connections. This makes the choice for the parameter α more canonical. We calculated the Weyl prior for univariate Gaussian and multivariate Gaussian distribution. The Weyl prior of the univariate Gaussian turns out to be the uniform prior. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
17 pages, 440 KiB  
Article
Symbolic Analysis Applied to the Specification of Spatial Trends and Spatial Dependence
by Maryna Makeienko
Entropy 2020, 22(4), 466; https://doi.org/10.3390/e22040466 - 20 Apr 2020
Cited by 5 | Viewed by 2481
Abstract
This article provides symbolic analysis tools for specifying spatial econometric models. It firstly considers testing spatial dependence in the presence of potential leading deterministic spatial components (similar to time-series tests for unit roots in the presence of temporal drift and/or time-trend) and secondly [...] Read more.
This article provides symbolic analysis tools for specifying spatial econometric models. It firstly considers testing spatial dependence in the presence of potential leading deterministic spatial components (similar to time-series tests for unit roots in the presence of temporal drift and/or time-trend) and secondly considers how to econometrically model spatial economic relations that might contain unobserved spatial structure of unknown form. Hypothesis testing is conducted with a symbolic-entropy based non-parametric statistical procedure, recently proposed by Garcia-Cordoba, Matilla-Garcia, and Ruiz (2019), which does not rely on prior weight matrices assumptions. It is shown that the use of geographically restricted semiparametric spatial models is a promising modeling strategy for cross-sectional datasets that are compatible with some types of spatial dependence. The results state that models that merely incorporate space coordinates might be sufficient to capture space dependence. Hedonic models for Baltimore, Boston, and Toledo housing prices datasets are revisited, studied (with the new proposed procedures), and compared with standard spatial econometric methodologies. Full article
(This article belongs to the Special Issue Information theory and Symbolic Analysis: Theory and Applications)
Show Figures

Figure A1

21 pages, 2602 KiB  
Article
Early Detection of Alzheimer’s Disease: Detecting Asymmetries with a Return Random Walk Link Predictor
by Manuel Curado, Francisco Escolano, Miguel A. Lozano and Edwin R. Hancock
Entropy 2020, 22(4), 465; https://doi.org/10.3390/e22040465 - 19 Apr 2020
Cited by 10 | Viewed by 3851
Abstract
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially [...] Read more.
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially offer the potential to capture asymmetries in the interactions between different anatomical brain regions. The detection of these asymmetries is relevant to detect the disease in an early stage. For this reason, in this paper, we analyze data extracted from fMRI images using the net4Lap algorithm to infer a directed graph from the available BOLD signals, and then seek to determine asymmetries between the left and right hemispheres of the brain using a directed version of the Return Random Walk (RRW). Experimental evaluation of this method reveals that it leads to the identification of anatomical brain regions known to be implicated in the early development of Alzheimer’s disease in clinical studies. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

14 pages, 1663 KiB  
Article
Tsallis Entropy, Likelihood, and the Robust Seismic Inversion
by Igo Pedro de Lima, Sérgio Luiz E. F. da Silva, Gilberto Corso and João M. de Araújo
Entropy 2020, 22(4), 464; https://doi.org/10.3390/e22040464 - 19 Apr 2020
Cited by 14 | Viewed by 4101
Abstract
The nonextensive statistical mechanics proposed by Tsallis have been successfully used to model and analyze many complex phenomena. Here, we study the role of the generalized Tsallis statistics on the inverse problem theory. Most inverse problems are formulated as an optimisation problem that [...] Read more.
The nonextensive statistical mechanics proposed by Tsallis have been successfully used to model and analyze many complex phenomena. Here, we study the role of the generalized Tsallis statistics on the inverse problem theory. Most inverse problems are formulated as an optimisation problem that aims to estimate the physical parameters of a system from indirect and partial observations. In the conventional approach, the misfit function that is to be minimized is based on the least-squares distance between the observed data and the modelled data (residuals or errors), in which the residuals are assumed to follow a Gaussian distribution. However, in many real situations, the error is typically non-Gaussian, and therefore this technique tends to fail. This problem has motivated us to study misfit functions based on non-Gaussian statistics. In this work, we derive a misfit function based on the q-Gaussian distribution associated with the maximum entropy principle in the Tsallis formalism. We tested our method in a typical geophysical data inverse problem, called post-stack inversion (PSI), in which the physical parameters to be estimated are the Earth’s reflectivity. Our results show that the PSI based on Tsallis statistics outperforms the conventional PSI, especially in the non-Gaussian noisy-data case. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop