Journal Description
Entropy
Entropy
is an international and interdisciplinary peer-reviewed open access journal of entropy and information studies, published monthly online by MDPI. The International Society for the Study of Information (IS4SI) and Spanish Society of Biomedical Engineering (SEIB) are affiliated with Entropy and their members receive a discount on the article processing charge.
- Open Access— free for readers, with article processing charges (APC) paid by authors or their institutions.
- High Visibility: indexed within Scopus, SCIE (Web of Science), Inspec, PubMed, PMC, Astrophysics Data System, and other databases.
- Journal Rank: JCR - Q2 (Physics, Multidisciplinary) / CiteScore - Q1 (Mathematical Physics)
- Rapid Publication: manuscripts are peer-reviewed and a first decision is provided to authors approximately 20.8 days after submission; acceptance to publication is undertaken in 2.9 days (median values for papers published in this journal in the second half of 2023).
- Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
- Testimonials: See what our editors and authors say about Entropy.
- Companion journals for Entropy include: Foundations, Thermo and MAKE.
Impact Factor:
2.7 (2022);
5-Year Impact Factor:
2.6 (2022)
Latest Articles
On the Stress–Strength Reliability of Transmuted GEV Random Variables with Applications to Financial Assets Selection
Entropy 2024, 26(6), 441; https://doi.org/10.3390/e26060441 - 23 May 2024
Abstract
In reliability contexts, probabilities of the type , where X and Y are random variables, have shown to be useful tools to compare the performance of these stochastic entities. By considering that both X and
[...] Read more.
In reliability contexts, probabilities of the type , where X and Y are random variables, have shown to be useful tools to compare the performance of these stochastic entities. By considering that both X and Y follow a transmuted generalized extreme-value (TGEV) distribution, new analytical relationships were derived for R in terms of special functions. The results hereby obtained are more flexible when compared to similar results found in the literature. To highlight the applicability and correctness of our results, we conducted a Monte-Carlo simulation study and investigated the use of the reliability measure to select among financial assets whose returns were characterized by the random variables X and Y. Our results highlight that R is an interesting alternative to modern portfolio theory, which usually relies on the contrast of involved random variables by a simple comparison of their means and standard deviations.
Full article
(This article belongs to the Special Issue Stochastic Models and Statistical Inference: Analysis and Applications)
Open AccessArticle
Causal Structure Learning with Conditional and Unique Information Groups-Decomposition Inequalities
by
Daniel Chicharro and Julia K. Nguyen
Entropy 2024, 26(6), 440; https://doi.org/10.3390/e26060440 - 23 May 2024
Abstract
The causal structure of a system imposes constraints on the joint probability distribution of variables that can be generated by the system. Archetypal constraints consist of conditional independencies between variables. However, particularly in the presence of hidden variables, many causal structures are compatible
[...] Read more.
The causal structure of a system imposes constraints on the joint probability distribution of variables that can be generated by the system. Archetypal constraints consist of conditional independencies between variables. However, particularly in the presence of hidden variables, many causal structures are compatible with the same set of independencies inferred from the marginal distributions of observed variables. Additional constraints allow further testing for the compatibility of data with specific causal structures. An existing family of causally informative inequalities compares the information about a set of target variables contained in a collection of variables, with a sum of the information contained in different groups defined as subsets of that collection. While procedures to identify the form of these groups-decomposition inequalities have been previously derived, we substantially enlarge the applicability of the framework. We derive groups-decomposition inequalities subject to weaker independence conditions, with weaker requirements in the configuration of the groups, and additionally allowing for conditioning sets. Furthermore, we show how constraints with higher inferential power may be derived with collections that include hidden variables, and then converted into testable constraints using data processing inequalities. For this purpose, we apply the standard data processing inequality of conditional mutual information and derive an analogous property for a measure of conditional unique information recently introduced to separate redundant, synergistic, and unique contributions to the information that a set of variables has about a target.
Full article
(This article belongs to the Special Issue Synergy and Redundancy Measures: Theory and Applications to Characterize Complex Systems and Shape Neural Network Representations)
Open AccessArticle
Fluctuation Theorems for Heat Exchanges between Passive and Active Baths
by
Massimiliano Semeraro, Antonio Suma and Giuseppe Negro
Entropy 2024, 26(6), 439; https://doi.org/10.3390/e26060439 - 23 May 2024
Abstract
In addition to providing general constraints on probability distributions, fluctuation theorems allow us to infer essential information on the role played by temperature in heat exchange phenomena. In this numerical study, we measure the temperature of an out-of-equilibrium active bath using a fluctuation
[...] Read more.
In addition to providing general constraints on probability distributions, fluctuation theorems allow us to infer essential information on the role played by temperature in heat exchange phenomena. In this numerical study, we measure the temperature of an out-of-equilibrium active bath using a fluctuation theorem that relates the fluctuations in the heat exchanged between two baths to their temperatures. Our setup consists of a single particle moving between two wells of a quartic potential accommodating two different baths. The heat exchanged between the two baths is monitored according to two definitions: as the kinetic energy carried by the particle whenever it jumps from one well to the other and as the work performed by the particle on one of the two baths when immersed in it. First, we consider two equilibrium baths at two different temperatures and verify that a fluctuation theorem featuring the baths temperatures holds for both heat definitions. Then, we introduce an additional Gaussian coloured noise in one of the baths, so as to make it effectively an active (out-of-equilibrium) bath. We find that a fluctuation theorem is still satisfied with both heat definitions. Interestingly, in this case the temperature obtained through the fluctuation theorem for the active bath corresponds to the kinetic temperature when considering the first heat definition, while it is larger with the second one. We interpret these results by looking at the particle jump phenomenology.
Full article
(This article belongs to the Section Non-equilibrium Phenomena)
►▼
Show Figures
Figure 1
Open AccessReview
Ising Paradigm in Isobaric Ensembles
by
Claudio A. Cerdeiriña and Jacobo Troncoso
Entropy 2024, 26(6), 438; https://doi.org/10.3390/e26060438 - 22 May 2024
Abstract
We review recent work on Ising-like models with “compressible cells” of fluctuating volume that, as such, are naturally treated in and ensembles. Besides volumetric phenomena, local entropic effects crucially underlie the models. We focus on “compressible cell
[...] Read more.
We review recent work on Ising-like models with “compressible cells” of fluctuating volume that, as such, are naturally treated in and ensembles. Besides volumetric phenomena, local entropic effects crucially underlie the models. We focus on “compressible cell gases” (CCG), namely, lattice gases with fluctuating cell volumes, and “compressible cell liquids” (CCL) with singly occupied cells and fluctuating cell volumes. CCGs contemplate singular diameters and “Yang–Yang features” predicted by the “complete scaling” formulation of asymmetric fluid criticality, with a specific version incorporating “ice-like” hydrogen bonding further describing the “singularity-free scenario” for the low-temperature unusual thermodynamics of supercooled water. In turn, suitable CCL variants constitute adequate prototypes of water-like liquid–liquid criticality and the freezing transition of a system of hard spheres. On incorporating vacant cells to such two-state CCL variants, one obtains three-state, BEG-like models providing a satisfactory description of water’s “second-critical-point scenario” and the whole phase behavior of a simple substance like argon. Future challenges comprise water’s crystal–fluid phase behavior and metastable states.
Full article
(This article belongs to the Special Issue Matter-Aggregating Systems at a Classical vs. Quantum Interface)
►▼
Show Figures
Figure 1
Open AccessArticle
Loss Control-Based Key Distribution under Quantum Protection
by
Nikita Kirsanov, Valeria Pastushenko, Aleksei Kodukhov, Aziz Aliev, Michael Yarovikov, Daniel Strizhak, Ilya Zarubin, Alexander Smirnov, Markus Pflitsch and Valerii Vinokur
Entropy 2024, 26(6), 437; https://doi.org/10.3390/e26060437 - 22 May 2024
Abstract
Quantum cryptography revolutionizes secure information transfer, providing defense against both quantum and classical computational attacks. The primary challenge in extending the reach of quantum communication comes from the exponential decay of signals over long distances. We meet this challenge by experimentally realizing the
[...] Read more.
Quantum cryptography revolutionizes secure information transfer, providing defense against both quantum and classical computational attacks. The primary challenge in extending the reach of quantum communication comes from the exponential decay of signals over long distances. We meet this challenge by experimentally realizing the Quantum-Protected Control-Based Key Distribution (QCKD) protocol, utilizing physical control over signal losses. By ensuring significant non-orthogonality of the leaked quantum states, this control severely constrains eavesdroppers’ capacities. We demonstrate the performance and scale of our protocol by experiments over a 1707 km long fiber line. The scalability of the QCKD opens the route for globally secure quantum-resistant communication.
Full article
(This article belongs to the Special Issue Advanced Technology in Quantum Cryptography)
►▼
Show Figures
Figure 1
Open AccessArticle
Enhancing Person Re-Identification through Attention-Driven Global Features and Angular Loss Optimization
by
Yihan Bi, Rong Wang, Qianli Zhou, Ronghui Lin and Mingjie Wang
Entropy 2024, 26(6), 436; https://doi.org/10.3390/e26060436 - 21 May 2024
Abstract
To address challenges related to the inadequate representation and inaccurate discrimination of pedestrian attributes, we propose a novel method for person re-identification, which leverages global feature learning and classification optimization. Specifically, this approach integrates a Normalization-based Channel Attention Module into the fundamental ResNet50
[...] Read more.
To address challenges related to the inadequate representation and inaccurate discrimination of pedestrian attributes, we propose a novel method for person re-identification, which leverages global feature learning and classification optimization. Specifically, this approach integrates a Normalization-based Channel Attention Module into the fundamental ResNet50 backbone, utilizing a scaling factor to prioritize and enhance key pedestrian feature information. Furthermore, dynamic activation functions are employed to adaptively modulate the parameters of ReLU based on the input convolutional feature maps, thereby bolstering the nonlinear expression capabilities of the network model. By incorporating Arcface loss into the cross-entropy loss, the supervised model is trained to learn pedestrian features that exhibit significant inter-class variance while maintaining tight intra-class coherence. The evaluation of the enhanced model on two popular datasets, Market1501 and DukeMTMC-ReID, reveals improvements in Rank-1 accuracy by 1.28% and 1.4%, respectively, along with corresponding gains in the mean average precision (mAP) of 1.93% and 1.84%. These findings indicate that the proposed model is capable of extracting more robust pedestrian features, enhancing feature discriminability, and ultimately achieving superior recognition accuracy.
Full article
(This article belongs to the Section Multidisciplinary Applications)
►▼
Show Figures
Figure 1
Open AccessArticle
Effect of Al Content on Microstructure and Properties of AlxCr0.2NbTiV Refractory High-Entropy Alloys
by
Rongbin Li, Qianqian Li, Zhixi Zhang, Rulin Zhang, Yue Xing and Doudou Han
Entropy 2024, 26(6), 435; https://doi.org/10.3390/e26060435 - 21 May 2024
Abstract
High-temperature creep refers to the slow and continuous plastic deformation of materials under the effects of high temperatures and mechanical stress over extended periods, which can lead to the degradation or even failure of the components’ functionality. AlxCr0.2NbTiV (x
[...] Read more.
High-temperature creep refers to the slow and continuous plastic deformation of materials under the effects of high temperatures and mechanical stress over extended periods, which can lead to the degradation or even failure of the components’ functionality. AlxCr0.2NbTiV (x = 0.2, 0.5, or 0.8) refractory high-entropy alloys were fabricated by arc melting. The effects of Al content on the microstructure of AlxCr0.2NbTiV alloys were studied using X-ray diffraction, scanning electron microscopy, and electron backscatter diffraction. The microhardness, compression properties, and nanoindentation creep properties of AlxCr0.2NbTiV alloys were also tested. The results show that the AlxCr0.2NbTiV series exhibits a BCC single-phase structure. As the Al content increases, the lattice constant of the alloys gradually decreases, and the intensity of the (110) crystal plane diffraction peak increases. Adding aluminum enhances the effect of solution strengthening; however, due to grain coarsening, the microhardness and room temperature compressive strength of the alloy are only slightly improved. Additionally, because the effect of solution strengthening is diminished at high temperatures, the compressive strength of the alloy at 1000 °C is significantly reduced. The creep mechanism of the alloys is predominantly governed by dislocation creep. Moreover, increasing the Al content helps to reduce the sensitivity of the alloy to the loading rate during the creep process. At a loading rate of 2.5 mN/s, the Al0.8Cr0.2NbTiV alloy exhibits the lowest creep strain rate sensitivity index (m), which is 0.0758.
Full article
(This article belongs to the Special Issue Recent Advances in Refractory High Entropy Alloys)
►▼
Show Figures
Figure 1
Open AccessArticle
Research on Active Safety Situation of Road Passenger Transportation Enterprises: Evaluation, Prediction, and Analysis
by
Lili Zheng, Shiyu Cao, Tongqiang Ding, Jian Tian and Jinghang Sun
Entropy 2024, 26(6), 434; https://doi.org/10.3390/e26060434 - 21 May 2024
Abstract
The road passenger transportation enterprise is a complex system, requiring a clear understanding of their active safety situation (ASS), trends, and influencing factors. This facilitates transportation authorities to promptly receive signals and take effective measures. Through exploratory factor analysis and confirmatory factor analysis,
[...] Read more.
The road passenger transportation enterprise is a complex system, requiring a clear understanding of their active safety situation (ASS), trends, and influencing factors. This facilitates transportation authorities to promptly receive signals and take effective measures. Through exploratory factor analysis and confirmatory factor analysis, we delved into potential factors for evaluating ASS and extracted an ASS index. To predict obtaining a higher ASS information rate, we compared multiple time series models, including GRU (gated recurrent unit), LSTM (long short-term memory), ARIMA, Prophet, Conv_LSTM, and TCN (temporal convolutional network). This paper proposed the WDA-DBN (water drop algorithm-Deep Belief Network) model and employed DEEPSHAP to identify factors with higher ASS information content. TCN and GRU performed well in the prediction. Compared to the other models, WDA-DBN exhibited the best performance in terms of MSE and MAE. Overall, deep learning models outperform econometric models in terms of information processing. The total time spent processing alarms positively influences ASS, while variables such as fatigue driving occurrences, abnormal driving occurrences, and nighttime driving alarm occurrences have a negative impact on ASS.
Full article
(This article belongs to the Special Issue Recent Advances in Statistical Inference for High Dimensional Data)
Open AccessArticle
Link Prediction in Complex Networks Using Average Centrality-Based Similarity Score
by
Y. V. Nandini, T. Jaya Lakshmi, Murali Krishna Enduri and Hemlata Sharma
Entropy 2024, 26(6), 433; https://doi.org/10.3390/e26060433 - 21 May 2024
Abstract
Link prediction plays a crucial role in identifying future connections within complex networks, facilitating the analysis of network evolution across various domains such as biological networks, social networks, recommender systems, and more. Researchers have proposed various centrality measures, such as degree, clustering coefficient,
[...] Read more.
Link prediction plays a crucial role in identifying future connections within complex networks, facilitating the analysis of network evolution across various domains such as biological networks, social networks, recommender systems, and more. Researchers have proposed various centrality measures, such as degree, clustering coefficient, betweenness, and closeness centralities, to compute similarity scores for predicting links in these networks. These centrality measures leverage both the local and global information of nodes within the network. In this study, we present a novel approach to link prediction using similarity score by utilizing average centrality measures based on local and global centralities, namely Similarity based on Average Degree , Similarity based on Average Betweenness , Similarity based on Average Closeness , and Similarity based on Average Clustering Coefficient . Our approach involved determining centrality scores for each node, calculating the average centrality for the entire graph, and deriving similarity scores through common neighbors. We then applied centrality scores to these common neighbors and identified nodes with above average centrality. To evaluate our approach, we compared proposed measures with existing local similarity-based link prediction measures, including common neighbors, the Jaccard coefficient, Adamic–Adar, resource allocation, preferential attachment, as well as recent measures like common neighbor and the Centrality-based Parameterized Algorithm , and keyword network link prediction . We conducted experiments on four real-world datasets. The proposed similarity scores based on average centralities demonstrate significant improvements. We observed an average enhancement of 24% in terms of Area Under the Receiver Operating Characteristic (AUROC) compared to existing local similarity measures, and a 31% improvement over recent measures. Furthermore, we witnessed an average improvement of 49% and 51% in the Area Under Precision-Recall (AUPR) compared to existing and recent measures. Our comprehensive experiments highlight the superior performance of the proposed method.
Full article
(This article belongs to the Special Issue Advances in Complex Networks and Artificial Intelligence)
►▼
Show Figures
Figure 1
Open AccessArticle
Revisiting the Characterization of Resting Brain Dynamics with the Permutation Jensen–Shannon Distance
by
Luciano Zunino
Entropy 2024, 26(5), 432; https://doi.org/10.3390/e26050432 - 20 May 2024
Abstract
Taking into account the complexity of the human brain dynamics, the appropriate characterization of any brain state is a challenge not easily met. Actually, even the discrimination of simple behavioral tasks, such as resting with eyes closed or eyes open, represents an intricate
[...] Read more.
Taking into account the complexity of the human brain dynamics, the appropriate characterization of any brain state is a challenge not easily met. Actually, even the discrimination of simple behavioral tasks, such as resting with eyes closed or eyes open, represents an intricate problem and many efforts have been and are being made to overcome it. In this work, the aforementioned issue is carefully addressed by performing multiscale analyses of electroencephalogram records with the permutation Jensen–Shannon distance. The influence that linear and nonlinear temporal correlations have on the discrimination is unveiled. Results obtained lead to significant conclusions that help to achieve an improved distinction between these resting brain states.
Full article
(This article belongs to the Section Entropy and Biology)
►▼
Show Figures
Figure 1
Open AccessArticle
AM-MSFF: A Pest Recognition Network Based on Attention Mechanism and Multi-Scale Feature Fusion
by
Meng Zhang, Wenzhong Yang, Danny Chen, Chenghao Fu and Fuyuan Wei
Entropy 2024, 26(5), 431; https://doi.org/10.3390/e26050431 - 20 May 2024
Abstract
Traditional methods for pest recognition have certain limitations in addressing the challenges posed by diverse pest species, varying sizes, diverse morphologies, and complex field backgrounds, resulting in a lower recognition accuracy. To overcome these limitations, this paper proposes a novel pest recognition method
[...] Read more.
Traditional methods for pest recognition have certain limitations in addressing the challenges posed by diverse pest species, varying sizes, diverse morphologies, and complex field backgrounds, resulting in a lower recognition accuracy. To overcome these limitations, this paper proposes a novel pest recognition method based on attention mechanism and multi-scale feature fusion (AM-MSFF). By combining the advantages of attention mechanism and multi-scale feature fusion, this method significantly improves the accuracy of pest recognition. Firstly, we introduce the relation-aware global attention (RGA) module to adaptively adjust the feature weights of each position, thereby focusing more on the regions relevant to pests and reducing the background interference. Then, we propose the multi-scale feature fusion (MSFF) module to fuse feature maps from different scales, which better captures the subtle differences and the overall shape features in pest images. Moreover, we introduce generalized-mean pooling (GeMP) to more accurately extract feature information from pest images and better distinguish different pest categories. In terms of the loss function, this study proposes an improved focal loss (FL), known as balanced focal loss (BFL), as a replacement for cross-entropy loss. This improvement aims to address the common issue of class imbalance in pest datasets, thereby enhancing the recognition accuracy of pest identification models. To evaluate the performance of the AM-MSFF model, we conduct experiments on two publicly available pest datasets (IP102 and D0). Extensive experiments demonstrate that our proposed AM-MSFF outperforms most state-of-the-art methods. On the IP102 dataset, the accuracy reaches 72.64%, while on the D0 dataset, it reaches 99.05%.
Full article
(This article belongs to the Section Entropy and Biology)
►▼
Show Figures
Figure 1
Open AccessReview
Information Theory, Living Systems, and Communication Engineering
by
Dragana Bajić
Entropy 2024, 26(5), 430; https://doi.org/10.3390/e26050430 - 18 May 2024
Abstract
Mainstream research on information theory within the field of living systems involves the application of analytical tools to understand a broad range of life processes. This paper is dedicated to an opposite problem: it explores the information theory and communication engineering methods that
[...] Read more.
Mainstream research on information theory within the field of living systems involves the application of analytical tools to understand a broad range of life processes. This paper is dedicated to an opposite problem: it explores the information theory and communication engineering methods that have counterparts in the data transmission process by way of DNA structures and neural fibers. Considering the requirements of modern multimedia, transmission methods chosen by nature may be different, suboptimal, or even far from optimal. However, nature is known for rational resource usage, so its methods have a significant advantage: they are proven to be sustainable. Perhaps understanding the engineering aspects of methods of nature can inspire a design of alternative green, stable, and low-cost transmission.
Full article
(This article belongs to the Section Entropy Reviews)
►▼
Show Figures
Figure 1
Open AccessArticle
Heat Bath in a Quantum Circuit
by
Jukka P. Pekola and Bayan Karimi
Entropy 2024, 26(5), 429; https://doi.org/10.3390/e26050429 - 17 May 2024
Abstract
We discuss the concept and realization of a heat bath in solid state quantum systems. We demonstrate that, unlike a true resistor, a finite one-dimensional Josephson junction array or analogously a transmission line with non-vanishing frequency spacing, commonly considered as a reservoir of
[...] Read more.
We discuss the concept and realization of a heat bath in solid state quantum systems. We demonstrate that, unlike a true resistor, a finite one-dimensional Josephson junction array or analogously a transmission line with non-vanishing frequency spacing, commonly considered as a reservoir of a quantum circuit, does not strictly qualify as a Caldeira–Leggett type dissipative environment. We then consider a set of quantum two-level systems as a bath, which can be realized as a collection of qubits. We show that only a dense and wide distribution of energies of the two-level systems can secure long Poincare recurrence times characteristic of a proper heat bath. An alternative for this bath is a collection of harmonic oscillators, for instance, in the form of superconducting resonators.
Full article
(This article belongs to the Special Issue Advances in Quantum Thermodynamics)
►▼
Show Figures
Figure 1
Open AccessArticle
A Novel Fault Diagnosis Method of High-Speed Train Based on Few-Shot Learning
by
Yunpu Wu, Jianhua Chen, Xia Lei and Weidong Jin
Entropy 2024, 26(5), 428; https://doi.org/10.3390/e26050428 - 16 May 2024
Abstract
Ensuring the safe and stable operation of high-speed trains necessitates real-time monitoring and diagnostics of their suspension systems. While machine learning technology is widely employed for industrial equipment fault diagnosis, its effective application relies on the availability of a large dataset with annotated
[...] Read more.
Ensuring the safe and stable operation of high-speed trains necessitates real-time monitoring and diagnostics of their suspension systems. While machine learning technology is widely employed for industrial equipment fault diagnosis, its effective application relies on the availability of a large dataset with annotated fault data for model training. However, in practice, the availability of informational data samples is often insufficient, with most of them being unlabeled. The challenge arises when traditional machine learning methods encounter a scarcity of training data, leading to overfitting due to limited information. To address this issue, this paper proposes a novel few-shot learning method for high-speed train fault diagnosis, incorporating sensor-perturbation injection and meta-confidence learning to improve detection accuracy. Experimental results demonstrate the superior performance of the proposed method, which introduces perturbations, compared to existing methods. The impact of perturbation effects and class numbers on fault detection is analyzed, confirming the effectiveness of our learning strategy.
Full article
(This article belongs to the Special Issue New Trends in Fault Diagnosis and Prognosis for Engineering Applications: From Signal Processing to Machine Learning and Deep Learning)
►▼
Show Figures
Figure 1
Open AccessArticle
Fault Diagnosis Method for Space Fluid Loop Systems Based on Improved Evidence Theory
by
Yue Liu, Zhenxiang Li, Lu Zhang and Hongyong Fu
Entropy 2024, 26(5), 427; https://doi.org/10.3390/e26050427 - 16 May 2024
Abstract
Addressing the challenges posed by the complexity of the structure and the multitude of sensor types installed in space application fluid loop systems, this paper proposes a fault diagnosis method based on an improved D-S evidence theory. The method first employs the Gaussian
[...] Read more.
Addressing the challenges posed by the complexity of the structure and the multitude of sensor types installed in space application fluid loop systems, this paper proposes a fault diagnosis method based on an improved D-S evidence theory. The method first employs the Gaussian affiliation function to convert the information acquired by sensors into BPA functions. Subsequently, it utilizes a pignistic probability transformation to convert the multiple subset focal elements into single subset focal elements. Finally, it comprehensively evaluates the credibility and uncertainty factors between evidences, introducing Bray–Curtis dissimilarity and belief entropy to achieve the fusion of conflicting evidence. The proposed method is initially validated on the classic Iris dataset, demonstrating its reliability. Furthermore, when applied to fault diagnosis in space application fluid circuit loop pumps, the results indicate that the method can effectively fuse multiple sensors and accurately identify faults.
Full article
(This article belongs to the Section Multidisciplinary Applications)
►▼
Show Figures
Figure 1
Open AccessArticle
Exploring Simplicity Bias in 1D Dynamical Systems
by
Kamal Dingle, Mohammad Alaskandarani, Boumediene Hamzi and Ard A. Louis
Entropy 2024, 26(5), 426; https://doi.org/10.3390/e26050426 - 16 May 2024
Abstract
Arguments inspired by algorithmic information theory predict an inverse relation between the probability and complexity of output patterns in a wide range of input–output maps. This phenomenon is known as simplicity bias. By viewing the parameters of dynamical systems as inputs, and the
[...] Read more.
Arguments inspired by algorithmic information theory predict an inverse relation between the probability and complexity of output patterns in a wide range of input–output maps. This phenomenon is known as simplicity bias. By viewing the parameters of dynamical systems as inputs, and the resulting (digitised) trajectories as outputs, we study simplicity bias in the logistic map, Gauss map, sine map, Bernoulli map, and tent map. We find that the logistic map, Gauss map, and sine map all exhibit simplicity bias upon sampling of map initial values and parameter values, but the Bernoulli map and tent map do not. The simplicity bias upper bound on the output pattern probability is used to make a priori predictions regarding the probability of output patterns. In some cases, the predictions are surprisingly accurate, given that almost no details of the underlying dynamical systems are assumed. More generally, we argue that studying probability–complexity relationships may be a useful tool when studying patterns in dynamical systems.
Full article
(This article belongs to the Section Complexity)
►▼
Show Figures
Figure 1
Open AccessArticle
Memory Corrections to Markovian Langevin Dynamics
by
Mateusz Wiśniewski, Jerzy Łuczka and Jakub Spiechowicz
Entropy 2024, 26(5), 425; https://doi.org/10.3390/e26050425 - 16 May 2024
Abstract
Analysis of non-Markovian systems and memory-induced phenomena poses an everlasting challenge in the realm of physics. As a paradigmatic example, we consider a classical Brownian particle of mass M subjected to an external force and exposed to correlated thermal fluctuations. We show that
[...] Read more.
Analysis of non-Markovian systems and memory-induced phenomena poses an everlasting challenge in the realm of physics. As a paradigmatic example, we consider a classical Brownian particle of mass M subjected to an external force and exposed to correlated thermal fluctuations. We show that the recently developed approach to this system, in which its non-Markovian dynamics given by the Generalized Langevin Equation is approximated by its memoryless counterpart but with the effective particle mass , can be derived within the Markovian embedding technique. Using this method, we calculate the first- and the second-order memory correction to Markovian dynamics of the Brownian particle for the memory kernel represented as the Prony series. The second one lowers the effective mass of the system further and improves the precision of the approximation. Our work opens the door for the derivation of higher-order memory corrections to Markovian Langevin dynamics.
Full article
(This article belongs to the Collection Foundations of Statistical Mechanics)
►▼
Show Figures
Figure 1
Open AccessArticle
Non-Negative Decomposition of Multivariate Information: From Minimum to Blackwell-Specific Information
by
Tobias Mages, Elli Anastasiadi and Christian Rohner
Entropy 2024, 26(5), 424; https://doi.org/10.3390/e26050424 - 15 May 2024
Abstract
Partial information decompositions (PIDs) aim to categorize how a set of source variables provides information about a target variable redundantly, uniquely, or synergetically. The original proposal for such an analysis used a lattice-based approach and gained significant attention. However, finding a suitable underlying
[...] Read more.
Partial information decompositions (PIDs) aim to categorize how a set of source variables provides information about a target variable redundantly, uniquely, or synergetically. The original proposal for such an analysis used a lattice-based approach and gained significant attention. However, finding a suitable underlying decomposition measure is still an open research question at an arbitrary number of discrete random variables. This work proposes a solution with a non-negative PID that satisfies an inclusion–exclusion relation for any f-information measure. The decomposition is constructed from a pointwise perspective of the target variable to take advantage of the equivalence between the Blackwell and zonogon order in this setting. Zonogons are the Neyman–Pearson region for an indicator variable of each target state, and f-information is the expected value of quantifying its boundary. We prove that the proposed decomposition satisfies the desired axioms and guarantees non-negative partial information results. Moreover, we demonstrate how the obtained decomposition can be transformed between different decomposition lattices and that it directly provides a non-negative decomposition of Rényi-information at a transformed inclusion–exclusion relation. Finally, we highlight that the decomposition behaves differently depending on the information measure used and how it can be used for tracing partial information flows through Markov chains.
Full article
(This article belongs to the Special Issue Synergy and Redundancy Measures: Theory and Applications to Characterize Complex Systems and Shape Neural Network Representations)
►▼
Show Figures
Figure 1
Open AccessArticle
Landauer Bound in the Context of Minimal Physical Principles: Meaning, Experimental Verification, Controversies and Perspectives
by
Edward Bormashenko
Entropy 2024, 26(5), 423; https://doi.org/10.3390/e26050423 - 15 May 2024
Abstract
The physical roots, interpretation, controversies, and precise meaning of the Landauer principle are surveyed. The Landauer principle is a physical principle defining the lower theoretical limit of energy consumption necessary for computation. It states that an irreversible change in information stored in a
[...] Read more.
The physical roots, interpretation, controversies, and precise meaning of the Landauer principle are surveyed. The Landauer principle is a physical principle defining the lower theoretical limit of energy consumption necessary for computation. It states that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat per a bit of information to its surroundings. The Landauer principle is discussed in the context of fundamental physical limiting principles, such as the Abbe diffraction limit, the Margolus–Levitin limit, and the Bekenstein limit. Synthesis of the Landauer bound with the Abbe, Margolus–Levitin, and Bekenstein limits yields the minimal time of computation, which scales as . Decreasing the temperature of a thermal bath will decrease the energy consumption of a single computation, but in parallel, it will slow the computation. The Landauer principle bridges John Archibald Wheeler’s “it from bit” paradigm and thermodynamics. Experimental verifications of the Landauer principle are surveyed. The interrelation between thermodynamic and logical irreversibility is addressed. Generalization of the Landauer principle to quantum and non-equilibrium systems is addressed. The Landauer principle represents the powerful heuristic principle bridging physics, information theory, and computer engineering.
Full article
(This article belongs to the Special Issue The Landauer Principle and Its Implementations in Physics, Chemistry and Biology: Current Status, Critics and Controversies)
►▼
Show Figures
Figure 1
Open AccessArticle
Model Selection for Exponential Power Mixture Regression Models
by
Yunlu Jiang, Jiangchuan Liu, Hang Zou and Xiaowen Huang
Entropy 2024, 26(5), 422; https://doi.org/10.3390/e26050422 - 15 May 2024
Abstract
Finite mixture of linear regression (FMLR) models are among the most exemplary statistical tools to deal with various heterogeneous data. In this paper, we introduce a new procedure to simultaneously determine the number of components and perform variable selection for the different regressions
[...] Read more.
Finite mixture of linear regression (FMLR) models are among the most exemplary statistical tools to deal with various heterogeneous data. In this paper, we introduce a new procedure to simultaneously determine the number of components and perform variable selection for the different regressions for FMLR models via an exponential power error distribution, which includes normal distributions and Laplace distributions as special cases. Under some regularity conditions, the consistency of order selection and the consistency of variable selection are established, and the asymptotic normality for the estimators of non-zero parameters is investigated. In addition, an efficient modified expectation-maximization (EM) algorithm and a majorization-maximization (MM) algorithm are proposed to implement the proposed optimization problem. Furthermore, we use the numerical simulations to demonstrate the finite sample performance of the proposed methodology. Finally, we apply the proposed approach to analyze a baseball salary data set. Results indicate that our proposed method obtains a smaller BIC value than the existing method.
Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Journal Menu
► ▼ Journal Menu-
- Entropy Home
- Aims & Scope
- Editorial Board
- Reviewer Board
- Topical Advisory Panel
- Video Exhibition
- Instructions for Authors
- Special Issues
- Topics
- Sections & Collections
- Article Processing Charge
- Indexing & Archiving
- Editor’s Choice Articles
- Most Cited & Viewed
- Journal Statistics
- Journal History
- Journal Awards
- Society Collaborations
- Conferences
- Editorial Office
Journal Browser
► ▼ Journal Browser-
arrow_forward_ios
Forthcoming issue
arrow_forward_ios Current issue - Vol. 26 (2024)
- Vol. 25 (2023)
- Vol. 24 (2022)
- Vol. 23 (2021)
- Vol. 22 (2020)
- Vol. 21 (2019)
- Vol. 20 (2018)
- Vol. 19 (2017)
- Vol. 18 (2016)
- Vol. 17 (2015)
- Vol. 16 (2014)
- Vol. 15 (2013)
- Vol. 14 (2012)
- Vol. 13 (2011)
- Vol. 12 (2010)
- Vol. 11 (2009)
- Vol. 10 (2008)
- Vol. 9 (2007)
- Vol. 8 (2006)
- Vol. 7 (2005)
- Vol. 6 (2004)
- Vol. 5 (2003)
- Vol. 4 (2002)
- Vol. 3 (2001)
- Vol. 2 (2000)
- Vol. 1 (1999)
Highly Accessed Articles
Latest Books
E-Mail Alert
News
Topics
Topic in
Algorithms, Diagnostics, Entropy, Information, J. Imaging
Application of Machine Learning in Molecular Imaging
Topic Editors: Allegra Conti, Nicola Toschi, Marianna Inglese, Andrea Duggento, Matthew Grech-Sollars, Serena Monti, Giancarlo Sportelli, Pietro CarraDeadline: 31 May 2024
Topic in
Education Sciences, Entropy, JAL, Societies, Sustainability
Sustainability in Aging and Depopulation Societies
Topic Editors: Shiro Horiuchi, Gregor Wolbring, Takeshi MatsudaDeadline: 15 June 2024
Topic in
Buildings, Energies, Entropy, Resources, Sustainability
Advances in Solar Heating and Cooling
Topic Editors: Salvatore Vasta, Sotirios Karellas, Marina Bonomolo, Alessio Sapienza, Uli JakobDeadline: 30 June 2024
Topic in
Actuators, Applied Sciences, Entropy
Thermodynamics and Heat Transfers in Vacuum Tube Trains (Hyperloop)
Topic Editors: Suyong Choi, Minki Cho, Jungyoul LimDeadline: 30 July 2024
Conferences
22–26 November 2024
2024 International Conference on Science and Engineering of Electronics (ICSEE'2024) Wuhan, China
28–31 May 2024
XXII Conference on Non-equilibrium Statistical Mechanics and Nonlinear Physics—MEDYFINOL 2024
Special Issues
Special Issue in
Entropy
Thermodynamic Evaluation and Optimization of Combustion Processes
Guest Editors: Chun Lou, Jaroslaw Krzywanski, Zhongnong Zhang, Dorian SkrobekDeadline: 23 May 2024
Special Issue in
Entropy
Entropy, Statistical Evidence, and Scientific Inference: Evidence Functions in Theory and Applications
Guest Editors: Brian Dennis, Mark L. Taper, Jose Miguel PoncianoDeadline: 31 May 2024
Special Issue in
Entropy
Nonlinear Dynamics in Cardiovascular Signals
Guest Editor: Claudia LermaDeadline: 15 June 2024
Special Issue in
Entropy
Non-equilibrium Thermodynamics
Guest Editors: Duc Nguyen-Manh, Abraham MarmurDeadline: 30 June 2024
Topical Collections
Topical Collection in
Entropy
Algorithmic Information Dynamics: A Computational Approach to Causality from Cells to Networks
Collection Editors: Hector Zenil, Felipe Abrahão
Topical Collection in
Entropy
Wavelets, Fractals and Information Theory
Collection Editor: Carlo Cattani
Topical Collection in
Entropy
Entropy in Image Analysis
Collection Editor: Amelia Carolina Sparavigna