Next Issue
Volume 26, October
Previous Issue
Volume 26, August
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 26, Issue 9 (September 2024) – 95 articles

Cover Story (view full-size image): Time possesses a clear asymmetry between the past and the future; entropy increases, life evolves, we remember the past, and we plan the future. Yet fundamentally, our universe is governed by laws that are symmetric with respect to time reversal, making it difficult to identify a physical explanation for our experience of the irreversible passage of time. To reconcile this difference, we propose a toy model of emergent time asymmetries—the causal multibaker maps. After imposing a suitable initial condition and coarse graining on them, a Pearlean locally causal structure emerges. From this structure, we obtain not only the second law of thermodynamics but many additional time asymmetries resembling those of the real universe. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
18 pages, 737 KiB  
Article
One-Photon-Interference Quantum Secure Direct Communication
by Xiang-Jie Li, Min Wang, Xing-Bo Pan, Yun-Rong Zhang and Gui-Lu Long
Entropy 2024, 26(9), 811; https://doi.org/10.3390/e26090811 - 23 Sep 2024
Viewed by 599
Abstract
Quantum secure direct communication (QSDC) is a quantum communication paradigm that transmits confidential messages directly using quantum states. Measurement-device-independent (MDI) QSDC protocols can eliminate the security loopholes associated with measurement devices. To enhance the practicality and performance of MDI-QSDC protocols, we propose a [...] Read more.
Quantum secure direct communication (QSDC) is a quantum communication paradigm that transmits confidential messages directly using quantum states. Measurement-device-independent (MDI) QSDC protocols can eliminate the security loopholes associated with measurement devices. To enhance the practicality and performance of MDI-QSDC protocols, we propose a one-photon-interference MDI QSDC (OPI-QSDC) protocol which transcends the need for quantum memory, ideal single-photon sources, or entangled light sources. The security of our OPI-QSDC protocol has also been analyzed using quantum wiretap channel theory. Furthermore, our protocol could double the distance of usual prepare-and-measure protocols, since quantum states sending from adjacent nodes are connected with single-photon interference, which demonstrates its potential to extend the communication distance for point-to-point QSDC. Full article
(This article belongs to the Special Issue Quantum Information: Working towards Applications)
Show Figures

Figure 1

17 pages, 4033 KiB  
Article
Motor Fault Diagnosis Based on Convolutional Block Attention Module-Xception Lightweight Neural Network
by Fengyun Xie, Qiuyang Fan, Gang Li, Yang Wang, Enguang Sun and Shengtong Zhou
Entropy 2024, 26(9), 810; https://doi.org/10.3390/e26090810 - 23 Sep 2024
Viewed by 725
Abstract
Electric motors play a crucial role in self-driving vehicles. Therefore, fault diagnosis in motors is important for ensuring the safety and reliability of vehicles. In order to improve fault detection performance, this paper proposes a motor fault diagnosis method based on vibration signals. [...] Read more.
Electric motors play a crucial role in self-driving vehicles. Therefore, fault diagnosis in motors is important for ensuring the safety and reliability of vehicles. In order to improve fault detection performance, this paper proposes a motor fault diagnosis method based on vibration signals. Firstly, the vibration signals of each operating state of the motor at different frequencies are measured with vibration sensors. Secondly, the characteristic of Gram image coding is used to realize the coding of time domain information, and the one-dimensional vibration signals are transformed into grayscale diagrams to highlight their features. Finally, the lightweight neural network Xception is chosen as the main tool, and the attention mechanism Convolutional Block Attention Module (CBAM) is introduced into the model to enforce the importance of the characteristic information of the motor faults and realize their accurate identification. Xception is a type of convolutional neural network; its lightweight design maintains excellent performance while significantly reducing the model’s order of magnitude. Without affecting the computational complexity and accuracy of the network, the CBAM attention mechanism is added, and Gram’s corner field is combined with the improved lightweight neural network. The experimental results show that this model achieves a better recognition effect and faster iteration speed compared with the traditional Convolutional Neural Network (CNN), ResNet, and Xception networks. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Data Analytics)
Show Figures

Figure 1

17 pages, 3333 KiB  
Article
The Application of Pinch Technology to a Novel Closed-Loop Spray Drying System with a Condenser and Reheater
by Zexin Lei, Thomas O’Neill and Timothy Langrish
Entropy 2024, 26(9), 809; https://doi.org/10.3390/e26090809 - 23 Sep 2024
Viewed by 518
Abstract
Spray drying is an energy-intensive process in industrial use, making energy recovery a critical focus for improving overall efficiency. This study investigates the potential of integrating heat-recovery systems, including an innovative air reheater, into a closed-loop spray-drying unit to maximise energy savings. Through [...] Read more.
Spray drying is an energy-intensive process in industrial use, making energy recovery a critical focus for improving overall efficiency. This study investigates the potential of integrating heat-recovery systems, including an innovative air reheater, into a closed-loop spray-drying unit to maximise energy savings. Through detailed pinch analysis, the system achieved a very low approach temperature, averaging 3.48 K, which is significantly lower than that of conventional open-loop systems. The study quantifies the energy-recovery potential by demonstrating that the integration of heat-recovery components can reduce the external heating demand by up to 30%. This not only enhances heat-transfer efficiency but also lowers operational costs and reduces the system’s environmental impact. The results suggest that closed-loop systems with air reheaters offer a scalable solution for improving energy efficiency across different industrial applications. The research highlights a new paradigm: focusing on latent energy within the system rather than adjusting individual operational variables. Full article
(This article belongs to the Special Issue Thermal Science and Engineering Applications)
Show Figures

Figure 1

21 pages, 1508 KiB  
Article
Autocatalytic Sets and Assembly Theory: A Toy Model Perspective
by Sebastian Raubitzek, Alexander Schatten, Philip König, Edina Marica, Sebastian Eresheim and Kevin Mallinger
Entropy 2024, 26(9), 808; https://doi.org/10.3390/e26090808 - 22 Sep 2024
Viewed by 839
Abstract
Assembly Theory provides a promising framework to explain the complexity of systems such as molecular structures and the origins of life, with broad applicability across various disciplines. In this study, we explore and consolidate different aspects of Assembly Theory by introducing a simplified [...] Read more.
Assembly Theory provides a promising framework to explain the complexity of systems such as molecular structures and the origins of life, with broad applicability across various disciplines. In this study, we explore and consolidate different aspects of Assembly Theory by introducing a simplified Toy Model to simulate the autocatalytic formation of complex structures. This model abstracts the molecular formation process, focusing on the probabilistic control of catalysis rather than the intricate interactions found in organic chemistry. We establish a connection between probabilistic catalysis events and key principles of Assembly Theory, particularly the probability of a possible construction path in the formation of a complex object, and examine how the assembly of complex objects is impacted by the presence of autocatalysis. Our findings suggest that this presence of autocatalysis tends to favor longer consecutive construction sequences in environments with a low probability of catalysis, while this bias diminishes in environments with higher catalysis probabilities, highlighting the significant influence of environmental factors on the assembly of complex structures. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

16 pages, 3343 KiB  
Article
Transient GI/MSP/1/N Queue
by Andrzej Chydzinski
Entropy 2024, 26(9), 807; https://doi.org/10.3390/e26090807 - 22 Sep 2024
Viewed by 475
Abstract
A non-zero correlation between service times can be encountered in many real queueing systems. An attractive model for correlated service times is the Markovian service process, because it offers powerful fitting capabilities combined with analytical tractability. In this paper, a transient study of [...] Read more.
A non-zero correlation between service times can be encountered in many real queueing systems. An attractive model for correlated service times is the Markovian service process, because it offers powerful fitting capabilities combined with analytical tractability. In this paper, a transient study of the queue length in a model with MSP services and a general distribution of interarrival times is performed. In particular, two theorems are proven: one on the queue length distribution at a particular time t, where t can be arbitrarily small or large, and another on the mean queue length at t. In addition to the theorems, multiple numerical examples are provided. They illustrate the development over time of the mean queue length and the standard deviation, along with the complete distribution, depending on the service correlation strength, initial system conditions, and the interarrival time variance. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

12 pages, 2110 KiB  
Article
Improving Transmission Line Fault Diagnosis Based on EEMD and Power Spectral Entropy
by Yuan-Bin Chen, Hui-Shan Cui, Chia-Wei Huang and Wei-Tai Hsu
Entropy 2024, 26(9), 806; https://doi.org/10.3390/e26090806 - 21 Sep 2024
Viewed by 702
Abstract
The fault diagnosis on a transmission line based on the characteristics of the power spectral entropy is proposed in this article. The data preprocessing for the experimental measurement is also introduced using the EEMD. The EEMD is used to preprocess experimental measurements, which [...] Read more.
The fault diagnosis on a transmission line based on the characteristics of the power spectral entropy is proposed in this article. The data preprocessing for the experimental measurement is also introduced using the EEMD. The EEMD is used to preprocess experimental measurements, which are nonlinear and non-stationary fault signals, to overcome the mode mixing. This study focuses on the fault location detection of transmission lines during faults. The proposed method is adopted for different fault types through simulation under the fault point by collecting current and voltage signals at a distance from the fault point. An analysis and comprehensive evaluation of three-phase measured current and voltage signals at distinct fault locations is conducted. The form and position of the fault are distinguished directly and effectively, thereby significantly improving the transmission line efficiency and accuracy of fault diagnosis. Full article
(This article belongs to the Special Issue Entropy Theory in Energy and Power Systems)
Show Figures

Figure 1

8 pages, 1685 KiB  
Article
Potential Benefits and Challenges of Quantifying Pseudoreplication in Genomic Data with Entropy Statistics
by Eric J. Ward and Robin S. Waples
Entropy 2024, 26(9), 805; https://doi.org/10.3390/e26090805 - 21 Sep 2024
Viewed by 552
Abstract
Generating vast arrays of genetic markers for evolutionary ecology studies has become routine and cost-effective. However, analyzing data from large numbers of loci associated with a small number of finite chromosomes introduces a challenge: loci on the same chromosome do not assort independently, [...] Read more.
Generating vast arrays of genetic markers for evolutionary ecology studies has become routine and cost-effective. However, analyzing data from large numbers of loci associated with a small number of finite chromosomes introduces a challenge: loci on the same chromosome do not assort independently, leading to pseudoreplication. Previous studies have demonstrated that pseudoreplication can substantially reduce precision of genetic analyses (and make confidence intervals wider), such as FST and linkage disequilibrium (LD) measures between pairs of loci. In LD analyses, another type of dependency (overlapping pairs of the same loci) also creates pseudoreplication. Building on previous work, we explore the potential of entropy metrics to improve the status quo, particularly total correlation (TC), to assess pseudoreplication in LD studies. Our simulations, performed on a monoecious population with a range of effective population sizes (Ne) and numbers of loci, attempted to isolate the overlapping-pairs-of-loci effect by considering unlinked loci and using entropy to quantify inter-locus relationships. We hypothesized a positive correlation between TC and the number of loci (L), and a negative correlation between TC and Ne. Results from our statistical models predicting TC demonstrate a strong effect of the number of loci, and muted effects of Ne and other predictors, adding support to the use of entropy-based metrics as a tool for estimating the statistical information of complex genetic datasets. Our results also highlight a challenge regarding scalability; computational limitations arise as the number of loci grows, making our current approach limited to smaller datasets. Despite these challenges, this work further refines our understanding of entropy measures, and offers insights into the complex dynamics of genetic information in evolutionary ecology research. Full article
(This article belongs to the Collection Do Entropic Approaches Improve Understanding of Biology?)
Show Figures

Figure 1

15 pages, 391 KiB  
Article
Robustness of Entanglement for Dicke-W and Greenberger-Horne-Zeilinger Mixed States
by Ling-Hui Zhu, Zhen Zhu, Guo-Lin Lv, Chong-Qiang Ye and Xiao-Yu Chen
Entropy 2024, 26(9), 804; https://doi.org/10.3390/e26090804 - 21 Sep 2024
Viewed by 600
Abstract
Quantum entanglement is a fundamental characteristic of quantum mechanics, and understanding the robustness of entanglement across different mixed states is crucial for comprehending the entanglement properties of general quantum states. In this paper, the robustness of entanglement of Dicke–W and Greenberger–Horne–Zeilinger (GHZ) mixed [...] Read more.
Quantum entanglement is a fundamental characteristic of quantum mechanics, and understanding the robustness of entanglement across different mixed states is crucial for comprehending the entanglement properties of general quantum states. In this paper, the robustness of entanglement of Dicke–W and Greenberger–Horne–Zeilinger (GHZ) mixed states under different mixing ratios is calculated using the entanglement witness method. The robustnesses of entanglement of Dicke–W and GHZ mixed states are different when the probability ratio of Dicke to W is greater than 32 and less than 32. For the probability of Dicke and W states greater than or equal to 32, we study the robustness of entanglement of Dicke and GHZ mixed states and analyze and calculate their upper and lower bounds. For the probability of Dicke and W states less than 32, we take the equal probability ratio of Dicke and W states as an example and calculate and analyze the upper and lower bounds of their robustness of entanglement in detail. Full article
Show Figures

Figure 1

18 pages, 340 KiB  
Article
Some Results for Double Cyclic Codes over Fq+vFq+v2Fq
by Tenghui Deng and Jing Yang
Entropy 2024, 26(9), 803; https://doi.org/10.3390/e26090803 - 20 Sep 2024
Viewed by 470
Abstract
Let Fq be a finite field with an odd characteristic. In this paper, we present a new result about double cyclic codes over a finite non-chain ring. Specifically, we study the double cyclic code over [...] Read more.
Let Fq be a finite field with an odd characteristic. In this paper, we present a new result about double cyclic codes over a finite non-chain ring. Specifically, we study the double cyclic code over Fq+vFq+v2Fq with v3=v, which is isomorphic to Fq×Fq×Fq. This study mainly involves generator polynomials and generator matrices. The generating polynomial of the dual code is also obtained. We show the relationship between the generating polynomials of the double cyclic codes and those of their dual codes. Finally, as an application of these results, we construct some optimal codes over F3. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
13 pages, 406 KiB  
Article
An Additively Optimal Interpreter for Approximating Kolmogorov Prefix Complexity
by Zoe Leyva-Acosta, Eduardo Acuña Yeomans and Francisco Hernandez-Quiroz
Entropy 2024, 26(9), 802; https://doi.org/10.3390/e26090802 - 20 Sep 2024
Cited by 1 | Viewed by 444
Abstract
We study practical approximations of Kolmogorov prefix complexity (K) using IMP2, a high-level programming language. Our focus is on investigating the optimality of the interpreter for this language as the reference machine for the Coding Theorem Method (CTM). This method is [...] Read more.
We study practical approximations of Kolmogorov prefix complexity (K) using IMP2, a high-level programming language. Our focus is on investigating the optimality of the interpreter for this language as the reference machine for the Coding Theorem Method (CTM). This method is designed to address applications of algorithmic complexity that differ from the popular traditional lossless compression approach based on the principles of algorithmic probability. The chosen model of computation is proven to be suitable for this task, and a comparison to other models and methods is conducted. Our findings show that CTM approximations using our model do not always correlate with the results from lower-level models of computation. This suggests that some models may require a larger program space to converge to Levin’s universal distribution. Furthermore, we compare the CTM with an upper bound on Kolmogorov complexity and find a strong correlation, supporting the CTM’s validity as an approximation method with finer-grade resolution of K. Full article
Show Figures

Figure 1

14 pages, 342 KiB  
Article
Assessing Variable Importance for Best Subset Selection
by Jacob Seedorff and Joseph E. Cavanaugh
Entropy 2024, 26(9), 801; https://doi.org/10.3390/e26090801 - 19 Sep 2024
Viewed by 427
Abstract
One of the primary issues that arises in statistical modeling pertains to the assessment of the relative importance of each variable in the model. A variety of techniques have been proposed to quantify variable importance for regression models. However, in the context of [...] Read more.
One of the primary issues that arises in statistical modeling pertains to the assessment of the relative importance of each variable in the model. A variety of techniques have been proposed to quantify variable importance for regression models. However, in the context of best subset selection, fewer satisfactory methods are available. With this motivation, we here develop a variable importance measure expressly for this setting. We investigate and illustrate the properties of this measure, introduce algorithms for the efficient computation of its values, and propose a procedure for calculating p-values based on its sampling distributions. We present multiple simulation studies to examine the properties of the proposed methods, along with an application to demonstrate their practical utility. Full article
Show Figures

Figure 1

9 pages, 637 KiB  
Article
Golf Club Selection with AI-Based Game Planning
by Mehdi Khazaeli and Leili Javadpour
Entropy 2024, 26(9), 800; https://doi.org/10.3390/e26090800 - 19 Sep 2024
Viewed by 678
Abstract
In the dynamic realm of golf, where every swing can make the difference between victory and defeat, the strategic selection of golf clubs has become a crucial factor in determining the outcome of a game. Advancements in artificial intelligence have opened new avenues [...] Read more.
In the dynamic realm of golf, where every swing can make the difference between victory and defeat, the strategic selection of golf clubs has become a crucial factor in determining the outcome of a game. Advancements in artificial intelligence have opened new avenues for enhancing the decision-making process, empowering golfers to achieve optimal performance on the course. In this paper, we introduce an AI-based game planning system that assists players in selecting the best club for a given scenario. The system considers factors such as distance, terrain, wind strength and direction, and quality of lie. A rule-based model provides the four best club options based on the player’s maximum shot data for each club. The player picks a club, shot, and target and a probabilistic classification model identifies whether the shot represents a birdie opportunity, par zone, bogey zone, or worse. The results of our model show that taking into account factors such as terrain and atmospheric features increases the likelihood of a better shot outcome. Full article
(This article belongs to the Special Issue Learning from Games and Contests)
Show Figures

Figure 1

12 pages, 10278 KiB  
Article
Enhanced Magnetocaloric Properties of the (MnNi)0.6Si0.62(FeCo)0.4Ge0.38 High-Entropy Alloy Obtained by Co Substitution
by Zhigang Zheng, Pengyan Huang, Xinglin Chen, Hongyu Wang, Shan Da, Gang Wang, Zhaoguo Qiu and Dechang Zeng
Entropy 2024, 26(9), 799; https://doi.org/10.3390/e26090799 - 19 Sep 2024
Cited by 1 | Viewed by 657
Abstract
In order to improve the magnetocaloric properties of MnNiSi-based alloys, a new type of high-entropy magnetocaloric alloy was constructed. In this work, Mn0.6Ni1−xSi0.62Fe0.4CoxGe0.38 (x = 0.4, 0.45, and 0.5) are [...] Read more.
In order to improve the magnetocaloric properties of MnNiSi-based alloys, a new type of high-entropy magnetocaloric alloy was constructed. In this work, Mn0.6Ni1−xSi0.62Fe0.4CoxGe0.38 (x = 0.4, 0.45, and 0.5) are found to exhibit magnetostructural first-order phase transitions from high-temperature Ni2In-type phases to low-temperature TiNiSi-type phases so that the alloys can achieve giant magnetocaloric effects. We investigate why chexagonal/ahexagonal (chexa/ahexa) gradually increases upon Co substitution, while phase transition temperature (Ttr) and isothermal magnetic entropy change (ΔSM) tend to gradually decrease. In particular, the x = 0.4 alloy with remarkable magnetocaloric properties is obtained by tuning Co/Ni, which shows a giant entropy change of 48.5 J∙kg−1K−1 at 309 K for 5 T and an adiabatic temperature change (ΔTad) of 8.6 K at 306.5 K. Moreover, the x = 0.55 HEA shows great hardness and compressive strength with values of 552 HV2 and 267 MPa, respectively, indicating that the mechanical properties undergo an effective enhancement. The large ΔSM and ΔTad may enable the MnNiSi-based HEAs to become a potential commercialized magnetocaloric material. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

19 pages, 1097 KiB  
Article
Nonparametric Expectile Shortfall Regression for Complex Functional Structure
by Mohammed B. Alamari, Fatimah A. Almulhim, Zoulikha Kaid and Ali Laksaci
Entropy 2024, 26(9), 798; https://doi.org/10.3390/e26090798 - 18 Sep 2024
Viewed by 476
Abstract
This paper treats the problem of risk management through a new conditional expected shortfall function. The new risk metric is defined by the expectile as the shortfall threshold. A nonparametric estimator based on the Nadaraya–Watson approach is constructed. The asymptotic property of the [...] Read more.
This paper treats the problem of risk management through a new conditional expected shortfall function. The new risk metric is defined by the expectile as the shortfall threshold. A nonparametric estimator based on the Nadaraya–Watson approach is constructed. The asymptotic property of the constructed estimator is established using a functional time-series structure. We adopt some concentration inequalities to fit this complex structure and to precisely determine the convergence rate of the estimator. The easy implantation of the new risk metric is shown through real and simulated data. Specifically, we show the feasibility of the new model as a risk tool by examining its sensitivity to the fluctuation in financial time-series data. Finally, a comparative study between the new shortfall and the standard one is conducted using real data. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

20 pages, 476 KiB  
Review
Forced Friends: Why the Free Energy Principle Is Not the New Hamilton’s Principle
by Bartosz Michał Radomski and Krzysztof Dołęga
Entropy 2024, 26(9), 797; https://doi.org/10.3390/e26090797 - 18 Sep 2024
Viewed by 823
Abstract
The claim that the free energy principle is somehow related to Hamilton’s principle in statistical mechanics is ubiquitous throughout the subject literature. However, the exact nature of this relationship remains unclear. According to some sources, the free energy principle is merely similar to [...] Read more.
The claim that the free energy principle is somehow related to Hamilton’s principle in statistical mechanics is ubiquitous throughout the subject literature. However, the exact nature of this relationship remains unclear. According to some sources, the free energy principle is merely similar to Hamilton’s principle of stationary action; others claim that it is either analogous or equivalent to it, while yet another part of the literature espouses the claim that it is a version of Hamilton’s principle. In this article, we aim to clarify the nature of the relationship between the two principles by investigating the two most likely interpretations of the claims that can be found in the subject literature. According to the strong interpretation, the two principles are equivalent and apply to the same subset of physical phenomena; according to the weak interpretation, the two principles are merely analogous to each other by virtue of their similar formal structures. As we show, adopting the stronger reading would lead to a dilemma that is untenable for the proponents of the free energy principle, thus supporting the adoption of the weaker reading for the relationship between the two constructs. Full article
Show Figures

Figure 1

21 pages, 3199 KiB  
Article
Developing an Early Warning System for Financial Networks: An Explainable Machine Learning Approach
by Daren Purnell, Jr., Amir Etemadi and John Kamp
Entropy 2024, 26(9), 796; https://doi.org/10.3390/e26090796 - 17 Sep 2024
Viewed by 1380
Abstract
Identifying the influential variables that provide early warning of financial network instability is challenging, in part due to the complexity of the system, uncertainty of a failure, and nonlinear, time-varying relationships between network participants. In this study, we introduce a novel methodology to [...] Read more.
Identifying the influential variables that provide early warning of financial network instability is challenging, in part due to the complexity of the system, uncertainty of a failure, and nonlinear, time-varying relationships between network participants. In this study, we introduce a novel methodology to select variables that, from a data-driven and statistical modeling perspective, represent these relationships and may indicate that the financial network is trending toward instability. We introduce a novel variable selection methodology that leverages Shapley values and modified Borda counts, in combination with statistical and machine learning methods, to create an explainable linear model to predict relationship value weights between network participants. We validate this new approach with data collected from the March 2023 Silicon Valley Bank Failure. The models produced using this novel method successfully identified the instability trend using only 14 input variables out of a possible 3160. The use of parsimonious linear models developed by this method has the potential to identify key financial stability indicators while also increasing the transparency of this complex system. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

17 pages, 1690 KiB  
Article
Robust Optimization Research of Cyber–Physical Power System Considering Wind Power Uncertainty and Coupled Relationship
by Jiuling Dong, Zilong Song, Yuanshuo Zheng, Jingtang Luo, Min Zhang, Xiaolong Yang and Hongbing Ma
Entropy 2024, 26(9), 795; https://doi.org/10.3390/e26090795 - 17 Sep 2024
Viewed by 663
Abstract
To mitigate the impact of wind power uncertainty and power–communication coupling on the robustness of a new power system, a bi-level mixed-integer robust optimization strategy is proposed. Firstly, a coupled network model is constructed based on complex network theory, taking into account the [...] Read more.
To mitigate the impact of wind power uncertainty and power–communication coupling on the robustness of a new power system, a bi-level mixed-integer robust optimization strategy is proposed. Firstly, a coupled network model is constructed based on complex network theory, taking into account the coupled relationship of energy supply and control dependencies between the power and communication networks. Next, a bi-level mixed-integer robust optimization model is developed to improve power system resilience, incorporating constraints related to the coupling strength, electrical characteristics, and traffic characteristics of the information network. The upper-level model seeks to minimize load shedding by optimizing DC power flow using fuzzy chance constraints, thereby reducing the risk of power imbalances caused by random fluctuations in wind power generation. Furthermore, the deterministic power balance constraints are relaxed into inequality constraints that account for wind power forecasting errors through fuzzy variables. The lower-level model focuses on minimizing traffic load shedding by establishing a topology–function-constrained information network traffic model based on the maximum flow principle in graph theory, thereby improving the efficiency of network flow transmission. Finally, a modified IEEE 39-bus test system with intermittent wind power is used as a case study. Random attack simulations demonstrate that, under the highest link failure rate and wind power penetration, Model 2 outperforms Model 1 by reducing the load loss ratio by 23.6% and improving the node survival ratio by 5.3%. Full article
(This article belongs to the Special Issue Robustness and Resilience of Complex Networks)
Show Figures

Figure 1

30 pages, 4353 KiB  
Review
Is Seeing Believing? A Practitioner’s Perspective on High-Dimensional Statistical Inference in Cancer Genomics Studies
by Kun Fan, Srijana Subedi, Gongshun Yang, Xi Lu, Jie Ren and Cen Wu
Entropy 2024, 26(9), 794; https://doi.org/10.3390/e26090794 - 16 Sep 2024
Viewed by 2329
Abstract
Variable selection methods have been extensively developed for and applied to cancer genomics data to identify important omics features associated with complex disease traits, including cancer outcomes. However, the reliability and reproducibility of the findings are in question if valid inferential procedures are [...] Read more.
Variable selection methods have been extensively developed for and applied to cancer genomics data to identify important omics features associated with complex disease traits, including cancer outcomes. However, the reliability and reproducibility of the findings are in question if valid inferential procedures are not available to quantify the uncertainty of the findings. In this article, we provide a gentle but systematic review of high-dimensional frequentist and Bayesian inferential tools under sparse models which can yield uncertainty quantification measures, including confidence (or Bayesian credible) intervals, p values and false discovery rates (FDR). Connections in high-dimensional inferences between the two realms have been fully exploited under the “unpenalized loss function + penalty term” formulation for regularization methods and the “likelihood function × shrinkage prior” framework for regularized Bayesian analysis. In particular, we advocate for robust Bayesian variable selection in cancer genomics studies due to its ability to accommodate disease heterogeneity in the form of heavy-tailed errors and structured sparsity while providing valid statistical inference. The numerical results show that robust Bayesian analysis incorporating exact sparsity has yielded not only superior estimation and identification results but also valid Bayesian credible intervals under nominal coverage probabilities compared with alternative methods, especially in the presence of heavy-tailed model errors and outliers. Full article
(This article belongs to the Special Issue Bayesian Learning and Its Applications in Genomics)
Show Figures

Figure 1

16 pages, 755 KiB  
Article
An MLWE-Based Cut-and-Choose Oblivious Transfer Protocol
by Yongli Tang, Menghao Guo, Yachao Huo, Zongqu Zhao, Jinxia Yu and Baodong Qin
Entropy 2024, 26(9), 793; https://doi.org/10.3390/e26090793 - 16 Sep 2024
Viewed by 744
Abstract
The existing lattice-based cut-and-choose oblivious transfer protocol is constructed based on the learning-with-errors (LWE) problem, which generally has the problem of inefficiency. An efficient cut-and-choose oblivious transfer protocol is proposed based on the difficult module-learning-with-errors (MLWE) problem. Compression and decompression techniques are introduced [...] Read more.
The existing lattice-based cut-and-choose oblivious transfer protocol is constructed based on the learning-with-errors (LWE) problem, which generally has the problem of inefficiency. An efficient cut-and-choose oblivious transfer protocol is proposed based on the difficult module-learning-with-errors (MLWE) problem. Compression and decompression techniques are introduced in the LWE-based dual-mode encryption system to improve it to an MLWE-based dual-mode encryption framework, which is applied to the protocol as an intermediate scheme. Subsequently, the security and efficiency of the protocol are analysed, and the security of the protocol can be reduced to the shortest independent vector problem (SIVP) on the lattice, which is resistant to quantum attacks. Since the whole protocol relies on the polynomial ring of elements to perform operations, the efficiency of polynomial modulo multiplication can be improved by using fast Fourier transform (FFT). Finally, this paper compares the protocol with an LWE-based protocol in terms of computational and communication complexities. The analysis results show that the protocol reduces the computation and communication overheads by at least a factor of n while maintaining the optimal number of communication rounds under malicious adversary attacks. Full article
(This article belongs to the Special Issue Information-Theoretic Cryptography and Security)
Show Figures

Figure 1

14 pages, 634 KiB  
Article
Debiasing the Conversion Rate Prediction Model in the Presence of Delayed Implicit Feedback
by Taojun Hu and Xiao-Hua Zhou
Entropy 2024, 26(9), 792; https://doi.org/10.3390/e26090792 - 15 Sep 2024
Viewed by 862
Abstract
The recommender system (RS) has been widely adopted in many applications, including online advertisements. Predicting the conversion rate (CVR) can help in evaluating the effects of advertisements on users and capturing users’ features, playing an important role in RS. In real-world scenarios, implicit [...] Read more.
The recommender system (RS) has been widely adopted in many applications, including online advertisements. Predicting the conversion rate (CVR) can help in evaluating the effects of advertisements on users and capturing users’ features, playing an important role in RS. In real-world scenarios, implicit rather than explicit feedback data are more abundant. Thus, directly training the RS with collected data may lead to suboptimal performance due to selection bias inherited from the nature of implicit feedback. Methods such as reweighting have been proposed to tackle selection bias; however, these methods omit delayed feedback, which often occurs due to limited observation times. We propose a novel likelihood approach combining the assumed parametric model for delayed feedback and the reweighting method to address selection bias. Specifically, the proposed methods minimize the likelihood-based loss using the multi-task learning method. The proposed methods are evaluated on the real-world Coat and Yahoo datasets. The proposed methods improve the AUC by 5.7% on Coat and 3.7% on Yahoo compared with the best baseline models. The proposed methods successfully debias the CVR prediction model in the presence of delayed implicit feedback. Full article
(This article belongs to the Special Issue Causal Inference in Recommender Systems)
Show Figures

Figure 1

13 pages, 1765 KiB  
Article
Exergy Flow as a Unifying Physical Quantity in Applying Dissipative Lagrangian Fluid Mechanics to Integrated Energy Systems
by Ke Xu, Yan Qi, Changlong Sun, Dengxin Ai, Jiaojiao Wang, Wenxue He, Fan Yang and Hechen Ren
Entropy 2024, 26(9), 791; https://doi.org/10.3390/e26090791 - 14 Sep 2024
Viewed by 670
Abstract
Highly integrated energy systems are on the rise due to increasing global demand. To capture the underlying physics of such interdisciplinary systems, we need a modern framework that unifies all forms of energy. Here, we apply modified Lagrangian mechanics to the description of [...] Read more.
Highly integrated energy systems are on the rise due to increasing global demand. To capture the underlying physics of such interdisciplinary systems, we need a modern framework that unifies all forms of energy. Here, we apply modified Lagrangian mechanics to the description of multi-energy systems. Based on the minimum entropy production principle, we revisit fluid mechanics in the presence of both mechanical and thermal dissipations and propose using exergy flow as the unifying Lagrangian across different forms of energy. We illustrate our theoretical framework by modeling a one-dimensional system with coupled electricity and heat. We map the exergy loss rate in real space and obtain the total exergy changes. Under steady-state conditions, our theory agrees with the traditional formula but incorporates more physical considerations such as viscous dissipation. The integral form of our theory also allows us to go beyond steady-state calculations and visualize the local, time-dependent exergy flow density everywhere in the system. Expandable to a wide range of applications, our theoretical framework provides the basis for developing versatile models in integrated energy systems. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

21 pages, 2082 KiB  
Review
The Many Roles of Precision in Action
by Jakub Limanowski, Rick A. Adams, James Kilner and Thomas Parr
Entropy 2024, 26(9), 790; https://doi.org/10.3390/e26090790 - 14 Sep 2024
Viewed by 1166
Abstract
Active inference describes (Bayes-optimal) behaviour as being motivated by the minimisation of surprise of one’s sensory observations, through the optimisation of a generative model (of the hidden causes of one’s sensory data) in the brain. One of active inference’s key appeals is its [...] Read more.
Active inference describes (Bayes-optimal) behaviour as being motivated by the minimisation of surprise of one’s sensory observations, through the optimisation of a generative model (of the hidden causes of one’s sensory data) in the brain. One of active inference’s key appeals is its conceptualisation of precision as biasing neuronal communication and, thus, inference within generative models. The importance of precision in perceptual inference is evident—many studies have demonstrated the importance of ensuring precision estimates are correct for normal (healthy) sensation and perception. Here, we highlight the many roles precision plays in action, i.e., the key processes that rely on adequate estimates of precision, from decision making and action planning to the initiation and control of muscle movement itself. Thereby, we focus on the recent development of hierarchical, “mixed” models—generative models spanning multiple levels of discrete and continuous inference. These kinds of models open up new perspectives on the unified description of hierarchical computation, and its implementation, in action. Here, we highlight how these models reflect the many roles of precision in action—from planning to execution—and the associated pathologies if precision estimation goes wrong. We also discuss the potential biological implementation of the associated message passing, focusing on the role of neuromodulatory systems in mediating different kinds of precision. Full article
Show Figures

Figure 1

10 pages, 1762 KiB  
Article
Evanescent Electron Wave-Spin
by Ju Gao and Fang Shen
Entropy 2024, 26(9), 789; https://doi.org/10.3390/e26090789 - 14 Sep 2024
Viewed by 462
Abstract
This study demonstrates the existence of an evanescent electron wave outside both finite and infinite quantum wells by solving the Dirac equation and ensuring the continuity of the spinor wavefunction at the boundaries. We show that this evanescent wave shares the spin characteristics [...] Read more.
This study demonstrates the existence of an evanescent electron wave outside both finite and infinite quantum wells by solving the Dirac equation and ensuring the continuity of the spinor wavefunction at the boundaries. We show that this evanescent wave shares the spin characteristics of the wave confined within the well, as indicated by analytical expressions for the current density across all regions. Our findings suggest that the electron cannot be confined to a mathematical singularity and that quantum information, or quantum entropy, can leak through any quantum confinement. These results emphasize that the electron wave, fully characterized by Lorentz-invariant charge and current densities, should be considered the true and sole entity of the electron. Full article
(This article belongs to the Section Quantum Information)
Show Figures

Figure 1

17 pages, 388 KiB  
Article
On the Analysis of Wealth Distribution in the Context of Infectious Diseases
by Tingting Zhang, Shaoyong Lai and Minfang Zhao
Entropy 2024, 26(9), 788; https://doi.org/10.3390/e26090788 - 14 Sep 2024
Viewed by 513
Abstract
A mathematical model is established to investigate the economic effects of infectious diseases. The distribution of wealth among two types of agents in the context of the epidemic is discussed. Using the method of statistical mechanics, the evolution of the entropy weak solutions [...] Read more.
A mathematical model is established to investigate the economic effects of infectious diseases. The distribution of wealth among two types of agents in the context of the epidemic is discussed. Using the method of statistical mechanics, the evolution of the entropy weak solutions for the model of the susceptible and the infectious involving wealth density functions is analyzed. We assume that as time tends to infinity, the wealth density function of the infectious is linearly related to the wealth density function of the susceptible individuals. Our results indicate that the spreading of disease significantly affects the wealth distribution. When time tends to infinity, the total wealth density function behaves as an inverse gamma distribution. Utilizing numerical experiments, the distribution of wealth under the epidemic phenomenon and the situation of wealth inequality among agents are discussed. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

21 pages, 4992 KiB  
Article
Enhancing Security of Telemedicine Data: A Multi-Scroll Chaotic System for ECG Signal Encryption and RF Transmission
by José Ricardo Cárdenas-Valdez, Ramón Ramírez-Villalobos, Catherine Ramirez-Ubieta and Everardo Inzunza-Gonzalez
Entropy 2024, 26(9), 787; https://doi.org/10.3390/e26090787 - 14 Sep 2024
Viewed by 938
Abstract
Protecting sensitive patient data, such as electrocardiogram (ECG) signals, during RF wireless transmission is essential due to the increasing demand for secure telemedicine communications. This paper presents an innovative chaotic-based encryption system designed to enhance the security and integrity of telemedicine data transmission. [...] Read more.
Protecting sensitive patient data, such as electrocardiogram (ECG) signals, during RF wireless transmission is essential due to the increasing demand for secure telemedicine communications. This paper presents an innovative chaotic-based encryption system designed to enhance the security and integrity of telemedicine data transmission. The proposed system utilizes a multi-scroll chaotic system for ECG signal encryption based on master–slave synchronization. The ECG signal is encrypted by a master system and securely transmitted to a remote location, where it is decrypted by a slave system using an extended state observer. Synchronization between the master and slave is achieved through the Lyapunov criteria, which ensures system stability. The system also supports Orthogonal Frequency Division Multiplexing (OFDM) and adaptive n-quadrature amplitude modulation (n-QAM) schemes to optimize signal discretization. Experimental validations with a custom transceiver scheme confirmed the system’s effectiveness in preventing channel overlap during 2.5 GHz transmissions. Additionally, a commercial RF Power Amplifier (RF-PA) for LTE applications and a development board were integrated to monitor transmission quality. The proposed encryption system ensures robust and efficient RF transmission of ECG data, addressing critical challenges in the wireless communication of sensitive medical information. This approach demonstrates the potential for broader applications in modern telemedicine environments, providing a reliable and efficient solution for the secure transmission of healthcare data. Full article
Show Figures

Figure 1

19 pages, 353 KiB  
Article
Relative Belief Inferences from Decision Theory
by Michael Evans and Gun Ho Jang
Entropy 2024, 26(9), 786; https://doi.org/10.3390/e26090786 - 14 Sep 2024
Viewed by 441
Abstract
Relative belief inferences are shown to arise as Bayes rules or limiting Bayes rules. These inferences are invariant under reparameterizations and possess a number of optimal properties. In particular, relative belief inferences are based on a direct measure of statistical evidence. Full article
(This article belongs to the Special Issue Bayesianism)
13 pages, 415 KiB  
Article
Sampled-Data Exponential Synchronization of Complex Dynamical Networks with Saturating Actuators
by Runan Guo and Wenshun Lv
Entropy 2024, 26(9), 785; https://doi.org/10.3390/e26090785 - 14 Sep 2024
Viewed by 503
Abstract
This paper investigates the problem of exponential synchronization control for complex dynamical systems (CDNs) with input saturation. Considering the effects of transmission delay, a memory sampled-data controller is designed. A modified two-sided looped functional is constructed that takes into account the entire sampling [...] Read more.
This paper investigates the problem of exponential synchronization control for complex dynamical systems (CDNs) with input saturation. Considering the effects of transmission delay, a memory sampled-data controller is designed. A modified two-sided looped functional is constructed that takes into account the entire sampling period, which includes both current state information and delayed state information. This functional only needs to be positive definite at the sampling instants. Sufficient criteria and the controller design method are provided to ensure the exponential synchronization of CDNs with input saturation under the influence of transmission delay, as well as the estimation of the basin of attraction. Additionally, an optimization algorithm for enlarging the region of attraction is proposed. Finally, a numerical example is presented to verify the effectiveness of the conclusion. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

15 pages, 2307 KiB  
Article
Information-Theoretic Modeling of Categorical Spatiotemporal GIS Data
by David Percy and Martin Zwick
Entropy 2024, 26(9), 784; https://doi.org/10.3390/e26090784 - 13 Sep 2024
Viewed by 499
Abstract
An information-theoretic data mining method is employed to analyze categorical spatiotemporal Geographic Information System land use data. Reconstructability Analysis (RA) is a maximum-entropy-based data modeling methodology that works exclusively with discrete data such as those in the National Land Cover Database (NLCD). The [...] Read more.
An information-theoretic data mining method is employed to analyze categorical spatiotemporal Geographic Information System land use data. Reconstructability Analysis (RA) is a maximum-entropy-based data modeling methodology that works exclusively with discrete data such as those in the National Land Cover Database (NLCD). The NLCD is organized into a spatial (raster) grid and data are available in a consistent format for every five years from 2001 to 2021. An NLCD tool reports how much change occurred for each category of land use; for the study area examined, the most dynamic class is Evergreen Forest (EFO), so the presence or absence of EFO in 2021 was chosen as the dependent variable that our data modeling attempts to predict. RA predicts the outcome with approximately 80% accuracy using a sparse set of cells from a spacetime data cube consisting of neighboring lagged-time cells. When the predicting cells are all Shrubs and Grasses, there is a high probability for a 2021 state of EFO, while when the predicting cells are all EFO, there is a high probability that the 2021 state will not be EFO. These findings are interpreted as detecting forest clear-cut cycles that show up in the data and explain why this class is so dynamic. This study introduces a new approach to analyzing GIS categorical data and expands the range of applications that this entropy-based methodology can successfully model. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

12 pages, 917 KiB  
Article
Time Sequence Deep Learning Model for Ubiquitous Tabular Data with Unique 3D Tensors Manipulation
by Adaleta Gicic, Dženana Đonko and Abdulhamit Subasi
Entropy 2024, 26(9), 783; https://doi.org/10.3390/e26090783 - 12 Sep 2024
Viewed by 652
Abstract
Although deep learning (DL) algorithms have been proved to be effective in diverse research domains, their application in developing models for tabular data remains limited. Models trained on tabular data demonstrate higher efficacy using traditional machine learning models than DL models, which are [...] Read more.
Although deep learning (DL) algorithms have been proved to be effective in diverse research domains, their application in developing models for tabular data remains limited. Models trained on tabular data demonstrate higher efficacy using traditional machine learning models than DL models, which are largely attributed to the size and structure of tabular datasets and the specific application contexts in which they are utilized. Thus, the primary objective of this paper is to propose a method to use the supremacy of Stacked Bidirectional LSTM (Long Short-Term Memory) deep learning algorithms in pattern discovery incorporating tabular data with customized 3D tensor modeling in feeding neural networks. Our findings are empirically validated using six diverse, publicly available datasets each varying in size and learning objectives. This paper proves that the proposed model based on time-sequence DL algorithms, which were generally described as inadequate when dealing with tabular data, yields satisfactory results and competes effectively with other algorithms specifically designed for tabular data. An additional benefit of this approach is its ability to preserve simplicity while ensuring fast model training also with large datasets. Even with extremely small datasets, models can be applied to achieve exceptional predictive results and fully utilize their capacity. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

17 pages, 3321 KiB  
Article
Sensitivity Analysis of Excited-State Population in Plasma Based on Relative Entropy
by Yosuke Shimada and Hiroshi Akatsuka
Entropy 2024, 26(9), 782; https://doi.org/10.3390/e26090782 - 12 Sep 2024
Viewed by 938
Abstract
A highly versatile evaluation method is proposed for transient plasmas based on statistical physics. It would be beneficial in various industrial sectors, including semiconductors and automobiles. Our research focused on low-energy plasmas in laboratory settings, and they were assessed via our proposed method, [...] Read more.
A highly versatile evaluation method is proposed for transient plasmas based on statistical physics. It would be beneficial in various industrial sectors, including semiconductors and automobiles. Our research focused on low-energy plasmas in laboratory settings, and they were assessed via our proposed method, which incorporates relative entropy and fractional Brownian motion, based on a revised collisional–radiative model. By introducing an indicator to evaluate how far a system is from its steady state, both the trend of entropy and the radiative process’ contribution to the lifetime of excited states were considered. The high statistical weight of some excited states may act as a bottleneck in the plasma’s energy relaxation throughout the system to a steady state. By deepening our understanding of how energy flows through plasmas, we anticipate potential contributions to resolving global environmental issues and fostering technological innovation in plasma-related industrial fields. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop