Previous Issue
Volume 26, June
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 26, Issue 7 (July 2024) – 32 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
20 pages, 2137 KiB  
Article
Bounding Quantum Correlations: The Role of the Shannon Information in the Information Causality Principle
by Natasha Oughton and Christopher G. Timpson
Entropy 2024, 26(7), 562; https://doi.org/10.3390/e26070562 (registering DOI) - 29 Jun 2024
Viewed by 87
Abstract
The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Alice’s distant database gained by [...] Read more.
The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Alice’s distant database gained by Bob after receiving an m bit message cannot exceed m bits, even when Alice and Bob share non-local resources. As originally formulated, it can be shown that the principle is violated exactly when the strength of the shared correlations exceeds the Tsirelson bound. However, we demonstrate here that when an alternative measure of information, one of the Renyi measures, is chosen, the Information Causality principle no longer arrives at the correct value for the Tsirelson bound. We argue that neither the assumption of particular `intuitive’ properties of uncertainties measures, nor pragmatic choices about how to optimise costs associated with communication, are sufficient to motivate uniquely the choice of the Shannon measure from amongst the more general Renyi measures. We conclude that the dependence of the success of Information Causality on mere convention undermines its claimed significance as a foundational principle. Full article
(This article belongs to the Special Issue Information-Theoretic Concepts in Physics)
21 pages, 1947 KiB  
Article
IoT Privacy Risks Revealed
by Kai-Chih Chang, Haoran Niu, Brian Kim and Suzanne Barber
Entropy 2024, 26(7), 561; https://doi.org/10.3390/e26070561 (registering DOI) - 29 Jun 2024
Viewed by 111
Abstract
A user’s devices such as their phone and computer are constantly bombarded by IoT devices and associated applications seeking connection to the user’s devices. These IoT devices may or may not seek explicit user consent, thus leaving the users completely unaware the IoT [...] Read more.
A user’s devices such as their phone and computer are constantly bombarded by IoT devices and associated applications seeking connection to the user’s devices. These IoT devices may or may not seek explicit user consent, thus leaving the users completely unaware the IoT device is collecting, using, and/or sharing their personal data or, only marginal informed, if the user consented to the connecting IoT device but did not read the associated privacy policies. Privacy policies are intended to inform users of what personally identifiable information (PII) data will be collected about them and the policies about how those PII data will be used and shared. This paper presents novel tools and the underlying algorithms employed by the Personal Privacy Assistant app (UTCID PPA) developed by the University of Texas at Austin Center for Identity to inform users of IoT devices seeking to connect to their devices and to notify those users of potential privacy risks posed by the respective IoT device. The assessment of these privacy risks must deal with the uncertainty associated with sharing the user’s personal data. If privacy risk (R) equals the consequences (C) of an incident (i.e., personal data exposure) multiplied by the probability (P) of those consequences occurring (C × P), then efforts to control risks must seek to reduce the possible consequences of an incident as well as reduce the uncertainty of the incident and its consequences occurring. This research classifies risk according to two parameters: expected value of the incident’s consequences and uncertainty (entropy) of those consequences. This research calculates the entropy of the privacy incident consequences by evaluating: (1) the data sharing policies governing the IoT resource and (2) the type of personal data exposed. The data sharing policies of an IoT resource are scored by the UTCID PrivacyCheck, which uses machine learning to read and score the IoT resource privacy policies against metrics set forth by best practices and international regulations. The UTCID Identity Ecosystem uses empirical identity theft and fraud cases to assess the entropy of privacy incident consequences involving a specific type of personal data, such as name, address, Social Security number, fingerprint, and user location. By understanding the entropy of a privacy incident posed by a given IoT resource seeking to connect to a user’s device, UTCID PPA offers actionable recommendations enhancing the user’s control over IoT connections, interactions, their personal data, and, ultimately, user-centric privacy control. Full article
(This article belongs to the Special Issue Information Security and Privacy: From IoT to IoV II)
18 pages, 2884 KiB  
Article
(HTBNet)Arbitrary Shape Scene Text Detection with Binarization of Hyperbolic Tangent and Cross-Entropy
by Zhao Chen
Entropy 2024, 26(7), 560; https://doi.org/10.3390/e26070560 (registering DOI) - 29 Jun 2024
Viewed by 94
Abstract
Abstract: The existing segmentation-based scene text detection methods mostly need complicated post-processing, and the post-processing operation is separated from the training process, which greatly reduces the detection performance. The previous method, DBNet, successfully simplified post-processing and integrated post-processing into a segmentation network. [...] Read more.
Abstract: The existing segmentation-based scene text detection methods mostly need complicated post-processing, and the post-processing operation is separated from the training process, which greatly reduces the detection performance. The previous method, DBNet, successfully simplified post-processing and integrated post-processing into a segmentation network. However, the training process of the model took a long time for 1200 epochs and the sensitivity to texts of various scales was lacking, leading to some text instances being missed. Considering the above two problems, we design the text detection Network with Binarization of Hyperbolic Tangent (HTBNet). First of all, we propose the Binarization of Hyperbolic Tangent (HTB), optimized along with which the segmentation network can expedite the initial convergent speed by reducing the number of epochs from 1200 to 600. Because features of different channels in the same scale feature map focus on the information of different regions in the image, to better represent the important features of all objects in the image, we devise the Multi-Scale Channel Attention (MSCA). Meanwhile, considering that multi-scale objects in the image cannot be simultaneously detected, we propose a novel module named Fused Module with Channel and Spatial (FMCS), which can fuse the multi-scale feature maps from channel and spatial dimensions. Finally, we adopt cross-entropy as the loss function, which measures the difference between predicted values and ground truths. The experimental results show that HTBNet, compared with lightweight models, has achieved competitive performance and speed on Total-Text (F-measure:86.0%, FPS:30) and MSRA-TD500 (F-measure:87.5%, FPS:30). Full article
16 pages, 1092 KiB  
Article
Fast Finite-Time Observer-Based Event-Triggered Consensus Control for Uncertain Nonlinear Multiagent Systems with Full-State Constraints
by Kewei Zhou and Xin Wang
Entropy 2024, 26(7), 559; https://doi.org/10.3390/e26070559 (registering DOI) - 29 Jun 2024
Viewed by 98
Abstract
This article studies a class of uncertain nonlinear multiagent systems (MASs) with state restrictions. RBFNNs, or radial basis function neural networks, are utilized to estimate the uncertainty of the system. To approximate the unknown states and disturbances, the state observer and disturbance observer [...] Read more.
This article studies a class of uncertain nonlinear multiagent systems (MASs) with state restrictions. RBFNNs, or radial basis function neural networks, are utilized to estimate the uncertainty of the system. To approximate the unknown states and disturbances, the state observer and disturbance observer are proposed to resolve those issues. Moreover, a fast finite-time consensus control technique is suggested in order to accomplish fast finite-time stability without going against the full-state requirements. It is demonstrated that every signal could be stable and boundless, and an event-triggered controller is considered for the saving of resources. Ultimately, the simulated example demonstrates the validity of the developed approach. Full article
(This article belongs to the Special Issue Nonlinear Dynamical Behaviors in Complex Systems)
16 pages, 828 KiB  
Article
Statistical Interdependence between Daily Precipitation and Extreme Daily Temperature in Regions of Mexico and Colombia
by Álvaro Zabaleta-Ortega, Teobaldis Mercado-Fernández, Israel Reyes-Ramírez, Fernando Angulo-Brown and Lev Guzmán-Vargas
Entropy 2024, 26(7), 558; https://doi.org/10.3390/e26070558 (registering DOI) - 29 Jun 2024
Viewed by 117
Abstract
We study the statistical interdependence between daily precipitation and daily extreme temperature for regions of Mexico (14 climatic stations, period 1960–2020) and Colombia (7 climatic stations, period 1973–2020) using linear (cross-correlation and coherence) and nonlinear (global phase synchronization index, mutual information, and cross-sample [...] Read more.
We study the statistical interdependence between daily precipitation and daily extreme temperature for regions of Mexico (14 climatic stations, period 1960–2020) and Colombia (7 climatic stations, period 1973–2020) using linear (cross-correlation and coherence) and nonlinear (global phase synchronization index, mutual information, and cross-sample entropy) synchronization metrics. The information shared between these variables is relevant and exhibits changes when comparing regions with different climatic conditions. We show that precipitation and temperature records from La Mojana are characterized by high persistence, while data from Mexico City exhibit lower persistence (less memory). We find that the information exchange and the level of coupling between the precipitation and temperature are higher for the case of the La Mojana region (Colombia) compared to Mexico City (Mexico), revealing that regions where seasonal changes are almost null and with low temperature gradients (less local variability) tend to display higher synchrony compared to regions where seasonal changes are very pronounced. The interdependence characterization between precipitation and temperature represents a robust option to characterize and analyze the collective dynamics of the system, applicable in climate change studies, as well as in changes not easily identifiable in future scenarios. Full article
(This article belongs to the Section Multidisciplinary Applications)
19 pages, 2526 KiB  
Review
Episodic Visual Hallucinations, Inference and Free Energy
by Daniel Collerton, Ichiro Tsuda and Shigetoshi Nara
Entropy 2024, 26(7), 557; https://doi.org/10.3390/e26070557 (registering DOI) - 28 Jun 2024
Viewed by 189
Abstract
Understandings of how visual hallucinations appear have been highly influenced by generative approaches, in particular Friston’s Active Inference conceptualization. Their core proposition is that these phenomena occur when hallucinatory expectations outweigh actual sensory data. This imbalance occurs as the brain seeks to minimize [...] Read more.
Understandings of how visual hallucinations appear have been highly influenced by generative approaches, in particular Friston’s Active Inference conceptualization. Their core proposition is that these phenomena occur when hallucinatory expectations outweigh actual sensory data. This imbalance occurs as the brain seeks to minimize informational free energy, a measure of the distance between predicted and actual sensory data in a stationary open system. We review this approach in the light of old and new information on the role of environmental factors in episodic hallucinations. In particular, we highlight the possible relationship of specific visual triggers to the onset and offset of some episodes. We use an analogy from phase transitions in physics to explore factors which might account for intermittent shifts between veridical and hallucinatory vision. In these triggered forms of hallucinations, we suggest that there is a transient disturbance in the normal one-to-one correspondence between a real object and the counterpart perception such that this correspondence becomes between the real object and a hallucination. Generative models propose that a lack of information transfer from the environment to the brain is one of the key features of hallucinations. In contrast, we submit that specific information transfer is required at onset and offset in these cases. We propose that this transient one-to-one correspondence between environment and hallucination is mediated more by aberrant discriminative than by generative inference. Discriminative inference can be conceptualized as a process for maximizing shared information between the environment and perception within a self-organizing nonstationary system. We suggest that generative inference plays the greater role in established hallucinations and in the persistence of individual hallucinatory episodes. We further explore whether thermodynamic free energy may be an additional factor in why hallucinations are temporary. Future empirical research could productively concentrate on three areas. Firstly, subjective perceptual changes and parallel variations in brain function during specific transitions between veridical and hallucinatory vision to inform models of how episodes occur. Secondly, systematic investigation of the links between environment and hallucination episodes to probe the role of information transfer in triggering transitions between veridical and hallucinatory vision. Finally, changes in hallucinatory episodes over time to elucidate the role of learning on phenomenology. These empirical data will allow the potential roles of different forms of inference in the stages of hallucinatory episodes to be elucidated. Full article
18 pages, 3851 KiB  
Article
Diffusion-Based Causal Representation Learning
by Amir Mohammad Karimi Mamaghan, Andrea Dittadi, Stefan Bauer, Karl Henrik Johansson and Francesco Quinzan
Entropy 2024, 26(7), 556; https://doi.org/10.3390/e26070556 (registering DOI) - 28 Jun 2024
Viewed by 187
Abstract
Causal reasoning can be considered a cornerstone of intelligent systems. Having access to an underlying causal graph comes with the promise of cause–effect estimation and the identification of efficient and safe interventions. However, learning causal representations remains a major challenge, due to the [...] Read more.
Causal reasoning can be considered a cornerstone of intelligent systems. Having access to an underlying causal graph comes with the promise of cause–effect estimation and the identification of efficient and safe interventions. However, learning causal representations remains a major challenge, due to the complexity of many real-world systems. Previous works on causal representation learning have mostly focused on Variational Auto-Encoders (VAEs). These methods only provide representations from a point estimate, and they are less effective at handling high dimensions. To overcome these problems, we propose a Diffusion-based Causal Representation Learning (DCRL) framework which uses diffusion-based representations for causal discovery in the latent space. DCRL provides access to both single-dimensional and infinite-dimensional latent codes, which encode different levels of information. In a first proof of principle, we investigate the use of DCRL for causal representation learning in a weakly supervised setting. We further demonstrate experimentally that this approach performs comparably well in identifying the latent causal structure and causal variables. Full article
(This article belongs to the Special Issue Deep Generative Modeling: Theory and Applications)
21 pages, 3067 KiB  
Article
Tail Risk Dynamics under Price-Limited Constraint: A Censored Autoregressive Conditional Fréchet Model
by Tao Xu, Lei Shu and Yu Chen
Entropy 2024, 26(7), 555; https://doi.org/10.3390/e26070555 (registering DOI) - 28 Jun 2024
Viewed by 117
Abstract
This paper proposes a novel censored autoregressive conditional Fréchet (CAcF) model with a flexible evolution scheme for the time-varying parameters, which allows deciphering tail risk dynamics constrained by price limits from the viewpoints of different risk preferences. The proposed model can well accommodate [...] Read more.
This paper proposes a novel censored autoregressive conditional Fréchet (CAcF) model with a flexible evolution scheme for the time-varying parameters, which allows deciphering tail risk dynamics constrained by price limits from the viewpoints of different risk preferences. The proposed model can well accommodate many important empirical characteristics of financial data, such as heavy-tailedness, volatility clustering, extreme event clustering, and price limits. We then investigate tail risk dynamics via the CAcF model in the price-limited stock markets, taking entropic value at risk (EVaR) as a risk measurement. Our findings suggest that tail risk will be seriously underestimated in price-limited stock markets when the censored property of limit prices is ignored. Additionally, the evidence from the Chinese Taiwan stock market shows that widening price limits would lead to a decrease in the incidence of extreme events (hitting limit-down) but a significant increase in tail risk. Moreover, we find that investors with different risk preferences may make opposing decisions about an extreme event. In summary, the empirical results reveal the effectiveness of our model in interpreting and predicting time-varying tail behaviors in price-limited stock markets, providing a new tool for financial risk management. Full article
12 pages, 632 KiB  
Article
The Statistics of q-Statistics
by Deniz Eroglu, Bruce M. Boghosian , Ernesto P. Borges  and Ugur Tirnakli
Entropy 2024, 26(7), 554; https://doi.org/10.3390/e26070554 (registering DOI) - 28 Jun 2024
Viewed by 161
Abstract
Almost two decades ago, Ernesto P. Borges and Bruce M. Boghosian embarked on the intricate task of composing a manuscript to honor the profound contributions of Constantino Tsallis to the realm of statistical physics, coupled with a concise exploration of q-Statistics. Fast-forward [...] Read more.
Almost two decades ago, Ernesto P. Borges and Bruce M. Boghosian embarked on the intricate task of composing a manuscript to honor the profound contributions of Constantino Tsallis to the realm of statistical physics, coupled with a concise exploration of q-Statistics. Fast-forward to Constantino Tsallis’ illustrious 80th birthday celebration in 2023, where Deniz Eroglu and Ugur Tirnakli delved into Constantino’s collaborative network, injecting renewed vitality into the project. With hearts brimming with appreciation for Tsallis’ enduring inspiration, Eroglu, Boghosian, Borges, and Tirnakli proudly present this meticulously crafted manuscript as a token of their gratitude. Full article
25 pages, 1299 KiB  
Article
On Entropic Learning from Noisy Time Series in the Small Data Regime
by Davide Bassetti, Lukáš Pospíšil and Illia Horenko
Entropy 2024, 26(7), 553; https://doi.org/10.3390/e26070553 (registering DOI) - 28 Jun 2024
Viewed by 139
Abstract
In this work, we present a novel methodology for performing the supervised classification of time-ordered noisy data; we call this methodology Entropic Sparse Probabilistic Approximation with Markov regularization (eSPA-Markov). It is an extension of entropic learning methodologies, allowing the simultaneous learning of segmentation [...] Read more.
In this work, we present a novel methodology for performing the supervised classification of time-ordered noisy data; we call this methodology Entropic Sparse Probabilistic Approximation with Markov regularization (eSPA-Markov). It is an extension of entropic learning methodologies, allowing the simultaneous learning of segmentation patterns, entropy-optimal feature space discretizations, and Bayesian classification rules. We prove the conditions for the existence and uniqueness of the learning problem solution and propose a one-shot numerical learning algorithm that—in the leading order—scales linearly in dimension. We show how this technique can be used for the computationally scalable identification of persistent (metastable) regime affiliations and regime switches from high-dimensional non-stationary and noisy time series, i.e., when the size of the data statistics is small compared to their dimensionality and when the noise variance is larger than the variance in the signal. We demonstrate its performance on a set of toy learning problems, comparing eSPA-Markov to state-of-the-art techniques, including deep learning and random forests. We show how this technique can be used for the analysis of noisy time series from DNA and RNA Nanopore sequencing. Full article
18 pages, 10185 KiB  
Article
Rise and Fall of Anderson Localization by Lattice Vibrations: A Time-Dependent Machine Learning Approach
by Yoel Zimmermann, Joonas Keski-Rahkonen, Anton M. Graf and Eric J. Heller
Entropy 2024, 26(7), 552; https://doi.org/10.3390/e26070552 - 28 Jun 2024
Viewed by 162
Abstract
The intricate relationship between electrons and the crystal lattice is a linchpin in condensed matter, traditionally described by the Fröhlich model encompassing the lowest-order lattice-electron coupling. Recently developed quantum acoustics, emphasizing the wave nature of lattice vibrations, hasenabled the exploration of previously uncharted [...] Read more.
The intricate relationship between electrons and the crystal lattice is a linchpin in condensed matter, traditionally described by the Fröhlich model encompassing the lowest-order lattice-electron coupling. Recently developed quantum acoustics, emphasizing the wave nature of lattice vibrations, hasenabled the exploration of previously uncharted territories of electron–lattice interaction not accessible with conventional tools such as perturbation theory. In this context, our agenda here is two-fold. First, we showcase the application of machine learning methods to categorize various interaction regimes within the subtle interplay of electrons and the dynamical lattice landscape. Second, we shed light on a nebulous region of electron dynamics identified by the machine learning approach and then attribute it to transient localization, where strong lattice vibrations result in a momentary Anderson prison for electronic wavepackets, which are later released by the evolution of the lattice. Overall, our research illuminates the spectrum of dynamics within the Fröhlich model, such as transient localization, which has been suggested as a pivotal factor contributing to the mysteries surrounding strange metals. Furthermore, this paves the way for utilizing time-dependent perspectives in machine learning techniques for designing materials with tailored electron–lattice properties. Full article
(This article belongs to the Special Issue Recent Advances in the Theory of Disordered Systems)
Show Figures

Figure 1

24 pages, 1895 KiB  
Article
Partial Discharge Fault Diagnosis in Power Transformers Based on SGMD Approximate Entropy and Optimized BILSTM
by Haikun Shang, Zixuan Zhao, Jiawen Li and Zhiming Wang
Entropy 2024, 26(7), 551; https://doi.org/10.3390/e26070551 - 27 Jun 2024
Viewed by 166
Abstract
Partial discharge (PD) fault diagnosis is of great importance for ensuring the safe and stable operation of power transformers. To address the issues of low accuracy in traditional PD fault diagnostic methods, this paper proposes a novel method for the power transformer PD [...] Read more.
Partial discharge (PD) fault diagnosis is of great importance for ensuring the safe and stable operation of power transformers. To address the issues of low accuracy in traditional PD fault diagnostic methods, this paper proposes a novel method for the power transformer PD fault diagnosis. It incorporates the approximate entropy (ApEn) of symplectic geometry mode decomposition (SGMD) into the optimized bidirectional long short-term memory (BILSTM) neural network. This method extracts dominant PD features employing SGMD and ApEn. Meanwhile, it improves the diagnostic accuracy with the optimized BILSTM by introducing the golden jackal optimization (GJO). Simulation studies evaluate the performance of FFT, EMD, VMD, and SGMD. The results show that SGMD–ApEn outperforms other methods in extracting dominant PD features. Experimental results verify the effectiveness and superiority of the proposed method by comparing different traditional methods. The proposed method improves PD fault recognition accuracy and provides a diagnostic rate of 98.6%, with lower noise sensitivity. Full article
(This article belongs to the Special Issue Information Theory and Nonlinear Signal Processing)
20 pages, 1059 KiB  
Article
Avionics Module Fault Diagnosis Algorithm Based on Hybrid Attention Adaptive Multi-Scale Temporal Convolution Network
by Qiliang Du, Mingde Sheng, Lubin Yu, Zhenwei Zhou, Lianfang Tian and Shilie He
Entropy 2024, 26(7), 550; https://doi.org/10.3390/e26070550 - 27 Jun 2024
Viewed by 130
Abstract
Since the reliability of the avionics module is crucial for aircraft safety, the fault diagnosis and health management of this module are particularly significant. While deep learning-based prognostics and health management (PHM) methods exhibit highly accurate fault diagnosis, they have disadvantages such as [...] Read more.
Since the reliability of the avionics module is crucial for aircraft safety, the fault diagnosis and health management of this module are particularly significant. While deep learning-based prognostics and health management (PHM) methods exhibit highly accurate fault diagnosis, they have disadvantages such as inefficient data feature extraction and insufficient generalization capability, as well as a lack of avionics module fault data. Consequently, this study first employs fault injection to simulate various fault types of the avionics module and performs data enhancement to construct the P2020 communications processor fault dataset. Subsequently, a multichannel fault diagnosis method, the Hybrid Attention Adaptive Multi-scale Temporal Convolution Network (HAAMTCN) for the integrated functional circuit module of the avionics module, is proposed, which adaptively constructs the optimal size of the convolutional kernel to efficiently extract features of avionics module fault signals with large information entropy. Further, the combined use of the Interaction Channel Attention (ICA) module and the Hierarchical Block Temporal Attention (HBTA) module results in the HAAMTCN to pay more attention to the critical information in the channel dimension and time step dimension. The experimental results show that the HAAMTCN achieves an accuracy of 99.64% in the avionics module fault classification task which proves our method achieves better performance in comparison with existing methods. Full article
(This article belongs to the Special Issue Methods in Artificial Intelligence and Information Processing II)
18 pages, 1170 KiB  
Article
Early Warning of Systemic Risk in Commodity Markets Based on Transfer Entropy Networks: Evidence from China
by Yiran Zhao, Xiangyun Gao, Hongyu Wei, Xiaotian Sun and Sufang An
Entropy 2024, 26(7), 549; https://doi.org/10.3390/e26070549 - 27 Jun 2024
Viewed by 141
Abstract
This study aims to employ a causal network model based on transfer entropy for the early warning of systemic risk in commodity markets. We analyzed the dynamic causal relationships of prices for 25 commodities related to China (including futures and spot prices of [...] Read more.
This study aims to employ a causal network model based on transfer entropy for the early warning of systemic risk in commodity markets. We analyzed the dynamic causal relationships of prices for 25 commodities related to China (including futures and spot prices of energy, industrial metals, precious metals, and agricultural products), validating the effect of the causal network structure among commodity markets on systemic risk. Our research results identified commodities and categories playing significant roles, revealing that industry and precious metal markets possess stronger market information transmission capabilities, with price fluctuations impacting a broader range and with greater force on other commodity markets. Under the influence of different types of crisis events, such as economic crises and the Russia–Ukraine conflict, the causal network structure among commodity markets exhibited distinct characteristics. The results of the effect of external shocks to the causal network structure of commodity markets on the entropy of systemic risk suggest that network structure indicators can warn of systemic risk. This article can assist investors and policymakers in managing systemic risk to avoid unexpected losses. Full article
(This article belongs to the Special Issue Entropy-Based Applications in Economics, Finance, and Management II)
19 pages, 327 KiB  
Article
Relativistic Consistency of Nonlocal Quantum Correlations
by Christian Beck and Dustin Lazarovici
Entropy 2024, 26(7), 548; https://doi.org/10.3390/e26070548 - 27 Jun 2024
Viewed by 168
Abstract
What guarantees the “peaceful coexistence” of quantum nonlocality and special relativity? The tension arises because entanglement leads to locally inexplicable correlations between distant events that have no absolute temporal order in relativistic spacetime. This paper identifies a relativistic consistency condition that is weaker [...] Read more.
What guarantees the “peaceful coexistence” of quantum nonlocality and special relativity? The tension arises because entanglement leads to locally inexplicable correlations between distant events that have no absolute temporal order in relativistic spacetime. This paper identifies a relativistic consistency condition that is weaker than Bell locality but stronger than the no-signaling condition meant to exclude superluminal communication. While justifications for the no-signaling condition often rely on anthropocentric arguments, relativistic consistency is simply the requirement that joint outcome distributions for spacelike separated measurements (or measurement-like processes) must be independent of their temporal order. This is necessary to obtain consistent statistical predictions across different Lorentz frames. We first consider ideal quantum measurements, derive the relevant consistency condition on the level of probability distributions, and show that it implies no-signaling (but not vice versa). We then extend the results to general quantum operations and derive corresponding operator conditions. This will allow us to clarify the relationships between relativistic consistency, no-signaling, and local commutativity. We argue that relativistic consistency is the basic physical principle that ensures the compatibility of quantum statistics and relativistic spacetime structure, while no-signaling and local commutativity can be justified on this basis. Full article
(This article belongs to the Special Issue Time and Temporal Asymmetries)
1 pages, 159 KiB  
Correction
Correction: Toikka et al. Some Remarks on the Boundary of Thermodynamic Stability. Entropy 2023, 25, 969
by Alexander Toikka, Georgii Misikov and Maria Toikka
Entropy 2024, 26(7), 547; https://doi.org/10.3390/e26070547 - 27 Jun 2024
Viewed by 122
Abstract
The authors would like to make a tiny but important correction to the published paper [...] Full article
23 pages, 1691 KiB  
Article
Partial Information Decomposition: Redundancy as Information Bottleneck
by Artemy Kolchinsky
Entropy 2024, 26(7), 546; https://doi.org/10.3390/e26070546 - 26 Jun 2024
Viewed by 296
Abstract
The partial information decomposition (PID) aims to quantify the amount of redundant information that a set of sources provides about a target. Here, we show that this goal can be formulated as a type of information bottleneck (IB) problem, termed the “redundancy bottleneck” [...] Read more.
The partial information decomposition (PID) aims to quantify the amount of redundant information that a set of sources provides about a target. Here, we show that this goal can be formulated as a type of information bottleneck (IB) problem, termed the “redundancy bottleneck” (RB). The RB formalizes a tradeoff between prediction and compression: it extracts information from the sources that best predict the target, without revealing which source provided the information. It can be understood as a generalization of “Blackwell redundancy”, which we previously proposed as a principled measure of PID redundancy. The “RB curve” quantifies the prediction–compression tradeoff at multiple scales. This curve can also be quantified for individual sources, allowing subsets of redundant sources to be identified without combinatorial optimization. We provide an efficient iterative algorithm for computing the RB curve. Full article
Show Figures

Figure 1

16 pages, 31693 KiB  
Article
A Dynamic Entropy Approach Reveals Reduced Functional Network Connectivity Trajectory Complexity in Schizophrenia
by David Sutherland Blair, Robyn L. Miller and Vince D. Calhoun
Entropy 2024, 26(7), 545; https://doi.org/10.3390/e26070545 - 26 Jun 2024
Viewed by 201
Abstract
Over the past decade and a half, dynamic functional imaging has revealed low-dimensional brain connectivity measures, identified potential common human spatial connectivity states, tracked the transition patterns of these states, and demonstrated meaningful transition alterations in disorders and over the course of development. [...] Read more.
Over the past decade and a half, dynamic functional imaging has revealed low-dimensional brain connectivity measures, identified potential common human spatial connectivity states, tracked the transition patterns of these states, and demonstrated meaningful transition alterations in disorders and over the course of development. Recently, researchers have begun to analyze these data from the perspective of dynamic systems and information theory in the hopes of understanding how these dynamics support less easily quantified processes, such as information processing, cortical hierarchy, and consciousness. Little attention has been paid to the effects of psychiatric disease on these measures, however. We begin to rectify this by examining the complexity of subject trajectories in state space through the lens of information theory. Specifically, we identify a basis for the dynamic functional connectivity state space and track subject trajectories through this space over the course of the scan. The dynamic complexity of these trajectories is assessed along each dimension of the proposed basis space. Using these estimates, we demonstrate that schizophrenia patients display substantially simpler trajectories than demographically matched healthy controls and that this drop in complexity concentrates along specific dimensions. We also demonstrate that entropy generation in at least one of these dimensions is linked to cognitive performance. Overall, the results suggest great value in applying dynamic systems theory to problems of neuroimaging and reveal a substantial drop in the complexity of schizophrenia patients’ brain function. Full article
(This article belongs to the Special Issue Entropy Application in Biomechanics and Biosignal Processing)
Show Figures

Figure 1

15 pages, 531 KiB  
Article
Adaptive Joint Carrier and DOA Estimations of FHSS Signals Based on Knowledge-Enhanced Compressed Measurements and Deep Learning
by Yinghai Jiang and Feng Liu
Entropy 2024, 26(7), 544; https://doi.org/10.3390/e26070544 - 26 Jun 2024
Viewed by 189
Abstract
As one of the most widely used spread spectrum techniques, the frequency-hopping spread spectrum (FHSS) has been widely adopted in both civilian and military secure communications. In this technique, the carrier frequency of the signal hops pseudo-randomly over a large range, compared to [...] Read more.
As one of the most widely used spread spectrum techniques, the frequency-hopping spread spectrum (FHSS) has been widely adopted in both civilian and military secure communications. In this technique, the carrier frequency of the signal hops pseudo-randomly over a large range, compared to the baseband. To capture an FHSS signal, conventional non-cooperative receivers without knowledge of the carrier have to operate at a high sampling rate covering the entire FHSS hopping range, according to the Nyquist sampling theorem. In this paper, we propose an adaptive compressed method for joint carrier and direction of arrival (DOA) estimations of FHSS signals, enabling subsequent non-cooperative processing. The compressed measurement kernels (i.e., non-zero entries in the sensing matrix) have been adaptively designed based on the posterior knowledge of the signal and task-specific information optimization. Moreover, a deep neural network has been designed to ensure the efficiency of the measurement kernel design process. Finally, the signal carrier and DOA are estimated based on the measurement data. Through simulations, the performance of the adaptively designed measurement kernels is proved to be improved over the random measurement kernels. In addition, the proposed method is shown to outperform the compressed methods in the literature. Full article
Show Figures

Figure 1

5 pages, 166 KiB  
Editorial
Information Theory in Emerging Wireless Communication Systems and Networks
by Erdem Koyuncu
Entropy 2024, 26(7), 543; https://doi.org/10.3390/e26070543 - 26 Jun 2024
Viewed by 219
Abstract
Wireless communication systems and networks are rapidly evolving to meet the increasing demands for higher data rates, better reliability, and connectivity anywhere, anytime [...] Full article
25 pages, 413 KiB  
Article
A Partial Information Decomposition for Multivariate Gaussian Systems Based on Information Geometry
by Jim W. Kay
Entropy 2024, 26(7), 542; https://doi.org/10.3390/e26070542 - 25 Jun 2024
Viewed by 195
Abstract
There is much interest in the topic of partial information decomposition, both in developing new algorithms and in developing applications. An algorithm, based on standard results from information geometry, was recently proposed by Niu and Quinn (2019). They considered the case of three [...] Read more.
There is much interest in the topic of partial information decomposition, both in developing new algorithms and in developing applications. An algorithm, based on standard results from information geometry, was recently proposed by Niu and Quinn (2019). They considered the case of three scalar random variables from an exponential family, including both discrete distributions and a trivariate Gaussian distribution. The purpose of this article is to extend their work to the general case of multivariate Gaussian systems having vector inputs and a vector output. By making use of standard results from information geometry, explicit expressions are derived for the components of the partial information decomposition for this system. These expressions depend on a real-valued parameter which is determined by performing a simple constrained convex optimisation. Furthermore, it is proved that the theoretical properties of non-negativity, self-redundancy, symmetry and monotonicity, which were proposed by Williams and Beer (2010), are valid for the decomposition Iig derived herein. Application of these results to real and simulated data show that the Iig algorithm does produce the results expected when clear expectations are available, although in some scenarios, it can overestimate the level of the synergy and shared information components of the decomposition, and correspondingly underestimate the levels of unique information. Comparisons of the Iig and Idep (Kay and Ince, 2018) methods show that they can both produce very similar results, but interesting differences are provided. The same may be said about comparisons between the Iig and Immi (Barrett, 2015) methods. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
24 pages, 605 KiB  
Article
Learning Causes of Functional Dynamic Targets: Screening and Local Methods
by Ruiqi Zhao, Xiaoxia Yang and Yangbo He
Entropy 2024, 26(7), 541; https://doi.org/10.3390/e26070541 - 24 Jun 2024
Viewed by 216
Abstract
This paper addresses the challenge of identifying causes for functional dynamic targets, which are functions of various variables over time. We develop screening and local learning methods to learn the direct causes of the target, as well as all indirect causes up to [...] Read more.
This paper addresses the challenge of identifying causes for functional dynamic targets, which are functions of various variables over time. We develop screening and local learning methods to learn the direct causes of the target, as well as all indirect causes up to a given distance. We first discuss the modeling of the functional dynamic target. Then, we propose a screening method to select the variables that are significantly correlated with the target. On this basis, we introduce an algorithm that combines screening and structural learning techniques to uncover the causal structure among the target and its causes. To tackle the distance effect, where long causal paths weaken correlation, we propose a local method to discover the direct causes of the target in these significant variables and further sequentially find all indirect causes up to a given distance. We show theoretically that our proposed methods can learn the causes correctly under some regular assumptions. Experiments based on synthetic data also show that the proposed methods perform well in learning the causes of the target. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
16 pages, 1391 KiB  
Article
GC-STCL: A Granger Causality-Based Spatial–Temporal Contrastive Learning Framework for EEG Emotion Recognition
by Lei Wang, Siming Wang, Bo Jin and Xiaopeng Wei
Entropy 2024, 26(7), 540; https://doi.org/10.3390/e26070540 - 24 Jun 2024
Viewed by 232
Abstract
EEG signals capture information through multi-channel electrodes and hold promising prospects for human emotion recognition. However, the presence of high levels of noise and the diverse nature of EEG signals pose significant challenges, leading to potential overfitting issues that further complicate the extraction [...] Read more.
EEG signals capture information through multi-channel electrodes and hold promising prospects for human emotion recognition. However, the presence of high levels of noise and the diverse nature of EEG signals pose significant challenges, leading to potential overfitting issues that further complicate the extraction of meaningful information. To address this issue, we propose a Granger causal-based spatial–temporal contrastive learning framework, which significantly enhances the ability to capture EEG signal information by modeling rich spatial–temporal relationships. Specifically, in the spatial dimension, we employ a sampling strategy to select positive sample pairs from individuals watching the same video. Subsequently, a Granger causality test is utilized to enhance graph data and construct potential causality for each channel. Finally, a residual graph convolutional neural network is employed to extract features from EEG signals and compute spatial contrast loss. In the temporal dimension, we first apply a frequency domain noise reduction module for data enhancement on each time series. Then, we introduce the Granger–Former model to capture time domain representation and calculate the time contrast loss. We conduct extensive experiments on two publicly available sentiment recognition datasets (DEAP and SEED), achieving 1.65% improvement of the DEAP dataset and 1.55% improvement of the SEED dataset compared to state-of-the-art unsupervised models. Our method outperforms benchmark methods in terms of prediction accuracy as well as interpretability. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

25 pages, 2996 KiB  
Article
Causalized Convergent Cross Mapping and Its Implementation in Causality Analysis
by Boxin Sun, Jinxian Deng, Norman Scheel, David C. Zhu, Jian Ren, Rong Zhang and Tongtong Li
Entropy 2024, 26(7), 539; https://doi.org/10.3390/e26070539 - 24 Jun 2024
Viewed by 280
Abstract
Rooted in dynamic systems theory, convergent cross mapping (CCM) has attracted increased attention recently due to its capability in detecting linear and nonlinear causal coupling in both random and deterministic settings. One limitation with CCM is that it uses both past and future [...] Read more.
Rooted in dynamic systems theory, convergent cross mapping (CCM) has attracted increased attention recently due to its capability in detecting linear and nonlinear causal coupling in both random and deterministic settings. One limitation with CCM is that it uses both past and future values to predict the current value, which is inconsistent with the widely accepted definition of causality, where it is assumed that the future values of one process cannot influence the past of another. To overcome this obstacle, in our previous research, we introduced the concept of causalized convergent cross mapping (cCCM), where future values are no longer used to predict the current value. In this paper, we focus on the implementation of cCCM in causality analysis. More specifically, we demonstrate the effectiveness of cCCM in identifying both linear and nonlinear causal coupling in various settings through a large number of examples, including Gaussian random variables with additive noise, sinusoidal waveforms, autoregressive models, stochastic processes with a dominant spectral component embedded in noise, deterministic chaotic maps, and systems with memory, as well as experimental fMRI data. In particular, we analyze the impact of shadow manifold construction on the performance of cCCM and provide detailed guidelines on how to configure the key parameters of cCCM in different applications. Overall, our analysis indicates that cCCM is a promising and easy-to-implement tool for causality analysis in a wide spectrum of applications. Full article
Show Figures

Figure 1

17 pages, 1365 KiB  
Article
Pedaling Asymmetry Reflected by Bilateral EMG Complexity in Chronic Stroke
by Shi-Chun Bao, Rui Sun and Raymond Kai-Yu Tong
Entropy 2024, 26(7), 538; https://doi.org/10.3390/e26070538 - 23 Jun 2024
Viewed by 222
Abstract
This study examines pedaling asymmetry using the electromyogram (EMG) complexity of six bilateral lower limb muscles for chronic stroke survivors. Fifteen unilateral chronic stroke and twelve healthy participants joined passive and volitional recumbent pedaling tasks using a self-modified stationary bike with a constant [...] Read more.
This study examines pedaling asymmetry using the electromyogram (EMG) complexity of six bilateral lower limb muscles for chronic stroke survivors. Fifteen unilateral chronic stroke and twelve healthy participants joined passive and volitional recumbent pedaling tasks using a self-modified stationary bike with a constant speed of 25 revolutions per minute. The fuzzy approximate entropy (fApEn) was adopted in EMG complexity estimation. EMG complexity values of stroke participants during pedaling were smaller than those of healthy participants (p = 0.002). For chronic stroke participants, the complexity of paretic limbs was smaller than that of non-paretic limbs during the passive pedaling task (p = 0.005). Additionally, there was a significant correlation between clinical scores and the paretic EMG complexity during passive pedaling (p = 0.022, p = 0.028), indicating that the paretic EMG complexity during passive movement might serve as an indicator of stroke motor function status. This study suggests that EMG complexity is an appropriate quantitative tool for measuring neuromuscular characteristics in lower limb dynamic movement tasks for chronic stroke survivors. Full article
(This article belongs to the Special Issue Approximate Entropy and Its Application)
17 pages, 665 KiB  
Article
Utilizing TabNet Deep Learning for Elephant Flow Detection by Analyzing Information in First Packet Headers
by Bartosz Kądziołka, Piotr Jurkiewicz, Robert Wójcik and Jerzy Domżał
Entropy 2024, 26(7), 537; https://doi.org/10.3390/e26070537 - 22 Jun 2024
Viewed by 317
Abstract
Rapid and precise detection of significant data streams within a network is crucial for efficient traffic management. This study leverages the TabNet deep learning architecture to identify large-scale flows, known as elephant flows, by analyzing the information in the 5-tuple fields of the [...] Read more.
Rapid and precise detection of significant data streams within a network is crucial for efficient traffic management. This study leverages the TabNet deep learning architecture to identify large-scale flows, known as elephant flows, by analyzing the information in the 5-tuple fields of the initial packet header. The results demonstrate that employing a TabNet model can accurately identify elephant flows right at the start of the flow and makes it possible to reduce the number of flow table entries by up to 20 times while still effectively managing 80% of the network traffic through individual flow entries. The model was trained and tested on a comprehensive dataset from a campus network, demonstrating its robustness and potential applicability to varied network environments. Full article
(This article belongs to the Special Issue Information Theory for Data Science)
36 pages, 12621 KiB  
Article
Research on Variable Parameter Color Image Encryption Based on Five-Dimensional Tri-Valued Memristor Chaotic System
by Pan Wang and Lina Ding
Entropy 2024, 26(7), 536; https://doi.org/10.3390/e26070536 - 22 Jun 2024
Viewed by 284
Abstract
To construct a chaotic system with complex characteristics and to improve the security of image data, a five-dimensional tri-valued memristor chaotic system with high complexity is innovatively constructed. Firstly, a pressure-controlled tri-valued memristor on Liu’s pseudo-four-wing chaotic system is introduced. Through analytical methods, [...] Read more.
To construct a chaotic system with complex characteristics and to improve the security of image data, a five-dimensional tri-valued memristor chaotic system with high complexity is innovatively constructed. Firstly, a pressure-controlled tri-valued memristor on Liu’s pseudo-four-wing chaotic system is introduced. Through analytical methods, such as Lyapunov exponential map, bifurcation map and attractor phase diagram, it is demonstrated that the new system has rich dynamical behaviors with periodic limit rings varying with the coupling parameter of the system, variable airfoil phenomenon as well as transient chaotic phenomenon of chaos-periodic depending on the system parameter and chaos-quasi-periodic depending on the memristor parameter. The system is simulated with dynamic circuits based on Simulink. Secondly, the differently structured synchronous controls of chaotic systems are realized using a nonlinear feedback control method. Finally, based on the newly constructed five-dimensional chaotic system, a variable parameter color image encryption scheme is proposed to iteratively generate varying chaotic pseudo-random sequences by varying the system parameters, which will be used for repetition-free disambiguation, additive modulo left-shift diffusion and DNA encryption for the three components of RGB of the color image after chunking. The simulation results are analyzed by histogram, information entropy, adjacent pixel correlation, etc., and the images are tested using differential attack, noise attack and geometric attack, as well as analyzing the PSNR and SSIM of the decrypted image quality. The results show that the encryption method has a certain degree of security and can be applied to medical, military and financial fields with more complex environmental requirements. Full article
(This article belongs to the Section Multidisciplinary Applications)
9 pages, 530 KiB  
Article
Parameterized Multipartite Entanglement and Genuine Entanglement Measures Based on q-Concurrence
by Pan-Wen Ma, Hui Zhao, Shao-Ming Fei, Mei-Ming Zhang and Zhi-Xi Wang
Entropy 2024, 26(7), 535; https://doi.org/10.3390/e26070535 - 22 Jun 2024
Viewed by 216
Abstract
We study genuine multipartite entanglement (GME) and multipartite k-entanglement based on q-concurrence. Well-defined parameterized GME measures and measures of multipartite k-entanglement are presented for arbitrary dimensional n-partite quantum systems. Our GME measures show that the GHZ state [...] Read more.
We study genuine multipartite entanglement (GME) and multipartite k-entanglement based on q-concurrence. Well-defined parameterized GME measures and measures of multipartite k-entanglement are presented for arbitrary dimensional n-partite quantum systems. Our GME measures show that the GHZ state is more entangled than the W state. Moreover, our measures are shown to be inequivalent to the existing measures according to entanglement ordering. Detailed examples show that our measures characterize the multipartite entanglement finer than some existing measures, in the sense that our measures identify the difference of two different states while the latter fail. Full article
(This article belongs to the Collection Quantum Information)
31 pages, 7787 KiB  
Article
Enhanced Air Quality Prediction Using a Coupled DVMD Informer-CNN-LSTM Model Optimized with Dung Beetle Algorithm
by Yang Wu, Chonghui Qian and Hengjun Huang
Entropy 2024, 26(7), 534; https://doi.org/10.3390/e26070534 - 21 Jun 2024
Viewed by 242
Abstract
Accurate prediction of air quality is crucial for assessing the state of the atmospheric environment, especially considering the nonlinearity, volatility, and abrupt changes in air quality data. This paper introduces an air quality index (AQI) prediction model based on the Dung Beetle Algorithm [...] Read more.
Accurate prediction of air quality is crucial for assessing the state of the atmospheric environment, especially considering the nonlinearity, volatility, and abrupt changes in air quality data. This paper introduces an air quality index (AQI) prediction model based on the Dung Beetle Algorithm (DBO) aimed at overcoming limitations in traditional prediction models, such as inadequate access to data features, challenges in parameter setting, and accuracy constraints. The proposed model optimizes the parameters of Variational Mode Decomposition (VMD) and integrates the Informer adaptive sequential prediction model with the Convolutional Neural Network-Long Short Term Memory (CNN-LSTM). Initially, the correlation coefficient method is utilized to identify key impact features from multivariate weather and meteorological data. Subsequently, penalty factors and the number of variational modes in the VMD are optimized using DBO. The optimized parameters are utilized to develop a variationally constrained model to decompose the air quality sequence. The data are categorized based on approximate entropy, and high-frequency data are fed into the Informer model, while low-frequency data are fed into the CNN-LSTM model. The predicted values of the subsystems are then combined and reconstructed to obtain the AQI prediction results. Evaluation using actual monitoring data from Beijing demonstrates that the proposed coupling prediction model of the air quality index in this paper is superior to other parameter optimization models. The Mean Absolute Error (MAE) decreases by 13.59%, the Root-Mean-Square Error (RMSE) decreases by 7.04%, and the R-square (R2) increases by 1.39%. This model surpasses 11 other models in terms of lower error rates and enhances prediction accuracy. Compared with the mainstream swarm intelligence optimization algorithm, DBO, as an optimization algorithm, demonstrates higher computational efficiency and is closer to the actual value. The proposed coupling model provides a new method for air quality index prediction. Full article
9 pages, 243 KiB  
Article
Flipped Quartification: Product Group Unification with Leptoquarks
by James B. Dent, Thomas W. Kephart, Heinrich Päs and Thomas J. Weiler
Entropy 2024, 26(7), 533; https://doi.org/10.3390/e26070533 - 21 Jun 2024
Viewed by 197
Abstract
The quartification model is an SU(3)4 extension with a bi-fundamental fermion sector of the well-known SU(3)3 bi-fundamentalfication model. An alternative “flipped” version of the quartification model is obtained by rearrangement of the particle [...] Read more.
The quartification model is an SU(3)4 extension with a bi-fundamental fermion sector of the well-known SU(3)3 bi-fundamentalfication model. An alternative “flipped” version of the quartification model is obtained by rearrangement of the particle assignments. The flipped model has two standard (bi-fundamentalfication) families and one flipped quartification family. In contrast to traditional product group unification models, flipped quartification stands out by featuring leptoquarks and thus allows for new mechanisms to explain the generation of neutrino masses and possible hints of lepton-flavor non-universality. Full article
Previous Issue
Back to TopTop