Journal Description
Entropy
Entropy
is an international and interdisciplinary peer-reviewed open access journal of entropy and information studies, published monthly online by MDPI. The International Society for the Study of Information (IS4SI) and Spanish Society of Biomedical Engineering (SEIB) are affiliated with Entropy and their members receive a discount on the article processing charge.
- Open Access— free for readers, with article processing charges (APC) paid by authors or their institutions.
- High Visibility: indexed within Scopus, SCIE (Web of Science), Inspec, PubMed, PMC, Astrophysics Data System, and other databases.
- Journal Rank: JCR - Q2 (Physics, Multidisciplinary) / CiteScore - Q1 (Mathematical Physics)
- Rapid Publication: manuscripts are peer-reviewed and a first decision is provided to authors approximately 21.5 days after submission; acceptance to publication is undertaken in 2.6 days (median values for papers published in this journal in the second half of 2025).
- Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
- Testimonials: See what our editors and authors say about Entropy.
- Companion journals for Entropy include: Foundations, Thermo and Complexities.
- Journal Cluster of Atomic, Molecular, and Optical (AMO) Physics: Entropy, Photonics, Atoms, Lights, Optics, Plasma, Physics, Quantum Beam Science and Lasers.
Impact Factor:
2.0 (2024);
5-Year Impact Factor:
2.2 (2024)
Latest Articles
Gradient Systems and Asymmetric Relaxations in View of Riemannian Geometry
Entropy 2026, 28(5), 516; https://doi.org/10.3390/e28050516 (registering DOI) - 2 May 2026
Abstract
In dually flat manifolds, there is a deep connection between gradient flows and pregeodesics. This was one of the many important contributions of Amari to information geometry. In this paper, we extend the study of this relationship to general Riemannian manifolds. Our result
[...] Read more.
In dually flat manifolds, there is a deep connection between gradient flows and pregeodesics. This was one of the many important contributions of Amari to information geometry. In this paper, we extend the study of this relationship to general Riemannian manifolds. Our result does not impose conditions of flatness on the connection or symmetry on its non-metricity tensor, thus broadening the geometric setting beyond Hessian manifolds. Within this framework, we provide a criterion for comparing relaxation along two different gradient descent curves of a function, formulated in terms of the non-metricity tensor of a connection for which the gradient curves are pregeodesics. We use it to study Gaussian chains, whose relaxation trajectories coincide with gradient descent curves in the space of Gaussian distributions. Thus, we recover a recent result that establishes a universal asymmetry: warming up is faster than cooling down. Our work illustrates how geometric insights rooted in Amari’s legacy offer new perspectives for optimization problems and stochastic processes.
Full article
(This article belongs to the Special Issue SUURI of Information Geometry: Dedicated to SUURI Engineer Professor Shun’ichi Amari on the Occasion of His 90th Birthday)
►
Show Figures
Open AccessArticle
An Algorithmic Treatment of Causal Unit Selection
by
Haiying Huang and Adnan Darwiche
Entropy 2026, 28(5), 515; https://doi.org/10.3390/e28050515 (registering DOI) - 2 May 2026
Abstract
The problem of optimizing a causal objective function emerged in recent work, where the behavior of objects needs to be expressed in terms of interventional or counterfactual probabilities. A key example is the unit selection problem introduced by Li and Pearl, where the
[...] Read more.
The problem of optimizing a causal objective function emerged in recent work, where the behavior of objects needs to be expressed in terms of interventional or counterfactual probabilities. A key example is the unit selection problem introduced by Li and Pearl, where the goal is to find the individuals who maximize a benefit function that scores their characteristics (called units) using counterfactual probabilities. Previous work on unit selection focused mainly on this specific objective function and on identifying its value using bounds. We complement this line of work by developing a theory that treats unit selection as a computational problem, assuming a fully specified causal model is available and a more general class of objective functions. At the core of our treatment is a novel reduction that transforms the computation of a broad class of causal objective functions into a classical associational probability on a meta-model called the objective model. Based on this reduction, we propose the first exact algorithm for finding the optimal units by applying Variable Elimination (VE) on the objective model. We then characterize the complexity of causal unit selection, showing that it is -complete, and that the runtime of VE must be exponential in the constrained treewidth of the objective model, which is larger and denser than the original input model. To address this challenge, we compile the objective model into a special class of tractable arithmetic circuits, allowing the optimal units to be computed in time linear in the circuit size. Finally, we present experiments demonstrating the substantial speedup from the circuit-based method over the VE-based method, and the speedup from the VE-based method over a baseline search method, together with a case study on a real-world ecology problem.
Full article
(This article belongs to the Special Issue Causal Graphical Models and Their Applications, 2nd Edition)
Open AccessArticle
Effective Mode Approximation for Probabilistic Verification of Collective Hamiltonians in Large Continuous-Variable Quantum Systems
by
José R. Rosas-Bustos, Jesse Van Griensven Thé, Roydon Andrew Fraser, Nadeem Said, Sebastian Ratto Valderrama, Mark Pecen, Alexander Truskovsky and Andy Thanos
Entropy 2026, 28(5), 514; https://doi.org/10.3390/e28050514 (registering DOI) - 2 May 2026
Abstract
The Effective Mode Approximation (EMA) is a verification-oriented framework for characterizing collective Hamiltonian dynamics in large continuous-variable (CV) quantum systems from experimentally accessible collective measurements. Rather than reconstructing a full mode-resolved Hamiltonian, EMA maps the observed dynamics onto a canonically normalized collective mode
[...] Read more.
The Effective Mode Approximation (EMA) is a verification-oriented framework for characterizing collective Hamiltonian dynamics in large continuous-variable (CV) quantum systems from experimentally accessible collective measurements. Rather than reconstructing a full mode-resolved Hamiltonian, EMA maps the observed dynamics onto a canonically normalized collective mode and tests whether summed quadrature trajectories are consistent with an effective harmonic description. We validate EMA using time-resolved homodyne sampling in Gaussian simulations of ring-coupled multi-qu-mode optical systems with and 64 modes. One-tone and two-tone sinusoidal models, selected using the Akaike Information Criterion (AIC), recover a stable dominant collective frequency across system size and produce residuals that remain centred near zero. The results show that EMA can verify dominant collective behaviour with a fixed number of effective parameters even when full microscopic reconstruction is impractical. EMA is therefore best understood not as a full-state ansatz, but as a low-overhead tool for validating collective dynamics under realistic measurement constraints in scalable CV hardware.
Full article
(This article belongs to the Section Quantum Information)
Open AccessArticle
Degrees, Levels, and Profiles of Contextuality
by
Ehtibar N. Dzhafarov and Víctor H. Cervantes
Entropy 2026, 28(5), 513; https://doi.org/10.3390/e28050513 - 1 May 2026
Abstract
We introduce a new notion, that of a contextuality profile of a system of random variables. Rather than characterizing a system’s contextuality by a single number, its overall degree of contextuality, we show how it can be characterized by a curve relating
[...] Read more.
We introduce a new notion, that of a contextuality profile of a system of random variables. Rather than characterizing a system’s contextuality by a single number, its overall degree of contextuality, we show how it can be characterized by a curve relating the degree of contextuality (including nonlocality, as a special case) to the level at which the system is considered, , where N is the maximum number of variables per system’s context. A system is represented at level n if one only considers the joint distributions with variables, ignoring higher-order joint distributions. We show that the level-wise contextuality analysis can be used in conjunction with any well-constructed measure of contextuality. We present a method of concatenated systems to explore contextuality profiles systematically, and we apply it to the contextuality profiles for three major measures of contextuality proposed in the literature.
Full article
(This article belongs to the Section Multidisciplinary Applications)
►▼
Show Figures

Figure 1
Open AccessArticle
Chaos and Coexisting Attractors of Kolmogorov-Type Permanent-Magnet Synchronous Generators
by
Dongdong Wang
Entropy 2026, 28(5), 512; https://doi.org/10.3390/e28050512 - 1 May 2026
Abstract
This paper investigates the dynamic behavior of a Kolmogorov-type permanent-magnet synchronous generator for wind power systems. Firstly, the chaotic model of the salient-pole permanent-magnet synchronous generator is derived and subsequently transformed into a Kolmogorov-type system. Secondly, by analyzing the derived Kolmogorov system, the
[...] Read more.
This paper investigates the dynamic behavior of a Kolmogorov-type permanent-magnet synchronous generator for wind power systems. Firstly, the chaotic model of the salient-pole permanent-magnet synchronous generator is derived and subsequently transformed into a Kolmogorov-type system. Secondly, by analyzing the derived Kolmogorov system, the system’s stability is established, and the boundary ellipsoid of the chaotic attractor is determined via the Casimir energy function. Thirdly, the analysis focuses on the mechanisms leading to chaos, including period-doubling bifurcation and the onset of double Hopf bifurcation. Finally, the basins of attraction associated with the coexisting static attractors are determined to characterize their long-term dynamical behavior. The analytical results show good agreement with the numerical simulations.
Full article
Open AccessArticle
A Hybrid CNN-GRU-SE Forecasting Method for Short-Term Photovoltaic Power Considers AFD and Data Aggregation
by
Keyan Liu, Dongli Jia, Huiyu Zhan, Jun Zhou, Zezhou Wang and Jianfei Bao
Entropy 2026, 28(5), 511; https://doi.org/10.3390/e28050511 - 1 May 2026
Abstract
To enhance the accuracy and robustness of short-term photovoltaic (PV) power forecasting, this paper proposes a novel forecasting method that integrates data aggregation, adaptive frequency decomposition (AFD), modified improved beluga whale optimization (MIBWO), and a CNN-GRU-SE hybrid model. First, the Pearson correlation coefficient
[...] Read more.
To enhance the accuracy and robustness of short-term photovoltaic (PV) power forecasting, this paper proposes a novel forecasting method that integrates data aggregation, adaptive frequency decomposition (AFD), modified improved beluga whale optimization (MIBWO), and a CNN-GRU-SE hybrid model. First, the Pearson correlation coefficient and the entropy weight method are combined to screen meteorological features that are strongly correlated with PV power output. Considering the geographical distance, a spatial data aggregation strategy is proposed to exploit the spatial correlation among neighboring PV stations and suppress the output volatility of individual stations. Then, the AFD is adopted to adaptively decompose the PV power series into trend and seasonal components, and the MIBWO algorithm is utilized to optimize the cutoff frequency of AFD and key hyperparameters of the CNN-GRU-SE forecasting model simultaneously. Finally, the SHAP method is employed for model interpretability analysis to quantify the contribution of each feature to the prediction results. Simulation results verified the power forecasting accuracy and robustness of the proposed method. Compared with CNN-GRU and BWO-CNN-GRU-SE, the proposed method reduces MAE by 96.23% and 95.03%, respectively. The method maintains stable performance with sunny and cloudy conditions.
Full article
(This article belongs to the Special Issue Multivariate Entropy-Informed Fault Diagnosis and Structural Health Monitoring)
Open AccessArticle
Clustering-Conditioned Granger Causality Between GDP Growth and Private Financing
by
Roberto Flores-Nava and Edgar Roman-Rangel
Entropy 2026, 28(5), 510; https://doi.org/10.3390/e28050510 - 1 May 2026
Abstract
Whether finance leads growth or growth leads finance remains a century-long debate. We argue that the direction and strength of the GDP–credit nexus are context-dependent and can be systematically uncovered by conditioning causal analysis on macro-structural heterogeneity across countries. We implement a two-stage
[...] Read more.
Whether finance leads growth or growth leads finance remains a century-long debate. We argue that the direction and strength of the GDP–credit nexus are context-dependent and can be systematically uncovered by conditioning causal analysis on macro-structural heterogeneity across countries. We implement a two-stage pipeline: (i) unsupervised clustering of 30 economies (2005–2022, five annual macro indicators) via Agglomerative Clustering to form homogeneous macro-structural groups; and (ii) within-cluster dynamic causal analysis using lagged correlations, Granger causality and explanatory models (quarterly GDP and private credit, year-on-year growth, 1Q2005–3Q2024). Results show non-universality of causality: (a) in “developed and in transition, economically stable” economies, credit → GDP is predominant; (b) in “highly developed, competitive and stable” economies, bidirectionality is predominant; however, the results are not economically intuitive; (c) in “emerging/intermediate with macro risk”, bidirectional links are common, and feedback between both variables is interpretable across distinct scenarios. Post hoc Lasso and XGBoost confirm effect magnitudes and non-linear thresholds. We contribute a macro-segmented causal discovery framework that reconciles conflicting findings in the literature and provides policy-relevant differentiation by economic context.
Full article
(This article belongs to the Special Issue Causal Graphical Models and Their Applications, 2nd Edition)
Open AccessArticle
Area Law for the Entanglement Entropy of Free Fermions in Nonrandom Ergodic Field
by
Leonid Pastur and Mira Shamis
Entropy 2026, 28(5), 509; https://doi.org/10.3390/e28050509 - 1 May 2026
Abstract
The paper deals with the asymptotic behavior of a widely used correlation characteristic in large quantum systems. The correlation is quantum entanglement, the characteristic is entanglement entropy, and the system is an ideal gas of lattice fermions. If the one-body Hamiltonian of fermions
[...] Read more.
The paper deals with the asymptotic behavior of a widely used correlation characteristic in large quantum systems. The correlation is quantum entanglement, the characteristic is entanglement entropy, and the system is an ideal gas of lattice fermions. If the one-body Hamiltonian of fermions is an ergodic finite difference operator with an exponentially decaying spectral projection, then the large-block form of the entanglement entropy is the so-called area law. However, the only class of one-body Hamiltonians for which this spectral condition was verified consists of discrete Schrödinger operators with random potential. In this paper, we prove the area law for several classes of Schrödinger operators whose potentials are ergodic but not random. We begin with quasiperiodic and limit-periodic operators and then move to a highly non-trivial case of potentials generated by subshifts of finite type. These arose in the theory of dynamical systems when studying chaotic phenomena. The corresponding asymptotic study requires involved spectral analysis, which therefore constitutes the bulk of the paper. Specifically, we prove uniform localisation of the eigenfunctions for the Maryland model and exponential decay of the eigenfunction correlator for various models. We believe these properties are of significant independent interest.
Full article
(This article belongs to the Section Quantum Information)
Open AccessArticle
Landauer-Based Economic Temperature in Blockspace Markets: Evidence from Bitcoin and Ethereum
by
Michael Zouari, Ilan Alon and Zeev Shtudiner
Entropy 2026, 28(5), 508; https://doi.org/10.3390/e28050508 - 1 May 2026
Abstract
The Landauer principle motivates the definition of economic temperature as the monetary price of processing a bit irreversibly. No empirical test of this definition exists in transparent fee markets. This paper fills that gap using daily Bitcoin and Ethereum data, constructing canonical thermodynamic
[...] Read more.
The Landauer principle motivates the definition of economic temperature as the monetary price of processing a bit irreversibly. No empirical test of this definition exists in transparent fee markets. This paper fills that gap using daily Bitcoin and Ethereum data, constructing canonical thermodynamic state variables and evaluating five diagnostic layers: state variable behavior, Maxwell-type integrability, Carnot-style efficiency bounds, nonlinear regime separation, and structural break sensitivity to protocol events. Bitcoin’s log-temperature behaves as a persistent mean-reverting process with an AR(1) coefficient of 0.97 and a half-life of 21 days; Ethereum is highly persistent, with weaker formal evidence of stationarity than Bitcoin. Maxwell integrability is frequency-dependent: Bitcoin passes all four relations at monthly frequency, whereas Ethereum passes two of four. Carnot-style evidence is the strongest: realized fee extraction efficiency stays well below the implied bound, with daily compliance exceeding 97% on both chains. Structural breaks around Bitcoin ordinals, EIP-1559, the merge, and Shanghai confirm that protocol changes reorganize the temperature relation. The thermodynamic framework provides structure that standard fee market analysis does not, including a first principles efficiency bound and a state space coherence test. The findings provide partial, frequency-dependent, and chain-specific empirical support for a Landauer-based thermodynamic description of blockspace markets.
Full article
(This article belongs to the Special Issue Entropy-Based Applications in Economics, Finance, and Management, 4th Edition)
Open AccessArticle
Evaluating Photonic Quantum Memristors in Noisy Environments
by
Jiachao Wang, Wentao Mao, Tengze Yang, Qiming Zhang and Wei Li
Entropy 2026, 28(5), 507; https://doi.org/10.3390/e28050507 - 1 May 2026
Abstract
While photonic quantum memristors (PQMs) offer promising avenues for neuromorphic computing, their performance is inherently affected by hardware noise, particularly photon loss and phase fluctuations. This study systematically investigates the impact of photon loss and phase fluctuations on PQM dynamics by employing the
[...] Read more.
While photonic quantum memristors (PQMs) offer promising avenues for neuromorphic computing, their performance is inherently affected by hardware noise, particularly photon loss and phase fluctuations. This study systematically investigates the impact of photon loss and phase fluctuations on PQM dynamics by employing the noisy gates approach, which integrates dissipative effects directly into the device evolution. At the device level, we demonstrate that photon loss alters the dynamic trajectory of individual PQMs. It induces evident deformations in the characteristic pinched hysteresis loops, with the degradation of non-Markovian memory effects being particularly pronounced at shorter integration times. To further evaluate system-level implications, we construct a two-PQM network to execute the NARMA2 time-series prediction task. Under noiseless conditions, the network exhibits strong representation capabilities with a normalized mean square error (NMSE) of 0.0448. However, performance degrades markedly under incoherent evolution; the NMSE increases to 0.1552, 0.2567, and 0.3056 for photon loss probabilities of 0.2, 0.4, and 0.5, respectively. Furthermore, at a high photon loss probability of 0.5, extending the integration time fails to compensate for the degradation and instead exacerbates the prediction error. These findings indicate that photon loss impairs both individual device dynamics and network-level processing, emphasizing the critical need for loss-tolerant architectures in deploying PQM networks.
Full article
(This article belongs to the Special Issue Quantum Algorithms and Quantum Machine Learning)
►▼
Show Figures

Figure 1
Open AccessArticle
DAGs and GRaSP Causal Inference Algorithms Combined and Applied to the Calculation of Insulin Bolus in Patients with Type 1 Diabetes
by
Rocío Contreras-Jiménez, Juan Carlos Olivares-Rojas, Adriana del Carmen Téllez-Anguiano, Jesús Eduardo Alcaráz-Chávez, José Antonio Gutiérrez-Gnecchi and Enrique Reyes-Archundia
Entropy 2026, 28(5), 506; https://doi.org/10.3390/e28050506 - 1 May 2026
Abstract
Type 1 diabetes mellitus (T1DM) is a chronic, non-preventable, and incurable disease that requires lifelong insulin administration. The principal challenge is calculating the prandial insulin bolus to avoid hypoglycemia and hyperglycemia. Traditional bolus calculators are based on limited number of variables, but there
[...] Read more.
Type 1 diabetes mellitus (T1DM) is a chronic, non-preventable, and incurable disease that requires lifelong insulin administration. The principal challenge is calculating the prandial insulin bolus to avoid hypoglycemia and hyperglycemia. Traditional bolus calculators are based on limited number of variables, but there are many variables that define the complex interactions among glucose levels, like carbohydrate intake, physical activity, mood, and contextual factors. While recent artificial intelligence (AI) approaches have shown promise in glucose prediction, most remain correlational and offer limited interpretability for clinical decision support. This study evaluates a causal inference-based framework for insulin bolus calculation using Directed Acyclic Graphs (DAGs) and the Greedy Relaxation of the Sparsest Permutation (GRaSP). Historical data from individuals with T1DM were analyzed, incorporating domain knowledge constraints to guide structure learning. A bootstrap-based stability analysis was conducted to evaluate the robustness of inferred relationships. Results show that integrating prior medical knowledge reduces graph complexity and improves interpretability. However, bootstrap stability reflects robustness of the learning procedure rather than causal validity. The findings suggest that the proposed framework is useful for generating plausible causal hypotheses, but not for confirming causal relationships. Further validation using conditional independence testing, equivalence class analysis, and temporal causal methods is required. However, the proposed framework focuses on generating plausible causal hypotheses rather than establishing causal validity, which requires further refutation-based validation.
Full article
(This article belongs to the Special Issue Causal Graphical Models and Their Applications, 2nd Edition)
►▼
Show Figures

Figure 1
Open AccessArticle
Multi-Stage Robust Bayesian High-Resolution Identification of Asynchronous Blade Vibrations Using Blade Tip Timing
by
Qinglei Zhang and Xiwen Chen
Entropy 2026, 28(5), 505; https://doi.org/10.3390/e28050505 - 30 Apr 2026
Abstract
►▼
Show Figures
Blade Tip Timing (BTT) is an essential non-contact technique for monitoring vibrations in rotating machinery, but its practical accuracy is often degraded by noise, undersampling, and spectral leakage. This paper proposes a multi-stage robust Bayesian high-resolution identification framework that systematically addresses these challenges.
[...] Read more.
Blade Tip Timing (BTT) is an essential non-contact technique for monitoring vibrations in rotating machinery, but its practical accuracy is often degraded by noise, undersampling, and spectral leakage. This paper proposes a multi-stage robust Bayesian high-resolution identification framework that systematically addresses these challenges. A recursive digital algorithm based on Kalman filtering estimates the rotational speed without requiring once-per-revolution probes, effectively suppressing sensor noise. An attention-enhanced dynamic convolutional autoencoder then generates channel-specific window functions to minimize spectral leakage. The core identification algorithm extracts phases via all-phase FFT and employs sub-bin interpolation to overcome the resolution limitation of conventional FFT. A Tukey-biweight-based robust aggregation strategy is used to suppress the influence of abnormal or unequal-quality sensor channels during multi-channel phase fusion. A Bayesian prior distribution over the vibration order guides the estimation toward physically plausible values under noisy conditions. Finally, a coarse-to-fine multi-stage search strategy drastically reduces computational burden while preserving accuracy. Experiments on a rotor-blade test bench at constant and variable speeds show that the method reduces the noise floor by about 60 dB, achieves a maximum frequency identification error of 7.84%, and accelerates the search by approximately 48.6% compared to exhaustive search. The proposed method provides a reliable and efficient solution for blade health monitoring.
Full article

Figure 1
Open AccessArticle
Measurement of China’s “External Market Provider” Role: Trade-Margin Decomposition and Gravity Determinants
by
Manru Zhao and Yujia Lu
Entropy 2026, 28(5), 504; https://doi.org/10.3390/e28050504 - 30 Apr 2026
Abstract
This paper measures China’s role as an “external market provider” by quantifying, for 168 source countries during 2001–2022, the share of each country’s exports absorbed by China and decomposing that share into extensive (product coverage), quantity, and price margins using the Hummels–Klenow framework.
[...] Read more.
This paper measures China’s role as an “external market provider” by quantifying, for 168 source countries during 2001–2022, the share of each country’s exports absorbed by China and decomposing that share into extensive (product coverage), quantity, and price margins using the Hummels–Klenow framework. To characterize destination-market concentration, we construct an HHI-based network diversification indicator from export-destination shares and interpret it from a complementary information-theoretic perspective, where higher concentration corresponds to lower diversification and stronger dependence. We document the dynamics of China’s market-provision role and estimate an extended gravity-type model with country- and year-fixed effects. The results show that China’s external market-provider role expanded markedly after WTO accession, with growth driven mainly by the quantity margin and, after 2018, increasingly supported by the price margin. Economic proximity and similarity in global value-chain position are associated with stronger China-absorption shares, while greater destination concentration relative to China is associated with lower China-absorption shares. Free trade agreements are linked to stronger, more extensive, and larger margins. Robustness checks based on lagged covariates, additional controls, higher-dimensional fixed effects, Tobit estimation, and winsorization support the main findings. Overall, the paper provides a replicable framework for measuring destination-market pull and shows how China’s import-side role varies across products, regions, and development groups, while using the information-theoretic perspective as a supplementary interpretation of diversification patterns rather than as a separate empirical tool.
Full article
(This article belongs to the Special Issue Entropy-Based Applications in Economics, Finance, and Management, 4th Edition)
►▼
Show Figures

Figure 1
Open AccessArticle
TraceLAB: A MATLAB Toolbox for Interindividual Synchrony Analysis of Facial Expression and Head Movement Data Acquired via Trace
by
Felix Carter, Mike Richardson, Danaë Stanton Fraser and Iain D. Gilchrist
Entropy 2026, 28(5), 503; https://doi.org/10.3390/e28050503 - 29 Apr 2026
Abstract
Facial expressions transmit information about internal states, both during social interaction and in response to shared stimuli such as films. When individuals view the same content, synchrony in their expressions reflects shared information processing, and the degree to which their expressions correlate indicates
[...] Read more.
Facial expressions transmit information about internal states, both during social interaction and in response to shared stimuli such as films. When individuals view the same content, synchrony in their expressions reflects shared information processing, and the degree to which their expressions correlate indicates how similarly their perceptual and affective systems are responding to the common input. This makes interindividual expression synchrony a potential marker of engagement and subjective experience. However, the acquisition and analysis of facial data pose both ethical and technical challenges to researchers. ‘Trace’ is a research media player implemented in PsychoPy’s online platform Pavlovia, which captures anonymised facial landmark coordinates through a webcam, without the ethical and technical constraints of capturing and storing video images of participants. Nonetheless, its usefulness is currently limited due to the lack of available preprocessing and analysis tools. This paper describes the functionality of TraceLAB, a MATLAB-based toolbox designed for the preprocessing of Trace data: specifically, the formatting, aligning, and filtering of data. In addition, TraceLAB implements some novel analysis techniques to allow researchers to quantify interindividual synchrony of expressions (through correlated component analysis) and head movements (through Surrogate Synchrony), which may be interpreted as measures of shared information processing. These techniques are demonstrated here on both simulated and real datasets.
Full article
(This article belongs to the Special Issue Synchronization and Information Patterns in Human Dynamics)
►▼
Show Figures

Figure 1
Open AccessArticle
Efficient Estimation Methods for the QR Distribution with Type-II Censored Data: An Empirical Validation on Lung Cancer Prognosis
by
Qasim Ramzan, Muhammad Amin, Shuhrah Alghamdi and Randa Alharbi
Entropy 2026, 28(5), 502; https://doi.org/10.3390/e28050502 - 29 Apr 2026
Abstract
The QR distribution, recently introduced for modeling lifetime data under Type-II censoring, offers a flexible framework for survival and reliability analysis. This study provides the first comprehensive evaluation of multiple modern estimation techniques for the QR distribution under Type-II censoring. We systematically compare
[...] Read more.
The QR distribution, recently introduced for modeling lifetime data under Type-II censoring, offers a flexible framework for survival and reliability analysis. This study provides the first comprehensive evaluation of multiple modern estimation techniques for the QR distribution under Type-II censoring. We systematically compare classical maximum likelihood estimation with stochastic gradient descent variants (Momentum and Adam), Bayesian approaches including Maximum A Posteriori estimation, Markov Chain Monte Carlo, and Variational Inference, as well as machine learning-integrated methods such as amortized neural network inference. Using both synthetic and the real Veterans’ Administration Lung Cancer dataset, we evaluate these methods in terms of parameter estimation accuracy, computational efficiency, and convergence behavior. The results demonstrate the strengths of optimization-based, Bayesian, and neural approaches, highlighting their practical utility in handling complex censored survival data. This research validates the distribution’s effectiveness in capturing survival dynamics, offering valuable insights for clinical applications and highlighting areas for methodological improvement.
Full article
(This article belongs to the Special Issue Statistical Planning, Inference, and Decision Making in High-Dimensional Data Analysis)
►▼
Show Figures

Graphical abstract
Open AccessArticle
Thermal-State Continuous-Variable Quantum Key Distribution Under the Effects of Gravity
by
Li Zhang, Jiannan Huang and Jian Zhou
Entropy 2026, 28(5), 501; https://doi.org/10.3390/e28050501 - 28 Apr 2026
Abstract
Continuous-variable quantum key distribution has gradually shifted from optical fiber communication to space communication. In free-space quantum communication, the influence of gravity cannot be ignored. In light of the influence of gravity, this study assesses the efficacy of the thermal-state continuous-variable quantum key
[...] Read more.
Continuous-variable quantum key distribution has gradually shifted from optical fiber communication to space communication. In free-space quantum communication, the influence of gravity cannot be ignored. In light of the influence of gravity, this study assesses the efficacy of the thermal-state continuous-variable quantum key distribution (QKD) protocol in a non-inertial reference frame. This differs from the conventional scenario in an inertial reference frame, where pure vacuum Gaussian states are employed without consideration of the influence of gravity. This study examines the potential and challenges of quantum key distribution in the presence of gravitational effects. The feasibility of generating the key under the influence of gravity through the lens of quantum state transfer in a non-inertial reference system is also analyzed. It presents a comprehensive mathematical derivation and simulation of the secret key rate for maintaining a positive rate under specific conditions. Furthermore, it presents a detailed implementation plan for thermal-state quantum key distribution in a non-inertial reference frame, particularly in the context of gravity. It offers valuable insights into the performance of quantum communication in unconventional settings.
Full article
(This article belongs to the Special Issue Recent Advances in Continuous-Variable Quantum Key Distribution)
►▼
Show Figures

Figure 1
Open AccessArticle
Support Size of ε-Capacity-Achieving Inputs for the Amplitude-Constrained AWGN Channel
by
Luca Barletta and Alex Dytso
Entropy 2026, 28(5), 500; https://doi.org/10.3390/e28050500 - 28 Apr 2026
Abstract
We study the discrete-time amplitude-constrained additive white Gaussian noise (AWGN) channel from the perspective of near-optimal input distributions in the high-SNR, or equivalently large-amplitude, regime. While it is known that the capacity-achieving input is discrete with finitely many mass points, the precise scaling
[...] Read more.
We study the discrete-time amplitude-constrained additive white Gaussian noise (AWGN) channel from the perspective of near-optimal input distributions in the high-SNR, or equivalently large-amplitude, regime. While it is known that the capacity-achieving input is discrete with finitely many mass points, the precise scaling of its support size as a function of the amplitude constraint remains an open problem. In this work, we instead consider the minimal support size required to achieve capacity up to an -gap. We introduce the quantity , defined as the smallest support size among discrete inputs supported on that achieves mutual information within of capacity. We show that this relaxed formulation is significantly more tractable and admits sharp characterizations in several vanishing-gap regimes. In particular, for polynomially decaying gaps, with , we establish that as . For exponentially small gaps, we obtain bounds of order between and . Our approach combines approximation-theoretic bounds for Gaussian mixtures with information-theoretic control of entropy via -divergence, together with a wrapping argument that relates the problem to approximating the uniform distribution on a circle. Beyond the technical results, our framework provides a conceptual explanation for the variety of scaling laws observed in prior numerical studies, suggesting that these may correspond to different regimes of -optimality rather than intrinsic properties of the exact optimizer.
Full article
(This article belongs to the Special Issue Foundations and Frontiers of Information Theory—Dedicated to Professor H. Vincent Poor on the Occasion of His 75th Birthday)
Open AccessArticle
BEP-IM: A Vehicular Crowdsensing Incentive Mechanism to Drive Sustained Spatial Coverage and Proactive Sensing Shaping
by
Jiamin Zhang, Lisha Shuai, Jiuling Dong, Gaoya Dong, Xiaolong Yang and Keping Long
Entropy 2026, 28(5), 499; https://doi.org/10.3390/e28050499 - 28 Apr 2026
Abstract
In the Internet of Vehicles, vehicular crowdsensing is crucial for alleviating traffic congestion and ensuring the safety of autonomous driving. However, practical vehicular crowdsensing processes face dual challenges of skewed spatial distributions of vehicles and inadequate data quality guidance. These issues cause sensing
[...] Read more.
In the Internet of Vehicles, vehicular crowdsensing is crucial for alleviating traffic congestion and ensuring the safety of autonomous driving. However, practical vehicular crowdsensing processes face dual challenges of skewed spatial distributions of vehicles and inadequate data quality guidance. These issues cause sensing redundancy in high-participation areas (HPAs) and coverage deficits in low-participation areas (LPAs), while also leading to unstable data quality. Given that participants’ decisions are profoundly influenced by bounded rationality and psychological preferences, this paper proposes a collaborative incentive mechanism integrating behavioral economics and psychology (BEP-IM) to drive sustained spatial coverage and proactive sensing shaping. First, to mitigate coverage deficits in LPA, a reference-dependent two-sided selection and bidding strategy (RD-TSB) is designed to guide participants toward LPA via a reference-driven utility evaluation. Concurrently, a loss-aversion-based sustained incentive strategy (LA-RPI) is introduced to enhance their sustained participation within LPAs by amplifying loss perception. Furthermore, to overcome weak data quality constraints, an operant conditioning-based proactive sensing shaping strategy (OC-SFQ) is constructed, utilizing a closed-loop mechanism of relative improvement, variable-ratio reinforcement, and association updating to drive participants to output high-quality data. Simulation results demonstrate that the proposed mechanism effectively increases participation frequency in LPAs and optimizes sensing data quality.
Full article
(This article belongs to the Section Multidisciplinary Applications)
Open AccessArticle
Interplay Between Vertical and Horizontal Schemes of Computation: From Bayesian Inference to Quantum Logic via Gluing Boolean Algebras
by
Yukio-Pegio Gunji, Kyoko Nakamura, Kazuto Sasai, Iori Tani, Mayo Kuroki, Alessandro Chiolerio, Andrew Adamatzky and Andrei Khrennikov
Entropy 2026, 28(5), 498; https://doi.org/10.3390/e28050498 - 28 Apr 2026
Abstract
Artificial intelligence is typically formulated as an information-processing system composed of artificial neurons, where computation is understood as recursive operations connecting inputs and outputs. However, real neural systems are materially embodied and continuously reconfigured by metabolic and physical processes, suggesting that computation cannot
[...] Read more.
Artificial intelligence is typically formulated as an information-processing system composed of artificial neurons, where computation is understood as recursive operations connecting inputs and outputs. However, real neural systems are materially embodied and continuously reconfigured by metabolic and physical processes, suggesting that computation cannot be reduced to fixed causal structures. In this paper, we propose a theoretical framework that captures the interplay between informational and material processes as the interaction between two computational schemes: a vertical scheme, representing fixed cause–effect relations, and a horizontal scheme, representing transformations between such relations. We show that the vertical scheme corresponds to Bayesian inference, which updates probability distributions over a fixed hypothesis space, and is consistent with the free-energy minimization principle. In contrast, the horizontal scheme is formalized as inverse Bayesian inference, which modifies the hypothesis space itself by updating likelihood structures based on experienced data. We further demonstrate that the interplay between these schemes can be expressed algebraically as a process of continuously gluing Boolean algebras. This construction yields a non-distributive orthomodular lattice, i.e., quantum logic, without invoking Hilbert space formalism. In this view, quantum logic emerges not as a static logical system but as a structural consequence of dynamically reconfiguring causal contexts. This framework provides a unified perspective in which inference is understood not only as optimization within a fixed model but also as a process that generates and transforms the model itself. It offers a formal basis for describing open-ended computation and suggests a connection to approaches such as unconventional computing and Natural Born Intelligence, where computational structures evolve through interaction with material processes. Unlike existing approaches, this framework derives quantum-logic-like structure from the continual reconfiguration of causal contexts rather than from Hilbert-space assumptions or optimization within a fixed hypothesis space.
Full article
(This article belongs to the Special Issue Quantum Information and Probability: From Foundations to Engineering IV)
Open AccessArticle
Improved Passive State Preparation–Continuous Variable Quantum Key Distribution Scheme Based on Non-Gaussian Operations
by
Hao Luo, Yijun Wang, Hang Zhang and Jiajia Zhong
Entropy 2026, 28(5), 497; https://doi.org/10.3390/e28050497 - 27 Apr 2026
Abstract
Passive state preparation–continuous variable quantum key distribution (PSP-CVQKD) protocol inherits the advantage of high secret key rate (SKR) of CVQKD, while overcoming the drawback of complex modulation equipment in GMCS-CVQKD. In recent years, it has received extensive attention in the experimental field. Even
[...] Read more.
Passive state preparation–continuous variable quantum key distribution (PSP-CVQKD) protocol inherits the advantage of high secret key rate (SKR) of CVQKD, while overcoming the drawback of complex modulation equipment in GMCS-CVQKD. In recent years, it has received extensive attention in the experimental field. Even so, short transmission distance remains its prominent issue. In this paper, a scheme for introducing non-Gaussian operations into PSP-CVQKD in optical fiber links is proposed. We derive the input–output relationship of the system, as well as the calculation formulas for the success probability and SKR, when non-Gaussian operations are introduced at both sides of the channels respectively. The results indicate that the improved PSP-CVQKD scheme is feasible in enhancing the SKR performance and can effectively increase the transmission distance. Our scheme provides beneficial ideas for further in-depth research on non-Gaussian operations and the performance improvement of other PSP-CVQKD protocols.
Full article
(This article belongs to the Special Issue Recent Advances in Continuous-Variable Quantum Key Distribution)
Journal Menu
► ▼ Journal Menu-
- Entropy Home
- Aims & Scope
- Editorial Board
- Reviewer Board
- Topical Advisory Panel
- Video Exhibition
- Instructions for Authors
- Special Issues
- Topics
- Sections & Collections
- Article Processing Charge
- Indexing & Archiving
- Editor’s Choice Articles
- Most Cited & Viewed
- Journal Statistics
- Journal History
- Journal Awards
- Society Collaborations
- Conferences
- Editorial Office
Journal Browser
► ▼ Journal Browser-
arrow_forward_ios
Forthcoming issue
arrow_forward_ios Current issue - Vol. 28 (2026)
- Vol. 27 (2025)
- Vol. 26 (2024)
- Vol. 25 (2023)
- Vol. 24 (2022)
- Vol. 23 (2021)
- Vol. 22 (2020)
- Vol. 21 (2019)
- Vol. 20 (2018)
- Vol. 19 (2017)
- Vol. 18 (2016)
- Vol. 17 (2015)
- Vol. 16 (2014)
- Vol. 15 (2013)
- Vol. 14 (2012)
- Vol. 13 (2011)
- Vol. 12 (2010)
- Vol. 11 (2009)
- Vol. 10 (2008)
- Vol. 9 (2007)
- Vol. 8 (2006)
- Vol. 7 (2005)
- Vol. 6 (2004)
- Vol. 5 (2003)
- Vol. 4 (2002)
- Vol. 3 (2001)
- Vol. 2 (2000)
- Vol. 1 (1999)
Highly Accessed Articles
Latest Books
E-Mail Alert
News
Topics
Topic in
Algorithms, Energies, Entropy, MAKE, Materials, Laboratories, Complexities
AI and Computational Methods for Modelling, Simulations and Optimizing of Advanced Systems: Innovations in Complexity, 2nd Edition
Topic Editors: Jaroslaw Krzywanski, Marcin Sosnowski, Karolina Grabowska, Dorian Skrobek, Anna Zylka, Agnieszka Kijo-Kleczkowska, Bashar Shboul, Tomasz CzakiertDeadline: 30 June 2026
Topic in
Atmosphere, Earth, Encyclopedia, Entropy, Fractal Fract, MAKE, Meteorology
Revisiting Butterfly Effect, Multiscale Dynamics, and Predictability Using Ai-Enhanced Modeling Framework (AEMF) and Chaos Theory
Topic Editors: Bo-Wen Shen, Roger A. Pielke Sr., Xubin ZengDeadline: 31 July 2026
Topic in
Applied Sciences, Computers, Energies, Entropy, Mathematics
Numerical Methods and Computer Simulations in Energy Analysis, 3rd Edition
Topic Editors: Marcin Kamiński, Mateus MendesDeadline: 31 August 2026
Topic in
AI, Applied Sciences, Computers, Electronics, Entropy, Future Internet, Information, IoT, Sensors, Telecom
Advances in Sixth Generation and Beyond (6G&B)
Topic Editors: Luis Javier García Villalba, Ana Lucila Sandoval OrozcoDeadline: 31 October 2026
Conferences
30 October–3 November 2026
The 1st International Conference on Modern Mathematical Physics (ICMMP 2026)

Special Issues
Special Issue in
Entropy
Advances in Entropy and Computational Fluid Dynamics, 2nd Edition
Guest Editors: Jorge Arturo Alfaro-Ayala, José de Jesús Ramírez-MinguelaDeadline: 15 May 2026
Special Issue in
Entropy
Rethinking Representation Learning in the Age of Large Models
Guest Editors: Yuhang Liu, Xinyu Zhang, Qingsen YanDeadline: 15 May 2026
Special Issue in
Entropy
Decoding Earthquake Complexity: From Earthquake Ruptures and Slip Styles to Seismic Sequences and Faulting
Guest Editors: Davide Zaccagnino, Filippos Vallianatos, Robert ShcherbakovDeadline: 15 May 2026
Special Issue in
Entropy
Nonlinear Dynamics of Complex Systems
Guest Editors: Alina Cristiana Gavriluţ, Maricel AgopDeadline: 15 May 2026
Topical Collections
Topical Collection in
Entropy
Advances in Applied Statistical Mechanics
Collection Editor: Antonio M. Scarfone
Topical Collection in
Entropy
Foundations of Statistical Mechanics
Collection Editor: Antonio M. Scarfone
Topical Collection in
Entropy
Advances in Integrated Information Theory
Collection Editors: Larissa Albantakis, Matteo Grasso, Andrew Haun



