Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (156)

Search Parameters:
Keywords = symbolic entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 1838 KB  
Article
Modeling the Emergence of Insight via Quantum Interference on Semantic Graphs
by Arianna Pavone and Simone Faro
Mathematics 2025, 13(19), 3171; https://doi.org/10.3390/math13193171 - 3 Oct 2025
Abstract
Creative insight is a core phenomenon of human cognition, often characterized by the sudden emergence of novel and contextually appropriate ideas. Classical models based on symbolic search or associative networks struggle to capture the non-linear, context-sensitive, and interference-driven aspects of insight. In this [...] Read more.
Creative insight is a core phenomenon of human cognition, often characterized by the sudden emergence of novel and contextually appropriate ideas. Classical models based on symbolic search or associative networks struggle to capture the non-linear, context-sensitive, and interference-driven aspects of insight. In this work, we propose a computational model of insight generation grounded in continuous-time quantum walks over weighted semantic graphs, where nodes represent conceptual units and edges encode associative relationships. By exploiting the principles of quantum superposition and interference, the model enables the probabilistic amplification of semantically distant but contextually relevant concepts, providing a plausible account of non-local transitions in thought. The model is implemented using standard Python 3.10 libraries and is available both as an interactive fully reproducible Google Colab notebook and a public repository with code and derived datasets. Comparative experiments on ConceptNet-derived subgraphs, including the Candle Problem, 20 Remote Associates Test triads, and Alternative Uses, show that, relative to classical diffusion, quantum walks concentrate more probability on correct targets (higher AUC and peaks reached earlier) and, in open-ended settings, explore more broadly and deeply (higher entropy and coverage, larger expected radius, and faster access to distant regions). These findings are robust under normalized generators and a common time normalization, align with our formal conditions for transient interference-driven amplification, and support quantum-like dynamics as a principled process model for key features of insight. Full article
(This article belongs to the Section E1: Mathematics and Computer Science)
17 pages, 618 KB  
Article
Advancing Sustainable Development Goal 4 through Green Education: A Multidimensional Assessment of Turkish Universities
by Bediha Sahin
Sustainability 2025, 17(19), 8800; https://doi.org/10.3390/su17198800 - 30 Sep 2025
Abstract
In this study, we provide, to our knowledge, one of the first multidimensional, data-driven evaluations of green education performance in Turkish higher education, combining the THE Education Score, THE Impact Score, and the UI GreenMetric Education & Research Score (GM-ED) with institutional characteristics, [...] Read more.
In this study, we provide, to our knowledge, one of the first multidimensional, data-driven evaluations of green education performance in Turkish higher education, combining the THE Education Score, THE Impact Score, and the UI GreenMetric Education & Research Score (GM-ED) with institutional characteristics, and situating the analysis within SDG 4 (Quality Education). While universities worldwide increasingly integrate sustainability into their missions, systematic evidence from middle-income systems remains scarce. To address this gap, we compile a dataset of 50 Turkish universities combining three global indicators—the Times Higher Education (THE) Education Score, THE Impact Score, and the UI GreenMetric Education & Research Score (GM-ED)—with institutional characteristics such as ownership and student enrollment. We employ descriptive statistics; correlation analysis; robust regression models; composite indices under equal, PCA, and entropy-based weighting; and exploratory k-means clustering. Results show that integration of sustainability into curricula and research is the most consistent predictor of SDG-oriented performance, while institutional size and ownership exert limited influence. In addition, we propose composite indices (GECIs). GECIs confirm stable top performers across methods, but mid-ranked universities are volatile, indicating that governance and strategic orientation matter more than structural capacity. The study contributes to international debates by framing green education as both a measurable indicator and a transformative institutional practice. For Türkiye, our findings highlight the need to move beyond symbolic initiatives toward systemic reforms that link accreditation, funding, and governance with green education outcomes. More broadly, we demonstrate how universities in middle-income contexts can institutionalize sustainability and provide a replicable framework for assessing progress toward SDG 4. Full article
(This article belongs to the Special Issue Sustainable Education for All: Latest Enhancements and Prospects)
21 pages, 1247 KB  
Article
ERLD-HC: Entropy-Regularized Latent Diffusion for Harmony-Constrained Symbolic Music Generation
by Yang Li
Entropy 2025, 27(9), 901; https://doi.org/10.3390/e27090901 - 25 Aug 2025
Viewed by 648
Abstract
Recently, music generation models based on deep learning have made remarkable progress in the field of symbolic music generation. However, the existing methods often have problems of violating musical rules, especially since the control of harmonic structure is relatively weak. To address these [...] Read more.
Recently, music generation models based on deep learning have made remarkable progress in the field of symbolic music generation. However, the existing methods often have problems of violating musical rules, especially since the control of harmonic structure is relatively weak. To address these limitations, this paper proposes a novel framework, the Entropy-Regularized Latent Diffusion for Harmony-Constrained (ERLD-HC), which combines a variational autoencoder (VAE) and latent diffusion models with an entropy-regularized conditional random field (CRF). Our model first encodes symbolic music into latent representations through VAE, and then introduces the entropy-based CRF module into the cross-attention layer of UNet during the diffusion process, achieving harmonic conditioning. The proposed model balances two key limitations in symbolic music generation: the lack of theoretical correctness of pure algorithm-driven methods and the lack of flexibility of rule-based methods. In particular, the CRF module learns classic harmony rules through learnable feature functions, significantly improving the harmony quality of the generated Musical Instrument Digital Interface (MIDI). Experiments on the Lakh MIDI dataset show that compared with the baseline VAE+Diffusion, the violation rates of harmony rules of the ERLD-HC model under self-generated and controlled inputs have decreased by 2.35% and 1.4% respectively. Meanwhile, the MIDI generated by the model maintains a high degree of melodic naturalness. Importantly, the harmonic guidance in ERLD-HC is derived from an internal CRF inference module, which enforces consistency with music-theoretic priors. While this does not yet provide direct external chord conditioning, it introduces a form of learned harmonic controllability that balances flexibility and theoretical rigor. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

16 pages, 8860 KB  
Article
Research on Rural Landscape Emotions Based on EEG Technology and VIKOR-GRA Model: A Case Study of Xiedian Ancient Village in Macheng City
by Xinyu Yan and Yifei Li
Buildings 2025, 15(17), 3002; https://doi.org/10.3390/buildings15173002 - 23 Aug 2025
Viewed by 466
Abstract
This study integrates EEG technology with the VIKOR-GRA model to construct a quantitative method for assessing emotional responses to rural landscapes. Taking 94 scenes from Xiedian Ancient Village in Macheng City, Hubei Province, as the research objects, arousal (Arousal) and valence (Valence) were [...] Read more.
This study integrates EEG technology with the VIKOR-GRA model to construct a quantitative method for assessing emotional responses to rural landscapes. Taking 94 scenes from Xiedian Ancient Village in Macheng City, Hubei Province, as the research objects, arousal (Arousal) and valence (Valence) were calculated based on the power ratio of α and β frequency bands. The entropy weight method was employed to determine weights and compute group utility value (S), individual regret value (R), and compromise solution (Q). The results indicate that 16 scenes had Q values > 0.75 (Grade IV), reflecting poor emotional experiences, with significantly lower arousal (−2.15 ± 0.38) and valence (−0.87 ± 1.02). Vegetation morphology and water visibility were identified as the primary limiting factors, while graphic symbols and historical culture exhibited strong positive feedback. Optimization strategies are proposed, providing a quantifiable technical pathway for the renewal of rural heritage landscapes. Full article
(This article belongs to the Section Architectural Design, Urban Science, and Real Estate)
Show Figures

Figure 1

23 pages, 374 KB  
Article
Empirical Lossless Compression Bound of a Data Sequence
by Lei M. Li
Entropy 2025, 27(8), 864; https://doi.org/10.3390/e27080864 - 14 Aug 2025
Viewed by 1007
Abstract
We consider the lossless compression bound of any individual data sequence. Conceptually, its Kolmogorov complexity is such a bound yet uncomputable. According to Shannon’s source coding theorem, the average compression bound is nH, where n is the number of words and [...] Read more.
We consider the lossless compression bound of any individual data sequence. Conceptually, its Kolmogorov complexity is such a bound yet uncomputable. According to Shannon’s source coding theorem, the average compression bound is nH, where n is the number of words and H is the entropy of an oracle probability distribution characterizing the data source. The quantity nH(θ^n) obtained by plugging in the maximum likelihood estimate is an underestimate of the bound. Shtarkov showed that the normalized maximum likelihood (NML) distribution is optimal in a minimax sense for any parametric family. Fitting a data sequence—without any a priori distributional assumption—by a relevant exponential family, we apply the local asymptotic normality to show that the NML code length is nH(θ^n)+d2logn2π+logΘ|I(θ)|1/2dθ+o(1), where d is dictionary size, |I(θ)| is the determinant of the Fisher information matrix, and Θ is the parameter space. We demonstrate that sequentially predicting the optimal code length for the next word via a Bayesian mechanism leads to the mixture code whose length is given by nH(θ^n)+d2logn2π+log|I(θ^n)|1/2w(θ^n)+o(1), where w(θ) is a prior. The asymptotics apply to not only discrete symbols but also continuous data if the code length for the former is replaced by the description length for the latter. The analytical result is exemplified by calculating compression bounds of protein-encoding DNA sequences under different parsing models. Typically, compression is maximized when parsing aligns with amino acid codons, while pseudo-random sequences remain incompressible, as predicted by Kolmogorov complexity. Notably, the empirical bound becomes more accurate as the dictionary size increases. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Graphical abstract

22 pages, 7845 KB  
Article
Military Strategies of Roman Cities Establishment Based on the Space Syntax Analysis Applied to the Vestiges of Timgad
by Marouane Samir Guedouh, Kamal Youcef, Hocine Sami Belmahdi, Mohamed Amine Khadraoui and Selma Saraoui
Heritage 2025, 8(8), 324; https://doi.org/10.3390/heritage8080324 - 12 Aug 2025
Viewed by 799
Abstract
Roman cities represent the Empire’s broader approach to urban planning, characterized by geometric precision and a strategic layout. Their spatial organization reflects the underlying military and administrative objectives, which can be better understood through new analytical tools. This research investigates the Roman military [...] Read more.
Roman cities represent the Empire’s broader approach to urban planning, characterized by geometric precision and a strategic layout. Their spatial organization reflects the underlying military and administrative objectives, which can be better understood through new analytical tools. This research investigates the Roman military strategy behind the establishment of Timgad, a Roman archeology in Algeria, using Space Syntax Analysis (SSA) to examine its spatial and urban structure. This study highpoints how its spatial configuration was intricately linked to military tactics aimed at asserting control and dominance by analyzing the city’s grid-like layout and applying SSA indicators, such as Connectivity, Integration, Entropy, Control, Controllability and Through Vision (via Axial Map and Visibility Graph Analysis). The results show high value in these indicators, especially in areas where military structures were strategically located along main roads and key urban nodes, demonstrating a careful exertion to maintain surveillance and authority over space. This spatial configuration reveals a deep synergy connecting military logic and urban design, sustaining the idea that Roman town planning supported both functional and symbolic roles in establishing imperial authority. This study concludes that Roman military strategy was not only central to their territorial expansion but also instrumental in shaping long-lasting urban models, influencing the structure of colonial cities far beyond their time. Timgad thus serves as an influential case of how military requirements shaped the built environment in the Roman Empire. Full article
(This article belongs to the Section Archaeological Heritage)
Show Figures

Figure 1

17 pages, 3234 KB  
Article
Including the Magnitude Variability of a Signal in the Ordinal Pattern Analysis
by Melvyn Tyloo, Joaquín González and Nicolás Rubido
Entropy 2025, 27(8), 840; https://doi.org/10.3390/e27080840 - 7 Aug 2025
Viewed by 730
Abstract
One of the most popular and innovative methods to analyse signals is by using Ordinal Patterns (OPs). The OP encoding is based on transforming a (univariate) signal into a symbolic sequence of OPs, where each OP represents the number of permutations needed to [...] Read more.
One of the most popular and innovative methods to analyse signals is by using Ordinal Patterns (OPs). The OP encoding is based on transforming a (univariate) signal into a symbolic sequence of OPs, where each OP represents the number of permutations needed to order a small subset of the signal’s magnitudes. This implies that OPs are conceptually clear, methodologically simple to implement, and robust to noise, and that they can be applied to short signals. Moreover, they simplify the statistical analyses that can be carried out on a signal, such as entropy and complexity quantifications. However, because of the relative ordering, information about the magnitude of the signal at each timestamp is lost—this being one of the major drawbacks of this method. Here, we propose a way to use the signal magnitudes discarded in the OP encoding as a complementary variable to its permutation entropy. To illustrate our approach, we analyse synthetic trajectories from logistic and Hénon maps—with and without added noise—and real-world signals, including intracranial electroencephalographic recordings from rats in different sleep-wake states and frequency fluctuations in power grids. Our results show that, when complementing the permutation entropy with the variability in the signal magnitudes, the characterisation of these signals is improved and the results remain explainable. This implies that our approach can be useful for feature engineering and improving AI classifiers, as typical machine learning algorithms need complementary signal features as inputs to improve classification accuracy. Full article
(This article belongs to the Special Issue Ordinal Patterns-Based Tools and Their Applications)
Show Figures

Figure 1

17 pages, 2680 KB  
Article
Application of Shannon Entropy to Reaction–Diffusion Problems Using the Stochastic Finite Difference Method
by Marcin Kamiński and Rafał Leszek Ossowski
Entropy 2025, 27(7), 705; https://doi.org/10.3390/e27070705 - 30 Jun 2025
Viewed by 441
Abstract
In this study, we introduce Shannon entropy as a key metric for assessing concentration variability in diffusion processes. Shannon entropy quantifies the uncertainty or disorder in the spatial distribution of diffusing particles, providing a novel perspective on diffusion dynamics. This proposed approach enables [...] Read more.
In this study, we introduce Shannon entropy as a key metric for assessing concentration variability in diffusion processes. Shannon entropy quantifies the uncertainty or disorder in the spatial distribution of diffusing particles, providing a novel perspective on diffusion dynamics. This proposed approach enables a more comprehensive characterization of mixing efficiency, equilibrium states, and transient diffusion behavior. Numerical simulations performed using the finite difference method in the MAPLE 2025 symbolic computing environment illustrate how entropy evolution correlates with diffusion kinetics. The computational model used in this study is based on a previously developed framework from our earlier research, ensuring consistency and validation of the results. The findings suggest that Shannon entropy can serve as a robust descriptor of diffusion-driven mixing, with potential applications in engineering, environmental science, and biophysics. Full article
(This article belongs to the Special Issue Uncertainty Quantification and Entropy Analysis)
Show Figures

Figure 1

18 pages, 1209 KB  
Article
Does Political Risk Affect the Efficiency of the Exchange-Traded Fund Market?—Entropy-Based Analysis Before and After the 2025 U.S. Presidential Inauguration
by Joanna Olbryś
Risks 2025, 13(7), 121; https://doi.org/10.3390/risks13070121 - 26 Jun 2025
Viewed by 982
Abstract
The aim of this research is to thoroughly investigate the influence of the 2025 Donald Trump Presidential Inauguration on informational efficiency of the U.S. exchange-traded fund market in the context of political risk. The data set includes daily observations for twenty U.S. Exchange-Traded [...] Read more.
The aim of this research is to thoroughly investigate the influence of the 2025 Donald Trump Presidential Inauguration on informational efficiency of the U.S. exchange-traded fund market in the context of political risk. The data set includes daily observations for twenty U.S. Exchange-Traded Funds (ETFs). The whole sample comprises the period from 20 October 2024 to 20 April 2025. Since the Presidential Inauguration of Donald Trump took place on 20 January 2025, two sub-samples of an equal length are analyzed: (1) the period before the 2025 U.S. Presidential Inauguration from 20 October 2024 to 19 January 2025 and (2) the period after the 2025 U.S. Presidential Inauguration from 20 January 2025 to 20 April 2025. Since the whole sample period is not long (six months), to estimate market efficiency, modified Shannon entropy based on symbolic encoding with two thresholds is used. The empirical findings are visualized by symbol-sequence histograms. The proposed research hypothesis states that the U.S. ETF market’s informational efficiency, as measured by entropy, substantially decreased during the turbulent period after the Donald Trump Presidential Inauguration compared to the period before the Inauguration. The results unambiguously confirm the research hypothesis and indicate that political risk could affect the informational efficiency of markets. To the best of the author’s knowledge, this is the first study exploring the influence of the Donald Trump Presidential Inauguration on the informational efficiency of the U.S. ETF market. Full article
(This article belongs to the Special Issue Risk Analysis in Financial Crisis and Stock Market)
Show Figures

Figure 1

21 pages, 7404 KB  
Article
Multi-Feature AND–OR Mechanism for Explainable Modulation Recognition
by Xiaoya Wang, Songlin Sun, Haiying Zhang, Yuyang Liu and Qiang Qiao
Electronics 2025, 14(12), 2356; https://doi.org/10.3390/electronics14122356 - 9 Jun 2025
Viewed by 616
Abstract
This study addresses the persistent challenge of balancing interpretability and robustness in black-box deep learning models for automatic modulation recognition (AMR), a critical task in wireless communication systems. To bridge this gap, we propose a novel explainable AI (XAI) framework that integrates symbolic [...] Read more.
This study addresses the persistent challenge of balancing interpretability and robustness in black-box deep learning models for automatic modulation recognition (AMR), a critical task in wireless communication systems. To bridge this gap, we propose a novel explainable AI (XAI) framework that integrates symbolic feature interaction concepts into communication signal analysis for the first time. The framework combines a modulation primitive decomposition architecture, which unifies Shapley interaction entropy with signal physics principles, and a dual-branch XAI mechanism (feature extraction + interaction analysis) validated on ResNet-based models. This approach explicitly maps signal periodicity to modulation order in high-dimensional feature spaces while mitigating feature coupling artifacts. Quantitative responsibility attribution metrics are introduced to evaluate component contributions through modular adversarial verification, establishing a certified benchmark for AMR systems. The experimental validation of the RML 2016.10a dataset has demonstrated the effectiveness of the framework. Under the dynamic signal-to-noise ratio condition of the benchmark ResNet with an accuracy of 94.88%, its occlusion sensitivity increased by 30% and stability decreased by 22% compared to the SHAP baseline. The work advances AMR research by systematically resolving the transparency–reliability trade-off, offering both theoretical and practical tools for deploying trustworthy AI in real-world wireless scenarios. Full article
(This article belongs to the Special Issue Explainability in AI and Machine Learning)
Show Figures

Figure 1

26 pages, 1708 KB  
Article
Research on Task Complexity Measurements in Human—Computer Interaction in Nuclear Power Plant DCS Systems Based on Emergency Operating Procedures
by Ensheng Pang and Licao Dai
Entropy 2025, 27(6), 600; https://doi.org/10.3390/e27060600 - 4 Jun 2025
Cited by 1 | Viewed by 944
Abstract
Within the scope of digital transformation in nuclear power plants (NPPs), task complexity in human–computer interaction (HCI) has become a critical factor affecting the safe and stable operation of NPPs. This study systematically reviews and analyzes existing complexity sources and assessment methods and [...] Read more.
Within the scope of digital transformation in nuclear power plants (NPPs), task complexity in human–computer interaction (HCI) has become a critical factor affecting the safe and stable operation of NPPs. This study systematically reviews and analyzes existing complexity sources and assessment methods and suggests that complexity is primarily driven by core factors such as the quantity of, variety of, and relationships between elements. By innovatively introducing Halstead’s E measure, this study constructs a quantitative model of dynamic task execution complexity (TEC), addressing the limitations of traditional entropy-based metrics in analyzing interactive processes. By combining entropy metrics and the E measure, a task complexity quantification framework is established, encompassing both the task execution and intrinsic dimensions. Specifically, Halstead’s E measure focuses on analyzing operators and operands, defining interaction symbols between humans and interfaces to quantify task execution complexity (TEC). Entropy metrics, on the other hand, measure task logical complexity (TLC), task scale complexity (TSC), and task information complexity (TIC) based on the intrinsic structure and scale of tasks. Finally, the weighted Euclidean norm of these four factors determines the task complexity (TC) of each step. Taking the emergency operating procedures (EOP) for a small-break loss-of-coolant accident (SLOCA) in an NPP as an example, the entropy and E metrics are used to calculate the task complexity of each step, followed by experimental validation using NASA-TLX task load scores and step execution time for regression analysis. The results show that task complexity is significantly positively correlated with NASA-TLX subjective scores and task execution time, with the determination coefficients reaching 0.679 and 0.785, respectively. This indicates that the complexity metrics have high explanatory power, showing that the complexity quantification model is effective and has certain application value in improving human–computer interfaces and emergency procedures. Full article
Show Figures

Figure 1

14 pages, 873 KB  
Article
Experimental Study of an Approximate Method for Calculating Entropy-Optimal Distributions in Randomized Machine Learning Problems
by Alexey Yu. Popkov, Yuri A. Dubnov, Ilya V. Sochenkov and Yuri S. Popkov
Mathematics 2025, 13(11), 1821; https://doi.org/10.3390/math13111821 - 29 May 2025
Viewed by 364
Abstract
This paper is devoted to the experimental study of the integral approximation method in entropy optimization problems arising from the application of the Randomized Machine Learning method. Entropy-optimal probability density functions contain normalizing integrals from multivariate exponential functions; as a result, when computing [...] Read more.
This paper is devoted to the experimental study of the integral approximation method in entropy optimization problems arising from the application of the Randomized Machine Learning method. Entropy-optimal probability density functions contain normalizing integrals from multivariate exponential functions; as a result, when computing these distributions in the process of solving an optimization problem, it is necessary to ensure efficient computation of these integrals. We investigate an approach based on the approximation of integrand functions, which are applied to the solution of several configurations of problems with model and real data with linear static models using a symbolic computation mechanism. Computational studies were carried out under the same conditions, with the same initial data and values of hyperparameters of the used models. They have shown the performance and efficiency of the proposed approach in the Randomized Machine Learning problems based on linear static models. Full article
Show Figures

Figure 1

27 pages, 2317 KB  
Article
Spatial Agglomeration Differences of Amenities and Causes in Traditional Villages from the Perspective of Tourist Perception
by Haiyan Yan, Rui Dong, Yanbing He, Jianqing Qi and Luna Li
Sustainability 2025, 17(10), 4475; https://doi.org/10.3390/su17104475 - 14 May 2025
Cited by 2 | Viewed by 661
Abstract
Amid global rural tourism growth and rural revitalization policies, traditional villages’ resource protection and tourism development have drawn international academic attention. To guide villages’ resource planning and management, this study constructed an evaluation index system of cultural, ecological, industrial, talent, and organizational amenities [...] Read more.
Amid global rural tourism growth and rural revitalization policies, traditional villages’ resource protection and tourism development have drawn international academic attention. To guide villages’ resource planning and management, this study constructed an evaluation index system of cultural, ecological, industrial, talent, and organizational amenities in traditional villages from the perspective of tourists’ perceptions using grounded theory and measured the spatial agglomeration differences, synergistic effects and their influencing factors of traditional village amenities by using location entropy, spatial autocorrelation, and gray correlation degree analysis. The results show that (1) the spatial distributions of cultural, ecological, industrial, and organizational amenities are more balanced, while talent amenities exhibit a more concentrated distribution. (2) The spatial concentration of amenities in traditional villages has a strong positive spatial correlation, the agglomeration level of the high-high type of concentration is distributed in clusters, the low–low type tends to be contiguous, and the low–high type is distributed sporadically around the high–high type; significant synergy between ecological and industrial amenities, and organizations play a supportive role in the spatial agglomeration of cultural, ecological, ecological and talent amenities. (3) Gross regional product, slope, and distance to 3A and above scenic spots significantly influence the spatial agglomeration of amenities. This study provides reference for the sustainable development of traditional villages from the perspectives of exerting agglomeration and radiation effects, synergistically promoting villages’ development, constructing the memory symbol system, and integrating the resource structural system based on the spatial agglomeration difference characteristics of traditional village amenities. Full article
Show Figures

Figure 1

10 pages, 8363 KB  
Article
Improved Reconstruction of Chaotic Signals from Ordinal Networks
by Antonio Politi and Leonardo Ricci
Entropy 2025, 27(5), 499; https://doi.org/10.3390/e27050499 - 6 May 2025
Viewed by 577
Abstract
Permutation entropy is customarily implemented to quantify the intrinsic indeterminacy of complex time series, under the assumption that determinism manifests itself by lowering the (permutation) entropy of the resulting symbolic sequence. We expect this to be roughly true, but, in general, it is [...] Read more.
Permutation entropy is customarily implemented to quantify the intrinsic indeterminacy of complex time series, under the assumption that determinism manifests itself by lowering the (permutation) entropy of the resulting symbolic sequence. We expect this to be roughly true, but, in general, it is not clear to what extent a given ordinal pattern indeed provides a faithful reconstruction of the original signal. Here, we address this question by attempting the reconstruction of the original time series by invoking an ergodic Markov approximation of the symbolic dynamics, thereby inverting the encoding procedure. Using the Hénon map as a testbed, we show that a meaningful reconstruction can also be made in the presence of a small observational noise. Full article
(This article belongs to the Special Issue Ordinal Patterns-Based Tools and Their Applications)
Show Figures

Figure 1

29 pages, 21938 KB  
Article
Turbulent Flow in Street Canyons: A Complexity Approach
by Csanád Árpád Hubay, Bálint Papp and Tamás Kalmár-Nagy
Entropy 2025, 27(5), 488; https://doi.org/10.3390/e27050488 - 30 Apr 2025
Cited by 1 | Viewed by 524
Abstract
Velocity measurements and simulations in an idealized urban environment were studied, focusing on turbulent flow over street canyons. Time series of fluctuating velocities were considered as marked point processes, and the distribution of mean residence times was characterized using a lognormal fit. The [...] Read more.
Velocity measurements and simulations in an idealized urban environment were studied, focusing on turbulent flow over street canyons. Time series of fluctuating velocities were considered as marked point processes, and the distribution of mean residence times was characterized using a lognormal fit. The quadrant method was applied to transform time series into symbolic sequences, enabling the investigation of their information content. By analyzing word frequency and normalized entropy levels, we compared measured and simulated sequences with periodic symbol sequences with and without noise. Our results indicate that noisy periodic sequences exhibit entropy distributions qualitatively similar to those of the measured and simulated data. Surrogate sequences generated using first-, and higher-order Markov statistics also displayed similarity. Higher-order Markov chains provide a more accurate representation of the information content of velocity fluctuation series. These findings contribute to the comparison of experimental and simulation techniques in the investigation of turbulence. Full article
Show Figures

Figure 1

Back to TopTop