Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (16)

Search Parameters:
Keywords = Rényi transfer entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 1850 KB  
Article
Tail Risk Spillover Between Global Stock Markets Based on Effective Rényi Transfer Entropy and Wavelet Analysis
by Jingjing Jia
Entropy 2025, 27(5), 523; https://doi.org/10.3390/e27050523 - 14 May 2025
Cited by 1 | Viewed by 689
Abstract
To examine the spillover of tail-risk information across global stock markets, we select nine major stock markets for the period spanning from June 2014 to May 2024 as the sample data. First, we employ effective Rényi transfer entropy to measure the tail-risk information [...] Read more.
To examine the spillover of tail-risk information across global stock markets, we select nine major stock markets for the period spanning from June 2014 to May 2024 as the sample data. First, we employ effective Rényi transfer entropy to measure the tail-risk information spillover. Second, we construct a Diebold–Yilmaz connectedness table to explore the overall characteristics of tail-risk information spillover across the global stock markets. Third, we integrate wavelet analysis with effective Rényi transfer entropy to assess the multi-scale characteristics of the information spillover. Our findings lead to several key conclusions: (1) US and European stock markets are the primary sources of tail-risk information spillover, while Asian stock markets predominantly act as net information receivers; (2) the intensity of tail-risk information spillover is most pronounced between markets at the medium-high trading frequency, and as trading frequency decreases, information spillover becomes more complex; (3) across all trading frequencies, the US stock market emerges as the most influential, while the Japanese stock market is the most vulnerable. China’s stock market, in contrast, demonstrates relative independence. Full article
(This article belongs to the Special Issue Complexity in Financial Networks)
Show Figures

Figure 1

12 pages, 1088 KB  
Article
Impact of the Global Fear Index (COVID-19 Panic) on the S&P Global Indices Associated with Natural Resources, Agribusiness, Energy, Metals, and Mining: Granger Causality and Shannon and Rényi Transfer Entropy
by Pedro Celso-Arellano, Victor Gualajara, Semei Coronado, Jose N. Martinez and Francisco Venegas-Martínez
Entropy 2023, 25(2), 313; https://doi.org/10.3390/e25020313 - 8 Feb 2023
Cited by 2 | Viewed by 4168
Abstract
The Global Fear Index (GFI) is a measure of fear/panic based on the number of people infected and deaths due to COVID-19. This paper aims to examine the interconnection or interdependencies between the GFI and a set of global indexes related to the [...] Read more.
The Global Fear Index (GFI) is a measure of fear/panic based on the number of people infected and deaths due to COVID-19. This paper aims to examine the interconnection or interdependencies between the GFI and a set of global indexes related to the financial and economic activities associated with natural resources, raw materials, agribusiness, energy, metals, and mining, such as: the S&P Global Resource Index, the S&P Global Agribusiness Equity Index, the S&P Global Metals and Mining Index, and the S&P Global 1200 Energy Index. To this end, we first apply several common tests: Wald exponential, Wald mean, Nyblom, and Quandt Likelihood Ratio. Subsequently, we apply Granger causality using a DCC-GARCH model. Data for the global indices are daily from 3 February 2020 to 29 October 2021. The empirical results obtained show that the volatility of the GFI Granger causes the volatility of the other global indices, except for the Global Resource Index. Moreover, by considering heteroskedasticity and idiosyncratic shocks, we show that the GFI can be used to predict the co-movement of the time series of all the global indices. Additionally, we quantify the causal interdependencies between the GFI and each of the S&P global indices using Shannon and Rényi transfer entropy flow, which is comparable to Granger causality, to confirm directionality more robustly The main conclusion of this research is that financial and economic activity related to natural resources, raw materials, agribusiness, energy, metals, and mining were affected by the fear/panic caused by COVID-19 cases and deaths. Full article
(This article belongs to the Special Issue Complexity in Economics and Finance: New Directions and Challenges)
Show Figures

Figure 1

14 pages, 1142 KB  
Article
Asymmetric Information Flow between Exchange Rate, Oil, and Gold: New Evidence from Transfer Entropy Approach
by Moinak Maiti and Parthajit Kayal
J. Risk Financial Manag. 2023, 16(1), 2; https://doi.org/10.3390/jrfm16010002 - 21 Dec 2022
Cited by 8 | Viewed by 2871
Abstract
The present study used transfer entropy and effective transfer entropy to examine the asymmetric information flow between exchange rates, oil, and gold. The dataset is composed of daily data covering the period of 1 January 2018 to 31 December 2021. Further, the dataset [...] Read more.
The present study used transfer entropy and effective transfer entropy to examine the asymmetric information flow between exchange rates, oil, and gold. The dataset is composed of daily data covering the period of 1 January 2018 to 31 December 2021. Further, the dataset is bifurcated for analysis for before and during COVID. The bidirectional information flow is observed between EUR/USD and Oil for the whole study period unlike before COVID. However, during COVID, there was a unidirectional information flow from Oil→EUR/USD. The study finds a significant unidirectional information flow from Gold→EUR/USD. The study estimates also indicate that before COVID, the direction of information flow was from Oil→Gold. However, the direction of information flow reversed during COVID from Gold→Oil. Overall, the direction of information flow among these three variables is asymmetric. The highest transfer entropy was observed for Gold→EUR/USD among all the pairs under consideration. Full article
Show Figures

Figure 1

26 pages, 456 KB  
Article
Some Technical Remarks on Negations of Discrete Probability Distributions and Their Information Loss
by Ingo Klein
Mathematics 2022, 10(20), 3893; https://doi.org/10.3390/math10203893 - 20 Oct 2022
Cited by 2 | Viewed by 2036
Abstract
Negation of a discrete probability distribution was introduced by Yager. To date, several papers have been published discussing generalizations, properties, and applications of negation. The recent work by Wu et al. gives an excellent overview of the literature and the motivation to deal [...] Read more.
Negation of a discrete probability distribution was introduced by Yager. To date, several papers have been published discussing generalizations, properties, and applications of negation. The recent work by Wu et al. gives an excellent overview of the literature and the motivation to deal with negation. Our paper focuses on some technical aspects of negation transformations. First, we prove that independent negations must be affine-linear. This fact was established by Batyrshin et al. as an open problem. Secondly, we show that repeated application of independent negations leads to a progressive loss of information (called monotonicity). In contrast to the literature, we try to obtain results not only for special but also for the general class of ϕ-entropies. In this general framework, we can show that results need to be proven only for Yager negation and can be transferred to the entire class of independent (=affine-linear) negations. For general ϕ-entropies with strictly concave generator function ϕ, we can show that the information loss increases separately for sequences of odd and even numbers of repetitions. By using a Lagrangian approach, this result can be extended, in the neighbourhood of the uniform distribution, to all numbers of repetition. For Gini, Shannon, Havrda–Charvát (Tsallis), Rényi and Sharma–Mittal entropy, we prove that the information loss has a global minimum of 0. For dependent negations, it is not easy to obtain analytical results. Therefore, we simulate the entropy distribution and show how different repeated negations affect Gini and Shannon entropy. The simulation approach has the advantage that the entire simplex of discrete probability vectors can be considered at once, rather than just arbitrarily selected probability vectors. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

28 pages, 5822 KB  
Article
Volatility Dynamics of Non-Linear Volatile Time Series and Analysis of Information Flow: Evidence from Cryptocurrency Data
by Muhammad Sheraz, Silvia Dedu and Vasile Preda
Entropy 2022, 24(10), 1410; https://doi.org/10.3390/e24101410 - 2 Oct 2022
Cited by 11 | Viewed by 3849
Abstract
This paper aims to empirically examine long memory and bi-directional information flow between estimated volatilities of highly volatile time series datasets of five cryptocurrencies. We propose the employment of Garman and Klass (GK), Parkinson’s, Rogers and Satchell (RS), and Garman and Klass-Yang and [...] Read more.
This paper aims to empirically examine long memory and bi-directional information flow between estimated volatilities of highly volatile time series datasets of five cryptocurrencies. We propose the employment of Garman and Klass (GK), Parkinson’s, Rogers and Satchell (RS), and Garman and Klass-Yang and Zhang (GK-YZ), and Open-High-Low-Close (OHLC) volatility estimators to estimate cryptocurrencies’ volatilities. The study applies methods such as mutual information, transfer entropy (TE), effective transfer entropy (ETE), and Rényi transfer entropy (RTE) to quantify the information flow between estimated volatilities. Additionally, Hurst exponent computations examine the existence of long memory in log returns and OHLC volatilities based on simple R/S, corrected R/S, empirical, corrected empirical, and theoretical methods. Our results confirm the long-run dependence and non-linear behavior of all cryptocurrency’s log returns and volatilities. In our analysis, TE and ETE estimates are statistically significant for all OHLC estimates. We report the highest information flow from BTC to LTC volatility (RS). Similarly, BNB and XRP share the most prominent information flow between volatilities estimated by GK, Parkinson’s, and GK-YZ. The study presents the practicable addition of OHLC volatility estimators for quantifying the information flow and provides an additional choice to compare with other volatility estimators, such as stochastic volatility models. Full article
(This article belongs to the Collection Advances in Applied Statistical Mechanics)
Show Figures

Figure 1

32 pages, 18172 KB  
Article
Causal Inference in Time Series in Terms of Rényi Transfer Entropy
by Petr Jizba, Hynek Lavička and Zlata Tabachová
Entropy 2022, 24(7), 855; https://doi.org/10.3390/e24070855 - 22 Jun 2022
Cited by 13 | Viewed by 5454
Abstract
Uncovering causal interdependencies from observational data is one of the great challenges of a nonlinear time series analysis. In this paper, we discuss this topic with the help of an information-theoretic concept known as Rényi’s information measure. In particular, we tackle the directional [...] Read more.
Uncovering causal interdependencies from observational data is one of the great challenges of a nonlinear time series analysis. In this paper, we discuss this topic with the help of an information-theoretic concept known as Rényi’s information measure. In particular, we tackle the directional information flow between bivariate time series in terms of Rényi’s transfer entropy. We show that by choosing Rényi’s parameter α, we can appropriately control information that is transferred only between selected parts of the underlying distributions. This, in turn, is a particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of “black swan” events, such as spikes or sudden jumps, are of key importance. In this connection, we first prove that for Gaussian variables, Granger causality and Rényi transfer entropy are entirely equivalent. Moreover, we also partially extend these results to heavy-tailed α-Gaussian variables. These results allow establishing a connection between autoregressive and Rényi entropy-based information-theoretic approaches to data-driven causal inference. To aid our intuition, we employed the Leonenko et al. entropy estimator and analyzed Rényi’s information flow between bivariate time series generated from two unidirectionally coupled Rössler systems. Notably, we find that Rényi’s transfer entropy not only allows us to detect a threshold of synchronization but it also provides non-trivial insight into the structure of a transient regime that exists between the region of chaotic correlations and synchronization threshold. In addition, from Rényi’s transfer entropy, we could reliably infer the direction of coupling and, hence, causality, only for coupling strengths smaller than the onset value of the transient regime, i.e., when two Rössler systems are coupled but have not yet entered synchronization. Full article
(This article belongs to the Special Issue Entropy-Based Applications in Economics, Finance, and Management)
Show Figures

Figure 1

17 pages, 4175 KB  
Article
Estimating Directed Phase-Amplitude Interactions from EEG Data through Kernel-Based Phase Transfer Entropy
by Iván De La Pava Panche, Viviana Gómez-Orozco, Andrés Álvarez-Meza, David Cárdenas-Peña and Álvaro Orozco-Gutiérrez
Appl. Sci. 2021, 11(21), 9803; https://doi.org/10.3390/app11219803 - 20 Oct 2021
Cited by 2 | Viewed by 3034
Abstract
Cross-frequency interactions, a form of oscillatory neural activity, are thought to play an essential role in the integration of distributed information in the brain. Indeed, phase-amplitude interactions are believed to allow for the transfer of information from large-scale brain networks, oscillating at low [...] Read more.
Cross-frequency interactions, a form of oscillatory neural activity, are thought to play an essential role in the integration of distributed information in the brain. Indeed, phase-amplitude interactions are believed to allow for the transfer of information from large-scale brain networks, oscillating at low frequencies, to local, rapidly oscillating neural assemblies. A promising approach to estimating such interactions is the use of transfer entropy (TE), a non-linear, information-theory-based effective connectivity measure. The conventional method involves feeding instantaneous phase and amplitude time series, extracted at the target frequencies, to a TE estimator. In this work, we propose that the problem of directed phase-amplitude interaction detection is recast as a phase TE estimation problem, under the hypothesis that estimating TE from data of the same nature, i.e., two phase time series, will improve the robustness to the common confounding factors that affect connectivity measures, such as the presence of high noise levels. We implement our proposal using a kernel-based TE estimator, defined in terms of Renyi’s α entropy, which has successfully been used to compute single-trial phase TE. We tested our approach on the synthetic data generated through a simulation model capable of producing a time series with directed phase-amplitude interactions at two given frequencies, and on EEG data from a cognitive task designed to activate working memory, a memory system whose underpinning mechanisms are thought to include phase–amplitude couplings. Our proposal detected statistically significant interactions between the simulated signals at the desired frequencies for the synthetic data, identifying the correct direction of the interaction. It also displayed higher robustness to noise than the alternative methods. The results attained for the working memory data showed that the proposed approach codes connectivity patterns based on directed phase–amplitude interactions, that allow for the different cognitive load levels of the working memory task to be differentiated. Full article
(This article belongs to the Special Issue Research on Biomedical Signal Processing)
Show Figures

Figure 1

26 pages, 6136 KB  
Article
Kernel-Based Phase Transfer Entropy with Enhanced Feature Relevance Analysis for Brain Computer Interfaces
by Iván De La Pava Panche, Andrés Álvarez-Meza, Paula Marcela Herrera Gómez, David Cárdenas-Peña, Jorge Iván Ríos Patiño and Álvaro Orozco-Gutiérrez
Appl. Sci. 2021, 11(15), 6689; https://doi.org/10.3390/app11156689 - 21 Jul 2021
Cited by 9 | Viewed by 3785
Abstract
Neural oscillations are present in the brain at different spatial and temporal scales, and they are linked to several cognitive functions. Furthermore, the information carried by their phases is fundamental for the coordination of anatomically distributed processing in the brain. The concept of [...] Read more.
Neural oscillations are present in the brain at different spatial and temporal scales, and they are linked to several cognitive functions. Furthermore, the information carried by their phases is fundamental for the coordination of anatomically distributed processing in the brain. The concept of phase transfer entropy refers to an information theory-based measure of directed connectivity among neural oscillations that allows studying such distributed processes. Phase TE is commonly obtained from probability estimations carried out over data from multiple trials, which bars its use as a characterization strategy in brain–computer interfaces. In this work, we propose a novel methodology to estimate TE between single pairs of instantaneous phase time series. Our approach combines a kernel-based TE estimator defined in terms of Renyi’s α entropy, which sidesteps the need for probability distribution computation with phase time series obtained by complex filtering the neural signals. Besides, a kernel-alignment-based relevance analysis is added to highlight relevant features from effective connectivity-based representation supporting further classification stages in EEG-based brain–computer interface systems. Our proposal is tested on simulated coupled data and two publicly available databases containing EEG signals recorded under motor imagery and visual working memory paradigms. Attained results demonstrate how the introduced effective connectivity succeeds in detecting the interactions present in the data for the former, with statistically significant results around the frequencies of interest. It also reflects differences in coupling strength, is robust to realistic noise and signal mixing levels, and captures bidirectional interactions of localized frequency content. Obtained results for the motor imagery and working memory databases show that our approach, combined with the relevance analysis strategy, codes discriminant spatial and frequency-dependent patterns for the different conditions in each experimental paradigm, with classification performances that do well in comparison with those of alternative methods of similar nature. Full article
(This article belongs to the Special Issue Advances in Neuroimaging Data Processing)
Show Figures

Figure 1

10 pages, 4178 KB  
Proceeding Paper
Rényi Transfer Entropy Estimators for Financial Time Series
by Petr Jizba, Hynek Lavička and Zlata Tabachová
Eng. Proc. 2021, 5(1), 33; https://doi.org/10.3390/engproc2021005033 - 30 Jun 2021
Cited by 6 | Viewed by 3152
Abstract
In this paper, we discuss the statistical coherence between financial time series in terms of Rényi’s information measure or entropy. In particular, we tackle the issue of the directional information flow between bivariate time series in terms of Rényi’s transfer entropy. The latter [...] Read more.
In this paper, we discuss the statistical coherence between financial time series in terms of Rényi’s information measure or entropy. In particular, we tackle the issue of the directional information flow between bivariate time series in terms of Rényi’s transfer entropy. The latter represents a measure of information that is transferred only between certain parts of underlying distributions. This fact is particularly relevant in financial time series, where the knowledge of “black swan” events such as spikes or sudden jumps is of key importance. To put some flesh on the bare bones, we illustrate the essential features of Rényi’s information flow on two coupled GARCH(1,1) processes. Full article
(This article belongs to the Proceedings of The 7th International Conference on Time Series and Forecasting)
Show Figures

Figure 1

23 pages, 730 KB  
Article
On the α-q-Mutual Information and the α-q-Capacities
by Velimir M. Ilić and Ivan B. Djordjević
Entropy 2021, 23(6), 702; https://doi.org/10.3390/e23060702 - 1 Jun 2021
Cited by 2 | Viewed by 3602
Abstract
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. [...] Read more.
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

34 pages, 2448 KB  
Article
On the Application of Entropy Measures with Sliding Window for Intrusion Detection in Automotive In-Vehicle Networks
by Gianmarco Baldini
Entropy 2020, 22(9), 1044; https://doi.org/10.3390/e22091044 - 18 Sep 2020
Cited by 18 | Viewed by 4817
Abstract
The evolution of modern automobiles to higher levels of connectivity and automatism has also increased the need to focus on the mitigation of potential cybersecurity risks. Researchers have proven in recent years that attacks on in-vehicle networks of automotive vehicles are possible and [...] Read more.
The evolution of modern automobiles to higher levels of connectivity and automatism has also increased the need to focus on the mitigation of potential cybersecurity risks. Researchers have proven in recent years that attacks on in-vehicle networks of automotive vehicles are possible and the research community has investigated various cybersecurity mitigation techniques and intrusion detection systems which can be adopted in the automotive sector. In comparison to conventional intrusion detection systems in large fixed networks and ICT infrastructures in general, in-vehicle systems have limited computing capabilities and other constraints related to data transfer and the management of cryptographic systems. In addition, it is important that attacks are detected in a short time-frame as cybersecurity attacks in vehicles can lead to safety hazards. This paper proposes an approach for intrusion detection of cybersecurity attacks in in-vehicle networks, which takes in consideration the constraints listed above. The approach is based on the application of an information entropy-based method based on a sliding window, which is quite efficient from time point of view, it does not require the implementation of complex cryptographic systems and it still provides a very high detection accuracy. Different entropy measures are used in the evaluation: Shannon Entropy, Renyi Entropy, Sample Entropy, Approximate Entropy, Permutation Entropy, Dispersion and Fuzzy Entropy. This paper evaluates the impact of the different hyperparameters present in the definition of entropy measures on a very large public data set of CAN-bus traffic with millions of CAN-bus messages with four different types of attacks: Denial of Service, Fuzzy Attack and two spoofing attacks related to RPM and Gear information. The sliding window approach in combination with entropy measures can detect attacks in a time-efficient way and with great accuracy for specific choices of the hyperparameters and entropy measures. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

13 pages, 1175 KB  
Article
Susceptibility of Stock Market Returns to International Economic Policy: Evidence from Effective Transfer Entropy of Africa with the Implication for Open Innovation
by Anokye M. Adam
J. Open Innov. Technol. Mark. Complex. 2020, 6(3), 71; https://doi.org/10.3390/joitmc6030071 - 28 Aug 2020
Cited by 44 | Viewed by 4015
Abstract
This study contributes to the scant finance literature on information flow from international economic policy uncertainty to emerging stock markets in Africa, using daily US economic policy uncertainty as a proxy and the daily stock market index for Botswana, Egypt, Ghana, Kenya, Morocco, [...] Read more.
This study contributes to the scant finance literature on information flow from international economic policy uncertainty to emerging stock markets in Africa, using daily US economic policy uncertainty as a proxy and the daily stock market index for Botswana, Egypt, Ghana, Kenya, Morocco, Nigeria, Namibia, South Africa, and Zambia from 31 December 2010 to 27 May 2020, using the Rényi effective transfer entropy. International economic policy uncertainty transmits significant information to Egypt, Ghana, Morocco, Namibia, and South Africa, and insignificant information to Botswana, Kenya, Nigeria, and Zambia. The asymmetry in the information transfer tends to make the African market an alternative for the diversification of international portfolios when the uncertainty of the global economic policy is on the rise. The findings also have implications for the adoption of open innovation in African stock markets. Full article
(This article belongs to the Special Issue Ambidextrous Open Innovation: Technology, Market and Complexity)
Show Figures

Figure 1

13 pages, 455 KB  
Article
Transfer Entropy between Communities in Complex Financial Networks
by Jan Korbel, Xiongfei Jiang and Bo Zheng
Entropy 2019, 21(11), 1124; https://doi.org/10.3390/e21111124 - 15 Nov 2019
Cited by 29 | Viewed by 7223
Abstract
In this paper, we analyze information flows between communities of financial markets, represented as complex networks. Each community, typically corresponding to a business sector, represents a significant part of the financial market and the detection of interactions between communities is crucial in the [...] Read more.
In this paper, we analyze information flows between communities of financial markets, represented as complex networks. Each community, typically corresponding to a business sector, represents a significant part of the financial market and the detection of interactions between communities is crucial in the analysis of risk spreading in the financial markets. We show that the transfer entropy provides a coherent description of information flows in and between communities, also capturing non-linear interactions. Particularly, we focus on information transfer of rare events—typically large drops which can spread in the network. These events can be analyzed by Rényi transfer entropy, which enables to accentuate particular types of events. We analyze transfer entropies between communities of the five largest financial markets and compare the information flows with the correlation network of each market. From the transfer entropy picture, we can also identify the non-linear interactions, which are typical in the case of extreme events. The strongest flows can be typically observed between specific types of business sectors—financial sectors is the most significant example. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

22 pages, 2900 KB  
Article
On a Dynamical Approach to Some Prime Number Sequences
by Lucas Lacasa, Bartolome Luque, Ignacio Gómez and Octavio Miramontes
Entropy 2018, 20(2), 131; https://doi.org/10.3390/e20020131 - 19 Feb 2018
Cited by 4 | Viewed by 7783
Abstract
We show how the cross-disciplinary transfer of techniques from dynamical systems theory to number theory can be a fruitful avenue for research. We illustrate this idea by exploring from a nonlinear and symbolic dynamics viewpoint certain patterns emerging in some residue sequences generated [...] Read more.
We show how the cross-disciplinary transfer of techniques from dynamical systems theory to number theory can be a fruitful avenue for research. We illustrate this idea by exploring from a nonlinear and symbolic dynamics viewpoint certain patterns emerging in some residue sequences generated from the prime number sequence. We show that the sequence formed by the residues of the primes modulo k are maximally chaotic and, while lacking forbidden patterns, unexpectedly display a non-trivial spectrum of Renyi entropies which suggest that every block of size m > 1 , while admissible, occurs with different probability. This non-uniform distribution of blocks for m > 1 contrasts Dirichlet’s theorem that guarantees equiprobability for m = 1 . We then explore in a similar fashion the sequence of prime gap residues. We numerically find that this sequence is again chaotic (positivity of Kolmogorov–Sinai entropy), however chaos is weaker as forbidden patterns emerge for every block of size m > 1 . We relate the onset of these forbidden patterns with the divisibility properties of integers, and estimate the densities of gap block residues via Hardy–Littlewood k-tuple conjecture. We use this estimation to argue that the amount of admissible blocks is non-uniformly distributed, what supports the fact that the spectrum of Renyi entropies is again non-trivial in this case. We complete our analysis by applying the chaos game to these symbolic sequences, and comparing the Iterated Function System (IFS) attractors found for the experimental sequences with appropriate null models. Full article
Show Figures

Figure 1

20 pages, 7433 KB  
Article
A Transient Fault Recognition Method for an AC-DC Hybrid Transmission System Based on MMC Information Fusion
by Jikai Chen, Yanhui Dou, Yang Li, Jiang Li and Guoqing Li
Energies 2017, 10(1), 23; https://doi.org/10.3390/en10010023 - 26 Dec 2016
Cited by 13 | Viewed by 5071
Abstract
At present, the research is still in the primary stage in the process of fault disturbance energy transfer in the multilevel modular converter based high voltage direct current (HVDC-MMC). An urgent problem is how to extract and analyze the fault features hidden in [...] Read more.
At present, the research is still in the primary stage in the process of fault disturbance energy transfer in the multilevel modular converter based high voltage direct current (HVDC-MMC). An urgent problem is how to extract and analyze the fault features hidden in MMC electrical information in further studies on the HVDC system. Aiming at the above, this article analyzes the influence of AC transient disturbance on electrical signals of MMC. At the same time, it is found that the energy distribution of electrical signals in MMC is different for different arms in the same frequency bands after the discrete wavelet packet transformation (DWPT). Renyi wavelet packet energy entropy (RWPEE) and Renyi wavelet packet time entropy (RWPTE) are proposed and applied to AC transient fault feature extraction from electrical signals in MMC. Using the feature extraction results of Renyi wavelet packet entropy (RWPE), a novel recognition method is put forward to recognize AC transient faults using the information fusion technology. Theoretical analysis and experimental results show that the proposed method is available to recognize transient AC faults. Full article
(This article belongs to the Special Issue Electric Power Systems Research 2017)
Show Figures

Figure 1

Back to TopTop