Entropy doi: 10.3390/e20110873

Authors: Zhe Wu Qiang Zhang Lixin Wang Lifeng Cheng Jingbo Zhou

It is a difficult task to analyze the coupling characteristics of rotating machinery fault signals under the influence of complex and nonlinear interference signals. This difficulty is due to the strong noise background of rotating machinery fault feature extraction and weaknesses, such as modal mixing problems, in the existing Ensemble Empirical Mode Decomposition (EEMD) time&ndash;frequency analysis methods. To quantitatively study the nonlinear synchronous coupling characteristics and information transfer characteristics of rotating machinery fault signals between different frequency scales under the influence of complex and nonlinear interference signals, a new nonlinear signal processing method&mdash;the harmonic assisted multivariate empirical mode decomposition method (HA-MEMD)&mdash;is proposed in this paper. By adding additional high-frequency harmonic-assisted channels and reducing them, the decomposing precision of the Intrinsic Mode Function (IMF) can be effectively improved, and the phenomenon of mode aliasing can be mitigated. Analysis results of the simulated signals prove the effectiveness of this method. By combining HA-MEMD with the transfer entropy algorithm and introducing signal processing of the rotating machinery, a fault detection method of rotating machinery based on high-frequency harmonic-assisted multivariate empirical mode decomposition-transfer entropy (HA-MEMD-TE) was established. The main features of the mechanical transmission system were extracted by the high-frequency harmonic-assisted multivariate empirical mode decomposition method, and the signal, after noise reduction, was used for the transfer entropy calculation. The evaluation index of the rotating machinery state based on HA-MEMD-TE was established to quantitatively describe the degree of nonlinear coupling between signals to effectively evaluate and diagnose the operating state of the mechanical system. By adding noise to different signal-to-noise ratios, the fault detection ability of HA-MEMD-TE method in the background of strong noise is investigated, which proves that the method has strong reliability and robustness. In this paper, transfer entropy is applied to the fault diagnosis field of rotating machinery, which provides a new effective method for early fault diagnosis and performance degradation-state recognition of rotating machinery, and leads to relevant research conclusions.

]]>Entropy doi: 10.3390/e20110872

Authors: Zhong Li Chenxu Wang Linye Yu Yong Gu Minxiang Pan Xiaohua Tan Hui Xu

The present work exhibits the effects of Sn addition on the magnetic properties and microstructure of FeCoNi(CuAl)0.8Snx (0 &le; x &le; 0.10) high-entropy alloys (HEAs). The results show all the samples consist of a mixed structure of face-centered-cubic (FCC) phase and body-centered-cubic (BCC) phase. The addition of Sn promotes the formation of BCC phase, and it also affects the shape of Cu-rich nano-precipitates in BCC matrix. It also shows that the Curie temperatures (Tc) of the FCC phase and the saturation magnetization (Ms) of the FeCoNi(CuAl)0.8Snx (0 &le; x &le; 0.10) HEAs increase greatly while the remanence (Br) decreases after the addition of Sn into FeCoNi(CuAl)0.8 HEA. The thermomagnetic curves indicate that the phases of the FeCoNi(CuAl)0.8Snx (0 &le; x &le; 0.10) HEAs will transform from FCC with low Tc to BCC phase with high Tc at temperature of 600&ndash;700 K. This work provides a new idea for FeCoNi(CuAl)0.8Snx (0 &le; x &le; 0.10) HEAs for their potential application as soft magnets to be used at high temperatures.

]]>Entropy doi: 10.3390/e20110871

Authors: David Cuesta-Frau Daniel Novák Vacláv Burda Antonio Molina-Picó Borja Vargas Milos Mraz Petra Kavalkova Marek Benes Martin Haluzik

This paper analyses the performance of SampEn and one of its derivatives, Fuzzy Entropy (FuzzyEn), in the context of artifacted blood glucose time series classification. This is a difficult and practically unexplored framework, where the availability of more sensitive and reliable measures could be of great clinical impact. Although the advent of new blood glucose monitoring technologies may reduce the incidence of the problems stated above, incorrect device or sensor manipulation, patient adherence, sensor detachment, time constraints, adoption barriers or affordability can still result in relatively short and artifacted records, as the ones analyzed in this paper or in other similar works. This study is aimed at characterizing the changes induced by such artifacts, enabling the arrangement of countermeasures in advance when possible. Despite the presence of these disturbances, results demonstrate that SampEn and FuzzyEn are sufficiently robust to achieve a significant classification performance, using records obtained from patients with duodenal-jejunal exclusion. The classification results, in terms of area under the ROC of up to 0.9, with several tests yielding AUC values also greater than 0.8, and in terms of a leave-one-out average classification accuracy of 80%, confirm the potential of these measures in this context despite the presence of artifacts, with SampEn having slightly better performance than FuzzyEn.

]]>Entropy doi: 10.3390/e20110870

Authors: Grace Villacrés Tobias Koch Aydin Sezgin Gonzalo Vazquez-Vilar

This paper studies a bursty interference channel, where the presence/absence of interference is modeled by a block-i.i.d. Bernoulli process that stays constant for a duration of T symbols (referred to as coherence block) and then changes independently to a new state. We consider both a quasi-static setup, where the interference state remains constant during the whole transmission of the codeword, and an ergodic setup, where a codeword spans several coherence blocks. For the quasi-static setup, we study the largest rate of a coding strategy that provides reliable communication at a basic rate and allows an increased (opportunistic) rate when there is no interference. For the ergodic setup, we study the largest achievable rate. We study how non-causal knowledge of the interference state, referred to as channel-state information (CSI), affects the achievable rates. We derive converse and achievability bounds for (i) local CSI at the receiver side only; (ii) local CSI at the transmitter and receiver side; and (iii) global CSI at all nodes. Our bounds allow us to identify when interference burstiness is beneficial and in which scenarios global CSI outperforms local CSI. The joint treatment of the quasi-static and ergodic setup further allows for a thorough comparison of these two setups.

]]>Entropy doi: 10.3390/e20110868

Authors: Jie Liu Zhao Duan

In this study, a comparative analysis of the statistical index (SI), index of entropy (IOE) and weights of evidence (WOE) models was introduced to landslide susceptibility mapping, and the performance of the three models was validated and systematically compared. As one of the most landslide-prone areas in Shaanxi Province, China, Shangnan County was selected as the study area. Firstly, a series of reports, remote sensing images and geological maps were collected, and field surveys were carried out to prepare a landslide inventory map. A total of 348 landslides were identified in study area, and they were reclassified as a training dataset (70% = 244 landslides) and testing dataset (30% = 104 landslides) by random selection. Thirteen conditioning factors were then employed. Corresponding thematic data layers and landslide susceptibility maps were generated based on ArcGIS software. Finally, the area under the curve (AUC) values were calculated for the training dataset and the testing dataset in order to validate and compare the performance of the three models. For the training dataset, the AUC plots showed that the WOE model had the highest accuracy rate of 76.05%, followed by the SI model (74.67%) and the IOE model (71.12%). In the case of the testing dataset, the prediction accuracy rates for the SI, IOE and WOE models were 73.75%, 63.89%, and 75.10%, respectively. It can be concluded that the WOE model had the best prediction capacity for landslide susceptibility mapping in Shangnan County. The landslide susceptibility map produced by the WOE model had a profound geological and engineering significance in terms of landslide hazard prevention and control in the study area and other similar areas.

]]>Entropy doi: 10.3390/e20110869

Authors: Maurice A. de Gosson

We have shown in previous work that the equivalence of the Heisenberg and Schr&ouml;dinger pictures of quantum mechanics requires the use of the Born and Jordan quantization rules. In the present work we give further evidence that the Born&ndash;Jordan rule is the correct quantization scheme for quantum mechanics. For this purpose we use correct short-time approximations to the action functional, initially due to Makri and Miller, and show that these lead to the desired quantization of the classical Hamiltonian.

]]>Entropy doi: 10.3390/e20110867

Authors: Xingbin Liu Di Xiao Cong Liu

Quantum image encryption offers major advantages over its classical counterpart in terms of key space, computational complexity, and so on. A novel double quantum image encryption approach based on quantum Arnold transform (QAT) and qubit random rotation is proposed in this paper, in which QAT is used to scramble pixel positions and the gray information is changed by utilizing random qubit rotation. Actually, the independent random qubit rotation operates once, respectively, in spatial and frequency domains with the help of quantum Fourier transform (QFT). The encryption process accomplishes pixel confusion and diffusion, and finally the noise-like cipher image is obtained. Numerical simulation and theoretical analysis verify that the method is valid and it shows superior performance in security and computational complexity.

]]>Entropy doi: 10.3390/e20110866

Authors: Richard Cant Ayodeji Remi-Omosowon Caroline Langensiepen Ahmad Lotfi

In this paper, a novel approach to the container loading problem using a spatial entropy measure to bias a Monte Carlo Tree Search is proposed. The proposed algorithm generates layouts that achieve the goals of both fitting a constrained space and also having “consistency” or neatness that enables forklift truck drivers to apply them easily to real shipping containers loaded from one end. Three algorithms are analysed. The first is a basic Monte Carlo Tree Search, driven only by the principle of minimising the length of container that is occupied. The second is an algorithm that uses the proposed entropy measure to drive an otherwise random process. The third algorithm combines these two principles and produces superior results to either. These algorithms are then compared to a classical deterministic algorithm. It is shown that where the classical algorithm fails, the entropy-driven algorithms are still capable of providing good results in a short computational time.

]]>Entropy doi: 10.3390/e20110865

Authors: Julian Gonzalez-Ayala Moises Santillán Maria Jesus Santos Antonio Calvo Hernández José Miguel Mateos Roco

Local stability of maximum power and maximum compromise (Omega) operation regimes dynamic evolution for a low-dissipation heat engine is analyzed. The thermodynamic behavior of trajectories to the stationary state, after perturbing the operation regime, display a trade-off between stability, entropy production, efficiency and power output. This allows considering stability and optimization as connected pieces of a single phenomenon. Trajectories inside the basin of attraction display the smallest entropy drops. Additionally, it was found that time constraints, related with irreversible and endoreversible behaviors, influence the thermodynamic evolution of relaxation trajectories. The behavior of the evolution in terms of the symmetries of the model and the applied thermal gradients was analyzed.

]]>Entropy doi: 10.3390/e20110864

Authors: Xuelian Zhou Yongchuan Tang

As a typical tool of risk analysis in practical engineering, failure mode and effects analysis (FMEA) theory is a well known method for risk prediction and prevention. However, how to quantify the uncertainty of the subjective assessments from FMEA experts and aggregate the corresponding uncertainty to the classical FMEA approach still needs further study. In this paper, we argue that the subjective assessments of FMEA experts can be adopted to model the weight of each FMEA expert, which can be regarded as a data-driven method for ambiguity information modeling in FMEA method. Based on this new perspective, a modified FMEA approach is proposed, where the subjective uncertainty of FMEA experts is handled in the framework of Dempster&ndash;Shafer evidence theory (DST). In the improved FMEA approach, the ambiguity measure (AM) which is an entropy-like uncertainty measure in DST framework is applied to quantify the uncertainty degree of each FMEA expert. Then, the classical risk priority number (RPN) model is improved by aggregating an AM-based weight factor into the RPN function. A case study based on the new RPN model in aircraft turbine rotor blades verifies the applicable and useful of the proposed FMEA approach.

]]>Entropy doi: 10.3390/e20110863

Authors: Paulina Trybek Michal Nowakowski Jerzy Salowka Jakub Spiechowicz Lukasz Machura

Information theory provides a spectrum of nonlinear methods capable of grasping an internal structure of a signal together with an insight into its complex nature. In this work, we discuss the usefulness of the selected entropy techniques for a description of the information carried by the surface electromyography signals during colorectal cancer treatment. The electrical activity of the external anal sphincter can serve as a potential source of knowledge of the actual state of the patient who underwent a common surgery for rectal cancer in the form of anterior or lower anterior resection. The calculation of Sample entropy parameters has been extended to multiple time scales in terms of the Multiscale Sample Entropy. The specific values of the entropy measures and their dependence on the time scales were analyzed with regard to the time elapsed since the operation, the type of surgical treatment and also the different depths of the rectum canal. The Mann&ndash;Whitney U test and Anova Friedman statistics indicate the statistically significant differences among all of stages of treatment and for all consecutive depths of rectum area for the estimated Sample Entropy. The further analysis at the multiple time scales signify the substantial differences among compared stages of treatment in the group of patients who underwent the lower anterior resection.

]]>Entropy doi: 10.3390/e20110862

Authors: Léo Viallon-Galinier Gaël Combe Vincent Richefeu Allbens Picardi Faria Atman

The statistics of grain displacements probability distribution function (pdf) during the shear of a granular medium displays an unusual dependence with the shear increment upscaling as recently evinced (see &ldquo;experimental validation of a nonextensive scaling law in confined granular media&rdquo;). Basically, the pdf of grain displacements has clear nonextensive (q-Gaussian) features at small scales, but approaches to Gaussian characteristics at large shear window scales&mdash;the granulence effect. Here, we extend this analysis studying a larger system (more grains considered in the experimental setup), which exhibits a severe shear band fault during the macroscopic straining. We calculate the pdf of grain displacements and the dependency of the q-statistics with the shear increment. This analysis has shown a singular behavior of q at large scales, displaying a non-monotonic dependence with the shear increment. By means of an independent image analysis, we demonstrate that this singular non-monotonicity could be associated with the emergence of a shear band within the confined system. We show that the exact point where the q-value inverts its tendency coincides with the emergence of a giant percolation cluster along the system, caused by the shear band. We believe that this original approach using Statistical Mechanics tools to identify shear bands can be a very useful piece to solve the complex puzzle of the rheology of dense granular systems.

]]>Entropy doi: 10.3390/e20110861

Authors: Yukio Ohsawa

A method is presented to detect earthquake precursors from time series data on earthquakes in a target region. The Regional Entropy of Seismic Information (RESI) is an index that represents the average influence of an earthquake in a target region on the diversity of clusters to which earthquake foci are distributed. Based on a simple qualitative model of the dynamics of land crust, it is hypothesized that the saturation that occurs after an increase in RESI precedes the activation of earthquakes. This hypothesis is validated by the earthquake catalog. This temporal change was found to correlate with the activation of earthquakes in Japanese regions one to two years ahead of the real activation, more reliably than the compared baseline methods.

]]>Entropy doi: 10.3390/e20110860

Authors: Marcos Hortelano Richard B. Reilly Francisco Castells Raquel Cervigón

Orthostatic intolerance syndrome occurs when the autonomic nervous system is incapacitated and fails to respond to the demands associated with the upright position. Assessing this syndrome among the elderly population is important in order to prevent falls. However, this problem is still challenging. The goal of this work was to determine the relationship between orthostatic intolerance (OI) and the cardiovascular response to exercise from the analysis of heart rate and blood pressure. More specifically, the behavior of these cardiovascular variables was evaluated in terms of refined composite multiscale fuzzy entropy (RCMFE), measured at different scales. The dataset was composed by 65 older subjects, 44.6% (n = 29) were OI symptomatic and 55.4% (n = 36) were not. Insignificant differences were found in age and gender between symptomatic and asymptomatic OI participants. When heart rate was evaluated, higher differences between groups were observed during the recovery period immediately after exercise. With respect to the blood pressure and other hemodynamic parameters, most significant results were obtained in the post-exercise stage. In any case, the symptomatic OI group exhibited higher irregularity in the measured parameters, as higher RCMFE levels in all time scales were obtained. This information could be very helpful for a better understanding of cardiovascular instability, as well as to recognize risk factors for falls and impairment of functional status.

]]>Entropy doi: 10.3390/e20110859

Authors: Maomao Hou Zhiyuan Lin Jingnan Chen Yaming Zhai Qiu Jin Fenglin Zhong

Numerous indicators under the plant-soil system should be taken into consideration when developing an appropriate agricultural water conservancy project. Entropy evaluation method offers excellent prospects in optimizing agricultural management schemes. To investigate the impact of different buried depths (30, 45, 60, 75, 90, and 105 cm) of subsurface drainage pipes on greenhouse plant-soil systems, the tomato was employed as plant material, and the marketable yield, fruit sugar to acid ratio, soil electrical conductivity, nitrogen loss rate, as well as crop water and fertilizer use efficiency were observed. Based on these indicators, the entropy evaluation method was used to select the optimal buried depth of subsurface drainage pipes. Both the calculation results of objective and subjective weights indicated that tomato yield and soil electrical conductivity were relatively more crucial than other indexes, and their comprehensive weights were 0.43 and 0.34, respectively. The 45 cm buried depth possessed the optimal comprehensive benefits, with entropy evaluation value of 0.94. Under 45 cm buried depth, the loss rate of soil available nitrogen was 13.9%, the decrease rate of soil salinity was 49.2%, and the tomato yield, sugar to acid ratio, nitrogen use efficiency, and water use efficiency were 112 kg&middot;ha&minus;1, 8.3, 39.7%, and 42.0%, respectively.

]]>Entropy doi: 10.3390/e20110858

Authors: Yimin Huang Xingli Chen Qiuxiang Li Xiaogang Ma

The internet has provided a new means for manufacturers to reach consumers. On the background of the widespread multichannel sales in China, based on a literature review of the service game and multichannel supply chain, this paper builds a multichannel dynamic service game model where the retailer operates an offline channel and the manufacturer operates an online channel and offers customers the option to buy online and pick up from the retailer&rsquo;s store (BOPS). The manufacturer and the retailer take maximizing the channel profits as their business objectives and make channel service game under optimal pricing. We carry on theoretical analysis of the model and perform numerical simulations from the perspective of entropy theory, game theory, and chaotic dynamics. The results show that the stability of the system will weaken with the increase in service elasticity coefficient and that it is unaffected by the feedback parameter adjustment of the retailer. The BOPS channel strengthens the cooperation between the manufacturer and the retailer and moderates the conflict between the online and the offline channels. The system will go into chaotic state and cause the system&rsquo;s entropy to increase when the manufacturer adjusts his/her service decision quickly. In a chaotic state, the system is sensitive to initial conditions and service input is difficult to predict; the manufacturer and retailer need more additional information to make the system clear or use the method of feedback control to delay or eliminate the occurrence of chaos.

]]>Entropy doi: 10.3390/e20110857

Authors: Khalil El Hindi Hussien AlSalamn Safwan Qassim Saad Al Ahmadi

Text classification is one domain in which the naive Bayesian (NB) learning algorithm performs remarkably well. However, making further improvement in performance using ensemble-building techniques proved to be a challenge because NB is a stable algorithm. This work shows that, while an ensemble of NB classifiers achieves little or no improvement in terms of classification accuracy, an ensemble of fine-tuned NB classifiers can achieve a remarkable improvement in accuracy. We propose a fine-tuning algorithm for text classification that is both more accurate and less stable than the NB algorithm and the fine-tuning NB (FTNB) algorithm. This improvement makes it more suitable than the FTNB algorithm for building ensembles of classifiers using bagging. Our empirical experiments, using 16-benchmark text-classification data sets, show significant improvement for most data sets.

]]>Entropy doi: 10.3390/e20110856

Authors: Leila Schneps Richard Overill David Lagnado

Testing of evidence in criminal cases can be limited by temporal or financial constraints or by the fact that certain tests may be mutually exclusive, so choosing the tests that will have maximal impact on the final result is essential. In this paper, we assume that a main hypothesis, evidence for it and possible tests for existence of this evidence are represented in the form of a Bayesian network, and use three different methods to measure the impact of a test on the main hypothesis. We illustrate the methods by applying them to an actual digital crime case provided by the Hong Kong police. We conclude that the Kullback&ndash;Leibler divergence is the optimal method for selecting the tests with the highest impact.

]]>Entropy doi: 10.3390/e20110855

Authors: Bogeun Gwak

We investigate the laws of thermodynamics and the validity of the cosmic censorship conjecture in the Kerr&ndash;Newman&ndash;de Sitter black hole under charged particle absorption. Here, the black hole undergoes infinitesimal changes because of the momenta carried by the particle entering it. The cosmic censorship conjecture is tested by whether the black hole can be overcharged beyond the extremal condition under absorption. The changes in the black hole violate the second law of thermodynamics. Furthermore, this is related to the cosmic censorship conjecture. To resolve this violation, we impose a reference energy of the particle at the asymptotic region based on the first law of thermodynamics. Under imposition of the reference energy, the absorption satisfies the laws of thermodynamics, and the extremal black hole cannot be overcharged. Thus, the cosmic censorship conjecture is valid under the absorption.

]]>Entropy doi: 10.3390/e20110854

Authors: Yakir Aharonov Eliahu Cohen Mordecai Waegell Avshalom C. Elitzur

While quantum reality can be probed through measurements, the Two-State Vector Formalism (TSVF) reveals a subtler reality prevailing between measurements. Under special pre- and post-selections, odd physical values emerge. This unusual picture calls for a deeper study. Instead of the common, wave-based picture of quantum mechanics, we suggest a new, particle-based perspective: Each particle possesses a definite location throughout its evolution, while some of its physical variables (characterized by deterministic operators, some of which obey nonlocal equations of motion) are carried by &ldquo;mirage particles&rdquo; accounting for its unique behavior. Within the time interval between pre- and post-selection, the particle gives rise to a horde of such mirage particles, of which some can be negative. What appears to be &ldquo;no-particle&rdquo;, known to give rise to interaction-free measurement, is in fact a self-canceling pair of positive and negative mirage particles, which can be momentarily split and cancel out again. Feasible experiments can give empirical evidence for these fleeting phenomena. In this respect, the Heisenberg ontology is shown to be conceptually advantageous compared to the Schr&ouml;dinger picture. We review several recent advances, discuss their foundational significance and point out possible directions for future research.

]]>Entropy doi: 10.3390/e20110853

Authors: David Cuesta-Frau Pau Miró-Martínez Sandra Oltra-Crespo Jorge Jordán-Núñez Borja Vargas Paula González Manuel Varela-Entrecanales

Many entropy-related methods for signal classification have been proposed and exploited successfully in the last several decades. However, it is sometimes difficult to find the optimal measure and the optimal parameter configuration for a specific purpose or context. Suboptimal settings may therefore produce subpar results and not even reach the desired level of significance. In order to increase the signal classification accuracy in these suboptimal situations, this paper proposes statistical models created with uncorrelated measures that exploit the possible synergies between them. The methods employed are permutation entropy (PE), approximate entropy (ApEn), and sample entropy (SampEn). Since PE is based on subpattern ordinal differences, whereas ApEn and SampEn are based on subpattern amplitude differences, we hypothesized that a combination of PE with another method would enhance the individual performance of any of them. The dataset was composed of body temperature records, for which we did not obtain a classification accuracy above 80% with a single measure, in this study or even in previous studies. The results confirmed that the classification accuracy rose up to 90% when combining PE and ApEn with a logistic model.

]]>Entropy doi: 10.3390/e20110852

Authors: Kenta Yamada Hideki Takayasu Misako Takayasu

We introduce a systematic method to estimate an economic indicator from the Japanese government by analyzing big Japanese blog data. Explanatory variables are monthly word frequencies. We adopt 1352 words in the section of economics and industry of the Nikkei thesaurus for each candidate word to illustrate the economic index. From this large volume of words, our method automatically selects the words which have strong correlation with the economic indicator and resolves some difficulties in statistics such as the spurious correlation and overfitting. As a result, our model reasonably illustrates the real economy index. The announcement of an economic index from government usually has a time lag, while our proposed method can be real time.

]]>Entropy doi: 10.3390/e20110851

Authors: Nasir Shehzad Ahmed Zeeshan Rahmat Ellahi Saman Rashidi

In this paper, an analytical study of internal energy losses for the non-Darcy Poiseuille flow of silver-water nanofluid due to entropy generation in porous media is investigated. Spherical-shaped silver (Ag) nanosize particles with volume fraction 0.3%, 0.6%, and 0.9% are utilized. Four illustrative models are considered: (i) heat transfer irreversibility (HTI), (ii) fluid friction irreversibility (FFI), (iii) Joule dissipation irreversibility (JDI), and (iv) non-Darcy porous media irreversibility (NDI). The governing equations of continuity, momentum, energy, and entropy generation are simplified by taking long wavelength approximations on the channel walls. The results represent highly nonlinear coupled ordinary differential equations that are solved analytically with the help of the homotopy analysis method. It is shown that for minimum and maximum averaged entropy generation, 0.3% by vol and 0.9% by vol of nanoparticles, respectively, are observed. Also, a rise in entropy is evident due to an increase in pressure gradient. The current analysis provides an adequate theoretical estimate for low-cost purification of drinking water by silver nanoparticles in an industrial process.

]]>Entropy doi: 10.3390/e20110850

Authors: Huer Sun Chao Wu Xiaohua Liang Qunfeng Zeng

The weak compound fault feature is difficult to extract from a gearbox because the signal components are complex and inter-modulated. An approach (that is abbreviated as MRPE-MOMEDA) for extracting the weak fault features of a transmission based on a multipoint optimal minimum entropy deconvolution adjustment (MOMEDA) and the permutation entropy was proposed to solve this problem in the present paper. The complexity of the periodic impact signal was low and the permutation entropy was relatively small. Moreover, the amplitude of the impact was relatively large. Based on these advantages, the multipoint reciprocal permutation entropy (MRPE) was proposed to track the impact fault source of the weak fault feature in gearbox compound faults. The impact fault period was indicated through MRPE. MOMEDA achieved signal denoising. The optimal filter coefficients were solved using MOMEDA. It exhibits an outstanding performance for noise suppression of gearbox signals with a periodic impact. The results from the transmission show that the proposed method can identify multiple faults simultaneously on a driving gear in the 4th gear of the transmission.

]]>Entropy doi: 10.3390/e20110849

Authors: Praveen Sathiyamoorthi Jae Wung Bae Peyman Asghari-Rad Jeong Min Park Jung Gi Kim Hyoung Seop Kim

Annealing of severely plastic deformed materials is expected to produce a good combination of strength and ductility, which has been widely demonstrated in conventional materials. In the present study, high-pressure torsion processed CoCrNi medium entropy alloy consisting of a single face-centered cubic (FCC) phase with a grain size of ~50 nm was subjected to different annealing conditions, and its effect on microstructure and mechanical behavior was investigated. The annealing of high-pressure torsion processed CoCrNi alloy exhibits partial recrystallization and near full recrystallization based on the annealing temperature and time. The samples annealed at 700 &deg;C for 2 min exhibit very fine grain size, a high fraction of low angle grain boundaries, and high kernel average misorientation value, indicating partially recrystallized microstructure. The samples annealed for a longer duration (&gt;2 min) exhibit relatively larger grain size, a low fraction of low angle grain boundaries, and low kernel average misorientation value, indicating nearly full recrystallized microstructure. The annealed samples with different microstructures significantly influence the uniform elongation, tensile strength, and work hardening rate. The sample annealed at 700 &deg;C for 15 min exhibits a remarkable combination of tensile strength (~1090 MPa) and strain to failure (~41%).

]]>Entropy doi: 10.3390/e20110848

Authors: Myoung Cho Moo Choi

The minimization of a free energy is often regarded as the key principle in understanding how the brain works and how the brain structure forms. In particular, a statistical-mechanics-based neural network model is expected to allow one to interpret many aspects of the neural firing and learning processes in terms of general concepts and mechanisms in statistical physics. Nevertheless, the definition of the free energy in a neural system is usually an intricate problem without an evident solution. After the pioneering work by Hopfield, several statistical-mechanics-based models have suggested a variety of definition of the free energy or the entropy in a neural system. Among those, the Feynman machine, proposed recently, presents the free energy of a neural system defined via the Feynman path integral formulation with the explicit time variable. In this study, we first give a brief review of the previous relevant models, paying attention to the troublesome problems in them, and examine how the Feynman machine overcomes several vulnerable points in previous models and derives the outcome of the firing or the learning rule in a (biological) neural system as the extremum state in the free energy. Specifically, the model reveals that the biological learning mechanism, called spike-timing-dependent plasticity, relates to the free-energy minimization principle. Basically, computing and learning mechanisms in the Feynman machine base on the exact spike timings of neurons, such as those in a biological neural system. We discuss the consequence of the adoption of an explicit time variable in modeling a neural system and the application of the free-energy minimization principle to understanding the phenomena in the brain.

]]>Entropy doi: 10.3390/e20110847

Authors: Shuting Wan Lei Chen Longjiang Dou Jianping Zhou

As high-voltage circuit breakers (HVCBs) are directly related to the safety and the stability of a power grid, it is of great significance to carry out fault diagnoses of HVCBs. To accurately identify operating states of HVCBs, a novel mechanical fault diagnosis method of HVCBs based on multi-feature entropy fusion (MFEF) and a hybrid classifier is proposed. MFEF involves the decomposition of vibration signals of HVCBs into several intrinsic mode functions using variational mode decomposition (VMD) and the calculation of multi-feature entropy by the integration of three Shannon entropies. Principle component analysis (PCA) is then used to reduce the dimension of the multi-feature entropy to achieve an effective fusion of features for selecting the feature vector. The detection of an unknown fault in HVCBs is achieved using support vector data description (SVDD) trained by normal-state samples and specific fault samples. On this basis, the identification and classification of the known states are realized by the support vector machine (SVM). Three faults (i.e., closing spring force decrease fault, buffer spring invalid fault, opening spring force decrease fault) are simulated on a real SF6 HVCB to test the feasibility of the proposed method. The detection accuracies of the unknown fault are 100%, 87.5%, and 100% respectively when each of the three faults is assumed to be the unknown fault. The comparative experiments show that SVM has no ability to detect the unknown fault, and that one-class support vector machine (OCSVM) has a weaker ability to detect the unknown fault than SVDD. For known-state classification, the adoption of the MFEF method achieved an accuracy of 100%, while the use of a single-feature method only achieved an accuracy of 75%. These results indicate that the proposed method combining MFEF with hybrid classifier is thus more efficient and robust than traditional methods.

]]>Entropy doi: 10.3390/e20110846

Authors: Ali J. Chamkha Fatih Selimefendigil

MHD free convection inside a triangular-wave-shaped corrugated porous cavity with Cu-water nanofluid is numerically studied with the finite element method. The influences of the Grashof number ( 10 4 &le; Gr &le; 10 6 ), Hartmann number ( 0 &le; Ha &le; 50 ), Darcy number ( 10 &minus; 4 &le; Da &le; 10 &minus; 1 ) and solid volume fraction of the nanoparticle ( 0 &le; ϕ &le; 0.05 ) on the convective flow features are examined. It is observed that increasing the Grashof number and Darcy number enhances the heat transfer, while the effect is opposite for the Hartmann number. As the corrugation frequency of the triangular wave increases, the local and averaged heat transfer rates decrease, which is more effective for higher values of Grashof and Darcy numbers. Normalized total entropy generation increases as the Darcy number and solid volume fraction of the nanoparticles increase and decreases as the Hartmann number increases both for flat and corrugated wall configurations.

]]>Entropy doi: 10.3390/e20110845

Authors: Zhu

The flocculation of cohesive sediment plays an important role in affecting morphological changes to coastal areas, to dredging operations in navigational canals, to sediment siltation in reservoirs and lakes, and to the variation of water quality in estuarine waters. Many studies have been conducted recently to formulate a turbulence-induced flocculation model (described by a characteristic floc size with respect to flocculation time) of cohesive sediment by virtue of theoretical analysis, numerical modeling, and/or experimental observation. However, a probability study to formulate the flocculation model is still lacking in the literature. The present study, therefore, aims to derive an explicit expression for the flocculation of cohesive sediment in a turbulent fluid environment based on two common entropy theories: Shannon entropy and Tsallis entropy. This study derives an explicit expression for the characteristic floc size, assumed to be a random variable, as a function of flocculation time by maximizing the entropy function subject to the constraint equation using a hypothesis regarding the cumulative distribution function of floc size. It was found that both the Shannon entropy and the Tsallis entropy theories lead to the same expression. Furthermore, the derived expression was tested with experimental data from the literature and the results were compared with those of existing deterministic models, showing that it has good agreement with the experimental data and that it has a better prediction accuracy for the logarithmic growth pattern of data in comparison to the other models, whereas, for the sigmoid growth pattern of experimental data, the model of Keyvani and Strom or Son and Hsu model could be the better choice for floc size prediction. Finally, the maximum capacity of floc size growth, a key parameter incorporated into this expression, was found to exhibit an empirical power relationship with the flow shear rate.

]]>Entropy doi: 10.3390/e20110844

Authors: Wen-Hua Cui Jun Ye

In order to quantify the fuzziness in the simplified neutrosophic setting, this paper proposes a generalized distance-based entropy measure and a dimension root entropy measure of simplified neutrosophic sets (NSs) (containing interval-valued and single-valued NSs) and verifies their properties. Then, comparison with the existing relative interval-valued NS entropy measures through a numerical example is carried out to demonstrate the feasibility and rationality of the presented generalized distance-based entropy and dimension root entropy measures of simplified NSs. Lastly, a decision-making example is presented to illustrate their applicability, and then the decision results indicate that the presented entropy measures are effective and reasonable. Hence, this study enriches the simplified neutrosophic entropy theory and measure approaches.

]]>Entropy doi: 10.3390/e20110843

Authors: Congxu Zhu Guojun Wang Kehui Sun

This paper presents an improved cryptanalysis of a chaos-based image encryption scheme, which integrated permutation, diffusion, and linear transformation process. It was found that the equivalent key streams and all the unknown parameters of the cryptosystem can be recovered by our chosen-plaintext attack algorithm. Both a theoretical analysis and an experimental validation are given in detail. Based on the analysis of the defects in the original cryptosystem, an improved color image encryption scheme was further developed. By using an image content&ndash;related approach in generating diffusion arrays and the process of interweaving diffusion and confusion, the security of the cryptosystem was enhanced. The experimental results and security analysis demonstrate the security superiority of the improved cryptosystem.

]]>Entropy doi: 10.3390/e20110842

Authors: Lipeng Pan Yong Deng

How to measure the uncertainty of the basic probability assignment (BPA) function is an open issue in Dempster&ndash;Shafer (D&ndash;S) theory. The main work of this paper is to propose a new belief entropy, which is mainly used to measure the uncertainty of BPA. The proposed belief entropy is based on Deng entropy and probability interval consisting of lower and upper probabilities. In addition, under certain conditions, it can be transformed into Shannon entropy. Numerical examples are used to illustrate the efficiency of the new belief entropy in measurement uncertainty.

]]>Entropy doi: 10.3390/e20110841

Authors: Gonzalo Castañeda Juan Romero-Padilla

In recent years, analytical tools of network theory have provided strong empirical support to the well-known hypothesis that regions develop through the local learning of capabilities (tacit productive knowledge). In this paper, we compare two indexes of competitiveness (or accumulated capabilities) for a subnational database of 32 Mexican states in the period 2004–2014. We find that Endogenous Fitness (i.e., region fitness and product complexity are derived jointly using only a Mexican exports database) has a better performance than Exogenous Fitness (i.e., product complexity comes from a world exports database and fitness is the sum of the complexity scores for the region’s competitive products). The performance criterion is established with the indicator’s capacity to meet a requirement of growth predictability: the existence of at least one laminar (ordered) regime in the fitness–income plane. In the Mexican data, Endogenous Fitness is a reliable predictor of per capita GDP in two distinct areas of the plane: one of continuous progress and opportunities, and another of stagnation and deteriorating fitness. The predictive capacity of this indicator becomes clear only when the metrics’ calculations are filtered by removing raw petroleum or oil-dependent states, while such capacity is robust to the inclusion of tourism—another important industry of the Mexican economy.

]]>Entropy doi: 10.3390/e20110840

Authors: Frédéric Barbaresco

We introduce poly-symplectic extension of Souriau Lie groups thermodynamics based on higher-order model of statistical physics introduced by Ingarden. This extended model could be used for small data analytics and machine learning on Lie groups. Souriau geometric theory of heat is well adapted to describe density of probability (maximum entropy Gibbs density) of data living on groups or on homogeneous manifolds. For small data analytics (rarified gases, sparse statistical surveys, …), the density of maximum entropy should consider higher order moments constraints (Gibbs density is not only defined by first moment but fluctuations request 2nd order and higher moments) as introduced by Ingarden. We use a poly-sympletic model introduced by Christian Günther, replacing the symplectic form by a vector-valued form. The poly-symplectic approach generalizes the Noether theorem, the existence of moment mappings, the Lie algebra structure of the space of currents, the (non-)equivariant cohomology and the classification of G-homogeneous systems. The formalism is covariant, i.e., no special coordinates or coordinate systems on the parameter space are used to construct the Hamiltonian equations. We underline the contextures of these models, and the process to build these generic structures. We also introduce a more synthetic Koszul definition of Fisher Metric, based on the Souriau model, that we name Souriau-Fisher metric. This Lie groups thermodynamics is the bedrock for Lie group machine learning providing a full covariant maximum entropy Gibbs density based on representation theory (symplectic structure of coadjoint orbits for Souriau non-equivariant model associated to a class of co-homology).

]]>Entropy doi: 10.3390/e20110839

Authors: Shuntaro Takahashi Kumiko Tanaka-Ishii

Neural language models have drawn a lot of attention for their strong ability to predict natural language text. In this paper, we estimate the entropy rate of natural language with state-of-the-art neural language models. To obtain the estimate, we consider the cross entropy, a measure of the prediction accuracy of neural language models, under the theoretically ideal conditions that they are trained with an infinitely large dataset and receive an infinitely long context for prediction. We empirically verify that the effects of the two parameters, the training data size and context length, on the cross entropy consistently obey a power-law decay with a positive constant for two different state-of-the-art neural language models with different language datasets. Based on the verification, we obtained 1.12 bits per character for English by extrapolating the two parameters to infinity. This result suggests that the upper bound of the entropy rate of natural language is potentially smaller than the previously reported values.

]]>Entropy doi: 10.3390/e20110838

Authors: Rudolf Hanel Stefan Thurner

Depending on context, the term entropy is used for a thermodynamic quantity, a measure of available choice, a quantity to measure information, or, in the context of statistical inference, a maximum configuration predictor. For systems in equilibrium or processes without memory, the mathematical expression for these different concepts of entropy appears to be the so-called Boltzmann&ndash;Gibbs&ndash;Shannon entropy, H. For processes with memory, such as driven- or self- reinforcing-processes, this is no longer true: the different entropy concepts lead to distinct functionals that generally differ from H. Here we focus on the maximum configuration entropy (that predicts empirical distribution functions) in the context of driven dissipative systems. We develop the corresponding framework and derive the entropy functional that describes the distribution of observable states as a function of the details of the driving process. We do this for sample space reducing (SSR) processes, which provide an analytically tractable model for driven dissipative systems with controllable driving. The fact that a consistent framework for a maximum configuration entropy exists for arbitrarily driven non-equilibrium systems opens the possibility of deriving a full statistical theory of driven dissipative systems of this kind. This provides us with the technical means needed to derive a thermodynamic theory of driven processes based on a statistical theory. We discuss the Legendre structure for driven systems.

]]>Entropy doi: 10.3390/e20110837

Authors: Maria Luisa Dalla Chiara Hector Freytes Roberto Giuntini Roberto Leporini Giuseppe Sergioli

Quantum computation theory has inspired new forms of quantum logic, called quantum computational logics, where formulas are supposed to denote pieces of quantum information, while logical connectives are interpreted as special examples of quantum logical gates. The most natural semantics for these logics is a form of holistic semantics, where meanings behave in a contextual way. In this framework, the concept of quantum probability can assume different forms. We distinguish an absolute concept of probability, based on the idea of quantum truth, from a relative concept of probability (a form of transition-probability, connected with the notion of fidelity between quantum states). Quantum information has brought about some intriguing epistemic situations. A typical example is represented by teleportation-experiments. In some previous works we have studied a quantum version of the epistemic operations &ldquo;to know&rdquo;, &ldquo;to believe&rdquo;, &ldquo;to understand&rdquo;. In this article, we investigate another epistemic operation (which is informally used in a number of interesting quantum situations): the operation &ldquo;being probabilistically informed&rdquo;.

]]>Entropy doi: 10.3390/e20110836

Authors: Stephen Fox Adrian Kotelba

Entropy in workplaces is situated amidst workers and their work. In this paper, findings are reported from a study encompassing psychomotor work by three types of workers: human, cyborg and robot; together with three aspects of psychomotor work: setting, composition and uncertainty. The Principle of Least Psychomotor Action (PLPA) is introduced and modelled in terms of situated entropy. PLPA is founded upon the Principle of Least Action. Situated entropy modelling of PLPA is informed by theoretical studies concerned with connections between information theory and thermodynamics. Four contributions are provided in this paper. First, the situated entropy of PLPA is modelled in terms of positioning, performing and perfecting psychomotor skills. Second, with regard to workers, PLPA is related to the state-of-the-art in human, cyborg and robot psychomotor skills. Third, with regard to work, situated entropy is related to engineering of work settings, work composition and work uncertainty. Fourth, PLPA and modelling situated entropy are related to debate about the future of work. Overall, modelling situated entropy is introduced as a means of objectively modelling relative potential of humans, cyborgs, and robots to carry out work with least action. This can introduce greater objectivity into debates about the future of work.

]]>Entropy doi: 10.3390/e20110835

Authors: Yang Wang Kun Zhang Yihui Feng Yansen Li Weiqi Tang Bingchen Wei

CoCrFeCuNi high-entropy alloys (HEAs) prepared by arc melting were irradiated with a 100 keV He+ ion beam. Volume swelling and hardening induced by irradiation were evaluated. When the dose reached 5.0 &times; 1017 ions/cm2, the Cu-rich phases exhibited more severe volume swelling compared with the matrix phases. This result indicated that the Cu-rich phases were favorable sites for the nucleation and gathering of He bubbles. X-ray diffraction indicated that all diffraction peak intensities decreased regularly. This reduction suggested loosening of the irradiated layer, thereby reducing crystallinity, under He+ ion irradiation. The Nix-Gao model was used to fit the measured hardness in order to obtain a hardness value H0 that excludes the indentation size effect. At ion doses of 2.5 &times; 1017 ions/cm2 and 5.0 &times; 1017 ions/cm2, the HEAs showed obvious hardening, which could be attributed to the formation of large amounts of irradiation defects. At the ion dose of 1.0 &times; 1018 ions/cm2, hardening was reduced, owing to the exfoliation of the original irradiation layer, combined with recovery induced by long-term thermal spike. This study is important to explore the potential uses of HEAs under extreme irradiation conditions.

]]>Entropy doi: 10.3390/e20110833

Authors: Andrea Napoletano Andrea Tacchella Luciano Pietronero

This work contributes to the literature in the field of innovation by proposing a quantitative approach for the prediction of the timing and location of patenting activity. In a recent work, it was shown that focusing on couples of technological codes allows for the formation of testable predictions of innovation events, defined as the first time two codes appear together in a patent. In particular, the construction of the vector space of codes and the introduction of the context similarity metric allows for a quantitative analysis of technological progress. Here, we move from that result and we show that, through context similarity, it is possible to assign to countries a score which measures the probability of being the first to patent a potential innovation. In other words, we show that we can not only estimate the likelihood that a potential innovation will be patented in the imminent future, but also forecast where it will be patented.

]]>Entropy doi: 10.3390/e20110834

Authors: Vinicius M. Netto Edgardo Brigatti João Meirelles Fabiano L. Ribeiro Bruno Pace Caio Cacholas Patricia Sanches

From physics to the social sciences, information is now seen as a fundamental component of reality. However, a form of information seems still underestimated, perhaps precisely because it is so pervasive that we take it for granted: the information encoded in the very environment we live in. We still do not fully understand how information takes the form of cities, and how our minds deal with it in order to learn about the world, make daily decisions, and take part in the complex system of interactions we create as we live together. This paper addresses three related problems that need to be solved if we are to understand the role of environmental information: (1) the physical problem: how can we preserve information in the built environment? (2) The semantic problem: how do we make environmental information meaningful? and (3) the pragmatic problem: how do we use environmental information in our daily lives? Attempting to devise a solution to these problems, we introduce a three-layered model of information in cities, namely environmental information in physical space, environmental information in semantic space, and the information enacted by interacting agents. We propose forms of estimating entropy in these different layers, and apply these measures to emblematic urban cases and simulated scenarios. Our results suggest that ordered spatial structures and diverse land use patterns encode information, and that aspects of physical and semantic information affect coordination in interaction systems.

]]>Entropy doi: 10.3390/e20110832

Authors: Tamás Fülöp Róbert Kovács Ádám Lovas Ágnes Rieth Tamás Fodor Mátyás Szücs Péter Ván Gyula Gróf

The non-Fourier heat conduction phenomenon on room temperature is analyzed from various aspects. The first one shows its experimental side, in what form it occurs, and how we treated it. It is demonstrated that the Guyer-Krumhansl equation can be the next appropriate extension of Fourier&rsquo;s law for room-temperature phenomena in modeling of heterogeneous materials. The second approach provides an interpretation of generalized heat conduction equations using a simple thermo-mechanical background. Here, Fourier heat conduction is coupled to elasticity via thermal expansion, resulting in a particular generalized heat equation for the temperature field. Both aforementioned approaches show the size dependency of non-Fourier heat conduction. Finally, a third approach is presented, called pseudo-temperature modeling. It is shown that non-Fourier temperature history can be produced by mixing different solutions of Fourier&rsquo;s law. That kind of explanation indicates the interpretation of underlying heat conduction mechanics behind non-Fourier phenomena.

]]>Entropy doi: 10.3390/e20110831

Authors: Özlem Ömer

In this article, we demonstrate that a quantal response statistical equilibrium approach to the US housing market with the help of the maximum entropy method of modeling is a powerful way of revealing different characteristics of the housing market behavior before, during and after the recent housing market crash in the US. In this line, a maximum entropy approach to quantal response statistical equilibrium model (QRSE) is employed in order to model housing market dynamics in different phases of the most recent housing market cycle using the S&amp;P Case Shiller housing price index for 20 largest- Metropolitan Regions, and Freddie Mac housing price index (FMHPI) for 367 Metropolitan Cities for the US between 2000 and 2015. Estimated model parameters provide an alternative way to understand and explain the behaviors of economic agents, and market dynamics by questioning the traditional economic theory, which takes assumption for the behavior of rational utility maximizing representative agent with self-fulfilled expectations as given.

]]>Entropy doi: 10.3390/e20110830

Authors: Xulun Ye Jieyu Zhao Yu Chen

Multi-manifold clustering is among the most fundamental tasks in signal processing and machine learning. Although the existing multi-manifold clustering methods are quite powerful, learning the cluster number automatically from data is still a challenge. In this paper, a novel unsupervised generative clustering approach within the Bayesian nonparametric framework has been proposed. Specifically, our manifold method automatically selects the cluster number with a Dirichlet Process (DP) prior. Then, a DP-based mixture model with constrained Mixture of Gaussians (MoG) is constructed to handle the manifold data. Finally, we integrate our model with the k-nearest neighbor graph to capture the manifold geometric information. An efficient optimization algorithm has also been derived to do the model inference and optimization. Experimental results on synthetic datasets and real-world benchmark datasets exhibit the effectiveness of this new DP-based manifold method.

]]>Entropy doi: 10.3390/e20110829

Authors: Andrei Y. Khrennikov Elena R. Loubenets

We introduce the general class of symmetric two-qubit states guaranteeing the perfect correlation or anticorrelation of Alice and Bob outcomes whenever some spin observable is measured at both sites. We prove that, for all states from this class, the maximal violation of the original Bell inequality is upper bounded by 3 2 and specify the two-qubit states where this quantum upper bound is attained. The case of two-qutrit states is more complicated. Here, for all two-qutrit states, we obtain the same upper bound 3 2 for violation of the original Bell inequality under Alice and Bob spin measurements, but we have not yet been able to show that this quantum upper bound is the least one. We discuss experimental consequences of our mathematical study.

]]>Entropy doi: 10.3390/e20110828

Authors: Jixia Wang Yameng Zhang

This paper is dedicated to the study of the geometric average Asian call option pricing under non-extensive statistical mechanics for a time-varying coefficient diffusion model. We employed the non-extensive Tsallis entropy distribution, which can describe the leptokurtosis and fat-tail characteristics of returns, to model the motion of the underlying asset price. Considering that economic variables change over time, we allowed the drift and diffusion terms in our model to be time-varying functions. We used the I t o ^ formula, Feynman&ndash;Kac formula, and P a d e &acute; ansatz to obtain a closed-form solution of geometric average Asian option pricing with a paying dividend yield for a time-varying model. Moreover, the simulation study shows that the results obtained by our method fit the simulation data better than that of Zhao et al. From the analysis of real data, we identify the best value for q which can fit the real stock data, and the result shows that investors underestimate the risk using the Black&ndash;Scholes model compared to our model.

]]>Entropy doi: 10.3390/e20110827

Authors: Chundi Jiang Wei Yang Yu Guo Fei Wu Yinggan Tang

Spatial correlation information between pixels is considered to be very important in thresholding methods. However, it is often ignored and thus unsatisfied segmentation results maybe obtained. To overcome this shortcoming, we propose a new image segmentation approach by taking not only pixels&rsquo; spatial information but also pixels&rsquo;s gray level into account. First, a non-local mean filter is imposed on the image. Then the filtered image and the original image together are adopted to build a two dimensional histogram, it is called non-local mean two dimensional histogram. Finally, a minimum relative entropy criteria is used to select the ideal thresholding vector. Since the non-local mean filter process is performed in a neighborhood of current pixel, it carries out the spatial information of current pixel. Segmentation results on several images illustrate the effectiveness of the proposed thresholding method, whose segmentation accuracy are greatly improved compared to most existing thresholding methods.

]]>Entropy doi: 10.3390/e20110826

Authors: Conor Finn Joseph T. Lizier

Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley&rsquo;s foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper addresses the gap by providing an explicit characterisation of information in terms of probability mass exclusions. It then demonstrates that different exclusions can yield the same amount of information and discusses the insight this provides about how information is shared amongst random variables&mdash;lack of progress in this area is a key barrier preventing us from understanding how information is distributed in complex systems. The paper closes by deriving a decomposition of the mutual information which can distinguish between differing exclusions; this provides surprising insight into the nature of directed information.

]]>Entropy doi: 10.3390/e20110825

Authors: Jerry Gibson

Shannon introduced the fields of information theory and rate distortion theory in his landmark 1948 paper [...]

]]>Entropy doi: 10.3390/e20110824

Authors: Vezha Boboeva Romain Brasselet Alessandro Treves

A statistical analysis of semantic memory should reflect the complex, multifactorial structure of the relations among its items. Still, a dominant paradigm in the study of semantic memory has been the idea that the mental representation of concepts is structured along a simple branching tree spanned by superordinate and subordinate categories. We propose a generative model of item representation with correlations that overcomes the limitations of a tree structure. The items are generated through &ldquo;factors&rdquo; that represent semantic features or real-world attributes. The correlation between items has its source in the extent to which items share such factors and the strength of such factors: if many factors are balanced, correlations are overall low; whereas if a few factors dominate, they become strong. Our model allows for correlations that are neither trivial nor hierarchical, but may reproduce the general spectrum of correlations present in a dataset of nouns. We find that such correlations reduce the storage capacity of a Potts network to a limited extent, so that the number of concepts that can be stored and retrieved in a large, human-scale cortical network may still be of order 107, as originally estimated without correlations. When this storage capacity is exceeded, however, retrieval fails completely only for balanced factors; above a critical degree of imbalance, a phase transition leads to a regime where the network still extracts considerable information about the cued item, even if not recovering its detailed representation: partial categorization seems to emerge spontaneously as a consequence of the dominance of particular factors, rather than being imposed ad hoc. We argue this to be a relevant model of semantic memory resilience in Tulving&rsquo;s remember/know paradigms.

]]>Entropy doi: 10.3390/e20110823

Authors: Hui Fang Victoria Wang Motonori Yamaguchi

Deep Learning (DL) networks are recent revolutionary developments in artificial intelligence research. Typical networks are stacked by groups of layers that are further composed of many convolutional kernels or neurons. In network design, many hyper-parameters need to be defined heuristically before training in order to achieve high cross-validation accuracies. However, accuracy evaluation from the output layer alone is not sufficient to specify the roles of the hidden units in associated networks. This results in a significant knowledge gap between DL&rsquo;s wider applications and its limited theoretical understanding. To narrow the knowledge gap, our study explores visualization techniques to illustrate the mutual information (MI) in DL networks. The MI is a theoretical measurement, reflecting the relationship between two sets of random variables even if their relationship is highly non-linear and hidden in high-dimensional data. Our study aims to understand the roles of DL units in classification performance of the networks. Via a series of experiments using several popular DL networks, it shows that the visualization of MI and its change patterns between the input/output with the hidden layers and basic units can facilitate a better understanding of these DL units&rsquo; roles. Our investigation on network convergence suggests a more objective manner to potentially evaluate DL networks. Furthermore, the visualization provides a useful tool to gain insights into the network performance, and thus to potentially facilitate the design of better network architectures by identifying redundancy and less-effective network units.

]]>Entropy doi: 10.3390/e20110822

Authors: Yanjie Zheng Yunsheng Zhao Shen Liang Hongfei Zheng

Based on the reversible heat engine model, theoretical analysis is carried out for economic performance of a solar tower power plant (STPP) combined with multi-effect desalination (MED). Taking total revenue of the output power and the fresh water yield per unit investment cost as the economic objective function, the most economical working condition of the system is given by analyzing the influence of the system investment composition, the receiver operating temperature, the concentration ratio, the efficiency of the endoreversible heat engine, and the relative water price on the economic parameters of the system. The variation curves of the economic objective function are given out when the main parameter is changed. The results show that the ratio of water price to electricity price, or relative price index, has a significant impact on system economy. When the water price is relatively low, with the effect numbers of the desalination system increasing, and the economic efficiency of the overall system worsens. Only when the price of fresh water rises to a certain value does it make sense to increase the effect. Additionally, the threshold of the fresh water price to the electricity price ratio is 0.22. Under the conditions of the current price index and the heliostat (or reflector), the cost ratio and the system economy can be maximized by selecting the optimum receiver temperature, the endoreversible heat engine efficiency, and the optimum concentration ratio. Given the receiver surface temperature and the endoreversible heat engine efficiency, increasing the system concentration ratio of the heliostat will be in favor of the system economy.

]]>Entropy doi: 10.3390/e20110821

Authors: Xiong Gan Hong Lu Guangyou Yang Jing Liu

In this paper, composite multiscale weighted permutation entropy (CMWPE) is proposed to evaluate the complexity of nonlinear time series, and the advantage of the CMWPE method is verified through analyzing the simulated signal. Meanwhile, considering the complex nonlinear dynamic characteristics of fault rolling bearing signal, a rolling bearing fault diagnosis approach based on CMWPE, joint mutual information (JMI) feature selection, and k-nearest-neighbor (KNN) classifier (CMWPE-JMI-KNN) is proposed. For CMWPE-JMI-KNN, CMWPE is utilized to extract the fault rolling bearing features, JMI is applied for sensitive features selection, and KNN classifier is employed for identifying different rolling bearing conditions. Finally, the proposed CMWPE-JMI-KNN approach is used to analyze the experimental dataset, the analysis results indicate the proposed approach could effectively identify different fault rolling bearing conditions.

]]>Entropy doi: 10.3390/e20110820

Authors: Lina Yao Wei Wu Yunfeng Kang Lifan Li

In this paper, a fault-tolerant control scheme is presented for a class of stochastic distribution collaborative control systems, which are composed of three subsystems connected in series to complete the control target. The radial basis function neural network is used to approximate the output probability density function of the third subsystem, which is also the output of the entire system. When fault occurs in the first subsystem, an adaptive diagnostic observer is designed to estimate the value of fault. However, the first subsystem does not have the ability of self-recovery, minimum rational entropy controllers are designed in the latter subsystems to compensate the influence of the fault and minimize the entropy of the system output. A numerical simulation is given to verify the effectiveness of the proposed scheme.

]]>Entropy doi: 10.3390/e20110819

Authors: Xiaomin Guo Ripeng Liu Pu Li Chen Cheng Mingchuan Wu Yanqiang Guo

Information-theoretically provable unique true random numbers, which cannot be correlated or controlled by an attacker, can be generated based on quantum measurement of vacuum state and universal-hashing randomness extraction. Quantum entropy in the measurements decides the quality and security of the random number generator (RNG). At the same time, it directly determines the extraction ratio of true randomness from the raw data, in other words, it obviously affects quantum random bits generating rate. In this work, we commit to enhancing quantum entropy content in the vacuum noise based quantum RNG. We have taken into account main factors in this proposal to establish the theoretical model of quantum entropy content, including the effects of classical noise, the optimum dynamical analog-digital convertor (ADC) range, the local gain and the electronic gain of the homodyne system. We demonstrate that by amplifying the vacuum quantum noise, abundant quantum entropy is extractable in the step of post-processing even classical noise excursion, which may be deliberately induced by an eavesdropper, is large. Based on the discussion and the fact that the bandwidth of quantum vacuum noise is infinite, we propose large dynamical range and moderate TIA gain to pursue higher local oscillator (LO) amplification of vacuum quadrature and broader detection bandwidth in homodyne system. High true randomness extraction ratio together with high sampling rate is attainable. Experimentally, an extraction ratio of true randomness of 85.3% is achieved by finite enhancement of the laser power of the LO when classical noise excursions of the raw data is obvious.

]]>Entropy doi: 10.3390/e20110818

Authors: Yong-qiang Feng Qian-hao Luo Qian Wang Shuang Wang Zhi-xia He Wei Zhang Xin Wang Qing-song An

Mixture working fluids can reduce effectively energy loss at heat sources and heat sinks, and therefore enhance the organic Rankine cycle (ORC) performance. The entropy and entransy dissipation analyses of a basic ORC system to recover low-grade waste heat using three mixture working fluids (R245fa/R227ea, R245fa/R152a and R245fa/pentane) have been investigated in this study. The basic ORC includes four components: an expander, a condenser, a pump and an evaporator. The heat source temperature is 120 &deg;C while the condenser temperature is 20 &deg;C. The effects of four operating parameters (evaporator outlet temperature, condenser temperature, pinch point temperature difference, degree of superheat), as well as the mass fraction, on entransy dissipation and entropy generation were examined. Results demonstrated that the entransy dissipation is insensitive to the mass fraction of R245fa. The entropy generation distributions at the evaporator for R245/pentane, R245fa/R152a and R245fa/R227ea are in ranges of 66&ndash;74%, 68&ndash;80% and 66&ndash;75%, respectively, with the corresponding entropy generation at the condenser ranges of 13&ndash;21%, 4&ndash;17% and 11&ndash;21%, respectively, while those at the expander for R245/pentane, R245fa/R152a and R245fa/R227ea are approaching 13%, 15% and 14%, respectively. The optimal mass fraction of R245fa for the minimum entropy generation is 0.6 using R245fa/R152a.

]]>Entropy doi: 10.3390/e20110817

Authors: MHR Khouzani Pasquale Malacaria

Information theory, as the mathematics of communication and storage of information, and game theory, as the mathematics of adversarial and cooperative strategic behaviour, are each successful fields of research on their own. [...]

]]>Entropy doi: 10.3390/e20110816

Authors: Enrico Celeghini Manuel Gadella Mariano A. del Olmo

In this paper, we present recent results in harmonic analysis in the real line R and in the half-line R + , which show a closed relation between Hermite and Laguerre functions, respectively, their symmetry groups and Fourier analysis. This can be done in terms of a unified framework based on the use of rigged Hilbert spaces. We find a relation between the universal enveloping algebra of the symmetry groups with the fractional Fourier transform. The results obtained are relevant in quantum mechanics as well as in signal processing as Fourier analysis has a close relation with signal filters. In addition, we introduce some new results concerning a discretized Fourier transform on the circle. We introduce new functions on the circle constructed with the use of Hermite functions with interesting properties under Fourier transformations.

]]>Entropy doi: 10.3390/e20110815

Authors: João F. D. Rodrigues Michael L. Lahr

When working with economic accounts it may occur that multiple estimates of a single datum exist, with different degrees of uncertainty or data quality. This paper addresses the problem of defining a method that can reconcile conflicting estimates, given best guess and uncertainty values. We proceeded from first principles, using two different routes. First, under an entropy-based approach, the data reconciliation problem is addressed as a particular case of a wider data balancing problem, and an alternative setting is found in which the multiple estimates are replaced by a single one. Afterwards, under an axiomatic approach, a set of properties is defined, which characterizes the ideal data reconciliation method. Under both approaches, the conclusion is that the formula for the reconciliation of best guesses is a weighted arithmetic average, with the inverse of uncertainties as weights, and that the formula for the reconciliation of uncertainties is a harmonic average.

]]>Entropy doi: 10.3390/e20110814

Authors: Orazio Angelini Tiziana Di Matteo

Among several developments, the field of Economic Complexity (EC) has notably seen the introduction of two new techniques. One is the Bootstrapped Selective Predictability Scheme (SPSb), which can provide quantitative forecasts of the Gross Domestic Product of countries. The other, Hidden Markov Model (HMM) regularisation, denoises the datasets typically employed in the literature. We contribute to EC along three different directions. First, we prove the convergence of the SPSb algorithm to a well-known statistical learning technique known as Nadaraya-Watson Kernel regression. The latter has significantly lower time complexity, produces deterministic results, and it is interchangeable with SPSb for the purpose of making predictions. Second, we study the effects of HMM regularization on the Product Complexity and logPRODY metrics, for which a model of time evolution has been recently proposed. We find confirmation for the original interpretation of the logPRODY model as describing the change in the global market structure of products with new insights allowing a new interpretation of the Complexity measure, for which we propose a modification. Third, we explore new effects of regularisation on the data. We find that it reduces noise, and observe for the first time that it increases nestedness in the export network adjacency matrix.

]]>Entropy doi: 10.3390/e20110813

Authors: José M. Amigó Sámuel G. Balogh Sergio Hernández

Entropy appears in many contexts (thermodynamics, statistical mechanics, information theory, measure-preserving dynamical systems, topological dynamics, etc.) as a measure of different properties (energy that cannot produce work, disorder, uncertainty, randomness, complexity, etc.). In this review, we focus on the so-called generalized entropies, which from a mathematical point of view are nonnegative functions defined on probability distributions that satisfy the first three Shannon&ndash;Khinchin axioms: continuity, maximality and expansibility. While these three axioms are expected to be satisfied by all macroscopic physical systems, the fourth axiom (separability or strong additivity) is in general violated by non-ergodic systems with long range forces, this having been the main reason for exploring weaker axiomatic settings. Currently, non-additive generalized entropies are being used also to study new phenomena in complex dynamics (multifractality), quantum systems (entanglement), soft sciences, and more. Besides going through the axiomatic framework, we review the characterization of generalized entropies via two scaling exponents introduced by Hanel and Thurner. In turn, the first of these exponents is related to the diffusion scaling exponent of diffusion processes, as we also discuss. Applications are addressed as the description of the main generalized entropies advances.

]]>Entropy doi: 10.3390/e20110812

Authors: Yan-Xin Zhuang Xiu-Lan Zhang Xian-Yu Gu

The effect of annealing temperature on the microstructure, phase constituents and mechanical properties of Al0.5CoCrFeMoxNi high-entropy complex alloys has been investigated at a fixed annealing time (10 h). The 600 &deg;C-annealing has no obvious effect on their microstructures, while the annealing at 800&ndash;1200 &deg;C enhances the precipitation of (Al,Ni)-rich ordered BCC phase or/and (Cr,Mo)-rich &sigma; phase, and thereby greatly affects the microstructure and mechanical properties of the alloys. All the annealed Al0.5CoCrFeNi alloys are composed of FCC and (Al,Ni)-rich ordered BCC phases; the phase constituent of the Al0.5CoCrFeMo0.1Ni alloy changes from FCC + BCC (600 &deg;C) to FCC + BCC + &sigma; (800 &deg;C) and then to FCC + BCC (1100 &deg;C); the phase constituents of the Al0.5CoCrFeMo0.2Ni and Al0.5CoCrFeMo0.3Ni alloys change from FCC + BCC + &sigma; to FCC + BCC with the annealing temperature rising from 600 to 1200 &deg;C; while all the annealed Al0.5CoCrFeMo0.4Ni and Al0.5CoCrFeMo0.5Ni alloys consist of FCC, BCC and &sigma; phases. The phase constituents of most of the alloys investigated are in good agreement with the calculated results from Thermo-Calc program. The alloys annealed at 800 &deg;C under current investigation conditionshave relative fine precipitations and microstructure, and thereby higher hardness and yield stress.

]]>Entropy doi: 10.3390/e20110811

Authors: Miguel Pineda Michail Stamatakis

Catalytic surface reaction networks exhibit nonlinear dissipative phenomena, such as bistability. Macroscopic rate law descriptions predict that the reaction system resides on one of the two steady-state branches of the bistable region for an indefinite period of time. However, the smaller the catalytic surface, the greater the influence of coverage fluctuations, given that their amplitude normally scales as the square root of the system size. Thus, one can observe fluctuation-induced transitions between the steady-states. In this work, a model for the bistable catalytic CO oxidation on small surfaces is studied. After a brief introduction of the average stochastic modelling framework and its corresponding deterministic limit, we discuss the non-equilibrium conditions necessary for bistability. The entropy production rate, an important thermodynamic quantity measuring dissipation in a system, is compared across the two approaches. We conclude that, in our catalytic model, the most favorable non-equilibrium steady state is not necessary the state with the maximum or minimum entropy production rate.

]]>Entropy doi: 10.3390/e20110810

Authors: Hongling Zhang Lei Zhang Xinyu Liu Qiang Chen Yi Xu

As a classic high-entropy alloy system, CoCrFeNiMn is widely investigated. In the present work, we used ZrH2 powders and atomized CoCrFeNiMn powders as raw materials to prepare CoCrFeNiMnZrx (x = 0, 0.2, 0.5, 0.8, 1.0) alloys by mechanical alloying (MA), followed by spark plasma sintering (SPS). During the MA process, a small amount of Zr (x &le; 0.5) can be completely dissolved into CoCrFeNiMn matrix, when the Zr content is above 0.5, the ZrH2 is excessive. After SPS, CoCrFeNiMn alloy is still as single face-centered cubic (FCC) solid solution, and CoCrFeNiMnZrx (x &ge; 0.2) alloys have two distinct microstructural domains, one is a single FCC phase without Zr, the other is a Zr-rich microstructure composed of FCC phase, B2 phase, Zr2Ni7, and &sigma; phase. The multi-phase microstructures can be attributed to the large lattice strain and negative enthalpy of mixing, caused by the addition of Zr. It is worth noting that two types of nanoprecipitates (body-centered cubic (BCC) phase and Zr2Ni7) are precipitated in the Zr-rich region. These can significantly increase the yield strength of the alloys.

]]>Entropy doi: 10.3390/e20110809

Authors: Razieh Rastgoo Kourosh Kiani Sergio Escalera

In this paper, a deep learning approach, Restricted Boltzmann Machine (RBM), is used to perform automatic hand sign language recognition from visual data. We evaluate how RBM, as a deep generative model, is capable of generating the distribution of the input data for an enhanced recognition of unseen data. Two modalities, RGB and Depth, are considered in the model input in three forms: original image, cropped image, and noisy cropped image. Five crops of the input image are used and the hand of these cropped images are detected using Convolutional Neural Network (CNN). After that, three types of the detected hand images are generated for each modality and input to RBMs. The outputs of the RBMs for two modalities are fused in another RBM in order to recognize the output sign label of the input image. The proposed multi-modal model is trained on all and part of the American alphabet and digits of four publicly available datasets. We also evaluate the robustness of the proposal against noise. Experimental results show that the proposed multi-modal model, using crops and the RBM fusing methodology, achieves state-of-the-art results on Massey University Gesture Dataset 2012, American Sign Language (ASL). and Fingerspelling Dataset from the University of Surrey&rsquo;s Center for Vision, Speech and Signal Processing, NYU, and ASL Fingerspelling A datasets.

]]>Entropy doi: 10.3390/e20100808

Authors: Tianhua Ju Xueyong Ding Yingyi Zhang Weiliang Chen Xiangkui Cheng Bo Wang Jingxin Dai Xinlin Yan

It is important to know the activity interaction parameters between components in melts in the process of metallurgy. However, it&rsquo;s considerably difficult to measure them experimentally, relying still to a large extent on theoretical calculations. In this paper, the first-order activity interaction parameter ( e s j ) of j on sulphur in Fe-based melts at 1873 K is investigated by a calculation model established by combining the Miedema model and Toop-Hillert geometric model as well as considering excess entropy and mixing enthalpy. We consider two strategies, with or without using excess entropy in the calculations. Our results show that: (1) the predicted values are in good agreement with those recommended by Japan Society for Promotion of Science (JSPS); and (2) the agreement is even better when excess entropy is considered in the calculations. In addition, the deviations of our theoretical results from experimental values | e S ( exp ) j &minus; e S ( cal ) j | depend on the element j&rsquo;s locations in the periodic table.

]]>Entropy doi: 10.3390/e20100807

Authors: Marco Baldovin Fabio Cecconi Massimo Cencini Andrea Puglisi Angelo Vulpiani

The goal of Science is to understand phenomena and systems in order to predict their development and gain control over them. In the scientific process of knowledge elaboration, a crucial role is played by models which, in the language of quantitative sciences, mean abstract mathematical or algorithmical representations. This short review discusses a few key examples from Physics, taken from dynamical systems theory, biophysics, and statistical mechanics, representing three paradigmatic procedures to build models and predictions from available data. In the case of dynamical systems we show how predictions can be obtained in a virtually model-free framework using the methods of analogues, and we briefly discuss other approaches based on machine learning methods. In cases where the complexity of systems is challenging, like in biophysics, we stress the necessity to include part of the empirical knowledge in the models to gain the minimal amount of realism. Finally, we consider many body systems where many (temporal or spatial) scales are at play&mdash;and show how to derive from data a dimensional reduction in terms of a Langevin dynamics for their slow components.

]]>Entropy doi: 10.3390/e20100806

Authors: Liqiang Jin Hongwen Yang

This paper proposes a distributed joint source-channel coding (DJSCC) scheme using polar-like codes. In the proposed scheme, each distributed source encodes source message with a quasi-uniform systematic polar code (QSPC) or a punctured QSPC, and only transmits parity bits over its independent channel. These systematic codes play the role of both source compression and error protection. For the infinite code-length, we show that the proposed scheme approaches the information-theoretical limit by the technique of joint source-channel polarization with side information. For the finite code-length, the simulation results verify that the proposed scheme outperforms the distributed separate source-channel coding (DSSCC) scheme using polar codes and the DJSCC scheme using classic systematic polar codes.

]]>Entropy doi: 10.3390/e20100805

Authors: Qiuna Lv Liyan Han Yipeng Wan Libo Yin

By introducing net entropy into a stock network, this paper focuses on investigating the impact of network entropy on market returns and trading in the Chinese Growth Enterprise Market (GEM). In this paper, indices of Wu structure entropy (WSE) and SD structure entropy (SDSE) are considered as indicators of network heterogeneity to present market diversification. A series of dynamic financial networks consisting of 1066 daily nets is constructed by applying the dynamic conditional correlation multivariate GARCH (DCC-MV-GARCH) model with a threshold adjustment. Then, we evaluate the quantitative relationships between network entropy indices and market trading-variables and their bilateral information spillover effects by applying the bivariate EGARCH model. There are two main findings in the paper. Firstly, the evidence significantly ensures that both market returns and trading volumes associate negatively with the network entropy indices, which indicates that stock heterogeneity, which is negative with the value of network entropy indices by definition, can help to improve market returns and increase market trading volumes. Secondly, results show significant information transmission between the indicators of network entropy and stock market trading variables.

]]>Entropy doi: 10.3390/e20100804

Authors: Henrik Jeldtoft Jensen Piergiulio Tempesta

The entropy of Boltzmann-Gibbs, as proved by Shannon and Khinchin, is based on four axioms, where the fourth one concerns additivity. The group theoretic entropies make use of formal group theory to replace this axiom with a more general composability axiom. As has been pointed out before, generalised entropies crucially depend on the number of allowed degrees of freedom N. The functional form of group entropies is restricted (though not uniquely determined) by assuming extensivity on the equal probability ensemble, which leads to classes of functionals corresponding to sub-exponential, exponential or super-exponential dependence of the phase space volume W on N. We review the ensuing entropies, discuss the composability axiom and explain why group entropies may be particularly relevant from an information-theoretical perspective.

]]>Entropy doi: 10.3390/e20100803

Authors: Ming-Yang Zhou Wen-Man Xiong Xiao-Yu Li Hao Liao

When a developing country reaches a relatively average income level, it often stops growing further and its income does not improve. This is known as the middle-income trap. How to overcome this trap is a longstanding problem for developing countries, and has been studied in various research fields. In this work, we use the Fitness-Complexity method (FCM) to analyze the common characteristics of the countries that successfully get through the middle-income trap, and show the origin of the middle-income trap based on the international trade network. In the analysis, a novel method is proposed to characterize the interdependency between products. The results show that some middle-complexity products depend much on each other, which indicates that developing countries should focus on them simultaneously, implying high difficulty to escape the middle-income trap. To tackle the middle-income trap, developing countries should learn experiences from developed countries that share similar development history. we then design an effective method to evaluate the similarity between countries and recommend developed countries to a certain developing country. The effectiveness of our method is validated in the international trade network.

]]>Entropy doi: 10.3390/e20100802

Authors: Sergey Serdyukov

In this work, we consider extended irreversible thermodynamics in assuming that the entropy density is a function of both common thermodynamic variables and their higher-order time derivatives. An expression for entropy production, and the linear phenomenological equations describing diffusion and chemical reactions, are found in the context of this approach. Solutions of the sets of linear equations with respect to fluxes and their higher-order time derivatives allow the coefficients of diffusion and reaction rate constants to be established as functions of size of the nanosystems in which these reactions occur. The Maxwell-Cattaneo and Jeffreys constitutive equations, as well as the higher-order constitutive equations, which describe the processes in reaction-diffusion systems, are obtained.

]]>Entropy doi: 10.3390/e20100801

Authors: A. A. Karawia

To enhance the encryption proficiency and encourage the protected transmission of multiple images, the current work introduces an encryption algorithm for multiple images using the combination of mixed image elements (MIES) and a two-dimensional economic map. Firstly, the original images are grouped into one big image that is split into many pure image elements (PIES); secondly, the logistic map is used to shuffle the PIES; thirdly, it is confused with the sequence produced by the two-dimensional economic map to get MIES; finally, the MIES are gathered into a big encrypted image that is split into many images of the same size as the original images. The proposed algorithm includes a huge number key size space, and this makes the algorithm secure against hackers. Even more, the encryption results obtained by the proposed algorithm outperform existing algorithms in the literature. A comparison between the proposed algorithm and similar algorithms is made. The analysis of the experimental results and the proposed algorithm shows that the proposed algorithm is efficient and secure.

]]>Entropy doi: 10.3390/e20100800

Authors: Yan Qiang Liejiang Wei Xiaomei Luo Hongchao Jian Wenan Wang Fenfen Li

Heat transfer performances and flow structures of laminar impinging slot jets with power-law non-Newtonian fluids and corresponding typical industrial fluids (Carboxyl Methyl Cellulose (CMC) solutions and Xanthangum (XG) solutions) have been studied in this work. Investigations are performed for Reynolds number Re less than 200, power-law index n ranging from 0.5 to 1.5 and consistency index K varying from 0.001 to 0.5 to explore heat transfer and flow structure of shear-thinning fluid and shear-thickening fluid. Results indicate that with the increase of n, K for a given Re, wall Nusselt number increases mainly attributing to the increase of inlet velocity U. For a given inlet velocity, wall Nusselt number decreases with the increase of n and K, which mainly attributes to the increase of apparent viscosity and the reduction of momentum diffusion. For the same Re, U and Pr, wall Nusselt number decreases with the increase of n. Among the study of industrial power-law shear-thinning fluid, CMC solution with 100 ppm shows the best heat transfer performance at a given velocity. Moreover, new correlation of Nusselt number about industrial fluid is proposed. In general, for the heat transfer of laminar confined impinging jet, it is best to use the working fluid with low viscosity.

]]>Entropy doi: 10.3390/e20100799

Authors: George Livadiotis

The paper derives the polytropic indices over the last two solar cycles (years 1995&ndash;2017) for the solar wind proton plasma near Earth (~1 AU). We use ~92-s datasets of proton plasma moments (speed, density, and temperature), measured from the Solar Wind Experiment instrument onboard Wind spacecraft, to estimate the moving averages of the polytropic index, as well as their weighted means and standard errors as a function of the solar wind speed and the year of measurements. The derived long-term behavior of the polytropic index agrees with the results of other previous methods. In particular, we find that the polytropic index remains quasi-constant with respect to the plasma flow speed, in agreement with earlier analyses of solar wind plasma. It is shown that most of the fluctuations of the polytropic index appear in the fast solar wind. The polytropic index remains quasi-constant, despite the frequent entropic variations. Therefore, on an annual basis, the polytropic index of the solar wind proton plasma near ~1 AU can be considered independent of the plasma flow speed. The estimated all-year weighted mean and its standard error is &gamma; = 1.86 &plusmn; 0.09.

]]>Entropy doi: 10.3390/e20100798

Authors: Sean Devine

Algorithmic information theory in conjunction with Landauer&rsquo;s principle can quantify the cost of maintaining a reversible real-world computational system distant from equilibrium. As computational bits are conserved in an isolated reversible system, bit flows can be used to track the way a highly improbable configuration trends toward a highly probable equilibrium configuration. In an isolated reversible system, all microstates within a thermodynamic macrostate have the same algorithmic entropy. However, from a thermodynamic perspective, when these bits primarily specify stored energy states, corresponding to a fluctuation from the most probable set of states, they represent &ldquo;potential entropy&rdquo;. However, these bits become &ldquo;realised entropy&rdquo; when, under the second law of thermodynamics, they become bits specifying the momentum degrees of freedom. The distance of a fluctuation from equilibrium is identified as the number of computational bits that move from stored energy states to momentum states to define a highly probable or typical equilibrium state. When reversibility applies, from Landauer&rsquo;s principle, it costs k B l n 2 T Joules to move a bit within the system from stored energy states to the momentum states.

]]>Entropy doi: 10.3390/e20100797

Authors: Chris Fields

The concept of a &ldquo;system&rdquo; is foundational to physics, but the question of how observers identify systems is seldom addressed. Classical thermodynamics restricts observers to finite, finite-resolution observations with which to identify the systems on which &ldquo;pointer state&rdquo; measurements are to be made. It is shown that system identification is at best approximate, even in a finite world, and that violations of the Leggett&ndash;Garg and Bell/CHSH (Clauser-Horne-Shimony-Holt) inequalities emerge naturally as requirements for successful system identification.

]]>Entropy doi: 10.3390/e20100796

Authors: Jan Naudts

Section 4 of &ldquo;Naudts J. Quantum Statistical Manifolds. Entropy 2018, 20, 472&rdquo; contains errors. They have limited consequences for the remainder of the paper. A new version of this Section is found here. Some smaller shortcomings of the paper are taken care of as well. In particular, the proof of Theorem 3 was not complete, and is therefore amended. Also, a few missing references are added.

]]>Entropy doi: 10.3390/e20100795

Authors: Daiyi Luo Weifeng Pan Yifan Li Kaicheng Feng Guanzheng Liu

Congestive heart failure (CHF) is a cardiovascular disease associated with autonomic dysfunction, where sympathovagal imbalance was reported in many studies using heart rate variability (HRV). To learn more about the dynamic interaction in the autonomic nervous system (ANS), we explored the directed interaction between the sympathetic nervous system (SNS) and the parasympathetic nervous system (PNS) with the help of transfer entropy (TE). This article included 24-h RR interval signals of 54 healthy subjects (31 males and 23 females, 61.38 &plusmn; 11.63 years old) and 44 CHF subjects (8 males and 2 females, 19 subjects&rsquo; gender were unknown, 55.51 &plusmn; 11.44 years old, 4 in class I, 8 in class II and 32 in class III~IV, according to the New York Heart Association Function Classification), obtained from the PhysioNet database and then segmented into 5-min non-overlapping epochs using cubic spline interpolation. For each segment in the normal group and CHF group, frequency-domain features included low-frequency (LF) power, high-frequency (HF) power and LF/HF ratio were extracted as classical estimators of autonomic activity. In the nonlinear domain, TE between LF and HF were calculated to quantify the information exchanging between SNS and PNS. Compared with the normal group, an extreme decrease in LF/HF ratio (p = 0.000) and extreme increases in both TE(LF&rarr;HF) (p = 0.000) and TE(HF&rarr;LF) (p = 0.000) in the CHF group were observed. Moreover, both in normal and CHF groups, TE(LF&rarr;HF) was a lot greater than TE(HF&rarr;LF) (p = 0.000), revealing that TE was able to distinguish the difference in the amount of directed information transfer among ANS. Extracted features were further applied in discriminating CHF using IBM SPSS Statistics discriminant analysis. The combination of the LF/HF ratio, TE(LF&rarr;HF) and TE(HF&rarr;LF) reached the highest screening accuracy (83.7%). Our results suggested that TE could serve as a complement to traditional index LF/HF in CHF screening.

]]>Entropy doi: 10.3390/e20100793

Authors: Fernando Rosas Pedro A.M. Mediano Martín Ugarte Henrik J. Jensen

Self-organisation lies at the core of fundamental but still unresolved scientific questions, and holds the promise of de-centralised paradigms crucial for future technological developments. While self-organising processes have been traditionally explained by the tendency of dynamical systems to evolve towards specific configurations, or attractors, we see self-organisation as a consequence of the interdependencies that those attractors induce. Building on this intuition, in this work we develop a theoretical framework for understanding and quantifying self-organisation based on coupled dynamical systems and multivariate information theory. We propose a metric of global structural strength that identifies when self-organisation appears, and a multi-layered decomposition that explains the emergent structure in terms of redundant and synergistic interdependencies. We illustrate our framework on elementary cellular automata, showing how it can detect and characterise the emergence of complex structures.

]]>Entropy doi: 10.3390/e20100794

Authors: Anne Humeau-Heurtier

n/a

]]>Entropy doi: 10.3390/e20100792

Authors: Sebastian Poledna Abraham Hinteregger Stefan Thurner

The notions of systemic importance and systemic risk of financial institutions are closely related to the topology of financial liability networks. In this work, we reconstruct and analyze the financial liability network of an entire economy using data of 50,159 firms and banks. Our analysis contains 80.2% of the total liabilities of firms towards banks and all interbank liabilities in the Austrian banking system. The combination of firm-bank networks and interbank networks allows us to extend the concept of systemic risk to the real economy. In particular, the systemic importance of individual companies can be assessed, and for the first time, the financial ties between the financial and the real economy become explicitly visible. We find that firms contribute to systemic risk in similar ways as banks do. We identify a set of mid-sized companies that carry substantial systemic risk. Their default would affect up to 40% of the Austrian financial market. We find that all firms together create more systemic risk than the entire financial sector. In 2008, the total systemic risk of the Austrian interbank network amounted to only 29% of the total systemic risk of the entire financial network consisting of firms and banks. The work demonstrates that the notions of systemically important financial institutions (SIFIs) can be directly extended to firms.

]]>Entropy doi: 10.3390/e20100791

Authors: Heinz Herwig

In order to teach heat transfer systematically and with a clear physical background, it is recommended that entropy should not be ignored as a fundamental quantity. Heat transfer processes are characterized by introducing the so-called &ldquo;entropic potential&rdquo; of the transferred energy, and an assessment number is based on this new quantity.

]]>Entropy doi: 10.3390/e20100790

Authors: Jiří Náprstek Cyril Fischer

In this study, we consider a method for investigating the stochastic response of a nonlinear dynamical system affected by a random seismic process. We present the solution of the probability density of a single/multiple-degree of freedom (SDOF/MDOF) system with several statically stable equilibrium states and with possible jumps of the snap-through type. The system is a Hamiltonian system with weak damping excited by a system of non-stationary Gaussian white noise. The solution based on the Gibbs principle of the maximum entropy of probability could potentially be implemented in various branches of engineering. The search for the extreme of the Gibbs entropy functional is formulated as a constrained optimization problem. The secondary constraints follow from the Fokker&ndash;Planck equation (FPE) for the system considered or from the system of ordinary differential equations for the stochastic moments of the response derived from the relevant FPE. In terms of the application type, this strategy is most suitable for SDOF/MDOF systems containing polynomial type nonlinearities. Thus, the solution links up with the customary formulation of the finite elements discretization for strongly nonlinear continuous systems.

]]>Entropy doi: 10.3390/e20100789

Authors: Sylvain Barbay Saliya Coulibaly Marcel G. Clerc

Out-of-equilibrium systems exhibit complex spatiotemporal behaviors when they present a secondary bifurcation to an oscillatory instability. Here, we investigate the complex dynamics shown by a pulsing regime in an extended, one-dimensional semiconductor microcavity laser whose cavity is composed by integrated gain and saturable absorber media. This system is known to give rise experimentally and theoretically to extreme events characterized by rare and high amplitude optical pulses following the onset of spatiotemporal chaos. Based on a theoretical model, we reveal a dynamical behavior characterized by the chaotic alternation of phase and amplitude turbulence. The highest amplitude pulses, i.e., the extreme events, are observed in the phase turbulence zones. This chaotic alternation behavior between different turbulent regimes is at contrast to what is usually observed in a generic amplitude equation model such as the Ginzburg&ndash;Landau model. Hence, these regimes provide some insight into the poorly known properties of the complex spatiotemporal dynamics exhibited by secondary instabilities of an Andronov&ndash;Hopf bifurcation.

]]>Entropy doi: 10.3390/e20100788

Authors: Xiao Zhang Xia Liu Yanyan Yang

The information entropy developed by Shannon is an effective measure of uncertainty in data, and the rough set theory is a useful tool of computer applications to deal with vagueness and uncertainty data circumstances. At present, the information entropy has been extensively applied in the rough set theory, and different information entropy models have also been proposed in rough sets. In this paper, based on the existing feature selection method by using a fuzzy rough set-based information entropy, a corresponding fast algorithm is provided to achieve efficient implementation, in which the fuzzy rough set-based information entropy taking as the evaluation measure for selecting features is computed by an improved mechanism with lower complexity. The essence of the acceleration algorithm is to use iterative reduced instances to compute the lambda-conditional entropy. Numerical experiments are further conducted to show the performance of the proposed fast algorithm, and the results demonstrate that the algorithm acquires the same feature subset to its original counterpart, but with significantly less time.

]]>Entropy doi: 10.3390/e20100787

Authors: Hervé Bergeron Jean-Pierre Gazeau

Any quantization maps linearly function on a phase space to symmetric operators in a Hilbert space. Covariant integral quantization combines operator-valued measure with the symmetry group of the phase space. Covariant means that the quantization map intertwines classical (geometric operation) and quantum (unitary transformations) symmetries. Integral means that we use all resources of integral calculus, in order to implement the method when we apply it to singular functions, or distributions, for which the integral calculus is an essential ingredient. We first review this quantization scheme before revisiting the cases where symmetry covariance is described by the Weyl-Heisenberg group and the affine group respectively, and we emphasize the fundamental role played by Fourier transform in both cases. As an original outcome of our generalisations of the Wigner-Weyl transform, we show that many properties of the Weyl integral quantization, commonly viewed as optimal, are actually shared by a large family of integral quantizations.

]]>Entropy doi: 10.3390/e20100786

Authors: William Cruz-Santos Salvador E. Venegas-Andraca Marco Lanzagorta

In this paper, we propose a methodology to solve the stereo matching problem through quantum annealing optimization. Our proposal takes advantage of the existing Min-Cut/Max-Flow network formulation of computer vision problems. Based on this network formulation, we construct a quadratic pseudo-Boolean function and then optimize it through the use of the D-Wave quantum annealing technology. Experimental validation using two kinds of stereo pair of images, random dot stereograms and gray-scale, shows that our methodology is effective.

]]>Entropy doi: 10.3390/e20100785

Authors: Matteo Bruno Fabio Saracco Tiziano Squartini Marco Dueñas

In this paper, we analyse the bipartite Colombian firms-products network, throughout a period of five years, from 2010 to 2014. Our analysis depicts a strongly modular system, with several groups of firms specializing in the export of specific categories of products. These clusters have been detected by running the bipartite variant of the traditional modularity maximization, revealing a bi-modular structure. Interestingly, this finding is refined by applying a recently proposed algorithm for projecting bipartite networks on the layer of interest and, then, running the Louvain algorithm on the resulting monopartite representations. Important structural differences emerge upon comparing the Colombian firms-products network with the World Trade Web, in particular, the bipartite representation of the latter is not characterized by a similar block-structure, as the modularity maximization fails in revealing (bipartite) nodes clusters. This points out that economic systems behave differently at different scales: while countries tend to diversify their production&mdash;potentially exporting a large number of different products&mdash;firms specialize in exporting (substantially very limited) baskets of basically homogeneous products.

]]>Entropy doi: 10.3390/e20100784

Authors: Peter Harremoës

We study entropy inequalities for variables that are related by functional dependencies. Although the powerset on four variables is the smallest Boolean lattice with non-Shannon inequalities, there exist lattices with many more variables where the Shannon inequalities are sufficient. We search for conditions that exclude the existence of non-Shannon inequalities. The existence of non-Shannon inequalities is related to the question of whether a lattice is isomorphic to a lattice of subgroups of a group. In order to formulate and prove the results, one has to bridge lattice theory, group theory, the theory of functional dependences and the theory of conditional independence. It is demonstrated that the Shannon inequalities are sufficient for planar modular lattices. The proof applies a gluing technique that uses that if the Shannon inequalities are sufficient for the pieces, then they are also sufficient for the whole lattice. It is conjectured that the Shannon inequalities are sufficient if and only if the lattice does not contain a special lattice as a sub-semilattice.

]]>Entropy doi: 10.3390/e20100783

Authors: Vito D. P. Servedio Paolo Buttà Dario Mazzilli Andrea Tacchella Luciano Pietronero

We present a new metric estimating fitness of countries and complexity of products by exploiting a non-linear non-homogeneous map applied to the publicly available information on the goods exported by a country. The non homogeneous terms guarantee both convergence and stability. After a suitable rescaling of the relevant quantities, the non homogeneous terms are eventually set to zero so that this new metric is parameter free. This new map almost reproduces the results of the original homogeneous metrics already defined in literature and allows for an approximate analytic solution in case of actual binarized matrices based on the Revealed Comparative Advantage (RCA) indicator. This solution is connected with a new quantity describing the neighborhood of nodes in bipartite graphs, representing in this work the relations between countries and exported products. Moreover, we define the new indicator of country net-efficiency quantifying how a country efficiently invests in capabilities able to generate innovative complex high quality products. Eventually, we demonstrate analytically the local convergence of the algorithm involved.

]]>Entropy doi: 10.3390/e20100782

Authors: Christos Papadimitriou Georgios Piliouras

In 1950, Nash proposed a natural equilibrium solution concept for games hence called Nash equilibrium, and proved that all finite games have at least one. The proof is through a simple yet ingenious application of Brouwer&rsquo;s (or, in another version Kakutani&rsquo;s) fixed point theorem, the most sophisticated result in his era&rsquo;s topology&mdash;in fact, recent algorithmic work has established that Nash equilibria are computationally equivalent to fixed points. In this paper, we propose a new class of universal non-equilibrium solution concepts arising from an important theorem in the topology of dynamical systems that was unavailable to Nash. This approach starts with both a game and a learning dynamics, defined over mixed strategies. The Nash equilibria are fixpoints of the dynamics, but the system behavior is captured by an object far more general than the Nash equilibrium that is known in dynamical systems theory as chain recurrent set. Informally, once we focus on this solution concept&mdash;this notion of &ldquo;the outcome of the game&rdquo;&mdash;every game behaves like a potential game with the dynamics converging to these states. In other words, unlike Nash equilibria, this solution concept is algorithmic in the sense that it has a constructive proof of existence. We characterize this solution for simple benchmark games under replicator dynamics, arguably the best known evolutionary dynamics in game theory. For (weighted) potential games, the new concept coincides with the fixpoints/equilibria of the dynamics. However, in (variants of) zero-sum games with fully mixed (i.e., interior) Nash equilibria, it covers the whole state space, as the dynamics satisfy specific information theoretic constants of motion. We discuss numerous novel computational, as well as structural, combinatorial questions raised by this chain recurrence conception of games.

]]>Entropy doi: 10.3390/e20100781

Authors: Wojciech Chmiel Joanna Kwiecień

The paper focuses on the opportunity of the application of the quantum-inspired evolutionary algorithm for determining minimal costs of the assignment in the quadratic assignment problem. The idea behind the paper is to present how the algorithm has to be adapted to this problem, including crossover and mutation operators and introducing quantum principles in particular procedures. The results have shown that the performance of our approach in terms of converging to the best solutions is satisfactory. Moreover, we have presented the results of the selected parameters of the approach on the quality of the obtained solutions.

]]>Entropy doi: 10.3390/e20100780

Authors: Mohamed Hatifi Ralph Willox Samuel Colin Thomas Durt

Recently, the properties of bouncing oil droplets, also known as &ldquo;walkers,&rdquo; have attracted much attention because they are thought to offer a gateway to a better understanding of quantum behavior. They indeed constitute a macroscopic realization of wave-particle duality, in the sense that their trajectories are guided by a self-generated surrounding wave. The aim of this paper is to try to describe walker phenomenology in terms of de Broglie&ndash;Bohm dynamics and of a stochastic version thereof. In particular, we first study how a stochastic modification of the de Broglie pilot-wave theory, &agrave; la Nelson, affects the process of relaxation to quantum equilibrium, and we prove an H-theorem for the relaxation to quantum equilibrium under Nelson-type dynamics. We then compare the onset of equilibrium in the stochastic and the de Broglie&ndash;Bohm approaches and we propose some simple experiments by which one can test the applicability of our theory to the context of bouncing oil droplets. Finally, we compare our theory to actual observations of walker behavior in a 2D harmonic potential well.

]]>Entropy doi: 10.3390/e20100779

Authors: Renaldas Urniezius Vytautas Galvanauskas Arnas Survyla Rimvydas Simutis Donatas Levisauskas

For historic reasons, industrial knowledge of reproducibility and restrictions imposed by regulations, open-loop feeding control approaches dominate in industrial fed-batch cultivation processes. In this study, a generic gray box biomass modeling procedure uses relative entropy as a key to approach the posterior similarly to how prior distribution approaches the posterior distribution by the multivariate path of Lagrange multipliers, for which a description of a nuisance time is introduced. The ultimate purpose of this study was to develop a numerical semi-global convex optimization procedure that is dedicated to the calculation of feeding rate time profiles during the fed-batch cultivation processes. The proposed numerical semi-global convex optimization of relative entropy is neither restricted to the gray box model nor to the bioengineering application. From the bioengineering application perspective, the proposed bioprocess design technique has benefits for both the regular feed-forward control and the advanced adaptive control systems, in which the model for biomass growth prediction is compulsory. After identification of the gray box model parameters, the options and alternatives in controllable industrial biotechnological processes are described. The main aim of this work is to achieve high reproducibility, controllability, and desired process performance. Glucose concentration measurements, which were used for the development of the model, become unnecessary for the development of the desired microbial cultivation process.

]]>Entropy doi: 10.3390/e20100778

Authors: Yeqiang Bu Shenyou Peng Shiwei Wu Yujie Wei Gang Wang Jiabin Liu Hongtao Wang

The bulk high-entropy alloys (HEAs) exhibit similar deformation behaviours as traditional metals. These bulk behaviours are likely an averaging of the behaviours exhibited at the nanoscale. Herein, in situ atomic-scale observation of deformation behaviours in nanoscaled CoCrCuFeNi face-centred cubic (FCC) HEA was performed. The deformation behaviours of this nanoscaled FCC HEA (i.e., nanodisturbances and phase transformations) were distinct from those of nanoscaled traditional FCC metals and corresponding bulk HEA. First-principles calculations revealed an obvious fluctuation of the stacking fault energy and stability difference at the atomic scale in the HEA. The stability difference was highlighted only in the nanoscaled HEA and induced unconventional deformation behaviours. Our work suggests that the nanoscaled HEA may provide more chances to discover the long-expected essential distinction between the HEAs and traditional metals.

]]>Entropy doi: 10.3390/e20100777

Authors: Matúš Medo Manuel Sebastian Mariani Linyuan Lü

Real networks typically studied in various research fields&mdash;ecology and economic complexity, for example&mdash;often exhibit a nested topology, which means that the neighborhoods of high-degree nodes tend to include the neighborhoods of low-degree nodes. Focusing on nested networks, we study the problem of link prediction in complex networks, which aims at identifying likely candidates for missing links. We find that a new method that takes network nestedness into account outperforms well-established link-prediction methods not only when the input networks are sufficiently nested, but also for networks where the nested structure is imperfect. Our study paves the way to search for optimal methods for link prediction in nested networks, which might be beneficial for World Trade and ecological network analysis.

]]>Entropy doi: 10.3390/e20100776

Authors: Angelica Sbardella François Perruchas Lorenzo Napolitano Nicolò Barbieri Davide Consoli

The present study provides an analysis of empirical regularities in the development of green technology. We use patent data to examine inventions that can be traced to the environment-related catalogue (ENV-Tech) covering technologies in environmental management, water-related adaptation and climate change mitigation. Furthermore, we employ the Economic Fitness-Complexity (EFC) approach to assess their development and geographical distribution across countries between 1970 and 2010. This allows us to identify three typologies of countries: leaders, laggards and catch-up. While, as expected, there is a direct relationship between GDP per capita and invention capacity, we also document the remarkable growth of East Asia countries that started from the periphery and rapidly established themselves as key actors. This geographical pattern coincides with higher integration across domains so that, while the relative development of individual areas may have peaked, there is now demand for greater interoperability across green technologies.

]]>Entropy doi: 10.3390/e20100775

Authors: Michiel Straat Fthi Abadi Christina Göpfert Barbara Hammer Michael Biehl

We introduce a modeling framework for the investigation of on-line machine learning processes in non-stationary environments. We exemplify the approach in terms of two specific model situations: In the first, we consider the learning of a classification scheme from clustered data by means of prototype-based Learning Vector Quantization (LVQ). In the second, we study the training of layered neural networks with sigmoidal activations for the purpose of regression. In both cases, the target, i.e., the classification or regression scheme, is considered to change continuously while the system is trained from a stream of labeled data. We extend and apply methods borrowed from statistical physics which have been used frequently for the exact description of training dynamics in stationary environments. Extensions of the approach allow for the computation of typical learning curves in the presence of concept drift in a variety of model situations. First results are presented and discussed for stochastic drift processes in classification and regression problems. They indicate that LVQ is capable of tracking a classification scheme under drift to a non-trivial extent. Furthermore, we show that concept drift can cause the persistence of sub-optimal plateau states in gradient based training of layered neural networks for regression.

]]>Entropy doi: 10.3390/e20100774

Authors: Yimin Yin Xiaojun Duan

In this paper, a rigorous formalism of information transfer within a multi-dimensional deterministic dynamic system is established for both continuous flows and discrete mappings. The underlying mechanism is derived from entropy change and transfer during the evolutions of multiple components. While this work is mainly focused on three-dimensional systems, the analysis of information transfer among state variables can be generalized to high-dimensional systems. Explicit formulas are given and verified in the classical Lorenz and Chua&rsquo;s systems. The uncertainty of information transfer is quantified for all variables, with which a dynamic sensitivity analysis could be performed statistically as an additional benefit. The generalized formalisms can be applied to study dynamical behaviors as well as asymptotic dynamics of the system. The simulation results can help to reveal some underlying information for understanding the system better, which can be used for prediction and control in many diverse fields.

]]>