Entropy doi: 10.3390/e20100721

Authors: Kalliopi Chochlaki Georgios Michas Filippos Vallianatos

The Yellowstone Park volcanic field is one of the most active volcanic systems in the world, presenting intense seismic activity that is characterized by several earthquake swarms over the last decades. In the present work, we focused on the spatiotemporal properties of the recent earthquake swarms that occurred on December&ndash;January 2008&ndash;2009 and the 2010 Madison Plateau swarm, using the approach of Non Extensive Statistical Physics (NESP). Our approach is based on Tsallis entropy, and is used in order to describe the behavior of complex systems where fracturing and strong correlations exist, such as in tectonic and volcanic environments. This framework is based on the maximization of the non-additive Tsallis entropy Sq, introducing the q-exponential function and the entropic parameter q that expresses the degree of non-extentivity of the system. The estimation of the q-parameters could be used as a correlation degree among the events in the spatiotemporal evolution of seismicity. Using the seismic data provided by University of Utah Seismological Stations (UUSS), we analyzed the inter-event time (T) and distance (r) distribution of successive earthquakes that occurred during the two swarms, fitting the observed data with the q-exponential function, resulting in the estimation of the Tsallis entropic parameters qT, qr for the inter-event time and distance distributions, respectively. Furthermore, we studied the magnitude-frequency distribution of the released earthquake energies E as formulated in the frame of NESP, which results in the estimation of the qE parameter. Our analysis provides the triplet (qE, qT, qr) that describes the magnitude-frequency distribution and the spatiotemporal scaling properties of each of the studied earthquake swarms. In addition, the spatial variability of qE throughout the Yellowstone park volcanic area is presented and correlated with the existence of the regional hydrothermal features.

]]>Entropy doi: 10.3390/e20100722

Authors: Rabha W. Ibrahim Maslina Darus

In this paper, we study Tsallis&rsquo; fractional entropy (TFE) in a complex domain by applying the definition of the complex probability functions. We study the upper and lower bounds of TFE based on some special functions. Moreover, applications in complex neural networks (CNNs) are illustrated to recognize the accuracy of CNNs.

]]>Entropy doi: 10.3390/e20100720

Authors: Adel Ouannas Xiong Wang Amina-Aicha Khennaoui Samir Bendoukha Viet-Thanh Pham Fawaz E. Alsaadi

In this paper, we investigate the dynamics of a fractional order chaotic map corresponding to a recently developed standard map that exhibits a chaotic behavior with no fixed point. This is the first study to explore a fractional chaotic map without a fixed point. In our investigation, we use phase plots and bifurcation diagrams to examine the dynamics of the fractional map and assess the effect of varying the fractional order. We also use the approximate entropy measure to quantify the level of chaos in the fractional map. In addition, we propose a one-dimensional stabilization controller and establish its asymptotic convergence by means of the linearization method.

]]>Entropy doi: 10.3390/e20090719

Authors: Jesús Gutiérrez-Gutiérrez Marta Zárraga-Rodríguez Pedro M. Crespo Xabier Insausti

In this paper, we obtain an integral formula for the rate distortion function (RDF) of any Gaussian asymptotically wide sense stationary (AWSS) vector process. Applying this result, we also obtain an integral formula for the RDF of Gaussian moving average (MA) vector processes and of Gaussian autoregressive MA (ARMA) AWSS vector processes.

]]>Entropy doi: 10.3390/e20090718

Authors: Hao Liao Xiao-Min Huang Alexandre Vidmer Yi-Cheng Zhang Ming-Yang Zhou

The Belt and Road initiative (BRI) was announced in 2013 by the Chinese government. Its goal is to promote the cooperation between European and Asian countries, as well as enhancing the trust between members and unifying the market. Since its creation, more and more developing countries are joining the initiative. Based on the geographical location characteristics of the countries in this initiative, we propose an improvement of a popular recommendation algorithm that includes geographic location information. This recommendation algorithm is able to make suitable recommendations of products for countries in the BRI. Then, Fitness and Complexity metrics are used to evaluate the impact of the recommendation results and measure the country&rsquo;s competitiveness. The aim of this work is to provide countries&rsquo; insights on the ideal development direction. By following the recommendations, the countries can quickly increase their international competitiveness.

]]>Entropy doi: 10.3390/e20090717

Authors: Maël Dugast Guillaume Bouleux Eric Marcon

We proposed in this work the introduction of a new vision of stochastic processes through geometry induced by dilation. The dilation matrices of a given process are obtained by a composition of rotation matrices built in with respect to partial correlation coefficients. Particularly interesting is the fact that the obtention of dilation matrices is regardless of the stationarity of the underlying process. When the process is stationary, only one dilation matrix is obtained and it corresponds therefore to Naimark dilation. When the process is nonstationary, a set of dilation matrices is obtained. They correspond to Kolmogorov decomposition. In this work, the nonstationary class of periodically correlated processes was of interest. The underlying periodicity of correlation coefficients is then transmitted to the set of dilation matrices. Because this set lives on the Lie group of rotation matrices, we can see them as points of a closed curve on the Lie group. Geometrical aspects can then be investigated through the shape of the obtained curves, and to give a complete insight into the space of curves, a metric and the derived geodesic equations are provided. The general results are adapted to the more specific case where the base manifold is the Lie group of rotation matrices, and because the metric in the space of curve naturally extends to the space of shapes; this enables a comparison between curves&rsquo; shapes and allows then the classification of random processes&rsquo; measures.

]]>Entropy doi: 10.3390/e20090716

Authors: Shuqin Zhu Congxu Zhu Wenhong Wang

In order to overcome the difficulty of key management in &ldquo;one time pad&rdquo; encryption schemes and also resist the attack of chosen plaintext, a new image encryption algorithm based on chaos and SHA-256 is proposed in this paper. The architecture of confusion and diffusion is adopted. Firstly, the surrounding of a plaintext image is surrounded by a sequence generated from the SHA-256 hash value of the plaintext to ensure that each encrypted result is different. Secondly, the image is scrambled according to the random sequence obtained by adding the disturbance term associated with the plaintext to the chaotic sequence. Third, the cyphertext (plaintext) feedback mechanism of the dynamic index in the diffusion stage is adopted, that is, the location index of the cyphertext (plaintext) used for feedback is dynamic. The above measures can ensure that the algorithm can resist chosen plaintext attacks and can overcome the difficulty of key management in &ldquo;one time pad&rdquo; encryption scheme. Also, experimental results such as key space analysis, key sensitivity analysis, differential analysis, histograms, information entropy, and correlation coefficients show that the image encryption algorithm is safe and reliable, and has high application potential.

]]>Entropy doi: 10.3390/e20090715

Authors: Ming Zhang Jinghong Zhou Runjuan Zhou

The sustainability of regional water resources has important supporting data needed for establishing policies on the sustainable development of the social economy. The purpose of this paper is to propose an assessment method to accurately reflect the sustainability of regional water resources in various areas. The method is based on the relative entropy of the information entropy theory. The steps are as follows. Firstly, the pretreatment of the evaluation sample data is required, before the relative entropy of each standard evaluation sample and evaluation grade (SEG) is calculated to obtain the entropy weight of each evaluation index. After this, the entropy weighted comprehensive index (WCI) of the standard evaluation grade sample is obtained. The function relation between WCI and SEG can be fitted by the cubic polynomial to construct the evaluation function. Using the above steps, a generalized entropy method (GEM) for the sustainable assessment of regional water resources is established and it is used to evaluate the sustainability of water resources in the Pingba and Huai River areas in China. The results show that the proposed GEM model can accurately reflect the sustainable water resources in the two regions. Compared with the other evaluation models, such as the Shepherd method, Artificial Neural Network and Fuzzy comprehensive evaluation, the GEM model has larger differences in its evaluation results, which are more reasonable. Thus, the proposed GEM model can provide scientific data support for coordinating the relationship between the sustainable development and utilization of regional water resources in order to improve the development of regional population, society and economy.

]]>Entropy doi: 10.3390/e20090714

Authors: Emanuel Guariglia

The aim of this paper is to investigate the generalization of the Sierpinski gasket through the harmonic metric. In particular, this work presents an antenna based on such a generalization. In fact, the harmonic Sierpinski gasket is used as a geometric configuration of small antennas. As with fractal antennas and R&eacute;nyi entropy, their performance is characterized by the associated entropy that is studied and discussed here.

]]>Entropy doi: 10.3390/e20090713

Authors: Dor Cohen Ofer Strichman

We present a new characterization of propositional formulas called entropy, which approximates the freedom we have in assigning the variables. Like several other such measures (e.g., back-door and back-door-key variables), it is computationally expensive to compute. Nevertheless, for small and medium-size satisfiable formulas, it enables us to study the effect of this freedom on the impact of various SAT heuristics, following up on a recent study by C. Oh (Oh, SAT&rsquo;15, LNCS 9340, 307&ndash;323). Oh&rsquo;s findings were that the expected success of various heuristics depends on whether the input formula is satisfiable or not. With entropy, and also with the measure of solution density, we are able to refine these findings for the case of satisfiable formulas. Specifically, we found empirically that satisfiable formulas with small entropy &ldquo;behave&rdquo; similarly to unsatisfiable formulas.

]]>Entropy doi: 10.3390/e20090712

Authors: Edward Bormashenko

The notion of three-phase (line) tension remains one of the most disputable notions in surface science. A very broad range of its values has been reported. Experts even do not agree on the sign of line tension. The polymer-chain-like model of three-phase (triple) line enables rough estimation of entropic input into the value of line tension, estimated as &Gamma; e n &cong; k B T d m &cong; 10 &minus; 11 N , where d m is the diameter of the liquid molecule. The introduction of the polymer-chain-like model of the triple line is justified by the &ldquo;water string&rdquo; model of the liquid state, predicting strong orientation effects for liquid molecules located near hydrophobic moieties. The estimated value of the entropic input into the line tension is close to experimental findings, reported by various groups, and seems to be relevant for the understanding of elastic properties of biological membranes.

]]>Entropy doi: 10.3390/e20090711

Authors: Yuze Su Xiangru Meng Qiaoyan Kang Xiaoyang Han

Network virtualization can offer more flexibility and better manageability for next generation Internet. With the increasing deployments of virtual networks in military and commercial networks, a major challenge is to ensure virtual network survivability against hybrid multiple failures. In this paper, we study the problem of recovering virtual networks affected by hybrid multiple failures in substrate networks and provide an integer linear programming formulation to solve it. We propose a heuristic algorithm to tackle the complexity of the integer linear programming formulation, which includes a faulty virtual network reconfiguration ranking method based on weighted relative entropy, a hybrid multiple failures ranking algorithm, and a virtual node migration method based on weighted relative entropy. In the faulty virtual network reconfiguration ranking method based on weighted relative entropy and virtual node migration method based on weighted relative entropy, multiple ranking indicators are combined in a suitable way based on weighted relative entropy. In the hybrid multiple failures ranking algorithm, the virtual node and its connective virtual links are re-embedded, firstly. Evaluation results show that our heuristic method not only has the best acceptance ratio and normal operation ratio, but also achieves the highest long-term average revenue to cost ratio compared with other virtual network reconfiguration methods.

]]>Entropy doi: 10.3390/e20090710

Authors: Samir Bendoukha Adel Ouannas Xiong Wang Amina-Aicha Khennaoui Viet-Thanh Pham Giuseppe Grassi Van Van Huynh

This paper is concerned with the co-existence of different synchronization types for fractional-order discrete-time chaotic systems with different dimensions. In particular, we show that through appropriate nonlinear control, projective synchronization (PS), full state hybrid projective synchronization (FSHPS), and generalized synchronization (GS) can be achieved simultaneously. A second nonlinear control scheme is developed whereby inverse full state hybrid projective synchronization (IFSHPS) and inverse generalized synchronization (IGS) are shown to co-exist. Numerical examples are presented to confirm the findings.

]]>Entropy doi: 10.3390/e20090709

Authors: Anton M. Unakafov Karsten Keller

This paper is devoted to change-point detection using only the ordinal structure of a time series. A statistic based on the conditional entropy of ordinal patterns characterizing the local up and down in a time series is introduced and investigated. The statistic requires only minimal a priori information on given data and shows good performance in numerical experiments. By the nature of ordinal patterns, the proposed method does not detect pure level changes but changes in the intrinsic pattern structure of a time series and so it could be interesting in combination with other methods.

]]>Entropy doi: 10.3390/e20090708

Authors: Jonatan Zischg Wolfgang Rauch Robert Sitzenfrei

Cities and their infrastructure networks are always in motion and permanently changing in structure and function. This paper presents a methodology for automatically creating future water distribution networks (WDNs) that are stressed step-by-step by disconnection and connection of WDN parts. The associated effects of demand shifting and flow rearrangements are simulated and assessed with hydraulic performances. With the methodology, it is possible to test various planning and adaptation options of the future WDN, where the unknown (future) network is approximated via the co-located and known (future) road network, and hence different topological characteristics (branched vs. strongly looped layout) can be investigated. The reliability of the planning options is evaluated with the flow entropy, a measure based on Shannon&rsquo;s informational entropy. Uncertainties regarding future water consumption and water loss management are included in a scenario analysis. To avoid insufficient water supply to customers during the transition process from an initial to a final WDN state, an adaptation concept is proposed where critical WDN components are replaced over time. Finally, the method is applied to the drastic urban transition of Kiruna, Sweden. Results show that without adaptation measures severe performance drops will occur after the WDN state 2023, mainly caused by the disconnection of WDN parts. However, with low adaptation efforts that consider 2&ndash;3% pipe replacement, sufficient pressure performances are achieved. Furthermore, by using an entropy-cost comparison, the best planning options are determined.

]]>Entropy doi: 10.3390/e20090707

Authors: Matthew E. Quenneville David A. Sivak

A stochastic system under the influence of a stochastic environment is correlated with both present and future states of the environment. Such a system can be seen as implicitly implementing a predictive model of future environmental states. The non-predictive model complexity has been shown to lower-bound the thermodynamic dissipation. Here we explore these statistical and physical quantities at steady state in simple models. We show that under quasi-static driving this model complexity saturates the dissipation. Beyond the quasi-static limit, we demonstrate a lower bound on the ratio of this model complexity to total dissipation, that is realized in the limit of weak driving.

]]>Entropy doi: 10.3390/e20090706

Authors: Khalid Sayood

We examine how information theory has been used to study cognition over the last seven decades. After an initial burst of activity in the 1950s, the backlash that followed stopped most work in this area. The last couple of decades has seen both a revival of interest, and a more firmly grounded, experimentally justified use of information theory. We can view cognition as the process of transforming perceptions into information&mdash;where we use information in the colloquial sense of the word. This last clarification is one of the problems we run into when trying to use information theoretic principles to understand or analyze cognition. Information theory is mathematical, while cognition is a subjective phenomenon. It is relatively easy to discern a subjective connection between cognition and information; it is a different matter altogether to apply the rigor of information theory to the process of cognition. In this paper, we will look at the many ways in which people have tried to alleviate this problem. These approaches range from narrowing the focus to only quantifiable aspects of cognition or borrowing conceptual machinery from information theory to address issues of cognition. We describe applications of information theory across a range of cognition research, from neural coding to cognitive control and predictive coding.

]]>Entropy doi: 10.3390/e20090705

Authors: Juan López-Sauceda Jorge López-Ortega Gerardo Abel Laguna Sánchez Jacobo Sandoval Gutiérrez Ana Paola Rojas Meza José Luis Aragón

A basic pattern in the body plan architecture of many animals, plants and some molecular and cellular systems is five-part units. This pattern has been understood as a result of genetic blueprints in development and as a widely conserved evolutionary character. Despite some efforts, a definitive explanation of the abundance of pentagonal symmetry at so many levels of complexity is still missing. Based on both, a computational platform and a statistical spatial organization argument, we show that five-fold morphology is substantially different from other abundant symmetries like three-fold, four-fold and six-fold symmetries in terms of spatial interacting elements. We develop a measuring system to determine levels of spatial organization in 2D polygons (homogeneous or heterogeneous partition of defined areas) based on principles of regularity in a morphospace. We found that spatial organization of five-fold symmetry is statistically higher than all other symmetries studied here (3 to 10-fold symmetries) in terms of spatial homogeneity. The significance of our findings is based on the statistical constancy of geometrical constraints derived from spatial organization of shapes, beyond the material or complexity level of the many different systems where pentagonal symmetry occurs.

]]>Entropy doi: 10.3390/e20090704

Authors: Jürn Schmelzer Timur Tropin

A response is given to a comment of Zanotto and Mauro on our paper published in Entropy 20, 103 (2018). Our arguments presented in this paper are widely ignored by them, and no new considerations are outlined in the comment, which would require a revision of our conclusions. For this reason, we restrict ourselves here to a brief response, supplementing it by some additional arguments in favor of our point of view not included in our above-cited paper.

]]>Entropy doi: 10.3390/e20090703

Authors: Edgar D. Zanotto John C. Mauro

In a recent article, Schmelzer and Tropin [Entropy 2018, 20, 103] presented a critique of several aspects of modern glass science, including various features of glass transition and relaxation, crystallization, and the definition of glass itself. We argue that these criticisms are at odds with well-accepted knowledge in the field from both theory and experiments. The objective of this short comment is to clarify several of these issues.

]]>Entropy doi: 10.3390/e20090702

Authors: Binghan Liu Zhongguang Fu Pengkai Wang Lu Liu Manda Gao Ji Liu

The energy use analysis of coal-fired power plant units is of significance for energy conservation and consumption reduction. One of the most serious problems attributed to Chinese coal-fired power plants is coal waste. Several units in one plant may experience a practical rated output situation at the same time, which may increase the coal consumption of the power plant. Here, we propose a new hybrid methodology for plant-level load optimization to minimize coal consumption for coal-fired power plants. The proposed methodology includes two parts. One part determines the reference value of the controllable operating parameters of net coal consumption under typical load conditions, based on an improved K-means algorithm and the Hadoop platform. The other part utilizes a support vector machine to determine the sensitivity coefficients of various operating parameters for the net coal consumption under different load conditions. Additionally, the fuzzy rough set attribute reduction method was employed to obtain the minimalist properties reduction method parameters to reduce the complexity of the dataset. This work is based on continuously-measured information system data from a 600 MW coal-fired power plant in China. The results show that the proposed strategy achieves high energy conservation performance. Taking the 600 MW load optimization value as an example, the optimized power supply coal consumption is 307.95 g/(kW&middot;h) compared to the actual operating value of 313.45 g/(kW&middot;h). It is important for coal-fired power plants to reduce their coal consumption.

]]>Entropy doi: 10.3390/e20090701

Authors: Beige Ye Taorong Qiu Xiaoming Bai Ping Liu

In view of the nonlinear characteristics of electroencephalography (EEG) signals collected in the driving fatigue state recognition research and the issue that the recognition accuracy of the driving fatigue state recognition method based on EEG is still unsatisfactory, this paper proposes a driving fatigue recognition method based on sample entropy (SE) and kernel principal component analysis (KPCA), which combines the advantage of the high recognition accuracy of sample entropy and the advantages of KPCA in dimensionality reduction for nonlinear principal components and the strong non-linear processing capability. By using support vector machine (SVM) classifier, the proposed method (called SE_KPCA) is tested on the EEG data, and compared with those based on fuzzy entropy (FE), combination entropy (CE), three kinds of entropies including SE, FE and CE that merged with KPCA. Experiment results show that the method is effective.

]]>Entropy doi: 10.3390/e20090700

Authors: Michail Vlysidis Yiannis N. Kaznessis

The time evolution of stochastic reaction networks can be modeled with the chemical master equation of the probability distribution. Alternatively, the numerical problem can be reformulated in terms of probability moment equations. Herein we present a new alternative method for numerically solving the time evolution of stochastic reaction networks. Based on the assumption that the entropy of the reaction network is maximum, Lagrange multipliers are introduced. The proposed method derives equations that model the time derivatives of these Lagrange multipliers. We present detailed steps to transform moment equations to Lagrange multiplier equations. In order to demonstrate the method, we present examples of non-linear stochastic reaction networks of varying degrees of complexity, including multistable and oscillatory systems. We find that the new approach is as accurate and significantly more efficient than Gillespie&rsquo;s original exact algorithm for systems with small number of interacting species. This work is a step towards solving stochastic reaction networks accurately and efficiently.

]]>Entropy doi: 10.3390/e20090699

Authors: Guolong Chen

The Koch curve exciting coil eddy current sensor is a kind of novel flexible planar eddy current probe. In this study, an intersection angle spectrum entropy index and a radial direction energy spectrum entropy were proposed to evaluate the eddy current distribution. Eddy current distributions induced by one turn of a circular coil and one turn of a second order Koch curve coil feed with different exciting frequency alternative currents and at different lift-off distances, were simulated and the eddy current distributions varying with lift-off distance in different exciting frequencies were compared by the two proposed indices. With the increase of the lift-off distance or the decrease of exciting frequency, the similarity between the shape of the Koch curve and the eddy current distribution became weakened and the degree of the concentration of the eddy current distribution in the specimen under the exciting coil was loosened.

]]>Entropy doi: 10.3390/e20090698

Authors: Alberto Muñoz Nicolás Hernández Javier M. Moguerza Gabriel Martos

The combination of different sources of information is a problem that arises in several situations, for instance, when data are analysed using different similarity measures. Often, each source of information is given as a similarity, distance, or a kernel matrix. In this paper, we propose a new class of methods which consists of producing, for anomaly detection purposes, a single Mercer kernel (that acts as a similarity measure) from a set of local entropy kernels and, at the same time, avoids the task of model selection. This kernel is used to build an embedding of data in a variety that will allow the use of a (modified) one-class Support Vector Machine to detect outliers. We study several information combination schemes and their limiting behaviour when the data sample size increases within an Information Geometry context. In particular, we study the variety of the given positive definite kernel matrices to obtain the desired kernel combination as belonging to that variety. The proposed methodology has been evaluated on several real and artificial problems.

]]>Entropy doi: 10.3390/e20090697

Authors: Jiri Petrzela

This paper brings analysis of the multiple-valued memory system (MVMS) composed by a pair of the resonant tunneling diodes (RTD). Ampere-voltage characteristic (AVC) of both diodes is approximated in operational voltage range as common in practice: by polynomial scalar function. Mathematical model of MVMS represents autonomous deterministic dynamical system with three degrees of freedom and smooth vector field. Based on the very recent results achieved for piecewise-linear MVMS numerical values of the parameters are calculated such that funnel and double spiral chaotic attractor is observed. Existence of such types of strange attractors is proved both numerically by using concept of the largest Lyapunov exponents (LLE) and experimentally by computer-aided simulation of designed lumped circuit using only commercially available active elements.

]]>Entropy doi: 10.3390/e20090696

Authors: Sergio Davis Diego González Gonzalo Gutiérrez

A general framework for inference in dynamical systems is described, based on the language of Bayesian probability theory and making use of the maximum entropy principle. Taking the concept of a path as fundamental, the continuity equation and Cauchy&rsquo;s equation for fluid dynamics arise naturally, while the specific information about the system can be included using the maximum caliber (or maximum path entropy) principle.

]]>Entropy doi: 10.3390/e20090695

Authors: Gamaliel Blé Domingo González

This paper discusses some properties of the topological entropy systems generated by polynomials of degree d in their Hubbard tree. An optimization of Thurston&rsquo;s core entropy algorithm is developed for a family of polynomials of degree d.

]]>Entropy doi: 10.3390/e20090694

Authors: Teresa C. M. Dias Marcio A. Diniz Carlos A. de B. Pereira Adriano Polpo

The 37th edition of MaxEnt was held in Brazil, hosting several distinguished researchers and students. The workshop offered four tutorials, nine invited talks, twenty four oral presentations and twenty seven poster presentations. All submissions received their first choice between oral and poster presentations. The event held a celebration to Julio Stern&rsquo;s 60th anniversary and awarded two prizes to young researchers. As customary, the workshop had one free afternoon, in which participants visited the city&rsquo;s surroundings and experienced Brazilian food and traditions.

]]>Entropy doi: 10.3390/e20090693

Authors: Juan Wang Qun Ding

According to the keyword abstract extraction function in the Natural Language Processing and Information Retrieval Sharing Platform (NLPIR), the design method of a dynamic rounds chaotic block cipher is presented in this paper, which takes into account both the security and efficiency. The cipher combines chaotic theory with the Feistel structure block cipher, and uses the randomness of chaotic sequence and the nonlinearity of chaotic S-box to dynamically generate encrypted rounds, realizing more numbers of dynamic rounds encryption for the important information marked by NLPIR, while less numbers of dynamic rounds encryption for the non-important information that is not marked. Through linear and differential cryptographic analysis, ciphertext information entropy, &ldquo;0&ndash;1&rdquo; balance and National Institute of Science and Technology (NIST) tests and the comparison with other traditional and lightweight block ciphers, the results indicate that the dynamic variety of encrypted rounds can achieve different levels of encryption for different information, which can achieve the purpose of enhancing the anti-attack ability and reducing the number of encrypted rounds. Therefore, the dynamic rounds chaotic block cipher can guarantee the security of information transmission and realize the lightweight of the cryptographic algorithm.

]]>Entropy doi: 10.3390/e20090692

Authors: Margarita A. Man’ko Vladimir I. Man’ko

We study an analog of Bayes&rsquo; formula and the nonnegativity property of mutual information for systems with one random variable. For single-qudit states, we present new entropic inequalities in the form of the subadditivity and condition corresponding to hidden correlations in quantum systems. We present qubit states in the quantum suprematism picture, where these states are identified with three probability distributions, describing the states of three classical coins, and illustrate the states by Triada of Malevich&rsquo;s squares with areas satisfying the quantum constraints. We consider arbitrary quantum states belonging to N-dimensional Hilbert space as ( N 2 &minus; 1 ) fair probability distributions describing the states of ( N 2 &minus; 1 ) classical coins. We illustrate the geometrical properties of the qudit states by a set of Triadas of Malevich&rsquo;s squares. We obtain new entropic inequalities for matrix elements of an arbitrary density N&times;N-matrix of qudit systems using the constructed maps of the density matrix on a set of the probability distributions. In addition, to construct the bijective map of the qudit state onto the set of probabilities describing the positions of classical coins, we show that there exists a bijective map of any quantum observable onto the set of dihotomic classical random variables with statistics determined by the above classical probabilities. Finally, we discuss the physical meaning and possibility to check derived inequalities in the experiments with superconducting circuits based on Josephson junction devices.

]]>Entropy doi: 10.3390/e20090691

Authors: Irina Popova Alexandr Rozhnoi Maria Solovieva Danila Chebrov Masashi Hayakawa

The neural network approach is proposed for studying very-low- and low-frequency (VLF and LF) subionospheric radio wave variations in the time vicinities of magnetic storms and earthquakes, with the purpose of recognizing anomalies of different types. We also examined the days with quiet geomagnetic conditions in the absence of seismic activity, in order to distinguish between the disturbed signals and the quiet ones. To this end, we trained the neural network (NN) on the examples of the representative database. The database included both the VLF/LF data that was measured during four-year monitoring at the station in Petropavlovsk-Kamchatsky, and the parameters of seismicity in the Kuril-Kamchatka and Japan regions. It was shown that the neural network can distinguish between the disturbed and undisturbed signals. Furthermore, the prognostic behavior of the VLF/LF variations indicative of magnetic and seismic activity has a different appearance in the time vicinity of the earthquakes and magnetic storms.

]]>Entropy doi: 10.3390/e20090690

Authors: Ali H. Alkhaldi Mohd. Aquib Aliya Naaz Siddiqui Mohammad Hasan Shahid

In this paper, we obtain the upper bounds for the normalized &delta; -Casorati curvatures and generalized normalized &delta; -Casorati curvatures for statistical submanifolds in Sasaki-like statistical manifolds with constant curvature. Further, we discuss the equality case of the inequalities. Moreover, we give the necessary and sufficient condition for a Sasaki-like statistical manifold to be &eta; -Einstein. Finally, we provide the condition under which the metric of Sasaki-like statistical manifolds with constant curvature is a solution of vacuum Einstein field equations.

]]>Entropy doi: 10.3390/e20090689

Authors: Ming Li Jialin Wang Ying Li Yingcheng Xu

Disclosure of sustainability information is important for stockholders and governments. In order to evaluate the quality of sustainability information disclosure in heavily polluting industries, the quality of the disclosure is proposed to be evaluated from completeness, adequacy, relevance, reliability, normativeness and clarity aspects. The corresponding evaluation indicator system is constructed. Due to the ambiguity and complexity of the evaluation information, the intuitionistic fuzzy sets are applied to model the linguistic ratings. Entropy is used to derive the weight of experts, the object weight and the subject weight of the indicators. which are integrated when dealing with the evaluation information. The quality of sustainability information disclosure of seven representative companies in heavily polluting industries is evaluated. The importance of indicators and ranking of the companies are derived. Based on the evaluation results, the discussion and suggestions are also provided.

]]>Entropy doi: 10.3390/e20090688

Authors: Albert No

Let X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n &rarr; { 0 , 1 } that maximizes the mutual information I ( f ( X n ) ; Y n ) is a dictator function, i.e., f ( x n ) = x i for some i. We propose a clustering problem, which is equivalent to the above problem where we emphasize an information geometry aspect of the equivalent problem. Moreover, we define a normalized geometric mean of measures and interesting properties of it. We also show that the conjecture is true when the arithmetic and geometric mean coincide in a specific set of measures.

]]>Entropy doi: 10.3390/e20090687

Authors: Gerard Milburn Sally Shrapnel

Characterising causal structure is an activity that is ubiquitous across the sciences. Causal models are representational devices that can be used as oracles for future interventions, to predict how values of some variables will change in response to interventions on others. Recent work has generalised concepts from this field to situations involving quantum systems, resulting in a new notion of quantum causal structure. A key concept in both the classical and quantum context is that of an intervention. Interventions are the controlled operations required to identify causal structure and ultimately the feature that endows causal models with empirical meaning. Although interventions are a crucial feature of both the classical and quantum causal modelling frameworks, to date there has been no discussion of their physical basis. In this paper, we consider interventions from a physical perspective and show that, in both the classical and quantum case, they are constrained by the thermodynamics of measurement and feedback in open systems. We demonstrate that the perfect &ldquo;atomic&rdquo; or &ldquo;surgical&rdquo; interventions characterised by Pearl&rsquo;s famous do-calculus are physically impossible, and this is the case for both classical and quantum systems.

]]>Entropy doi: 10.3390/e20090686

Authors: Borzoo Rassouli Morteza Varasteh Deniz Gündüz

The capacity region of a two-transmitter Gaussian multiple access channel (MAC) under average input power constraints is studied, when the receiver employs a zero-threshold one-bit analogue-to-digital converter (ADC). It is proven that the input distributions of the two transmitters that achieve the boundary points of the capacity region are discrete. Based on the position of a boundary point, upper bounds on the number of the mass points of the corresponding distributions are derived. Furthermore, a lower bound on the sum capacity is proposed that can be achieved by time division with power control. Finally, inspired by the numerical results, the proposed lower bound is conjectured to be tight.

]]>Entropy doi: 10.3390/e20090685

Authors: Jiang You Huijun Feng Lingen Chen Zhihui Xie

A heat conduction model in a radial-pattern disc by considering non-uniform heat generation (NUHG) is established in this paper. A series of high conductivity channels (HCCs) are attached on the rim of the disc and extended to its center. Constructal optimizations of the discs with constant and variable cross-sectional HCCs are carried out, respectively, and their maximum temperature differences (MTDs) are minimized based on analytical method and finite element method. Besides, the influences of the NUHG coefficient, HCC number and width coefficient on the optimal results are studied. The results indicate that the deviation of the optimal constructs obtained from the analytical method and finite element method are comparatively slight. When the NUHG coefficient is equal to 10, the minimum MTD of the disc with 25 constant cross-sectional HCCs is specifically reduced by 48.8% compared to that with 10 HCCs. As a result, the heat conduction performance (HCP) of the disc can be efficiently improved by properly increasing the number of HCCs. The minimum MTD of the disc with variable cross-sectional HCC is decreased by 15.0% when the width coefficient is changed from 1 to 4. Therefore, the geometry of variable cross-sectional HCC can be applied in the constructal design of the disc to a better heat transfer performance. The constructal results obtained by investigating the non-uniform heat generating case in this paper can contribute to the design of practical electronic device to a better heat transfer performance.

]]>Entropy doi: 10.3390/e20090684

Authors: Fernando Jiménez Carlos Martínez Luis Miralles-Pechuán Gracia Sánchez Guido Sciavicco

The ease of interpretation of a classification model is essential for the task of validating it. Sometimes it is required to clearly explain the classification process of a model&rsquo;s predictions. Models which are inherently easier to interpret can be effortlessly related to the context of the problem, and their predictions can be, if necessary, ethically and legally evaluated. In this paper, we propose a novel method to generate rule-based classifiers from categorical data that can be readily interpreted. Classifiers are generated using a multi-objective optimization approach focusing on two main objectives: maximizing the performance of the learned classifier and minimizing its number of rules. The multi-objective evolutionary algorithms ENORA and NSGA-II have been adapted to optimize the performance of the classifier based on three different machine learning metrics: accuracy, area under the ROC curve, and root mean square error. We have extensively compared the generated classifiers using our proposed method with classifiers generated using classical methods such as PART, JRip, OneR and ZeroR. The experiments have been conducted in full training mode, in 10-fold cross-validation mode, and in train/test splitting mode. To make results reproducible, we have used the well-known and publicly available datasets Breast Cancer, Monk&rsquo;s Problem 2, Tic-Tac-Toe-Endgame, Car, kr-vs-kp and Nursery. After performing an exhaustive statistical test on our results, we conclude that the proposed method is able to generate highly accurate and easy to interpret classification models.

]]>Entropy doi: 10.3390/e20090683

Authors: Ying Yang Huaixin Cao

Einstein-Podolsky-Rosen (EPR) steering is very important quantum correlation of a composite quantum system. It is an intermediate type of nonlocal correlation between entanglement and Bell nonlocality. In this paper, based on introducing definitions and characterizations of EPR steering, some EPR steering inequalities are derived. With these inequalities, the steerability of the maximally entangled state is checked and some conditions for the steerability of the X -states are obtained.

]]>Entropy doi: 10.3390/e20090682

Authors: Huimin Zhao Rui Yao Ling Xu Yu Yuan Guangyu Li Wu Deng

A damage degree identification method based on high-order difference mathematical morphology gradient spectrum entropy (HMGSEDI) is proposed in this paper to solve the problem that fault signal of rolling bearings are weak and difficult to be quantitatively measured. In the HMGSEDI method, on the basis of mathematical morphology gradient spectrum and spectrum entropy, the changing scale influence of structure elements to damage degree identification is thoroughly analyzed to determine its optimal scale range. The high-order difference mathematical morphology gradient spectrum entropy is then defined in order to quantitatively describe the fault damage degree of bearing. The discrimination concept of fault damage degree is defined to quantitatively describe the difference between the high-order differential mathematical entropy and the general mathematical morphology entropy in order to propose a fault damage degree identification method. The vibration signal of motors under no-load and load states are used to testify the effectiveness of the proposed HMGSEDI method. The experiment shows that high-order differential mathematical morphology entropy can more effectively identify the fault damage degree of bearings and the identification accuracy of fault damage degree can be greatly improved. Therefore, the HMGSEDI method is an effective quantitative fault damage degree identification method, and provides a new way to identify fault damage degree and fault prediction of rotating machinery.

]]>Entropy doi: 10.3390/e20090681

Authors: Pedro J. Zufiria Iker Barriales-Valbuena

Time evolving Random Network Models are presented as a mathematical framework for modelling and analyzing the evolution of complex networks. This framework allows the analysis over time of several network characterizing features such as link density, clustering coefficient, degree distribution, as well as entropy-based complexity measures, providing new insight on the evolution of random networks. First, some simple dynamic network models, based only on edge density, are analyzed to serve as a baseline reference for assessing more complex models. Then, a model that depends on network structure with the aim of reflecting some characteristics of real networks is also analyzed. Such model shows a more sophisticated behavior with two different regimes, one of them leading to the generation of high clustering coefficient/link density ratio values when compared with the baseline values, as it happens in many real networks. Simulation examples are discussed to illustrate the behavior of the proposed models.

]]>Entropy doi: 10.3390/e20090680

Authors: Alexander Felix Beckmann Anirudh Singh Rana Manuel Torrilhon Henning Struchtrup

Due to the failure of the continuum hypothesis for higher Knudsen numbers, rarefied gases and microflows of gases are particularly difficult to model. Macroscopic transport equations compete with particle methods, such as the Direct Simulation Monte Carlo method (DSMC), to find accurate solutions in the rarefied gas regime. Due to growing interest in micro flow applications, such as micro fuel cells, it is important to model and understand evaporation in this flow regime. Here, evaporation boundary conditions for the R13 equations, which are macroscopic transport equations with applicability in the rarefied gas regime, are derived. The new equations utilize Onsager relations, linear relations between thermodynamic fluxes and forces, with constant coefficients, that need to be determined. For this, the boundary conditions are fitted to DSMC data and compared to other R13 boundary conditions from kinetic theory and Navier&ndash;Stokes&ndash;Fourier (NSF) solutions for two one-dimensional steady-state problems. Overall, the suggested fittings of the new phenomenological boundary conditions show better agreement with DSMC than the alternative kinetic theory evaporation boundary conditions for R13. Furthermore, the new evaporation boundary conditions for R13 are implemented in a code for the numerical solution of complex, two-dimensional geometries and compared to NSF solutions. Different flow patterns between R13 and NSF for higher Knudsen numbers are observed.

]]>Entropy doi: 10.3390/e20090679

Authors: David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions (&ldquo;dits&rdquo;) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as &ldquo;two-draw&rdquo; probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.

]]>Entropy doi: 10.3390/e20090678

Authors: Michail Vlysidis Yiannis N. Kaznessis

Deterministic and stochastic models of chemical reaction kinetics can give starkly different results when the deterministic model exhibits more than one stable solution. For example, in the stochastic Schl&ouml;gl model, the bimodal stationary probability distribution collapses to a unimodal distribution when the system size increases, even for kinetic constant values that result in two distinct stable solutions in the deterministic Schl&ouml;gl model. Using zero-information (ZI) closure scheme, an algorithm for solving chemical master equations, we compute stationary probability distributions for varying system sizes of the Schl&ouml;gl model. With ZI-closure, system sizes can be studied that have been previously unattainable by stochastic simulation algorithms. We observe and quantify paradoxical discrepancies between stochastic and deterministic models and explain this behavior by postulating that the entropy of non-equilibrium steady states (NESS) is maximum.

]]>Entropy doi: 10.3390/e20090677

Authors: Salim Lahmiri Stelios Bekiros

The risk‒return trade-off is a fundamental relationship that has received a large amount of attention in financial and economic analysis. Indeed, it has important implications for understanding linear dynamics in price returns and active quantitative portfolio optimization. The main contributions of this work include, firstly, examining such a relationship in five major fertilizer markets through different time periods: a period of low variability in returns and a period of high variability such as that during which the recent global financial crisis occurred. Secondly, we explore how entropy in those markets varies during the investigated time periods. This requires us to assess their inherent informational dynamics. The empirical results show that higher volatility is associated with a larger return in diammonium phosphate, potassium chloride, triple super phosphate, and urea market, but not rock phosphate. In addition, the magnitude of this relationship is low during a period of high variability. It is concluded that key statistical patterns of return and the relationship between return and volatility are affected during high variability periods. Our findings indicate that entropy in return and volatility series of each fertilizer market increase significantly during time periods of high variability.

]]>Entropy doi: 10.3390/e20090676

Authors: Bofeng Xu Junheng Feng Tongguang Wang Yue Yuan Zhenzhou Zhao Wei Zhong

A trailing-edge flap control strategy for mitigating rotor power fluctuations of a 5 MW offshore floating wind turbine is developed under turbulent wind inflow. The wind shear must be considered because of the large rotor diameter. The trailing-edge flap control strategy is based on the turbulent wind speed, the blade azimuth angle, and the platform motions. The rotor power is predicted using the free vortex wake method, coupled with the control strategy. The effect of the trailing-edge flap control on the rotor power is determined by a comparison with the rotor power of a turbine without a trailing-edge flap control. The optimal values of the three control factors are obtained. The results show that the trailing-edge flap control strategy is effective for improving the stability of the output rotor power of the floating wind turbine under the turbulent wind condition.

]]>Entropy doi: 10.3390/e20090675

Authors: MHR. Khouzani Pasquale Malacaria

This paper studies the problem of optimal channel design. For a given input probability distribution and for hard and soft design constraints, the aim here is to design a (probabilistic) channel whose output leaks minimally from its input. To analyse this problem, general notions of entropy and information leakage are introduced. It can be shown that, for all notions of leakage here defined, the optimal channel design problem can be solved using convex programming with zero duality gap. Subsequently, the optimal channel design problem is studied in a game-theoretical framework: games allow for analysis of optimal strategies of both the defender and the adversary. It is shown that all channel design problems can be studied in this game-theoretical framework, and that the defender’s Bayes–Nash equilibrium strategies are equivalent to the solutions of the convex programming problem. Moreover, the adversary’s equilibrium strategies correspond to a robust inference problem.

]]>Entropy doi: 10.3390/e20090674

Authors: Xiaohong Wang Yidi He Lizhi Wang

In this study, due to the redundant and irrelevant features contained in the multi-dimensional feature parameter set, the information fusion performance of the subspace learning algorithm was reduced. To solve the above problem, a mutual information (MI) and fractal dimension-based unsupervised feature parameters selection method was proposed. The key to this method was the importance ordering algorithm based on the comprehensive consideration of the relevance and redundancy of features, and then the method of fractal dimension-based feature parameter subset evaluation criterion was adopted to obtain the optimal feature parameter subset. To verify the validity of the proposed method, a brushless direct current (DC) motor performance degradation test was designed. Vibrational sample data during motor performance degradation was used as the data source, and motor health-fault diagnosis capacity and motor state prediction effect ware evaluation indexes to compare the information fusion performance of the subspace learning algorithm before and after the use of the proposed method. According to the comparison result, the proposed method is able to eliminate highly-redundant parameters that are less correlated to feature parameters, thereby enhancing the information fusion performance of the subspace learning algorithm.

]]>Entropy doi: 10.3390/e20090673

Authors: David M. Leitner

We review a theory that predicts the onset of thermalization in a quantum mechanical coupled non-linear oscillator system, which models the vibrational degrees of freedom of a molecule. A system of N non-linear oscillators perturbed by cubic anharmonic interactions exhibits a many-body localization (MBL) transition in the vibrational state space (VSS) of the molecule. This transition can occur at rather high energy in a sizable molecule because the density of states coupled by cubic anharmonic terms scales as N3, in marked contrast to the total density of states, which scales as exp(aN), where a is a constant. The emergence of a MBL transition in the VSS is seen by analysis of a random matrix ensemble that captures the locality of coupling in the VSS, referred to as local random matrix theory (LRMT). Upon introducing higher order anharmonicity, the location of the MBL transition of even a sizable molecule, such as an organic molecule with tens of atoms, still lies at an energy that may exceed the energy to surmount a barrier to reaction, such as a barrier to conformational change. Illustrative calculations are provided, and some recent work on the influence of thermalization on thermal conduction in molecular junctions is also discussed.

]]>Entropy doi: 10.3390/e20090672

Authors: Majdoleen Abu Qamar Nasruddin Hassan

The idea of the Q-neutrosophic soft set emerges from the neutrosophic soft set by upgrading the membership functions to a two-dimensional entity which indicate uncertainty, indeterminacy and falsity. Hence, it is able to deal with two-dimensional inconsistent, imprecise, and indeterminate information appearing in real life situations. In this study, the tools that measure the similarity, distance and the degree of fuzziness of Q-neutrosophic soft sets are presented. The definitions of distance, similarity and measures of entropy are introduced. Some formulas for Q-neutrosophic soft entropy were presented. The known Hamming, Euclidean and their normalized distances are generalized to make them well matched with the idea of Q-neutrosophic soft set. The distance measure is subsequently used to define the measure of similarity. Lastly, we expound three applications of the measures of Q-neutrosophic soft sets by applying entropy and the similarity measure to a medical diagnosis and decision making problems.

]]>Entropy doi: 10.3390/e20090671

Authors: António M. Lopes J. A. Tenreiro Machado

n/a

]]>Entropy doi: 10.3390/e20090670

Authors: Tomasz Kapitaniak S. Alireza Mohammadi Saad Mekhilef Fawaz E. Alsaadi Tasawar Hayat Viet-Thanh Pham

In this paper, we introduce a new, three-dimensional chaotic system with one stable equilibrium. This system is a multistable dynamic system in which the strange attractor is hidden. We investigate its dynamic properties through equilibrium analysis, a bifurcation diagram and Lyapunov exponents. Such multistable systems are important in engineering. We perform an entropy analysis, parameter estimation and circuit design using this new system to show its feasibility and ability to be used in engineering applications.

]]>Entropy doi: 10.3390/e20090669

Authors: Hongjun Guan Zongli Dai Shuang Guan Aiwu Zhao

Most existing high-order prediction models abstract logical rules that are based on historical discrete states without considering historical inconsistency and fluctuation trends. In fact, these two characteristics are important for describing historical fluctuations. This paper proposes a model based on logical rules abstracted from historical dynamic fluctuation trends and the corresponding inconsistencies. In the logical rule training stage, the dynamic trend states of up and down are mapped to the two dimensions of truth-membership and false-membership of neutrosophic sets, respectively. Meanwhile, information entropy is employed to quantify the inconsistency of a period of history, which is mapped to the indeterminercy-membership of the neutrosophic sets. In the forecasting stage, the similarities among the neutrosophic sets are employed to locate the most similar left side of the logical relationship. Therefore, the two characteristics of the fluctuation trends and inconsistency assist with the future forecasting. The proposed model extends existing high-order fuzzy logical relationships (FLRs) to neutrosophic logical relationships (NLRs). When compared with traditional discrete high-order FLRs, the proposed NLRs have higher generality and handle the problem caused by the lack of rules. The proposed method is then implemented to forecast Taiwan Stock Exchange Capitalization Weighted Stock Index and Heng Seng Index. The experimental conclusions indicate that the model has stable prediction ability for different data sets. Simultaneously, comparing the prediction error with other approaches also proves that the model has outstanding prediction accuracy and universality.

]]>Entropy doi: 10.3390/e20090668

Authors: Umer Farooq Muhammad Idrees Afridi Muhammad Qasim D. C. Lu

The present research work explores the effects of suction/injection and viscous dissipation on entropy generation in the boundary layer flow of a hybrid nanofluid (Cu&ndash;Al2O3&ndash;H2O) over a nonlinear radially stretching porous disk. The energy dissipation function is added in the energy equation in order to incorporate the effects of viscous dissipation. The Tiwari and Das model is used in this work. The flow, heat transfer, and entropy generation analysis have been performed using a modified form of the Maxwell Garnett (MG) and Brinkman nanofluid model for effective thermal conductivity and dynamic viscosity, respectively. Suitable transformations are utilized to obtain a set of self-similar ordinary differential equations. Numerical solutions are obtained using shooting and bvp4c Matlab solver. The comparison of solutions shows excellent agreement. To examine the effects of principal flow parameters like suction/injection, the Eckert number, and solid volume fraction, different graphs are plotted and discussed. It is concluded that entropy generation inside the boundary layer of a hybrid nanofluid is high compared to a convectional nanofluid.

]]>Entropy doi: 10.3390/e20090667

Authors: Fen Yang Ziming Kou Juan Wu Tengyu Li

In this paper, a novel weak fault features extraction scheme is proposed to extract weak fault features in head sheave bearings of floor-type multi-rope friction mine hoists in strong noise environments. A mutual information-based sample entropy (MI-SE) is proposed to select the effective intrinsic mode function (IMF). The numerical simulation presented in this paper has demonstrated that the improved complete ensemble empirical mode decomposition with adaptive noise (ICEEMDAN) has a poor performance on weak signals processing under a strong noise background, and fault features cannot be identified clearly. The de-noised signal is decomposed into several IMFs by the ICEEMDAN method, with the help of the minimum entropy deconvolution (MED), which works as a pre-filter to increase the kurtosis value by about 3.2 times. The envelope spectrum of the effective IMF selected by the MI-SE method shows almost all fault features clearly. An analogous experiment system was built to verify the feasibility of the proposed scheme, whose results have also shown that the proposed hybrid scheme has better performance compared with ICEEMDAN or MED on the weak fault features extraction under a strong noise background. This paper provides a novel method to diagnose the weak faults of the slow speed and heavy load rolling bearings in a strong noise environment.

]]>Entropy doi: 10.3390/e20090666

Authors: Julien Ramousse Christophe Goupil

Thermoelectric system&rsquo;s operation needs careful attention to ensure optimal power conversion depending on the application aims. As a ternary diagram of bithermal systems allows a synthetic graphical analysis of the performance attainable by any work-heat conversion system, thermoelectric systems operation is plotted as a parametric curve function of the operating conditions (electric current and reservoirs&rsquo; temperature), based on the standard model of Ioffe. The threshold of each operating mode (heat engine, heat pump, thermal dissipation, and forced thermal transfer), along with the optimal efficiencies and powers of the heat pump and heat engine modes, are characterized graphically and analytically as a function of the material properties and the operating conditions. The sensibility of the performance aims (maximum efficiency vs. maximum power) with the operating conditions is, thus, highlighted. In addition, the specific contributions of each phenomenon involved in the semiconductor (reversible Seebeck effect, irreversible heat leakage by conduction and irreversible thermal dissipation by Joule effect) are discussed in terms of entropy generation. Finally, the impact of the exo-irreversibilities on the performance is analyzed by taking the external thermal resistances into account.

]]>Entropy doi: 10.3390/e20090665

Authors: Massimiliano Zanin Alejandro Rodríguez-González Ernestina Menasalvas Ruiz David Papo

Time irreversibility, i.e., the lack of invariance of the statistical properties of a system under time reversal, is a fundamental property of all systems operating out of equilibrium. Time reversal symmetry is associated with important statistical and physical properties and is related to the predictability of the system generating the time series. Over the past fifteen years, various methods to quantify time irreversibility in time series have been proposed, but these can be computationally expensive. Here, we propose a new method, based on permutation entropy, which is essentially parameter-free, temporally local, yields straightforward statistical tests, and has fast convergence properties. We apply this method to the study of financial time series, showing that stocks and indices present a rich irreversibility dynamics. We illustrate the comparative methodological advantages of our method with respect to a recently proposed method based on visibility graphs, and discuss the implications of our results for financial data analysis and interpretation.

]]>Entropy doi: 10.3390/e20090664

Authors: Ammar Alsabery Muneer Ismael Ali Chamkha Ishak Hashim

This numerical study considers the mixed convection and the inherent entropy generated in Al 2 O 3 –water nanofluid filling a cavity containing a rotating conductive cylinder. The vertical walls of the cavity are wavy and are cooled isothermally. The horizontal walls are thermally insulated, except for a heat source segment located at the bottom wall. The dimensionless governing equations subject to the selected boundary conditions are solved numerically using the Galerkin finite-element method. The study is accomplished by inspecting different ranges of the physical and geometrical parameters, namely, the Rayleigh number ( 10 3 ≤ R a ≤ 10 6 ), angular rotational velocity ( 0 ≤ Ω ≤ 750 ), number of undulations ( 0 ≤ N ≤ 4 ), volume fraction of Al 2 O 3 nanoparticles ( 0 ≤ ϕ ≤ 0.04 ), and the length of the heat source ( 0.2 ≤ H ≤ 0.8 ) . The results show that the rotation of the cylinder boosts the rate of heat exchange when the Rayleigh number is less than 5 × 10 5 . The number of undulations affects the average Nusselt number for a still cylinder. The rate of heat exchange increases with the volume fraction of the Al 2 O 3 nanoparticles and the length of the heater segment.

]]>Entropy doi: 10.3390/e20090663

Authors: Xudong Wang Xiaofeng Hui

This paper applies effective transfer entropy to research the information transfer in the Chinese stock market around its crash in 2015. According to the market states, the entire period is divided into four sub-phases: the tranquil, bull, crash, and post-crash periods. Kernel density estimation is used to calculate the effective transfer entropy. Then, the information transfer network is constructed. Nodes&rsquo; centralities and the directed maximum spanning trees of the networks are analyzed. The results show that, in the tranquil period, the information transfer is weak in the market. In the bull period, the strength and scope of the information transfer increases. The utility sector outputs a great deal of information and is the hub node for the information flow. In the crash period, the information transfer grows further. The market efficiency in this period is worse than that in the other three sub-periods. The information technology sector is the biggest information source, while the consumer staples sector receives the most information. The interactions of the sectors become more direct. In the post-crash period, information transfer declines but is still stronger than the tranquil time. The financial sector receives the largest amount of information and is the pivot node.

]]>Entropy doi: 10.3390/e20090662

Authors: Edgar Parker

After the 2008 financial collapse, the now popular measure of implied systemic risk called the absorption ratio was introduced. This statistic measures how closely the economy&rsquo;s markets are coupled. The more closely financial markets are coupled the more susceptible they are to systemic collapse. A new alternative measure of financial market health, the implied information processing ratio or entropic efficiency of the economy, was derived using concepts from information theory. This new entropic measure can also be useful in predicting economic downturns and measuring systematic risk. In the current work, the relationship between these two ratios and types of risks are explored. Potential methods of the joint use of these different measures to optimally reduce systemic and systematic risk are introduced.

]]>Entropy doi: 10.3390/e20090661

Authors: Andrea Matera Rahif Kassab Osvaldo Simeone Umberto Spagnolini

This paper considers the coexistence of Ultra Reliable Low Latency Communications (URLLC) and enhanced Mobile BroadBand (eMBB) services in the uplink of Cloud Radio Access Network (C-RAN) architecture based on the relaying of radio signals over analog fronthaul links. While Orthogonal Multiple Access (OMA) to the radio resources enables the isolation and the separate design of different 5G services, Non-Orthogonal Multiple Access (NOMA) can enhance the system performance by sharing wireless and fronthaul resources. This paper provides an information-theoretic perspective in the performance of URLLC and eMBB traffic under both OMA and NOMA. The analysis focuses on standard cellular models with additive Gaussian noise links and a finite inter-cell interference span, and it accounts for different decoding strategies such as puncturing, Treating Interference as Noise (TIN) and Successive Interference Cancellation (SIC). Numerical results demonstrate that, for the considered analog fronthauling C-RAN architecture, NOMA achieves higher eMBB rates with respect to OMA, while guaranteeing reliable low-rate URLLC communication with minimal access latency. Moreover, NOMA under SIC is seen to achieve the best performance, while, unlike the case with digital capacity-constrained fronthaul links, TIN always outperforms puncturing.

]]>Entropy doi: 10.3390/e20090660

Authors: Román Baravalle Osvaldo A. Rosso Fernando Montani

The electroencephalogram (EEG) is an electrophysiological monitoring method that allows us to glimpse the electrical activity of the brain. Neural oscillations patterns are perhaps the best salient feature of EEG as they are rhythmic activities of the brain that can be generated by interactions across neurons. Large-scale oscillations can be measured by EEG as the different oscillation patterns reflected within the different frequency bands, and can provide us with new insights into brain functions. In order to understand how information about the rhythmic activity of the brain during visuomotor/imagined cognitive tasks is encoded in the brain we precisely quantify the different features of the oscillatory patterns considering the Shannon&ndash;Fisher plane H &times; F . This allows us to distinguish the dynamics of rhythmic activities of the brain showing that the Beta band facilitate information transmission during visuomotor/imagined tasks.

]]>Entropy doi: 10.3390/e20090659

Authors: Stephen Fox Adrian Kotelba Ilkka Niskanen

Entropy in factories is situated. For example, there can be numerous different ways of picking, orientating, and placing physical components during assembly work. Physical components can be redesigned to increase the Information Gain they provide and so reduce situated entropy in assembly work. Also, situated entropy is affected by the extent of knowledge of those doing the work. For example, work can be done by knowledgeable experts or by beginners who lack knowledge about physical components, etc. The number of different ways that work can be done and the knowledge of the worker combine to affect cognitive load. Thus, situated entropy in factories relates to situated cognition within which knowledge is bound to physical contexts and knowing is inseparable from doing. In this paper, six contributions are provided for modelling situated entropy in factories. First, theoretical frameworks are brought together to provide a conceptual framework for modelling. Second, the conceptual framework is related to physical production using practical examples. Third, Information Theory mathematics is applied to the examples and a preliminary methodology in presented for modelling in practice. Fourth, physical artefacts in factory production are reframed as carriers of Information Gain and situated entropy, which may or may not combine as Net Information Gain. Fifth, situated entropy is related to different types of cognitive factories that involve different levels of uncertainty in production operations. Sixth, the need to measure Net Information Gain in the introduction of new technologies for embodied and extended cognition is discussed in relation to a taxonomy for distributed cognition situated in factory production. Overall, modelling of situated entropy is introduced as an opportunity for improving the planning and control of factories that deploy human cognition and cognitive technologies including assembly robotics.

]]>Entropy doi: 10.3390/e20090658

Authors: Łukasz Kuśmierz Bartłomiej Dybiec Ewa Gudowska-Nowak

Scale free L&eacute;vy motion is a generalized analogue of the Wiener process. Its time derivative extends the notion of &ldquo;white noise&rdquo; to non-Gaussian noise sources, and as such, it has been widely used to model natural signal variations described by an overdamped Langevin stochastic differential equation. Here, we consider the dynamics of an archetypal model: a Brownian-like particle is driven by external forces, and noise is represented by uncorrelated L&eacute;vy fluctuations. An unperturbed system of that form eventually attains a steady state which is uniquely determined by the set of parameter values. We show that the analyzed Markov process with the stability index &alpha; &lt; 2 violates the detailed balance, i.e., its stationary state is quantified by a stationary probability density and nonvanishing current. We discuss consequences of the non-Gibbsian character of the stationary state of the system and its impact on the general form of the fluctuation&ndash;dissipation theorem derived for weak external forcing.

]]>Entropy doi: 10.3390/e20090657

Authors: Young-Sik Kim

Since the entropy is a popular randomness measure, there are many studies for the estimation of entropies for given random samples. In this paper, we propose an estimation method of the R&eacute;nyi entropy of order &alpha; . Since the R&eacute;nyi entropy of order &alpha; is a generalized entropy measure including the Shannon entropy as a special case, the proposed estimation method for R&eacute;nyi entropy can detect any significant deviation of an ergodic stationary random source&rsquo;s output. It is shown that the expected test value of the proposed scheme is equivalent to the R&eacute;nyi entropy of order &alpha; . After deriving a general representation of parameters of the proposed estimator, we discuss on the particular orders of R&eacute;nyi entropy such as &alpha; &rarr; 1 , &alpha; = 1 / 2 , and &alpha; = 2 . Because the R&eacute;nyi entropy of order 2 is the most popular one, we present an iterative estimation method for the application with stringent resource restrictions.

]]>Entropy doi: 10.3390/e20090656

Authors: Arkady Plotnitsky

The article reconsiders quantum theory in terms of the following principle, which can be symbolically represented as QUANTUMNESS &rarr; PROBABILITY &rarr; ALGEBRA and will be referred to as the QPA principle. The principle states that the quantumness of physical phenomena, that is, the specific character of physical phenomena known as quantum, implies that our predictions concerning them are irreducibly probabilistic, even in dealing with quantum phenomena resulting from the elementary individual quantum behavior (such as that of elementary particles), which in turn implies that our theories concerning these phenomena are fundamentally algebraic, in contrast to more geometrical classical or relativistic theories, although these theories, too, have an algebraic component to them. It follows that one needs to find an algebraic scheme able make these predictions in a given quantum regime. Heisenberg was first to accomplish this in the case of quantum mechanics, as matrix mechanics, whose matrix character testified to his algebraic method, as Einstein characterized it. The article explores the implications of the Heisenberg method and of the QPA principle for quantum theory, and for the relationships between mathematics and physics there, from a nonrealist or, in terms of this article, &ldquo;reality-without-realism&rdquo; or RWR perspective, defining the RWR principle, thus joined to the QPA principle.

]]>Entropy doi: 10.3390/e20090655

Authors: Yuji Ikeda Fritz Körmann Isao Tanaka Jörg Neugebauer

Medium and high entropy alloys (MEAs and HEAs) based on 3d transition metals, such as face-centered cubic (fcc) CrCoNi and CrMnFeCoNi alloys, reveal remarkable mechanical properties. The stacking fault energy (SFE) is one of the key ingredients that controls the underlying deformation mechanism and hence the mechanical performance of materials. Previous experiments and simulations have therefore been devoted to determining the SFEs of various MEAs and HEAs. The impact of local chemical environment in the vicinity of the stacking faults is, however, still not fully understood. In this work, we investigate the impact of the compositional fluctuations in the vicinity of stacking faults for two prototype fcc MEAs and HEAs, namely CrCoNi and CrMnFeCoNi by employing first-principles calculations. Depending on the chemical composition close to the stacking fault, the intrinsic SFEs vary in the range of more than 150 mJ/m 2 for both the alloys, which indicates the presence of a strong driving force to promote particular types of chemical segregations towards the intrinsic stacking faults in MEAs and HEAs. Furthermore, the dependence of the intrinsic SFEs on local chemical fluctuations reveals a highly non-linear behavior, resulting in a non-trivial interplay of local chemical fluctuations and SFEs. This sheds new light on the importance of controlling chemical fluctuations via tuning, e.g., the annealing condition to obtain the desired mechanical properties for MEAs and HEAs.

]]>Entropy doi: 10.3390/e20090654

Authors: Sebastian Haas Mike Mosbacher Oleg N. Senkov Michael Feuerbacher Jens Freudenberger Senol Gezgin Rainer Völkl Uwe Glatzel

We determined the entropy of high entropy alloys by investigating single-crystalline nickel and five high entropy alloys: two fcc-alloys, two bcc-alloys and one hcp-alloy. Since the configurational entropy of these single-phase alloys differs from alloys using a base element, it is important to quantify the entropy. Using differential scanning calorimetry, cp-measurements are carried out from &minus;170 &deg;C to the materials&rsquo; solidus temperatures TS. From these experiments, we determined the thermal entropy and compared it to the configurational entropy for each of the studied alloys. We applied the rule of mixture to predict molar heat capacities of the alloys at room temperature, which were in good agreement with the Dulong-Petit law. The molar heat capacity of the studied alloys was about three times the universal gas constant, hence the thermal entropy was the major contribution to total entropy. The configurational entropy, due to the chemical composition and number of components, contributes less on the absolute scale. Thermal entropy has approximately equal values for all alloys tested by DSC, while the crystal structure shows a small effect in their order. Finally, the contributions of entropy and enthalpy to the Gibbs free energy was calculated and examined and it was found that the stabilization of the solid solution phase in high entropy alloys was mostly caused by increased configurational entropy.

]]>Entropy doi: 10.3390/e20090653

Authors: Yu Yao Junhui Zhao Lenan Wu

A new strategy to optimizing the waveforms of cognitive radar under transmitted power constraint is presented. Our scheme is to enhance the performance of target estimation by minimizing the MSE (mean-square error) of the estimates of target scattering coefficients (TSC) based on Kalman filtering and then minimizing mutual information (MI) between the radar target echoes at successive time instants. The two steps are the optimal design of transmission waveform and the selection of a reasonable waveform from the ensemble for emission, respectively. The waveform design technique addresses the problems of target detection and parameter estimation in intelligent transportation system (ITS), where there is a need of extracting the features of target information obtained from different sensors. As the number of iterations increases, simulation results show better TSC estimation from the radar scene provided by the proposed approach as compared with the traditional waveform optimization algorithm. In addition, the proposed algorithm results in improved target detection probability.

]]>Entropy doi: 10.3390/e20090652

Authors: Rusya Iryanti Yahaya Norihan Md Arifin Siti Suzilliana Putri Mohamed Isa

Two-dimensional magnetohydrodynamic (MHD) stagnation point flow of incompressible Casson fluid over a shrinking sheet is studied. In the present study, homogeneous-heterogeneous reactions, suction and slip effects are considered. Similarity variables are introduced to transform the governing partial differential equations into non-linear ordinary differential equations. The transformed equations and boundary conditions are then solved using the bvp4c solver in MATLAB. The local skin friction coefficient is tabulated for different values of suction and shrinking parameters. The profiles for fluid velocity and concentration for various parameters are illustrated. It was found that two solutions were obtained at certain ranges of parameters. Then, the bvp4c solver was used to perform stability analysis on the dual solutions. Based on the results, the first solution was more stable and physically meaningful than the other solution. The skin friction coefficient increased when suction increased, but decreased when the magnitude of shrinking parameter increased. Meanwhile, the velocity and concentration profile increased in the presence of a magnetic field. It is also noted that the higher the strength of the homogeneous-heterogeneous reactions, the lower the concentration of reactants.

]]>Entropy doi: 10.3390/e20090651

Authors: Piotr Weber Piotr Bełdowski Martin Bier Adam Gadomski

We study the entropy production that is associated with the growing or shrinking of a small granule in, for instance, a colloidal suspension or in an aggregating polymer chain. A granule will fluctuate in size when the energy of binding is comparable to k B T , which is the &ldquo;quantum&rdquo; of Brownian energy. Especially for polymers, the conformational energy landscape is often rough and has been commonly modeled as being self-similar in its structure. The subdiffusion that emerges in such a high-dimensional, fractal environment leads to a Fokker&ndash;Planck Equation with a fractional time derivative. We set up such a so-called fractional Fokker&ndash;Planck Equation for the aggregation into granules. From that Fokker&ndash;Planck Equation, we derive an expression for the entropy production of a growing granule.

]]>Entropy doi: 10.3390/e20090650

Authors: Paul M. Baggenstoss

The maximum entropy principle introduced by Jaynes proposes that a data distribution should maximize the entropy subject to constraints imposed by the available knowledge. Jaynes provided a solution for the case when constraints were imposed on the expected value of a set of scalar functions of the data. These expected values are typically moments of the distribution. This paper describes how the method of maximum entropy PDF projection can be used to generalize the maximum entropy principle to constraints on the joint distribution of this set of functions.

]]>Entropy doi: 10.3390/e20090649

Authors: Miguel A. Fuentes

In this work, we show that it is possible to obtain important ubiquitous physical characteristics when an aggregation of many systems is taken into account. We discuss the possibility of obtaining not only an anomalous diffusion process, but also a Non-Linear diffusion equation, that leads to a probability distribution, when using a set of non-Markovian processes. This probability distribution shows a power law behavior in the structure of its tails. It also reflects the anomalous transport characteristics of the ensemble of particles. This ubiquitous behavior, with a power law in the diffusive transport and the structure of the probability distribution, is related to a fast fluctuating phenomenon presented in the noise parameter. We discuss all the previous results using a financial time series example.

]]>Entropy doi: 10.3390/e20090648

Authors: Chun-Huei Tsau Meng-Chi Tsai

The effects of niobium and molybdenum additions on the microstructures, hardness and corrosion behaviors of CrFeCoNi(Nb,Mo) alloys were investigated. All of the CrFeCoNi(Nb,Mo) alloys displayed dendritic microstructures. The dendrites of CrFeCoNiNb and CrFeCoNiNb0.5Mo0.5 alloys were a hexagonal close packing (HCP) phase and the interdendrites were a eutectic structure of HCP and face-centered cubic (FCC) phases. Additionally, the dendrites of CrFeCoNiMo alloys were a simple cubic (SC) phase and the interdendrites were a eutectic structure of SC and FCC phases. The volume fraction of dendrites and interdendrites in these alloys were calculated. The influences of the volume fraction of dendrite in the alloys on the overall hardness were also discussed. The CrFeCoNiNb alloy had the larger volume fraction of dendrite and thus had the highest hardness among these alloys. The CrFeCoNi(Nb,Mo) alloys also showed better corrosion resistances in 1 M H2SO4 and 1 M NaCl solutions by comparing with commercial 304 stainless steel. The CrFeCoNiNb0.5Mo0.5 alloy possessed the best corrosion resistances in these solutions among the CrFeCoNi(Nb,Mo) alloys.

]]>Entropy doi: 10.3390/e20090647

Authors: Stefano Antonio Gattone Angela De Sanctis Stéphane Puechmorel Florence Nicol

In this paper, the problem of clustering rotationally invariant shapes is studied and a solution using Information Geometry tools is provided. Landmarks of a complex shape are defined as probability densities in a statistical manifold. Then, in the setting of shapes clustering through a K-means algorithm, the discriminative power of two different shapes distances are evaluated. The first, derived from Fisher&ndash;Rao metric, is related with the minimization of information in the Fisher sense and the other is derived from the Wasserstein distance which measures the minimal transportation cost. A modification of the K-means algorithm is also proposed which allows the variances to vary not only among the landmarks but also among the clusters.

]]>Entropy doi: 10.3390/e20090646

Authors: Anna Maria Manzoni Sebastian Haas Haneen Daoud Uwe Glatzel Christiane Förster Nelia Wanderka

Compositionally complex alloys, or high entropy alloys, are good candidates for applications at higher temperatures in gas turbines. After their introduction, the equiatomic Al17Co17Cr17Cu17Fe17Ni17 (at.%) served as a starting material and a long optimization road finally led to the recently optimized Al10Co25Cr8Fe15Ni36Ti6 (at.%) alloy, which shows promising mechanical properties. Investigations of the as-cast state and after different heat treatments focus on the evolution of the microstructure and provide an overview of some mechanical properties. The dendritic solidification provides two phases in the dendritic cores and two different ones in the interdendritic regions. Three of the four phases remain after heat treatments. Homogenization and subsequent annealing produce a &gamma;-&gamma;&rsquo; based microstructure, similar to Ni-based superalloys. The &gamma; phase is Co-Cr-Fe rich and the &gamma;&rsquo; phase is Al-Ni-Ti rich. The understanding of the mechanical behavior of the investigated alloy is supported and enhanced by the study of the different phases and their nanohardness measurements. The observations are compared with mechanical and microstructural data from commercial Ni-based superalloys, Co-based alloys, and Co-Ni-based alloys at the desired application temperature of ~800 &deg;C.

]]>Entropy doi: 10.3390/e20090645

Authors: Paolo De Gregorio Lamberto Rondoni

From basic principles, we review some fundamentals of entropy calculations, some of which are implicit in the literature. We mainly deal with microcanonical ensembles to effectively compare the counting of states in continuous and discrete settings. When dealing with non-interacting elements, this effectively reduces the calculation of the microcanonical entropy to counting the number of certain partitions, or compositions of a number. This is true in the literal sense, when quantization is assumed, even in the classical limit. Thus, we build on a moderately dated, ingenuous mathematical work of Haselgrove and Temperley on counting the partitions of an arbitrarily large positive integer into a fixed (but still large) number of summands, and show that it allows us to exactly calculate the low energy/temperature entropy of a one-dimensional Bose&ndash;Einstein gas in a box. Next, aided by the asymptotic analysis of the number of compositions of an integer as the sum of three squares, we estimate the entropy of the three-dimensional problem. For each selection of the total energy, there is a very sharp optimal number of particles to realize that energy. Therefore, the entropy is &lsquo;large&rsquo; and almost independent of the particles, when the particles exceed that number. This number scales as the energy to the power of ( 2 / 3 ) -rds in one dimension, and ( 3 / 5 ) -ths in three dimensions. In the one-dimensional case, the threshold approaches zero temperature in the thermodynamic limit, but it is finite for mesoscopic systems. Below that value, we studied the intermediate stage, before the number of particles becomes a strong limiting factor for entropy optimization. We apply the results of moments of partitions of Coons and Kirsten to calculate the relative fluctuations of the ground state and excited states occupation numbers. At much lower temperatures than threshold, they vanish in all dimensions. We briefly review some of the same results in the grand canonical ensemble to show to what extents they differ.

]]>Entropy doi: 10.3390/e20090644

Authors: Andrew J. Majda Nan Chen

Complex multiscale systems are ubiquitous in many areas. This research expository article discusses the development and applications of a recent information-theoretic framework as well as novel reduced-order nonlinear modeling strategies for understanding and predicting complex multiscale systems. The topics include the basic mathematical properties and qualitative features of complex multiscale systems, statistical prediction and uncertainty quantification, state estimation or data assimilation, and coping with the inevitable model errors in approximating such complex systems. Here, the information-theoretic framework is applied to rigorously quantify the model fidelity, model sensitivity and information barriers arising from different approximation strategies. It also succeeds in assessing the skill of filtering and predicting complex dynamical systems and overcomes the shortcomings in traditional path-wise measurements such as the failure in measuring extreme events. In addition, information theory is incorporated into a systematic data-driven nonlinear stochastic modeling framework that allows effective predictions of nonlinear intermittent time series. Finally, new efficient reduced-order nonlinear modeling strategies combined with information theory for model calibration provide skillful predictions of intermittent extreme events in spatially-extended complex dynamical systems. The contents here include the general mathematical theories, effective numerical procedures, instructive qualitative models, and concrete models from climate, atmosphere and ocean science.

]]>Entropy doi: 10.3390/e20090643

Authors: Jaad Tannous Lina Anouti Rabih Sultan

When we examine the random growth of trees along a linear alley in a rural area, we wonder what governs the location of those trees, and hence the distance between adjacent ones. The same question arises when we observe the growth of metal electro-deposition trees along a linear cathode in a rectangular film of solution. We carry out different sets of experiments wherein zinc trees are grown by electrolysis from a linear graphite cathode in a 2D film of zinc sulfate solution toward a thick zinc metal anode. We measure the distance between adjacent trees, calculate the average for each set, and correlate the latter with probability and entropy. We also obtain a computational image of the grown trees as a function of parameters such as the cell size, number of particles, and sticking probability. The dependence of average distance on concentration is studied and assessed.

]]>Entropy doi: 10.3390/e20090642

Authors: Erlandson Ferreira Saraiva Adriano Kamimura Suzuki Luis Aparecido Milan

In this paper, we study the performance of Bayesian computational methods to estimate the parameters of a bivariate survival model based on the Ali&ndash;Mikhail&ndash;Haq copula with marginal distributions given by Weibull distributions. The estimation procedure was based on Monte Carlo Markov Chain (MCMC) algorithms. We present three version of the Metropolis&ndash;Hastings algorithm: Independent Metropolis&ndash;Hastings (IMH), Random Walk Metropolis (RWM) and Metropolis&ndash;Hastings with a natural-candidate generating density (MH). Since the creation of a good candidate generating density in IMH and RWM may be difficult, we also describe how to update a parameter of interest using the slice sampling (SS) method. A simulation study was carried out to compare the performances of the IMH, RWM and SS. A comparison was made using the sample root mean square error as an indicator of performance. Results obtained from the simulations show that the SS algorithm is an effective alternative to the IMH and RWM methods when simulating values from the posterior distribution, especially for small sample sizes. We also applied these methods to a real data set.

]]>Entropy doi: 10.3390/e20090641

Authors: Olivier Rioul

Following a recent proof of Shannon&rsquo;s entropy power inequality (EPI), a comprehensive framework for deriving various EPIs for the R&eacute;nyi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation. Simple arguments are given to recover the previously known R&eacute;nyi EPIs and derive new ones, by unifying a multiplicative form with constant c and a modification with exponent &alpha; of previous works. In particular, for log-concave densities, we obtain a simple transportation proof of a sharp varentropy bound.

]]>Entropy doi: 10.3390/e20090640

Authors: Jorge F. Silva Milan S. Derpich

This work demonstrates a formal connection between density estimation with a data-rate constraint and the joint objective of fixed-rate universal lossy source coding and model identification introduced by Raginsky in 2008 (IEEE TIT, 2008, 54, 3059&ndash;3077). Using an equivalent learning formulation, we derive a necessary and sufficient condition over the class of densities for the achievability of the joint objective. The learning framework used here is the skeleton estimator, a rate-constrained learning scheme that offers achievable results for the joint coding and modeling problem by optimally adapting its learning parameters to the specific conditions of the problem. The results obtained with the skeleton estimator significantly extend the context where universal lossy source coding and model identification can be achieved, allowing for applications that move from the known case of parametric collection of densities with some smoothness and learnability conditions to the rich family of non-parametric L 1 -totally bounded densities. In addition, in the parametric case we are able to remove one of the assumptions that constrain the applicability of the original result obtaining similar performances in terms of the distortion redundancy and per-letter rate overhead.

]]>Entropy doi: 10.3390/e20090639

Authors: Osamah Abdullah

Modern indoor positioning system services are important technologies that play vital roles in modern life, providing many services such as recruiting emergency healthcare providers and for security purposes. Several large companies, such as Microsoft, Apple, Nokia, and Google, have researched location-based services. Wireless indoor localization is key for pervasive computing applications and network optimization. Different approaches have been developed for this technique using WiFi signals. WiFi fingerprinting-based indoor localization has been widely used due to its simplicity, and algorithms that fingerprint WiFi signals at separate locations can achieve accuracy within a few meters. However, a major drawback of WiFi fingerprinting is the variance in received signal strength (RSS), as it fluctuates with time and changing environment. As the signal changes, so does the fingerprint database, which can change the distribution of the RSS (multimodal distribution). Thus, in this paper, we propose that symmetrical H&ouml;lder divergence, which is a statistical model of entropy that encapsulates both the skew Bhattacharyya divergence and Cauchy&ndash;Schwarz divergence that are closed-form formulas that can be used to measure the statistical dissimilarities between the same exponential family for the signals that have multivariate distributions. The H&ouml;lder divergence is asymmetric, so we used both left-sided and right-sided data so the centroid can be symmetrized to obtain the minimizer of the proposed algorithm. The experimental results showed that the symmetrized H&ouml;lder divergence consistently outperformed the traditional k nearest neighbor and probability neural network. In addition, with the proposed algorithm, the position error accuracy was about 1 m in buildings.

]]>Entropy doi: 10.3390/e20090638

Authors: Michele Greco Giovanni Martino

Water discharge assessment in open channel flow is one of the most crucial issues for hydraulic engineers in the fields of water resource management, river dynamics, ecohydraulics, irrigation, and hydraulic structure design, among others. Recent studies state that the entropy velocity law allows expeditive methodology for discharge estimation and rating curve development due to the simple mathematical formulation and implementation. Many works have been developed based on the one-dimensional (1-D) formulation of the entropy velocity profile, supporting measurements in the lab and the field for rating curve assessment, but in recent years, the two-dimensional (2-D) formulation was proposed and applied in studies of regular ditch flow, showing good performance. The present work deals with a comparison between the 1-D and 2-D approaches in order to give a general framework of threats and opportunities related to the robust operational application of such laws. The analysis was carried out on a laboratory ditch with regular roughness, under controlled boundary conditions, and in different stages, generating an exhaustive dashboard for better appraisal of the approaches.

]]>Entropy doi: 10.3390/e20090637

Authors: Ricardo T. Páez-Hernández Juan Carlos Chimal-Eguía Delfino Ladino-Luna Juan Manuel Velázquez-Arcos

This paper presents a finite-time thermodynamic optimization based on three different optimization criteria: Maximum Power Output (MP), Maximum Efficient Power (MEP), and Maximum Power Density (MPD), for a simplified Curzon-Ahlborn engine that was first proposed by Agrawal. The results obtained for the MP are compared with those obtained using MEP and MPD criteria. The results show that when a Newton heat transfer law is used, the efficiency values of the engine working in the MP regime are lower than the efficiency values ( &tau; ) obtained with the MEP and MPD regimes for all values of the parameter &tau; = T 2 / T 1 , where T 1 and T 2 are the hot and cold temperatures of the engine reservoirs ( T 2 &lt; T 1 ) , respectively. However, when a Dulong-Petit heat transfer law is used, the efficiency values of the engine working at MEP are larger than those obtained with the MP and the MPD regimes for all values of &tau; . Notably, when 0 &lt; &tau; &lt; 0.68 , the efficiency values for the MP regime are larger than those obtained with the MPD regime. Also, when 0.68 &lt; &tau; &lt; 1 , the efficiency values for the aforementioned regimes are similar. Importantly, the parameter &tau; plays a crucial role in the engine performance, providing guidance during the design of real power plants.

]]>Entropy doi: 10.3390/e20090636

Authors: Kang-Seok Lee Hosung Park Jong-Seon No

In this paper, a new family of binary LRCs (BLRCs) with locality 2 and uneven availabilities for hot data is proposed, which has a high information symbol availability and low parity symbol availabilities for the local repair of distributed storage systems. The local repair of each information symbol for the proposed codes can be done not by accessing other information symbols but only by accessing parity symbols. The proposed BLRCs with k = 4 achieve the optimality on the information length for their given code length, minimum Hamming distance, locality, and availability in terms of the well-known theoretical upper bound.

]]>Entropy doi: 10.3390/e20090635

Authors: Riccardo Rao Massimiliano Esposito

We present a general method to identify an arbitrary number of fluctuating quantities which satisfy a detailed fluctuation theorem for all times within the framework of time-inhomogeneous Markovian jump processes. In doing so, we provide a unified perspective on many fluctuation theorems derived in the literature. By complementing the stochastic dynamics with a thermodynamic structure (i.e., using stochastic thermodynamics), we also express these fluctuating quantities in terms of physical observables.

]]>Entropy doi: 10.3390/e20090634

Authors: Nobuoki Eshima Minoru Tabata Claudio Giovanni Borroni

In factor analysis, factor contributions of latent variables are assessed conventionally by the sums of the squared factor loadings related to the variables. First, the present paper considers issues in the conventional method. Second, an alternative entropy-based approach for measuring factor contributions is proposed. The method measures the contribution of the common factor vector to the manifest variable vector and decomposes it into contributions of factors. A numerical example is also provided to demonstrate the present approach.

]]>Entropy doi: 10.3390/e20090633

Authors: Airton Deppman Tobias Frederico Eugenio Megías Debora P. Menezes

The role played by non-extensive thermodynamics in physical systems has been under intense debate for the last decades. With many applications in several areas, the Tsallis statistics have been discussed in detail in many works and triggered an interesting discussion on the most deep meaning of entropy and its role in complex systems. Some possible mechanisms that could give rise to non-extensive statistics have been formulated over the last several years, in particular a fractal structure in thermodynamic functions was recently proposed as a possible origin for non-extensive statistics in physical systems. In the present work, we investigate the properties of such fractal thermodynamical system and propose a diagrammatic method for calculations of relevant quantities related to such a system. It is shown that a system with the fractal structure described here presents temperature fluctuation following an Euler Gamma Function, in accordance with previous works that provided evidence of the connections between those fluctuations and Tsallis statistics. Finally, the scale invariance of the fractal thermodynamical system is discussed in terms of the Callan&ndash;Symanzik equation.

]]>Entropy doi: 10.3390/e20090632

Authors: Kei Hirose Hiroki Masuda

Relative error estimation has been recently used in regression analysis. A crucial issue of the existing relative error estimation procedures is that they are sensitive to outliers. To address this issue, we employ the &gamma; -likelihood function, which is constructed through &gamma; -cross entropy with keeping the original statistical model in use. The estimating equation has a redescending property, a desirable property in robust statistics, for a broad class of noise distributions. To find a minimizer of the negative &gamma; -likelihood function, a majorize-minimization (MM) algorithm is constructed. The proposed algorithm is guaranteed to decrease the negative &gamma; -likelihood function at each iteration. We also derive asymptotic normality of the corresponding estimator together with a simple consistent estimator of the asymptotic covariance matrix, so that we can readily construct approximate confidence sets. Monte Carlo simulation is conducted to investigate the effectiveness of the proposed procedure. Real data analysis illustrates the usefulness of our proposed procedure.

]]>Entropy doi: 10.3390/e20090631

Authors: Marc Harper Dashiell Fryer

We propose the entropy of random Markov trajectories originating and terminating at the same state as a measure of the stability of a state of a Markov process. These entropies can be computed in terms of the entropy rates and stationary distributions of Markov processes. We apply this definition of stability to local maxima and minima of the stationary distribution of the Moran process with mutation and show that variations in population size, mutation rate, and strength of selection all affect the stability of the stationary extrema.

]]>Entropy doi: 10.3390/e20090630

Authors: Julio Alberto López-Saldívar Octavio Castaños Eduardo Nahmad-Achar Ramón López-Peña Margarita A. Man’ko Vladimir I. Man’ko

A new geometric representation of qubit and qutrit states based on probability simplexes is used to describe the separability and entanglement properties of density matrices of two qubits. The Peres&ndash;Horodecki positive partial transpose (ppt) -criterion and the concurrence inequalities are formulated as the conditions that the introduced probability distributions must satisfy to present entanglement. A four-level system, where one or two states are inaccessible, is considered as an example of applying the elaborated probability approach in an explicit form. The areas of three Triadas of Malevich&rsquo;s squares for entangled states of two qubits are defined through the qutrit state, and the critical values of the sum of their areas are calculated. We always find an interval for the sum of the square areas, which provides the possibility for an experimental checkup of the entanglement of the system in terms of the probabilities.

]]>Entropy doi: 10.3390/e20090629

Authors: Yan-Qun Zhuo Yanshuang Guo Shunyun Chen Yuntao Ji Jin Ma

Field and experimental observations showed that preslip undergoes a transition from multiple to single preslip zones, which implies the existence of linkage of preslip zones before the fault instability. However, the observations of the linkage process, which is significant for understanding the mechanism of earthquake preparation, remains to be implemented due to the limitations of observation methods in previous studies. Detailed spatiotemporal evolutions of preslip were observed via a high-speed camera and a digital image correlation method in our experiments. The normalized length of preslip zones shows an increase trend while the normalized number of preslip zones (NN) shows an increase followed by a decrease trend, which indicate that the expansion of the preslip undergoes a transition from increase to linkage of the isolated preslip zones. The peak NN indicates the initiation of the linkage of preslip zones. Both the linkage of the preslip zones and the decrease in the normalized information entropy of fault displacement direction indicate the reduction of spatial complexity of preslip as the instability approaches. Furthermore, the influences of dynamic adjustment of stress along the fault and the interactions between the asperities and preslip on the spatial complexity of preslip were also observed and analyzed.

]]>Entropy doi: 10.3390/e20090628

Authors: Yan Li Luca Pezzè Manuel Gessner Zhihong Ren Weidong Li Augusto Smerzi

Frequentist and Bayesian phase estimation strategies lead to conceptually different results on the state of knowledge about the true value of an unknown parameter. We compare the two frameworks and their sensitivity bounds to the estimation of an interferometric phase shift limited by quantum noise, considering both the cases of a fixed and a fluctuating parameter. We point out that frequentist precision bounds, such as the Cram&eacute;r&ndash;Rao bound, for instance, do not apply to Bayesian strategies and vice versa. In particular, we show that the Bayesian variance can overcome the frequentist Cram&eacute;r&ndash;Rao bound, which appears to be a paradoxical result if the conceptual difference between the two approaches are overlooked. Similarly, bounds for fluctuating parameters make no statement about the estimation of a fixed parameter.

]]>Entropy doi: 10.3390/e20090627

Authors: Andrea Murari Michele Lungaroni Emmanuele Peluso Pasquale Gaudio Ernesto Lerche Luca Garzotti Michela Gelfusa JET Contributors

Understanding the details of the correlation between time series is an essential step on the route to assessing the causal relation between systems. Traditional statistical indicators, such as the Pearson correlation coefficient and the mutual information, have some significant limitations. More recently, transfer entropy has been proposed as a powerful tool to understand the flow of information between signals. In this paper, the comparative advantages of transfer entropy, for determining the time horizon of causal influence, are illustrated with the help of synthetic data. The technique has been specifically revised for the analysis of synchronization experiments. The investigation of experimental data from thermonuclear plasma diagnostics proves the potential and limitations of the developed approach.

]]>Entropy doi: 10.3390/e20090626

Authors: Wenlong Fu Jiawen Tan Chaoshun Li Zubing Zou Qiankun Li Tie Chen

As crucial equipment during industrial manufacture, the health status of rotating machinery affects the production efficiency and device safety. Hence, it is of great significance to diagnose rotating machinery faults, which can contribute to guarantee the running stability and plan for maintenance, thus promoting production efficiency and economic benefits. For this purpose, a hybrid fault diagnosis model with entropy-based feature extraction and SVM optimized by a chaos quantum sine cosine algorithm (CQSCA) is developed in this research. Firstly, the state-of-the-art variational mode decomposition (VMD) is utilized to decompose the vibration signals into sets of components, during which process the preset parameter K is confirmed with the central frequency observation method. Subsequently, the permutation entropy values of all components are computed to constitute the feature vectors corresponding to different kind of signals. Later, the newly developed sine cosine algorithm (SCA) is employed and improved with chaotic initialization by a Duffing system and quantum technique to optimize the support vector machine (SVM) model, with which the fault pattern is recognized. Additionally, the availability of the optimized SVM with CQSCA was revealed in pattern recognition experiments. Finally, the proposed hybrid fault diagnosis approach was employed for engineering applications as well as contrastive analysis. The comparative results show that the proposed method achieved the best training accuracy 99.5% and best testing accuracy 97.89%. Furthermore, it can be concluded from the boxplots of different diagnosis methods that the stability and precision of the proposed method is superior to those of others.

]]>Entropy doi: 10.3390/e20090625

Authors: Jieting Wu Feiyu Zhu Xin Liu Hongfeng Yu

Edge bundling is a promising graph visualization approach to simplifying the visual result of a graph drawing. Plenty of edge bundling methods have been developed to generate diverse graph layouts. However, it is difficult to defend an edge bundling method with its resulting layout against other edge bundling methods as a clear theoretic evaluation framework is absent in the literature. In this paper, we propose an information-theoretic framework to evaluate the visual results of edge bundling techniques. We first illustrate the advantage of edge bundling visualizations for large graphs, and pinpoint the ambiguity resulting from drawing results. Second, we define and quantify the amount of information delivered by edge bundling visualization from the underlying network using information theory. Third, we propose a new algorithm to evaluate the resulting layouts of edge bundling using the amount of the mutual information between a raw network dataset and its edge bundling visualization. Comparison examples based on the proposed framework between different edge bundling techniques are presented.

]]>Entropy doi: 10.3390/e20090624

Authors: Yong Zhang Xue-Hui Yan Wei-Bing Liao Kun Zhao

In this study, (Al0.5CrFeNiTi0.25)Nx high-entropy films are prepared by a reactive direct current (DC) magnetron sputtering at different N2 flow rates on silicon wafers. It is found that the structure of (Al0.5CrFeNiTi0.25)Nx high-entropy films is amorphous, with x = 0. It transforms from amorphous to a face-centered-cubic (FCC) structure with the increase of nitrogen content, while the bulk Al0.5CrFeNiTi0.25 counterpart prepared by casting features a body-centered-cubic (BCC) phase structure. The phase formation can be explained by the atomic size difference (&delta;). Lacking nitrogen, &delta; is approximately 6.4% for the five metal elements, which is relatively large and might form a BCC or ordered-BCC structure, while the metallic elements in this alloy system all have a trend to form nitrides like TiN, CrN, AlN, and FeN. Therefore, nitride components are becoming very similar in size and structure and solve each other easily, thus, an FCC (Al-Cr-Fe-Ni-Ti)N solid solution forms. The calculated value of &delta; is approximately 23% for this multicomponent nitride solid solution. The (Al0.5CrFeNiTi0.25)Nx films achieve a pronounced hardness and a Young&rsquo;s modulus of 21.45 GPa and 253.8 GPa, respectively, which is obviously much higher than that of the as-cast Al0.5CrFeNiTi0.25 bulk alloys.

]]>Entropy doi: 10.3390/e20090623

Authors: Alexander V. Mantzaris John A. Marich Tristin W. Halfman

The Schelling model of segregation allows for a general description of residential movements in an environment modeled by a lattice. The key factor is that occupants change positions until they are surrounded by a designated minimum number of similarly labeled residents. An analogy to the Ising model has been made in previous research, primarily due the assumption of state changes being dependent upon the adjacent cell positions. This allows for concepts produced in statistical mechanics to be applied to the Schelling model. Here is presented a methodology to estimate the entropy of the model for different states of the simulation. A Monte Carlo estimate is obtained for the set of macrostates defined as the different aggregate homogeneity satisfaction values across all residents, which allows for the entropy value to be produced for each state. This produces a trace of the estimated entropy value for the states of the lattice configurations to be displayed with each iteration. The results show that the initial random placements of residents have larger entropy values than the final states of the simulation when the overall homogeneity of the residential locality is increased.

]]>