Entropy doi: 10.3390/e20070532

Authors: Leslie Glasser H. Donald Brooke Jenkins

n/a

]]>Entropy doi: 10.3390/e20070531

Authors: Karmele Lopez-de-Ipina Jordi Solé-Casals Marcos Faúndez-Zanuy Pilar M. Calvo Enric Sesa Josep Roure Unai Martinez-de-Lizarduy Blanca Beitia Elsa Fernández Jon Iradi Joseba Garcia-Melero Alberto Bergareche

Among neural disorders related to movement, essential tremor has the highest prevalence; in fact, it is twenty times more common than Parkinson&rsquo;s disease. The drawing of the Archimedes&rsquo; spiral is the gold standard test to distinguish between both pathologies. The aim of this paper is to select non-linear biomarkers based on the analysis of digital drawings. It belongs to a larger cross study for early diagnosis of essential tremor that also includes genetic information. The proposed automatic analysis system consists in a hybrid solution: Machine Learning paradigms and automatic selection of features based on statistical tests using medical criteria. Moreover, the selected biomarkers comprise not only commonly used linear features (static and dynamic), but also other non-linear ones: Shannon entropy and Fractal Dimension. The results are hopeful, and the developed tool can easily be adapted to users; and taking into account social and economic points of view, it could be very helpful in real complex environments.

]]>Entropy doi: 10.3390/e20070530

Authors: Amina-Aicha Khennaoui Adel Ouannas Samir Bendoukha Xiong Wang Viet-Thanh Pham

In this paper, we propose a fractional map based on the integer-order unified map. The chaotic behavior of the proposed map is analyzed by means of bifurcations plots, and experimental bounds are placed on the parameters and fractional order. Different control laws are proposed to force the states to zero asymptotically and to achieve the complete synchronization of a pair of fractional unified maps with identical or nonidentical parameters. Numerical results are used throughout the paper to illustrate the findings.

]]>Entropy doi: 10.3390/e20070529

Authors: Simona Decu Stefan Haesen Leopold Verstraelen Gabriel-Eduard Vîlcu

In this article, we consider statistical submanifolds of Kenmotsu statistical manifolds of constant ϕ-sectional curvature. For such submanifold, we investigate curvature properties. We establish some inequalities involving the normalized &delta;-Casorati curvatures (extrinsic invariants) and the scalar curvature (intrinsic invariant). Moreover, we prove that the equality cases of the inequalities hold if and only if the imbedding curvature tensors h and h&lowast; of the submanifold (associated with the dual connections) satisfy h=&minus;h&lowast;, i.e., the submanifold is totally geodesic with respect to the Levi&ndash;Civita connection.

]]>Entropy doi: 10.3390/e20070528

Authors: Wiktor Jakowluk

In this paper, a novel method is proposed to design a free final time input signal, which is then used in the robust system identification process. The solution of the constrained optimal input design problem is based on the minimization of an extra state variable representing the free final time scaling factor, formulated in the Bolza functional form, subject to the D-efficiency constraint as well as the input energy constraint. The objective function used for the model of the system identification provides robustness regarding the outlying data and was constructed using the so-called Entropy-like estimator. The perturbation time interval has a significant impact on the cost of the real-life system identification experiment. The contribution of this work is to examine the economic aspects between the imposed constraints on the input signal design, and the experiment duration while undertaking an identification experiment in the real operating conditions. The methodology is applicable to the general class of systems and was supported by numerical examples. Illustrative examples of the Least Squares, and the Entropy-Like estimators for the system parameter data validation where measurements include additive white noise are compared using ellipsoidal confidence regions.

]]>Entropy doi: 10.3390/e20070527

Authors: Romain Brasselet Angelo Arleo

Categorization is a fundamental information processing phenomenon in the brain. It is critical for animals to compress an abundance of stimulations into groups to react quickly and efficiently. In addition to labels, categories possess an internal structure: the goodness measures how well any element belongs to a category. Interestingly, this categorization leads to an altered perception referred to as categorical perception: for a given physical distance, items within a category are perceived closer than items in two different categories. A subtler effect is the perceptual magnet: discriminability is reduced close to the prototypes of a category and increased near its boundaries. Here, starting from predefined abstract categories, we naturally derive the internal structure of categories and the phenomenon of categorical perception, using an information theoretical framework that involves both probabilities and pairwise similarities between items. Essentially, we suggest that pairwise similarities between items are to be tuned to render some predefined categories as well as possible. However, constraints on these pairwise similarities only produce an approximate matching, which explains concurrently the notion of goodness and the warping of perception. Overall, we demonstrate that similarity-based information theory may offer a global and unified principled understanding of categorization and categorical perception simultaneously.

]]>Entropy doi: 10.3390/e20070526

Authors: Kevin Brown Paul Allopenna William Hunt Rachael Steiner Elliot Saltzman Ken McRae James Magnuson

Human speech perception involves transforming a countinuous acoustic signal into discrete linguistically meaningful units (phonemes) while simultaneously causing a listener to activate words that are similar to the spoken utterance and to each other. The Neighborhood Activation Model posits that phonological neighbors (two forms [words] that differ by one phoneme) compete significantly for recognition as a spoken word is heard. This definition of phonological similarity can be extended to an entire corpus of forms to produce a phonological neighbor network (PNN). We study PNNs for five languages: English, Spanish, French, Dutch, and German. Consistent with previous work, we find that the PNNs share a consistent set of topological features. Using an approach that generates random lexicons with increasing levels of phonological realism, we show that even random forms with minimal relationship to any real language, combined with only the empirical distribution of language-specific phonological form lengths, are sufficient to produce the topological properties observed in the real language PNNs. The resulting pseudo-PNNs are insensitive to the level of lingustic realism in the random lexicons but quite sensitive to the shape of the form length distribution. We therefore conclude that “universal” features seen across multiple languages are really string universals, not language universals, and arise primarily due to limitations in the kinds of networks generated by the one-step neighbor definition. Taken together, our results indicate that caution is warranted when linking the dynamics of human spoken word recognition to the topological properties of PNNs, and that the investigation of alternative similarity metrics for phonological forms should be a priority.

]]>Entropy doi: 10.3390/e20070525

Authors: Eesa Al Solami Musheer Ahmad Christos Volos Mohammad Najam Doja Mirza Mohd Sufyan Beg

In this paper, we present a novel method to construct cryptographically strong bijective substitution-boxes based on the complicated dynamics of a new hyperchaotic system. The new hyperchaotic system was found to have good characteristics when compared with other systems utilized for S-box construction. The performance assessment of the proposed S-box method was carried out based on criteria, such as high nonlinearity, a good avalanche effect, bit-independent criteria, and low differential uniformity. The proposed method was also analyzed for the batch-generation of 8 &times; 8 S-boxes. The analyses found that through a proposed purely chaos-based method, an 8 &times; 8 S-box with a maximum average high nonlinearity of 108.5, or S-boxes with differential uniformity as low as 8, can be retrieved. Moreover, small-sized S-boxes with high nonlinearity and low differential uniformity are also obtainable. A performance comparison of the anticipated method with recent S-box proposals proved its dominance and effectiveness for a strong bijective S-box construction.

]]>Entropy doi: 10.3390/e20070524

Authors: Yizhak Marcus

The standard entropies S298&deg;E of deep eutectic solvents (DESs), which are liquid binary mixtures of a hydrogen bond acceptor component and a hydrogen bod donor one, are calculated from their molecular volumes, derived from their densities or crystal structures. These values are compared with those of the components&mdash;pro-rated according to the DES composition&mdash;to obtain the standard entropies of DES formation &Delta;fS. These quantities are positive, due to the increased number and kinds of hydrogen bonds present in the DESs relative to those in the components. The &Delta;fS values are also compared with the freezing point depressions of the DESs &Delta;fusT/K, but no general conclusions on their mutual relationship could be drawn.

]]>Entropy doi: 10.3390/e20070523

Authors: Jinshan Ma

A novel generalized grey target decision method for mixed attributes based on Kullback-Leibler (K-L) distance is proposed. The proposed approach involves the following steps: first, all indices are converted into index binary connection number vectors; second, the two-tuple (determinacy, uncertainty) numbers originated from index binary connection number vectors are obtained; third, the positive and negative target centers of two-tuple (determinacy, uncertainty) numbers are calculated; then the K-L distances of all alternatives to their positive and negative target centers are integrated by the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method; the final decision is based on the integrated value on a bigger the better basis. A case study exemplifies the proposed approach.

]]>Entropy doi: 10.3390/e20070522

Authors: Yuanyuan Li Yanjing Sun Xinhua Huang Guanqiu Qi Mingyao Zheng Zhiqin Zhu

Multi-modality image fusion provides more comprehensive and sophisticated information in modern medical diagnosis, remote sensing, video surveillance, etc. Traditional multi-scale transform (MST) based image fusion solutions have difficulties in the selection of decomposition level, and the contrast loss in fused image. At the same time, traditional sparse-representation based image fusion methods suffer the weak representation ability of fixed dictionary. In order to overcome these deficiencies of MST- and SR-based methods, this paper proposes an image fusion framework which integrates nonsubsampled contour transformation (NSCT) into sparse representation (SR). In this fusion framework, NSCT is applied to source images decomposition for obtaining corresponding low- and high-pass coefficients. It fuses low- and high-pass coefficients by using SR and Sum Modified-laplacian (SML) respectively. NSCT inversely transforms the fused coefficients to obtain the final fused image. In this framework, a principal component analysis (PCA) is implemented in dictionary training to reduce the dimension of learned dictionary and computation costs. A novel high-pass fusion rule based on SML is applied to suppress pseudo-Gibbs phenomena around singularities of fused image. Compared to three mainstream image fusion solutions, the proposed solution achieves better performance on structural similarity and detail preservation in fused images.

]]>Entropy doi: 10.3390/e20070521

Authors: Wenan Cai Zhaojian Yang Zhijian Wang Yiliang Wang

Due to the weak entropy of the vibration signal in the strong noise environment, it is very difficult to extract compound fault features. EMD (Empirical Mode Decomposition), EEMD (Ensemble Empirical Mode Decomposition) and LMD (Local Mean Decomposition) are widely used in compound fault feature extraction. Although they can decompose different characteristic components into each IMF (Intrinsic Mode Function), there is still serious mode mixing because of the noise. VMD (Variational Mode Decomposition) is a rigorous mathematical theory that can alleviate the mode mixing. Each characteristic component of VMD contains a unique center frequency but it is a parametric decomposition method. An improper value of K will lead to over-decomposition or under-decomposition. So, the number of decomposition levels of VMD needs an adaptive determination. The commonly used adaptive methods are particle swarm optimization and ant colony algorithm but they consume a lot of computing time. This paper proposes a compound fault feature extraction method based on Multipoint Kurtosis (MKurt)-VMD. Firstly, MED (Minimum Entropy Deconvolution) denoises the vibration signal in the strong noise environment. Secondly, multipoint kurtosis extracts the periodic multiple faults and a multi-periodic vector is further constructed to determine the number of impulse periods which determine the K value of VMD. Thirdly, the noise-reduced signal is processed by VMD and the fault features are further determined by FFT. Finally, the proposed compound fault feature extraction method can alleviate the mode mixing in comparison with EEMD. The validity of this method is further confirmed by processing the measured signal and extracting the compound fault features such as the gear spalling and the roller fault, their fault periods are 22.4 and 111.2 respectively and the corresponding frequencies are 360 Hz and 72 Hz, respectively.

]]>Entropy doi: 10.3390/e20070520

Authors: Jan Smrek Kurt Kremer

Active matter consists of particles that dissipate energy, from their own sources, in the form of mechanical work on their surroundings. Recent interest in active-passive polymer mixtures has been driven by their relevance in phase separation of (e.g., transcriptionally) active and inactive (transcriptionally silent) DNA strands in nuclei of living cells. In this paper, we study the interfacial properties of the phase separated steady states of the active-passive polymer mixtures and compare them with equilibrium phase separation. We model the active constituents by assigning them stronger-than-thermal fluctuations. We demonstrate that the entropy production is an accurate indicator of the phase transition. We then construct phase diagrams and analyze kinetic properties of the particles as a function of the distance from the interface. Studying the interface fluctuations, we find that they follow the capillary waves spectrum. This allows us to establish a mechanistic definition of the interfacial stiffness and its dependence on the relative level of activity with respect to the passive constituents. We show how the interfacial width depends on the activity ratio and comment on the finite size effects. Our results highlight similarities and differences of the non-equilibrium steady states with an equilibrium phase separated polymer mixture with a lower critical solution temperature. We present several directions in which the non-equilibrium system can be studied further and point out interesting observations that indicate general principles behind the non-equilibrium phase separation.

]]>Entropy doi: 10.3390/e20070519

Authors: Li He Haifei Zhu Tao Zhang Honghong Yang Yisheng Guan

In kernel methods, Nystr&ouml;m approximation is a popular way of calculating out-of-sample extensions and can be further applied to large-scale data clustering and classification tasks. Given a new data point, Nystr&ouml;m employs its empirical affinity vector, k, for calculation. This vector is assumed to be a proper measurement of the similarity between the new point and the training set. In this paper, we suggest replacing the affinity vector by its projections on the leading eigenvectors learned from the training set, i.e., using k*=&sum;i=1ckTuiui instead, where ui is the i-th eigenvector of the training set and c is the number of eigenvectors used, which is typically equal to the number of classes designed by users. Our work is motivated by the constraints that in kernel space, the kernel-mapped new point should (a) also lie on the unit sphere defined by the Gaussian kernel and (b) generate training set affinity values close to k. These two constraints define a Quadratic Optimization Over a Sphere (QOOS) problem. In this paper, we prove that the projection on the leading eigenvectors, rather than the original affinity vector, is the solution to the QOOS problem. The experimental results show that the proposed replacement of k by k* slightly improves the performance of the Nystr&ouml;m approximation. Compared with other affinity matrix modification methods, our k* obtains comparable or higher clustering performance in terms of accuracy and Normalized Mutual Information (NMI).

]]>Entropy doi: 10.3390/e20070518

Authors: Heonsoo Lee Zirui Huang Xiaolin Liu UnCheol Lee Anthony G. Hudetz

Theoretical consideration predicts that the alteration of local and shared information in the brain is a key element in the mechanism of anesthetic-induced unconsciousness. Ordinal pattern analysis, such as permutation entropy (PE) and symbolic mutual information (SMI), have been successful in quantifying local and shared information in neurophysiological data; however, they have been rarely applied to altered states of consciousness, especially to data obtained with functional magnetic resonance imaging (fMRI). PE and SMI analysis, together with the superb spatial resolution of fMRI recording, enables us to explore the local information of specific brain areas, the shared information between the areas, and the relationship between the two. Given the spatially divergent action of anesthetics on regional brain activity, we hypothesized that anesthesia would differentially influence entropy (PE) and shared information (SMI) across various brain areas, which may represent fundamental, mechanistic indicators of loss of consciousness. FMRI data were collected from 15 healthy participants during four states: wakefulness (W), light (conscious) sedation (L), deep (unconscious) sedation (D), and recovery (R). Sedation was produced by the common, clinically used anesthetic, propofol. Firstly, we found that that global PE decreased from W to D, and increased from D to R. The PE was differentially affected across the brain areas; specifically, the PE in the subcortical network was reduced more than in the cortical networks. Secondly, SMI was also differentially affected in different areas, as revealed by the reconfiguration of its spatial pattern (topographic structure). The topographic structures of SMI in the conscious states W, L, and R were distinctively different from that of the unconscious state D. Thirdly, PE and SMI were positively correlated in W, L, and R, whereas this correlation was disrupted in D. And lastly, PE changes occurred preferentially in highly connected hub regions. These findings advance our understanding of brain dynamics and information exchange, emphasizing the importance of topographic structure and the relationship of local and shared information in anesthetic-induced unconsciousness.

]]>Entropy doi: 10.3390/e20070517

Authors: Tianchen Li Bin Liu Yong Liu Wenmin Guo Ao Fu Liangsheng Li Nie Yan Qihong Fang

A novel metal matrix composite based on the NbMoCrTiAl high entropy alloy (HEA) was designed by the in-situ formation method. The microstructure, phase evolution, and compression mechanical properties at room temperature of the composite are investigated in detail. The results confirmed that the composite was primarily composed of body-centered cubic solid solution with a small amount of titanium carbides and alumina. With the presence of approximately 7.0 vol. % Al2O3 and 32.2 vol. % TiC reinforced particles, the compressive fracture strength of the composite (1542 MPa) was increased by approximately 50% compared with that of the as-cast NbMoCrTiAl HEA. In consideration of the superior oxidation resistance, the P/M NbMoCrTiAl high entropy alloy composite could be considered as a promising high temperature structural material.

]]>Entropy doi: 10.3390/e20070516

Authors: Lukas Mairhofer Sandra Eibenberger Armin Shayeghi Markus Arndt

Matter-wave near-field interference can imprint a nano-scale fringe pattern onto a molecular beam, which allows observing its shifts in the presence of even very small external forces. Here we demonstrate quantum interference of the pre-vitamin 7-dehydrocholesterol and discuss the conceptual challenges of magnetic deflectometry in a near-field interferometer as a tool to explore photochemical processes within molecules whose center of mass is quantum delocalized.

]]>Entropy doi: 10.3390/e20070515

Authors: Edoardo Milotti Sergio Bartalucci Sergio Bertolucci Massimiliano Bazzi Mario Bragadireanu Michael Cargnelli Alberto Clozza Catalina Curceanu Luca De Paolis Jean-Pierre Egger Carlo Guaraldo Mihail Iliescu Matthias Laubenstein Johann Marton Marco Miliucci Andreas Pichler Dorel Pietreanu Kristian Piscicchia Alessandro Scordo Hexi Shi Diana Laura Sirghi Florin Sirghi Laura Sperandio Oton Vázquez Doce Eberhard Widmann Johann Zmeskal

The VIolation of Pauli (VIP) experiment (and its upgraded version, VIP-2) uses the Ramberg and Snow (RS) method (Phys. Lett. B 1990, 238, 438) to search for violations of the Pauli exclusion principle in the Gran Sasso underground laboratory. The RS method consists of feeding a copper conductor with a high direct current, so that the large number of newly-injected conduction electrons can interact with the copper atoms and possibly cascade electromagnetically to an already occupied atomic ground state if their wavefunction has the wrong symmetry with respect to the atomic electrons, emitting characteristic X-rays as they do so. In their original data analysis, RS considered a very simple path for each electron, which is sure to return a bound, albeit a very weak one, because it ignores the meandering random walks of the electrons as they move from the entrance to the exit of the copper sample. These complex walks bring the electrons close to many more atoms than in the RS calculation. Here, we consider the full description of these walks and show that this leads to a nontrivial and nonlinear X-ray emission rate. Finally, we obtain an improved bound, which sets much tighter constraints on the violation of the Pauli exclusion principle for electrons.

]]>Entropy doi: 10.3390/e20070514

Authors: Arieh Ben-Naim

It is well known that the statistical mechanical theory of liquids has been lagging far behind the theory of either gases or solids, See for examples: Ben-Naim (2006), Fisher (1964), Guggenheim (1952) Hansen and McDonald (1976), Hill (1956), Temperley, Rowlinson and Rushbrooke (1968), O’Connell (1971). Information theory was recently used to derive and interpret the entropy of an ideal gas of simple particles (i.e., non-interacting and structure-less particles). Starting with Shannon’s measure of information (SMI), one can derive the entropy function of an ideal gas, the same function as derived by Sackur (1911) and Tetrode (1912). The new deviation of the same entropy function, based on SMI, has several advantages, as listed in Ben-Naim (2008, 2017). Here we mention two: First, it provides a simple interpretation of the various terms in this entropy function. Second, and more important for our purpose, this derivation may be extended to any system of interacting particles including liquids and solutions. The main idea is that once one adds intermolecular interactions between the particles, one also adds correlations between the particles. These correlations may be cast in terms of mutual information (MI). Hence, we can start with the informational theoretical interpretation of the entropy of an ideal gas. Then, we add correction due to correlations in the form of MI between the locations of the particles. This process preserves the interpretation of the entropy of liquids and solutions in terms of a measure of information (or as an average uncertainty about the locations of the particles). It is well known that the entropy of liquids, any liquids for that matter, is lower than the entropy of a gas. Traditionally, this fact is interpreted in terms of order-disorder. The lower entropy of the liquid is interpreted in terms of higher degree of order compared with that of the gas. However, unlike the transition from a solid to either a liquid, or to a gaseous phase where the order-disorder interpretation works well, the same interpretation would not work for the liquid-gas transition. It is hard, if not impossible, to argue that the liquid phase is more “ordered” than the gaseous phase. In this article, we interpret the lower entropy of liquids in terms of SMI. One outstanding liquid known to be a structured liquid, is water, according to Ben-Naim (2009, 2011). In addition, heavy water, as well as aqueous solutions of simple solutes such as argon or methane, will be discussed in this article.

]]>Entropy doi: 10.3390/e20070513

Authors: Honorio Martin Pedro Martin-Holgado Pedro Peris-Lopez Yolanda Morilla Luis Entrena

The effects of ionizing radiation on field-programmable gate arrays (FPGAs) have been investigated in depth during the last decades. The impact of these effects is typically evaluated on implementations which have a deterministic behavior. In this article, two well-known true-random number generators (TRNGs) based on sampling jittery signals have been exposed to a Co-60 radiation source as in the standard tests for space conditions. The effects of the accumulated dose on these TRNGs, an in particular, its repercussion over their randomness quality (e.g., entropy or linear complexity), have been evaluated by using two National Institute of Standards and Technology (NIST) statistical test suites. The obtained results clearly show how the degradation of the statistical properties of these TRNGs increases with the accumulated dose. It is also notable that the deterioration of the TRNG (non-deterministic component) appears before that the degradation of the deterministic elements in the FPGA, which compromises the integrated circuit lifetime.

]]>Entropy doi: 10.3390/e20070512

Authors: Takuya Isomura

The mutual information between the state of a neural network and the state of the external world represents the amount of information stored in the neural network that is associated with the external world. In contrast, the surprise of the sensory input indicates the unpredictability of the current input. In other words, this is a measure of inference ability, and an upper bound of the surprise is known as the variational free energy. According to the free-energy principle (FEP), a neural network continuously minimizes the free energy to perceive the external world. For the survival of animals, inference ability is considered to be more important than simply memorized information. In this study, the free energy is shown to represent the gap between the amount of information stored in the neural network and that available for inference. This concept involves both the FEP and the infomax principle, and will be a useful measure for quantifying the amount of information available for inference.

]]>Entropy doi: 10.3390/e20070511

Authors: Arthur Matsuo Yamashita Rios de Sousa Hideki Takayasu Misako Takayasu

We use the definition of statistical symmetry as the invariance of a probability distribution under a given transformation and apply the concept to the underlying probability distribution of stochastic processes. To measure the degree of statistical asymmetry, we take the Kullback&ndash;Leibler divergence of a given probability distribution with respect to the corresponding transformed one and study it for the Gaussian autoregressive process using transformations on the temporal correlations&rsquo; structure. We then illustrate the employment of this notion as a time series analysis tool by measuring local statistical asymmetries of foreign exchange market price data for three transformations that capture distinct autocorrelation behaviors of the series&mdash;independence, non-negative correlations and Markovianity&mdash;obtaining a characterization of price movements in terms of each statistical symmetry.

]]>Entropy doi: 10.3390/e20070510

Authors: Jialong Wang Lingli Cui Yonggang Xu

Aiming to solve the problem of accurate diagnosis of the size and location of rolling bearing faults, a novel quantitative and localization fault diagnosis method of the rolling bearing is proposed based on the quantitative mapping model (QMM). The fault size and location of the rolling bearing affect the impulse type and the modulation degree of the vibration signal, which subsequently changes the complexity and randomness of the time-domain distribution of the vibration signal. According to the relationship between the multiscale permutation entropy (MPE) of the vibration signal and rolling bearing fault size, an average MPE (A-MPE) index is proposed to establish linear and nonlinear QMMs through the regression function. The proper QMM is selected through the error rate of fault size prediction to achieve a quantitative fault diagnosis of the rolling bearing. Due to the mathematical characteristics of the QMM, the localization fault diagnosis is realized. The multiscale morphological filtering (MMF) method is also introduced to extract the time-domain geometric feature of the fault bearing vibration signal and to improve the QMM accuracy of the fault size prediction. The results show that the QMM has a great effect on the quantitative fault size prediction and localization diagnosis of the rolling bearing.

]]>Entropy doi: 10.3390/e20070509

Authors: Nan Chen Andrew J. Majda

A conditional Gaussian framework for understanding and predicting complex multiscale nonlinear stochastic systems is developed. Despite the conditional Gaussianity, such systems are nevertheless highly nonlinear and are able to capture the non-Gaussian features of nature. The special structure of the system allows closed analytical formulae for solving the conditional statistics and is thus computationally efficient. A rich gallery of examples of conditional Gaussian systems are illustrated here, which includes data-driven physics-constrained nonlinear stochastic models, stochastically coupled reaction&ndash;diffusion models in neuroscience and ecology, and large-scale dynamical models in turbulence, fluids and geophysical flows. Making use of the conditional Gaussian structure, efficient statistically accurate algorithms involving a novel hybrid strategy for different subspaces, a judicious block decomposition and statistical symmetry are developed for solving the Fokker&ndash;Planck equation in large dimensions. The conditional Gaussian framework is also applied to develop extremely cheap multiscale data assimilation schemes, such as the stochastic superparameterization, which use particle filters to capture the non-Gaussian statistics on the large-scale part whose dimension is small whereas the statistics of the small-scale part are conditional Gaussian given the large-scale part. Other topics of the conditional Gaussian systems studied here include designing new parameter estimation schemes and understanding model errors.

]]>Entropy doi: 10.3390/e20070508

Authors: Henryk Gzyl German Molina Enrique ter Horst

Risk neutral measures are defined such that the basic random assets in a portfolio are martingales. Hence, when the market model is complete, valuation of other financial instruments is a relatively straightforward task when those basic random assets constitute their underlying asset. To determine the risk neutral measure, it is assumed that the current prices of the basic assets are known exactly. However, oftentimes all we know about the current price, or that of a derivative having it as underlying, is a bid-ask range. The question then arises as to how to determine the risk neutral measure from that information. We may want to determine risk neutral measures from that information to use it, for example, to price other derivatives on the same asset. In this paper we propose an extended version of the maximum entropy method to carry out that task. This approach provides a novel solution to this problem, which is computationally simple and fast.

]]>Entropy doi: 10.3390/e20070507

Authors: António M. Lopes J. A. Tenreiro Machado

Complex systems (CS) are pervasive in many areas of science and technology, namely in financialmarkets, transportation, telecommunication and social networks, world and country economies,immunological systems, living organisms, computational systems, and electrical and mechanicalstructures [...]

]]>Entropy doi: 10.3390/e20070506

Authors: Elizabeth Shumbayawonda Pinar Deniz Tosun Alberto Fernández Michael Pycraft Hughes Daniel Abásolo

Maturation and ageing, which can be characterised by the dynamic changes in brain morphology, can have an impact on the physiology of the brain. As such, it is possible that these changes can have an impact on the magnetic activity of the brain recorded using magnetoencephalography. In this study changes in the resting state brain (magnetic) activity due to healthy ageing were investigated by estimating the complexity of magnetoencephalogram (MEG) signals. The main aim of this study was to identify if the complexity of background MEG signals changed significantly across the human lifespan for both males and females. A sample of 177 healthy participants (79 males and 98 females aged between 21 and 80 and grouped into 3 categories i.e., early-, mid- and late-adulthood) was used in this investigation. This investigation also extended to evaluating if complexity values remained relatively stable during the 5 min recording. Complexity was estimated using permutation Lempel-Ziv complexity, a recently introduced complexity metric, with a motif length of 5 and a lag of 1. Effects of age and gender were investigated in the MEG channels over 5 brain regions, i.e., anterior, central, left lateral, posterior, and, right lateral, with highest complexity values observed in the signals recorded by the channels over the anterior and central regions of the brain. Results showed that while changes due to age had a significant effect on the complexity of the MEG signals recorded over 5 brain regions, gender did not have a significant effect on complexity values in all age groups investigated. Moreover, although some changes in complexity were observed between the different minutes of recording, due to the small magnitude of the changes it was concluded that practical significance might outweigh statistical significance in this instance. The results from this study can contribute to form a fingerprint of the characteristics of healthy ageing in MEGs that could be useful when investigating changes to the resting state activity due to pathology.

]]>Entropy doi: 10.3390/e20070505

Authors: Martin Löbel Thomas Lindner Thomas Mehner Thomas Lampke

The novel alloying concept of high-entropy alloys (HEAs) has been the focus of many recent investigations revealing an interesting combination of properties. Alloying with aluminium and titanium showed strong influence on microstructure and phase composition. However, detailed investigations on the influence of titanium are lacking. In this study, the influence of titanium in the alloy system AlCoCrFeNiTix was studied in a wide range (molar ratios x = 0.0; 0.2; 0.5; 0.8; 1.0; 1.5). Detailed studies investigating the microstructure, chemical composition, phase composition, solidification behaviour, and wear behaviour were carried out. Alloying with titanium showed strong influence on the resulting microstructure and lead to an increase of microstructural heterogeneity. Phase analyses revealed the formation of one body-centred cubic (bcc) phase for the alloy without titanium, whereas alloying with titanium caused the formation of two different bcc phases as main phases. Additional phases were detected for alloys with increased titanium content. For x &ge; 0.5, a minor phase with face-centred cubic (fcc) structure was formed. Further addition of titanium led to the formation of complex phases. Investigation of wear behaviour revealed a superior wear resistance of the alloy AlCoCrFeNiTi0.5 as compared to a bearing steel sample.

]]>Entropy doi: 10.3390/e20070504

Authors: Alireza Khalili Golmankhaneh Arran Fernandez Ali Khalili Golmankhaneh Dumitru Baleanu

In this paper, we study C&zeta;-calculus on generalized Cantor sets, which have self-similar properties and fractional dimensions that exceed their topological dimensions. Functions with fractal support are not differentiable or integrable in terms of standard calculus, so we must involve local fractional derivatives. We have generalized the C&zeta;-calculus on the generalized Cantor sets known as middle-&xi; Cantor sets. We have suggested a calculus on the middle-&xi; Cantor sets for different values of &xi; with 0&lt;&xi;&lt;1. Differential equations on the middle-&xi; Cantor sets have been solved, and we have presented the results using illustrative examples. The conditions for super-, normal, and sub-diffusion on fractal sets are given.

]]>Entropy doi: 10.3390/e20070503

Authors: Yuanpu Xia Ziming Xiong Zhu Wen Hao Lu Xin Dong

Uncertainty is one of the main sources of risk of geological hazards in tunnel engineering. Uncertainty information not only affects the accuracy of evaluation results, but also affects the reliability of decision-making schemes. Therefore, it is necessary to evaluate and control the impact of uncertainty on risk. In this study, the problems in the existing entropy-hazard model such as inefficient decision-making and failure of decision-making are analysed, and an improved uncertainty evaluation and control process are proposed. Then the tolerance cost, the key factor in the decision-making model, is also discussed. It is considered that the amount of change in risk value (R1) can better reflect the psychological behaviour of decision-makers. Thirdly, common multi-attribute decision-making models, such as the expected utility-entropy model, are analysed, and then the viewpoint of different types of decision-making issues that require different decision methods is proposed. The well-known Allais paradox is explained by the proposed methods. Finally, the engineering application results show that the uncertainty control idea proposed here is accurate and effective. This research indicates a direction for further research into uncertainty, and risk control, issues affecting underground engineering works.

]]>Entropy doi: 10.3390/e20070502

Authors: Oleg N. Kirillov

Sets in the parameter space corresponding to complex exceptional points (EP) have high codimension, and by this reason, they are difficult objects for numerical location. However, complex EPs play an important role in the problems of the stability of dissipative systems, where they are frequently considered as precursors to instability. We propose to locate the set of complex EPs using the fact that the global minimum of the spectral abscissa of a polynomial is attained at the EP of the highest possible order. Applying this approach to the problem of self-stabilization of a bicycle, we find explicitly the EP sets that suggest scaling laws for the design of robust bikes that agree with the design of the known experimental machines.

]]>Entropy doi: 10.3390/e20070501

Authors: Miron Kaufman

We present a mean field model of a gel consisting of P polymers, each of length L and Nz polyfunctional monomers. Each polyfunctional monomer forms z covalent bonds with the 2P bifunctional monomers at the ends of the linear polymers. We find that the entropy dependence on the number of polyfunctional monomers exhibits an abrupt change at Nz = 2P/z due to the saturation of possible crosslinks. This non-analytical dependence of entropy on the number of polyfunctionals generates a first-order phase transition between two gel phases: one poor and the other rich in poly-functional molecules.

]]>Entropy doi: 10.3390/e20070500

Authors: Avishy Carmi Eliahu Cohen

The characterization of quantum correlations, being stronger than classical, yet weaker than those appearing in non-signaling models, still poses many riddles. In this work, we show that the extent of binary correlations in a general class of nonlocal theories can be characterized by the existence of a certain covariance matrix. The set of quantum realizable two-point correlators in the bipartite case then arises from a subtle restriction on the structure of this general covariance matrix. We also identify a class of theories whose covariance has neither a quantum nor an &ldquo;almost quantum&rdquo; origin, but which nevertheless produce the accessible two-point quantum mechanical correlators. Our approach leads to richer Bell-type inequalities in which the extent of nonlocality is intimately related to a non-additive entropic measure. In particular, it suggests that the Tsallis entropy with parameter q=1/2 is a natural operational measure of non-classicality. Moreover, when generalizing this covariance matrix, we find novel characterizations of the quantum mechanical set of correlators in multipartite scenarios. All these predictions might be experimentally validated when adding weak measurements to the conventional Bell test (without adding postselection).

]]>Entropy doi: 10.3390/e20070499

Authors: Maurice A. de Gosson

Poincar&eacute;&rsquo;s Recurrence Theorem implies that any isolated Hamiltonian system evolving in a bounded Universe returns infinitely many times arbitrarily close to its initial phase space configuration. We discuss this and related recurrence properties from the point of view of recent advances in symplectic topology which have not yet reached the Physics community. These properties are closely related to Emergent Quantum Mechanics since they belong to a twilight zone between classical (Hamiltonian) mechanics and its quantization.

]]>Entropy doi: 10.3390/e20070498

Authors: Francisco Valverde-Albacete Carmen Peláez-Moreno

Data transformation, e.g., feature transformation and selection, is an integral part of any machine learning procedure. In this paper, we introduce an information-theoretic model and tools to assess the quality of data transformations in machine learning tasks. In an unsupervised fashion, we analyze the transformation of a discrete, multivariate source of information X¯ into a discrete, multivariate sink of information Y¯ related by a distribution PX¯Y¯. The first contribution is a decomposition of the maximal potential entropy of (X¯,Y¯), which we call a balance equation, into its (a) non-transferable, (b) transferable, but not transferred, and (c) transferred parts. Such balance equations can be represented in (de Finetti) entropy diagrams, our second set of contributions. The most important of these, the aggregate channel multivariate entropy triangle, is a visual exploratory tool to assess the effectiveness of multivariate data transformations in transferring information from input to output variables. We also show how these decomposition and balance equations also apply to the entropies of X¯ and Y¯, respectively, and generate entropy triangles for them. As an example, we present the application of these tools to the assessment of information transfer efficiency for Principal Component Analysis and Independent Component Analysis as unsupervised feature transformation and selection procedures in supervised classification tasks.

]]>Entropy doi: 10.3390/e20070497

Authors: Ming-Xia Xiao Hai-Cheng Wei Ya-Jie Xu Hsien-Tsai Wu Cheuk-Kwan Sun

The present study aimed at testing the hypothesis that application of multiscale cross-approximate entropy (MCAE) analysis in the study of nonlinear coupling behavior of two synchronized time series of different natures [i.e., R-R interval (RRI) and crest time (CT, the time interval from foot to peakof a pulse wave)] could yield information on complexity related to diabetes-associated vascular changes. Signals of a single waveform parameter (i.e., CT) from photoplethysmography and RRI from electrocardiogram were simultaneously acquired within a period of one thousand cardiac cycles for the computation of different multiscale entropy indices from healthy young adults (n = 22) (Group 1), upper-middle aged non-diabetic subjects (n = 34) (Group 2) and diabetic patients (n = 34) (Group 3). The demographic (i.e., age), anthropometric (i.e., body height, body weight, waist circumference, body-mass index), hemodynamic (i.e., systolic and diastolic blood pressures), and serum biochemical (i.e., high- and low-density lipoprotein cholesterol, total cholesterol, and triglyceride) parameters were compared with different multiscale entropy indices including small- and large-scale multiscale entropy indices for CT and RRI [MEISS(CT), MEILS(CT), MEISS(RRI), MEILS(RRI), respectively] as well as small- and large-scale multiscale cross-approximate entropy indices [MCEISS, MCEILS, respectively]. The results demonstrated that both MEILS(RRI) and MCEILS significantly differentiated between Group 2 and Group 3 (all p &lt; 0.017). Multivariate linear regression analysis showed significant associations of MEILS(RRI) and MCEILS(RRI,CT) with age and glycated hemoglobin level (all p &lt; 0.017). The findings highlight the successful application of a novel multiscale cross-approximate entropy index in non-invasively identifying diabetes-associated subtle changes in vascular functional integrity, which is of clinical importance in preventive medicine.

]]>Entropy doi: 10.3390/e20070496

Authors: Lajos Diósi

The concept of universal gravity-related irreversibility began in quantum cosmology. The ultimate reason for universal irreversibility is thought to come from black holes close to the Planck scale. Quantum state reductions, unrelated to gravity or relativity but related to measurement devices, are completely different instances of irreversibilities. However, an intricate relationship between Newton gravity and quantized matter might result in fundamental and spontaneous quantum state reduction&mdash;in the non-relativistic Schr&ouml;dinger&ndash;Newton context. The above two concepts of fundamental irreversibility emerged and evolved with few or even no interactions. The purpose here is to draw a parallel between the two approaches first, and to ask rather than answer the question: can both the Planckian and the Schr&ouml;dinger&ndash;Newton indeterminacies/irreversibilities be two faces of the same universe. A related personal note of the author&rsquo;s 1986 meeting with Aharonov and Bohm is appended.

]]>Entropy doi: 10.3390/e20070495

Authors: Marina Santacroce Paola Siri Barbara Trivellato

We use maximal exponential models to characterize a suitable polar cone in a mathematical convex optimization framework. A financial application of this result is provided, leading to a duality minimax theorem related to portfolio exponential utility maximization.

]]>Entropy doi: 10.3390/e20070494

Authors: Xinying Xu Yalan Zhao Mifeng Ren Lan Cheng Mingyue Gong

In this paper, a novel data-driven single neuron predictive control strategy is proposed for non-Gaussian networked control systems with metrology delays in the information theory framework. Firstly, survival information potential (SIP), instead of minimum entropy, is used to formulate the performance index to characterize the randomness of the considered systems, which is calculated by oversampling method. Then the minimum values can be computed by optimizing the SIP-based performance index. Finally, the proposed strategy, minimum entropy method and mean square error (MSE) are applied to a networked motor control system, and results demonstrated the effectiveness of the proposed strategy.

]]>Entropy doi: 10.3390/e20070493

Authors: William Seager

Although David Bohm&rsquo;s interpretation of quantum mechanics is sometimes thought to be a kind of regression towards classical thinking, it is in fact an extremely radical metaphysics of nature. The view goes far beyond the familiar but perennially peculiar non-locality and entanglement of quantum systems. In this paper, a philosophical exploration, I examine three core features of Bohm&rsquo;s metaphysical views, which have been both supported by features of quantum mechanics and integrated into a comprehensive system. These are the holistic nature of the world, the role of a unique kind of information as the ontological basis of the world, and the integration of mentality into this basis as an essential and irreducible aspect of it.

]]>Entropy doi: 10.3390/e20070492

Authors: Dimiter Prodanov

The present work is concerned with the study of a generalized Langevin equation and its link to the physical theories of statistical mechanics and scale relativity. It is demonstrated that the form of the coefficients of the Langevin equation depends critically on the assumption of continuity of the reconstructed trajectory. This in turn demands for the fluctuations of the diffusion term to be discontinuous in time. This paper further investigates the connection between the scale-relativistic and stochastic mechanics approaches, respectively, with the study of the Burgers equation, which in this case appears as a stochastic geodesic equation for the drift. By further demanding time reversibility of the drift, the Langevin equation can also describe equivalent quantum-mechanical systems in a path-wise manner. The resulting statistical description obeys the Fokker&ndash;Planck equation of the probability density of the differential system, which can be readily estimated from simulations of the random paths. Based on the Fokker&ndash;Planck formalism, a new derivation of the transient probability densities is presented. Finally, stochastic simulations are compared to the theoretical results.

]]>Entropy doi: 10.3390/e20070491

Authors: Ester Bonmati Anton Bardera Miquel Feixas Imma Boada

Brain networks are widely used models to understand the topology and organization of the brain. These networks can be represented by a graph, where nodes correspond to brain regions and edges to structural or functional connections. Several measures have been proposed to describe the topological features of these networks, but unfortunately, it is still unclear which measures give the best representation of the brain. In this paper, we propose a new set of measures based on information theory. Our approach interprets the brain network as a stochastic process where impulses are modeled as a random walk on the graph nodes. This new interpretation provides a solid theoretical framework from which several global and local measures are derived. Global measures provide quantitative values for the whole brain network characterization and include entropy, mutual information, and erasure mutual information. The latter is a new measure based on mutual information and erasure entropy. On the other hand, local measures are based on different decompositions of the global measures and provide different properties of the nodes. Local measures include entropic surprise, mutual surprise, mutual predictability, and erasure surprise. The proposed approach is evaluated using synthetic model networks and structural and functional human networks at different scales. Results demonstrate that the global measures can characterize new properties of the topology of a brain network and, in addition, for a given number of nodes, an optimal number of edges is found for small-world networks. Local measures show different properties of the nodes such as the uncertainty associated to the node, or the uniqueness of the path that the node belongs. Finally, the consistency of the results across healthy subjects demonstrates the robustness of the proposed measures.

]]>Entropy doi: 10.3390/e20070490

Authors: Amin Hosseinpoor Milaghardan Rahim Ali Abbaspour Christophe Claramunt

The rapid proliferation of sensors and big data repositories offer many new opportunities for data science. Among many application domains, the analysis of large trajectory datasets generated from people’s movements at the city scale is one of the most promising research avenues still to explore. Extracting trajectory patterns and outliers in urban environments is a direction still requiring exploration for many management and planning tasks. The research developed in this paper introduces a spatio-temporal framework, so-called STE-SD (Spatio-Temporal Entropy for Similarity Detection), based on the initial concept of entropy as introduced by Shannon in his seminal theory of information and as recently extended to the spatial and temporal dimensions. Our approach considers several complementary trajectory descriptors whose distribution in space and time are quantitatively evaluated. The trajectory primitives considered include curvatures, stop-points, self-intersections and velocities. These primitives are identified and then qualified using the notion of entropy as applied to the spatial and temporal dimensions. The whole approach is experimented and applied to urban trajectories derived from the Geolife dataset, a reference data benchmark available in the city of Beijing.

]]>Entropy doi: 10.3390/e20070489

Authors: N. Alex Cayco-Gajic Joel Zylberberg Eric Shea-Brown

Correlations in neural activity have been demonstrated to have profound consequences for sensory encoding. To understand how neural populations represent stimulus information, it is therefore necessary to model how pairwise and higher-order spiking correlations between neurons contribute to the collective structure of population-wide spiking patterns. Maximum entropy models are an increasingly popular method for capturing collective neural activity by including successively higher-order interaction terms. However, incorporating higher-order interactions in these models is difficult in practice due to two factors. First, the number of parameters exponentially increases as higher orders are added. Second, because triplet (and higher) spiking events occur infrequently, estimates of higher-order statistics may be contaminated by sampling noise. To address this, we extend previous work on the Reliable Interaction class of models to develop a normalized variant that adaptively identifies the specific pairwise and higher-order moments that can be estimated from a given dataset for a specified confidence level. The resulting &ldquo;Reliable Moment&rdquo; model is able to capture cortical-like distributions of population spiking patterns. Finally, we show that, compared with the Reliable Interaction model, the Reliable Moment model infers fewer strong spurious higher-order interactions and is better able to predict the frequencies of previously unobserved spiking patterns.

]]>Entropy doi: 10.3390/e20070488

Authors: Sephira Riva Shahin Mehraban Nicholas P. Lavery Stefan Schwarzmüller Oliver Oeckler Stephen G. R. Brown Kirill V. Yusenko

We investigate the effect of alloying with scandium on microstructure, high-temperature phase stability, electron transport, and mechanical properties of the Al2CoCrFeNi, Al0.5CoCrCuFeNi, and AlCoCrCu0.5FeNi high-entropy alloys. Out of the three model alloys, Al2CoCrFeNi adopts a disordered CsCl structure type. Both of the six-component alloys contain a mixture of body-centered cubic (bcc) and face centered cubic (fcc) phases. The comparison between in situ high-temperature powder diffraction data and ex situ data from heat-treated samples highlights the presence of a reversible bcc to fcc transition. The precipitation of a MgZn2-type intermetallic phase along grain boundaries following scandium addition affects all systems differently, but especially enhances the properties of Al2CoCrFeNi. It causes grain refinement; hardness and electrical conductivity increases (up to 20% and 14% respectively) and affects the CsCl-type &rarr; fcc equilibrium by moving the transformation to sensibly higher temperatures. The maximum dimensionless thermoelectric figure of merit (ZT) of 0.014 is reached for Al2CoCrFeNi alloyed with 0.3 wt.% Sc at 650 &deg;C.

]]>Entropy doi: 10.3390/e20070487

Authors: Sicheng Zhai Wen Wang Juan Xu Shuai Xu Zitang Zhang Yan Wang

FeSiBAlNi (W5), FeSiBAlNiCo (W6-Co), and FeSiBAlNiGd (W6-Gd) high entropy alloys (HEAs) were prepared using a copper-mold casting method. Effects of Co and Gd additions combined with subsequent annealing on microstructures and magnetism were investigated. The as-cast W5 consists of BCC solid solution and FeSi-rich phase. The Gd addition induces the formation of body-centered cubic (BCC) and face-centered cubic (FCC) solid solutions for W6-Gd HEAs. Whereas, the as-cast W6-Co is composed of the FeSi-rich phase. During annealing, no new phases arise in the W6-Co HEA, indicating a good phase stability. The as-cast W5 has the highest hardness (1210 HV), which is mainly attributed to the strengthening effect of FeSi-rich phase evenly distributed in the solid solution matrix. The tested FeSiBAlNi-based HEAs possess soft magnetism. The saturated magnetization and remanence ratio of W6-Gd are distinctly enhanced from 10.93 emu/g to 62.78 emu/g and from 1.44% to 15.50% after the annealing treatment, respectively. The good magnetism of the as-annealed W6-Gd can be ascribed to the formation of Gd-oxides.

]]>Entropy doi: 10.3390/e20070486

Authors: Zhiyuan Li Juan Du Xavier Ottavy Hongwu Zhang

A local loss model and an integral loss model are proposed to study the irreversible flow loss mechanism in a linear compressor cascade. The detached eddy simulation model based on the Menter shear stress transport turbulence model (SSTDES) was used to perform the high-fidelity simulations. The flow losses in the cascade with an incidence angle of 2&deg;, 4&deg; and 7&deg; were analyzed. The contours of local loss coefficient can be explained well by the three-dimensional flow structures. The trend of flow loss varying with incidence angle predicted by integral loss is the same as that calculated by total pressure loss coefficient. The integral loss model was used to evaluate the irreversible loss generated in different regions and its varying trend with the flow condition. It as found that the boundary layer shear losses generated near the endwall, the pressure surface and the suction surface are almost identical for the three incidence angles. The secondary flow loss in the wake-flow and blade-passage regions changes dramatically with the flow condition due to the occurrence of corner stall. For this cascade, the secondary flow loss accounts for 26.1%, 48.3% and 64.3% of the total loss for the flow when the incidence angles are 2&deg;, 4&deg; and 7&deg;, respectively. Lastly, the underlying reason for the variation of the secondary flow loss with the incidence angle is explained using the Lc iso-surface method.

]]>Entropy doi: 10.3390/e20070485

Authors: Angelo Carollo Bernardo Spagnolo Davide Valenti

In this article, we derive a closed form expression for the symmetric logarithmic derivative of Fermionic Gaussian states. This provides a direct way of computing the quantum Fisher Information for Fermionic Gaussian states. Applications range from quantum Metrology with thermal states to non-equilibrium steady states with Fermionic many-body systems.

]]>Entropy doi: 10.3390/e20070484

Authors: Mohammad H. Ahmadi Mirhadi S. Sadaghiani Fathollah Pourfayaz Mahyar Ghazvini Omid Mahian Mehdi Mehrpooya Somchai Wongwises

An exergy analysis of a novel integrated power system is represented in this study. A Solid Oxide Fuel Cell (SOFC), which has been assisted with a Gas Turbine (GT) and Organic Rankine Cycle (ORC) by employing liquefied natural gas (LNG) as a heat sink in a combined power system is simulated and investigated. Initially in this paper, the integrated power system and the primary concepts of the simulation are described. Subsequently, results of the simulation, exergy analysis, and composite curves of heat exchangers are represented and discussed. The equations of the exergy efficiency and destruction for the main cycle&rsquo;s units such as compressors, expanders, pumps, evaporators, condensers, reformers, and reactors are presented. According to the results, the highest exergy destruction is contributed to the SOFC reactor, despite its acceptable exergy efficiency which is equal to 75.7%. Moreover, the exergy efficiencies of the ORC cycle and the whole plant are determined to be 64.9% and 39.9%, respectively. It is worth noting that the rational efficiency of the integrated power system is 53.5%. Among all units, the exergy efficiency of the LNG pump is determined to be 11.7% the lowest exergy efficiency among the other investigated components, indicating a great potential for improvements.

]]>Entropy doi: 10.3390/e20070483

Authors: Louis Kauffman

This paper reviews results about discrete physics and non-commutative worlds and explores further the structure and consequences of constraints linking classical calculus and discrete calculus formulated via commutators. In particular, we review how the formalism of generalized non-commutative electromagnetism follows from a first order constraint and how, via the Kilmister equation, relationships with general relativity follow from a second order constraint. It is remarkable that a second order constraint, based on interlacing the commutative and non-commutative worlds, leads to an equivalent tensor equation at the pole of geodesic coordinates for general relativity.

]]>Entropy doi: 10.3390/e20070482

Authors: Bin Pang Yuling He Guiji Tang Chong Zhou Tian Tian

The impulsive fault feature signal of rolling bearings at the early failure stage is easily contaminated by the fundamental frequency (i.e., the rotation frequency of the shaft) signal and background noise. To address this problem, this paper puts forward a rolling bearing weak fault diagnosis method with the combination of optimal notch filter and enhanced singular value decomposition. Firstly, in order to eliminate the interference of the fundamental frequency signal, the original signal was processed by the notch filter with the fundamental frequency as the center frequency and with a varying bandwidth to get a series of corresponding notch filter signals. Secondly, the Teager energy entropy index was adopted to adaptively determine the optimal bandwidth to complete the optimal notch filter analysis on the raw vibration signal and obtain the corresponding optimal notch filter signal. Thirdly, an enhanced singular value decomposition de-nosing method was employed to de-noise the optimal notch filter signal. Finally, the envelope spectrum analysis was conducted on the de-noised signal to extract the fault characteristic frequencies. The effectiveness of the presented method was demonstrated via simulation and experiment verifications. In addition, the minimum entropy deconvolution, Kurtogram and Infogram methods were employed for comparisons to show the advantages of the presented method.

]]>Entropy doi: 10.3390/e20070481

Authors: Philip Tee George Parisis Luc Berthouze Ian Wakeman

Combinatoric measures of entropy capture the complexity of a graph but rely upon the calculation of its independent sets, or collections of non-adjacent vertices. This decomposition of the vertex set is a known NP-Complete problem and for most real world graphs is an inaccessible calculation. Recent work by Dehmer et al. and Tee et al. identified a number of vertex level measures that do not suffer from this pathological computational complexity, but that can be shown to be effective at quantifying graph complexity. In this paper, we consider whether these local measures are fundamentally equivalent to global entropy measures. Specifically, we investigate the existence of a correlation between vertex level and global measures of entropy for a narrow subset of random graphs. We use the greedy algorithm approximation for calculating the chromatic information and therefore K&ouml;rner entropy. We are able to demonstrate strong correlation for this subset of graphs and outline how this may arise theoretically.

]]>Entropy doi: 10.3390/e20060480

Authors: Saran Chen Xin Lu Zhong Liu Zhongwei Jia

With the increasing use of online social networking platforms, online surveys are widely used in many fields, e.g., public health, business and sociology, to collect samples and to infer the population characteristics through self-reported data of respondents. Although the online surveys can protect the privacy of respondents, self-reporting is challenged by a low response rate and unreliable answers when the survey contains sensitive questions, such as drug use, sexual behaviors, abortion or criminal activity. To overcome this limitation, this paper develops an approach that collects the second-order information of the respondents, i.e., asking them about the characteristics of their friends, instead of asking the respondents&rsquo; own characteristics directly. Then, we generate the inference about the population variable with the Hansen-Hurwitz estimator for the two classic sampling strategies (simple random sampling or random walk-based sampling). The method is evaluated by simulations on both artificial and real-world networks. Results show that the method is able to generate population estimates with high accuracy without knowing the respondents&rsquo; own characteristics, and the biases of estimates under various settings are relatively small and are within acceptable limits. The new method offers an alternative way for implementing surveys online and is expected to be able to collect more reliable data with improved population inference on sensitive variables.

]]>Entropy doi: 10.3390/e20060479

Authors: Yongqi Wang Kolumban Hutter

n/a

]]>Entropy doi: 10.3390/e20060478

Authors: Daniel Rohrlich Guy Hetzroni

We review an argument that bipartite &ldquo;PR-box&rdquo; correlations, though designed to respect relativistic causality, in fact violate relativistic causality in the classical limit. As a test of this argument, we consider Greenberger&ndash;Horne&ndash;Zeilinger (GHZ) correlations as a tripartite version of PR-box correlations, and ask whether the argument extends to GHZ correlations. If it does&mdash;i.e., if it shows that GHZ correlations violate relativistic causality in the classical limit&mdash;then the argument must be incorrect (since GHZ correlations do respect relativistic causality in the classical limit.) However, we find that the argument does not extend to GHZ correlations. We also show that both PR-box correlations and GHZ correlations can be retrocausal, but the retrocausality of PR-box correlations leads to self-contradictory causal loops, while the retrocausality of GHZ correlations does not.

]]>Entropy doi: 10.3390/e20060477

Authors: Alejandro Ramírez-Rojas Elsa Leticia Flores-Márquez Nicholas V. Sarlis Panayiotis A. Varotsos

We analyse seismicity during the 6-year period 2012&ndash;2017 in the new time domain termed natural time in the Chiapas region where the M8.2 earthquake occurred, Mexico&rsquo;s largest earthquake in more than a century, in order to study the complexity measures associated with fluctuations of entropy as well as with entropy change under time reversal. We find that almost three months before the M8.2 earthquake, i.e., on 14 June 2017, the complexity measure associated with the fluctuations of entropy change under time reversal shows an abrupt increase, which, however, does not hold for the complexity measure associated with the fluctuations of entropy in forward time. On the same date, the entropy change under time reversal has been previously found to exhibit a minimum [Physica A 506, 625&ndash;634 (2018)]; we thus find here that this minimum is also accompanied by increased fluctuations of the entropy change under time reversal. In addition, we find a simultaneous increase of the Tsallis entropic index q.

]]>Entropy doi: 10.3390/e20060476

Authors: Ming Li Wendan Wei Jialin Wang Xiaoyu Qi

Accounting informatization is an important part of enterprise informatization. It affects the accounting and finance operational efficiency. With the comprehensive evaluation of accounting informatization from multiple aspects, we can find the strengths and weaknesses of corporate accounting informatization, which then can be improved precisely. In this paper, an evaluation approach of accounting informatization is proposed. Firstly, the evaluation index system is constructed from the aspects of strategic position, infrastructure construction, implementation of accounting informatization, informatization guarantee, and application efficiency. Considering the complexity and ambiguity of the index, experts are required to give linguistic ratings which are then converted into intuitionistic fuzzy number. Then, an entropy and cross entropy method based on intuitionistic fuzzy sets is proposed to derive the weights of experts so as to reduce the error caused by personal bias. By combining the weights of the index and weighted ratings, the evaluation results are obtained. Finally, a case of accounting information evaluation is given to illustrate the feasibility of the proposed approach.

]]>Entropy doi: 10.3390/e20060475

Authors: Alejandro Linares-Barranco Hongjie Liu Antonio Rios-Navarro Francisco Gomez-Rodriguez Diederik P. Moeys Tobi Delbruck

Taking inspiration from biology to solve engineering problems using the organizing principles of biological neural computation is the aim of the field of neuromorphic engineering. This field has demonstrated success in sensor based applications (vision and audition) as well as in cognition and actuators. This paper is focused on mimicking the approaching detection functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC) and its application to robotics. These RGCs transmit action potentials when an expanding object is detected. In this work we compare the software and hardware logic FPGA implementations of this approaching function and the hardware latency when applied to robots, as an attention/reaction mechanism. The visual input for these cells comes from an asynchronous event-driven Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation, on a Spartan 6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz. The entropy has been calculated to demonstrate that the system is not totally deterministic in response to approaching objects because of several bioinspired characteristics. It has been measured that a Summit XL mobile robot can react to an approaching object in 90 ms, which can be used as an attentional mechanism. This is faster than similar event-based approaches in robotics and equivalent to human reaction latencies to visual stimulus.

]]>Entropy doi: 10.3390/e20060474

Authors: Thomas Filk

The article describes an interpretation of the mathematical formalism of standard quantum mechanics in terms of relations. In particular, the wave function &psi;(x) is interpreted as a complex-valued relation between an entity (often called &ldquo;particle&rdquo;) and a second entity x (often called &ldquo;spatial point&rdquo;). Such complex-valued relations can also be formulated for classical physical systems. Entanglement is interpreted as a relation between two entities (particles or properties of particles). Such relations define the concept of &ldquo;being next to each other&rdquo;, which implies that entangled entities are close to each other, even though they might appear to be far away with respect to a classical background space. However, when space is also considered to be a network of relations (of which the classical background space is a large-scale continuum limit), such nearest neighbor configurations are possible. The measurement problem is discussed from the perspective of this interpretation. It should be emphasized that this interpretation is not meant to be a serious attempt to describe the ontology of our world, but its purpose is to make it obvious that, besides Bohmian mechanics, presumably many other ontological interpretations of quantum theory exist.

]]>Entropy doi: 10.3390/e20060473

Authors: Claudia Zander Angel Ricardo Plastino

We revisit the concept of entanglement within the Bohmian approach to quantum mechanics. Inspired by Bohmian dynamics, we introduce two partial measures for the amount of entanglement corresponding to a pure state of a pair of quantum particles. One of these measures is associated with the statistical correlations exhibited by the joint probability density of the two Bohmian particles in configuration space. The other partial measure corresponds to the correlations associated with the phase of the joint wave function, and describes the non-separability of the Bohmian velocity field. The sum of these two components is equal to the total entanglement of the joint quantum state, as measured by the linear entropy of the single-particle reduced density matrix.

]]>Entropy doi: 10.3390/e20060472

Authors: Jan Naudts

Quantum information geometry studies families of quantum states by means of differential geometry. A new approach is followed with the intention to facilitate the introduction of a more general theory in subsequent work. To this purpose, the emphasis is shifted from a manifold of strictly positive density matrices to a manifold of faithful quantum states on the C*-algebra of bounded linear operators. In addition, ideas from the parameter-free approach to information geometry are adopted. The underlying Hilbert space is assumed to be finite-dimensional. In this way, technicalities are avoided so that strong results are obtained, which one can hope to prove later on in a more general context. Two different atlases are introduced, one in which it is straightforward to show that the quantum states form a Banach manifold, the other which is compatible with the inner product of Bogoliubov and which yields affine coordinates for the exponential connection.

]]>Entropy doi: 10.3390/e20060471

Authors: Fanrong Meng Xiaobin Rui Zhixiao Wang Yan Xing Longbing Cao

Attributed networks consist of not only a network structure but also node attributes. Most existing community detection algorithms only focus on network structures and ignore node attributes, which are also important. Although some algorithms using both node attributes and network structure information have been proposed in recent years, the complex hierarchical coupling relationships within and between attributes, nodes and network structure have not been considered. Such hierarchical couplings are driving factors in community formation. This paper introduces a novel coupled node similarity (CNS) to involve and learn attribute and structure couplings and compute the similarity within and between nodes with categorical attributes in a network. CNS learns and integrates the frequency-based intra-attribute coupled similarity within an attribute, the co-occurrence-based inter-attribute coupled similarity between attributes, and coupled attribute-to-structure similarity based on the homophily property. CNS is then used to generate the weights of edges and transfer a plain graph to a weighted graph. Clustering algorithms detect community structures that are topologically well-connected and semantically coherent on the weighted graphs. Extensive experiments verify the effectiveness of CNS-based community detection algorithms on several data sets by comparing with the state-of-the-art node similarity measures, whether they involve node attribute information and hierarchical interactions, and on various levels of network structure complexity.

]]>Entropy doi: 10.3390/e20060470

Authors: Ting Yang Shujun Liu Wenguo Liu Jishun Guo Pin Wang

In this paper, a noise enhanced binary hypothesis-testing problem was studied for a variable detector under certain constraints in which the detection probability can be increased and the false-alarm probability can be decreased simultaneously. According to the constraints, three alternative cases are proposed, the first two cases concerned minimization of the false-alarm probability and maximization of the detection probability without deterioration of one by the other, respectively, and the third case was achieved by a randomization of two optimal noise enhanced solutions obtained in the first two limit cases. Furthermore, the noise enhanced solutions that satisfy the three cases were determined whether randomization between different detectors was allowed or not. In addition, the practicality of the third case was proven from the perspective of Bayes risk. Finally, numerous examples and conclusions are presented.

]]>Entropy doi: 10.3390/e20060469

Authors: Gerardo Valadez Huerta Vincent Flasbart Tobias Marquardt Pablo Radici Stephan Kabelac

The calculation of the entropy production rate within an operational high temperature solid oxide fuel cell (SOFC) is necessary to design and improve heating and cooling strategies. However, due to a lack of information, most of the studies are limited to empirical relations, which are not in line with the more general approach given by non-equilibrium thermodynamics (NET). The SOFC 1D-model presented in this study is based on non-equilibrium thermodynamics and we parameterize it with experimental data and data from molecular dynamics (MD). The validation of the model shows that it can effectively describe the behavior of a SOFC at 1300&nbsp;K. Moreover, we show that the highest entropy production is present in the electrolyte and the catalyst layers, and that the Peltier heat transfer is considerable for the calculation of the heat flux in the electrolyte and cannot be neglected. To our knowledge, this is the first validated model of a SOFC based on non-equilibrium thermodynamics and this study can be extended to analyze SOFCs with other solid oxide electrolytes, with perovskites electrolytes or even other electrochemical systems like solid oxide electrolysis cells (SOECs).

]]>Entropy doi: 10.3390/e20060468

Authors: Jonathan N. Blakely Marko S. Milosavljevic Ned J. Corron

Chaotic evolution is generally too irregular to be captured in an analytic solution. Nonetheless, some dynamical systems do have such solutions enabling more rigorous analysis than can be achieved with numerical solutions. Here, we introduce a method of coupling solvable chaotic oscillators that maintains solvability. In fact, an analytic solution is given for an entire network of coupled oscillators. Importantly, a valid chaotic solution is shown even when the coupling topology is complex and the population of oscillators is heterogeneous. We provide a specific example of a solvable chaotic network with star topology and a hub that oscillates much faster than its leaves. We present analytic solutions as the coupling strength is varied showing states of varying degrees of global organization. The covariance of the network is derived explicity from the analytic solution characterizing the degree of synchronization across the network as the coupling strength varies. This example suggests that analytic solutions may constitute a new tool in the study of chaotic network dynamics generally.

]]>Entropy doi: 10.3390/e20060467

Authors: Jaume del Olmo Alos Javier Rodríguez Fonollosa

Asymptotic secrecy-capacity achieving polar coding schemes are proposed for the memoryless degraded broadcast channel under different reliability and secrecy requirements: layered decoding or layered secrecy. In these settings, the transmitter wishes to send multiple messages to a set of legitimate receivers keeping them masked from a set of eavesdroppers. The layered decoding structure requires receivers with better channel quality to reliably decode more messages, while the layered secrecy structure requires eavesdroppers with worse channel quality to be kept ignorant of more messages. Practical constructions for the proposed polar coding schemes are discussed and their performance evaluated by means of simulations.

]]>Entropy doi: 10.3390/e20060466

Authors: Dennis Dieks

A consensus seems to have developed that the Gibbs paradox in classical thermodynamics (the discontinuous drop in the entropy of mixing when the mixed gases become equal to each other) is unmysterious: in any actual situation, two gases can be separated or not, and the associated harmless discontinuity from &ldquo;yes&rdquo; to &ldquo;no&rdquo; is responsible for the discontinuity. By contrast, the Gibbs paradox in statistical physics continues to attract attention. Here, the problem is that standard calculations in statistical mechanics predict a non-vanishing value of the entropy of mixing even when two gases of the same kind are mixed, in conflict with thermodynamic predictions. This version of the Gibbs paradox is often seen as a sign that there is something fundamentally wrong with either the traditional expression S=klnW or with the way W is calculated. It is the aim of this article to review the situation from the orthodox (as opposed to information theoretic) standpoint. We demonstrate how the standard formalism is not only fully capable of dealing with the paradox, but also provides an intuitively clear picture of the relevant physical mechanisms. In particular, we pay attention to the explanatory relevance of the existence of particle trajectories in the classical context. We also discuss how the paradox survives the transition to quantum mechanics, in spite of the symmetrization postulates.

]]>Entropy doi: 10.3390/e20060465

Authors: Tim Maudlin

Quantum physics demands some radical revision of our fundamental beliefs about physical reality. We know that because there are certain verified physical phenomena&mdash;two-slit interference, the disappearance of interference upon monitoring, violations of Bell&rsquo;s inequality&mdash;that have no classical analogs. But the exact nature of that revision has been under dispute since the foundation of quantum theory. I offer a method of clarifying what the commitments of a clearly formulated physical theory are, and apply it to a discussion of some options available to account for another non-classical phenomenon: the Aharonov&ndash;Bohm effect.

]]>Entropy doi: 10.3390/e20060464

Authors: Marianthi Markatou Yang Chen

One natural way to measure model adequacy is by using statistical distances as loss functions. A related fundamental question is how to construct loss functions that are scientifically and statistically meaningful. In this paper, we investigate non-quadratic distances and their role in assessing the adequacy of a model and/or ability to perform model selection. We first present the definition of a statistical distance and its associated properties. Three popular distances, total variation, the mixture index of fit and the Kullback-Leibler distance, are studied in detail, with the aim of understanding their properties and potential interpretations that can offer insight into their performance as measures of model misspecification. A small simulation study exemplifies the performance of these measures and their application to different scientific fields is briefly discussed.

]]>Entropy doi: 10.3390/e20060463

Authors: Shouliang Li Weikang Ding Benshun Yin Tongfeng Zhang Yide Ma

With the popularity of the Internet, the transmission of images has become more frequent. It is of great significance to study efficient and secure image encryption algorithms. Based on traditional Logistic maps and consideration of delay, we propose a new one-dimensional (1D) delay and linearly coupled Logistic chaotic map (DLCL) in this paper. Time delay is a common phenomenon in various complex systems in nature, and it will greatly change the dynamic characteristics of the system. The map is analyzed in terms of trajectory, Lyapunov exponent (LE) and Permutation entropy (PE). The results show that this map has wide chaotic range, better ergodicity and larger maximum LE in comparison with some existing chaotic maps. A new method of color image encryption is put forward based on DLCL. In proposed encryption algorithm, after various analysis, it has good encryption performance, and the key used for scrambling is related to the original image. It is illustrated by simulation results that the ciphered images have good pseudo randomness through our method. The proposed encryption algorithm has large key space and can effectively resist differential attack and chosen plaintext attack.

]]>Entropy doi: 10.3390/e20060462

Authors: Roderich Tumulka

The biggest and most lasting among David Bohm&rsquo;s (1917&ndash;1992) many achievements is to have proposed a picture of reality that explains the empirical rules of quantum mechanics. This picture, known as pilot wave theory or Bohmian mechanics among other names, is still the simplest and most convincing explanation available. According to this theory, electrons are point particles in the literal sense and move along trajectories governed by Bohm&rsquo;s equation of motion. In this paper, I describe some more recent developments and extensions of Bohmian mechanics, concerning in particular relativistic space-time and particle creation and annihilation.

]]>Entropy doi: 10.3390/e20060461

Authors: Yijun Wang Xudong Wang Duan Huang Ying Guo

We show that a noiseless linear amplifier (NLA) can be placed properly at the receiver&rsquo;s end to improve the performance of self-referenced (SR) continuous variable quantum key distribution (CV-QKD) when the reference pulses are weak. In SR CV-QKD, the imperfections of the amplitude modulator limit the maximal amplitude of the reference pulses, while the performance of SR CV-QKD is positively related to the amplitude of the reference pulses. An NLA can compensate the impacts of large phase noise introduced by the weak reference pulses. Simulation results derived from collective attacks show that this scheme can improve the performance of SR CV-QKD with weak reference pulses, in terms of extending maximum transmission distance. An NLA with a gain of g can increase the maximum transmission distance by the equivalent of 20log10g dB of losses.

]]>Entropy doi: 10.3390/e20060460

Authors: George Ruppeiner

Black holes pose great difficulties for theory since gravity and quantum theory must be combined in some as yet unknown way. An additional difficulty is that detailed black hole observational data to guide theorists is lacking. In this paper, I sidestep the difficulties of combining gravity and quantum theory by employing black hole thermodynamics augmented by ideas from the information geometry of thermodynamics. I propose a purely thermodynamic agenda for choosing correct candidate black hole thermodynamic scaled equations of state, parameterized by two exponents. These two adjustable exponents may be set to accommodate additional black hole information, either from astrophysical observations or from some microscopic theory, such as string theory. My approach assumes implicitly that the as yet unknown microscopic black hole constituents have strong effective interactions between them, of a type found in critical phenomena. In this picture, the details of the microscopic interaction forces are not important, and the essential macroscopic picture emerges from general assumptions about the number of independent thermodynamic variables, types of critical points, boundary conditions, and analyticity. I use the simple Kerr and Reissner-Nordstr&ouml;m black holes for guidance, and find candidate equations of state that embody several the features of these purely gravitational models. My approach may offer a productive new way to select black hole thermodynamic equations of state representing both gravitational and quantum properties.

]]>Entropy doi: 10.3390/e20060459

Authors: Yuli Zhao Yin Zhang Bin Zhang Kening Gao Pengfei Li

Since complex search tasks are usually divided into subtasks, providing subtask-oriented query recommendations is an effective way to support complex search tasks. Currently, most subtask-oriented query recommendation methods extract subtasks from plain form search logs consisting of only queries and clicks, providing limited clues to identify subtasks. Meanwhile, for several decades, the Computer Human Interface (CHI)/Human Computer Interaction (HCI) communities have been working on new complex search tools for the purpose of supporting rich user interactions beyond just queries and clicks, and thus providing rich form search logs with more clues for subtask identification. In this paper, we researched the provision of subtask-oriented query recommendations by extracting thematic experiences from the rich form search logs of complex search tasks logged in a proposed visual data structure. We introduce the tree structure of the visual data structure and propose a visual-based subtask identification method based on the visual data structure. We then introduce a personalized PageRank-based method to recommend queries by ranking nodes on the network from the identified subtasks. We evaluated the proposed methods in experiments consisting of informative and tentative search tasks.

]]>Entropy doi: 10.3390/e20060458

Authors: Gerhard Grössing Siegfried Fussy Johannes Mesa Pascasio Herbert Schwabl

In the quest for an understanding of nonlocality with respect to an appropriate ontology, we propose a &ldquo;cosmological solution&rdquo;. We assume that from the beginning of the universe each point in space has been the location of a scalar field representing a zero-point vacuum energy that nonlocally vibrates at a vast range of different frequencies across the whole universe. A quantum, then, is a nonequilibrium steady state in the form of a &ldquo;bouncer&rdquo; coupled resonantly to one of those (particle type dependent) frequencies, in remote analogy to the bouncing oil drops on an oscillating oil bath as in Couder&rsquo;s experiments. A major difference to the latter analogy is given by the nonlocal nature of the vacuum oscillations. We show with the examples of double- and n-slit interference that the assumed nonlocality of the distribution functions alone suffices to derive the de Broglie&ndash;Bohm guiding equation for N particles with otherwise purely classical means. In our model, no influences from configuration space are required, as everything can be described in 3-space. Importantly, the setting up of an experimental arrangement limits and shapes the forward and osmotic contributions and is described as vacuum landscaping.

]]>Entropy doi: 10.3390/e20060457

Authors: Michal Pavelka Václav Klika Miroslav Grmela

Landau damping is the tendency of solutions to the Vlasov equation towards spatially homogeneous distribution functions. The distribution functions, however, approach the spatially homogeneous manifold only weakly, and Boltzmann entropy is not changed by the Vlasov equation. On the other hand, density and kinetic energy density, which are integrals of the distribution function, approach spatially homogeneous states strongly, which is accompanied by growth of the hydrodynamic entropy. Such a behavior can be seen when the Vlasov equation is reduced to the evolution equations for density and kinetic energy density by means of the Ehrenfest reduction.

]]>Entropy doi: 10.3390/e20060456

Authors: Branko Ristic Jennifer L. Palmer

This short note addresses the problem of autonomous on-line path-panning for exploration and occupancy-grid mapping using a mobile robot. The underlying algorithm for simultaneous localisation and mapping (SLAM) is based on random-finite set (RFS) modelling of ranging sensor measurements, implemented as a Rao-Blackwellised particle filter. Path-planning in general must trade-off between exploration (which reduces the uncertainty in the map) and exploitation (which reduces the uncertainty in the robot pose). In this note we propose a reward function based on the R&eacute;nyi divergence between the prior and the posterior densities, with RFS modelling of sensor measurements. This approach results in a joint map-pose uncertainty measure without a need to scale and tune their weights.

]]>Entropy doi: 10.3390/e20060455

Authors: Mingtao Ge Jie Wang Fangfang Zhang Ke Bai Xiangyang Ren

According to the dynamic characteristics of the rolling bearing vibration signal and the distribution characteristics of its noise, a fault identification method based on the adaptive filtering empirical wavelet transform (AFEWT) and kernel density estimation mutual information (KDEMI) classifier is proposed. First, we use AFEWT to extract the feature of the rolling bearing vibration signal. The hypothesis test of the Gaussian distribution is carried out for the sub-modes that are obtained by the twice decomposition of EWT, and Gaussian noise is filtered out according to the test results. In this way, we can overcome the noise interference and avoid the mode selection problem when we extract the feature of the signal. Then we combine the advantages of kernel density estimation (KDE) and mutual information (MI) and put forward a KDEMI classifier. The mutual information of the probability density combining the unknown signal feature vector and the probability density of the known type signal is calculated. The type of the unknown signal is determined via the value of the mutual information, so as to achieve the purpose of fault identification of the rolling bearing. In order to verify the effectiveness of AFEWT in feature extraction, we extract signal features using three methods, AFEWT, EWT, and EMD, and then use the same classifier to identify fault signals. Experimental results show that the fault signal has the highest recognition rate by using AFEWT for feature extraction. At the same time, in order to verify the performance of the AFEWT-KDEMI method, we compare two classical fault signal identification methods, SVM and BP neural network, with the AFEWT-KDEMI method. Through experimental analysis, we found that the AFEWT-KDEMI method is more stable and effective.

]]>Entropy doi: 10.3390/e20060454

Authors: Fabricio Toscano Daniel S. Tasca Łukasz Rudnicki Stephen P. Walborn

Uncertainty relations involving incompatible observables are one of the cornerstones of quantum mechanics. Aside from their fundamental significance, they play an important role in practical applications, such as detection of quantum correlations and security requirements in quantum cryptography. In continuous variable systems, the spectra of the relevant observables form a continuum and this necessitates the coarse graining of measurements. However, these coarse-grained observables do not necessarily obey the same uncertainty relations as the original ones, a fact that can lead to false results when considering applications. That is, one cannot naively replace the original observables in the uncertainty relation for the coarse-grained observables and expect consistent results. As such, several uncertainty relations that are specifically designed for coarse-grained observables have been developed. In recognition of the 90th anniversary of the seminal Heisenberg uncertainty relation, celebrated last year, and all the subsequent work since then, here we give a review of the state of the art of coarse-grained uncertainty relations in continuous variable quantum systems, as well as their applications to fundamental quantum physics and quantum information tasks. Our review is meant to be balanced in its content, since both theoretical considerations and experimental perspectives are put on an equal footing.

]]>Entropy doi: 10.3390/e20060453

Authors: Georg J. Schmitz

Different notions of entropy can be identified in different scientific communities: (i) the thermodynamic sense; (ii) the information sense; (iii) the statistical sense; (iv) the disorder sense; and (v) the homogeneity sense. Especially the &ldquo;disorder sense&rdquo; and the &ldquo;homogeneity sense&rdquo; relate to and require the notion of space and time. One of the few prominent examples relating entropy to both geometry and space is the Bekenstein-Hawking entropy of a Black Hole. Although this was developed for describing a physical object&mdash;a black hole&mdash;having a mass, a momentum, a temperature, an electrical charge, etc., absolutely no information about this object&rsquo;s attributes can ultimately be found in the final formulation. In contrast, the Bekenstein-Hawking entropy in its dimensionless form is a positive quantity only comprising geometric attributes such as an area A&mdash;the area of the event horizon of the black hole, a length LP&mdash;the Planck length, and a factor 1/4. A purely geometric approach to this formulation will be presented here. The approach is based on a continuous 3D extension of the Heaviside function which draws on the phase-field concept of diffuse interfaces. Entropy enters into the local and statistical description of contrast or gradient distributions in the transition region of the extended Heaviside function definition. The structure of the Bekenstein-Hawking formulation is ultimately derived for a geometric sphere based solely on geometric-statistical considerations.

]]>Entropy doi: 10.3390/e20060452

Authors: Ran Gao Xiaolong Yin Zhiping Li

In multiphase (≥3) equilibrium calculations, when the Newton method is used to solve the material balance (Rachford-Rice) equations, poorly conditioned Jacobian can lead to false convergence. We present a robust successive substitution method that solves the multiphase Rachford-Rice equations sequentially using the method of bi-section while considering the monotonicity of the equations and the locations of singular hyperplanes. Although this method is slower than Newton solution, as it does not rely on Jacobians that can become poorly conditioned, it can be inserted into Newton iterations upon the detection of a poorly conditioned Jacobian. Testing shows that embedded successive substitution steps effectively improved the robustness. The benefit of the Newton method in the speed of convergence is maintained.

]]>Entropy doi: 10.3390/e20060451

Authors: Ángel Sanz

Bohmian mechanics, widely known within the field of the quantum foundations, has been a quite useful resource for computational and interpretive purposes in a wide variety of practical problems. Here, it is used to establish a comparative analysis at different levels of approximation in the problem of the diffraction of helium atoms from a substrate consisting of a defect with axial symmetry on top of a flat surface. The motivation behind this work is to determine which aspects of one level survive in the next level of refinement and, therefore, to get a better idea of what we usually denote as quantum-classical correspondence. To this end, first a quantum treatment of the problem is performed with both an approximated hard-wall model and then with a realistic interaction potential model. The interpretation and explanation of the features displayed by the corresponding diffraction intensity patterns is then revisited with a series of trajectory-based approaches: Fermatian trajectories (optical rays), Newtonian trajectories and Bohmian trajectories. As it is seen, while Fermatian and Newtonian trajectories show some similarities, Bohmian trajectories behave quite differently due to their implicit non-classicality.

]]>Entropy doi: 10.3390/e20060450

Authors: Robert H. Swendsen

Two distinct puzzles, which are both known as Gibbs&rsquo; paradox, have interested physicists since they were first identified in the 1870s. They each have significance for the foundations of statistical mechanics and have led to lively discussions with a wide variety of suggested resolutions. Most proposed resolutions had involved quantum mechanics, although the original puzzles were entirely classical and were posed before quantum mechanics was invented. In this paper, I show that contrary to what has often been suggested, quantum mechanics is not essential for resolving the paradoxes. I present a resolution of the paradoxes that does not depend on quantum mechanics and includes the case of colloidal solutions, for which quantum mechanics is not relevant.

]]>Entropy doi: 10.3390/e20060449

Authors: Anastasiia Bakhchina Karina Arutyunova Alexey Sozinov Alexander Demidovsky Yurii Alexandrov

Cardiac activity is involved in the processes of organization of goal-directed behaviour. Each behavioural act is aimed at achieving an adaptive outcome and it is subserved by the actualization of functional systems consisting of elements distributed across the brain and the rest of the body. This paper proposes a system-evolutionary view on the activity of the heart and its variability. We have compared the irregularity of the heart rate, as measured by sample entropy (SampEn), in behaviours that are subserved by functional systems formed at different stages of individual development, which implement organism-environment interactions with different degrees of differentiation. The results have shown that SampEn of the heart rate was higher during performing tasks that included later acquired knowledge (foreign language vs. native language; mathematical vocabulary vs. general vocabulary) and decreased in the stress and alcohol conditions, as well as at the beginning of learning. These results are in line with the hypothesis that irregularity of the heart rate reflects the properties of a set of functional systems subserving current behaviour, with higher irregularity corresponding to later acquired and more complex behaviour.

]]>Entropy doi: 10.3390/e20060448

Authors: Bing Li Mingliang Liu Zijian Guo Yamin Ji

The mechanical fault diagnosis results of the high voltage circuit breakers (HVCBs) are mainly determined by the feature vector and classifier used. In order to obtain more remarkable characteristics of signals and a robust classifier which is suitable for small sample classification, in this paper, a new mechanical fault diagnosis method is proposed. Firstly, the vibration signals of HVCBs are collected by a designed acquisition system, and the noise of signals is eliminated by a soft threshold de-noising method. Secondly, the empirical wavelet transform (EWT) is adopted to decompose the signals into a series of physically meaningful modes, and then, the improved time-frequency entropy (ITFE) method is used to extract the characteristics of the vibration signals. Finally, a generalized regression neural network (GRNN) is used to identify four types of vibration signals of HVCBs, while the smoothing parameter &delta; of GRNN is optimized by a loop traversal method. The experimental results show that by using this optimal classifier for fault diagnosis, the proposed fault diagnosis method has the better generalization performance and the recognition rate of unknown samples is over 95%, and the signal features obtained by the ITFE method are more significant than those of the traditional TFE method.

]]>Entropy doi: 10.3390/e20060447

Authors: Yubing Huang Wei Dai Lianxi Liu Yu Zhao

During the assembly process, there are inevitable variations and noise factors in the material properties, process parameters and screening scheme, which may affect the quality of the product. Using the stress-strength model, an evaluated screening scheme method, by analyzing the variation of the defect density in the assembly process, is proposed and discussed. The influence of screening stress on product defects is considered to determine the screening scheme. We performed the defect stream analysis by calculating the recursive relations of residual defect density under multi-stress conditions. We find that the probability density function, which shows the defect changing process from latent to dominant relative to the time process, agrees very well with the historical data. We also calculate the risk as the entropy of the assembly task. Finally, we verify our method by analyzing the assembly process of a certain product.

]]>Entropy doi: 10.3390/e20060446

Authors: Wei Zhou Jin Chen Bingqing Ding

One important element of military supply transportation is concealment, especially during war preparations and warfare periods. By introducing entropy to calculate the transportation concealment degree, we investigate the issue about concealed military supply transportation on the whole road network and propose an optimal flow distribution model. This model&rsquo;s objective function is to maximize the concealment of military supply transportation. After analyzing the road network, classifying different nodes, summarizing the constraint conditions based on the properties and assumptions in the transportation process, and combining the general parameter limits, the optimal flow distribution model is further transformed into a calculable non-linear programming model. Thus, based on this non-linear programming model, we can obtain the optimal distribution scheme of military supply transportation from the perspectives of network analysis and concealment measurement. Lastly, an example of military supply transportation in Jiangsu province, China is illustrated to prove the feasibility of the proposed model. The managerial implication is that by utilizing the proposed flow distribution model, military supplies can be efficiently transported to the required destinations based on maximizing the concealment degree. Not only this model can be utilized in the real military supply transportation, it can be also applied in other transportation fields which require time efficiency and concealment.

]]>Entropy doi: 10.3390/e20060445

Authors: Chunlei Fan Qun Ding

In this paper, a novel image encryption scheme is proposed for the secure transmission of image data. A self-synchronous chaotic stream cipher is designed with the purpose of resisting active attack and ensures the limited error propagation of image data. Two-dimensional discrete wavelet transform and Arnold mapping are used to scramble the pixel value of the original image. A four-dimensional hyperchaotic system with four positive Lyapunov exponents serve as the chaotic sequence generator of the self-synchronous stream cipher in order to enhance the security and complexity of the image encryption system. Finally, the simulation experiment results show that this image encryption scheme is both reliable and secure.

]]>Entropy doi: 10.3390/e20060444

Authors: Mirko Polato Ivano Lauriola Fabio Aiolli

Kernel based classifiers, such as SVM, are considered state-of-the-art algorithms and are widely used on many classification tasks. However, this kind of methods are hardly interpretable and for this reason they are often considered as black-box models. In this paper, we propose a new family of Boolean kernels for categorical data where features correspond to propositional formulas applied to the input variables. The idea is to create human-readable features to ease the extraction of interpretation rules directly from the embedding space. Experiments on artificial and benchmark datasets show the effectiveness of the proposed family of kernels with respect to established ones, such as RBF, in terms of classification accuracy.

]]>Entropy doi: 10.3390/e20060443

Authors: Olivier Darrigol

This article is a detailed history of the Gibbs paradox, with philosophical morals. It purports to explain the origins of the paradox, to describe and criticize solutions of the paradox from the early times to the present, to use the history of statistical mechanics as a reservoir of ideas for clarifying foundations and removing prejudices, and to relate the paradox to broad misunderstandings of the nature of physical theory.

]]>Entropy doi: 10.3390/e20060442

Authors: Jack Jewson Jim Q. Smith Chris Holmes

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback&ndash;Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes &amp; Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker &amp; Vidyashankar, 2014; Ghosh &amp; Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models.

]]>Entropy doi: 10.3390/e20060441

Authors: Ingrid Rotter

The aim of this paper is to study the question of whether or not equilibrium states exist in open quantum systems that are embedded in at least two environments and are described by a non-Hermitian Hamilton operator H . The eigenfunctions of H contain the influence of exceptional points (EPs) and external mixing (EM) of the states via the environment. As a result, equilibrium states exist (far from EPs). They are different from those of the corresponding closed system. Their wavefunctions are orthogonal even though the Hamiltonian is non-Hermitian.

]]>Entropy doi: 10.3390/e20060440

Authors: Oliver Passon

We discuss a common misconception regarding the de Broglie&ndash;Bohm (dBB) theory; namely, that it not only assigns a position to each quantum object but also contains the momenta as &ldquo;hidden variables&rdquo;. Sometimes this alleged property of the theory is even used to argue that the dBB theory is inconsistent with quantum theory. We explain why this claim is unfounded and show in particular how this misconception veils the true novelty of the dBB theory.

]]>Entropy doi: 10.3390/e20060439

Authors: Hongzhi Zhang Xiao Liang Guangluan Xu Kun Fu Feng Li Tinglei Huang

Automatic question answering (QA), which can greatly facilitate the access to information, is an important task in artificial intelligence. Recent years have witnessed the development of QA methods based on deep learning. However, a great amount of data is needed to train deep neural networks, and it is laborious to annotate training data for factoid QA of new domains or languages. In this paper, a distantly supervised method is proposed to automatically generate QA pairs. Additional efforts are paid to let the generated questions reflect the query interests and expression styles of users by exploring the community QA. Specifically, the generated questions are selected according to the estimated probabilities they are asked. Diverse paraphrases of questions are mined from community QA data, considering that the model trained on monotonous synthetic questions is very sensitive to variants of question expressions. Experimental results show that the model solely trained on generated data via the distant supervision and mined paraphrases could answer real-world questions with the accuracy of 49.34%. When limited annotated training data is available, significant improvements could be achieved by incorporating the generated data. An improvement of 1.35 absolute points is still observed on WebQA, a dataset with large-scale annotated training samples.

]]>Entropy doi: 10.3390/e20060438

Authors: Tatsuaki Tsuruyama

Kullback&ndash;Leibler divergence (KLD) is a type of extended mutual entropy, which is used as a measure of information gain when transferring from a prior distribution to a posterior distribution. In this study, KLD is applied to the thermodynamic analysis of cell signal transduction cascade and serves an alternative to mutual entropy. When KLD is minimized, the divergence is given by the ratio of the prior selection probability of the signaling molecule to the posterior selection probability. Moreover, the information gain during the entire channel is shown to be adequately described by average KLD production rate. Thus, this approach provides a framework for the quantitative analysis of signal transduction. Moreover, the proposed approach can identify an effective cascade for a signaling network.

]]>Entropy doi: 10.3390/e20060437

Authors: António M. Lopes J. A. Tenreiro Machado

Climate has complex dynamics due to the plethora of phenomena underlying its evolution. These characteristics pose challenges to conducting solid quantitative analysis and reaching assertive conclusions. In this paper, the global temperature time series (TTS) is viewed as a manifestation of the climate evolution, and its complexity is calculated by means of four different indices, namely the Lempel&ndash;Ziv complexity, sample entropy, signal harmonics power ratio, and fractal dimension. In the first phase, the monthly mean TTS is pre-processed by means of empirical mode decomposition, and the TTS trend is calculated. In the second phase, the complexity of the detrended signals is estimated. The four indices capture distinct features of the TTS dynamics in a 4-dim space. Hierarchical clustering is adopted for dimensional reduction and visualization in the 2-dim space. The results show that TTS complexity exhibits space-time variability, suggesting the presence of distinct climate forcing processes in both dimensions. Numerical examples with real-world data demonstrate the effectiveness of the approach.

]]>Entropy doi: 10.3390/e20060436

Authors: Antonio M. Scarfone Hiroshi Matsuzoe Tatsuaki Wada

In this paper, we present a review of recent developments on the &kappa; -deformed statistical mechanics in the framework of the information geometry. Three different geometric structures are introduced in the &kappa; -formalism which are obtained starting from three, not equivalent, divergence functions, corresponding to the &kappa; -deformed version of Kullback&ndash;Leibler, &ldquo;Kerridge&rdquo; and Br&egrave;gman divergences. The first statistical manifold derived from the &kappa; -Kullback&ndash;Leibler divergence form an invariant geometry with a positive curvature that vanishes in the &kappa; &rarr; 0 limit. The other two statistical manifolds are related to each other by means of a scaling transform and are both dually-flat. They have a dualistic Hessian structure endowed by a deformed Fisher metric and an affine connection that are consistent with a statistical scalar product based on the &kappa; -escort expectation. These flat geometries admit dual potentials corresponding to the thermodynamic Massieu and entropy functions that induce a Legendre structure of &kappa; -thermodynamics in the picture of the information geometry.

]]>Entropy doi: 10.3390/e20060435

Authors: Alessandro Bisio Giacomo Mauro D’Ariano Nicola Mosco Paolo Perinotti Alessandro Tosini

We study the solutions of an interacting Fermionic cellular automaton which is the analogue of the Thirring model with both space and time discrete. We present a derivation of the two-particle solutions of the automaton recently in the literature, which exploits the symmetries of the evolution operator. In the two-particle sector, the evolution operator is given by the sequence of two steps, the first one corresponding to a unitary interaction activated by two-particle excitation at the same site, and the second one to two independent one-dimensional Dirac quantum walks. The interaction step can be regarded as the discrete-time version of the interacting term of some Hamiltonian integrable system, such as the Hubbard or the Thirring model. The present automaton exhibits scattering solutions with nontrivial momentum transfer, jumping between different regions of the Brillouin zone that can be interpreted as Fermion-doubled particles, in stark contrast with the customary momentum-exchange of the one-dimensional Hamiltonian systems. A further difference compared to the Hamiltonian model is that there exist bound states for every value of the total momentum and of the coupling constant. Even in the special case of vanishing coupling, the walk manifests bound states, for finitely many isolated values of the total momentum. As a complement to the analytical derivations we show numerical simulations of the interacting evolution.

]]>Entropy doi: 10.3390/e20060434

Authors: Jiuli Yin Cui Su Yongfen Zhang Xinghua Fan

Carbon markets provide a market-based way to reduce climate pollution. Subject to general market regulations, the major existing emission trading markets present complex characteristics. This paper analyzes the complexity of carbon market by using the multi-scale entropy. Pilot carbon markets in China are taken as the example. Moving average is adopted to extract the scales due to the short length of the data set. Results show a low-level complexity inferring that China&rsquo;s pilot carbon markets are quite immature in lack of market efficiency. However, the complexity varies in different time scales. China&rsquo;s carbon markets (except for the Chongqing pilot) are more complex in the short period than in the long term. Furthermore, complexity level in most pilot markets increases as the markets developed, showing an improvement in market efficiency. All these results demonstrate that an effective carbon market is required for the full function of emission trading.

]]>Entropy doi: 10.3390/e20060433

Authors: Nabor O. Castillo Diego I. Gallardo Heleno Bolfarine Héctor W. Gómez

This paper focuses on studying a truncated positive version of the power-normal (PN) model considered in Durrans (1992). The truncation point is considered to be zero so that the resulting model is an extension of the half normal distribution. Some probabilistic properties are studied for the proposed model along with maximum likelihood and moments estimation. The model is fitted to two real datasets and compared with alternative models for positive data. Results indicate good performance of the proposed model.

]]>