Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 19, Issue 11 (November 2017)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) Already Leonardo da Vinci (1452–1519) realized that turbulent flows can be basically separated in [...] Read more.
View options order results:
result details:
Displaying articles 1-65
Export citation of selected articles as:
Open AccessArticle Dynamic and Thermodynamic Properties of a CA Engine with Non-Instantaneous Adiabats
Entropy 2017, 19(11), 632; https://doi.org/10.3390/e19110632
Received: 24 October 2017 / Revised: 15 November 2017 / Accepted: 17 November 2017 / Published: 22 November 2017
Cited by 1 | PDF Full-text (320 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents an analysis of a Curzon and Alhborn thermal engine model where both internal irreversibilities and non-instantaneous adiabatic branches are considered, operating with maximum ecological function and maximum power output regimes. Its thermodynamic properties are shown, and an analysis of its
[...] Read more.
This paper presents an analysis of a Curzon and Alhborn thermal engine model where both internal irreversibilities and non-instantaneous adiabatic branches are considered, operating with maximum ecological function and maximum power output regimes. Its thermodynamic properties are shown, and an analysis of its local dynamic stability is performed. The results derived are compared throughout the work with the results obtained previously for a case in which the adiabatic branches were assumed as instantaneous. The results indicate a better performance for thermodynamic properties in the model with instantaneous adiabatic branches, whereas there is an improvement in robustness in the case where non-instantaneous adiabatic branches are considered. Full article
(This article belongs to the Section Thermodynamics)
Figures

Figure 1

Open AccessArticle On Normalized Mutual Information: Measure Derivations and Properties
Entropy 2017, 19(11), 631; https://doi.org/10.3390/e19110631
Received: 26 October 2017 / Revised: 12 November 2017 / Accepted: 20 November 2017 / Published: 22 November 2017
PDF Full-text (278 KB) | HTML Full-text | XML Full-text
Abstract
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds,
[...] Read more.
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and three different random variables. Since the MI formulation for a pair of events is always nonnegative, it can properly be extended to include weighted MI and NMI measures for pairs of events or for random variables that are analogous to the well-known weighted entropy. This weighted MI is generalized to the case of continuous random variables. Such weighted measures have the advantage over previously proposed measures of always being nonnegative. A simple transformation is derived for the NMI, such that the transformed measures have the value-validity property necessary for making various appropriate comparisons between values of those measures. A numerical example is provided. Full article
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)
Open AccessArticle Metacomputable
Entropy 2017, 19(11), 630; https://doi.org/10.3390/e19110630
Received: 5 September 2017 / Revised: 10 November 2017 / Accepted: 14 November 2017 / Published: 22 November 2017
PDF Full-text (228 KB) | HTML Full-text | XML Full-text
Abstract
The paper introduces the notion of “metacomputable” processes as those which are the product of computable processes. This notion seems interesting in the instance when metacomputable processes may not be computable themselves, but are produced by computable ones. The notion of computability used
[...] Read more.
The paper introduces the notion of “metacomputable” processes as those which are the product of computable processes. This notion seems interesting in the instance when metacomputable processes may not be computable themselves, but are produced by computable ones. The notion of computability used here relies on Turing computability. When we talk about something being non-computable, this can be viewed as computation that incorporates Turing’s oracle, maybe a true randomizer (perhaps a quantum one). The notions of “processes” is used broadly, so that it also covers “objects” under the functional description; for the sake of this paper an object is seen as computable if processes that fully describe relevant aspects of its functioning are computable. The paper also introduces a distinction between phenomenal content and the epistemic subject which holds that content. The distinction provides an application of the notion of the metacomputable. In accordance with the functional definition of computable objects, sketched out above, it is possible to think of objects, such as brains, as being computable. If we take the functionality of brains relevant for consideration to be their supposed ability to generate first-person consciousness, and if they were computable in this regard, it would mean that brains, as generators of consciousness, could be described, straightforwardly, by Turing-computable mathematical functions. If there were other, maybe artificial, generators of first-person consciousness, then we could hope to design those as Turing-computable machines as well. However, thinking of such generators of consciousness as computable does not preclude the stream of consciousness being non-computable. This is the main point of this article—computable processes, including functionally described machines, may be able to generate incomputable products. Those processes, while not computable, are metacomputable—by regulative definition introduced in this article. Another example of a metacomputable process that is not also computable would be a true randomizer, if we were able to build one. Presumably, it would be built according to a computable design, e.g., by a machine designed using AutoCAD, that could be programmed into an industrial robot. Yet, its product—a perfect randomizer—would be incomputable. The last point I need to make belongs to ontology in the theory of computability. The claim that computable objects, or processes, may produce incomputable ones does not commit us to what I call computational monism—the idea that non-computable processes may, strictly speaking, be transformed into computable ones. Metacomputable objects, or processes, may originate from computable systems (systems will be understood here as complex, structured objects or processes) that have non-computable admixtures. Such processes are computable as long as those non-computable admixtures are latent, or otherwise irrelevant for a given functionality, and they are non-computable if the admixtures become active and relevant. Ontology, in which computational processes, or objects, can produce non-computable processes, or objects, iff the former ones have non-computable components, may be termed computational dualism. Such objects or processes may be computable despite containing non-computable elements, in particular if there is an on and off switch of those non-computable processes, and it is off. One kind of such a switch is provided, in biology, by latent genes that become active only in specific environmental situations, or at a given age. Both ontologies, informational dualism and informational monism, are compatible with some non-computable processes being metacomputable. Full article
Open AccessArticle Parametric PET Image Reconstruction via Regional Spatial Bases and Pharmacokinetic Time Activity Model
Entropy 2017, 19(11), 629; https://doi.org/10.3390/e19110629
Received: 29 September 2017 / Revised: 2 November 2017 / Accepted: 20 November 2017 / Published: 22 November 2017
PDF Full-text (2659 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
It is known that the process of reconstruction of a Positron Emission Tomography (PET) image from sinogram data is very sensitive to measurement noises; it is still an important research topic to reconstruct PET images with high signal-to-noise ratios. In this paper, we
[...] Read more.
It is known that the process of reconstruction of a Positron Emission Tomography (PET) image from sinogram data is very sensitive to measurement noises; it is still an important research topic to reconstruct PET images with high signal-to-noise ratios. In this paper, we propose a new reconstruction method for a temporal series of PET images from a temporal series of sinogram data. In the proposed method, PET images are reconstructed by minimizing the Kullback–Leibler divergence between the observed sinogram data and sinogram data derived from a parametric model of PET images. The contributions of the proposition include the following: (1) regions of targets in images are explicitly expressed using a set of spatial bases in order to ignore the noises in the background; (2) a parametric time activity model of PET images is explicitly introduced as a constraint; and (3) an algorithm for solving the optimization problem is clearly described. To demonstrate the advantages of the proposed method, quantitative evaluations are performed using both synthetic and clinical data of human brains. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Design of Rate-Compatible Parallel Concatenated Punctured Polar Codes for IR-HARQ Transmission Schemes
Entropy 2017, 19(11), 628; https://doi.org/10.3390/e19110628
Received: 16 August 2017 / Revised: 2 November 2017 / Accepted: 20 November 2017 / Published: 21 November 2017
PDF Full-text (1182 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we propose a rate-compatible (RC) parallel concatenated punctured polar (PCPP) codes for incremental redundancy hybrid automatic repeat request (IR-HARQ) transmission schemes, which can transmit multiple data blocks over a time-varying channel. The PCPP coding scheme can provide RC polar coding
[...] Read more.
In this paper, we propose a rate-compatible (RC) parallel concatenated punctured polar (PCPP) codes for incremental redundancy hybrid automatic repeat request (IR-HARQ) transmission schemes, which can transmit multiple data blocks over a time-varying channel. The PCPP coding scheme can provide RC polar coding blocks in order to adapt to channel variations. First, we investigate an improved random puncturing (IRP) pattern for the PCPP coding scheme due to the code-rate and block length limitations of conventional polar codes. The proposed IRP algorithm only select puncturing bits from the frozen bits set and keep the information bits unchanged during puncturing, which can improve 0.2–1 dB decoding performance more than the existing random puncturing (RP) algorithm. Then, we develop a RC IR-HARQ transmission scheme based on PCPP codes. By analyzing the overhead of the previous successful decoded PCPP coding block in our IR-HARQ scheme, the optimal initial code-rate can be determined for each new PCPP coding block over time-varying channels. Simulation results show that the average number of transmissions is about 1.8 times for each PCPP coding block in our RC IR-HARQ scheme with a 2-level PCPP encoding construction, which can reduce half of the average number of transmissions than the existing RC polar coding schemes. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessFeature PaperArticle Variational Characterization of Free Energy: Theory and Algorithms
Entropy 2017, 19(11), 626; https://doi.org/10.3390/e19110626
Received: 25 September 2017 / Revised: 7 November 2017 / Accepted: 15 November 2017 / Published: 20 November 2017
Cited by 2 | PDF Full-text (479 KB) | HTML Full-text | XML Full-text
Abstract
The article surveys and extends variational formulations of the thermodynamic free energy and discusses their information-theoretic content from the perspective of mathematical statistics. We revisit the well-known Jarzynski equality for nonequilibrium free energy sampling within the framework of importance sampling and Girsanov change-of-measure
[...] Read more.
The article surveys and extends variational formulations of the thermodynamic free energy and discusses their information-theoretic content from the perspective of mathematical statistics. We revisit the well-known Jarzynski equality for nonequilibrium free energy sampling within the framework of importance sampling and Girsanov change-of-measure transformations. The implications of the different variational formulations for designing efficient stochastic optimization and nonequilibrium simulation algorithms for computing free energies are discussed and illustrated. Full article
(This article belongs to the Special Issue Understanding Molecular Dynamics via Stochastic Processes)
Figures

Figure 1

Open AccessArticle Robust-BD Estimation and Inference for General Partially Linear Models
Entropy 2017, 19(11), 625; https://doi.org/10.3390/e19110625
Received: 10 October 2017 / Revised: 15 November 2017 / Accepted: 16 November 2017 / Published: 20 November 2017
PDF Full-text (725 KB) | HTML Full-text | XML Full-text
Abstract
The classical quadratic loss for the partially linear model (PLM) and the likelihood function for the generalized PLM are not resistant to outliers. This inspires us to propose a class of “robust-Bregman divergence (BD)” estimators of both the parametric and nonparametric components in
[...] Read more.
The classical quadratic loss for the partially linear model (PLM) and the likelihood function for the generalized PLM are not resistant to outliers. This inspires us to propose a class of “robust-Bregman divergence (BD)” estimators of both the parametric and nonparametric components in the general partially linear model (GPLM), which allows the distribution of the response variable to be partially specified, without being fully known. Using the local-polynomial function estimation method, we propose a computationally-efficient procedure for obtaining “robust-BD” estimators and establish the consistency and asymptotic normality of the “robust-BD” estimator of the parametric component β o . For inference procedures of β o in the GPLM, we show that the Wald-type test statistic W n constructed from the “robust-BD” estimators is asymptotically distribution free under the null, whereas the likelihood ratio-type test statistic Λ n is not. This provides an insight into the distinction from the asymptotic equivalence (Fan and Huang 2005) between W n and Λ n in the PLM constructed from profile least-squares estimators using the non-robust quadratic loss. Numerical examples illustrate the computational effectiveness of the proposed “robust-BD” estimators and robust Wald-type test in the appearance of outlying observations. Full article
Figures

Figure 1

Open AccessArticle Re-Evaluating Electromyogram–Force Relation in Healthy Biceps Brachii Muscles Using Complexity Measures
Entropy 2017, 19(11), 624; https://doi.org/10.3390/e19110624
Received: 9 August 2017 / Revised: 26 September 2017 / Accepted: 17 October 2017 / Published: 19 November 2017
PDF Full-text (1209 KB) | HTML Full-text | XML Full-text
Abstract
The objective of this study is to re-evaluate the relation between surface electromyogram (EMG) and muscle contraction torque in biceps brachii (BB) muscles of healthy subjects using two different complexity measures. Ten healthy subjects were recruited and asked to complete a series of
[...] Read more.
The objective of this study is to re-evaluate the relation between surface electromyogram (EMG) and muscle contraction torque in biceps brachii (BB) muscles of healthy subjects using two different complexity measures. Ten healthy subjects were recruited and asked to complete a series of elbow flexion tasks following different isometric muscle contraction levels ranging from 10% to 80% of maximum voluntary contraction (MVC) with each increment of 10%. Meanwhile, both the elbow flexion torque and surface EMG data from the muscle were recorded. The root mean square (RMS), sample entropy (SampEn) and fuzzy entropy (FuzzyEn) of corresponding EMG data were analyzed for each contraction level, and the relation between EMG and muscle torque was accordingly quantified. The experimental results showed a nonlinear relation between the traditional RMS amplitude of EMG and the muscle torque. By contrast, the FuzzyEn of EMG exhibited an improved linear correlation with the muscle torque than the RMS amplitude of EMG, which indicates its great value in estimating BB muscle strength in a simple and straightforward manner. In addition, the SampEn of EMG was found to be insensitive to the varying muscle torques, almost presenting a flat trend with the increment of muscle force. Such a character of the SampEn implied its potential application as a promising surface EMG biomarker for examining neuromuscular changes while overcoming interference from muscle strength. Full article
(This article belongs to the Special Issue Information Theory Applied to Physiological Signals)
Figures

Figure 1

Open AccessArticle Digital Image Stabilization Method Based on Variational Mode Decomposition and Relative Entropy
Entropy 2017, 19(11), 623; https://doi.org/10.3390/e19110623
Received: 11 September 2017 / Revised: 15 November 2017 / Accepted: 16 November 2017 / Published: 18 November 2017
PDF Full-text (6250 KB) | HTML Full-text | XML Full-text
Abstract
Cameras mounted on vehicles frequently suffer from image shake due to the vehicles’ motions. To remove jitter motions and preserve intentional motions, a hybrid digital image stabilization method is proposed that uses variational mode decomposition (VMD) and relative entropy (RE). In this paper,
[...] Read more.
Cameras mounted on vehicles frequently suffer from image shake due to the vehicles’ motions. To remove jitter motions and preserve intentional motions, a hybrid digital image stabilization method is proposed that uses variational mode decomposition (VMD) and relative entropy (RE). In this paper, the global motion vector (GMV) is initially decomposed into several narrow-banded modes by VMD. REs, which exhibit the difference of probability distribution between two modes, are then calculated to identify the intentional and jitter motion modes. Finally, the summation of the jitter motion modes constitutes jitter motions, whereas the subtraction of the resulting sum from the GMV represents the intentional motions. The proposed stabilization method is compared with several known methods, namely, medium filter (MF), Kalman filter (KF), wavelet decomposition (MD) method, empirical mode decomposition (EMD)-based method, and enhanced EMD-based method, to evaluate stabilization performance. Experimental results show that the proposed method outperforms the other stabilization methods. Full article
Figures

Figure 1

Open AccessFeature PaperArticle Inquiry Calculus and the Issue of Negative Higher Order Informations
Entropy 2017, 19(11), 622; https://doi.org/10.3390/e19110622
Received: 20 September 2017 / Revised: 1 November 2017 / Accepted: 10 November 2017 / Published: 18 November 2017
Cited by 1 | PDF Full-text (1015 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we will give the derivation of an inquiry calculus, or, equivalently, a Bayesian information theory. From simple ordering follow lattices, or, equivalently, algebras. Lattices admit a quantification, or, equivalently, algebras may be extended to calculi. The general rules of quantification
[...] Read more.
In this paper, we will give the derivation of an inquiry calculus, or, equivalently, a Bayesian information theory. From simple ordering follow lattices, or, equivalently, algebras. Lattices admit a quantification, or, equivalently, algebras may be extended to calculi. The general rules of quantification are the sum and chain rules. Probability theory follows from a quantification on the specific lattice of statements that has an upper context. Inquiry calculus follows from a quantification on the specific lattice of questions that has a lower context. There will be given here a relevance measure and a product rule for relevances, which, taken together with the sum rule of relevances, will allow us to perform inquiry analyses in an algorithmic manner. Full article
(This article belongs to the Special Issue Maximum Entropy and Bayesian Methods)
Figures

Figure 1

Open AccessArticle Surface Interaction of Nanoscale Water Film with SDS from Computational Simulation and Film Thermodynamics
Entropy 2017, 19(11), 620; https://doi.org/10.3390/e19110620
Received: 15 September 2017 / Revised: 15 November 2017 / Accepted: 15 November 2017 / Published: 18 November 2017
PDF Full-text (2736 KB) | HTML Full-text | XML Full-text
Abstract
Foam systems have been attracting extensive attention due to their importance in a variety of applications, e.g., in the cleaning industry, and in bubble flotation. In the context of flotation chemistry, flotation performance is strongly affected by bubble coalescence, which in turn relies
[...] Read more.
Foam systems have been attracting extensive attention due to their importance in a variety of applications, e.g., in the cleaning industry, and in bubble flotation. In the context of flotation chemistry, flotation performance is strongly affected by bubble coalescence, which in turn relies significantly on the surface forces upon the liquid film between bubbles. Conventionally, unusual short-range strongly repulsive surface interactions for Newton black films (NBF) between two interfaces with thickness of less than 5 nm were not able to be incorporated into the available classical Derjaguin, Landau, Verwey, and Overbeek (DLVO) theory. The non-DLVO interaction would increase exponentially with the decrease of film thickness, as it plays a crucial role in determining liquid film stability. However, its mechanism and origin are still unclear. In the present work, we investigate the surface interaction of free-standing sodium dodecyl-sulfate (SDS) nanoscale black films in terms of disjoining pressure using the molecular simulation method. The aqueous nanoscale film, consisting of a water coating with SDS surfactants, and with disjoining pressure and film tension of SDS-NBF as a function of film thickness, were quantitatively determined by a post-processing technique derived from film thermodynamics. Full article
(This article belongs to the Special Issue Mesoscopic Thermodynamics and Dynamics)
Figures

Figure 1

Open AccessArticle An Analysis of Information Dynamic Behavior Using Autoregressive Models
Entropy 2017, 19(11), 612; https://doi.org/10.3390/e19110612
Received: 20 September 2017 / Revised: 8 November 2017 / Accepted: 10 November 2017 / Published: 18 November 2017
PDF Full-text (963 KB) | HTML Full-text | XML Full-text
Abstract
Information Theory is a branch of mathematics, more specifically probability theory, that studies information quantification. Recently, several researches have been successful with the use of Information Theoretic Learning (ITL) as a new technique of unsupervised learning. In these works, information measures are used
[...] Read more.
Information Theory is a branch of mathematics, more specifically probability theory, that studies information quantification. Recently, several researches have been successful with the use of Information Theoretic Learning (ITL) as a new technique of unsupervised learning. In these works, information measures are used as criterion of optimality in learning. In this article, we will analyze a still unexplored aspect of these information measures, their dynamic behavior. Autoregressive models (linear and non-linear) will be used to represent the dynamics in information measures. As a source of dynamic information, videos with different characteristics like fading, monotonous sequences, etc., will be used. Full article
Figures

Figure 1

Open AccessArticle Fault Detection for Vibration Signals on Rolling Bearings Based on the Symplectic Entropy Method
Entropy 2017, 19(11), 607; https://doi.org/10.3390/e19110607
Received: 25 September 2017 / Revised: 29 October 2017 / Accepted: 9 November 2017 / Published: 18 November 2017
Cited by 2 | PDF Full-text (1833 KB) | HTML Full-text | XML Full-text
Abstract
Bearing vibration response studies are crucial for the condition monitoring of bearings and the quality inspection of rotating machinery systems. However, it is still very difficult to diagnose bearing faults, especially rolling element faults, due to the complex, high-dimensional and nonlinear characteristics of
[...] Read more.
Bearing vibration response studies are crucial for the condition monitoring of bearings and the quality inspection of rotating machinery systems. However, it is still very difficult to diagnose bearing faults, especially rolling element faults, due to the complex, high-dimensional and nonlinear characteristics of vibration signals as well as the strong background noise. A novel nonlinear analysis method—the symplectic entropy (SymEn) measure—is proposed to analyze the measured signals for fault monitoring of rolling bearings. The core technique of the SymEn approach is the entropy analysis based on the symplectic principal components. The dynamical characteristics of the rolling bearing data are analyzed using the SymEn method. Unlike other techniques consisting of high-dimensional features in the time-domain, frequency-domain and the empirical mode decomposition (EMD)/wavelet-domain, the SymEn approach constructs low-dimensional (i.e., two-dimensional) features based on the SymEn estimate. The vibration signals from our experiments and the Case Western Reserve University Bearing Data Center are applied to verify the effectiveness of the proposed method. Meanwhile, it is found that faulty bearings have a great influence on the other normal bearings. To sum up, the results indicate that the proposed method can be used to detect rolling bearing faults. Full article
(This article belongs to the Section Complexity)
Figures

Figure 1

Open AccessFeature PaperEssay Thermodynamics: The Unique Universal Science
Entropy 2017, 19(11), 621; https://doi.org/10.3390/e19110621
Received: 28 June 2017 / Revised: 3 October 2017 / Accepted: 7 November 2017 / Published: 17 November 2017
PDF Full-text (1121 KB) | HTML Full-text | XML Full-text
Abstract
Thermodynamics is a physical branch of science that governs the thermal behavior of dynamical systems from those as simple as refrigerators to those as complex as our expanding universe. The laws of thermodynamics involving conservation of energy and nonconservation of entropy are, without
[...] Read more.
Thermodynamics is a physical branch of science that governs the thermal behavior of dynamical systems from those as simple as refrigerators to those as complex as our expanding universe. The laws of thermodynamics involving conservation of energy and nonconservation of entropy are, without a doubt, two of the most useful and general laws in all sciences. The first law of thermodynamics, according to which energy cannot be created or destroyed, merely transformed from one form to another, and the second law of thermodynamics, according to which the usable energy in an adiabatically isolated dynamical system is always diminishing in spite of the fact that energy is conserved, have had an impact far beyond science and engineering. In this paper, we trace the history of thermodynamics from its classical to its postmodern forms, and present a tutorial and didactic exposition of thermodynamics as it pertains to some of the deepest secrets of the universe. Full article
Figures

Figure 1

Open AccessFeature PaperArticle Modal Strain Energy-Based Debonding Assessment of Sandwich Panels Using a Linear Approximation with Maximum Entropy
Entropy 2017, 19(11), 619; https://doi.org/10.3390/e19110619
Received: 21 September 2017 / Revised: 5 November 2017 / Accepted: 14 November 2017 / Published: 17 November 2017
PDF Full-text (10936 KB) | HTML Full-text | XML Full-text
Abstract
Sandwich structures are very attractive due to their high strength at a minimum weight, and, therefore, there has been a rapid increase in their applications. Nevertheless, these structures may present imperfect bonding or debonding between the skins and core as a result of
[...] Read more.
Sandwich structures are very attractive due to their high strength at a minimum weight, and, therefore, there has been a rapid increase in their applications. Nevertheless, these structures may present imperfect bonding or debonding between the skins and core as a result of manufacturing defects or impact loads, degrading their mechanical properties. To improve both the safety and functionality of these systems, structural damage assessment methodologies can be implemented. This article presents a damage assessment algorithm to localize and quantify debonds in sandwich panels. The proposed algorithm uses damage indices derived from the modal strain energy method and a linear approximation with a maximum entropy algorithm. Full-field vibration measurements of the panels were acquired using a high-speed 3D digital image correlation (DIC) system. Since the number of damage indices per panel is too large to be used directly in a regression algorithm, reprocessing of the data using principal component analysis (PCA) and kernel PCA has been performed. The results demonstrate that the proposed methodology accurately identifies debonding in composite panels. Full article
(This article belongs to the Special Issue Entropy for Characterization of Uncertainty in Risk and Reliability)
Figures

Figure 1

Back to Top