Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (5)

Search Parameters:
Keywords = weighted quasi-arithmetic means

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
33 pages, 449 KB  
Article
Bounds of Different Integral Operators in Tensorial Hilbert and Variable Exponent Function Spaces
by Waqar Afzal, Mujahid Abbas and Omar Mutab Alsalami
Mathematics 2024, 12(16), 2464; https://doi.org/10.3390/math12162464 - 9 Aug 2024
Cited by 15 | Viewed by 1589
Abstract
In dynamical systems, Hilbert spaces provide a useful framework for analyzing and solving problems because they are able to handle infinitely dimensional spaces. Many dynamical systems are described by linear operators acting on a Hilbert space. Understanding the spectrum, eigenvalues, and eigenvectors of [...] Read more.
In dynamical systems, Hilbert spaces provide a useful framework for analyzing and solving problems because they are able to handle infinitely dimensional spaces. Many dynamical systems are described by linear operators acting on a Hilbert space. Understanding the spectrum, eigenvalues, and eigenvectors of these operators is crucial. Functional analysis typically involves the use of tensors to represent multilinear mappings between Hilbert spaces, which can result in inequality in tensor Hilbert spaces. In this paper, we study two types of function spaces and use convex and harmonic convex mappings to establish various operator inequalities and their bounds. In the first part of the article, we develop the operator Hermite–Hadamard and upper and lower bounds for weighted discrete Jensen-type inequalities in Hilbert spaces using some relational properties and arithmetic operations from the tensor analysis. Furthermore, we use the Riemann–Liouville fractional integral and develop several new identities which are used in operator Milne-type inequalities to develop several new bounds using different types of generalized mappings, including differentiable, quasi-convex, and convex mappings. Furthermore, some examples and consequences for logarithm and exponential functions are also provided. Furthermore, we provide an interesting example of a physics dynamical model for harmonic mean. Lastly, we develop Hermite–Hadamard inequality in variable exponent function spaces, specifically in mixed norm function space (lq(·)(Lp(·))). Moreover, it was developed using classical Lebesgue space (Lp) space, in which the exponent is constant. This inequality not only refines Jensen and triangular inequality in the norm sense, but we also impose specific conditions on exponent functions to show whether this inequality holds true or not. Full article
(This article belongs to the Special Issue Variational Problems and Applications, 2nd Edition)
Show Figures

Figure 1

25 pages, 642 KB  
Article
Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means
by Frank Nielsen
Algorithms 2022, 15(11), 435; https://doi.org/10.3390/a15110435 - 17 Nov 2022
Cited by 1 | Viewed by 3528
Abstract
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others. Choosing a suitable α-divergence can either be done beforehand according to some prior knowledge of the [...] Read more.
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others. Choosing a suitable α-divergence can either be done beforehand according to some prior knowledge of the application domains or directly learned from data sets. In this work, we generalize the α-divergences using a pair of strictly comparable weighted means. Our generalization allows us to obtain in the limit case α1 the 1-divergence, which provides a generalization of the forward Kullback–Leibler divergence, and in the limit case α0, the 0-divergence, which corresponds to a generalization of the reverse Kullback–Leibler divergence. We then analyze the condition for a pair of weighted quasi-arithmetic means to be strictly comparable and describe the family of quasi-arithmetic α-divergences including its subfamily of power homogeneous α-divergences. In particular, we study the generalized quasi-arithmetic 1-divergences and 0-divergences and show that these counterpart generalizations of the oriented Kullback–Leibler divergences can be rewritten as equivalent conformal Bregman divergences using strictly monotone embeddings. Finally, we discuss the applications of these novel divergences to k-means clustering by studying the robustness property of the centroids. Full article
(This article belongs to the Special Issue Machine Learning for Pattern Recognition)
Show Figures

Figure 1

18 pages, 319 KB  
Article
Refinement of Discrete Lah–Ribarič Inequality and Applications on Csiszár Divergence
by Đilda Pečarić, Josip Pečarić and Jurica Perić
Mathematics 2022, 10(5), 755; https://doi.org/10.3390/math10050755 - 26 Feb 2022
Cited by 1 | Viewed by 1547
Abstract
In this paper we give a new refinement of the Lah–Ribarič inequality and, using the same technique, we give a refinement of the Jensen inequality. Using these results, a refinement of the discrete Hölder inequality and a refinement of some inequalities for discrete [...] Read more.
In this paper we give a new refinement of the Lah–Ribarič inequality and, using the same technique, we give a refinement of the Jensen inequality. Using these results, a refinement of the discrete Hölder inequality and a refinement of some inequalities for discrete weighted power means and discrete weighted quasi-arithmetic means are obtained. We also give applications in the information theory; namely, we give some interesting estimations for the discrete Csiszár divergence and for its important special cases. Full article
(This article belongs to the Special Issue Mathematical Inequalities with Applications)
19 pages, 366 KB  
Article
Stochastic Order and Generalized Weighted Mean Invariance
by Mateu Sbert, Jordi Poch, Shuning Chen and Víctor Elvira
Entropy 2021, 23(6), 662; https://doi.org/10.3390/e23060662 - 25 May 2021
Viewed by 2579
Abstract
In this paper, we present order invariance theoretical results for weighted quasi-arithmetic means of a monotonic series of numbers. The quasi-arithmetic mean, or Kolmogorov–Nagumo mean, generalizes the classical mean and appears in many disciplines, from information theory to physics, from economics to traffic [...] Read more.
In this paper, we present order invariance theoretical results for weighted quasi-arithmetic means of a monotonic series of numbers. The quasi-arithmetic mean, or Kolmogorov–Nagumo mean, generalizes the classical mean and appears in many disciplines, from information theory to physics, from economics to traffic flow. Stochastic orders are defined on weights (or equivalently, discrete probability distributions). They were introduced to study risk in economics and decision theory, and recently have found utility in Monte Carlo techniques and in image processing. We show in this paper that, if two distributions of weights are ordered under first stochastic order, then for any monotonic series of numbers their weighted quasi-arithmetic means share the same order. This means for instance that arithmetic and harmonic mean for two different distributions of weights always have to be aligned if the weights are stochastically ordered, this is, either both means increase or both decrease. We explore the invariance properties when convex (concave) functions define both the quasi-arithmetic mean and the series of numbers, we show its relationship with increasing concave order and increasing convex order, and we observe the important role played by a new defined mirror property of stochastic orders. We also give some applications to entropy and cross-entropy and present an example of multiple importance sampling Monte Carlo technique that illustrates the usefulness and transversality of our approach. Invariance theorems are useful when a system is represented by a set of quasi-arithmetic means and we want to change the distribution of weights so that all means evolve in the same direction. Full article
(This article belongs to the Special Issue Measures of Information)
23 pages, 417 KB  
Article
On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means
by Frank Nielsen
Entropy 2019, 21(5), 485; https://doi.org/10.3390/e21050485 - 11 May 2019
Cited by 160 | Viewed by 18392
Abstract
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present [...] Read more.
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen–Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback–Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen–Shannon divergences are touched upon. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Graphical abstract

Back to TopTop