Processing math: 100%
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = tensor metric of nonindependent variables

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 384 KiB  
Article
Optimal and Efficient Approximations of Gradients of Functions with Nonindependent Variables
by Matieyendou Lamboni
Axioms 2024, 13(7), 426; https://doi.org/10.3390/axioms13070426 - 25 Jun 2024
Cited by 2 | Viewed by 1328
Abstract
Gradients of smooth functions with nonindependent variables are relevant for exploring complex models and for the optimization of the functions subjected to constraints. In this paper, we investigate new and simple approximations and computations of such gradients by making use of independent, central, [...] Read more.
Gradients of smooth functions with nonindependent variables are relevant for exploring complex models and for the optimization of the functions subjected to constraints. In this paper, we investigate new and simple approximations and computations of such gradients by making use of independent, central, and symmetric variables. Such approximations are well suited for applications in which the computations of the gradients are too expansive or impossible. The derived upper bounds of the biases of our approximations do not suffer from the curse of dimensionality for any 2-smooth function, and they theoretically improve the known results. Also, our estimators of such gradients reach the optimal (mean squared error) rates of convergence (i.e., O(N1)) for the same class of functions. Numerical comparisons based on a test case and a high-dimensional PDE model show the efficiency of our approach. Full article
(This article belongs to the Special Issue Recent Research on Functions with Non-Independent Variables)
Show Figures

Figure 1

17 pages, 338 KiB  
Article
Derivative Formulas and Gradient of Functions with Non-Independent Variables
by Matieyendou Lamboni
Axioms 2023, 12(9), 845; https://doi.org/10.3390/axioms12090845 - 30 Aug 2023
Cited by 2 | Viewed by 1182
Abstract
Stochastic characterizations of functions subject to constraints result in treating them as functions with non-independent variables. By using the distribution function or copula of the input variables that comply with such constraints, we derive two types of partial derivatives of functions with non-independent [...] Read more.
Stochastic characterizations of functions subject to constraints result in treating them as functions with non-independent variables. By using the distribution function or copula of the input variables that comply with such constraints, we derive two types of partial derivatives of functions with non-independent variables (i.e., actual and dependent derivatives) and argue in favor of the latter. Dependent partial derivatives of functions with non-independent variables rely on the dependent Jacobian matrix of non-independent variables, which is also used to define a tensor metric. The differential geometric framework allows us to derive the gradient, Hessian, and Taylor-type expansions of functions with non-independent variables. Full article
Back to TopTop