Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = symmetrized and perturbed hyperbolic tangent activation function

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
59 pages, 1417 KB  
Article
Symmetrized Neural Network Operators in Fractional Calculus: Caputo Derivatives, Asymptotic Analysis, and the Voronovskaya–Santos–Sales Theorem
by Rômulo Damasclin Chaves dos Santos, Jorge Henrique de Oliveira Sales and Gislan Silveira Santos
Axioms 2025, 14(7), 510; https://doi.org/10.3390/axioms14070510 - 30 Jun 2025
Viewed by 515
Abstract
This work presents a comprehensive mathematical framework for symmetrized neural network operators operating under the paradigm of fractional calculus. By introducing a perturbed hyperbolic tangent activation, we construct a family of localized, symmetric, and positive kernel-like densities, which form the analytical backbone for [...] Read more.
This work presents a comprehensive mathematical framework for symmetrized neural network operators operating under the paradigm of fractional calculus. By introducing a perturbed hyperbolic tangent activation, we construct a family of localized, symmetric, and positive kernel-like densities, which form the analytical backbone for three classes of multivariate operators: quasi-interpolation, Kantorovich-type, and quadrature-type. A central theoretical contribution is the derivation of the Voronovskaya–Santos–Sales Theorem, which extends classical asymptotic expansions to the fractional domain, providing rigorous error bounds and normalized remainder terms governed by Caputo derivatives. The operators exhibit key properties such as partition of unity, exponential decay, and scaling invariance, which are essential for stable and accurate approximations in high-dimensional settings and systems governed by nonlocal dynamics. The theoretical framework is thoroughly validated through applications in signal processing and fractional fluid dynamics, including the formulation of nonlocal viscous models and fractional Navier–Stokes equations with memory effects. Numerical experiments demonstrate a relative error reduction of up to 92.5% when compared to classical quasi-interpolation operators, with observed convergence rates reaching On1.5 under Caputo derivatives, using parameters λ=3.5, q=1.8, and n=100. This synergy between neural operator theory, asymptotic analysis, and fractional calculus not only advances the theoretical landscape of function approximation but also provides practical computational tools for addressing complex physical systems characterized by long-range interactions and anomalous diffusion. Full article
(This article belongs to the Special Issue Advances in Fuzzy Logic and Computational Intelligence)
Show Figures

Figure 1

25 pages, 320 KB  
Article
Multivariate Smooth Symmetrized and Perturbed Hyperbolic Tangent Neural Network Approximation over Infinite Domains
by George A. Anastassiou
Mathematics 2024, 12(23), 3777; https://doi.org/10.3390/math12233777 - 29 Nov 2024
Cited by 1 | Viewed by 692
Abstract
In this article, we study the multivariate quantitative smooth approximation under differentiation of functions. The approximators here are multivariate neural network operators activated by the symmetrized and perturbed hyperbolic tangent activation function. All domains used here are infinite. The multivariate neural network operators [...] Read more.
In this article, we study the multivariate quantitative smooth approximation under differentiation of functions. The approximators here are multivariate neural network operators activated by the symmetrized and perturbed hyperbolic tangent activation function. All domains used here are infinite. The multivariate neural network operators are of quasi-interpolation type: the basic type, the Kantorovich type, and the quadrature type. We give pointwise and uniform multivariate approximations with rates. We finish with illustrations. Full article
Back to TopTop