1. Introduction
The author of [
1,
2], see
Section 2,
Section 3,
Section 4 and
Section 5, was the first to establish neural network approximation to continuous functions with rates via very specific neural network operators of Cardaliaguet–Euvrard and “Squashing” types by using the modulus of continuity of the engaged function or its high-order derivative and producing very tight Jackson-type inequalities. He treats both univariate and multivariate cases. These operators’ “bell-shaped” and “squashing” functions are assumed to provide compact support.
The author inspired by [
3], continued his studies on neural network approximation by introducing and using the proper quasi-interpolation operators of sigmoidal and hyperbolic tangent types, which resulted in [
4,
5], treating both the univariate and multivariate cases. He also studied the corresponding fractional cases [
5].
A parametrized activation function kills far fewer neurons than the original one.
Therefore, here, the author obtains parametrized Richard’s-curve-activated neural network approximations of differentiated functions, which are transformed from into , going beyond the bounded domain functions.
We present real and complex ordinary and fractional quasi-interpolation quantitative approximations. The fractional case is extensively studied because of the applications of fractional calculus in the interpretation of many natural phenomena and in engineering. Therefore, we derive Jackson inequalities that are close to the sharp type.
Real feed-forward neural networks (FNNs) with one hidden layer, the ones we use here, are mathematically expressed by:
where, for
,
are the thresholds,
are the connection weights,
are the coefficients,
is the inner product of
and
x, and
is the activation function of the network. For more information about neural networks in general, read [
6,
7,
8]. Due to their efficiency, neural network approximations are widely used for many subjects, like differential equations, numerical analysis, statistics, and AI.
2. Preliminaries
The following come from [
9], pp. 2–6.
The Richards’s curve is as follows:
which is of great interest when
The function
increases in terms of
and is a sigmoid function; specifically, this is a generalized logistic function [
10].
It should be noted that
We consider the following activation function:
which is
, all
.
Function
has many applications in epidemiology and especially in COVID-19 modeling of infection trajectories [
11].
We can see that
We notice that
Therefore,
G is an even function.
We can observe that
that is
Let
. We can observe that
Therefore, for
and
decrease.
Let then, and and then so that and decreases over
Thus, decreases at .
Clearly, increases at , and
We can observe that
That is, the
x axis is the horizontal asymptote for
G.
In conclusion,
G is a bell symmetric function with the following maximum:
We need to use the following theorems.
Remark 1. Because G is even, it holds thatHenceand Proof. We can observe that
So,
is a density function. □
Remark 2. We can obtain thatLet . That is . Applying the mean value theorem, we obtainwhere We need the following definitions.
Definition 1. In this article, we study the smooth approximation properties of the following quasi-interpolation neural network operators acting on (continuous and bounded functions):
- (i)
- (ii)
The Kantorovich-type operators: - (iii)
let , , , , and We also consider the quadrature-type operators: We will be using the first modulus of continuity:
where which is bounded and/or uniformly continuous. We are motivated by the following result.
Theorem 3 ([
9], p. 13)
. Let , , , , . Then, For (uniformly and bounded continuous functions), we can obtain , both pointwise and uniformly.
3. Main Results
Here, we study the approximation properties of neural network operators , , under differentiation.
Theorem 4. Here, , , , with , . Then,
- (i)
- (ii)
Assume all , we have that at high speed
- (iii)
- (iv)
Proof. Using Taylor’s theorem, we have (
)
Use the following equation:
(I) Let . Then,
(i) case
:
We found that
(ii) Case
: then,
Consequently, we prove that
Next, we can observe (
)
In case
, we obtain
Consequently, it holds
Next, we treat
Notice that
Therefore, we have
Hence, it holds that
Thus, we have
and the following holds:
Finally, we estimate
The theorem is proved. □
Next comes
Theorem 5. Here , , , with , . Then,
(ii) assume all , thenat high speed Proof. One can write
Let now
with
,
.
Therefore, we can write
where
(I) Let ().
(i) if
, then
(ii) if
, then
Therefore, when
, then
Clearly, now the following holds:
(II) Let .
(i) if
, then
(ii) if
, then
Hence, when
, then
Clearly, then
(by [
9], p. 6, Theorem 1.4)
We have found that
Therefore, the following holds:
Finally, we estimate
(… as earlier)
Therefore,
The theorem is proved. □
Therefore, the following theorem holds.
Theorem 6. Here, , , , with , . Then,
(ii) Assume all , then,at high speed Proof. We have that
and
Furthermore, the following holds:
where
Use the following equation:
(I) Let .
(i) if
; then,
(ii) If
, then
Therefore, when
, then
Clearly, now the following holds:
(II) Let .
(i) if
, then
(ii) if
, then
So, in general, we obtain
Therefore, the following holds:
Next, we estimate
The theorem is proved. □
We need the following definition.
Definition 2. A function is absolutely continuous over , if is absolutely continuous, for every . We can write , if (absolutely continuous functions over ) and .
Definition 3. Let , ( is the ceiling of the number); . We use the left Caputo fractional derivative ([12,13,14], pp. 49–52) as the following function:, ; where Γ is the gamma function. Note that and exists, i.e., on , ∀.
We set , ∀
Lemma 1 (See also [
15])
. Let , , , and . Then, for any . Definition 4 (See also [
13,
16,
17])
. Let , , . The right Caputo fractional derivative of order is given by, . We set .Note that and exists a.e. on , ∀.
Lemma 2 (see also [
15])
. Let , , . Then, for any . Proposition 2 (See also [
15])
. Let , , . Then is continuous in , . Also we have
Proposition 3 (see also [
15])
. Let , , . Then, is continuous in , . We further mention
Proposition 4 (see also [
15])
. Let , , , and let . Then, is continuous in Proposition 5 (See also [
15])
. Let , , , and let . Then, is continuous in Proposition 6 (See also [
15])
. Let , , ; . Then, are jointly continuous functions in from Fractional results follow.
Theorem 7. Let , , , , , , , , . Assume also that both . Then,
(II) given , , we have (IV) Adding , we obtainAs shown above, when , the sum As we can see here, we obtain fractional pointwise and uniform convergence with rates of , with the unit operator as
Proof. Let . We have that .
From [
12], p. 54, we can derive the left Caputo fractional Taylor’s formula:
for all
Also, from [
16], using the right Caputo fractional Taylor’s formula, we can obtain the following:
for all
Hence, the following holds:
for all
iff
and
for all
, iff
We have that
Therefore, the following holds:
and
Adding the last two equalities (
100), (
101), we can obtain:
where
with
and
Furthermore, let
for
and
for
Let
; we derive the following:
Also, we obtain the following:
and
Therefore, the following holds:
Next, we estimate
(by
)
Hence, the following holds:
Furthermore, we can obtain:
As was shown earlier, we can obtain that
We have that
and
.
Therefore, the following holds:
Thus, it is reasonable to assume that .
Consequently, the following holds:
The theorem is now proved. □
4. Applications for
We obtain the following results:
Corollary 1. Here, , , , , with ; . Then,
(II) assume ; we have thatat high speed Proof. Use Theorem 4 for □
Corollary 2. Here, , , , , with ; . Then,
(II) assume ; we have thatat high speed Proof. Use Theorems 5 and 6 for □
Corollary 3. Let , , , , , , . Assume also that . Then,
Proof. Use Theorem 7 for □
Next is the case of
Corollary 4. Let , , , , , . Assume that . Then,
Proof. Use Corollary 3. □
5. Complex Neural Network Approximation
Remark 3. Let , with real and imaginary parts , . Clearly, f is continuous if and are continuous.
Also,holds for all , given that , . Here, the following are defined: We observe here thatandand Using we can denote the space of continuous and bounded functions . Clearly, f is bounded if both are bounded from into , where
Theorem 8. Let , such that . Assume , , with , . Here, , , . Then,
(II) Given that , we have Theorem 9. All are presented as in Theorem 8. Then,
(II) Given that , we have Proof. Use Theorems 5 and 6. □
We finish with the following fractional result.
Theorem 10. Let , such that . Assume , , . Here , , , , , , . Suppose also that . Then,
(II) Given , , we haveand Conclusions: The author used parametrized Richard’s curve activated neural network approximations to differentiate functions from into , going beyond the bounded domain functions. He presented real and complex, ordinary and fractional quasi-interpolation quantitative approximations. The results are totally new.