Abstract
For this paper, we consider the almost sure exponential stability of uncertain stochastic Hopfield neural networks based on subadditive measures. Firstly, we deduce two corollaries, using the Itô–Liu formula. Then, we introduce the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks. Next, we investigate the almost sure exponential stability of uncertain stochastic Hopfield neural networks, using the Lyapunov method, Liu inequality, the Liu lemma, and exponential martingale inequality. In addition, we prove two sufficient conditions for almost sure exponential stability. Furthermore, we consider stabilization with linear uncertain stochastic perturbation and present some exceptional examples. Finally, our paper provides our conclusion.
Keywords:
Hopfield neural networks; chance theory; almost sure exponential stability; Lyapunov method MSC:
65C99; 82C32
1. Introduction
An artificial neural network (ANN) is a computational model inspired by the human brain. ANNs comprise interconnected neurons that process and transmit information. ANNs excel in parallel processing and handling complex, nonlinear problems. ANNs learn from data, recognize patterns, and solve tasks like image recognition and natural language processing. With different architectures such as feedforward, recurrent, and convolutional networks, ANNs have become a crucial component of modern artificial intelligence, enabling machines to learn, adapt, and perform tasks that have traditionally required human intelligence. The Hopfield neural network, as a type of ANN [1], has witnessed steady advancement and intensive investigation over the past few decades, leading to a rich reservoir of research outcomes that have found widespread applications across diverse domains, including combination optimization [2], signal processing [3], pattern recognition [4], and robust control [5]; however, the successful application of neural networks in these fields is closely linked to their dynamic behavior, and stochastic stability is the most important property [6,7,8,9,10,11,12,13]. The above literature shows that the ability of a neural network to maintain stochastic stability (exponential stability and instability [6], exponential stability with time delay [7,8], global stability of stochastic high-order neural networks [9], mean square exponential stability with time-varying delays [10], mean square global asymptotic stability with distributed delays [11], and almost sure exponential stability [12,13]) is crucial for its overall performance, especially when dealing with complex processes. Hence, significant efforts have been directed towards exploring and enhancing the stability of neural networks.
It is well known that stability is the crucial property of stochastic neural networks, which are often affected simultaneously by parameter uncertainties and random interference factors that can impact their stability due to reasons such as system modeling, measurement errors, and system linearization, as documented in Refs. [14,15,16,17]. For example, Huang et al. [14] examined the exponential stability analysis of uncertain stochastic neural networks with multiple delays, and Wang et al. [15] studied the exponential stability of uncertain stochastic neural networks with mixed time delays. Chen et al. [16] investigated the mean square exponential stability of uncertain stochastic delayed neural networks, and Syed [17] surveyed the stochastic stability of uncertain recurrent neural networks with Markovian jumping parameters. However, these studies [14,15,16,17] only focused on the robust stability and asymptotic stability of stochastic neural networks with uncertain parameters, while the almost sure exponential stability of neural networks with both uncertain and random disturbances remains unexplored.
As noted above, the stochastic differential equation is a good tool for describing the stability of a stochastic neural network, and the dynamics of the stochastic differential system may be influenced by many other unknown, uncertain, and random disturbances. To address these, Itô [18] established the theory of stochastic analysis and stochastic differential equations with the Wiener process based on additive measures. Over the past 70 years, stochastic differential equations have matured, both in theory and practice, and they have become a vital tool in fields such as physics, systems science, management science, finance, and space science, especially the development of stochastic stability, as in [19,20,21,22]. An uncertain process, on the other hand, is a sequence of uncertain variables, with subadditive measures, that change over time. Liu [23] introduced the concept of a Liu process, which is the uncertain version of the Wiener process, in 2008. The Liu process is a Lipschitz continuous process with independent and steady increase properties, and its increments follow an uncertain normal distribution. Based on this process, Liu [24] introduced the chain rule in the process of uncertainty analysis to study the differentials and integrals of uncertain process functions, as well as a class of differential equations driven by standard Liu processes called uncertain differential equations [25]. Consequently, the stability of uncertain differential equations was discussed. When faced with a system that exhibits both uncertainty and randomness simultaneously, the noise should be modeled using the Wiener–Liu process, and the system evolution can be described through a hybrid differential equation, leading to the development of uncertain stochastic hybrid neural network systems [26]. In 2013, Liu [27] first introduced chance theory to investigate such uncertain stochastic systems based on subadditive measures, and subsequent works by Fei et al. [28,29] have further explored the use of the Wiener–Liu process and the Itô–Liu formula in uncertain stochastic differential equations. Researchers have made progress in studying various forms of the stability of stochastic neural networks based on additive measures, but the analysis of indeterminate neural networks, including both random and uncertain factors, requires chance theory’s subadditive measures. This paper will review some research results based on chance theory, exploring the stability of uncertain stochastic neural networks using the Itô–Liu formula and the Lyapunov method. The main contributions of this paper are the extension of two corollaries of the Itô–Liu formula under subadditive measures, the introduction of the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks for the first time, and the consideration of sufficient conditions for almost sure exponential stability and stabilization with linear uncertain stochastic perturbation.
In Section 2, we recall some results about Hopfield neural networks and some concepts, lemmas, theorems, and corollaries about chance theory, which are essential for our analysis. In Section 3, we present our main results about the almost sure exponential stability of uncertain stochastic neural networks. In Section 4, we present our conclusion.
2. Preliminaries
2.1. The Explanation of Symbols
We add the table of momenclature so that we could relate to symbols used in the paper easily (Table 1).
Table 1.
The explanation of symbols related to this paper.
2.2. The Basic Knowledge
A Hopfield neural network [1] can be described in the form of an ordinary differential equation as follows:
where denotes the voltage on the input of the ith neuron, denotes the input capacitance, is the connection matrix element, is a nondecreasing transfer function, see Table 1, and ; the following is the slope of at , satisfying
where determines the upper bound of the function and is denoted by
then,
where
Furthermore,
Itis easy to know that for any given initial case , the equation has a unique solution. In particular, the equation is unique equilibrium solution . In other words, the zero point is the equilibrium point of the neural network system. The aim of this paper is to investigate the uncertain stochastic effects on the stability. The following reviews chance theory including some concepts, lemmas, theorems, and corollaries which are essential for our analysis.
Let be a nonempty set, and a -algebra over . Each element in is called an event and is the belief degree. The uncertain measure dealing with belief degree satisfies the following axioms [23,25]:
- Axiom 1 (Normality Axiom). for the universal set .
- Axiom 2 (Duality Axiom). for any event .
- Axiom 3 (Subadditivity Axiom). For every countable sequence of events ,holds.
- Axiom 4 (Product Axiom). Let be uncertainty spaces for . The product uncertain measure is an uncertain measure satisfyingwhere are arbitrary events chosen from for , respectively.
Remark 1.
Axioms 1 and 2 are similar to probability theory, and axioms 3 and 4 are fundamentally different from probability theory. In particular, axiom 3 embodies subadditivity, which is different from the additivity of probability theory, and the product axiom of axiom 4 embodies the minimization operation, which is different from the product axiom of probability theory. The detailed analysis can be found in Refs. [23,25].
Definition 1
([23]). An uncertain variable is a measurable function ξ from an uncertainty space to the set of real numbers, i.e., for any Borel set B of real numbers, the set
is an event.
Definition 2
([23]). Let T be an index set and an uncertainty space. An uncertain process is a measurable function from to the set of real numbers such that is an event for any Borel set B for each time k.
Definition 3
([23]). An uncertain process is said to be a Liu process if
- (i) and almost all sample paths are Lipschitz continuous;
- (ii) has stationary and independent increments;
- (iii) every increment is a normal uncertain variable with expected value 0 and variance , whose uncertainty distribution is
Definition 4
([23]). Let be an uncertain process with respect to time k and be a Liu process with respect to time k. For any partition of closed interval with , the mesh is written as
Then, the uncertain integral of with respect to is
provided that the limit exists almost surely and is finite. In this case, the uncertain process is said to be integrable.
Lemma 1
([25] (Liu inequality)). Let be a Liu process on uncertainty space . Then, there exists an uncertain variable K such that is a Lipschitz constant of the sample path for each γ,
and
Lemma 2
([26] (Liu lemma)). Suppose that is a Liu process, and is an integrable uncertain process on with respect to k. Then, the inequality
holds, where is the Lipschitz constant of the sample path .
Let be a complete probability space with a filtration satisfying the usual conditions, that is, it is increasing and right continuous while contains all -null sets.
Let be an uncertainty space where normality, duality, subadditivity, and product measure axioms are given. Let be Liu Liu process defined on . The Liu process filtration is the sub--field family () of satisfying the usual conditions. It is generalized by and -null sets of , .
Liu [27] first introduced chance theory to investigate a hybrid system with both uncertainty about belief degree and randomness. To investigate the uncertain stochastic differential systems, Fei [29] extended a filtered chance space on which some concepts, theorems, are presented as follows.
Definition 5 ([29]).
(i) Let B be a Borel set; an uncertain random variable is a measurable function (or ) from a chance space
to (or ), that is, (or ), so the set
(ii) , is an uncertain random event with chance measure
Definition 6 ([29]).
(a) An uncertain stochastic process is essentially a sequence of uncertain variables indexed by time. For each time if is an uncertain random variable, then we call an uncertain stochastic process (or hybrid process). If the sample paths of are continuous functions of k for almost all then we call it continuous.
(b) If is -measurable for all then we call it -adapted. Further, if is -measurable for all then we call it -adapted (or adapted).
(c) If the uncertain stochastic process is measurable related to the σ-algebra
then we call it progressively measurable.
Further, if the uncertain stochastic process (or is progressively measurable and satisfies , then we call it -progressively measurable, where (or )) denotes the set of -progressively measurable uncertain random processes.
Definition 7
([28]). Let be a Wiener process and a Liu process. Then, is called a Wiener–Liu process. The Wiener–Liu process is said to be standard if both and are standard.
Definition 8
([28]). Let , where and are scalar uncertain stochastic processes, and let be a standard Wiener–Liu process. For any partition of a closed interval with , the mesh is written as
Then, the uncertain stochastic integral of with respect to is
provided that the limit exists almost surely and is finite. In this case, the uncertain stochastic process is said to be integrable.
Remark 2.
The uncertain stochastic integral may also be written as follows:
The following theorem results in the Itô–Liu formula of the one-dimensional case.
Theorem 1
([28] (Itô–Liu formula)). Let be a Wiener–Liu process given by
Let be a Wiener process and a Liu process, and a twice continuously differentiable function. Define Then, we have the following chain rule:
Using Theorem 1, we can easily obtain the following two corollaries.
Corollary 1.
The infinitesimal increments and may be replaced with the derived Wiener–Liu process,
where and are absolutely integrable uncertain stochastic processes, and is a square integrable uncertain stochastic process; then, ( means second-order continuous differentiable), thus producing
Let and be a p-dimensional standard Wiener process and a q-dimensional standard Liu process, respectively. If and are absolute integrable hybrid processes, and are square integrable hybrid processes, for then the m-dimensional hybrid process is given by
or, in matrix notation, simply
where
Corollary 2.
Assume m-dimensional hybrid process is given by
Let be a multivariate continuously differentiable function. Define . Then,
where , , for And
In other words, it can be expressed as
Definition 9
([28]). Suppose is a standard, is a standard process, and , and h are some given functions. Then,
is called an uncertain stochastic differential equation.
3. Main Results
Let us consider a hypothetical scenario in which an uncertain stochastic perturbation is introduced to the neural network, and as a result, the perturbed network can be modeled using an uncertain stochastic differential equation.
where denotes an n-dimensional Wiener process and (i.e., . Additionally, let and i.e., . In addition, and satisfy the Lipschitz continuous and satisfy the linear growth condition. Consequently, we can deduce from Refs. [28,29] that for , Equation (8) possesses a unique global solution , assuming for the sake of stability in this paper. As a result, Equation (8) possesses an equilibrium solution . Additionally, when , the uniqueness exists with chance measure one, that is, for all almost surely.
In contrast to Equation (3), Equation (8) represents a system with an uncertain stochastic perturbation. It is intriguing to explore the influence of uncertain stochastic perturbation on the stability characteristics of the neural network. In the next section, we will delve into these issues in great depth.
3.1. Almost Sure Exponential Stability
Definition 10.
Firstly, we assume that Equation (8) has a solution . Further, we assume that there exist two measure sets, and , such that for any and for all and , the nonzero solution of Equation (8) when satisfies the following condition:
then, we call the uncertain stochastic neural network (8) almost surely exponentially stable, simply denoted as
Theorem 2.
Proof.
Take the Lyapunov function
Choose any nonzero value of and define as . It follows from the fact that there is only one possible solution that will almost surely be nonzero for all . The Itô–Liu formula implies that
Considering condition (11), we obtain
where
for all , where is an uncertain process and , and
which is a continuous martingale that disappears when . This martingale’s quadratic variation is denoted by . That is,
By condition (12), we obtain
Let and be arbitrary. The exponential martingale inequality implies
Therefore, according to the Borel–Cantelli lemma, it follows that there exists a random integer for almost every , such that for all , the following holds:
that is,
By condition (13), for any event , we have
where , is a Lipschitz constant of . By Lemma 1, for , there exists positive , such that
namely, , , such that
Substituting this into (15) yields
for all and , almost surely. By (16), we can obtain that
for all and , almost surely. So, for almost all , if and , then
Letting , we obtain
Because P is a symmetric positive definite matrix, the minimum eigenvalue , and then
Thus
Thus
We complete the proof. □
By Theorem 2, the following two sufficient conclusions can be obtained.
Theorem 3.
Suppose (2) is satisfied, and there exists a diagonal matrix where for all i. Let , be real numbers, and let the constant such that
for all . Denote by (Q) the largest eigenvalue of the symmetric matrix , where is defined as follows:
Then, the solution of Equation (8) satisfies
(i) if
(ii) if
whenever .
Proof.
It holds from (2) that
Thus, when ,
We can easily arrive at conclusion (18) by applying Theorem 2. Additionally, when ,
By utilizing Theorem 2 once more, we can arrive at conclusion (19). Hence, we complete the proof. □
Theorem 4.
Proof.
By condition, we can obtain that
Hence
So, by Theorem 2 again, we complete the proof. □
Theorem 5.
Moreover, assume
hold for all , where and are constants. Then, the solution to Equation (8) holds that
, or
, whenever , where
Proof.
By condition, we can obtain that
and
Therefore, in the case ,
When , applying Theorem 2 with P being the identity matrix, we can deduce that
When ,
It follows from Theorem 2 again that
We complete the proof. □
3.2. Stabilization by Linear Uncertain Stochastic Perturbation
We are aware that neural network
can sometimes be unstable. It may be assumed that subjecting an unstable neural network to an uncertain stochastic perturbation would cause it to behave even worse, or become more unstable. However, this is not always the case. Uncertain stochastic perturbation can actually make an unstable neural network more stable. In this section, we will demonstrate that any neural network of the form (3) can be stabilized by uncertain stochastic perturbation. For practical purposes, we will only consider linear uncertain stochastic perturbations. This means that we will only focus on perturbations of the form:
i.e., where are all matrices. In this case, the uncertain stochastic perturbed network (8) becomes
Note that
and
The proof can be obtained easily by Theorem 2, which we omit here.
3.3. Some Examples
Example 1.
The parameters denote the strength of the stochastic and uncertain perturbations, respectively. By selecting the identity matrix as the value of P, we observe that
and
Similarly, we have
Remark 3.
If we set for , then Equation (32) simplifies even further to
here, we just rely on a Wiener–Liu process scalar as the origin of the uncertain stochastic perturbation. This uncertain stochastic network is almost surely exponentially stable provided
The neural network described by can be stabilized by incorporating a sufficiently strong and uncertain stochastic perturbation in a particular way. In other words, we can draw the corollary that this simple example illustrates.
Corollary 3.
If (2) is satisfied, a Wiener–Liu process can stabilize any neural network with the given form
Notably, it is also feasible to utilize a single scalar Wiener–Liu process for this purpose.
Example 2.
For each l, choose a positive definite matrix and such that
There are numerous matrices that meet the criteria or characteristics being discussed. Let ζ be a real number and define . Let ϑ be a real number and define . Then, Equation (29) becomes
And let P be the identity matrix, noting that
and
Example 3.
We examine the scenario where the network’s dimension, denoted as m, is an even number, specifically . Suppose we set n to 1, meaning we select a scalar Wiener–Liu process . Additionally, let ζ be a real number and P the identity matrix again; then, we define that
Then, Equation (29) becomes
Note that
and
Remark 4.
Different from the almost sure exponential stability of stochastic Hopfield neural networks based on the probability theory of additive measures [6,12], uncertain stochastic Hopfield neural networks are more complex in terms of handling conditions and processes of almost sure exponential stability, such as the conditions of Theorems 2–6. In addition, we use the Itô–Liu formula, Liu inequality (Lemma 1), the Liu lemma (Lemma 2), etc, and these conclusions are all obtained using subadditive measures.
Remark 5.
The practical significance of almost sure exponential stability in uncertain stochastic Hopfield neural networks is that it ensures robust and reliable performance in real-world applications, such as image or speech recognition, financial analysis, or control systems. Almost sure exponential stability enables the network to reliably handle uncertainties and variations in the input data. It improves the neural network’s ability to generalize and make accurate predictions, even when faced with Liu noises and Wiener noises. This stability increases the neural network’s practical usefulness and applicability in real-world scenarios.
4. Conclusions
The main focus of this paper is the stability of Hopfield neural network dynamical systems with uncertain stochastic perturbations. The paper presents a theorem for judging the stability of such systems, along with two conclusions of sufficient conditions for stability. The stability of neural network systems with linear uncertain stochastic perturbations is studied in order to facilitate the discussion. We note that uncertain stochastic neural networks can be divided into two types: one is uncertain stochastic neuron activation functions, such as the Boltzman machine model, and the other is neural networks with uncertain stochastic weighted connections. Therefore, when considering uncertain stochastic neural networks, both of these cases should be considered. The uncertain stochastic neural network model studied in the paper is the second type, which involves neural networks with uncertain stochastic weighted connections. Overall, this paper provides a valuable contribution to the field of neural networks by considering the effects of both stochastic and uncertain elements on network stability and proposing methods for analyzing such systems. This work can also extend to the two-layer cellular neural network, impulsive model, or the reaction diffusion model, as in Refs [30,31,32]. There is currently no corresponding research result for neural networks using uncertain stochastic neuron activation functions, uncertain stochastic two-layer cellular neural network, the uncertain stochastic impulsive model, or the reaction diffusion model, and researchers can develop these areas in the near future.
Author Contributions
Conceptualization, Z.J. and C.L.; methodology, Z.J. and C.L.; software, Z.J. and C.L.; validation, Z.J. and C.L.; formal analysis, Z.J.; investigation, Z.J. and C.L.; writing—original draft preparation, Z.J.; writing—review and editing, Z.J. and C.L.; supervision, C.L. All authors have read and agreed to the published version of the manuscript.
Funding
This work was supported in part by the Natural Science Foundation of Ningxia (no. 2020AAC03242), Major Projects of North Minzu University (no. ZDZX201805), Governance and Social Management Research Center of Northwestic regions, and Nation and First-Class Disciplines Foundation of Ningxia (Grant No. NXYLXK2017B09).
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Hopfield, J.; Tank, D. Neural computation of decision in optimization problems. Biol. Cybern. 1985, 52, 141–152. [Google Scholar] [CrossRef] [PubMed]
- Haykin, S. Neural Networks: A Comprehensive Foundation; Prentice Hall: Hoboken, NJ, USA, 1998. [Google Scholar]
- Joya, G.; Atencia, M.A.; Soval, F. Hopfield neural networks for optimization: Study of the different dynamics. Neurocomputing 2002, 43, 219–237. [Google Scholar] [CrossRef]
- Young, S.S.; Scott, P.D.; Nasrabadi, N.M. Object recognition using multilayer Hopfield neural network. IEEE Trans. Image Process. 1997, 6, 357–372. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.Y.; Xie, L.H.; De Souza, C.E. Robust control of a class of uncertain nonlinear systems. Syst. Control Lett. 1992, 19, 139–149. [Google Scholar] [CrossRef]
- Liao, X.; Mao, X. Exponential stability and instability of stochastic neural networks. Stoch. Ann. Appl. 1996, 14, 165–185. [Google Scholar] [CrossRef]
- He, Y.; Liu, G.P.; Rees, D.; Wu, M. Stability analysis for neural networks with time-varying interval delay. IEEE Trans. Neural Netw. 2007, 18, 1850–1854. [Google Scholar] [CrossRef]
- Wang, Q.; Liu, X.Z. Exponential stability of impulsive cellular neural networks with time delays via Lyapunov functions. Appl. Math. Comput. 2007, 194, 186–198. [Google Scholar]
- Wang, Z.D.; Fang, J.A.; Liu, X.H. Global stability of stochastic high-order neural networks with discrete and distributed delays. Chaos Solitons Fractals 2008, 36, 388–396. [Google Scholar] [CrossRef]
- Huang, C.X.; He, Y.G.; Wang, H.N. Mean square exponential stability of stochastic recurrent neural networks with time-varying delays. Comput. Math. Appl. 2008, 56, 1773–1778. [Google Scholar] [CrossRef]
- Guo, Y.X. Mean square global asymptotic stability of stochastic recurrent neural networks with distributed delays. Appl. Math. Comput. 2009, 215, 791–795. [Google Scholar] [CrossRef]
- Liu, L.; Zhu, Q.X. Almost sure exponential stability of numerical solutions to stochastic delay Hopfeld neural networks. Appl. Math. Comput. 2015, 266, 698–712. [Google Scholar]
- Zhao, Y.; Zhu, Q.X. Stabilization of stochastic highly nonlinear delay systems with neutral term. IEEE Trans. Autom. Control 2023, 68, 2544–2551. [Google Scholar] [CrossRef]
- Huang, H.; Cao, J. Exponential stability analysis of uncertain stochastic neural networks with multiple delays. Nonlinear Anal. Real World Appl. 2007, 8, 646–653. [Google Scholar] [CrossRef]
- Wang, Z.; Lauria, S.; Fang, J.; Liu, X. Exponential stability of uncertain stochastic neural networks with mixed time-delays. Chaos, Solitons Fractals 2007, 32, 62–72. [Google Scholar] [CrossRef]
- Chen, W.H.; Lu, X.M. Mean square exponential stability of uncertain stochastic delayed neural networks. Phys. Lett. A 2008, 372, 1061–1069. [Google Scholar] [CrossRef]
- Ali, M.S. Stochastic stability of uncertain recurrent neural networks with Markovian jumping parameters. Acta Math. Sci. 2015, 35, 1122–1136. [Google Scholar]
- Itô, K. On stochastic differential equations. Am. Math. Soc. 1951, 4, 1–51. [Google Scholar]
- Yu, J.J.; Zhang, K.J.; Fei, S.M. Further results on mean square exponential stability of uncertain stochastic delayed neural networks. Commun. Nonlinear Sci. Numer. Simul. 2009, 14, 1582–1589. [Google Scholar] [CrossRef]
- Deng, F.Q.; Luo, Q.; Mao, X.R. Stochastic stabilization of hybrid differential equations. Automatica 2012, 48, 2321–2328. [Google Scholar] [CrossRef]
- Guo, Q.; Mao, X.R.; Yue, R.X. Almost sure exponential stability of stochastic differential delay equations. SIAM J. Control Optim. 2016, 54, 1919–1933. [Google Scholar] [CrossRef]
- Zhu, Q.X. Stabilization of stochastic nonlinear delay systems with exogenous disturbances and the event-triggered feedback control. IEEE Trans. Autom. Control 2019, 64, 3764–3771. [Google Scholar] [CrossRef]
- Liu, B. Fuzzy process, hybrid process and uncertain process. J. Uncertain Syst. 2008, 2, 3–16. [Google Scholar]
- Liu, B. Some research problems in uncertainty theory. J. Uncertain Syst. 2009, 3, 3–10. [Google Scholar]
- Chen, X.; Liu, B. Existence and uniqueness theorem for uncertain differential equations. Fuzzy Optim. Decis. Mak. 2010, 9, 69–81. [Google Scholar] [CrossRef]
- Yao, K.; Gao, J.; Gao, Y. Some stability theorems of uncertain differential equation. Fuzzy Optim. Decis. Mak. 2013, 12, 3–13. [Google Scholar] [CrossRef]
- Liu, Y. Uncertain random variables: A mixture of uncertainty and randomness. Soft Comput. 2013, 17, 625–634. [Google Scholar] [CrossRef]
- Fei, W. Optimal control of uncertain stochastic systems with markovian switching and its applications to portfolio decisions. Cybern. Syst. 2014, 45, 69–88. [Google Scholar] [CrossRef]
- Fei, W. On existence and uniqueness of solutions to uncertain backward stochastic differential equations. Appl. Math. 2014, 29, 53–66. [Google Scholar] [CrossRef]
- Arena, P.; Baglio, S.; Fortuna, L.; Manganaro, G. Self-organization in a two-layer CNN. IEEE Trans. Autom. Control 1998, 45, 157–162. [Google Scholar] [CrossRef]
- Zhang, T.W.; Xiong, L.L. Periodic motion for impulsive fractional functional differential equations with piecewise Caputo derivative. Appl. Math. Lett. 2020, 101, 106072. [Google Scholar] [CrossRef]
- Huang, H.; Zhao, K.; Liu, X. On solvability of BVP for a coupled Hadamard fractional systems involving fractional derivative impulses. AIMS Math. 2022, 7, 19221–19236. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).