Next Article in Journal
Generation of Schrödinger Cat States in a Hybrid Cavity Optomechanical System
Previous Article in Journal
Recent Advances in W-Containing Refractory High-Entropy Alloys—An Overview
Previous Article in Special Issue
Synchronization in Finite-Time of Delayed Fractional-Order Fully Complex-Valued Dynamical Networks via Non-Separation Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dynamics of Hopfield-Type Neural Networks with Modulo Periodic Unpredictable Synaptic Connections, Rates and Inputs

by
Marat Akhmet
1,*,
Madina Tleubergenova
2,3 and
Akylbek Zhamanshin
1,2
1
Department of Mathematics, Middle East Technical University, Ankara 06531, Turkey
2
Department of Mathematics, Aktobe Regional University, Aktobe 030000, Kazakhstan
3
Institute of Information and Computational Technologies CS MES RK, Almaty 050010, Kazakhstan
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(11), 1555; https://doi.org/10.3390/e24111555
Submission received: 5 September 2022 / Revised: 26 October 2022 / Accepted: 26 October 2022 / Published: 29 October 2022
(This article belongs to the Special Issue Dynamical Systems, Differential Equations and Applications)

Abstract

:
In this paper, we rigorously prove that unpredictable oscillations take place in the dynamics of Hopfield-type neural networks (HNNs) when synaptic connections, rates and external inputs are modulo periodic unpredictable. The synaptic connections, rates and inputs are synchronized to obtain the convergence of outputs on the compact subsets of the real axis. The existence, uniqueness, and exponential stability of such motions are discussed. The method of included intervals and the contraction mapping principle are applied to attain the theoretical results. In addition to the analysis, we have provided strong simulation arguments, considering that all the assumed conditions are satisfied. It is shown how a new parameter, degree of periodicity, affects the dynamics of the neural network.

1. Introduction

It is well-known that HNNs [1,2] are widely used in the fields of signal and image processing, pattern recognition, associative memory and optimization computation, among others [3,4,5,6,7,8]. Hence, they have been the object of intensive analysis by numerous authors in recent decades. With the increasing improvement in neural networks, the aforementioned systems are being modernized, and the dynamics of models with various types of coefficients are being investigated [9,10,11,12,13]. Special attention is being paid to the problem of the existence and stability of periodic and almost periodic solutions of HNNs [14,15,16,17,18,19,20,21], for which appropriate coefficients and conditions are necessary.
A few years ago, the boundaries of the classical theory of dynamical systems, founded by H. Poincare [22] and G. Birkhoff [23], were expanded by the concepts of unpredictable points and unpredictable functions [24]. It was proven that the unpredictable point leads to the existence of chaos in quasi-minimal sets. That is, the proof of the unpredictability simultaneously confirms Poincare chaos. Unpredictable functions were defined as unpredictable points in the Bebutov dynamical system [25], where the topology of convergence on compact sets of the real axis is used instead of the metric space. The use of such convergence significantly simplifies the problem of proving the existence of unpredictable solutions for differential equations and neural networks, and a new method of included intervals has been introduced and developed in several papers [26,27,28,29,30,31].
Let us commence with the main definitions.
Definition 1
([25]). A uniformly continuous and bounded function ψ : R R is unpredictable if there exist positive numbers ϵ 0 , δ and sequences t n , s n , both of which diverge to infinity such that | ψ ( t + t n ) ψ ( t ) | 0 as n uniformly on compact subsets of R and | ψ ( t + t n ) ψ ( t ) | > ϵ 0 for each t [ s n δ , s n + δ ] and n N .
In Definition 1, the sequences t n , s n , n = 1 , 2 , are said to be the convergence and divergence sequences of the function ψ ( t ) , respectively. We call the uniform convergence on compact subsets of R  the convergence property, and the existence of the sequence s n and positive numbers ϵ 0 , δ is called the separation property. It is known [32] that an unpredictable function without separation property is said to be a Poisson stable function.
Let us introduce a new type of unpredictable functions, which are important objects for investigation in the paper.
Definition 2.
The sum ϕ ( t ) + ψ ( t ) is said to be a modulo periodic unpredictable function if ϕ ( t ) is a continuous periodic function and ψ ( t ) is an unpredictable function.
In this study, we focus on the Hopfield-type neural network with two-component coefficients and inputs:
x i ( t ) = ( a i ( t ) + b i ( t ) ) x i ( t ) + j = 1 p ( c i j ( t ) + d i j ( t ) ) f j ( x j ( t ) ) + u i ( t ) + v i ( t ) , i = 1 , 2 , , p ,
where x i ( t ) stands for the state vector of the ith unit at time t. The synaptic connections, rates and external inputs are modulo periodic unpredictable; they consist of two components such that a i ( t ) , c i j ( t ) , u i ( t ) are periodic and b i ( t ) , d i j ( t ) , v i ( t ) are unpredictable. c i j ( t ) and d i j ( t ) denote components of the synaptic connection weights of the jth unit with the ith unit at time t; the functions f j ( x j ( t ) ) denote the measures of activation to its incoming potentials of the unit j at time t .
Consider the convergence sequence t n of the unpredictable function ψ ( t ) . For fixed real number ω > 0 , one can write that t n τ n ( m o d ω ) , where 0 τ n < ω for all n 1 . The boundedness of the sequence τ n implies that there exists a subsequence τ n l , which converges to a number τ ω ,   0 τ ω < ω . That is, there exists a subsequence t n l of the convergence sequence t n and a number τ ω such that t n l τ ω ( m o d ω ) as l . We called the number τ ω the Poisson shift for the convergence sequence t n with respect to the ω [33]. Denote by T ω the set of all Poisson shifts. The number κ ω = i n f T ω ,   0 κ ω < ω , is said to be the Poisson number for the convergence sequence t n . If κ ω = 0 , then we say that the sequence t n satisfies the kappa property.

2. Methods

Due to the development of neural networks and their applications, classical types of functions such as periodic and almost periodic are no longer sufficient to study their dynamics. This is especially seen in analysis of the chaotic behavior of the systems. Therefore, in order to meet requirements of progress, many more functions are needed. To satisfy the demands, we have combined periodic and unpredictable components in rates and inputs. If the periodicity is inserted to serve for stability, the unpredictability guarantees chaotic dynamics. According to Definition 1, verification of the convergence and separation properties is necessary to prove the existence of unpredictable solutions. To provide constructive conditions for the existence of unpredictable solutions, we have determined the special kappa property of the convergence sequence t n , with respect to the period ω .
The method of included intervals, which was introduced in paper [26] and has been developed in [27,28,29,33], is a powerful instrument for verifying convergence properties. This technique has been applied in the study of continuous unpredictable solutions of Hopfield-type neural networks with delayed and advanced arguments [30] and in the study of discontinuous unpredictable solutions of impulsive neural networks with Hopfield structures [31]. All the previous models in [30,31] are considered with constant rates, while in the present research, the rates are variable, and we have designed the new model of Hopfield-type neural networks with modulo periodic unpredictable rates a i ( t ) + b i ( t ) , connection weights c i j ( t ) + d i j ( t ) and external inputs u i ( t ) + v i ( t ) . The periodic components, a i ( t ) , serve the stability of the model, while the unpredictable components b i ( t ) and v i ( t ) cause chaotic outputs.

3. Main Results

Throughout the paper, we will use the norm v = max 1 i p v i , where · is the absolute value, v = ( v 1 , , v p ) and v i R , i = 1 , 2 , , p .
Following the results in [34], it can be shown that the function y ( t ) = ( y 1 ( t ) , y 2 ( t ) , , y p ( t ) ) is a solution of (1) if and only if it satisfies the following integral equation:
y i ( t ) = t e s t a i ( u ) d u b i ( s ) y i ( s ) + j = 1 p ( c i j ( s ) + d i j ( s ) ) f j ( y j ( s ) ) + u i ( s ) + v i ( s ) d s ,
for all i = 1 , , p .
Denote by S the set of vector-functions φ = ( φ 1 , φ 2 , , φ p ) , where φ i ,   i = 1 , 2 , , p , satisfy the convergence property with the common convergence sequence t n . Moreover, | φ i | < H ,   i = 1 , 2 , , p , where H is a positive number. In the set, S determines the norm φ ( t ) 0 = max ( i ) | φ i ( t ) | .
Define on S the operator T such that T ϕ ( t ) = ( T 1 ϕ ( t ) , T 2 ϕ ( t ) , , T p ϕ ( t ) ) , where:
T i ϕ ( t ) t e s t a i ( u ) d u b i ( s ) ϕ i ( s ) + j = 1 p ( c i j ( s ) + d i j ( s ) ) f j ( ϕ j ( s ) ) + u i ( s ) + v i ( s ) d s ,
for each i = 1 , 2 , ,   p . We will need the following conditions:
(C1)
The functions a i ( t ) ,   c i j ( t ) and u i ( t ) are continuous ω —periodic, and 0 ω a i ( u ) d u > 0 for each i , j = 1 , , p ;
(C2)
The functions b i ( t ) ,   d i j ( t ) and v i ( t ) ,   i , j = 1 , 2 , , p , are unpredictable with the same convergence and divergence sequences t n , s n . Moreover, | v i ( t + t n ) v i ( t ) | > ϵ 0 for all t [ s n δ ; s n + δ ] , i = 1 , 2 , , p , and positive numbers δ , ϵ 0 ;
(C3)
The convergence sequence t n satisfies the kappa property with respect to the period ω ;
(C4)
There exists a positive number m f such that sup | s | < H | f ( s ) | = m f ;
(C5)
There exists a positive number L such that the function f ( s ) satisfies the inequality | f ( s 1 ) f ( s 2 ) |     L | s 1 s 2 | if | s 1 | < H , | s 2 | < H .
According the condition ( C 1 ) , for all i = 1 , ,   p , the numbers K i 1 and λ i > 0 exist, such that
e s t a i ( u ) d u K i e λ i ( t s ) .
For convenience, we introduce the following notations:
m i a = sup t R | a i ( t ) | , m i b = sup t R | b i ( t ) | , m i u = sup t R | u i ( t ) | , m i v = sup t R | v i ( t ) | , m i j c = sup t R | c i j ( t ) | , m i j d = sup t R | d i j ( t ) | ,
for each i = 1 , 2 , , p .
The following conditions are required:
(C6)
K i λ i K i m i b ( j = 1 p ( m i j c + m i j d ) m f + m i u + m i v ) < H ;
(C7)
K i ( m i b + j = 1 p ( m i j c + m i j d ) L ) < λ i ;
(C8)
H m i b + j = 1 p m i j d m f < ϵ 0 4 ;
for all i , j = 1 , , p .
Lemma 1.
The set S is a complete space.
Proof. 
Consider a Cauchy sequence ϕ k ( t ) in S , which converges to a limit function ϕ ( t ) on R . Fix a closed and bounded interval I R . We obtain:
ϕ ( t + t n ) ϕ ( t )     ϕ ( t + t n ) ϕ k ( t + t n )   +   ϕ k ( t + t n ) ϕ k ( t )   +   ϕ k ( t ) ϕ ( t ) .
One can choose sufficiently large n and k , such that each term on the right side of (5) is smaller than ϵ 3 for an arbitrary ϵ > 0 and t I . Thus, we conclude that the sequence ϕ ( t + t n ) is uniformly converging to ϕ ( t ) on I . That is, the set S is complete. □
Lemma 2.
The operator T is invariant in S .
Proof. 
For a function φ ( t ) S and fixed i = 1 , 2 , , p , we have that
| T i φ ( t ) | = t e t s a i ( u ) d u b i ( s ) φ i ( s ) + j = 1 p ( c i j ( s ) + d i j ( s ) ) f j ( φ j ( s ) ) + u i ( s ) + v i ( s ) d s t K i e λ i ( t s ) | b i ( s ) φ i ( s ) | + j = 1 p ( | c i j ( s ) | + | d i j ( s ) | ) | f j ( φ j ( s ) ) | + | u i ( s | + | v i ( s ) | d s K i λ i m i b H + j = 1 p ( m i j c + m i j d ) m f + m i u + m i v .
The last inequality and condition (C6) imply that T φ 0 < H .
Next, applying the method of included intervals, we will show that T φ ( t + t n ) T φ ( t ) as n uniformly on compact subsets of R .
Let us fix an arbitrary ϵ > 0 and a section [ α , β ] ,   < α < β < . There exist numbers γ ,   ξ such that γ < α and ξ > 0 , which satisfy the following inequalities:
K i λ i e λ i ( α γ ) m i b H + 1 4 j = 1 p ( m i j c + m i j d ) ( L H + m f ) + m i u + m i v < ϵ 8 ,
K i λ i ( e ξ ( β γ ) 1 ) ( m i b H + j = 1 p ( m i j c + m i j d ) m f + m i u + m i v ) < ϵ 4 ,
and
K i ξ λ i ( m i b + H + j = 1 p ( m i j c + m i j d ) L + 2 p m f + 2 ) < ϵ 4 ,
for all i = 1 , 2 , , p .
Since the functions b i ( t ) ,   d i j ( t ) and v i ( t ) ,   i , j = 1 , 2 , , p , are unpredictable, φ ( t ) belongs to S , and the convergence sequence, t n , is common to all of them and satisfies the kappa property. Then, the following inequalities are true: | b i ( t + t n ) b i ( t ) | < ξ ,   | d i j ( t + t n ) d i j ( t ) | < ξ ,   | v i ( t + t n ) v i ( t ) | < ξ ,   | φ i ( t + t n ) φ i ( t ) | < ξ for t [ γ , β ] . Moreover, applying condition (C3), one can attain that | a i ( t + t n ) a i ( t ) | < ξ ,   | c i j ( t + t n ) c i j ( t ) | < ξ , and | u i ( s + t n ) u i ( s ) | < ξ for t R ,   i , j = 1 , 2 , , p . We have that:
| T i φ ( t + t n ) T i φ ( t ) |     | t e s t a i ( u + t n ) d u ( b i ( s + t n ) φ i ( s + t n ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) f j ( φ j ( s + t n ) ) + u i ( s + t n ) + v i ( s + t n ) ) d s t e s t a i ( u ) d u b i ( s ) φ i ( s ) + j = 1 p ( c i j ( s ) + d i j ( s ) ) f j ( φ j ( s ) ) + u i ( s ) + v i ( s ) d s | t | e s t a i ( u + t n ) d u e s t a i ( u ) d u | | b i ( s + t n ) φ i j ( s + t n ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) f j ( φ j ( s + t n ) ) + u i ( s + t n ) + v i ( s + t n ) | d s + t e s t a i ( u ) d u | b i ( s + t p ) φ i ( s + t p ) + b i ( s ) φ i ( s ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) ( f j ( φ j ( s + t n ) ) f j ( φ j ( s ) ) ) + j = 1 p ( c i j ( s + t n ) c i j ( s ) + d i j ( s + t n ) d i j ( s ) ) f j ( φ j ( s ) ) + u i ( s + t n ) + v i ( s + t n ) u i ( s ) v i ( s ) | d s ,
for all i = 1 , 2 , , p . Consider the terms in the last inequality separately on intervals ( , γ ] and ( γ , t ] . By using inequalities (6)–(8), we obtain:
I 1 = γ | e s t a i ( u + t n ) d u e s t a i ( u ) d u | | b i ( s + t n ) φ i j ( s + t n ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) f j ( φ j ( s + t n ) ) + u i ( s + t n ) + v i ( s + t n ) | d s + γ e s t a i ( u ) d u | b i ( s + t p ) φ i ( s + t p ) + b i ( s ) φ i ( s ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) ( f j ( φ j ( s + t n ) ) f j ( φ j ( s ) ) ) + j = 1 p ( c i j ( s + t n ) c i j ( s ) + d i j ( s + t n ) d i j ( s ) ) f j ( φ j ( s ) ) + u i ( s + t n ) + v i ( s + t n ) u i ( s ) v i ( s ) | d s γ 2 K i e λ i ( t s ) m i b H + j = 1 p ( m i j c + m i j d ) m f + m i u + m i v d s + γ K i e λ i ( t s ) 2 m i b H + j = 1 p ( m i j c + m i j d ) L H + 2 j = 1 p ( m i j c + m i j d ) m f + 2 m i u + 2 m i v d s 2 K i λ i e λ i ( α γ ) m i b H + j = 1 p ( m i j c + m i j d ) m f + m i u + m i v + K i λ i e λ i ( α γ ) 2 m i b H + j = 1 p ( m i j c + m i j d ) ( L H + 2 m f ) + 2 m i u + 2 m i v 4 K i λ i e λ i ( α γ ) m i b H + 1 4 j = 1 p ( m i j c + m i j d ) ( L H + m f ) + m i u + m i v < ϵ 2 ,
and
I 2 = γ t | e s t a i ( u + t n ) d u e s t a i ( u ) d u | | b i ( s + t n ) φ i j ( s + t n ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) f j ( φ j ( s + t n ) ) + u i ( s + t n ) + v i ( s + t n ) | d s γ t e s t a i ( u ) d u | b i ( s + t p ) φ i ( s + t p ) + b i ( s ) φ i ( s ) + j = 1 p ( c i j ( s + t n ) + d i j ( s + t n ) ) ( f j ( φ j ( s + t n ) ) f j ( φ j ( s ) ) ) + j = 1 p ( c i j ( s + t n ) c i j ( s ) + d i j ( s + t n ) d i j ( s ) ) f j ( φ j ( s ) ) + u i ( s + t n ) + v i ( s + t n ) u i ( s ) v i ( s ) | d s γ t K i e λ i ( t s ) ( e ξ ( β γ ) 1 ) m i b H + j = 1 p ( m i j c + m i j d ) m f + m i u + m i v d s + γ t K i e λ i ( t s ) ( m i b + H ) ξ + j = 1 p ( m i j c + m i j d ) L ξ + 2 ξ p m f + 2 ξ d s K i λ i ( e ξ ( β γ ) 1 ) ( m i b H + j = 1 p ( m i j c + m i j d ) m f + m i u + m i v ) + K i λ i ( ( m i b + H ) ξ + j = 1 p ( m i j c + m i j d ) L ξ + 2 ξ p m f + 2 ξ ) < ϵ 4 + ϵ 4 = ϵ 2 ,
for each i = 1 , 2 , , p . This is why, for all t [ α , β ] and i = 1 , 2 , , p , we have that | T i φ ( t + t n ) T i φ ( t ) | I 1 + I 2 < ϵ . So, the function T φ ( t + t n ) uniformly convergences to T φ ( t ) on compact subsets of R , and it is true that T : S S .  □
Lemma 3.
The operator T is contractive in S , provided that the conditions ( C 1 ) ( C 7 ) are valid.
Proof. 
For two functions φ , ψ S , and fixed i = 1 , 2 , , p , it is true that
| T i φ ( t ) T i ψ ( t ) | t e s t a i ( u ) d u ( | b i ( s ) | | φ i ( s ) ψ i ( s ) | + j = 1 p c i j ( s ) | f j ( φ j ( s ) ) f j ( ψ j ( s ) ) | d s + j = 1 p d i j ( s ) | f j ( φ j ( s ) ) f j ( ψ j ( s ) ) | d s ) d s K i λ i ( m i b | φ i ( s ) ψ i ( s ) | + j = 1 p m i j c L | φ j ( s ) ψ j ( s ) | + j = 1 p m i j d L | φ j ( s ) ψ j ( s ) | ) d s K i λ i ( m i b + j = 1 p ( m i j c + m i j d ) L ) φ ψ 0 .
The last inequality yields T φ ( t ) T ψ ( t ) 0 max i K i λ i ( m i b + j = 1 p ( m i j c + m i j d ) L ) φ ( t ) ψ ( t ) 0 . Hence, in accordance with condition (C7), the operator T is contractive in S .
Theorem 1.
The neural network (1) admits a unique exponentially stable unpredictable solution provided that conditions ( C 1 ) ( C 8 ) are fulfilled.
Proof. 
By Lemma 1, the set S is complete; by Lemma 2, the the operator T is invariant in S ; and by Lemma 3, the operator T is contractive in S . Applying the contraction mapping theorem, we obtain that there exists a fixed point ω S of the operator T , which is a solution of the neural network (1) and satisfies the convergence property.
Next, we show that the solution ω ( t ) of (1) satisfies the separation property.
Applying the relations
ω i ( t ) = ω i ( s n ) s n t a i ( s ) ω i ( s ) d s s n t b i ( s ) ω i ( s ) d s + s n t j = 1 p c i j ( s ) f j ( ω j ( s ) ) d s + s n t j = 1 p d i j ( s ) f j ( ω j ( s ) ) d s + s n t u i ( s ) d s + s n t v i ( s ) d s
and
ω i ( t + t n ) = ω i ( s n + t n ) s n t a i ( s + t n ) ω i ( s + t n ) d s s n t b i ( s + t n ) ω i ( s + t n ) d s + s n t j = 1 p c i j ( s + t n ) f j ( ω j ( s + t n ) ) d s + s n t j = 1 p d i j ( s + t n ) f j ( ω j ( s ) ) d s + s n t u i ( s + t n ) d s + s n t v i ( s + t n ) d s
we obtain:
ω i ( t + t n ) ω i ( t ) = ω i ( s n + t n ) ω i ( s n ) s n t a i ( s + t n ) ( ω i ( s + t n ) ω i ( s ) ) d s s n t ω i ( s ) ( a i ( s + t n ) a i ( s ) ) d s s n t b i ( s + t n ) ( ω i ( s + t n ) ω i ( s ) ) d s s n t ω i ( s ) ( b i ( s + t n ) b i ( s ) ) d s + s n t j = 1 p c i j ( s + t n ) ( f i ( ω i ( s + t n ) ) f i ( ω i ( s ) ) ) d s + s n t j = 1 p ( c i j ( s + t n ) c i j ( s ) ) f i ( ω i ( s ) ) d s + s n t j = 1 p d i j ( s + t n ) ( f i ( ω i ( s + t n ) ) f i ( ω i ( s ) ) ) d s + s n t j = 1 p ( d i j ( s + t n ) d i j ( s ) ) f i ( ω i ( s ) ) d s + s n t ( u i ( s + t n ) u i ( s ) ) d s + s n t ( v i ( s + t n ) v i ( s ) ) d s .
There exist positive numbers δ 1 and integers l ,   k such that, for each i = 1 , 2 , , p , the following inequalities are satisfied:
6 l < δ 1 < δ ;
| a i ( t + s ) a i ( s ) | < ϵ 0 ( 1 l + 2 k ) , t R ,
| c i j ( t + s ) c i j ( s ) | < ϵ 0 ( 1 l + 2 k ) , t R ,
| u i ( t + s ) u i ( s ) | < ϵ 0 ( 1 l + 2 k ) , t R ,
m i a + H + m i b + j = 1 p ( m i j c + m i j d ) L + 1 ( 1 l + 2 k ) < 1 4 ,
| ω i ( t + s ) ω i ( t ) | < ϵ 0 min ( 1 k , 1 4 l ) , t R , | s | < δ 1 .
Let the numbers δ 1 ,   l and k, as well as numbers n N , and i = 1 , , p , be fixed. Consider the following two alternatives: (i) | ω i ( t n + s n ) ω i ( s n ) |   < ϵ 0 / l ; (ii) | ω i ( t n + s n ) ω i ( s n ) |   ϵ 0 / l .
(i) Using (14), one can show that
| ω i ( t + t n ) ω i ( t n ) | | ω i ( t + t n ) ω i ( s n + t n ) | + | ω i ( s n + t n ) ω i ( s n ) | + | ω i ( s n ) ω i ( t ) | < ϵ 0 l + ϵ 0 k + ϵ 0 k = ϵ 0 ( 1 l + 2 k ) , i = 1 , 2 , , p ,
if t [ s n , s n + δ 1 ] . Therefore, the condition (C8) and inequalities (9)–(15) imply that
| ω i ( t + t n ) ω i ( t ) | s n t | v i ( s + t n ) v i ( s ) | d s s n t | ω i ( s ) | | b i ( s + t n ) b i ( s ) | d s s n t j = 1 p | d i j ( s + t n ) d i j ( s ) | | f i ( ω i ( s ) | d s s n t | a i ( s + t n ) | | ω i ( s + t n ) ω i ( s ) | d s s n t | ω i ( s ) | | a i ( s + t n ) a i ( s ) | d s s n t | b i ( s + t n ) | | ω i ( s + t n ) ω i ( s ) | d s s n t j = 1 p | c i j ( s + t n ) | | f i ( ω i ( s + t n ) ) f i ( ω i ( s ) ) | d s s n t j = 1 p | c i j ( s + t n ) c i j ( s ) | | f i ( ω i ( s ) ) | d s s n t j = 1 p | d i j ( s + t n ) | | f i ( ω i ( s + t n ) ) f i ( ω i ( s ) ) | d s s n t | u i ( s + t n ) u i ( s ) | d s | ω i ( s n + t n ) ω i ( s n ) | δ 1 ϵ 0 2 δ 1 H m i b 2 δ 1 j = 1 p m i j d m f δ 1 m i a ϵ 0 ( 1 l + 2 k ) δ 1 H ϵ 0 ( 1 l + 2 k ) δ 1 m i b ϵ 0 ( 1 l + 2 k ) δ 1 j = 1 p m i j c L ϵ 0 ( 1 l + 2 k ) δ 1 j = 1 p m i j d L ϵ 0 ( 1 l + 2 k ) δ 1 ϵ 0 ( 1 l + 2 k ) ϵ 0 l = δ 1 ϵ 0 2 H m i b 2 j = 1 p m i j d m f ( m i a + H + m i b + j = 1 p ( m i j c + m i j d ) L + 1 ) ϵ 0 ( 1 l + 2 k ) ϵ 0 l > ϵ 0 2 l
for t [ s n , s n + δ 1 ] .
(ii) If | ω i ( t n + s n ) ω i ( s n ) | ϵ 0 / l , it is not difficult to find that (14) implies:
| ω i ( t + t n ) ω i ( t ) | | ω i ( t n + s n ) ω i ( s n ) | | ω i ( s n ) ω i ( t ) | | ω i ( t + t n ) ω i ( t n + s n ) | > ϵ 0 l ϵ 0 4 l ϵ 0 4 l = ϵ 0 2 l , i = 1 , 2 , , p ,
if t [ s n δ 1 , s n + δ 1 ] and n N . Thus, it can be concluded that ω ( t ) is an unpredictable solution with sequences t n ,   s n and positive numbers δ 1 2 ,   ϵ 0 2 l .
Next, we will prove the stability of the solution ω ( t ) . It is true that
ω i ( t ) = e t 0 t a i ( u ) d u ω i ( t 0 ) t 0 t e s t a i ( u ) d u b i ( s ) ω i ( s ) + j = 1 p ( c i j ( s ) + d i j ( s ) ) f j ( ω j ( s ) ) + u i ( s ) + v i ( s ) d s ,
for all i = 1 , , p .
Let y ( t ) = ( y 1 ( t ) , y 2 ( t ) , , y p ( t ) ) , be another solution of system (1). Then,
y i ( t ) = e t 0 t a i ( u ) d u y i ( t 0 ) t 0 t e s t a i ( u ) d u b i ( s ) y i ( s ) + j = 1 p ( c i j ( s ) + d i j ( s ) ) f j ( y j ( s ) ) + u i ( s ) + v i ( s ) d s ,
for all i = 1 , , p .
Making use of the relation:
y i ( t ) ω i ( t ) = e t 0 t a i ( u ) d u y i ( t 0 ) ω i ( t 0 ) t 0 t e t 0 t a i ( u ) d u ( b i ( s ) y i ( s ) b i ( s ) ω i ( s ) + j = 1 p c i j f j ( y j ( s ) ) j = 1 p c i j f j ( ω j ( s ) ) + j = 1 p d i j f j ( y j ( s ) ) j = 1 p d i j f j ( ω j ( s ) ) ) d s ,
we obtain that:
y i ( t ) ω i ( t ) K i e λ i ( t t 0 ) | y i ( t 0 ) ω i ( t 0 ) | + t 0 t K i e λ i ( t t 0 ) m i b + j = 1 p ( m i j c + m i j d ) L | y i ( s ) ω i ( s ) | d s ,
for all i = 1 , 2 , , p .
Applying the Gronwall–Belman Lemma, one can obtain:
| y i ( t ) ω i ( t ) | K i y i ( t 0 ) ω i ( t 0 ) | e ( K i ( m i b + L j = 1 p ( m i j c + m i j d ) ) λ i ) ( t t 0 ) ,
for each i = 1 , 2 , , p . So, (C7) implies that ω ( t ) = ( ω 1 ( t ) , ω 2 ( t ) , , ω p ( t ) ) is an exponentially stable unpredictable solution of the neural network (1). The theorem is proven. □

4. Numerical Examples

Let ψ i , i Z , be a solution of the logistic discrete equation:
λ i + 1 = μ λ i ( 1 λ i ) ,
with μ = 3.91 .
In the paper [25], an example was constructed of the unpredictable function Θ ( t ) . The function Θ ( t ) = t e 3 ( t s ) Ω ( s ) d s , where Ω ( t ) is a piecewise constant function defined on the real axis through the equation Ω ( t ) = ψ i for t [ i , i + 1 ) , i Z .
In what follows, we will define the piecewise constant function, Ω ( t ) , for t [ h i , h ( i + 1 ) ) , where i Z and h is a positive real number. The number h is said to be the length of step of the functions Ω ( t ) and Θ ( t ) . We call the ratio of the period and the length of step, = ω / h  the degree of periodicity.
Below, using numerical simulations, we will show how the degree of periodicity affects the dynamics of a neural network.
Example 1.
Let us consider the following Hopfield-type neural network:
x i ( t ) = ( a i ( t ) + b i ( t ) ) x i ( t ) + j = 1 3 ( c i j ( t ) + d i j ( t ) ) f j ( x j ( t ) ) + u i ( t ) + v i ( t ) ,
 where i = 1 , 2 , 3 ,   f ( x ( t ) ) = 0.2 t a n h ( x ( t ) ) . The functions a i ( t ) ,   c i j ( t ) and u i ( t ) are π / 2 —periodic such that a 1 ( t ) = 2 + sin 2 ( 2 t ) ,   a 2 ( t ) = 3 + cos ( 4 t ) ,   a 3 ( t ) = 4 + cos 2 ( 2 t ) ,   c 11 ( t ) = 0.1 cos ( 4 t ) ,   c 12 ( t ) = 0.3 sin ( 2 t ) ,   c 13 ( t ) = 0.1 cos ( 8 t ) ,   c 21 ( t ) = 0.2 sin ( 8 t ) ,   c 22 ( t ) = 0.05 cos ( 4 t ) ,   c 23 ( t ) = 0.4 sin ( 2 t ) ,   c 31 ( t ) = 0.3 cos ( 2 t ) ,   c 32 ( t ) = 0.5 sin ( 4 t ) ,   c 33 ( t ) = 0.1 sin ( 8 t ) ,   u 1 ( t ) = sin ( 8 t ) ,   u 2 ( t ) = sin ( 4 t ) ,   u 3 ( t ) = cos ( 4 t ) . The unpredictable functions b i ( t ) ,   d i j ( t ) and v i ( t ) such that b 1 ( t ) = 0.2 Θ ( t ) ,   b 2 ( t ) = 0.6 Θ ( t ) ,   b 3 ( t ) = 0.4 Θ ( t ) ,   d 11 ( t ) = 0.02 Θ ( t ) ,   d 12 ( t ) = 0.05 Θ ( t ) ,   d 13 ( t ) = 0.03 Θ ( t ) ,   d 21 ( t ) = 0.04 Θ ( t ) ,   d 22 ( t ) = 0.01 Θ ( t ) ,   d 23 ( t ) = 0.06 Θ ( t ) ,   d 31 ( t ) = 0.06 Θ ( t ) ,   d 32 ( t ) = 0.06 Θ ( t ) ,   d 33 ( t ) = 0.05 Θ ( t ) ,   v 1 ( t ) = 3 Θ ( t ) ,   v 2 ( t ) = 5 Θ ( t ) ,   v 3 ( t ) = 4 Θ ( t ) , where Θ ( t ) = t e 2.5 ( t s ) Ω ( s ) d s with the length of step h = 4 π . Condition (C1) is valid, and K i = 1 ,   i = 1 , 2 , 3 ,   λ 1 = 5 π / 4 ,   λ 2 = 6 π / 4 ,   λ 3 = 9 π / 4 . Since the elements of the convergence sequence are multiples of h = 4 π , and the period ω is equal to π / 2 , condition (C3) is valid. The degree of periodicity is equal to 1/8. Conditions (C4)–(C8) are satisfied with H = 1 ,   m f = 0.2 ,   L = 0.2 ,   m 1 b = 0.08 ,   m 2 b = 0.24 ,   m 3 b = 0.16 ,   m 11 c = 0.1 ,   m 12 c = 0.3 ,   m 13 c = 0.1 ,   m 21 c = 0.2 ,   m 22 c = 0.05 ,   m 23 c = 0.4 ,   m 31 c = 0.3 ,   m 32 c = 0.5 ,   m 33 c = 0.1 ,   m 11 d = 0.008 ,   m 12 d = 0.02 ,   m 13 d = 0.012 ,   m 21 d = 0.016 ,   m 22 d = 0.004 ,   m 23 d = 0.024 ,   m 31 d = 0.024 ,   m 32 d = 0.024 ,   m 33 d = 0.02 ,   m 1 u = m 2 u = m 3 u = 1 ,   m 1 v = 1.2 ,   m 2 v = 2 ,   m 3 v = 1.6 . According Theorem 1, the neural network (19) admits a unique asymptotically stable, unpredictable solution ω ( t ) = ( ω 1 ( t ) , ω 2 ( t ) , ω 3 ( t ) ) . In Figure 1 and Figure 2, the coordinates and the trajectory of the neural network are shown (19), which asymptotically convergence to the coordinates and trajectory of the unpredictable solution ω ( t ) . Moreover, utilizing (17), we have that:
| x 1 ( t ) ω 1 ( t ) | | x 1 ( 0 ) ω 1 ( 0 ) | e 3.62 ( t t 0 ) 2 e 3.62 ( t t 0 ) , | x 2 ( t ) ω 2 ( t ) | | x 2 ( 0 ) ω 2 ( 0 ) | e 4.26 ( t t 0 ) 2 e 4.26 ( t t 0 ) , | x 3 ( t ) ω 3 ( t ) | | x 3 ( 0 ) ω 3 ( 0 ) | e 6.72 ( t t 0 ) 2 e 6.72 ( t t 0 ) .
Thus, if t > 1 3.62 ( 5 ln 10 + ln 2 ) 3.38 , then x ( t ) ω ( t ) 0 < 10 5 . In other words, what is seen in Figure 1 and Figure 2 for sufficiently large time can be accepted as parts of the graph and trajectory of the unpredictable solution.
Example 2.
Let us show the simulation results for the following Hopfield-type neural network:
y i ( t ) = ( a i ( t ) + b i ( t ) ) y i ( t ) + j = 1 3 ( c i j ( t ) + d i j ( t ) ) f j ( y j ( t ) ) + u i ( t ) + v i ( t ) ,
 where i = 1 , 2 , 3 ,   f ( y ( t ) ) = 0.5 a r c t g ( y ( t ) ) .
The functions a i ( t ) ,   c i j ( t ) and u i ( t ) are periodic with common period ω = 1 , and a 1 ( t ) = 5 + cos ( 2 π t ) ,   a 2 ( t ) = 4 + sin 2 ( π t ) ,   a 3 ( t ) = 6 + 0.5 sin ( 2 π t ) ,   c 11 ( t ) = 0.4 cos ( 2 π t ) ,   c 12 ( t ) = 0.2 sin ( 4 π t ) ,   c 13 ( t ) = 0.1 cos ( 8 π t ) ,   c 21 ( t ) = 0.1 cos ( 4 π t ) ,   c 22 ( t ) = 0.4 cos ( 2 π t ) ,   c 23 ( t ) = 0.4 sin ( 4 π t ) ,   c 31 ( t ) = 0.3 sin ( 2 π t ) ,   c 32 ( t ) = 0.5 cos ( 4 π t ) ,   c 33 ( t ) = 0.2 cos ( 2 π t ) ,   u 1 ( t ) = cos ( 2 π t ) ,   u 2 ( t ) = 0.5 sin ( 4 π t ) ,   u 3 ( t ) = sin ( 2 π t ) . The functions b i ( t ) ,   d i j ( t ) and v i ( t ) are unpredictable such that b 1 ( t ) = 0.5 Θ ( t ) ,   b 2 ( t ) = 0.3 Θ ( t ) ,   b 3 ( t ) = 0.7 Θ ( t ) ,   d 11 ( t ) = 0.3 Θ ( t ) ,   d 12 ( t ) = 0.6 Θ ( t ) ,   d 13 ( t ) = 0.2 Θ ( t ) ,   d 21 ( t ) = 0.3 Θ ( t ) ,   d 22 ( t ) = 0.5 Θ ( t ) ,   d 23 ( t ) = 0.3 Θ ( t ) ,   d 31 ( t ) = 0.1 Θ ( t ) ,   d 32 ( t ) = 0.2 Θ ( t ) ,   d 33 ( t ) = 0.5 Θ ( t ) ,   v 1 ( t ) = 6 Θ ( t ) ,   v 2 ( t ) = 8 Θ ( t ) ,   v 3 ( t ) = 7 Θ ( t ) , where Θ ( t ) = t e 3 ( t s ) Ω ( s ) d s with the length of step h = 1 . Condition (C1) is valid, and K i = 1 ,   i = 1 , 2 , 3 ,   λ 1 = 5 ,   λ 2 = 4.5 ,   λ 3 = 6 . Conditions (C2) and (C3) are satisfied since the elements of the convergence sequence are multiples of h = 1 and the period ω is equal to 1 . The degree of periodicity equals to 1. Conditions (C4)–(C8) are satisfied with H = 1 ,   m f = π / 4 ,   L = 0.5 ,   m 1 b = 1 / 6 ,   m 2 b = 1 / 10 ,   m 3 b = 7 / 30 ,   m 11 c = 0.4 ,   m 12 c = 0.2 ,   m 13 c = 0.1 ,   m 21 c = 0.1 ,   m 22 c = 0.4 ,   m 23 c = 0.4 ,   m 31 c = 0.3 ,   m 32 c = 0.5 ,   m 33 c = 0.2 ,   m 11 d = 0.1 ,   m 12 d = 0.2 ,   m 13 d = 0.07 ,   m 21 d = 0.1 ,   m 22 d = 0.17 ,   m 23 d = 0.1 ,   m 31 d = 0.34 ,   m 32 d = 0.07 ,   m 33 d = 0.17 ,   m 1 u = 1 ,   m 2 u = 0.5 ,   m 3 u = 1 ,   m 1 v = 2 ,   m 2 v = 8 / 3 ,   m 3 v = 7 / 3 .  Figure 3 and Figure 4 demonstrate the coordinates and the trajectory of the solution y ( t ) = ( y 1 ( t ) , y 2 ( t , y 3 ( t ) ) ) , of the neural network (20), with initial values y 1 ( 0 ) = 0.2 ,   y 2 ( 0 ) = 0.4 ,   y 3 ( 0 ) = 0.6 . The solution y ( t ) = ( y 1 ( t ) , y 2 ( t , y 3 ( t ) ) ) asymptotically converges to the unpredictable solution ω ( t ) . By estimation (17), one can obtain that y ( t ) ω ( t ) 0 < 10 6 for t > 1 4.175 ( 6 ln 10 + ln 2 ) 3.48 . .
Example 3.
Finally, we will show how the degree of periodicity, > 1 , effects the dynamics of the Hopfield-type neural network:
z i ( t ) = ( a i ( t ) + b i ( t ) ) z i ( t ) + j = 1 3 ( c i j ( t ) + d i j ( t ) ) f j ( z j ( t ) ) + u i ( t ) + v i ( t ) ,
 where i = 1 , 2 , 3 ,   f ( z ( t ) ) = 0.25 a r c t g ( z ( t ) ) . The functions a i ( t ) ,   c i j ( t ) and u i ( t ) are periodic with common period ω = 10 π , and a 1 ( t ) = 5 + sin ( 2 t ) ,   a 2 ( t ) = 6 + cos ( 4 t ) ,   a 3 ( t ) = 4 + 0.5 sin ( 2 t ) ,   c 11 ( t ) = 0.01 sin ( 2 t ) ,   c 12 ( t ) = 0.04 cos ( 4 t ) ,   c 13 ( t ) = 0.02 sin ( 8 t ) ,   c 21 ( t ) = 0.05 cos ( 4 t ) ,   c 22 ( t ) = 0.03 sin ( 2 t ) ,   c 23 ( t ) = 0.03 cos ( 8 t ) ,   c 31 ( t ) = 0.02 sin ( 4 t ) ,   c 32 ( t ) = 0.05 cos ( 2 t ) ,   c 33 ( t ) = 0.01 cos ( 4 t ) ,   u 1 ( t ) = sin ( 0.4 t ) ,   u 2 ( t ) = cos ( 0.4 t ) ,   u 3 ( t ) = cos ( 0.2 t ) . The unpredictable functions b i ( t ) ,   d i j ( t ) and v i ( t ) are such that b 1 ( t ) = 0.8 Θ ( t ) ,   b 2 ( t ) = 0.3 Θ ( t ) ,   b 3 ( t ) = 0.4 Θ ( t ) ,   d 11 ( t ) = 0.04 Θ ( t ) ,   d 12 ( t ) = 0.05 Θ ( t ) ,   d 13 ( t ) = 0.02 Θ ( t ) ,   d 21 ( t ) = 0.05 Θ ( t ) ,   d 22 ( t ) = 0.01 Θ ( t ) ,   d 23 ( t ) = 0.06 Θ ( t ) ,   d 31 ( t ) = 0.01 Θ ( t ) ,   d 32 ( t ) = 0.06 Θ ( t ) ,   d 33 ( t ) = 0.03 Θ ( t ) ,   v 1 ( t ) = 1.6 Θ ( t ) ,   v 2 ( t ) = 1.4 Θ ( t ) ,   v 3 ( t ) = 1.8 Θ ( t ) , where Θ ( t ) = t e 2 ( t s ) Ω ( s ) d s with the length of step h = 0.1 π . All conditions (C1)–(C8) are valid with K i = 1 ,   i = 1 , 2 , 3 ,   λ 1 = 50 π ,   λ 2 = 40 π ,   λ 3 = 60 π ,   H = 1 ,   m f = π / 4 ,   L = 0.25 ,   m 1 b = 0.4 ,   m 2 b = 0.15 ,   m 3 b = 0.2 ,   m 11 c = 0.01 ,   m 12 c = 0.04 ,   m 13 c = 0.02 ,   m 21 c = 0.05 ,   m 22 c = 0.03 ,   m 23 c = 0.03 ,   m 31 c = 0.02 ,   m 32 c = 0.05 ,   m 33 c = 0.01 ,   m 11 d = 0.02 ,   m 12 d = 0.025 ,   m 13 d = 0.01 ,   m 21 d = 0.025 ,   m 22 d = 0.005 ,   m 23 d = 0.03 ,   m 31 d = 0.005 ,   m 32 d = 0.03 ,   m 33 d = 0.015 ,   m 1 u = m 2 u = m 3 u = 1 ,   m 1 v = 0.8 ,   m 2 v = 0.7 ,   m 3 v = 0.9 . The degree of periodicity is equal to 100. In Figure 5 and Figure 6, we depict the coordinates and the trajectory of the solution z ( t ) = ( z 1 ( t ) , z 2 ( t ) , z 3 ( t ) ) of the neural network (21), with initial values z 1 ( 0 ) = 0.8 ,   z 2 ( 0 ) = 0.2 ,   z 3 ( 0 ) = 0.5 . The solution z ( t ) asymptotically converges to the unpredictable solution ω ( t ) .
Observing the graphs in Figure 1 and Figure 3, if 1 , we see that the unpredictability prevails. More preciously, periodicity appears only locally on separated intervals if < 1 , and is not seen at all for = 1 . Oppositely, if > 1 , one can see in Figure 5 that the solution admits clear periodic shape, which is enveloped by the unpredictability.

5. Conclusions

In this paper, we consider HNNs with variable two-component connection matrix, rates and external inputs. Sufficient conditions are obtained to ensure the existence of exponentially stable unpredictable solutions for HNNs. We introduced and utilized the quantitative characteristic, the degree of periodicity, which differentiates contribution of components, that is, the periodicity and the unpredictability, in the outputs of the model. The obtained results make it possible to find effects of periodicity in chaotic oscillations, which is very important for synchronization, stabilization and control of chaos.

Author Contributions

M.A.: conceptualization; investigation; validation; writing—original draft. M.T.: investigation; writing—review and editing. A.Z.: investigation; software; writing—original draft. All authors have read and agreed to the published version of the manuscript.

Funding

M.A. and A.Z. have been supported by the 2247-A National Leading Researchers Program of TUBITAK, Turkey, N 120C138. M. Tleubergenova has been supported by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan, grant No. AP08856170.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 1982, 79, 2554–2558. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Hopfield, J.J. Neurons with graded response have collective computational properties like those of two-stage neurons. Proc. Natl. Acad. Sci. USA 1984, 81, 3088–3092. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Pajares, G. A Hopfield Neural Network for Image Change Detection. IEEE Trans. Neural Netw. 2006, 17, 1250–1264. [Google Scholar] [CrossRef] [PubMed]
  4. Koss, J.E.; Newman, F.D.; Johnson, T.K.; Kirch, D.L. Abdominal organ segmentation using texture transforms and a Hopfield neural network. IEEE Trans. Med. Imaging 1999, 18, 640–648. [Google Scholar] [CrossRef]
  5. Cheng, K.C.; Lin, Z.C.; Mao, C.W. The Application of Competitive Hopfield Neural Network to Medical Image Segmentation. IEEE Trans. Med. Imaging 1996, 15, 560–567. [Google Scholar] [CrossRef] [Green Version]
  6. Soni, N.; Sharma, E.K.; Kapoor, A. Application of Hopfield neural network for facial image recognition. IJRTE 2019, 8, 3101–3105. [Google Scholar]
  7. Sang, N.; Zhang, T. Segmentation of FLIR images by Hopfield neural network with edge constraint. Pattern Recognit. 2001, 34, 811–821. [Google Scholar] [CrossRef]
  8. Amartur, S.C.; Piraino, D.; Takefuji, Y. Optimization neural networks for the segmentation of magnetic resonance images. IEEE Trans. Med. Imaging 1992, 11, 215–220. [Google Scholar] [CrossRef]
  9. Mohammad, S. Exponential stability in Hopfield-type neural networks with impulses. Chaos Solitons Fractals 2007, 32, 456–467. [Google Scholar] [CrossRef]
  10. Chen, T.; Amari, S.I. Stability of asymmetric Hopfield networks. IEEE Trans. Neural Netw. 2001, 12, 159–163. [Google Scholar] [CrossRef]
  11. Shi, P.L.; Dong, L.Z. Existence and exponential stability of anti-periodic solutions of Hopfield neural networks with impulses. Appl. Math. Comput. 2010, 216, 623–630. [Google Scholar] [CrossRef]
  12. Juang, J. Stability analysis of Hopfield type neural networks. IEEE Trans. Neural Netw. 1999, 10, 1366–1374. [Google Scholar] [CrossRef] [PubMed]
  13. Yang, H.; Dillon, T.S. Exponential stability and oscillation of Hopfield graded response neural network. IEEE Trans. Neural Netw. 1994, 5, 719–729. [Google Scholar] [CrossRef] [PubMed]
  14. Liu, B. Almost periodic solutions for Hopfield neural networks with continuously distributed delays. Math. Comput. Simul. 2007, 73, 327–335. [Google Scholar] [CrossRef]
  15. Liu, Y.; Huang, Z.; Chen, L. Almost periodic solution of impulsive Hopfield neural networks with finite distributed delays. Neural Comput. Appl. 2012, 21, 821–831. [Google Scholar] [CrossRef]
  16. Guo, S.J.; Huang, L.H. Periodic oscillation for a class of neural networks with variable coefficients. Nonlinear Anal. Real World Appl. 2005, 6, 545–561. [Google Scholar] [CrossRef]
  17. Liu, B.W.; Huang, L.H. Existence and exponential stability of almost periodic solutions for Hopfield neural networks with delays. Neurocomputing 2005, 68, 196–207. [Google Scholar] [CrossRef]
  18. Liu, Y.G.; You, Z.S.; Cao, L.P. On the almost periodic solution of generalized Hopfield neural networks with time-varying delays. Neurocomputing 2006, 69, 1760–1767. [Google Scholar] [CrossRef]
  19. Yang, X.F.; Liao, X.F.; Evans, D.J.; Tang, Y. Existence and stability of periodic solution in impulsive Hopfield neural networks with finite distributed delays. Phys. Lett. A 2005, 343, 108–116. [Google Scholar] [CrossRef]
  20. Zhang, H.; Xia, Y. Existence and exponential stability of almost periodic solution for Hopfield type neural networks with impulse. Chaos Solitons Fractals 2008, 37, 1076–1082. [Google Scholar] [CrossRef]
  21. Bai, C. Existence and stability of almost periodic solutions of Hopfield neural networks with continuously distributed delays. Nonlinear Anal. Theory Methods Appl. 2009, 71, 5850–5859. [Google Scholar] [CrossRef]
  22. Poincare, H. New Methods of Celestial Mechanics; Dover Publications: New York, NY, USA, 1957. [Google Scholar]
  23. Birkhoff, G. Dynamical Systems; American Mathematical Society: Providence, RI, USA, 1927. [Google Scholar]
  24. Akhmet, M.; Fen, M.O. Unpredictable points and chaos. Commun. Nonlinear Sci. Nummer. Simulat. 2016, 40, 1–5. [Google Scholar] [CrossRef] [Green Version]
  25. Akhmet, M.; Fen, M.O. Poincare chaos and unpredictable functions. Commun. Nonlinear Sci. Nummer. Simulat. 2017, 48, 85–94. [Google Scholar] [CrossRef] [Green Version]
  26. Akhmet, M.; Tleubergenova, M.; Zhamanshin, A. Poincare chaos for a hyperbolic quasilinear system. Miskolc Math. Notes 2019, 20, 33–44. [Google Scholar] [CrossRef]
  27. Akhmet, M.; Seilova, R.; Tleubergenova, M.; Zhamanshin, A. Shunting inhibitory cellular neural networks with strongly unpredictable oscillations. Commun. Nonlinear Sci. Numer. Simul. 2020, 89, 105287. [Google Scholar] [CrossRef]
  28. Akhmet, M.; Tleubergenova, M.; Akylbek, Z. Inertial neural networks with unpredictable oscillations. Mathematics 2020, 8, 1797. [Google Scholar] [CrossRef]
  29. Akhmet, M. Domain Structured Dynamics: Unpredictability, Chaos, Randomness, Fractals, Differential Equations and Neural Networks; IOP Publishing: Bristol, UK, 2021. [Google Scholar]
  30. Akhmet, M.; ÇinÇin, D.A.; Tleubergenova, M.; Nugayeva, Z. Unpredictable oscillations for Hopfield–type neural networks with delayed and advanced arguments. Mathematics 2020, 9, 571. [Google Scholar] [CrossRef]
  31. Akhmet, M.; Tleubergenova, M.; Nugayeva, Z. Unpredictable Oscillations of Impulsive Neural Networks with Hopfield Structure. Lect. Notes Data Eng. Commun. Technol. 2021, 76, 625–642. [Google Scholar]
  32. Sell, G. Topological Dynamics and Ordinary Differential Equations; Van Nostrand Reinhold Company: London, UK, 1971. [Google Scholar]
  33. Akhmet, M.; Tleubergenova, M.; Zhamanshin, A. Modulo periodic Poisson stable solutions of quasilinear differential equations. Entropy 2021, 23, 1535. [Google Scholar] [CrossRef]
  34. Hartman, P. Ordinary Differential Equations; Birkhauser: Boston, MA, USA, 2002. [Google Scholar]
Figure 1. The time series of the coordinates x 1 ( t ) , x 2 ( t ) and x 3 ( t ) of the solution of system (19) with the initial conditions x 1 ( 0 ) = 0.5 , x 2 ( 0 ) = 0.7 , x 3 ( 0 ) = 0.3 and = 1 / 8 .
Figure 1. The time series of the coordinates x 1 ( t ) , x 2 ( t ) and x 3 ( t ) of the solution of system (19) with the initial conditions x 1 ( 0 ) = 0.5 , x 2 ( 0 ) = 0.7 , x 3 ( 0 ) = 0.3 and = 1 / 8 .
Entropy 24 01555 g001
Figure 2. The trajectory of the neural network (19).
Figure 2. The trajectory of the neural network (19).
Entropy 24 01555 g002
Figure 3. The time series of the coordinates y 1 ( t ) , y 2 ( t ) and y 3 ( t ) of the solution of system (20) with the initial conditions y 1 ( 0 ) = 0.5 , y 2 ( 0 ) = 0.7 , y 3 ( 0 ) = 0.3 , and = 1 .
Figure 3. The time series of the coordinates y 1 ( t ) , y 2 ( t ) and y 3 ( t ) of the solution of system (20) with the initial conditions y 1 ( 0 ) = 0.5 , y 2 ( 0 ) = 0.7 , y 3 ( 0 ) = 0.3 , and = 1 .
Entropy 24 01555 g003
Figure 4. The trajectory of the neural network (20).
Figure 4. The trajectory of the neural network (20).
Entropy 24 01555 g004
Figure 5. The coordinates z 1 ( t ) , z 2 ( t ) and z 3 ( t ) of the solution of system (21) with the initial conditions z 1 ( 0 ) = 0.8 , z 2 ( 0 ) = 0.2 , z 3 ( 0 ) = 0.5 and = 100 .
Figure 5. The coordinates z 1 ( t ) , z 2 ( t ) and z 3 ( t ) of the solution of system (21) with the initial conditions z 1 ( 0 ) = 0.8 , z 2 ( 0 ) = 0.2 , z 3 ( 0 ) = 0.5 and = 100 .
Entropy 24 01555 g005
Figure 6. The trajectory of the neural network (21).
Figure 6. The trajectory of the neural network (21).
Entropy 24 01555 g006
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Akhmet, M.; Tleubergenova, M.; Zhamanshin, A. Dynamics of Hopfield-Type Neural Networks with Modulo Periodic Unpredictable Synaptic Connections, Rates and Inputs. Entropy 2022, 24, 1555. https://doi.org/10.3390/e24111555

AMA Style

Akhmet M, Tleubergenova M, Zhamanshin A. Dynamics of Hopfield-Type Neural Networks with Modulo Periodic Unpredictable Synaptic Connections, Rates and Inputs. Entropy. 2022; 24(11):1555. https://doi.org/10.3390/e24111555

Chicago/Turabian Style

Akhmet, Marat, Madina Tleubergenova, and Akylbek Zhamanshin. 2022. "Dynamics of Hopfield-Type Neural Networks with Modulo Periodic Unpredictable Synaptic Connections, Rates and Inputs" Entropy 24, no. 11: 1555. https://doi.org/10.3390/e24111555

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop