Skip to main content

Exponential convergence of Cohen-Grossberg neural networks with continuously distributed leakage delays

Abstract

This paper is concerned with the global exponential convergence of Cohen-Grossberg neural networks with continuously distributed leakage delays. By using the Lyapunov functional method and differential inequality techniques, we propose a new approach to establishing some sufficient conditions ensuring that all solutions of the networks converge exponentially to the zero point. Our results complement some recent ones.

MSC: 34C25, 34K13, 34K25.

1 Introduction

It is well known that Cohen-Grossberg neural networks (CGNNs) have been successfully applied in many fields such as pattern recognition, parallel computing, associative memory, and combinatorial optimization (see [1–5]). Such applications heavily depend on the global exponential convergence behaviors, because the exponential convergent rate can be unveiled. Many good results on the problem of the global exponential convergence of the equilibriums and periodic solutions of for CGNNs are given in the literature. We refer the reader to [6–13] and the references cited therein. Recently, in real applications, a typical time delay called leakage (or ‘forgetting’) delay has been introduced in the negative feedback terms of the neural network system, and these terms are variously known as forgetting or leakage terms (see [14–16]). Subsequently, Gopalsamy [17] investigated the stability on the equilibrium for the bidirectional associative memory (BAM) neural networks with constant delay in the leakage term. Following this, the authors of [18–22] dealt with the existence and stability of equilibrium and periodic solutions for neuron networks model involving constant leakage delays. In particular, Peng [23] established some delay dependent criteria for the existence and global attractive periodic solutions of the bidirectional associative memory neural network with continuously distributed delays in the leakage terms. However, to the best of our knowledge, few authors have considered the exponential convergence behavior for all solutions of CGNNs with continuously distributed delays in the leakage terms. Motivated by the arguments above, in the present paper, we shall consider the following CGNNs with time-varying coefficients and continuously distributed delays in the leakage terms:

x i ′ ( t ) = − a i ( t , x i ( t ) ) [ b i ( t , ∫ 0 ∞ δ i ( s ) x i ( t − s ) d s ) − ∑ j = 1 n c i j ( t ) f j ( x j ( t − τ i j ( t ) ) ) − ∑ j = 1 n d i j ( t ) ∫ 0 ∞ K i j ( u ) g j ( x j ( t − u ) ) d u + I i ( t ) ] , i = 1 , 2 , … , n ,
(1.1)

where a i and b i are continuous functions on R 2 , δ i , τ i j , f j , g j , c i j , d i j and I i are continuous functions on R; n corresponds to the number of units in a neural network; x i (t) denotes the potential (or voltage) of cell i at time t; a i represents an amplification function; b i is an appropriately behaved function; c i j (t) and d i j (t) denote the strengths of connectivity between cell i and j at time t, respectively; the activation functions f i (⋅) and g i (⋅) show how the i th neuron reacts to the input, τ i j (t)≥0 corresponds to the transmission delays, K i j (u) and δ i (u)≥0 correspond to the transmission delay kernels, and I i (t) denotes the i th component of an external input source introduced from outside the network to cell i at time t for i,j∈F={1,2,…,n}.

Throughout this paper, for i,j∈F, it will be assumed that h i :[0,+∞)→[0,+∞) and K i j :[0,+∞)→R are continuous functions, and there exist constants τ i j + , I i ¯ , c i j ¯ , and d i j ¯ such that

τ i j + = sup t ∈ R τ i j (t), I i ¯ = sup t ∈ R | I i (t)|, c i j ¯ = sup t ∈ R | c i j (t)|, d i j ¯ = sup t ∈ R | d i j (t)|.
(1.2)

We also make the following assumptions.

(H1) For each j∈F, there exist nonnegative constants β, α, L ˜ j and L j such that

0 ≤ β ≤ 1 , 0 ≤ α ≤ 1 , | f j ( u ) | ≤ L Ëœ j | u | β , | g j ( u ) | ≤ L j | u | α for all  u ∈ R .
(1.3)

(H2) For i∈F, there exist positive constants a i ̲ and a i ¯ such that

a i ̲ ≤ a i (t,u)≤ a i ¯ for all t>0,u∈R.

(H3) For i∈F, b i (t,0)≡0, and there exist positive constants b i ̲ and b i ¯ such that

b i ̲ |u−v|≤sgn(u−v) ( b i ( t , u ) − b i ( t , v ) ) ≤ b i ¯ |u−v|for all t>0,u,v∈R.

(H4) For all t>0 and i,j∈F, there exist constants η>0 and λ>0 such that

∫ 0 ∞ s δ i (s) e λ s ds<+∞, ∫ 0 ∞ | K i j (u)| e λ u du<+∞

and

− η > − [ a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e λ s d s − λ ( 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ) − a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s d s ] + a i ¯ [ ∑ j = 1 n L ˜ j ( | c i j ( t ) | e λ β τ i j ( t ) + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s c i j ¯ e λ β τ i j + ) e λ ( 1 − β ) t + ∑ j = 1 n L j ∫ 0 ∞ | K i j ( u ) | e λ α u d u ( | d i j ( t ) | + d i j ¯ a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ) e λ ( 1 − α ) t ] .

(H5) I i (t)=O( e − λ t ) (t→±∞), i∈F.

The initial conditions associated with system (1.1) are of the form

x i (s)= φ i (s),s∈(−∞,0],i∈F,
(1.4)

where φ i (⋅) denotes a real-valued bounded continuous function defined on (−∞,0].

The remaining part of this paper is organized as follows. In Section 2, we present some new sufficient conditions to ensure that all solutions of CGNNs (1.1) with initial conditions (1.4) converge exponentially to the zero point. In Section 3, we shall give some examples and remarks to illustrate our results obtained in the previous sections.

2 Main results

Theorem 2.1 Let (H1)-(H5) hold. Then, for every solution Z(t)= ( x 1 ( t ) , x 2 ( t ) , … , x n ( t ) ) T of CGNNs (1.1) with initial conditions (1.4), there exists a positive constant K such that

| x i (t)|≤K e − λ t for all t>0,i∈F.

Proof Let Z(t)= ( x 1 ( t ) , x 2 ( t ) , … , x n ( t ) ) T be a solution of system (1.1) with initial conditions (1.4), and let

X i (t)= e λ t x i (t),i∈F.

In view of (1.1), we have

X i ′ ( t ) = λ X i ( t ) + e λ t a i ( t , x i ( t ) ) [ − b i ( t , ∫ 0 ∞ δ i ( s ) e λ ( s − t ) X i ( t − s ) d s ) + ∑ j = 1 n c i j ( t ) f j ( x j ( t − τ i j ( t ) ) ) + ∑ j = 1 n d i j ( t ) ∫ 0 ∞ K i j ( u ) g j ( x j ( t − u ) ) d u − I i ( t ) ] = λ X i ( t ) + e λ t a i ( t , x i ( t ) ) [ − b i ( t , ∫ 0 ∞ δ i ( s ) e λ ( s − t ) X i ( t ) d s ) + ( b i ( t , ∫ 0 ∞ δ i ( s ) e λ ( s − t ) X i ( t ) d s ) − b i ( t , ∫ 0 ∞ δ i ( s ) e λ ( s − t ) X i ( t − s ) d s ) ) + ∑ j = 1 n c i j ( t ) f j ( x j ( t − τ i j ( t ) ) ) + ∑ j = 1 n d i j ( t ) ∫ 0 ∞ K i j ( u ) g j ( x j ( t − u ) ) d u − I i ( t ) ] , i = 1 , 2 , … , n .
(2.1)

Let

M= max i = 1 , 2 , … , n sup s ≤ 0 { e λ s | φ i ( s ) | } .
(2.2)

From (1.2) and (H5), we can choose a positive constant K>M+1 such that

η> [ 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ] a i ¯ sup t ∈ R | e λ t I i ( t ) | K for all t>0,i∈F.
(2.3)

Then it is easy to see that

| X i (t)|≤M<Kfor all t≤0,i=1,2,…,n.

We now claim that

| X i (t)|<Kfor all t>0,i∈F.
(2.4)

Otherwise, one of the following two cases must occur.

  1. (1)

    There exist i∈F and t ∗ >0 such that

    X i ( t ∗ ) =K,| X j (t)|<Kfor all t< t ∗ ,j∈F.
    (2.5)
  2. (2)

    There exist i∈F and t ∗ ∗ >0 such that

    X i ( t ∗ ∗ ) =−K,| X j (t)|<Kfor all t< t ∗ ∗ ,j∈F.
    (2.6)

Now, we distinguish two cases to finish the proof.

Case (1). If (2.5) holds. Then, from (2.1), (2.3), and (H1)-(H4), we have

0 ≤ X i ′ ( t ∗ ) = λ X i ( t ∗ ) + e λ t ∗ a i ( t ∗ , x i ( t ∗ ) ) [ − b i ( t ∗ , ∫ 0 ∞ δ i ( s ) e λ ( s − t ∗ ) X i ( t ∗ ) d s ) + ( b i ( t ∗ , ∫ 0 ∞ δ i ( s ) e λ ( s − t ∗ ) X i ( t ∗ ) d s ) − b i ( t ∗ , ∫ 0 ∞ δ i ( s ) e λ ( s − t ∗ ) X i ( t ∗ − s ) d s ) ) + ∑ j = 1 n c i j ( t ∗ ) f j ( x j ( t ∗ − τ i j ( t ∗ ) ) ) + ∑ j = 1 n d i j ( t ∗ ) ∫ 0 ∞ K i j ( u ) g j ( x j ( t ∗ − u ) ) d u − I i ( t ∗ ) ] ≤ λ X i ( t ∗ ) − a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e λ s d s X i ( t ∗ ) + a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s ∫ t ∗ − s t ∗ X i ′ ( u ) d u d s + a i ¯ ∑ j = 1 n | c i j ( t ∗ ) | L ˜ j e λ β τ i j ( t ∗ ) e λ ( 1 − β ) t ∗ | X j ( t ∗ − τ i j ( t ∗ ) ) | β + a i ¯ ∑ j = 1 n | d i j ( t ∗ ) | L j e λ ( 1 − α ) t ∗ ∫ 0 ∞ | K i j ( u ) | e λ α u | X j ( t ∗ − u ) | α d u + a i ¯ e λ t ∗ | I i ( t ∗ ) | ≤ λ X i ( t ∗ ) − a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e λ s d s X i ( t ∗ ) + a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s ∫ t ∗ − s t ∗ | λ X i ( u ) + e λ u a i ( u , x i ( u ) ) [ − b i ( u , ∫ 0 ∞ δ i ( v ) e λ ( v − u ) X i ( u − v ) d v ) + ∑ j = 1 n c i j ( u ) f j ( x j ( u − τ i j ( u ) ) ) + ∑ j = 1 n d i j ( u ) ∫ 0 ∞ K i j ( v ) g j ( x j ( u − v ) ) d v − I i ( u ) ] | d u d s + a i ¯ ∑ j = 1 n | c i j ( t ∗ ) | L ˜ j e λ β τ i j ( t ∗ ) e λ ( 1 − β ) t ∗ | X j ( t ∗ − τ i j ( t ∗ ) ) | β + a i ¯ ∑ j = 1 n | d i j ( t ∗ ) | L j e λ ( 1 − α ) t ∗ ∫ 0 ∞ | K i j ( u ) | e λ α u | X j ( t ∗ − u ) | α d u + a i ¯ e λ t ∗ | I i ( t ∗ ) | ≤ λ X i ( t ∗ ) − a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e λ s d s X i ( t ∗ ) + λ a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s X i ( t ∗ ) + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s d s X i ( t ∗ ) + a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s ∫ t ∗ − s t ∗ a i ¯ [ ∑ j = 1 n c i j ¯ L ˜ j e λ β τ i j + e λ ( 1 − β ) u | X j ( u − τ i j ( u ) ) | β + ∑ j = 1 n d i j ¯ L j e λ ( 1 − α ) u ∫ 0 ∞ | K i j ( v ) | e λ α v | X j ( u − v ) | α d v + sup t ∈ R | e λ t I i ( t ) | ] d u d s + a i ¯ ∑ j = 1 n | c i j ( t ∗ ) | L ˜ j e λ β τ i j ( t ∗ ) e λ ( 1 − β ) t ∗ | X j ( t ∗ − τ i j ( t ∗ ) ) | β + a i ¯ ∑ j = 1 n | d i j ( t ∗ ) | L j e λ ( 1 − α ) t ∗ ∫ 0 ∞ | K i j ( u ) | e λ α u | X j ( t ∗ − u ) | α d u + a i ¯ e λ t ∗ | I i ( t ∗ ) | ≤ − [ a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e λ s d s − λ ( 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ) − a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s d s ] X i ( t ∗ ) + a i ¯ [ ∑ j = 1 n L ˜ j ( | c i j ( t ∗ ) | e λ β τ i j ( t ∗ ) + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s c i j ¯ e λ β τ i j + ) e λ ( 1 − β ) t ∗ + ∑ j = 1 n L j ∫ 0 ∞ | K i j ( u ) | e λ α u d u ( | d i j ( t ∗ ) | + d i j ¯ a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ) e λ ( 1 − α ) t ∗ ] K + [ 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ] a i ¯ sup t ∈ R | e λ t I i ( t ) | = { − [ a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e λ s d s − λ ( 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ) − a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e λ s d s ] + a i ¯ [ ∑ j = 1 n L ˜ j ( | c i j ( t ∗ ) | e λ β τ i j ( t ∗ ) + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s c i j ¯ e λ β τ i j + ) e λ ( 1 − β ) t ∗ + ∑ j = 1 n L j ∫ 0 ∞ | K i j ( u ) | e λ α u d u ( | d i j ( t ∗ ) | + d i j ¯ a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ) e λ ( 1 − α ) t ∗ ] } K + [ 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ] a i ¯ sup t ∈ R | e λ t I i ( t ) | < − η K + [ 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e λ s d s ] a i ¯ sup t ∈ R | e λ t I i ( t ) | < 0 .

This contradiction implies that (2.5) does not hold.

Case (2). If (2.6) holds, then, from (2.1), (2.3), and (H1)-(H4), by using a similar argument as in Case (1), we can derive a contradiction, which shows that (2.6) does not hold.

Therefore, (2.4) is proved and

| x i (t)|≤K e − λ t for all t>0,i∈F.

This implies that the proof of Theorem 2.1 is now completed. □

3 An example

Example 3.1 Consider the following CGNNs with time-varying delays in the leakage terms:

{ x 1 ′ ( t ) = − ( 2 + e cos 2 t 1 10 π arctan x 1 ( t ) ) [ ( 4 − | t | | sin t | 1 + 2 | t | ) ∫ 0 ∞ δ 1 ( s ) x 1 ( t − s ) d s x 1 ′ ( t ) = + 1 70 | t | sin t 1 + 40 | t | f 1 ( x 1 ( t − 2 sin 2 t ) ) + 1 70 | t | sin t 1 + 36 | t | x 1 ′ ( t ) = ⋅ f 2 ( x 2 ( t − 3 sin 2 t ) ) + 1 70 | t | sin t 1 + 40 | t | ∫ 0 ∞ e − u g 1 ( x j ( t − u ) ) d u x 1 ′ ( t ) = + 1 70 | t | 2 sin t 1 + 36 | t | 2 ∫ 0 ∞ e − u g 2 ( x j ( t − u ) ) d u + 20 , 000 e − 3 t sin t ] , x 2 ′ ( t ) = − ( 2 + e sin 2 t 1 10 π arctan x 2 ( t ) ) [ ( 4 − | t | | cos t | 1 + 2 | t | ) ∫ 0 ∞ δ 2 ( s ) x 2 ( t − s ) d s x 1 ′ ( t ) = + 1 70 | t | cos t 1 + 40 | t | f 1 ( x 1 ( t − 2 sin 2 t ) ) x 1 ′ ( t ) = + 1 70 | t | cos t 1 + 36 | t | f 2 ( x 2 ( t − 5 sin 2 t ) ) + 1 70 | t | cos t 1 + 40 | t | ∫ 0 ∞ e − u g 1 ( x j ( t − u ) ) d u x 1 ′ ( t ) = + 1 70 | t | cos t 1 + 36 | t | ∫ 0 ∞ e − u g 2 ( x j ( t − u ) ) d u + 30 , 000 e − t cos t ] ,
(3.1)

where f i (x)= g i (x)=x sin 2 i x, δ i (t)= e − 10 t , i=1,2.

It follows that

1≤ a i ̲ ≤ a i ¯ ≤3,3≤ b i ̲ ≤ b i ¯ ≤4,i=1,2

and

b i ̲ |u|≤sgn(u) b i (t,u)for all t,u∈R,i=1,2.

Define a continuous function Γ i (ω) by setting

Γ i ( ω ) = − [ a i ̲ b i ̲ ∫ 0 ∞ δ i ( s ) e ω s d s − ω ( 1 + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e ω s d s ) − a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e ω s d s a i ¯ b i ¯ ∫ 0 ∞ δ i ( s ) e ω s d s ] + a i ¯ [ ∑ j = 1 n L Ëœ j ( | c i j ( t ) | e ω β Ï„ i j ( t ) + a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e ω s d s c i j ¯ e ω β Ï„ i j + ) e ω ( 1 − β ) t + ∑ j = 1 n L j ∫ 0 ∞ | K i j ( u ) | e ω α u d u ( | d i j ( t ) | + d i j ¯ a i ¯ b i ¯ ∫ 0 ∞ s δ i ( s ) e ω s d s ) e ω ( 1 − α ) t ] for all  t > 0 , i = 1 , 2 .

According to the continuity of Γ i (ω) and Γ i (0)<0, we can choose constants η=0.1 and λ>0 such that

Γ i (λ)<−ηfor all t>0,i=1,2,

which implies that the CGNNs (3.1) satisfied (H1)-(H5). Hence, from Theorem 2.1, all solutions of the CGNNs (3.1) with initial value ( φ 1 (x), φ 2 (x)) converge exponentially to the zero point (0,0).

Remark 3.1 It is easy to check that the results in [17–23] and [24–34] are invalid for the global exponential convergence of (3.1), since the leakage delays are continuously distributed.

References

  1. Cohen M, Grossberg S: Absolute stability and global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Syst. Man Cybern. 1983, 13: 815-826.

    Article  MathSciNet  MATH  Google Scholar 

  2. Kennedy MP, Chua LO: Neural networks for nonlinear programming. IEEE Trans. Circuits Syst. 1988, 35: 554-562. 10.1109/31.1783

    Article  MathSciNet  Google Scholar 

  3. Cao J, Liang J: Boundedness and stability for Cohen-Grossberg neural networks with time-varying delays. J. Math. Anal. Appl. 2004, 296: 665-685. 10.1016/j.jmaa.2004.04.039

    Article  MathSciNet  MATH  Google Scholar 

  4. Chen T, Rong L: Delay-independent stability analysis of Cohen-Grossberg neural networks. Phys. Lett. A 2003, 317: 436-439. 10.1016/j.physleta.2003.08.066

    Article  MathSciNet  MATH  Google Scholar 

  5. Li Y: Existence and stability of periodic solutions for Cohen-Grossberg neural networks with multiple delays. Chaos Solitons Fractals 2004, 20: 459-466. 10.1016/S0960-0779(03)00406-5

    Article  MathSciNet  MATH  Google Scholar 

  6. Liao X, Li C, Wong K: Criteria for exponential stability of Cohen-Grossberg neural networks. Neural Netw. 2004, 17: 1401-1406. 10.1016/j.neunet.2004.08.007

    Article  MATH  Google Scholar 

  7. Wang L: Stability of Cohen-Grossberg neural networks with distributed delays. Appl. Math. Comput. 2005, 160: 93-110. 10.1016/j.amc.2003.09.014

    Article  MathSciNet  MATH  Google Scholar 

  8. Wang L, Zou X: Exponential stability of Cohen-Grossberg neural networks. Neural Netw. 2002, 15: 415-422. 10.1016/S0893-6080(02)00025-4

    Article  Google Scholar 

  9. Liu B: New convergence behavior of solutions to Cohen-Grossberg neural networks with delays and time-varying coefficients. Phys. Lett. A 2008,372(2):117-123. 10.1016/j.physleta.2007.09.066

    Article  MathSciNet  MATH  Google Scholar 

  10. Gong S: Anti-periodic solutions for a class of Cohen-Grossberg neural networks. Comput. Math. Appl. 2009,58(2):341-347. 10.1016/j.camwa.2009.03.105

    Article  MathSciNet  MATH  Google Scholar 

  11. Zhang Z, Zhou D: Global robust exponential stability for second-order Cohen-Grossberg neural networks with multiple delays. Neurocomputing 2009,73(1-3):213-218. 10.1016/j.neucom.2009.09.003

    Article  Google Scholar 

  12. Zhang ZQ, Yang Y, Huang YS: Global exponential stability of interval general BAM neural networks with reaction-diffusion terms and multiple time-varying delays. Neural Netw. 2011, 24: 457-465. 10.1016/j.neunet.2011.02.003

    Article  MATH  Google Scholar 

  13. Zhang Z, Liu W, Zhou D: Existence and global exponential stability of periodic solution to Cohen-Grossberg BAM neural networks with time-varying delays. Abstr. Appl. Anal. 2012., 2012: Article ID 805846 10.1155/2012/805846

    Google Scholar 

  14. Haykin S: Neural Networks. Prentice Hall, New York; 1994.

    MATH  Google Scholar 

  15. Kosok B: Neural Networks and Fuzzy Systems. Prentice Hall, New Delhi; 1992.

    Google Scholar 

  16. Gopalsamy K: Stability and Oscillations in Delay Differential Equations of Population Dynamics. Kluwer Academic, Dordrecht; 1992.

    Book  MATH  Google Scholar 

  17. Gopalsamy K: Leakage delays in BAM. J. Math. Anal. Appl. 2007, 325: 1117-1132. 10.1016/j.jmaa.2006.02.039

    Article  MathSciNet  MATH  Google Scholar 

  18. Li X, Cao J: Delay-dependent stability of neural networks of neutral type with time delay in the leakage term. Nonlinearity 2010, 23: 1709-1726. 10.1088/0951-7715/23/7/010

    Article  MathSciNet  MATH  Google Scholar 

  19. Li X, Rakkiyappan R, Balasubramaniam P: Existence and global stability analysis of equilibrium of fuzzy cellular neural networks with time delay in the leakage term under impulsive perturbations. J. Franklin Inst. 2011, 348: 135-155. 10.1016/j.jfranklin.2010.10.009

    Article  MathSciNet  MATH  Google Scholar 

  20. Balasubramaniam P, Vembarasan V, Rakkiyappan R: Leakage delays in T-S fuzzy cellular neural networks. Neural Process. Lett. 2011, 33: 111-136. 10.1007/s11063-010-9168-3

    Article  MATH  Google Scholar 

  21. Liu B: Global exponential stability for BAM neural networks with time-varying delays in the leakage terms. Nonlinear Anal., Real World Appl. 2012. 10.1016/j.nonrwa.2012.07.016

    Google Scholar 

  22. Gan Q, Liang Y: Synchronization of chaotic neural networks with time delay in the leakage term and parametric uncertainties based on sampled-data control. J. Franklin Inst. 2012,349(6):1955-1971. 10.1016/j.jfranklin.2012.05.001

    Article  MathSciNet  MATH  Google Scholar 

  23. Peng S: Global attractive periodic solutions of BAM neural networks with continuously distributed delays in the leakage terms. Nonlinear Anal., Real World Appl. 2010, 11: 2141-2151. 10.1016/j.nonrwa.2009.06.004

    Article  MathSciNet  MATH  Google Scholar 

  24. Liu B: Global exponential stability for BAM neural networks with time-varying delays in the leakage terms. Nonlinear Anal., Real World Appl. 2013, 14: 559-566. 10.1016/j.nonrwa.2012.07.016

    Article  MathSciNet  MATH  Google Scholar 

  25. Chen Z, Yang M: Exponential convergence for HRNNs with continuously distributed delays in the leakage terms. Neural Comput. Appl. 2012. 10.1007/s00521-012-1172-2

    Google Scholar 

  26. Chen Z, Meng J: Exponential convergence for cellular neural networks with time-varying delays in the leakage terms. Abstr. Appl. Anal. 2012., 2012: Article ID 941063 10.1155/2012/941063

    Google Scholar 

  27. Xiong W, Meng J: Exponential convergence for cellular neural networks with continuously distributed delays in the leakage terms. Electron. J. Qual. Theory Differ. Equ. 2013., 2013: Article ID 10. http://www.math.u-szeged.hu/ejqtde/

    Google Scholar 

  28. Xu Y: Anti-periodic solutions for HCNNs with time-varying delays in the leakage terms. Neural Comput. Appl. 2012. 10.1007/s00521-012-1330-6

    Google Scholar 

  29. Chen Z: A shunting inhibitory cellular neural network with leakage delays and continuously distributed delays of neutral type. Neural Comput. Appl. 2012. 10.1007/s00521-012-1200-2

    Google Scholar 

  30. Chen Z, Meng J: Exponential convergence for cellular neural networks with time-varying delays in the leakage terms. Abstr. Appl. Anal. 2012., 2012: Article ID 941063 10.1155/2012/941063

    Google Scholar 

  31. Zhang H, Yang M: Global exponential stability of almost periodic solutions for SICNNs with continuously distributed leakage delays. Abstr. Appl. Anal. 2013., 2013: Article ID 307981

    Google Scholar 

  32. Lakshmanan S, Park JH, Lee TH, Jung HY, Rakkiyappan R: Stability criteria for BAM neural networks with leakage delays and probabilistic time-varying delays. Appl. Math. Comput. 2013,219(17):9408-9423. 10.1016/j.amc.2013.03.070

    Article  MathSciNet  MATH  Google Scholar 

  33. Li Y, Li Y: Existence and exponential stability of almost periodic solution for neutral delay BAM neural networks with time-varying delays in leakage terms. J. Franklin Inst. 2013,350(9):2808-2825. 10.1016/j.jfranklin.2013.07.005

    Article  MathSciNet  MATH  Google Scholar 

  34. Baštinec J, Diblík J, Khusainov DY, Ryvolová A: Exponential stability and estimation of solutions of linear differential systems of neutral type with constant coefficients. Bound. Value Probl. 2010., 2010: Article ID 956121

    Google Scholar 

Download references

Acknowledgements

The authors would like to express the sincere appreciation to the reviewers for their helpful comments in improving the presentation and quality of the paper. This work was supported by the National Natural Science Foundation of China (grant nos. 51375160, 11201184), and the Scientific Research Fund of Hunan Provincial Natural Science Foundation of China (grant no. 12JJ3007).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shuhua Gong.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

ZC gave the proof of Theorem 2.1 and drafted the manuscript. SG proved and gave the example to illustrate the effectiveness of the obtained results. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Chen, Z., Gong, S. Exponential convergence of Cohen-Grossberg neural networks with continuously distributed leakage delays. J Inequal Appl 2014, 48 (2014). https://doi.org/10.1186/1029-242X-2014-48

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1029-242X-2014-48

Keywords