Journal
NONLINEAR ANALYSIS-REAL WORLD APPLICATIONS
Volume 9, Issue 4, Pages 1535-1557Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.nonrwa.2007.03.018
Keywords
recurrent neural networks; Lagrange stability; global exponential attractivity; delays
Categories
Ask authors/readers for more resources
In this paper, we study the global exponential stability in Lagrange sense for continuous recurrent neural networks (RNNs) with multiple time delays. Three different types of activation functions are considered, which include both bounded and unbounded activation functions. By constructing appropriate Lyapunov-like functions, we provide easily verifiable criteria for the boundedness and global exponential attractivity of RNNs. These results can be applied to analyze monostable as well as multistable neural networks. (C) 2007 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available