Dynamics in Artificial Higher Order Neural Networks with Delays

Dynamics in Artificial Higher Order Neural Networks with Delays

Jinde Cao, Fengli Ren, Jinling Liang
Copyright: © 2009 |Pages: 41
DOI: 10.4018/978-1-59904-897-0.ch018
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter concentrates on studying the dynamics of artificial higher order neural networks (HONNs) with delays. Both stability analysis and periodic oscillation are discussed here for a class of delayed HONNs with (or without) impulses. Most of the sufficient conditions obtained in this chapter are presented in linear matrix inequalities (LMIs), and so can be easily computed and checked in practice using the Matlab LMI Toolbox. In reality, stability is a necessary feature when applying artificial neural networks. Also periodic solution plays an important role in the dynamical behavior of all solutions though other dynamics such as bifurcation and chaos do coexist. So here we mainly focus on questions of the stability and periodic solutions of artificial HONNs with (or without) impulses. Firstly, stability analysis and periodic oscillation are analyzed for higher order bidirectional associative memory (BAM) neural networks without impulses. Secondly, global exponential stability and exponential convergence are studied for a class of impulsive higher order bidirectional associative memory neural networks with time-varying delays. The main methods and tools used in this chapter are linear matrix inequalities (LMIs), Lyapunov stability theory and coincidence degree theory.
Chapter Preview
Top

Introduction

In recent years, Hopfield neural networks and their various generalizations have attracted the attention of many scientists (e.g., mathematicians, physicists, computer scientists and so on), due to their potential for the tasks of classification, associative memory, parallel computation and their ability to solve difficult optimization problems (Hopfield, 1984; ChuaYang, 1988; Marcus and Westervelt, 1989; Cohen and Grossberg, 1983; Driessche and Zou, 1998; Cao and Tao, 2001; Cao, 2001; Cao and Wang, 2004; Cao, Wang and Liao, 2003). For the Hopfield neural network characterized by first order deferential equations, Abu-Mostafa and Jacques (1985); McEliece, Posner, Rodemich and Venkatesh (1987) and Baldi (1988) presented its intrinsic limitations. As a consequence, different architectures with higher order interactions (Personnaz, Guyon and Dreyfus, 1987; Psaltis, Park and Hong, 1988; Simpson, 1990; Peretto and Niez, 1986; Ho, Lam, Xu and Tam, 1999) have been successively introduced to design neural networks which have stronger approximation properties, faster convergence rate, greater storage capacity, and higher fault tolerance than lower order neural networks. Meanwhile stability properties of these models have been investigated in Dembo, Farotimi and Kailath (1991); Kamp and Hasler (1990); Kosmatopoulos, Polycarpou, Christodoulou and Ioannou (1995); Xu, Liu and Liao (2003); Ren and Cao (2006); Ren and Cao (2007); Ren and Cao (in press). In this chapter, we will give some criteria on higher order BAM neural networks.

BAM neural networks were proposed in Kosko (1988). This model generalizes the single-layer auto-associative circuit and possesses good application prospects in the areas of pattern recognition, signal and image processing. The circuit diagram and connection pattern implementing the delayed BAM networks can be found in Cao and Wang (2002). From a mathematical viewpoint, although the system in this chapter can be regarded as a network with dimension n+m, it produces many nice properties due to the special structure of connection weights and its practical application in storing paired patterns via both directions: forward and backward. When a neural circuit is employed as an associative memory, the existence of many equilibrium points is a necessary feature. However, when applied to parallel computation and signal processing involving the solution of optimization problems, it is required that there be a well-defined computable solution for all possible initial states. This means that the network should have a unique equilibrium point that is globally attractive. Indeed, earlier applications in optimization have suffered from the existence of a complicated set of equilibriums. Thus, the global attractiveness of systems is of great importance for both practical and theoretical reasons. For more details about BAM neural networks, see Cao (2003); Cao and Dong (2003); Liao and Yu (1998); Mohamad (2001); Chen, Cao and Huang (2004).

In this chapter, firstly, we investigate the following second order BAM neural networks with time delays:

where i=1,2,…, n; j=1,2, …, m; t>0 ; 978-1-59904-897-0 .ch018.m02, 978-1-59904-897-0 .ch018.m03 denote the potential (or voltage) of cell i and j at time t; 978-1-59904-897-0 .ch018.m04, 978-1-59904-897-0 .ch018.m05 are positive constants; time delays978-1-59904-897-0 .ch018.m06, 978-1-59904-897-0 .ch018.m07 are non-negative constants, which correspond to finite speed of axonal signal transmission; 978-1-59904-897-0 .ch018.m08, 978-1-59904-897-0 .ch018.m09, 978-1-59904-897-0 .ch018.m10, 978-1-59904-897-0 .ch018.m11 are the first and second order connection weights of the neural network, respectively; 978-1-59904-897-0 .ch018.m12, 978-1-59904-897-0 .ch018.m13denote the ith and the jth component of an external input source introduced from outside the network to cell i and j, respectively; and 978-1-59904-897-0 .ch018.m14=max{978-1-59904-897-0 .ch018.m15, 978-1-59904-897-0 .ch018.m16}.

Complete Chapter List

Search this Book:
Reset