Locally Recurrent Neural Networks and Their Applications

Locally Recurrent Neural Networks and Their Applications

Todor D. Ganchev
DOI: 10.4018/978-1-60566-766-9.ch009
OnDemand:
(Individual Chapters)
Available
$33.75
List Price: $37.50
10% Discount:-$3.75
TOTAL SAVINGS: $3.75

Abstract

In this chapter we review various computational models of locally recurrent neurons and deliberate the architecture of some archetypal locally recurrent neural networks (LRNNs) that are based on them. Generalizations of these structures are discussed as well. Furthermore, we point at a number of realworld applications of LRNNs that have been reported in past and recent publications. These applications involve classification or prediction of temporal sequences, discovering and modeling of spatial and temporal correlations, process identification and control, etc. Validation experiments reported in these developments provide evidence that locally recurrent architectures are capable of identifying and exploiting temporal and spatial correlations (i.e., the context in which events occur), which is the main reason for their advantageous performance when compared with the one of their non-recurrent counterparts or other reasonable machine learning techniques.
Chapter Preview
Top

Introduction

After the introduction of the simplified computational model of a neuron in the early 1940’s (McCulloch & Pitts, 1943), various advanced models of neurons and derivative neural network structures were developed. Nowadays, i.e. sixty-five years later, there is a huge number of different neurons and immense number of neural networks, which differ by their building blocks, linkage, and structure. This abundance of architectures reflects the wide-spread use of neural networks on real-life problems, and the diversity of these problems attests to the inherent flexibility and enormous potential of neural networks.

In the present chapter we focus our attention to one class of dynamic neural networks, namely the locally recurrent architectures that were introduced in the beginning of the 1990’s, and that regained significant attention in the past few years.

The emergence of locally recurrent architectures contributed to significant advance in the theory and practice of neural networks. In the years following their introduction, numerous derivative architectures flourished, and the locally recurrent neurons become important building blocks in many original neural network designs, developed for the needs of particular real-world applications. In these applications, LRNNs were proved successful in dealing with spatial correlations, time sequences and structured data. Moreover, the engagement of LRNNs in real-world applications requited with improved training strategies and variety of new architectures. The considerable research interest to LRNNs led to significant advance of the gradient-based learning techniques that had to be adapted for the needs of temporal sequences processing. Furthermore, numerous new training methods based on genetic algorithms, particle swarm optimization (PSO), and other evolutionary techniques appeared, diversifying the nomenclature of training methods and improving the applicability of LRNNs on specific domain-oriented tasks or in general.

Before proceeding with the main exposition, we would like to define few terms that we use through the text to avoid potential ambiguity:

First of all, we name dynamic neural networks these networks that incorporate dynamic synaptic or feedback weights among some or all of their neurons. These networks are capable of expressing dynamic behaviors.

Recurrent neural networks we label those architectures which incorporate feedback connections among the layers of the network, or those that do not have straightforward layered input-output architecture but instead the inputs flow forth and back among the nodes of the network.

Feedforward architectures that incorporate a layer of recurrent neurons, which possess feedbacks only from neurons belonging to the same layer, are referred to as locally connected recurrent neural networks. The locally connected architectures are also referred to as non-local recurrent structures.

Next, here we use the term locally recurrent neural network for any network that possesses neurons with local feedback connections. On the level of individual neurons, local recurrence is identified with the availability of one or more feedbacks that encompass one or more elements in the structure of a neuron. The following section discusses this matter in details.

Furthermore, on the level of neural networks (we assume multilayer neural networks) the local recurrent neurons might be organized in a feedforward, non-local recurrent, or global recurrent architectures, depending on the implementation of the network linkage. For further reference to terminology we direct the interested reader to the work of Tsoi and Back (Tsoi and Back, 1997), which offers a comprehensive overview of terms and architectures related to recurrent neural networks.

In the following sections we overview various neural network architectures that are based on locally recurrent neurons. We place the accent on detailing the structure of these neural networks and, therefore, the numerous training methods developed for these architectures are not covered in details. The interested reader shall follow the comprehensive references provided in order to obtain more information about the training methods used, and their performance on real-world problems.

Key Terms in this Chapter

Locally Connected Recurrent Neural Networks: Feedforward architectures that incorporate a layer of recurrent neurons, which possess feedbacks only from neurons belonging to the same layer

Dynamic Neural Networks: Networks that incorporate dynamic synaptic or feedback weights among some or all of their neurons. These networks are capable of expressing dynamic behaviors.

Locally Recurrent Neural Networks: Networks that possess neurons with local feedback connections. On the level of individual neurons, local recurrence is identified with the availability of one or more feedbacks that encompass one or more elements in the structure of a neuron

Non-Local Recurrent: Non-local recurrent structure is another term for denotation of the locally connected recurrent architectures.

Recurrent Neural Networks: Architectures that incorporate feedback connections among the layers of the network, or those that do not have straightforward layered input-output architecture but instead the inputs flow forth and back among the nodes of the network

Complete Chapter List

Search this Book:
Reset