Abstract

Proceedings Abstracts of the Twenty-Fourth International Joint Conference on Artificial Intelligence

Equivalence Results between Feedforward and Recurrent Neural Networks for Sequences / 3827
Alessandro Sperduti
PDF

In the context of sequence processing, we study the relationship between single-layer feedforward neural networks,that have simultaneous access to all items composing a sequence, and single-layer recurrent neural networks which access information one step at a time.We treat both linear and nonlinear networks, describing a constructive procedure, based on linear autoencoders for sequences, that given a feedforward neural network shows how to define a recurrent neural network that implements the same function in time. Upper bounds on the required number of hidden units for the recurrent network as a function of some features of the feedforward network are given. By separating the functional from the memory component, the proposed procedure suggests new efficient learning as well as interpretation procedures for recurrent neural networks.