Skip to main content

Research Repository

Advanced Search

A recurrent multiscale architecture for long-term memory prediction task

Squartini, S.; Hussain, A.; Piazza, F.

Authors

S. Squartini

F. Piazza



Abstract

In the past few years, researchers have been extensively studying the application of recurrent neural networks (RNNs) to solving tasks where detection of long term dependencies is required. This paper proposes an original architecture termed the Recurrent Multiscale Network, RMN, to deal with these kinds of problems. Its most relevant properties are concerned with maintaining conventional RNNs' capability of information storing whilst simultaneously attempting to reduce their typical drawback occurring when they are trained by gradient descent algorithms, namely the vanishing gradient effect. This is achieved through RMN which preprocesses the original signal separating information at different temporal scales through an adequate DSP tool, and handling each information level with an autonomous recurrent architecture; the final goal is achieved by a nonlinear reconstruction section. This network has shown a markedly improved generalization performance over conventional RNNs, in its application to time series prediction tasks where long range dependencies are involved.

Presentation Conference Type Conference Paper (Published)
Conference Name 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
Start Date Apr 6, 2003
End Date Apr 10, 2003
Online Publication Date Jun 5, 2003
Publication Date 2003
Deposit Date Oct 17, 2019
Pages 789-792
Series ISSN 1520-6149
Book Title 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
ISBN 0-7803-7663-3
DOI https://doi.org/10.1109/ICASSP.2003.1202485
Keywords recurrent neural nets, neural net architecture, time series, prediction theory, discrete wavelet transforms
Public URL http://researchrepository.napier.ac.uk/Output/1793724