Skip to main content

Research Repository

Advanced Search

Learning from Few Samples with Memory Network

Zhang, Shufei; Huang, Kaizhu; Zhang, Rui; Hussain, Amir

Authors

Shufei Zhang

Kaizhu Huang

Rui Zhang



Abstract

Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significantly. In this paper, a novel NN structure is proposed called a memory network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Taking advantage of the memory from previous samples, the new model achieves a remarkable improvement in performance when trained using limited data. The memory network is demonstrated here using the multi-layer perceptron (MLP) as a base model. However, it would be straightforward to extend the idea to other neural networks, e.g., convolutional neural networks (CNN). In this paper, the memory network structure is detailed, the training algorithm is presented, and a series of experiments are conducted to validate the proposed framework. Experimental results show that the proposed model outperforms traditional MLP-based models as well as other competitive algorithms in response to two real benchmark data sets.

Journal Article Type Article
Acceptance Date Sep 6, 2017
Online Publication Date Oct 25, 2017
Publication Date 2018-02
Deposit Date Jul 26, 2019
Journal Cognitive Computation
Print ISSN 1866-9956
Publisher BMC
Peer Reviewed Peer Reviewed
Volume 10
Issue 1
Pages 15-22
DOI https://doi.org/10.1007/s12559-017-9507-z
Keywords Memory, Multi-layer perceptron, Neural network, Recognition, Prior knowledge
Public URL http://researchrepository.napier.ac.uk/Output/1792284
Related Public URLs https://www.storre.stir.ac.uk/handle/1893/26262