Multiple-Weight Recurrent Neural Networks

Multiple-Weight Recurrent Neural Networks

Zhu Cao, Linlin Wang, Gerard de Melo

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 1483-1489. https://doi.org/10.24963/ijcai.2017/205

Recurrent neural networks (RNNs) have enjoyed great success in speech recognition, natural language processing, etc. Many variants of RNNs have been proposed, including vanilla RNNs, LSTMs, and GRUs. However, current architectures are not particularly adept at dealing with tasks involving multi-faceted contents. In this work, we solve this problem by proposing Multiple-Weight RNNs and LSTMs, which rely on multiple weight matrices in an attempt to mimic the human ability of switching between contexts. We present a framework for adapting RNN-based models and analyze the properties of this approach. Our detailed experimental results show that our model outperforms previous work across a range of different tasks and datasets.
Keywords:
Machine Learning: Neural Networks
Natural Language Processing: Natural Language Processing
Machine Learning: New Problems
Machine Learning: Deep Learning