Adaptive locally connected recurrent unit (ALCRU)
Tarih
Yazarlar
Dergi Başlığı
Dergi ISSN
Cilt Başlığı
Yayıncı
Erişim Hakkı
Özet
Research has shown that adaptive locally connected neurons outperform their fully connected (dense) counterparts, motivating this study on the development of the Adaptive Locally Connected Recurrent Unit (ALCRU). ALCRU modifies the Simple Recurrent Neuron Model (SimpleRNN) by incorporating spatial coordinate spaces for input and hidden state vectors, facilitating the learning of parametric local receptive fields. These modifications add four trainable parameters per neuron, resulting in a minor increase in computational complexity. ALCRU is implemented using standard frameworks and trained with back-propagation-based optimizers. We evaluate the performance of ALCRU using diverse benchmark datasets, including IMDb for sentiment analysis, AdditionRNN for sequence modelling, and the Weather dataset for time-series forecasting. Results show that ALCRU achieves accuracy and loss metrics comparable to GRU and LSTM while consistently outperforming SimpleRNN. In particular, experiments with longer sequence lengths on AdditionRNN and increased input dimensions on IMDb highlight ALCRU’s superior scalability and efficiency in processing complex data sequences. In terms of computational efficiency, ALCRU demonstrates a considerable speed advantage over gated models like LSTM and GRU, though it is slower than SimpleRNN. These findings suggest that adaptive local connectivity enhances both the accuracy and efficiency of recurrent neural networks, offering a promising alternative to standard architectures.