Master Thesis – Bidirectional Neural Networks


Computational Analysis of the Bidirectional Activation-based Learning in Autoencoder Task [article IJCNN, pdf]

Master thesis [pdf]

Public git repository with source code[git]


In our work, we used computational simulations to analyse supervised artificial neural networks based on the Generalized recirculation algorithm (GeneRec) by O’Reilly (1996) and the Bidirectional Activation-based Learning algorithm (BAL) by Farkaš and Rebrová (2013). The main idea of both algorithms is to update weights based on the difference between forward and backward propagation of neuron activations rather than based on error backpropagation (BP) between layers, which is considered biologically implausible. However, both algorithms struggle to learn low dimensional mappings which could be easily learned by BP. The aim of this work is to fill this gap. Several modifications of BAL are proposed and after systematic analysis a Two learning rates (TLR) version is introduced. TLR uses different learning rates for different weight matrices. The simulations prove increase in success rate and show smooth relation between success and learning rates. For the networks with highest success rate the two learning rates can be in ratio 10^6.

Further the idea of TLR is applied to GeneRec. Finally, additional experiments for momentum, weight initialization, hidden activations and dynamic learning rate are analysed. We believe that using the idea of TLR could lead to performance increase in other artificial neural network models as well, and even multi-layered networks. Intuitively, an increase in success rate could be achieved by generalizing the idea of TLR to additional parameters, such as momentum or weight initialization. Further experiment are outlined.

Table of Content

  • Overview of Neural Nets
    • Motivation
    • Standard notation
    • Existing models
      • Backpropagation
      • Almeida-Pineda Algorithm
      • Recirculation algorithm
      • Boltzmann machines
      • Contrastive Hebbian Learning
      • Deep architectures
      • Other
    • GeneRec
      • Model
      • Attributes and Applications
  • Our Research
    • Our Motivation
    • Considered modifications
      • Regression
      • Ballard
      • Dropout
      • Deep architectures
    • Comparison
  • Implementation
    • Description
    • Code
  • Conclusion


Leave a Reply

Your email address will not be published. Required fields are marked *