Open Access Open Access  Restricted Access Subscription Access

Training Neural Network Elements Created From Long Shot Term Memory


Affiliations
1 Department of Informatics, Novi Sad, Serbia
 

This paper presents the application of stochastic search algorithms to train artificial neural networks. Methodology approaches in the work created primarily to provide training complex recurrent neural networks. It is known that training recurrent networks is more complex than the type of training feedforward neural networks. Through simulation of recurrent networks is realized propagation signal from input to output and training process achieves a stochastic search in the space of parameters. The performance of this type of algorithm is superior to most of the training algorithms, which are based on the concept of gradient. The efficiency of these algorithms is demonstrated in the training network created from units that are characterized by long term and long shot term memory of networks. The presented methology is effective and relative simple.

Keywords

Artificial Neural Networks (ANN), Feed-Forward Neural Networks (FNN), Recurent Neural Networks (RNN), Simulation Graph Model (SGM) Networks, Stochastic Direct Search (SDS), Long Term Memory (LTM), Long Shot Term Memory (LST) Units.
User
Notifications
Font Size


  • Training Neural Network Elements Created From Long Shot Term Memory

Abstract Views: 363  |  PDF Views: 4

Authors

Kostantin P. Nikolic
Department of Informatics, Novi Sad, Serbia

Abstract


This paper presents the application of stochastic search algorithms to train artificial neural networks. Methodology approaches in the work created primarily to provide training complex recurrent neural networks. It is known that training recurrent networks is more complex than the type of training feedforward neural networks. Through simulation of recurrent networks is realized propagation signal from input to output and training process achieves a stochastic search in the space of parameters. The performance of this type of algorithm is superior to most of the training algorithms, which are based on the concept of gradient. The efficiency of these algorithms is demonstrated in the training network created from units that are characterized by long term and long shot term memory of networks. The presented methology is effective and relative simple.

Keywords


Artificial Neural Networks (ANN), Feed-Forward Neural Networks (FNN), Recurent Neural Networks (RNN), Simulation Graph Model (SGM) Networks, Stochastic Direct Search (SDS), Long Term Memory (LTM), Long Shot Term Memory (LST) Units.

References





DOI: https://doi.org/10.13005/ojcst%2F10.01.01