

Training Neural Network Elements Created From Long Shot Term Memory
This paper presents the application of stochastic search algorithms to train artificial neural networks. Methodology approaches in the work created primarily to provide training complex recurrent neural networks. It is known that training recurrent networks is more complex than the type of training feedforward neural networks. Through simulation of recurrent networks is realized propagation signal from input to output and training process achieves a stochastic search in the space of parameters. The performance of this type of algorithm is superior to most of the training algorithms, which are based on the concept of gradient. The efficiency of these algorithms is demonstrated in the training network created from units that are characterized by long term and long shot term memory of networks. The presented methology is effective and relative simple.
Keywords
Artificial Neural Networks (ANN), Feed-Forward Neural Networks (FNN), Recurent Neural Networks (RNN), Simulation Graph Model (SGM) Networks, Stochastic Direct Search (SDS), Long Term Memory (LTM), Long Shot Term Memory (LST) Units.
User
Font Size
Information