Open Access
Subscription Access
Open Access
Subscription Access
Hybrid Algorithm for Neural Network Training
Subscribe/Renew Journal
In supervised learning BP (Back Propagation) algorithm is proved to be the best algorithm for neural network training. But with BP algorithm Learning often takes long time to converge, and it may fall into local minima. By varying the learning rate, the local minima problem can be tackled. In this paper, we describe the hybrid learning approach to optimization neural network training process. By this approach, the first off all the global minima is found and with reference to it the local minima is found. A global minimum is for minimizing the overall error in the network and local minima is generalization of all patterns. The simulation results show that the proposed hybrid algorithm outperforms over traditional BP algorithm.
Keywords
Back-Propagation, Local Minima, Global Minima, Generalization.
User
Subscription
Login to verify subscription
Font Size
Information
Abstract Views: 248
PDF Views: 5