Open Access
Subscription Access
Open Access
Subscription Access
Backpropagation Training Algorithm with Adaptive Parameters to Solve Digital Problems
Subscribe/Renew Journal
An efficient technique namely Backpropagation training with adaptive parameters using Lyapunov Stability Theory for training single hidden layer feed forward network is proposed. A three-layered Feedforward neural network architecture is used to solve the selected problems. Sequential Training Mode is used to train the network. Lyapunov stability theory is employed to ensure the faster and steady state error convergence and to construct and energy surface with a single global minimum point through the adaptive adjustment of the weights and the adaptive parameter β. To avoid local minima entrapment, an adaptive backpropagation algorithm based on Lyapunov stability theory is used. Lyapunov stability theory gives the algorithm, the efficiency of attaining a single global minimum point. The learning parameters used in this algorithm is responsible for the faster error convergence. The adaptive learning parameter used in this algorithm is chosen properly for faster error convergence. The error obtained has been asymptotically converged to zero according to Lyapunov Stability theory. The performance of the adaptive Backpropagation algorithm is measured by solving parity problem, half adder and full adder problems.
Keywords
Single Hidden Layer, Lyapunov Stability Theory, Adaptive Learning Parameter.
Subscription
Login to verify subscription
User
Font Size
Information
Abstract Views: 250
PDF Views: 0