The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader).

If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs.

Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link above.

Fullscreen Fullscreen Off


Objectives: This article focuses on improving the convergence rate and reducing the number of operations used to train the Least Mean Square (LMS) algorithm. Methods/Statistical Analysis: In this paper, two modifications are suggested to train an adaptive filter using the LMS algorithm; one is based on initialization of weights and another on early termination of the training of a sequence. Findings: The optimum weights of an adaptive filter are found by initializing the weights by zeros, providing several random sequences as input and updating the weights according to the error. Moreover, the weights are continuously updated for the entire sequence even if the weights have been converged. In the proposed algorithm, the weights are initialized with zeros only once for the first sequence. The optimum weights obtained for a sequence are used as initial weights for the subsequent sequence to improve the convergence rate. Further, to reduce the number of operations, the weight update process for a sequence is terminated when the error is below some prescribed threshold. Applications/Improvements: Results shows that by making these modifications, the rate of convergence increases and number of multiplications decrease.

Keywords

Convergence Rate, Initialization of Weights, LMS Algorithm, Multiplications, Mean Square Error, Threshold.
User