Open Access Open Access  Restricted Access Subscription Access

Assessment of Accuracy Enhancement of Back Propagation Algorithm by Training the Model using Deep Learning


Affiliations
1 Department of Computer Science & Engineering, Jamia Hamdard, New Delhi, India
 

Deep learning is a branch of machine learning which is recently gaining a lot of attention due to its efficiency in solving a number of AI problems. The aim of this research is to assess the accuracy enhancement by using deep learning in back propagation algorithm. For this purpose, two techniques has been used. In the first technique, simple back propagation algorithm is used and the designed model is tested for accuracy. In the second technique, the model is first trained using deep learning via deep belief nets to make it learn and improve its parameters values and then back propagation is used over it. The advantage of softmax function is used in both the methods. Both the methods have been tested over images of handwritten digits and accuracy is then calculated. It has been observed that there is a significant increase in the accuracy of the model if we apply deep learning for training purpose.


Keywords

Machine Learning, Deep Learning, Deep Belief Nets, Back Propagation, Restricted Boltzmann Machines, Artificial Neural Networks, Softmax Function.
User
Notifications
Font Size

  • Bo Pang, Lillian Lee, and Shivakumar Vaithyanathan, “Thumbs up? sentiment classification using machine learning techniques”, in Proc. of the ACL-02 Conference on Empirical Methods in Natural Language Processing, vol. 10. ACM, Stroudsburg, PA, USA, pp. 79-86. DOI=http://dx.doi.org/10.3115/1118693.1118704.
  • Hu, Minqing and Bing Liu. Mining and summarizing customer reviews. In Proceedings of ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD-2004).
  • Syed Imtiyaz Hassan, “Designing a flexible system for automatic detection of categorical student sentiment polarity using machine learning”, International Journal of u- and e- Service, Science and Technology, vol. 10, issue.3, Mar 2017, ISSN: 2005-4246. (to be published in Mar 31, 2017)
  • Turney, P. Thumbs up or thumbs down?: semantic orientation applied to unsupervised classification of reviews. In Proceedings of Annual Meeting of the Association for Computational Linguistics (ACL- 2002).
  • MNIST handwritten digit database, YannLeCun, Corinna Cortes and Chris Burges", Yann.lecun.com, 2016. [Online]. Available: http://yann.lecun.com/exdb/mnist/. [Accessed: 01- Dec- 2016].
  • G. Hinton and Y. Teh, A fast learning algorithm for deep belief nets, 1st ed. Toronto, 2006, pp. 1-5,8-11.
  • A. Chris Nicholson, "A Beginner's Tutorial for Restricted Boltzmannn Machines - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM", Deeplearning4j.org, 2016. [Online]. Available: https://deeplearning4j.org/restrictedBoltzmannnmachine.html. [Accessed: 30- Nov- 2016].
  • Q. V. Le, A Tutorial on Deep Learning, 1st ed. Mountain View, 2015, pp. 10-15.
  • O.Matan ,Reading Handwritten Digits : A Zip Code Recognition System, 1st ed. Holmdel, 2016, pp. 5-15.
  • A Practical Guide to Training Restricted Boltzmannn Machines, 1st ed. Toronto, 2016, pp. 1-7.
  • J. Han and M. Kamber, Data Mining Concepts amd Techniques, 3rd ed. MA: Elsevier, 2012, pp. 398-407 ,327-332.
  • "Classification. Classification: predicts categorical class labels classifies data (constructs a model) based on the training set and the values (class. - ppt download", Slideplayer.com, 2016. [Online]. Available: http://slideplayer.com/slide/5243492/. [Accessed: 01- Dec- 2016].
  • M. Mazue, "A Step by Step Backpropagation Example", Matt Mazur, 2016. [Online]. Available: https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example. [Accessed: 30- Nov- 2016].
  • "Training an Artificial Neural Network - Intro", solver, 2016. [Online]. Available: http://www.solver.com/training-artificial-neural-network-intro. [Accessed: 30- Nov- 2016].
  • D. Yuret, "Softmax Classification — Knet.jl 0.7.2 documentation", Knet.readthedocs.io, 2016. [Online]. Available: http://knet.readthedocs.io/en/latest/softmax.html. [Accessed: 01- Dec- 2016].
  • S. Raschka, "What-is-the-intuition-behind-SoftMax-function", Quora, 2014. [Online]. Available: https://www.quora.com/What-is-the-intuition-behind-SoftMax-function. [Accessed: 01- Dec- 2016].
  • K. ZHANG and X. CHEN, Large-Scale Deep Belief Nets With MapReduce, 1st ed. Detroit: IEEE, 2014, pp. 1-5.
  • Syed Imtiyaz Hassan, “Extracting the sentiment score of customer review from unstructured big data using Map Reduce algorithm”, International Journal of Database Theory and Application, vol. 9, issue 12, Dec 2016, pp. 289-298doi:10.14257/ijdta.2016.9.12.26, ISSN: 2005-4270.

Abstract Views: 289

PDF Views: 0




  • Assessment of Accuracy Enhancement of Back Propagation Algorithm by Training the Model using Deep Learning

Abstract Views: 289  |  PDF Views: 0

Authors

Baby Kahkeshan
Department of Computer Science & Engineering, Jamia Hamdard, New Delhi, India
Syed Imtiaz Hassan
Department of Computer Science & Engineering, Jamia Hamdard, New Delhi, India

Abstract


Deep learning is a branch of machine learning which is recently gaining a lot of attention due to its efficiency in solving a number of AI problems. The aim of this research is to assess the accuracy enhancement by using deep learning in back propagation algorithm. For this purpose, two techniques has been used. In the first technique, simple back propagation algorithm is used and the designed model is tested for accuracy. In the second technique, the model is first trained using deep learning via deep belief nets to make it learn and improve its parameters values and then back propagation is used over it. The advantage of softmax function is used in both the methods. Both the methods have been tested over images of handwritten digits and accuracy is then calculated. It has been observed that there is a significant increase in the accuracy of the model if we apply deep learning for training purpose.


Keywords


Machine Learning, Deep Learning, Deep Belief Nets, Back Propagation, Restricted Boltzmann Machines, Artificial Neural Networks, Softmax Function.

References





DOI: https://doi.org/10.13005/ojcst%2F10.02.07