Open Access Subscription Access
Open Access Subscription Access
Boosting the Accuracy of Optimisation Chatbot by Random Forest With Halving Grid Search Hyperparameter Tuning
Computer science, engineering and technologies are witnessing a vital role in providing challenging demands of users. Artificial intelligence, machine learning and robotic process automation strive to improve the intelligent behavior of computers. Fast human like responses of text chatbot can perform better if and only if it is optimized. Hyper parameter optimization methods are popular for successfully boosting up the overall performance of model. In this paper we focus on creating chatbot using random forest and optimizing its performance by hyper parameter tuning halving grid search. We propose chatbot model 1 without optimization, chatbot model 2 with optimization and chatbot model3 with optimization and best values of key performance indicators. Computations are performed before optimization and after optimization for measurement factors including accuracy, precision, recall and f1-scores. Three different models proposed, and performance are compared for each model with respect to precision, recall, f1-scores and accuracy.
Optimization Chatbot, Artificial Intelligence, Machine Learning, Halving Grid Search Hyper Parameter Tuning, Robotic Process Automation.
- D. Acemoglu and P. Restrepo, “The Race between Machine and Man: Implications of Technology for Growth, Factor Shares and Employment”, NBER Working Paper No. 22252, 2016.
- M. Boden and S. Kember, “Principles of Robotics: Regulating Robots in the Real World”, Connection Science, Vol. 29, pp. 124-129, 2017.
- V.C. Muller and N. Bostrom, “Future Progress in Artificial Intelligence: A Survey of Expert Opinion”, Proceedings of International Conference on Fundamental Issues of Artificial Intelligence, pp. 553-571, 2016.
- B. Kohli and P. Kumar, “A Platform for Human-Chatbot Interaction using Python”, Proceedings of International Conference on Green Computing and Internet of Things, pp. 439-444, 2018.
- Chen Wei, Zhichen Yu and Simon Fong, “How to Build a Chatbot: Chatbot Framework and its Capabilities”, Proceedings of International Conference on Machine Learning, pp.369-373, 2018.
- E. Adamopoulou and L. Moussiades, “An Overview of Chatbot Technology”, Proceedings of International Conference on Advances in Information and Communication Technology, pp. 1-7, 2020.
- Mary Lacity, Leslie Willcocks and Andrew Craig, “Robotic Process Automation at Telefónica O2”, The Outsourcing Unit Working Research Paper Series, Vol. 15, No, 2, pp. 1- 178, 2015.
- Leslie Willcocks, Mary Lacity and Andrew Craig, “The IT Function and Robotic Process Automation”, The Outsourcing Unit Working Research Paper Series, Vol. 15, No. 5, pp. 1-112, 2015.
- M. Dumas and H.A. Reijers, “Fundamentals of Business Process Management”, Springer, 2013.
- R. Parasuraman and C.D. Wickens, “A Model for Types and Levels of Human Interaction with Automation”, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, Vol. 30, No. 3, pp. 286-297, 2000.
- M. Vagia and S.A. Fjerdingen, “A Literature Review on the Levels of Automation during the Years What are the Different Taxonomies that have been Proposed?”, Applied Ergonomics, Vol. 53, pp. 190-202, 2016.
- A. Jimenez-Ramirez, I. Barba and C. Del Valle, “A Method to Improve the Early Stages of the Robotic Process Automation Lifecycle”, Lecture Notes in Computer Science, Vol. 11483, pp. 446-461, 2019.
- C. Thornton and H. Hoos, “Auto-Weka: Combined Selection and Hyperparameter Optimization of Classification Algorithms”, Proceedings of International Conference on Knowledge Discovery Data Mining, pp. 847- 855, 2013.
- P. Probst, B. Bischl and A.L. Boulesteix, “Tunability: Importance of Hyperparameters of Machine Learning Algorithms”, Proceedings of International Conference on Machine Learning, pp. 1-5, 2018.
- B. Shekar and G. Dagnew, “Grid Search-Based Hyperparameter Tuning and Classification of Microarray Cancer Data”, Proceedings of International Conference on Computer Communications, pp. 1-8, 2020.
- R.G. Mantovani, “Hyper-Parameter Tuning of a Decision Tree Induction Algorithm”, Proceedings of International Conference on Intelligent Systems, pp. 37-42, 2016.
- C. Fawcett and H.H. Hoos, “Analyzing Differences between Algorithm Configurations through Ablation”, Journal of Heuristics, Vol. 22, No. 4, pp. 431-458, 2016.
- C. Anisha and N. Arulanand, “Early Prediction of Parkinson’s Disease (PD) using Ensemble Classifiers”, Proceedings of International Conference on Innovation Trends in Information Technology, pp. 1-6, 2020.
- R. Sindhgatta, A.H.M. Hofstede and A. Ghose, “Resource Based Adaptive Robotic Process Automation”, Lecture Notes in Computer Science, pp. 451-466, 2020.
- B. Nagaraj and K.B. Malagi, “Research Paper to Design and Develop an Algorithm for Optimization Chatbot”, Proceedings of International Conference on Ubiquitous Intelligent Systems Smart Innovation, Systems and Technologies, pp. 1-8, 2022.
- Bedre Nagaraj and Kiran B. Malagi, “Research Paper to Design and Develop an Algorithm for Optimization Chatbot”, Proceedings of International Conference on Ubiquitous Intelligent Systems, pp. 1-12, 2022.
- M.A. Zoller and M.F. Huber, “Benchmark and Survey of Automated Machine Learning Frameworks”, Proceedings of International Conference on Machine Learning, pp. 1-6, 2019.
- J. Bergstra and Y. Bengio, “Random Search for HyperParameter Optimization”, Journal of Machine Learning Research, Vol. 13, No. 1, pp. 281-305, 2012.
- J. Snoek, H. Larochelle and R.P. Adams, “Practical Bayesian Optimization of Machine Learning Algorithms”, Proceedings of International Conference on Machine Learning, pp. 1-7, 2012.
- O. Chapelle, V. Vapnik and O. Bousquet, “Choosing Multiple Parameters for Support Vector Machines”, Machine Learning, Vol. 46, pp. 1-3, 2002.
- J.H. Friedman, “Greedy Function Approximation: A Gradient Boosting Machine”, Annals of Statistics, Vol. 29, No. 5, pp. 1189-1232, 2001.
- P. Geurts, D. Ernst and L. Wehenkel, “Extremely Randomized Trees”, Machine Learning, Vol. 63, No. 1, pp. 3-42, 2006.
- S. Zhang and M. Zong “Efficient KNN Classification with Different Numbers of Nearest Neighbors”, IEEE Transactions on Neural Networks and Learning Systems, Vol. 29, No. 5, pp. 1774-1785, 2017.
- M. Reif and A. Dengel, “Prediction of Classifier Training Time including Parameter Optimization”, Proceedings of International Conference on Conference on Artificial Intelligence, pp. 260-271, 2011.
- R. Gao and Z. Liu, “An Improved Adaboost Algorithm for Hyperparameter Optimization”, Journal of Physics Conference Series, pp. 1-13, 2020.
- P. Probst, M.N. Wright and A.L. Boulesteix, “Hyperparameters and Tuning Strategies for Random Forest”, Wiley Interdisciplinary Review, Vol. 9, No. 3, pp. 1301-1314, 2019.
- Bedre Nagaraj and Kiran B. Malagi., “Design and Development of an Optimization Chatbot based on Random Forest Machine Learning Algorithm with Hyper parameter Tuning”, Indian Journal of Natural Sciences, Vol.13, No. 76, pp. 1-14, 2023.
- Bedre Nagaraj and Kiran B. Malagi, “Review of Optimization Chatbot Trends, Tools, Technologies and Research Issues”, Proceedings of International Conference on Innovation Trends in Information Technology, pp. 1146- 1160, 2021.
- Bedre Nagaraj and Kiran B. Malagi, “Design of an Optimization Chatbot Model for Classification Framework”, IFERP Publisher, 2021.
Abstract Views: 108
PDF Views: 0