Online Learning and Saliency Effects On CNN-Based Gait Recognizers
Authentication through gait analysis offers a reliable and an easy-to-use alternative to common authentication methods. This paper presents a novel gait recognizer that exploits online learning in Convolutional Neural Network, CNN. The features which make that algorithm promising are its high recognition accuracy and low computational cost, in addition to its adaptability, flexibility and applicability. Also, in a parallel line the effect of saliency as a means to generate global features is examined.
In this paper, the inertial measures (instead of visual data) are utilized for person authentication. Thus the smartphone inertial sensors are used to continuously assess whether the mobile is actually in the hands of the right owner or not. Three different approaches (saliency detection, offline, and online learning) have been proposed, examined, and implemented. The last two of these approaches are based on the use of convolutional neural networks to shift the measured values of the sensors into a new vector that can be classified more reliably, while the first approach is based on the use of saliency detection algorithms to get the most salient regions of the gait. The models of these approaches are carried out and various experiments on such models are conducted. The results of these experiments were promising and showed the applicability of gait recognition to provide implicit continuous authentication, specially, when online learning is relied upon, since the identification accuracy reaches 98.7%.
Keywords
- Z. Sitova, J. Sedenka, and Q. Yang, “HMOG: New Behavioral Biometric Features for Continuous Authentication of Smartphone Users,” IEEE Trans. Inf. Forensics Secur., vol. 11, no. 5, pp. 877–892, 2016, doi: 10.1109/TIFS.2015.2506542.
- W. H. Lee and R. B. Lee, “Implicit Smartphone User Authentication with Sensors and Contextual Machine Learning,” Proc. - 47th Annu. IEEE/IFIP Int. Conf. Dependable Syst. Networks, DSN 2017, pp. 297–308, 2017, doi: 10.1109/DSN.2017.24.
- M. Ehatisham-Ul-Haq, M. A. Azam, U. Naeem, S. U. Rehman, and A. Khalid, “Identifying Smartphone Users based on their Activity Patterns via Mobile Sensing,” Procedia Comput. Sci., vol. 113, pp. 202–209, 2017, doi: 10.1016/j.procs.2017.08.349.
- Q. Zou, Y. Wang, Q. Wang, Y. Zhao, and Q. Li, “Deep Learning-Based Gait Recognition Using Smartphones in the Wild,” IEEE Trans. Inf. Forensics Secur., pp. 3197–3212, 2020, doi: 10.1109/tifs.2020.2985628.
- R. Damaševičius, R. Maskeliunas, A. Venčkauskas, and M. Woźniak, “Smartphone user identity verification using gait characteristics,” Symmetry (Basel)., vol. 8, no. 10, 2016, doi: 10.3390/sym8100100.
- M. Muaaz and R. Mayrhofer, “SmartphoneBased Gait Recognition: From Authentication to Imitation,” IEEE Trans. Mob. Comput., vol. 16, no. 11, pp. 3209–3221, 2017, doi: 10.1109/TMC.2017.2686855.
- M. P. Centeno, Y. Guan, and A. van Moorsel, “Mobile based continuous authentication using deep features,” EMDL 2018 - Proc. 2018 Int. Work. Embed. Mob. Deep Learn., pp. 19–24, 2018, doi: 10.1145/3212725.3212732.
- X. Kang, B. Huang, and G. Qi, “A novel walking detection and step counting algorithm using unconstrained smartphones,” Sensors (Switzerland), vol. 18, no. 1, 2018, doi: 10.3390/s18010297.
- Z. He, “Accelerometer based gesture recognition using fusion features and SVM,” J. Softw., vol. 6, no. 6, pp. 1042–1049, 2011, doi: 10.4304/jsw.6.6.1042-1049.
- O. Dehzangi, M. Taherisadr, and R. ChangalVala, “IMU-based gait recognition using convolutional neural networks and multi-sensor fusion,” Sensors (Switzerland), vol. 17, no. 12, 2017, doi: 10.3390/s17122735.
- C. Wan, L. Wang, and V. V. Phoha, “A survey on gait recognition,” ACM Comput. Surv., vol. 51, no. 5, 2018, doi: 10.1145/3230633.
- Z. Li, W. Yang, S. Peng, and F. Liu, “A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects,” Apr. 2020.
- R. Desimone and J. Duncan, “Neural mechanisms of selective visual attention,” Annu. Rev. Neurosci., vol. 18, no. February 1995, pp. 193– 222, 1995, doi: 10.1146/annurev.ne.18.030195.001205.
- L. Itti and C. Koch, “Computational modelling of visual attention,” Nat. Rev. Neurosci., vol. 2, no. 3, pp. 194–203, 2001, doi: 10.1038/35058500.
- J. Hurwitz and D. Kirsch, Machine Learning For Dummies®, IBM Limited Edition. New York: John Wiley & Sons, Inc., 2018.
- E. Bisong, Building Machine Learning and Deep Learning Models on Google Cloud Platform. Berkeley, CA: Apress, 2019.
- Q. Yang, G. Zhou, and Z. Sitová, “A multimodal data set for evaluating continuous authentication performance in smartphones,” SenSys 2014 - Proc. 12th ACM Conf. Embed. Networked Sens. Syst., no. 1, pp. 358–359, 2014, doi: 10.1145/2668332.2668366.
- M.-M. Cheng, Z. Zhang, W.-Y. Lin, and P. Torr, “BING: Binarized Normed Gradients for Objectness Estimation at 300fps,” in 2014 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 2014, pp. 3286– 3293, doi: 10.1109/CVPR.2014.414.
- I. Jolliffe, “Principal Component Analysis,” in International Encyclopedia of Statistical Science, Berlin, Heidelberg: Springer Berlin Heidelberg, 2011, pp. 1094–1096.
- X. Hou and L. Zhang, “Saliency Detection: A Spectral Residual Approach,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 2007, pp. 1–8, doi: 10.1109/CVPR.2007.383267.
- J. Ivković, A. Veljović, B. Ranđelović, and V. Veljović, “ODROID-XU4 as a desktop PC and microcontroller development boards alternative,” Tech. Informatics Educ., no. May, pp. 439–444, 2016.
- J. Han and B. Bhanu, “Individual recognition using gait energy image,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 28, no. 2, pp. 316–322, 2006, doi: 10.1109/TPAMI.2006.38.
- Özbay, Erdal, and Feyza Altunbey Özbay. "A CNN Framework for Classification of Melanoma and Benign Lesions on Dermatoscopic Skin Images." International Journal of Advanced Networking and Applications 13.2 (2021): 4874-4883..
- Mubassıra, Masiath, and Amit Kumar Das. "Implementation of Recurrent Neural Network with Language Model for Automatic Articulation Identification System in Bangla." International Journal of Advanced Networking and Applications 12.6 (2021): 4800-4808..
Abstract Views: 137
PDF Views: 0