Open Access Open Access  Restricted Access Subscription Access

Navigation with a Cooperative Social Robot for a Group of Visitors using Face Detection and a ‘Stop and Wait’ Scheme


Affiliations
1 National Institute of Technology Mizoram, Mizoram 796 012, India
2 Atal Bihari Vajpayee Indian Institute of Information Technology and Management Gwalior, Madhya Pradesh 474 015, India
 

The work is on the use of a robot as a guide that takes visitors for a guided tour around a facility. A past project of the research group proposed a robot guide that enacted a pre-recorded tour, however, had a limited applicability as the humans rarely followed the guided tour and the robot did not respond to the movement of the visitors. A robotic guide must ensure that it takes the visitors along, who are not left behind, while typically maintaining comforting distances from the visitor. The width and height of the human face is computed which is inversely proportional to the distance of the person from the robot. Further, we construct a method which guides visitors cooperatively. The robot moves sequentially to different locations with the visitors, and if any visitor is found missing, the robot stops and waits for that visitor. When the visitor becomes visible, the robot resumes the journey. The robot moves and navigates as a guide for a group of visitors maintaining appropriate distances from the visitor using the distance measurement methodology. The results are demonstrated by making the robot take visitors on a guided tour of the Robotics and Machine Intelligence Laboratory. The robot waits if a visitor leaves the group for calls or any other reason, while also waits if the visitors lag behind. The work demonstrates the ability of a robot to be socially complaint while taking a group of visitors on a guided tour.

Keywords

Behavioural robotics, Convolution neural network, Group detection, Robotic tour, Service robot.
User
Notifications
Font Size

  • Ishiguro H, Ono T, Imai M, Maeda T, Kanda T & Nakatsu R, Robovie: an interactive humanoid robot, Ind Rob, 28(6) (2001) 498–504, doi: 10.1108/01439910110410051.
  • Hornby G S, Takamura S, Yokono J, Hanagata O, Yamamoto T & Fujita M, Evolving robust gaits with AIBO, Proc Int Conf Rob Autom (IEEE) 2000, 3040–3045, 10.1109/ROBOT.2000.846489.
  • Nourbakhsh I R, Kunz C & Willeke T, Themobot museum robot installations: A five year experiment, Proc IEEE/RSJ Int Conf Int Rob Syst (IEEE) 2003, 3636–3641, doi: 10.1109/IROS.2003.1249720.
  • Murtra A C, Tur J M & Sanfeliu A, Efficient active global localization for mobile robots operating in large and cooperative environments, IEEE Int Conf Rob Autom (IEEE) 2008, 2758–2763, doi: 10.1109/ROBOT. 2008.4543628.
  • Kaplan F, Talking AIBO: First experimentation of verbal interactions with an autonomous four-legged robot, In Learn to behave: interacting agents CELE-TWENTE Work Lang Tech, 22 (2000).
  • Dautenhahn K, Woods S, Kaouri C, Walters M L, Koay K L & Werry I, What is a robot companion-friend, assistant or butler? IEEE/RSJ Int Conf Intell Rob Syst (IEEE) 2005, 1192–1197, doi: 10.1109/IROS.2005.1545189.
  • Pransky J, Social adjustments to a robotic future, Wolf and Mallett, (2004) 137–59.
  • Vaughan R, Sumpter N, Henderson J, Frost A & Cameron S, Experiments in automatic flock control, Rob Auton Syst, 31(1-2) (2000) 109–117, doi: 10.1016/S0921-8890(99)00084-6.
  • Choudhury T, Aggarwal A & Tomar R, A deep learning approach to helmet detection for road safety, J Sci Indus Res, 79(6) (2020) 509–512.
  • Feil-Seifer D & Mataric M J, Defining socially assistive robotics, 9th Int Conf Rehab Rob (IEEE) 2005, 465–468, doi: 10.1109/ICORR.2005.1501143.
  • Hayashi K, Sakamoto D, Kanda T, Shiomi M, Koizumi S, Ishiguro H, Ogasawara T & Hagita N, Humanoid robots as a passive-social medium-a field experiment at a train station, 2nd ACM/IEEE Int Conf Human-Rob Interact (IEEE) 2007, 137–144, doi: 10.1145/1228716.1228735.
  • Burgard W, Cremers A B, Fox D, Hähnel D, Lakemeyer G, Schulz D, Steiner W & Thrun S, The interactive museum tour-guide robot, AAAI/IAAI, (1998) 11–18, doi: 10.1016/S0004-3702(99)00070-3.
  • Sabanovic S, Michalowski M P & Simmons R, Robots in the wild: Observing human-robot social interaction outside the lab, 9th IEEE Int Work Adv Motion Control (IEEE) 2006, 596–601, doi: 10.1109/AMC.2006.1631758.
  • Shiomi M, Kanda T, Glas D F, Satake S, Ishiguro H & Hagita N, Field trial of networked social robots in a shopping mall, IEEE/RSJ Int Conf Intell Rob Syst (IEEE) 2009, 2846–2853, doi: 10.1109/IROS.2009.5354242.
  • Sisbot E A, Alami R, Siméon T, Dautenhahn K, Walters M & Woods S, Navigation in the presence of humans, 5th IEEE-RAS Int Conf Humanoid Rob (IEEE) 2005, 181–188, doi: 10.1109/ICHR.2005.1573565.
  • Shiomi M, Kanda T, Ishiguro H & Hagita N, A larger audience, please!—Encouraging people to listen to a guide robot, 5th ACM/IEEE Int Conf Human-Rob Interact (IEEE) 2010, 31–38, doi: 10.1109/HRI.2010.5453270.
  • Shiomi M, Kanda T, Koizumi S, Ishiguro H &Hagita N, Group attention control for communication robots, Int J Humanoid Rob, 5(04) (2008) 587–608, doi: 10.1142/S021984360800156X.
  • Reddy A K, Malviya V & Kala R, Social cues in the Autonomous Navigation of indoor mobile robots, Int J Soc Rob, 13 (2021) 1335–1358, doi: 10.1007/s12369-020-00721-1.
  • Martinez-Garcia E A & Akihisa O, Crowding and guiding groups of humans by teams of mobile robots, In IEEE Work Adv Rob Soc Impacts (IEEE) 2005, 91–96, doi: 10.1109/ARSO.2005.1511629.
  • Garrell A, Sanfeliu A & Moreno-Noguer F, Discrete time motion model for guiding people in urban areas using multiple robots, IEEE/RSJ Int Conf Intell Rob Syst (IEEE) 2009, 486–491, doi: 10.1109/IROS.2009.5354740.
  • Arulampalam M S, Maskell S, Gordon N & Clapp T, A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking, IEEE Trans Signal Proc, 50(2) (2002) 174–188, doi: 10.1109/78.978374.
  • Everingham M & Zisserman A, Identifying individuals in video by combining'generative'and discriminative head models, Tenth IEEE Int Conf Comput Vis (IEEE) 2005, 1103–1110, doi: 10.1109/ICCV.2005.116.
  • Sivic J, Everingham M & Zisserman A, “Who are you?”-Learning person specific classifiers from video, IEEE Conf Comput Vis Pattern Recog (IEEE) 2009, 1145–1152, doi: 10.1109/CVPR.2009.5206513.
  • Tapaswi M, Bäuml M & Stiefelhagen R, “Knock! Knock! Who is it?” probabilistic person identification in TV-series, IEEE Conf Comput Vis Pattern Recog (IEEE) 2012, 2658–2665, doi: 10.1109/CVPR.2012.6247986.
  • Wolf L, Hassner T & Maoz I, Face recognition in unconstrained videos with matched background similarity, In CVPR (IEEE) 2011, 529–534, doi: 10.1109/CVPR.2011. 5995566.
  • Barr J R, Bowyer K W, Flynn P J & Biswas S, Face recognition from video: A review, Int J Pattern Recog Artif Intel, 26(05) (2012) 1266002, doi: 10.1142/S02180014 12660024.
  • Cinbis R G, Verbeek J & Schmid C, Unsupervised metric learning for face identification in TV video, IEEE Int Conf Comput Vis (IEEE) 2011, 1559–1566, doi: 10.1109/ ICCV.2011.6126415.
  • Lu C & Tang X, Surpassing human-level face verification performance on LFW with Gaussian Face, AAAI Conf Artif Intell, 29(1) (2015), doi: 10.48550/arXiv.1404.3840.
  • Sivic J, Everingham M & Zisserman A, Person spotting: Video shot retrieval for face sets, Int Conf Image Video Retrieval (Springer Berlin Heidelberg) 2005, 226–236, doi: 10.1007/11526346_26.
  • Kumar N, Kunju N, Kumar A & Sohi B S, Active marker based kinematic and spatio-temporal gait measurement system using LabVIEW vision, J Sci Ind Res, 69(8) (2010) 600–605.
  • Smeulders A W, Worring M, Santini S, Gupta A & Jain R, Content-based image retrieval at the end of the early years, IEEE Trans Pattern Analysis Mach Intell, 22(12) (2000) 1349–1380, doi: 10.1109/34.895972.
  • Malviya V & Kala R, Trackingvehicle and faces: Towards socialistic assessment of human behaviour, IEEE Conf Inform Communication Technol (IEEE) 2018, 1–6, doi: 10.1109/INFOCOMTECH.2018.8722427.
  • Azadi S, Moradi M & Esmaili A, Optimal balancing of PUMA-like robot in predefined path, J Sci Indus Res, 74(4) (2015) 209–211.
  • Malviya V & Kala R, Socialistic 3D tracking of humans from a mobile robot for a ‘human following robot’ behaviour, Robotica, 41(5) (2023) 1407–1435, doi: 10.1017/S0263574722001795.
  • Malviya V & Kala R, Trajectory prediction and tracking using a multi-behaviour social particle filter, Appl Intell, 52(7) (2022) 7158–7200, doi: 10.1007/s10489-021-02286-6.
  • Malviya A & Kala R, Social robot motion planning using contextual distances observed from 3D human motion tracking, Expert Syst Appl, 184 (2021) 115515, doi: 10.1016/j.eswa.2021.115515.
  • Malviya A & Kala R, Learning-based simulation and modeling of unorganized chaining behavior using data generated from 3D human motion tracking, Robotica, 40(3) (2022) 544–569, doi: 10.1017/ S0263574721000679.
  • Malviya A & Kala R, Risk modeling of the overtaking behavior in the Indian traffic, IEEE 25th Int Conf Intell Transp Syst (IEEE) 2022, 2882–2887, doi: 10.1109/ITSC55140.2022.9922140.
  • Zhang K, Zhang Z, Li Z & Qiao Y, Joint face detection and alignment using multitask cascaded convolutional networks, IEEE Signal Proc Lett, 23(10) (2016) 1499–1503, doi: 10.1109/LSP.2016.2603342.
  • Li H, Lin Z, Shen X, Brandt J & Hua G, A convolutional neural network cascade for face detection, IEEE Conf Comput Vis Pattern Recog (IEEE) 2015, 5325–5334, doi: 10.1109/CVPR.2015.7299170.
  • He K, Zhang X, Ren S & Sun J, Delving deep into rectifiers: Surpassing human-level performance on image net classification, IEEE Int Conf Comput Vis (IEEE) 2015, 1026–1034, doi: 10.1109/ICCV.2015.123.
  • https://www.youtube.com/watch?v=rABqTiqeA48

Abstract Views: 26

PDF Views: 26




  • Navigation with a Cooperative Social Robot for a Group of Visitors using Face Detection and a ‘Stop and Wait’ Scheme

Abstract Views: 26  |  PDF Views: 26

Authors

Vaibhav Malviya
National Institute of Technology Mizoram, Mizoram 796 012, India
Rahul Kala
Atal Bihari Vajpayee Indian Institute of Information Technology and Management Gwalior, Madhya Pradesh 474 015, India

Abstract


The work is on the use of a robot as a guide that takes visitors for a guided tour around a facility. A past project of the research group proposed a robot guide that enacted a pre-recorded tour, however, had a limited applicability as the humans rarely followed the guided tour and the robot did not respond to the movement of the visitors. A robotic guide must ensure that it takes the visitors along, who are not left behind, while typically maintaining comforting distances from the visitor. The width and height of the human face is computed which is inversely proportional to the distance of the person from the robot. Further, we construct a method which guides visitors cooperatively. The robot moves sequentially to different locations with the visitors, and if any visitor is found missing, the robot stops and waits for that visitor. When the visitor becomes visible, the robot resumes the journey. The robot moves and navigates as a guide for a group of visitors maintaining appropriate distances from the visitor using the distance measurement methodology. The results are demonstrated by making the robot take visitors on a guided tour of the Robotics and Machine Intelligence Laboratory. The robot waits if a visitor leaves the group for calls or any other reason, while also waits if the visitors lag behind. The work demonstrates the ability of a robot to be socially complaint while taking a group of visitors on a guided tour.

Keywords


Behavioural robotics, Convolution neural network, Group detection, Robotic tour, Service robot.

References