Open Access Open Access  Restricted Access Subscription Access

EEG Signal-Based Movement Control for Mobile Robots


Affiliations
1 Department of Computer Science and Engineering and Information Technology, School of Technology, Assam Don Bosco University, Azara - 781 017, India
 

Although wheelchair with joystick control is available, people whose hands are paralysed cannot use the joystick and need other forms of assistance to move. This article presents the design and analysis of a mobile prototype robot control using a single-electrode commercial electroencephalogram (EEG) headset. We examine the possibility of detecting P300 and blink signal for use as an input to control a prototype robot. From the captured EEG signals, P300 and non-P300 are classified using an artificial neural network. In another experiment, we classify signals captured during the intentional blink of the eye and signals where there is no blink. Also, we classify when the user intentionally blinks two, three and four times. From the experiments, we found that P300 cannot be successfully detected with a single dry electrode on Fp1 position. Additionally, we found that signals which contain blink and those which do not contain blink can be classified using an artificial neural network. We also found that different number of blinks can be classified using an artificial neural network. Different number of blinks is used to move forward, turn left and right. The model trained to classify between blink and nonblink signals is used to apply the brake. Experiments performed have shown that using a single-electrode commercial headset and blink of the eye, a user can successfully control the prototype to reach a predefined destination.

Keywords

Electroencephalography, Machine Learning, Brain–Computer Interface, Neural Networks.
User
Notifications
Font Size

  • Geng, T., Gan, J. Q. and Hu, H., A self-paced online BCI for mobile robot control. Int. J. Adv. Mech. Syst., 2010, 2(1–2), 28-35.
  • Galán, F., Nuttin, M., Lew, E., Ferrez, P. W., Vanacker, G., Philips, J. and Millán, J. D., A brain-actuated wheelchair: asynchronous and non-invasive brain–computer interfaces for continuous control of robots. Clin. Neurophysiol., 2008, 119(9), 2159–2169.
  • Li, J., Liang, J., Zhao, Q., Li, J., Hong, K. and Zhang, L., Design of assistive wheelchair system directly steered by human thoughts. Int. J. Neural Syst., 2013, 23(3), 1350013.
  • Barbosa, A. O., Achanccaray, D. R. and Meggiolaro, M. A., Activation of a mobile robot through a brain computer interface. In IEEE International Conference on Robotics and Automation, Alaska, USA, 3 May 2010, pp. 4815–4821.
  • Pires, G., Castelo-Branco, M. and Nunes, U., Visual P300-based BCI to steer a wheelchair: a Bayesian approach. In 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, Canada, 20 August 2008, pp. 658-661.
  • Rebsamen, B., Guan, C., Zhang, H., Wang, C., Teo, C., Ang, M. H. and Burdet, E., A brain controlled wheelchair to navigate in familiar environments. IEEE Trans. Neural Syst. Rehab. Eng., 2010, 18(6), 590–598.
  • Shin, B. G., Kim, T. and Jo, S., Non-invasive brain signal interface for a wheelchair navigation. In International Conference on Control, Automation and Systems, 27 October 2010, pp. 2257-2260.
  • Escolano, C., Murguialday, A. R., Matuz, T., Birbaumer, N. and Minguez, J., A telepresence robotic system operated with a P300based brain–computer interface: initial tests with ALS patients. In 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August 2010, pp. 4476–4480.
  • Mandel, C., Lüth, T., Laue, T., Roefer, T., Graeser, A. and Krieg Brueckner, B., Navigating a smart wheelchair with a brain-computer interface interpreting steady-state visual evoked potentials. In IEEE/RSJ International Conference on Intelligent Robots and Systems, St Louis, USA, 10 October 2009, pp. 1118–1125.
  • Li, Y., Pan, J., Wang, F. and Yu, Z., A hybrid BCI system combining P300 and SSVEP and its application to wheelchair control. IEEE Trans. Biomed. Eng., 2013, 60(11), 3156–3166.
  • Long, J., Li, Y., Wang, H., Yu, T., Pan, J. and Li, F., A hybrid brain–computer interface to control the direction and speed of a simulated or real wheelchair. IEEE Trans. Neural Syst. Rehab. Eng., 2012, 20(5), 720–729.
  • Rahul, Y., Sharma, R. K. and Nissi, P., A review on EEG control smart wheel chair. Int. J. Adv. Res. Comput. Sci., 2017, 8(9), 501-507.
  • Bougrain, L., Saavedra, C. and Ranta, R., Finally, what is the best filter for P300 detection? In TOBI Workshop lll-Tools for BrainComputer Interaction, Würzburg, Germany, 20 March 2012.
  • Khemri, N. A., P300 wave detection using a commercial noninvasive EEG sensor: reliability and performance in control applications. Masters Abst. Int., 2012, 51(2).

Abstract Views: 364

PDF Views: 111




  • EEG Signal-Based Movement Control for Mobile Robots

Abstract Views: 364  |  PDF Views: 111

Authors

Yumlembam Rahul
Department of Computer Science and Engineering and Information Technology, School of Technology, Assam Don Bosco University, Azara - 781 017, India
Rupam Kumar Sharma
Department of Computer Science and Engineering and Information Technology, School of Technology, Assam Don Bosco University, Azara - 781 017, India

Abstract


Although wheelchair with joystick control is available, people whose hands are paralysed cannot use the joystick and need other forms of assistance to move. This article presents the design and analysis of a mobile prototype robot control using a single-electrode commercial electroencephalogram (EEG) headset. We examine the possibility of detecting P300 and blink signal for use as an input to control a prototype robot. From the captured EEG signals, P300 and non-P300 are classified using an artificial neural network. In another experiment, we classify signals captured during the intentional blink of the eye and signals where there is no blink. Also, we classify when the user intentionally blinks two, three and four times. From the experiments, we found that P300 cannot be successfully detected with a single dry electrode on Fp1 position. Additionally, we found that signals which contain blink and those which do not contain blink can be classified using an artificial neural network. We also found that different number of blinks can be classified using an artificial neural network. Different number of blinks is used to move forward, turn left and right. The model trained to classify between blink and nonblink signals is used to apply the brake. Experiments performed have shown that using a single-electrode commercial headset and blink of the eye, a user can successfully control the prototype to reach a predefined destination.

Keywords


Electroencephalography, Machine Learning, Brain–Computer Interface, Neural Networks.

References





DOI: https://doi.org/10.18520/cs%2Fv116%2Fi12%2F1993-2000