Open access peer-reviewed chapter

Robot Perception Based on Vision and Haptic Feedback for Fighting the COVID-19 Pandemic

Written By

Ahmad Hoirul Basori, Omar M. Barukab, Sharaf Jameel Malebary and Andi Besse Firdausiah Mansur

Submitted: 26 November 2021 Reviewed: 19 April 2022 Published: 18 May 2022

DOI: 10.5772/intechopen.104983

From the Edited Volume

Haptic Technology - Intelligent Approach to Future Man-Machine Interaction

Edited by Ahmad Hoirul Basori, Sharaf J. Malebary and Omar M. Barukab

Chapter metrics overview

88 Chapter Downloads

View Full Metrics

Abstract

The robot perception can be enhanced further through visual and haptic to give more impression. This chapter aims to combine vision and haptic for the robot navigation during tracing their movement. The pandemic has striven humans to do direct contact; therefore, an alternative using the robot as delivery tools is assumed to be one of solution. As the initial experiment has been shown in the previous section, the deviation of angle is quite low and the success rate of arriving at the destination is also quite high around 76%. Future work can be enhanced by improving the success rate by monitoring the robot track closely.

Keywords

  • robot perception
  • vision
  • haptic
  • navigation

1. Introduction

The pandemic of COVID-19 has affected human life in general. Many prevention actions are used to prevent the virus from the spread, including maintaining social distance and wearing a facemask. To ensure the safety of people, robots are used for controlling the COVID-19 patient bed using the Arduino robot [1]. It is designed to make less contact with patients so reduced the chance of infection. The Coronavirus spreads via the saliva droplets or nose liquid when the sick person is coughing or sneezing [2]. Due to the high rate of infection, mobile robots can be an alternative solution to reduce contact with patients [2]. Robots for support service is one of the solutions for maintaining society awareness toward the virus spread. The mobile robot’s utility during the pandemic might vary, such as delivery service, population awareness, and disinfection facilities [2]. This chapter aims to provide new robot insight through deep learning vision and haptic that can augment the robot’s response toward their environment.

Advertisement

2. Related works

Artificial Intelligence (AI) and robotics are valuable resources for helping the patient treatment, doctors, nurses, and other front-line staff. Intelligent robots can perform good service for a particular task when it is planned and designed better [3, 4]. However, due to the high cost and complexity of the technology, not all country affords to adapt this approach [2]. Furthermore, another researcher tried to utilize a nursing robot for patient monitoring and medicine consumption according to the medication schedule. The other robot known as Lio-A is a robot with a multi-functional arm that has the capability of human-robot interaction and personal care assistant (Figures 1 and 2) [5].

Figure 1.

Robot consultation [2].

Figure 2.

Lio-A robot for a personal assistant [5].

The Lio-A robot has a visual and audio sensor for receiving the command, while laser and ultrasound for navigation and surrounding monitoring. It also has a mechanical sensor for handling the task given to them. Lio-A has the capability for autonomous action by having automatic navigation and recharging [5]. The other researcher also uses robots and realistic virtual reality to enhance interaction between humans and machines. The interaction can be in a gaming-based system, Brain-computer interface, or 3D simulation [6, 7, 8, 9, 10, 11, 12, 13, 14]. Virtual navigation using augmented reality or sensors is also helpful for the robot to achieve the desired direction according to the path that set up for them [15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28]. Machine learning algorithms, such as reinforcement learning also involve managing the crowd behavior of pedestrians, so it will be beneficial for robotic navigation later [29, 30, 31, 32, 33]. In addition, the Kinect camera with its capability for gesture tracking is also beneficial for medical applications along with robotics for helping COVID-19 patients, doctors, and nurses [34, 35, 36, 37, 38, 39, 40, 41].

Advertisement

3. Methodology

The visual perception of the robot can be enhanced through a special camera, such as a Kinect depth camera, that can provide a depth image stream as input. This image will be useful for the navigation of the robot due to its capability of providing a 3D image of the object.

Advertisement

4. Materials

There is three main hardware that involved during experiments:

  • Wheel robot using Arduino board Uno3, refer to Figure 3.

  • Kinect camera attached to the robot body.

  • Laptop for processing.

Figure 3.

System setup.

The Kinect camera will stream the object in front of the robot and send it to the pc, once pc received the image. It will be continued by processing the image whether the image is one of its goals or obstacles. If obstacle then the robot needs to avoid the object, while if it is a goal, the robot needs to grasp the object using the gripper, a detailed methodology is shown in Figure 4.

Figure 4.

System methodology.

Advertisement

5. Result and discussion

The initial testing using the Kinect camera is tracking the user skeleton. It can track the user’s movement, especially hand gestures that can be interpreted as command control. The tracked skeleton is shown as a fragmented line that is imposed on the human body, as shown in Figure 5. The parts of the body, such as the arm, body, and head skeleton, are tracked in real time. Later this body part will be used as reference control to manipulate the robot.

Figure 5.

Skeleton tracking.

Figure 6 shows how the depth image stream of the object streamed into pc for further process. While the depth image with a recognized person is shown in Figure 7.

Figure 6.

Depth image stream.

Figure 7.

Person recognition.

The process of person recognition with pre-train model is quite interesting; the robot can recognize the person by receiving the information from the main process inside the PC. The haptic device is used for helping the robot on avoiding obstacle by sending the vibration as an alert signal. So, the robot can turn left or right to avoid the obstacle. We did several experiments with robotic movement, as shown in Figure 8.

Figure 8.

Robot movement tracks.

Advertisement

6. Conclusions

Visual and haptic enhancement of robotic perception is very important for the success rate of the robot task. In this chapter, we present a combination of haptic with the vision to enhance the robot navigation during performing the delivery task to the user. Robot delivery is one of the essential keys during the pandemic of COVID-19 to avoid direct contact between humans. As the initial experiment has been shown in the previous section, the deviation of angle is quite low and the success rate of arriving at the destination is also quite high around 76%. Future work can be enhanced by improving the success rate by monitoring the robot track closely.

Advertisement

Acknowledgments

This work was supported by the Deanship of Scientific Research (DSR), King Abdulaziz University, Jeddah Saudi Arabia. The authors, therefore, gratefully acknowledge the DSR technical and financial support.

References

  1. 1. Hadi HA. Line Follower Robot Arduino (using robot to control Patient bed who was infected with Covid-19 Virus). In: 2020 4th International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT). 2020. Istanbul, Turkey: IEEE;
  2. 2. Cardona M, Cortez F, Palacios A, Cerros K. Mobile robots application against covid-19 pandemic. In: 2020 IEEE ANDESCON. Quito, Ecuador: IEEE; 2020
  3. 3. Hussain K, Wang X, Omar Z, Elnour M, Ming Y. Robotics and artificial intelligence applications in manage and control of COVID-19 pandemic. In: 2021 International Conference on Computer, Control and Robotics (ICCCR). Shanghai, China. 2021
  4. 4. Lay-Ekuakille A, Chiffi C, Celesti A, Rahman MZU, Singh SP. Infrared monitoring of oxygenation process generated by robotic verticalization in bedridden people. IEEE Sensors Journal. 2021;21(13):14426-14433
  5. 5. Mišeikis J, Caroni P, Duchamp P, Gasser A, Marko R, Mišeikienė N, et al. Lio-A personal robot assistant for human-robot interaction and care applications. IEEE Robotics and Automation Letters. 2020;5(4):5339-5346
  6. 6. Abdullasim N, Basori AH, Salam S, Bade A. Velocity perception: Collision handling technique for agent avoidance behavior. Indonesian Journal of Electrical Engineering. 2013;11:2264-2270
  7. 7. Afif FN, Basori AH. Orientation control for indoor virtual landmarks based on hybrid-based markerless augmented reality. Procedia—Social and Behavioral Sciences. 2013;97:648-655
  8. 8. Afif FN, Basori AH, Saari N. Vision-based tracking technology for augmented reality: A survey. International Journal of Interactive Digital Media. 2013;1(1):46-49
  9. 9. Ahmed MA, Basori AH. The influence of beta signal toward emotion classification for facial expression control through EEG sensors. Procedia—Social and Behavioral Sciences. 2013;97:730-736
  10. 10. AlJahdali HMA, Basori AH. Emotional contagion driven of parent-child’s agents in crowd during panic situation. International Journal of Computer Science and Network Security. 2019;19(1):1-7
  11. 11. Basori AH. End-effector wheeled robotic arm gaming prototype for upper limb coordination control in home-based therapy. TELKOMNIKA Telecommunication, Computing, Electronics and Control. 2020;18(4):2080-2086
  12. 12. Basori AH, Aljahdali H. Abdullah OSJIJoACS, applications. Forced-Driven Wet Cloth Simulation based on External Physical Dynamism. 2017;8:572-581
  13. 13. Basori AH, Aljahdali HJCE. Applications. TOU-AR:Touchable Interface for Interactive Interaction in Augmented Reality Environment. 2017;6:45-50
  14. 14. Ahmed MA, Basori AH, Saari N. Brain computer interface control for facial animation. International Journal of Interactive Digital Media. 2013;1(2):58-61
  15. 15. Albaqami NN, Allehaibi KH, Basori AH. Augmenting pilgrim experience and safety with geo-location way finding and mobile augmented reality. International Journal of Computer Science and Network Security. 2018;18(2):23-32
  16. 16. Allehaibi KHS, Basori AH, Albaqami NN. B-COV:Bio-inspired virtual interaction for 3D articulated robotic arm for post-stroke rehabilitation during pandemic of COVID-19. International Journal of Computer Science and Network Security. 2021;21(2):110-119
  17. 17. Arpit J, Abhinav S, Jianwu W, Mangey R. Use of AI, Robotics, and Modern Tools to Fight Covid-19. Use of AI, Robotics, and Modern Tools to Fight Covid-19. River Publishers; 2021. pp. ii-xxx
  18. 18. Basori AH. Emotion walking for humanoid avatars using brain signals. International Journal of Advanced Robotic Systems. 2013;10(1):29
  19. 19. Basori AH. HapAR: Handy intelligent multimodal haptic and audio-based mobile AR navigation for the visually impaired. In: Paiva S, editor. Technological Trends in Improved Mobility of the Visually Impaired. Cham: Springer International Publishing; 2020. pp. 319-334
  20. 20. Basori AH, AlJahdali HMA. Telerobotic 3D articulated arm-assisted surgery tools with augmented reality for surgery training. In: Heston TF, editor. Telehealth. IntechOpen: Intech; 2018
  21. 21. Basori AH, AlJahdali HMA. Performance driven-biped control for animated human model with motion synthesis data. Journal of Information Systems Engineering and Business Intelligence. 2018;4(2):162-168
  22. 22. Basori AH, Almagrabi A. Towards racing gamification with natural interface for post stroke rehabilitation. Computer Engineering and Applications Journal. 2017;6:21-28
  23. 23. Basori AH, Almagrabi AO. Real time physical force-driven hair animation. International Journal of Computer Science and Network Security. 2018;18(10):71-77
  24. 24. Basori AH, Al-Sharif AMA-G, AOF AL-O, Almagrabi A. Ba-Rukab OJJoISE, Intelligence B. Intelligence Context Aware Mobile Navigation using Augmented Reality Technology. 2018;4:65-72
  25. 25. Basori AH, Dato’ b, Abdul Kadir MR, Ali RM, Mohamed F, Kadiman S. Kinect-based gesture recognition in volumetric visualisation of heart from cardiac magnetic resonance (CMR) imaging. In: Ma M, Jain LC, Anderson P, editors. Virtual, Augmented Reality and Serious Games for Healthcare 1. Berlin, Heidelberg: Springer Berlin Heidelberg; 2014. pp. 79-92
  26. 26. Basori AH, Qasim AZ. Extreme expression of sweating in 3D virtual human. Computers in Human Behavior. 2014;35:307-314
  27. 27. Basori AH, Tenriawaru A, Mansur ABF. Intelligent avatar on E-learning using facial expression and haptic. Telkomnika. 2011;9(1):115-124
  28. 28. Basori AH, Malebary SJ. iDriveAR: In-vehicle driver awareness and drowsiness framework based on facial tracking and augmented reality. In: Gupta N, Prakash A, Tripathi R, editors. Internet of Vehicles and its Applications in Autonomous Driving. Cham: Springer International Publishing; 2021. pp. 93-103
  29. 29. Basori AH, Malebary SJ. Deep reinforcement learning for adaptive cyber defense and Attacker’s pattern identification. In: Shandilya SK, Wagner N, Nagar AK, editors. Advances in Cyber Security Analytics and Decision Systems. Cham: Springer International Publishing; 2020. pp. 15-25
  30. 30. Basori AH, Malebary SJ, Firdausiah Mansur AB, Tenriawaru A, Yusof N, Yunianta A, et al. Intelligent socio-emotional control of pedestrian crowd behaviour inside smart city. Procedia Computer Science. 2021;182:80-88
  31. 31. Malebary SJ, Basori AH, Alkayal ES. Reinforcement learning for pedestrian evacuation simulation and optimization during pandemic and panic situation. Journal of Physics: Conference Series. 2021;1817(1):012008
  32. 32. Mohd Rahim MS, Fata AZA, Basori AH, Rosman AS, Nizar TJ, Yusof FWM. Development of 3D tawaf simulation for hajj training application using virtual environment. In: Visual Informatics: Sustaining Research and Innovations. Berlin, Heidelberg: Springer Berlin Heidelberg; 2011
  33. 33. Tenriawaru A, Mansur ABF, Basori AH, Al-Qurashi Q , Al-Muhaimeed A, Al-Hazmi M. Social awareness and safety assistance of COVID-19 based on DLN face mask detection and AR Distancing. International Journal of Artificial Intelligence Research. 2021;5(2):111-122
  34. 34. Manikandan P, Ramesh G, Likith G, Sreekanth D, Prasad GD. Smart nursing robot for COVID-19 patients. In: 2021 International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE). Greater Noida, India: IEEE; 2021
  35. 35. Murphy R. How robots are helping with COVID-19 and how they can do more in the future. In: 2020 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR). Abu Dhabi, United Arab Emirates: IEEE; 2020
  36. 36. Ravuri P, Yenikapati T, et al. Design and simulation of medical assistance robot for combating COVID-19. In: 2021 6th International Conference on Communication and Electronics Systems (ICCES). Coimbatre, India: IEEE; 2021
  37. 37. Riduwan M, Basori AH, Mohamed F. Finger-based gestural interaction for exploration of 3D heart visualization. Procedia—Social and Behavioral Sciences. 2013;97:684-690
  38. 38. Sayed AS, Ammar HH, Shalaby R. Centralized multi-agent mobile robots SLAM and navigation for COVID-19 field hospitals. In: 2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES). Abu Dhabi, United Arab Emirates: IEEE; 2020
  39. 39. Siriwardhana Y, Gür G, Ylianttila M, Liyanage M. The role of 5G for digital healthcare against COVID-19 pandemic: Opportunities and challenges. ICT Express. IEEE. Coimbatre, India 2021;7(2):244-252
  40. 40. Yusoff YA, Basori AH, Mohamed F. Interactive hand and arm gesture control for 2D medical image and 3D volumetric medical visualization. Procedia—Social and Behavioral Sciences. 2013;97:723-729
  41. 41. Zedda A, Gusai E, Caruso M, Bertuletti S, Baldazzi G, Spanu S, et al. DoMoMEA: A home-based telerehabilitation system for stroke patients. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). New York: IEEE; 2020

Written By

Ahmad Hoirul Basori, Omar M. Barukab, Sharaf Jameel Malebary and Andi Besse Firdausiah Mansur

Submitted: 26 November 2021 Reviewed: 19 April 2022 Published: 18 May 2022