Анотація:
The aim of this paper is to present a novel architecture for hand gesture-based control of mobile robots. The research activity was mainly focused on the design of a new method for hand gestures recognition under unconstrained scenes. Our method comes to solve some of the most important problems that current HRI (Human-Robot Interaction) systems fight with: changes of lighting, long distances, speed. Like any other HRI specific method, the one that we developed is working in real time/environment. It is a robust and adaptive method, being able to deal with changes of lighting. It is also capable of recognizing hand gestures from long distances. Another important issue we have focused upon was the integration of our method into a more complex HRI system, in which a human operator can drive a mobile robot only through hand gestures. In systems like these, the communication between human operators and robotic systems should be done in the most natural way. Typically, communication is done through voice and hands/head postures and gestures. Our method was designed in such a manner that will be able to recognize the hand gestures even if there are certain deviations from the ideal cases.