THE 3 rd INTERNATIONAL SCIENTIFIC CONFERENCES OF STUDENTS AND YOUNG RESEARCHERS dedicated to the 99
th
anniversary of the National Leader of Azerbaijan Heydar Aliyev
129
TOUCHLESS HUMAN-MACHINE INTERFACE BASED ON INFRARED PROXIMITY SENSING Ali Asgarov Baku Higher Oil School Baku, Azerbaijan ali.asgarov.std@bhos.edu.az Supervisor: Ph.D Associate Professor Ali Parsayan Keywords: Hand gesture recognition, Haar-Like features, Deep Learning, Z-axis filtering,
3-D Images, Depth Sensor, Infrared (IR) Images, Convolutional Neural Network (CNN).
Introduction Many researchers have attempted to make machines interpret other
people's intents and information through noncontact methods including
voice, facial expressions, body motion, and gestures. Gesture is the most
essential component of human language, and its evolution has an influence
on the nature and flexibility of human-robot interaction. We can see touch-
less HMI as a viable technology in today’s world, with more promise in areas
where sanity or outdoor operation are important considerations [1].
In this
paper, a suitable design exploring some techniques involved in hand and
finger movement detection, using depth-sensing infrared cameras embedded
on Xbox Kinect Module is presented. Firstly, 3-D images are generated and
filtered along the z-axis, then two distinct techniques; Haar-Like Features,
and Deep Learning using a Convolution Neural Network (CNN), are
performed on the images to detect hands movement.
Materials and Method There are three main steps in the detection system: the input unit,
processing unit and the output unit. The input unit consists of the Kinetic
Sensor which generates a RGB image, Infrared image and Depth
Information. Processing Unit consists of detection of hands in infrared
images, detection of fingers in the hands, detection of the finger movements.
Output unit consists of displaying result of hand and finger movement
detection.