Hand Gesture Recognition

 

Introduction

Hand Gesture Recognition (HGR) engine provides the functions of registering hand gestures for Grayscale/IR video input and recognizing them.

It receives video images from a camera or a file-based video image data saved in the storage device, and delivers the results to the application.

It can recognize hand gestures, receive the configuration information of the video data from the input data, and use them for analyzing the video data.

 

The HGR engine provided by ThinQ.AI supports the following features:

DSM 엔진의 특징
Feature Description
Few-shot gesture control

Learns gestures from 1 to 3 second-long short videos (learning with few-shot samples).

Low-power CPU

It allows operating on a low-power CPU for deep-learning-based hand-gesture recognition (Raspberry Pi 4).

DNN based pipeline

Through the DNN-based pipeline, it can track the palm and hand skeleton to classify hand gestures.

 

Engine Structure

HGR engine receives video image data as an input, recognizes hand gestures, and outputs the result.

HGR엔진구조.png

Examples of Use 

It can be applied to various services with an embedded camera.

  • You can use it for gesture recognition by installing it in a smart camera, XR device, signage, or home appliance.
  • You can increase the proportion of non-contact diagnosis or treatment by applying it to health care.
  • You can reduce the risk of injury by applying it to the manufacturing process and minimizing the number of direct contact.

활용_비대면진료.png