A Gesture Recognition and Classification User Interface for RGB-based Videos (Bachelor Thesis, Ongoing)


Joel Gschwind


Gestures are a non-separable part of everyday communication and a crucial component of human-machine interaction. There are several methods of capturing gestures to make the recognition and detection easier, for instance using color, motion sensors or depth cameras. However, there are many instances where the gestures to be recognized are captured through normal cameras and a lot of applications rely on recognizing gestures in this environment. The goal of the Deepmime Project is to develop a system which performs real-time gesture recognition based on only RGB camera feeds.

The Deepmime System is envisioned to have three components:

The Bachelor’s Thesis has the following objectives:

Start / End Dates

2019/02/25 - 2019/06/25


Research Topics