Multiple Finger Gesture Recognition Using a Single Finger Ring

Ubiquitous Interaction for the Internet of Things

KompetenzzentrumNext Generation Services

KontaktProf. Dr. Sahin AlbayrakDipl.-Inform. Mathias Wilhelm

 

The `Internet of Things' (IoT) refers to a wide variety of connected devices with build-in sensors, which collect and exchange information. Most of these objects are not equipped with dedicated user interface technology. Interaction with the IoT thus requires new interaction paradigms with ubiquitous interaction devices, devices that are seamlessly present throughout the users' daily live and not bound to any specific things.Gestures are naturally used in inter-human communication. They also represent a promising modality for interaction with the IoT as they can communicate spatial relations referring to the physical side of the IoT. Finger rings equipped with sensors are a promising basis for ubiquitous gesture detection. However, most existing sensor rings can only track the motion of one finger or are severely limited in the gestures they can recognize. Additionally, gesture recognition algorithms, if applied continuously, can easily exceed the power capacity of a battery that is small enough to be integrated in a finger ring. These disadvantages limit the applicability and utility of current sensor rings. The objective of this project is to explore and evaluate the capabilities and limits of capacitive proximity sensing for enabling multiple finger gesture recognition with a single finger ring over the whole day. For this purpose we follow a 3-fold research agenda by answering the following questions: (1) Which configuration of capacitive proximity sensing and other sensors is best suited to detect gestures and what are the limits of this technology? (2) How can the produced data streams be mapped to a skeleton hand model to enable finger tracking? (3) How far can the power usage be optimized by a prefiltering algorithm? To answer these questions we develop multiple generations of ring prototypes and optimize them in combination with the finger tracking and prefiltering algorithms based on recorded gestures.

Funded By: