Interfaces – Work Bench

Interaction through sounds, facial expressions and gestures

The human senses of sight, hearing and touch are reflected in gaze and gesture control, voice processing, and audio and haptic control. Interfaces recognise these human abilities, interpret them and react to them. Thanks to machine learning, intuitive interaction can now be continuously optimised.

Avatars and malleable displays

Visitors can experience this intuitive interaction beyond the use of a mouse and keyboard. As a result, things like avatars can be controlled purely through gazes and gestures. Data can literally be made tactile on a malleable display and be arranged in any way by pressing or pulling. With the help of sounds and clapping, a teddy bear can be controlled in an intergalactic game.

Visitors can discover and see for themselves how machines perform their assigned tasks. Sensory gloves and VR headsets serve as interfaces and are displayed.

We use Cookies solely for statistical purposes and for the necessary operation of the site. We use Matomo and anonymize the IP address. Cookies are only set if you accept this. For more information about Cookies is available in our Privacy policy.