BACK
|

Achieving a Flexible Interaction System

Ntention seeks to solve this problem by creating intelligent software combining multiple input methods with contextual information, to understand the intention behind each input event and perform the correct output.

A fully flexible interaction system

Today, most interaction systems are based on buttons, sticks and levers. These input methods are usually linked one-to-one with processes on the machine being used, and thus are made to fill the exact input needs of the machine. To cover every needed functionality, certain control systems and machines may end up with hundreds of buttons or sticks. The result is a complex, hard-to-learn control system which may require a substantial time investment to learn. It is clear that one-to-one buttons form a constraint for the flexibility and simplicity of interaction systems.

Using sensor-based Smart Gloves with ‘Gesture Recognition’ Software, Ntention is able to take snapshots of hand poses and record dynamic gestures and movement. Through the application of Machine Learning and Neural Networks, these gestures can be analyzed and recognized, and thus used in interaction systems to enable or initiate a variety of functionalities. Simple gestures can be pre-defined, recorded and personalized.

While this is a solid foundation for intuitive interaction systems, even this is not “fully flexible”. There still needs to be a pre-defined set of input methods and a pre-developed set of functionalities that may happen depending on each input. Ntention seeks to solve this problem by creating intelligent software combining multiple input methods with contextual information, to understand the intention behind each input event and perform the correct output. This solution is a Fully Flexible Interaction System.

Ntention news

Contact person

Author
Lead Software Engineer

Håvard Pedersen Brandal

EmailLinkedIn