Control a flying drone with natural hand gestures
The Drone Glove is a sensor-based glove that enables the user to control a flying drone with natural and intuitive hand gestures.
The Drone Glove is a sensor-based glove that enables the user to interact with software and machines through intuitive motions. The glove features a set of sensors that captures the movements of the hand, and it has a custom designed microchip. The sensors are placed on the fingers and the back of the hand. The movements of the hand are then translated to the corresponding control commands. The glove can communicate through Bluetooth, Bluetooth Low Energy and RF signals.
Ntention has developed a flexible foundation allowing us to tailor our technological solution to different hobby or industry use cases. For each case, we develop a set of movements that corresponds to specific commands, and we emphasize developing an intuitive control scheme and providing a unique user experience. Long term, we aim to use Machine Learning and/or AI to recognize the user’s intention with each movement to fully provide an intuitive experience, but we are currently using a pre-defined set of movements.
As “proof-of-concept” for our interaction technology, we developed a prototype that allowed the user to control drones in the air. The glove registers the yaw, pitch and roll of the hand, and it also registers how open or closed the hand is. Advanced sensors on the back of the hand registers the movements over the x-, y- and z-axes, which is then communicated to the drone for its rotation and movement. These advanced sensors are called Inertial Measurement Units (IMUs), which is a combination of an accelerometer, a gyroscope and a magnetometer. How open the hand is controls the throttle, which in turn controls speed, lift-off and landing.
Ntention developed applications for Android and iOS that enabled the Drone Glove to control DJI and Parrot drones. The glove connects to the applications through Bluetooth Low Energy, and the apps communicate with the connected drones through Wi-Fi signals. Our apps featured glove calibration and options to tailor the drone control methods (e.g. signal smoothing). The applications also enable the user to take photos or video footage
through the drone camera, if available.
We made sure the Drone Glove would be as easy to use as possible, with a “Plug & Play” philosophy. To use the Drone Glove with a drone, the user would need to; put on the glove, pair the glove with the application, calibrate the glove in-app to make sure the sensors are fitted to the hand size, and connect the app to the drone. Then, the user can lift off and control the drone in the air.
After developing the proof-of-concept, we moved on to other use cases. We began by creating a simple 3D visualization of the hand movements through a 3D model of a hand that moves corresponding to the user’s hand. Ntention continued development by creating a small portfolio of “proof-of-concept” projects. With the help and expertise of BLJ Engineering, we built a 7-joint robot arm, and we developed a way to steer the robot arm and perform simple tasks with the Drone Glove.
Parallelly, we developed our app for the Epson Moverio AR-glasses, enabling users to control drones with the glove while using the AR-glasses. This allows users to receive the video feed from the drone directly into the AR-glasses while controlling the drone with the Drone Glove. Going forward, we aim to develop pilot projects for industry use cases and integrate the Drone Glove into additional AR/VR systems, with a focus on creating the most intuitive interaction systems possible.