Illustration of a VR path planning concept

XR Interface for drone path planning

Ntention recently completed an exciting project in collaboration with IFE Halden and MIL, supported by RFF Viken. The project focused on designing and prototyping a virtual reality (VR) user interface tailored to address multiple identified use cases. By working closely with end users and customers, the team aimed to validate the concept and lay the foundation for a versatile product that could be further developed and marketed across various industries.
Client
Date
September 2022- November 2024
Type of Project
Research and Development

Background

The market for robots and automation of tasks is growing fast. Some of the key driving factors in industry are reduced cost of operation and removing personnel from dangerous situations. While the ideal solution in many cases would be to fully automate work processes in many industries, the reality is that the technology is not ready both on the robot side and on the side of automation.

As the demand for mobile robotics and drones in across industries and applications increases, so does the need for more effective teleoperation systems. Traditional interfaces, like control panels or joysticks, may limit operator efficiency and increase cognitive load. They may also restrict operators to trained individuals, hampering the rate of adoption. VR interfaces present an opportunity to address these challenges by providingimmersive environments that support improved decision-making and taskperformance.

After working with the development of intuitive user interfaces for interaction with drones, robotics, and large building 3D models for several years, Ntention identified several industries where the lack of easy-to-use interfaces appeared to be a roadblock for adoption of robotics. Common for these industries were use cases with dynamic environments where full automation was not an option. Some of these scenarios included Space exploration, search-and-rescue missions, and inspection and maintenance work across several different industries.

Control roomDrone controller screen

Goal

Asses user needs and preferences

Assess specific user needs and preferences to inform the types of interactions that should be supported by the system, including data visualization, data source identification, control actions, manual/automatic control options.

Integrate VR and robotics

Integrate VR and the robotic system so that it is possible to communicate with robotics through a VR application, as well as to update and implement the natural user interface that can perform the interaction scenario defined in the project.

Test and validate

Test and validate whether the natural user interface is simpler, more efficient, and provides better situational awareness than current commercial solutions

Picture of a hand and a small drone over it

Concept

The concept we developed centered around creating a series of waypoints that, when combined, formed a path for the drone to follow. The key principles behind the design were simplicity and direct interaction with objects to reduce reliance on complex menus.

Illustration of a VR path planning concept

Object-Based Interaction

We implemented an object-based interaction system to minimize ambiguity and make the experience seamless. This approach allowed users to interact directly with objects in the environment:

  • To interact with the drone, users could click directly on it.
  • To interact with a waypoint, users simply clicked on the waypoint itself to move, modify, or delete it.

This design eliminated the need for a traditional menu system, where users often struggle with mapping between menu options and objects in the environment.

HandMenu for Extended Interaction

To complement object-based interaction, we introduced a hand menu that allowed users to:

  • Create waypoints from a distance.
  • Access key settings quickly and conveniently.

The handmenu ensured that users could interact efficiently even when far away from the drone or a specific waypoint. It could be accessed effortlessly by simply looking at their own palm, creating a natural and intuitive way to bring up the menu.

Navigation: The World-in-Miniature (WIM) Mental Model

For movement within the virtual 3D environment, we adopted a World-in-Miniature (WIM) mental model. In this approach, users perceive the digital environment as a miniature world that can be moved around them, rather than navigating their physical presence through the world. This differs from conventional navigation systems in digital environments.

Dragging for Movement

We enabled users to "drag themselves" around the environment using their hands:

  • Users could use one hand or both hands to grab and drag anywhere in the space to move toward their desired location.
  • This interaction was intuitive and allowed precise control over navigation.

Use-Case for the Energy Sector

In the energy sector, we tested the concept with key stakeholders to explore its potential applications. The VR user interface showed significant promise for control centers managing the remote operation of drones and robotics. It could serve as a valuable add-on interface to enhance remote control facilities, providing experts with intuitive tools to interact seamlessly with robotic systems. This approach underscores the importance of creating interfaces that empower professionals to maximize the efficiency and precision of robotics in critical operations.

Result

We successfully integrated the VR interface with robotics using the Robot Operating System (ROS), creating a functional demo that allows users to plan a drone's path in VR, similar to desktop autopilot software. The video showcases this capability, demonstrating how users can navigate a virtual 3D environment—represented here by a 3D map—and set a flight plan for the drone. The software's modular design ensures that the drone simulator featured in the demo can easily be replaced with a real drone, highlighting the system's adaptability for practical applications.