Innovative System Boosts Drone-Viewpoint Mixed Reality Apps

A research group at Osaka University has developed an innovative positioning system, correctly aligning the coordinates of the real and virtual worlds without the need to define routes in advance. This is achieved by integrating two vision-based self-location estimation methods: visual positioning systems (VPS) and natural feature-based tracking. This development will lead to the realization of versatile drone-based mixed reality (MR) using drones available on the market. Drone-based MR is expected to see use in a variety of applications in the future, such as urban landscape simulation and support for maintenance and inspection work, contributing to further development of drone applications, especially in the fields of architecture, engineering, and construction (AEC).

In recent years, there has been a growing interest in the integration of drones across diverse sectors, particularly within AEC. The use of drones in AEC has expanded due to their superior features in terms of time, accuracy, safety, and cost. The amalgamation of drones with MR stands out as a promising avenue as it is not restricted by the user's range of action and is effective when performing landscape simulations for large-scale spaces such as cities and buildings. Previous studies proposed methods to integrate MR and commercial drones using versatile technologies such as screen sharing and streaming delivery; however, these methods required predefined drone flight routes to match the movements of the real and virtual world, thus reducing the versatility of the application and limiting use cases of MR.

While this research does not implement a drone-based MR application for actual use, the proposed alignment system is highly versatile and has the potential for various additional functionalities in the future. This brings us one step closer to realizing drone-centric MR applications that can be utilized throughout the entire lifecycle of architectural projects, from the initial stages of design and planning to later stages such as maintenance and inspection.

First author Airi Kinoshita mentions, "The integration of drones and MR has the potential to solve various social issues, such as those in urban planning and infrastructure development and maintenance, disaster response and humanitarian aid, cultural protection and tourism, and environmental conservation by freeing MR users from the constraints of experiencing only their immediate vicinity, enabling MR expression from a freer perspective."

20240606_2_fig_1.png

Fig. 1

Overview of the proposed method. While previous methods operate drones according to predefined flight routes, the proposed method integrates VPS and natural feature-based tracking to estimate the drone's position, thereby matching real-world and virtual-world motion.

Credit: 🄫2023 Airi Kinoshita et al., Drone Systems and Applications

20240606_2_fig_2.png

Fig. 2

A comparison of the positioning accuracy between the proposed system of this study and the system of the previous study. Compared to the results of the previous study (right column), it can be read that the positioning accuracy of the proposed system in this study is higher.

Credit: 🄫2023 Airi Kinoshita et al., Drone Systems and Applications

20240606_2_fig_3.png

Fig. 3

To quantitatively evaluate the results in Figure 2, the accuracy of alignment is quantified using an index called IoU and expressed as a graph.

Credit: 🄫2023 Airi Kinoshita et al., Drone Systems and Applications

The article, "Drone-Based Mixed Reality: Enhancing Visualization for Large-Scale Outdoor Simulations with Dynamic Viewpoint Adaptation Using Vision-Based Pose Estimation Methods," was published in Drone Systems and Applications at DOI: https://doi.org/10.1139/dsa-2023-0135

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.