Smart Mobility Digital Twin Boosts Hybrid, Remote Driving

Key-points

  • Real-time replication of real-world traffic conditions in cyber space using Smart Mobility Digital Twin
  • Successfully implemented a hybrid autonomous and remote driving system using Smart Mobility Digital Twin
  • This hybrid driving system enhances both traffic safety and efficiency simultaneously

Summary

The research groups led by Prof. Kei Sakaguchi from the School of Engineering at Tokyo Institute of Technology and Prof. Walid Saad from Virginia Tech have jointly realized a Smart Mobility Digital Twin[1] that replicates physical space's traffic conditions in cyber space in real-time. Using this digital twin, they successfully demonstrated a hybrid autonomous driving system that combines both self-driving and remote operation.

While digital twin technology, which replicates physical objects and systems in cyberspace, has seen rapid growth in fields like manufacturing and construction, it had not been applied to the dynamic mobility sector until now.

In this research, the Smart Mobility Education & Research Field at Tokyo Tech's Ookayama Campus was utilized to build a smart mobility digital twin. Furthermore, a demonstration system for hybrid autonomous driving, combining self-driving and remote control, was developed using this digital twin. In the demonstration, the digital twin was able to identify safer and more efficient routes for autonomous vehicles in real-time and relay this information back to the vehicles. This confirmed that hybrid autonomous driving, integrating both local autonomy and remote guidance, is feasible.

This research enables the fusion of local path planning based on the vehicle's own sensors and global path planning based on the digital twin's broader environment view. This is achieved through V2X[2] communication. improving both traffic safety and efficiency simultaneously.

The result of this research was published in "IEEE Transactions on Intelligent Vehicles" in vol. 9, no. 3, Mar. 2024.

Background

Digital twins, which reproduce physical space's objects and systems in cyber space, have rapidly developed in secondary industries such as manufacturing and construction. Recently, it has been applied to tertiary industries such as healthcare, education, and e-commerce, and is now extending to primary industries such as agriculture and fisheries. The advantages of digital twins include not only visualization using computer vision technology in cybers pace, but also real-time monitoring through sensors and IoT technology, prediction using simulation and AI, and optimal control and anomaly avoidance based on predictions. The difficulty of constructing digital twins varies with the dynamics of the objects or systems. In manufacturing and construction, where dynamics are low, digital twin implementation is relatively easy, but in mobility, with high dynamics, achieving a digital twin has been challenging.

Against this backdrop, Tokyo Institute of Technology and Virginia Tech have been working since 2022 on a joint research project commissioned by Japan's National Institute of Information and Communications Technology (NICT) and the U.S. National Science Foundation (NSF). This project, titled "Research and Development of Wireless Edge Computing Service Platforms for IoFDT (Internet of Federated Digital Twin) to Realize Society 5.0," aims to construct a Smart Mobility Digital Twin and has successfully implemented the world's first hybrid autonomous and remote driving using this digital twin.

Research Achievements

1. Smart Mobility Digital Twin

Tokyo Institute of Technology, in collaboration with members of the Super Smart Society Promotion Consortium, has been constructing the Smart Mobility Education & Research Field at Ookayama Campus since 2019. This field is equipped with two autonomous vehicles capable of Level 4/5 autonomous driving and four roadside units (RSUs) [3] intended for next-generation ITS (Intelligent Transportation System). The RSUs are equipped with sensors such as LiDAR[4] and cameras, V2X (vehicle-to-everything) communication supporting 760 MHz, 5.7 GHz, and 60 GHz, edge computing (MEC), and backhaul networks to the cloud, enabling infrastructure-coordinated safe driving support. The Smart Mobility Digital Twin reproduces these physical mobility fields in real-time in cyberspace, allowing for real-time collision prediction and route planning on the digital twin, thereby enabling safe driving support.

The system configuration of the Smart Mobility Digital Twin is shown in Fig. 1. It consists of autonomous vehicles and RSUs in the physical space, edge and cloud servers, a virtualization platform orchestrating the entire network, ROS (Robot Operating System) and Autoware software packages for autonomous driving operating in the cyberspace, static information such as Ookayama point-cloud map/3D models, 3D visualization

Fig. 1 System Architecture of Smart Mobility Digital Twin

Fig. 1. System Architecture of Smart Mobility Digital Twin

Fig. 2 Ookayama Smart Mobility Digital Twin

Fig. 2. Ookayama Smart Mobility Digital Twin

software like Unity, and dynamic smart mobility applications operating on these infrastructures. Edge servers in autonomous vehicles and RSUs use sensors like LiDAR and cameras to detect surrounding traffic participants such as vehicles, bicycles, and pedestrians, constructing localized digital twins. Information detected by multiple vehicles and RSUs is aggregated in the cloud and superimposed on point clouds/3D maps to construct a wide-area digital twin of the entire field. By incorporating such a hierarchical structure of local and wide-area digital twins (with any number of layers), it is possible to accommodate various smart mobility use cases with different requirements, such as collision avoidance and delivery optimization.

Fig. 2 shows an example of the Ookayama Smart Mobility Digital Twin. The bottom part displays photos of vehicles and RSUs in the physical space, while the top part shows real-time information of vehicles (blue) and pedestrians (pink) superimposed on a 3D map in cyber space. The middle part shows detection results superimposed on the point cloud along with the detection range of LiDAR and other sensors. It can be observed that detection results from multiple RSUs are fused together. Despite a delay of approximately 10ms for local digital twins and 100ms for global digital twins, the physical and digital twins are almost synchronized in real time.

2. Hybrid Autonomous Driving

Hybrid autonomous driving integrates path planning based on local environment observations by autonomous vehicles with path planning based on global environment observations provided by the digital twin through V2X communication. This enables simultaneous improvements in both traffic safety and efficiency. Fig. 3 shows the demonstration system of hybrid autonomous driving. In the demonstration system, a digital twin of the autonomous vehicle is constructed in cyber space, path planning is performed on the global digital twin in cyber space, the optimized path is sent back to the autonomous vehicle in physical space, and the vehicle performs autonomous driving using the selected path and its sensors. It is the first time in the world that such a hybrid autonomous driving system has been practically implemented. While the view of autonomous driving is limited to the surroundings of the vehicle, similar to human driving, the global digital twin can observe road conditions in real-time and from a bird's-eye view, allowing the selection of safer and more efficient routes in real time. During the demonstration experiment, the autonomous vehicle detected a parked vehicle and many pedestrians on its route using the global digital twin in cyber space, which enabled it to change to a safer and more efficient surrounding road, and this change was fed back to the physical autonomous vehicle, confirming the realization of hybrid autonomous driving. For details, please refer to Video 1.

Fig. 3 Hybrid Driving System

Fig. 3. Hybrid Driving System

Video 1 Hybrid Driving Enabled by Smart Mobility Digital Twin

Social Impact and Future Works

Smart Mobility Digital Twin and the hybrid autonomous and remote driving system, developed in this study, not only enhance the safety and efficiency of autonomous driving but also significantly contribute to creating safe and secure life environments and optimizing the overall transportation system.

Acknowledgement

This research was partly supported by the NICT-JUNO "Advanced Research & Development of Communications and Broadcasting (#22404)."

Terms

[1] Digital Twin: A virtual model that precisely replicates physical space's objects or systems in cyber space in real-time and enables optimization and prediction.

[2] V2X: Technology that allows vehicles to communicate with other vehicles, traffic infrastructure, pedestrians, etc.

[3] Roadside Unit (RSU): Device installed along roads as part of traffic infrastructure to recognize the traffic environment and exchange information with vehicles and pedestrians, aiming to improve traffic safety and efficiency.

[4] LiDAR: Sensor that uses lasers to measure distances and shapes of objects. It is used in autonomous vehicles to understand the surrounding environment in 3D.

Reference

Authors :
Kui Wang1, Zongdian Li1, Kazuma Nonomura1, Tao Yu1, Kei Sakaguchi1*, Omar Hashash2, and Walid Saad2,3
Title :
Smart Mobility Digital Twin Based Automated Vehicle Navigation System: A Proof of Concept
Journal :
IEEE Transactions on Intelligent Vehicles
DOI :
Affiliations :
1Department of Electrical and Electronic Engineering, School of Engineering, Tokyo Institute of Technology, Japan

2Bradley Department of Electrical and Computer Engineering, Virginia Tech, USA

3Artificial Intelligence & Cyber Systems Research Center, Lebanese American University, Lebanon

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.