Fast, Comfortable Robot-Human Handover for Mobile Systems

Beijing Institute of Technology Press Co., Ltd

A research paper by scientists at The Chinese University of Hong Kong proposed a method that enables a mobile robot to hand over objects to humans efficiently and safely by combining mobile navigation with visual perception.

The new research paper, published on Aug. 13 in the journal Cyborg and Bionic Systems, introduced a comprehensive handover framework tailored for mobile robots, designed to seamlessly manage the entire handover process, from object location to grasping and final delivery.

Robot interaction has become a cornerstone of contemporary society, with its applications permeating diverse sectors including manufacturing, healthcare, and personal assistance. Nevertheless, achieving a handover process that parallels the efficiency and fluidity of exchanges among humans remains a formidable challenge for the robotics community. "The impetus for a robot-to-human handover is derived from the human's need to acquire an object for a specific task. The object in question may be situated within the robot's immediate operational area, such as an operating table, or it may be located some distance away." Explained study author Tin Lun Lam, a professor at The Chinese University of Hong Kong. Consequently, the robot must execute a series of actions: navigate effectively and safely to the object's location, secure the object, and then return to deliver it to the human collaborator.

The handover navigation process is bifurcated into two sequential stages: the initial detection and acquisition of the object, followed by the robot's traversal to the human recipient. This sequence encompasses four core components: localization, exploration, object grasping, and path planning. Model-based human body and hand reconstruction techniques offer a solution to this problem. "Our robotic system can map its environment in real-time and locate objects to pick up. It uses advanced algorithms to grasp objects in a way that suits human preference and employs path planning and obstacle avoidance to navigate back to the human user. The robot adjusts its movements during handover by analyzing the human's posture and movements through visual sensors, ensuring a smooth and collision-free handover. Tests of our system show that it can successfully hand over various objects to humans and adapt to changes in the human's hand position, highlighting improvements in safety and versatility for robotic handovers." said Chongxi Meng.

A prominent feature of our framework is its advanced vision-based system that is adept at recognizing and interpreting a wide range of human hand postures through a specialized detection and reconstruction approach. This allows for accurate estimation of hand poses in various configurations. The framework's ability to identify the optimal grasp type is crucial, as it ensures that the robot can determine both a safe hand posture for the human recipient and a successful grasp configuration for the object itself. To demonstrate the versatility and effectiveness of our system across different scenarios, we have rigorously tested our robot-to-human handover algorithm on both single-arm and dual-arm robots. The results, showcasing the algorithm's adaptability and performance in varied settings, are presented in a video included in the Supplementary Materials. This evidence further substantiates the robustness and general applicability of the algorithms constituting our handover system.

Authors of the paper include Chongxi Meng, Tianwei Zhang, Da Zhao, Tin Lun Lam

This work was supported by the National Natural Science Foundation of China (grant nos. 62306185 and 62073274), the Guangdong Basic and Applied Basic Research Foundation (grant no. 2023B1515020089), and the Shenzhen Science and Technology Program (grant no. JSGGKQTD20221101115656029).

The paper, "Fast and Comfortable Robot-to-Human Handover for Mobile Cooperation Robot System" was published in the journal Cyborg and Bionic Systems on Aug 13, 2024, at DOI: 10.34133/cbsystems.0120.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.