New Tactile Sensors Boost Biometric Tech

Abstract

Decoupling dynamic touch signals in the optical tactile sensors is highly desired for behavioral tactile applications yet challenging because typical optical sensors mostly measure only static normal force and use imprecise multi-image averaging for dynamic force sensing. Here, we report a highly sensitive upconversion nanocrystals-based behavioral biometric optical tactile sensor that instantaneously and quantitatively decomposes dynamic touch signals into individual components of vertical normal and lateral shear force from a single image in real-time. By mimicking the sensory architecture of human skin, the unique luminescence signal obtained is axisymmetric for static normal forces and non-axisymmetric for dynamic shear forces. Our sensor demonstrates high spatio-temporal screening of small objects and recognizes fingerprints for authentication with high spatial-temporal resolution. Using a dynamic force discrimination machine learning framework, we realized a Braille-to-Speech translation system and a next-generation dynamic biometric recognition system for handwriting.

Optical tactile sensors are gaining significant attention as next-generation biometric recognition technologies. Capable of analyzing dynamic forces from a single image, these sensors transcend the limitations of existing optical systems, creating potential applications in diverse fields, such as handwriting emotion analysis, surface characterization, and anti-counterfeiting measures.

A collaborative research team, comprising Professor Jiseok Lee, Professor Hyunhyub Ko, and Professor Donghyuk Kim from the School of Energy and Chemical Engineering at UNIST, alongside Professor Jungwook Kim from Seoul National University, has developed an optical tactile sensor that analyzes dynamic touch signals in real time.

Previous sensors were limited to measuring either static or dynamic forces; however, this research team has pioneered a technology that can separate and analyze these forces simultaneously. Notably, this advancement opens new possibilities for visually representing variations in handwriting speed and pressure, as well as for individual identification through machine learning analysis.

At the core of this innovative technology are upconversion nanoparticles, which facilitate high-resolution measurements of dynamic forces and accurately detect external stimuli by absorbing near-infrared light.

To enhance data analysis, the research team incorporated machine learning techniques to process the data collected by the sensors more precisely. Their machine learning algorithm effectively separated vertical pressure from frictional shear forces within dynamic touch signals and accurately identified the direction of these forces. The validity of the force transmission path and signal changes within the sensor was further confirmed through finite element analysis.

The sensor's design mimics the sensory structure of human skin, providing an amplification of force detection. It distinguishes vertical pressure and frictional shear forces concurrently from a single optical image, capable of detecting minute forces as low as 0.05 N generated by gentle pressure on an object, and boasts an impressive response time of 9.12 milliseconds.

The developed sensor has potential applications not only for handwriting analysis but also for fingerprint recognition and braille interpretation. In practice, the research team has implemented a system that converts braille to voice, demonstrating the sensor's utility in dynamic biometric systems and anti-counterfeiting scenarios.

Professor Lee noted, "This is the first study to simultaneously visualize static pressure and dynamic friction by mimicking the sensory structures of human skin, enabling real-time analysis through the separation of these two forces via machine learning."

First author Changil Son emphasized, "This simple sensor structure holds promise for future applications in dynamic pressure quantification, particularly in high-sensitivity handwriting detection." Co-author Chaeyong Ryu expressed optimism, stating, "These advancements will contribute to the development of AI learning-based sensors applicable in robotics."

The findings of this research have been published in Nature Communications on September 12, 2024. This research was supported by the Samsung Future Technology Promotion Project and the National Research Foundation of Korea (NRF).

Journal Reference

Changil Son, Jinyoung Kim, Dongwon Kang, et al., "Behavioral biometric optical tactile sensor for instantaneous decoupling of dynamic touch signals in real time," Nat. Commun., (2024).

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.