Neuromorphic System Boosts Machine Vision in Harsh Light

NEC Core: Achieve rapid and efficient exposure control by breaking loop dependency with neuromorphic events

NEC Core: Achieve rapid and efficient exposure control by breaking loop dependency with neuromorphic events

A research team led by Professor Jia Pan and Professor Yifan Evan Peng from the Department of Computer Science and Department of Electrical & Electronic Engineering under the Faculty of Engineering at the University of Hong Kong (HKU), in collaboration with the researcher at Australian National University, has recently developed a groundbreaking neuromorphic exposure control (NEC) system that revolutionizes machine vision under extreme lighting variations. Published in Nature Communications, this biologically inspired system mimics human peripheral vision to achieve unprecedented speed and robustness in dynamic perception environments.

Traditional automatic exposure (AE) systems rely on iterative image feedback, creating a chicken-and-egg dilemma that fails in sudden brightness shifts (e.g., tunnels, glare). The NEC system solves this by integrating event cameras — sensors that capture per-pixel brightness changes as asynchronous "events" — with a novel Trilinear Event Double Integral (TEDI) algorithm. This approach: Operates at 130 million events/sec on a single CPU, enabling edge deployment.

"Like how our pupils instantly adapt to light, NEC mimics biological synergy between retinal pathways," explained Mr. Shijie Lin, the first-author of the article. "By fusing event streams with physical light metrics, we bypass traditional bottlenecks to deliver lighting-agnostic vision."

In tests, the team has validated NEC across mission-critical scenarios:

  1. Autonomous Driving: Improved detection accuracy (mAP +47.3%) when vehicles exit tunnels into blinding sunlight.
  2. Augmented Reality (AR): Achieved 11% higher pose estimation (PCK) for hand tracking under surgical lights.
  3. 3D Reconstruction: Enabled continuous SLAM in overexposed environments where conventional methods fail.
  4. Medical AR Assistance: Maintained clear intraoperative visualization despite dynamic spotlight adjustments.

Professor Jia Pan said, "This breakthrough represents a significant leap in machine vision by bridging the gap between biological principles and computational efficiency. The NEC system not only addresses the limitations of traditional exposure control but also paves the way for more adaptive and resilient vision systems in real-world applications, from autonomous vehicles to medical robotics."

Professor Evan Y. Peng commented, "Our collaborative work has been instrumental in pushing the boundaries of neuromorphic engineering. By leveraging event-based sensing and bio-inspired algorithms, we've created a system that is not only faster but also more robust under extreme conditions. This is a testament to the power of interdisciplinary research in solving diverse complex engineering challenges."

In the long term, the NEC paradigm offers a novel event-frame processing scheme that reduces the processing burden of high-resolution events/images and incorporates bio-plausible principles into the low-level control of the machine eyes. This opens new avenues for camera design, system control, and downstream algorithms. The team's success in embodying neuromorphic synergy in various systems is a milestone that can inspire many optical/image/neuromorphic processing pipelines and implies direct economic and practical implications for the industry.

For details about the research article, please visit:

https://www.nature.com/articles/s41467-024-54789-8

About Professor Jia Pan

Jia Pan is an Associate Professor in the Department of Computer Science at the University of Hong Kong (HKU). His research focuses on robotics, artificial intelligence, motion planning, and human-robot interaction. Professor Pan is particularly known for his work in developing algorithms for robot motion, collision detection, and optimization, with applications in autonomous systems and industrial robotics. He has published extensively in top-tier conferences and journals in robotics and AI, earning recognition for his innovative contributions to the field.

About Professor Yifan "Evan" Peng

Yifan Evan Peng is an Assistant Professor at HKU Electrical & Electronic Engineering and Computer Science, leading the Computational Imaging & Mixed Representation Laboratory. He was a Postdoctoral Research Scholar at Stanford University. He received his PhD in Computer Science the University of British Columbia, and both his MS and BS in Optical Science and Engineering from the State Key Lab of Modern Optical Instrumentation, Zhejiang University. Professor Peng's research interest lies in the interdisciplinary field of Optics, Graphics, Vision, and Artificial Intelligence, particularly with the focus of: Computational Optics, Imager, Sensor, and Display; Holography & VR/AR/MR; Human-centered Visual & Sensory Systems.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.