Elon Musk has promised fully driverless vehicles, without pedals or a steering wheel, from 2026, despite the US government's road safety agency investigating Tesla's 'Full Self-Driving' system after a pedestrian was killed in a low-visibility crash. An RMIT expert comments.
Professor Reza Hoseinnezhad, autonomous systems
"While Tesla's ambition to rely solely on cameras for vehicle sensing is an intriguing approach, I believe it is fundamentally flawed.
"The argument often made is that humans navigate the world with just our eyes, without the need for laser scanners or additional sensors.
"While true, this analogy overlooks a crucial factor. Behind our eyes is an extraordinary supercomputer: the human brain.
"Our brains not only process vast amounts of sensory data but are also experts in transfer learning, allowing us to generalise from limited experiences and apply those lessons to novel situations.
"Current AI models, including those used in Tesla's self-driving system, are still far from matching this capability. This shortfall is particularly evident when considering edge cases like low-light conditions or unusual visual obstacles-scenarios that have led to several fatal accidents involving Tesla's Autopilot.
"Unlike humans, these systems struggle to identify and react appropriately to unfamiliar hazards, especially in challenging environments like dusk or dawn. By relying solely on cameras and forgoing complementary technologies like radar or lidar-which stands for 'light detection and ranging'- Tesla is missing critical redundancy that could make autonomous driving safer.
"Until camera-based AI systems are capable of true generalisation across all potential driving scenarios, a more robust, multi-sensor approach remains the safest path forward."
Professor Reza Hoseinnezhad is a researcher of autonomous systems at RMIT School of Engineering, with expertise in statistical information fusion for situational awareness in autonomous vehicles.
***