A review article about the future of neuromorphic computing by a team of 23 researchers, including two authors from UTSA, was published today in Nature. Dhireesha Kudithipudi, the Robert F. McDermott Endowed Chair in Engineering and founding director of MATRIX: The UTSA AI Consortium for Human Well-Being , served as the lead author, while Tej Pandit, a UTSA doctoral candidate in computer engineering, is one of the co-authors. The review article, titled " Neuromorphic Computing at Scale ," examines the state of neuromorphic technology and presents a strategy for building large-scale neuromorphic systems.
The research is part of a broader effort to advance neuromorphic computing, a field that applies principles of neuroscience to computing systems to mimic the brain's function and structure. Neuromorphic chips have the potential to outpace traditional computers in energy and space efficiency as well as performance, presenting substantial advantages across various domains, including artificial intelligence, health care and robotics. As the electricity consumption of AI is projected to double by 2026 , neuromorphic computing emerges as a promising solution.
The authors say that neuromorphic systems are reaching a "critical juncture," with scale being a key metric to track the progress of the field. Neuromorphic systems are rapidly growing, with Intel's Hala Point already containing 1.15 billion neurons . The authors argue that these systems will still need to grow considerably larger to tackle highly complex, real-world challenges.
"Neuromorphic computing is at a pivotal moment, reminiscent of the AlexNet-like moment for deep learning," said Kudithipudi. "We are now at a point where there is a tremendous opportunity to build new architectures and open frameworks that can be deployed in commercial applications. I strongly believe that fostering tight collaboration between industry and academia is the key to shaping the future of this field. This collaboration is reflected in our team of co-authors."
Kudithipudi has done extensive work in the field of neuromorphic computing. Last year, she secured a $4 million grant from the National Science Foundation to launch THOR: The Neuromorphic Commons , a first-of-its-kind research network providing access to open neuromorphic computing hardware and tools in support of interdisciplinary and collaborative research. Catherine Schuman (University of Tennessee, Knoxville) and Gert Cauwenberghs (University of California, San Diego), co-authors on the article, are also co-investigators on THOR.
In addition to expanded access, the team also calls for the development of a wider array of user-friendly programming languages to lower the barrier of entry into the field. They believe this would foster increased collaboration, particularly across disciplines and industries.
Steve Furber, emeritus professor of computer engineering at the University of Manchester, is among the authors on the project. Furber specializes in neural systems engineering and asynchronous systems. He led the development of the million-core SpiNNaker1 neuromorphic computing platform at Manchester and co-developed SpiNNaker2 with TU Dresden.
"Twenty years after the launch of the SpiNNaker project, it seems that the time for neuromorphic technology has finally come, and not just for brain modeling, but also for wider AI applications, notably to address the unsustainable energy demands of large, dense AI models," said Furber. "This paper captures the state of neuromorphic technology at this key point in its development, as it is poised to emerge into full-scale commercial use."
To achieve scale in neuromorphic computing, the team proposes several key features that must be optimized, including sparsity, a feature observed in the biological brains. The brain develops by forming numerous neural connections (densification) before selectively pruning most of them. This strategy optimizes spatial efficiency while retaining information at high fidelity. If successfully emulated, this feature could enable neuromorphic systems that are significantly more energy-efficient and compact.
"This paper is one of the most collaborative efforts to date toward outlining the field of neuromorphic computing with emphasis on scale, ecosystem and outreach between researchers, students, consumers and industry," said Pandit. "Representatives of many key research groups came together to share crucial information about the current state and future of the field with the goal of making large-scale neuromorphic systems more mainstream."
Pandit is pursuing his doctoral degree at UTSA under Kudithipudi. His focus is on training AI systems to learn continually without overwriting existing information. He recently published about the topic.
"UTSA is deeply invested in developing knowledge in this field, which has the potential to catalyze a number of technologies and address grand challenges in the world today such as energy waste and trustworthy AI," said JoAnn Browning, UTSA interim vice president for research. "I am extremely proud to see Dr. Kudithipudi and Tej Pandit making such significant contributions to harness the power of this promising technology, particularly on the heels of the launch of UTSA's new neuromorphic commons, THOR."
The UTSA researchers worked with esteemed authors from various institutions, national laboratories and industry partners. These include the University of Tennessee, Knoxville, Sandia National Laboratories, Rochester Institute of Technology, the University of Pittsburgh, Intel Labs, Technische Universität Dresden, the U.S. Naval Research Laboratory, Google DeepMind, the Italian Institute of Technology, UC San Diego, the Institute of Neuroinformatics at the University of Zürich and ETH Zürich, the National Institute of Standards and Technology, Oak Ridge National Laboratory, SpiNNcloud Systems GmbH, the Indian Institute of Science, Royal Holloway, University of London, and The University of Manchester. This collaboration underscores the extensive network and interdisciplinary approach taken by UTSA researchers to advance their groundbreaking work.