New Formula May Boost Chip Speed and Power

UNSW Sydney

Research from UNSW has proven a fundamental truth in communication which could make computer chips faster and more efficient. Communication complexity is the study of how much information needs to be passed between two or more parties in order to solve a particular problem. It measures the minimum number of bits that have to be exchanged when each party has only part of the data needed. These studies are important in computer chip design because it helps minimise the amount of data that needs to be transferred between different parts of a processor. Reducing communication speeds up computations and improves overall efficiency, which is crucial for designing faster and more powerful chips, especially in modern multi-core and distributed systems. It has previously been thought that the complexity of communication means that trying to do multiple tasks at once is not any more efficient than doing each task individually, known as the Direct Sum Conjecture. But research by Dr Simon Mackenzie, from UNSW's School of Computer Science and Engineering, and Dr Abdallah Saffidine has now shown that savings can indeed be made by bundling complex communication tasks together. "The existing belief has been that any communication task had a certain cost, and batching them together would not save you anything," said Dr Mackenzie. "An example would be the amount of information needed to be exchanged if two people toss a coin and want to know what each other landed on. In that case it has to be two bits of information – each one saying if they got a head or tail – and complex communication in computer systems was believed to be the same. "Our new paper challenges that notion. It shows with a formula that batching tasks can actually be beneficial, which could potentially improve computer chip design and other large-scale systems by making communication more efficient." Bread and bananas One other way of considering the work is to imagine needing to buy both a loaf of bread and a bunch of bananas. If you go to the shop and buy them both together, then you save time compared to making two separate trips. However, there would not be any cost savings – the price to buy bread and bananas is the same whether you go to the shop once or twice. In computer science, the existing belief has been that for complex communication tasks the 'cost' is the same whether you do them separately or combine them together. "For certain computational tasks it was thought that putting them together doesn't actually save anything compared to doing them separately," said Dr Mackenzie, who will present the paper alongside Dr Saffidine at the Symposium on Theory of Computing (STOC 2025) in Prague in June. "In communication complexity, because the specific setting doesn't have much structure, it's been widely accepted that when you put the tasks together you aren't being more efficient. What we show is that's not true - you actually can make savings by batching." In the paper, a so-called Alternating Communicating Game featuring a number of 'players' is set up, whereby each person communicates in succession over a given number of rounds until the game is solved. Through a series of complicated formulas, the researchers ultimately show that by forcing the way the players communicate with each other it can be proven that it's harder to solve tasks individually rather than when they are batched together. "The impact of this work on computer science more widely is to give people a better understand of how to batch tasks together so there can be efficiencies in solving them," Dr Mackenzie said. "There is an implication that this will help in the design of computer chips and potentially make them become faster. But it also applies to any large-scale system where communication might be the bottleneck – such as things like cloud computing, large distributed networks and even massive scale artificial intelligence systems." Unconditional lower bound The researchers also say their work proves the absolute minimum amount of communication needed to solve a problem, known as the unconditional lower bound. That's important because it establishes a fundamental limit on the resources (such as time, space, or circuit size) required to solve a problem, independent of any unproven assumptions. "In physics, most people understand that you can't go faster than the speed of light, so that is an absolute limit and it shapes how we navigate, communicate, explore the universe," Dr Mackenzie said. "An unconditional lower bound is just as important in theoretical computer science so that we understand what we can and cannot do from a fundamental perspective."

Key Facts:

Expert from UNSW Sydney shows proof that could optimise algorithms and resource allocation in computing systems.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).