IBM predicts a computation revolution
Quantum and AI are ushering in a whole new range of algorithmic possibilities, says IBM’s research lead
'Tis the season to be predictive and Alessandro Curioni, director of IBM's Research Lab in Zurich, has some bold takes on the future of computation.
According to Curioni, we're on the cusp of two simultaneous technological revolutions which in combination will fundamentally change how the world uses computers to solve difficult problems.
There have only been two comparable developments in computing history, Curioni contends - the microprocessor and the internet - but those breakthroughs happened decades apart. In contrast the current contenders, AI and quantum computing, are evolving in tandem. "We are really going through two of the largest transformational technological changes in IT in the past 30 years," he said during a media event in November.
AI will change how we abstract reality
The physical world is typically simulated by applying mathematical equations to experimental data and processing (discretising) the outputs, variables and functions to make them suitable for computational analysis. The future, according to IBM, will bypass these steps. Rather than approximating reality with equations, multi-modal frontier AI models learn directly from observational data, and then generate surrogate models that can fit the reality much more closely for the particular use case. "I'm not saying it's better, I'm saying it's different," Curioni stated. "Adding a different way to do things enables a lot of new applications".
As an approach, unsupervised learning by AI is also much more generalisable, since a frontier model trained on many modes of data (text, images, video etc.) may be used to create any number of specialised models for individual use cases. Improvements in both speed and accuracy will be dramatic, Curioni said.
"The way weather forecast has been done for 30 years, was to take the data, put in the equation, solve the equation and get the weather forecast. Today, you can create models that are based directly on the data. That becomes much more efficient, because with the new abstraction you can create models that solve your problems in minutes instead than hours or days. This change of representation is a game changer."
Quantum will change how information is represented
Quantum computing's predicted impact is even more radical. Curioni envisions a fundamental shift in how information itself is represented. Instead of being restricted to classical binary digits (one or zero), quantum systems can represent reality in high resolution using any value between those limits. "The mapping of what is very complex in [classical computing] can be very easy on [quantum]. So this change of representation is going to make some of the problems that were very difficult on a classical computer much easier. Then simulating nature, optimisation and certain types of machine learning are going to become much faster, much more sustainable."
This doesn't mean quantum will replace classical computing, rather it is another tool in the box, currently a specialist tool but one that IBM sees as becoming more central over time.
Quantum + HPC + AI will combine to enable the development of new algorithms
Ultimately, Curioni foresees a quantum-centric computing future, where quantum processors are central (a position occupied today by the GPU), supported by classical high-performance computing (HPC) and unified by AI.
Combining AI (learning directly from data) quantum (more accurately representing reality) and HPC (raw compute power, memory) will enable the development of entirely new algorithms. We'll move beyond choosing between classical, quantum or AI approaches to problem solving toward integrated systems where each technology handles what it does best.
Curioni believes that with multi-modal AI and quantum, a mere four classes of mathematical algorithms (partial differential equations; linear algebra; optimisation and combinatorial; and probability and statistics) will cover 95% of the application use cases that will be important in the future, and that new more generalisable approaches will become possible as the underlying hardware and AI models mature.
Longstanding problems will become solvable
Intractable issues will become solvable and complex, compute-heavy calculations will be completed much more quickly. The former category includes modelling matter at the atomic level; the latter takes in atmospheric modelling, fluid dynamics, supply chain management, risk analysis, climate change modelling and mitigation, personalised medicine, financial models and rapid prototyping. Perhaps most ambitiously, IBM forecasts the emergence of true digital twins - not merely digital representations but accurate, real-time mappings of physical reality.
The new computational era will arrive in 2029
Curioni's predictions are surprisingly specific. Quantum computing is already being applied to problems in materials science and optimisation, but IBM's timeline pinpoints the arrival of a fault-tolerant device at 2029. Fault-tolerant quantum computers (enabling 100 million quantum operations before errors occur compared to the current 5,000 or so) will exponentially increase algorithmic capabilities.
Before that milestone is reached, Curioni predicts that by 2027 we'll see early hybrid algorithms taking advantage of quantum processors for specific computational tasks with classical systems handling broader workflows, all orchestrated by AI. As to who will make the key breakthroughs, it's a race between IBM (unsurprisingly), Google and the Chinese Academy of Sciences.
That's all well and good, but will end-of-year predictions will become any more accurate?
Even with the most advanced technology, this seems highly doubtful.