The future of computing is hurtling towards us at an ever-increasing rate.
Emerging technologies such as quantum computing, DNA storage and all-optical computing will change business and society fundamentally, and sooner than many people realise.
But the classical, silicon-based CPU that we are accustomed to won't be replaced - the coming technologies will combine as a hybrid computing superpower providing distinct capabilities.
In this article, I'll provide an overview of the computing technologies that will arrive in the next two-to-ten years.
The defining trend, of course, is that progress in the rates of processing speed and power in current computing is in jeopardy. Specifically, the time and cost associated with achieving ever-smaller chip geometries is becoming increasingly challenging.
Another key driver of change is the fact that some problems are simply too difficult to solve by applying the bits and bytes of classical compute. Meanwhile, concerns about the sustainability of conventional applications are accelerating the desire for new approaches.
Novel silicon architectures offer a compelling avenue to progress. With the physical limitations of reducing the size of foundational electronic components, the future of silicon-based electronics will be multi-layered designs (3D integration). These are being considered to produce even quicker, more power efficient and denser chips more cheaply. They also promise personalised chip designs that cope with the needs of AI algorithms.
I'm very optimistic about the future of neuromorphic computing. It doesn't follow traditional sequential processing; instead, it tries to imitate how human neurons process information. They are stimulated through spikes that can trigger one or multiple neurons, which can then activate others that are connected.
Like neurons, neuromorphic computing will be capable of parallel and targeted processing for a defined set of neurons and interconnections. Its potential will be explored by algorithms and sensing systems that try to imitate human thinking.
Moving on to optical innovation takes us to photonic computing, which replaces the electrons of digital computers with photons, using light waves to process and store data instead. The speed of light is unsurpassable, so photonics theoretically minimises latency, making light a better computing medium than electricity.
Biological computing uses molecules like DNA and proteins for computation rather than electrical signals. Thanks to nanobiotechnology progress, researchers have found ways to programme living cells to respond predictably to chemical inputs. Although this is costly and time-consuming initially, it then becomes cost-effective to grow billions more.
The approach has already been successful in cold data storage by encoding information in DNA. Longer term, it will process data extremely reliably.
This changes the basic idea of storing and processing information in the form of binary bits (0 or 1) to qubits (any number from 0 to 1). Particles will be able to represent any of an infinite number of states and interact with other particles in infinitely different ways. The result is that quantum computers have the potential to solve very complex problems faster than traditional computers.
The majority, if not all, of the technologies I've described, are being developed for specific capabilities - and more broadly to overcome the particular shortcomings of current CPU chips.
Classical computing will not be replaced entirely. The future of computing will be characterised by a hybrid approach - the right technology in the right place at the right time.
Dr Aidong Xu is head of semiconductor capability at Cambridge Consultants.
He holds over 30 years' experience across diverse industries, including with leading semiconductor companies. He has managed large international engineering teams and brought products into the global market that have achieved rapid and sustained business growth. Aidong holds a PhD. in power electronics and power semiconductor.