Peter Cochrane: Has computing become too challenging for any one person to fully comprehend?

Peter Cochrane looks at how computing has changed in his lifetime, from analogue, to digital and, soon, quantum

As a young university student, I entered a world of analogue and digital computing: feedback, feed-forward, gain blocks, loops and instability points; logic, Boolean algebra, gates, shift registers and machine code programming.

We designed, we built, we tested and we understood at every level in depth. This was the era of machine-specific computing. The payroll, ballistics, estuary and river water flows et al all warranted a dedicated machine at vast cost.

And then, within six or so years, came stored program control, general purpose digital computers and high-level coding.

We designed, we built, we tested and we understood at every level in depth

Even at this stage, there were many who had a strong grasp of every aspect of all the fundamentals. Seymore Cray, for example, was remarkable in his ability to design machine architectures, circuits, chips, mechanical and thermal management systems, and write the system machine code and apps.

But he was the last man standing. Since his time the level of complexity and sophistication in every element and dimension on computing has become a challenge, and each aspect has become a highly specialised topic.

So I think we can safely say that no one understands (in detail) how computers work any more.

Fast forward to today and computing has become ever more clouded by the rise of artificial intelligence (AI), robotics and quantum computing. I find myself sitting in meetings and conferences listening to people present confused and confusing pitches across the board, with claims of what quantum computing and AI can and cannot do.

Sadly, this also often extends to our digital computing, networking and application layers where people overlook the combinatorial complexity and inherent non-linearities.

Computing has become ever more clouded by the rise of artificial intelligence (AI), robotics and quantum computing

In short, none of this is straightforward and we are now deep into a sphere where even exhaustive testing is fundamentally impossible. We tend to characterise all this as ‘emergent behaviours', or perhaps more graphically, ‘surprises'.

To be clear, let us categorise the primary key features as follows:

Analogue computers

Able to cope with accurately dimensioned/scaled linear or non linear problems with probabilistically variable analogue outputs. Primary use is in the modelling of complex physical systems and situations.

Digital computers

Mostly best suited to well defined, deterministic linear and contained problems with high definition and consistent digital output. Can be configured to operate in serial our parallel modes to suit a particular problem.

The most power, sophisticated and influential technology invented to date, and even if quantum computing matures, they will still be required and may even remain the dominant mode by far.

Quantum computers

A new technology that is still in its infancy. In principle/theory it is capable of coping with massively complex, ill-defined, non-linear and unbounded problems; integer factoring, data sorting, many body problems, pattern recognition, chemical reaction, biology modelling and prediction. This includes protein folding, complex network traffic flows and system modelling and so on, but always with a probabilistic output.

Artificial intelligence

Dominated by digital computing technology, although analogue and quantum computing may become a part of the platforms at some future time.

It is still only at a very fundamental phase, but already providing some powerful pattern and image recognition, as well as showing promise for rule and behaviour abstraction, observational learning, and solving problems way beyond human abilities.

It will become mobile when embodied in robots, but is a long long way from any form of sentience. There is no general purpose AI today - AI is all task specific.

Robots

Mainly static, purpose specific or purpose programmable using digital computers on production lines, machine shops and warehouses. At the extreme edge robots are being powered by AI, becoming mobile, with a small percentage now almost entirely autonomous and working buddy-buddy with people.

"Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical…."Richard Feynman, 1982

This then is the computing technology scene today, full of uncertainty and possibilities, and just how it all fits with us and biology in general is depicted below.

Peter Cochrane OBE is an ex-CTO of BT who now works as a consultant focusing on solving problems and improving the world through the application of technology

Computing's IT Leaders' Summit this year focuses on a subject at the forefront of every CIO's mind: digital transformation. Taking place at 10-11 Carlton House Terrace, Computing's IT Leaders' Summit will include a keynote by Alex Sbardella focusing on the future of digital - China. To reserve your place, please check out our dedicated event website