Peter Cochrane: Metaverse versus the meta-pundits

The Metaverse: We've been here before

Image:
The Metaverse: We've been here before

The Metaverse concept is deliberately vague to cover for the lack of hardware support, demand and practical applications, but hacks have lapped it up

As the hysteria of 'digitalisation' refuses to subside, a new technological trope has been launched amid huge amounts of uncertainty and confusion. What the heck is the 'Metaverse'? Judging by the media, blogs, conference presentations and discussions in industry and academia, it would appear that no one actually knows! There is no definitive description.

At its launch in 2021, the Metaverse was poorly defined despite all the graphics and demos. But it would appear to be virtual environments akin to computer games and Second Life with VR and AR networking on a global scale. But we've seen and heard all the promises and predictions many times before.

It all started back in the 1800s with early photography and the creation of 3-D worlds using mirrors and paired photographs.

250 years later this mutated to 3D cinema and head-up displays in the 1950s. Latterly, 3D TV, VR and AR offerings initially seemed compelling, but Joe Public soon lost interest and they either died on the vine or migrated to niche applications.

Why have 3-D projection VR and AR and virtual worlds struggled to become main stream commercial propositions at any scale? It's about inadequate hardware, niche technologies, niche applications and niche markets. Sure, it all has a place in the games world, medicine, astronomy and hi-tech maintenance for the aerospace and oil industries, but what ought to be obvious is that relabelling it all ‘the Metaverse' does not address the fundamental display technology limitations, or indeed, conjure applications and demand.

Physiologically, none of the display and interactive interfaces work well for more than relatively short periods without some discomfort for the users. Since the creation of the first VR headsets way back in the early 80s, all we have done is made them a little smaller, lighter, higher definition and lower energy.

What we have not done is to address the visual contention created between left and right eye due to habitual misalignment, and that of simultaneous trying to focus on near and far field objects. We have known how to solve these problems for over 30 years, and it doesn't involve mini TV screens mounted on your head! What does work is the laser projection of images written directly onto the retina. This has the added advantage of allowing you to occupy a virtual world and real world at the same time whilst overcoming the near and far field contention.

Today, there are groups working on active contact lenses that provide this form of ‘eye writing', but another challenging problem may be a show-stopper: how to power a contact lens when you can't have wires? Some kind of inductive or other means projecting power might do it, but for now we may have to settle for a special spectacle mounted laser module.

A big advantage of the direct laser projection is that you can occupy a virtual world of the real world at the same time, whilst simultaneously overcoming any near and far field contention. With the addition of a camera, such an arrangement opens the door to several new elements. First, augmented intelligence with humans sharing experiences with global AI systems. Second, telepresence with our physical abilities projected into a robot. Third, the bigger picture involves exoskeletons and other sensory projections.

Today it appears that the ‘meta-pundits' have sold us short on a workable vision. The Metaverse has to be far more multifaceted and capable of solving real problems whilst satisfying untapped needs.

But before we go there we should probably address the many technology limitations and that of human physiology; long term operational comfort, along with the amplification (or not) of human abilities.

Peter Cochrane OBE is Professor of Sentient Systems at the University of Suffolk