Lifting the fog: Computing’s top tech predictions for 2026

AI evolves, infrastructure gets a boost and cyber moves back onshore

Technology changes faster than any other sector. Keeping up with what's happening now – let alone what might in the future – can tax even the most dedicated IT leader.

But don’t worry; the Computing team is here to help you make sense of the chaos. Read on for our informed predictions of the year ahead.

We’ve divided our forecasts into three broad categories:

This year we’ve even added a confidence rating, so you can see how much we trust our own judgement.

Want to hear more from the Computing team? Tomorrow, 3rd December, we’re running a webinar reviewing 2025’s tech trends and what they mean for 2026. Sign up here.

AI will beat the bubble – Tom

Confidence: 6 4 2/10

I’ve become a lot less confident in this one in just the last eight weeks, but I include it for posterity.

The last three years have been a gold rush for AI, with everyone racing to tie their wagon to the technology, but the closing months of 2025 have seen a lot of negative chatter about AI investment. Forecasts from leading financial institutions, investors and even tech companies warn the sector will become the next dot-com bubble.

What’s happening around gen AI is not the same as the dot-com era. The technology behind the internet wasn't ready for mass consumption in the 1990s (everyone was on dial up, for a start), whereas the tech powering artificial intelligence is mature, understood and already widely adopted.

There are definitely questions still to be answered about gen AI: it’s not worth all the hype vendors are pushing, and its eventual mass adoption could well end up being much more limited in scope, and take a lot longer, than they envision. On top of that, the circular investing and dodgy accounting practices aren’t filling anyone with confidence.

Whichever way the investment market slides, the underlying technology is here to stay. But companies should remember who actually got rich in California’s gold rush: it wasn't the hapless prospectors, it was the merchants selling them picks and shovels.

...Or will it? OpenAI and Oracle’s circular dependency will cause financial distress – Penny

Confidence: 6/10

This is not me seeking to argue with Tom’s prediction about AI beating the bubble. Despite the legion of ethical, legal and practical challenges AI creates, enterprises are scaling up deployments and pushing ahead with new use cases.

However, whilst talk of the whole bubble exploding in 2026 might be overblown, it’s also true that the financial underpinnings have got messy this year. I’m not a betting woman, but if I was, I’d be betting against OpenAI and Oracle stock in 2026.

OpenAI currently generates around $13 billion in annual revenue but has committed to spending $300 billion with Oracle over five years (this doesn’t even scratch the surface of its total spending commitments as people like Ed Zitron have continued to point out, much to Sam Altman’s visible irritation).

Analysts estimate that 58% of Oracle's future order backlog is tied solely to OpenAI. If the company fails to generate sufficient revenue or secure funding (both scenarios highly plausible given how few of ChatGPT’s billions of users are paying, and the growing levels of cynicism around Altman’s continual demands of money for compute to chase the AGI that is just around the corner) it could default on this contract, which would devastate Oracle's financial projections.

Oracle’s debt-to-equity ratio is also worrying investors, and quite understandably. For comparison, Microsoft’s debt to equity ratio is 30%. Amazon’s is 50%. Oracle’s is 500%.

Furthermore, Oracle must borrow about $100 billion over four years to build the infrastructure OpenAI needs. This is why the cost of insuring against Oracle’s debt default has tripled since the deal was announced, and it is why Oracle’s value has declined by $315 billion (a sum greater, you will notice, than the value of the deal itself) over the same period. If Oracle cannot deliver the promised infrastructure, OpenAI's growth plans are in pieces.

Investors can see a circular dependency where OpenAI needs Oracle's infrastructure to generate revenue, but Oracle needs OpenAI's payments to justify its vast debt-financed buildout. The chances of both seeing out the 2020s financially unscathed seem slim.

Workslop is going to become a much bigger problem – Penny

Confidence: 8/10

This year, the UK words of the year have been ‘parasocial’ and ‘vibe coding’. The inclusion of the latter sums up the absurdity of much of AI discussion, not least because ‘vibe coding’ is two words. This is par for the course with AI slop, and workslop specifically: something that sounds plausible, looks polished, but on closer inspection reveals itself to be nonsense.

AI might get a task off of your to-do list quickly, but workslop pushes the onus to colleagues or partners. It doesn’t save time, simply shifts who has to spend it.

I predict "workslop" will be the Collins or Cambridge word of the year in 2026 because it’s already very much a thing. A study published earlier this year by Stanford University revealed that 40% of US employees reported encountering workslop within the last month.

Workslop could account, at least in part, for the disappointing results a study by MIT found in productivity earlier this year. Anecdotes shared at Computing roundtables throughout the year have added to the suspicion that problem isn’t the tech itself, it’s how people use it. Put simply, employees are adopting a minimum viable product mentality and using LLMs to take shortcuts. Rather than improving their productivity, all it does is damage someone else’s – the poor individual further downstream who must do quality control.

There are causes for concern beyond another year of most generative AI pilots failing to deliver a boost to corporate profits. The use of generative AI in education, recruitment and job searching risks demotivating talented, hard-working young adults. Why bother to do the challenging work when you can get a chatbot to do it for you? It might not be quite as good, but does it really matter?

It certainly does for employers. Many are investing in training and genuine AI literacy programs with culture at the centre. Others are just mandating the use of AI to cut costs without establishing any clear guardrails. The results for the latter group are unlikely to be stellar – for them or their customers.

“Agentic” dies a death – Tom

Confidence: 7/10

Everyone is talking about agentic AI, and some pilots have even made it to production – but working out which are really agents, versus a simple rebadging of automation or workflow chaining, is complex.

The term “agentic” has become overused and diluted, with no single market-wide definition; vendors are slapping it on everything from macro-like workflow scripts to glorified chatbots in their rush to secure market share. Hardly a new issue, but it waters down what agentic AI really is and increases scepticism among buyers who don’t know what they’re actually getting when they sign up.

Expect the hype cycle to correct itself as agentic AI dives headfirst into the trough of disillusionment. Some organisations will continue to make progress, but the rest of the market will move away from the term as the gap between marketing claims and real capability becomes too obvious to ignore.

Instead expect “agentic” to be replaced by more grounded language around automation, orchestration and copilots.

As always, the survivors will be those delivering measurable business value, not speculative demos.

Cloud will come home – Tom

Confidence: 9/10

There has been a surge of interest in data sovereignty through 2025, as the USA has shown itself to be an inconstant ally. A series of political swings, policy reversals and questionable decisions has left CIOs and boards questioning their dependence on US-based cloud providers.

Public entities around Europe have already started to make small moves to divorce themselves from US tech, as they realise "locally hosted" doesn’t mean “locally controlled.” Cloud providers’ efforts to provide local services may be referred to as sovereign clouds, but are simply data residency agreements in most cases.

In 2026, driven by growing regulatory pressure and a shift in risk appetite (see below), we’ll see a quiet boom in hybrid and even on-prem processing for strategic workloads, as businesses switch from “cloud first” to “cloud where appropriate.”

Maynard Williams, MD in the UK&I at Accenture, told me, “The focus is shifting from simply where data ‘sits’ to who controls it, how it’s used, and under what risk framework. This is driving interest in sovereign data and increasing investment in sovereign IT solutions.”

Infrastructure gets a new lease of life – Tom

Confidence: 8/10

The mass outages of 2025 – AWS, Azure, Cloudflare – revealed the fragility of the global internet. Although it was built as a distributed system with resilience at the core of its design, the over-reliance on large providers at every level of the web has changed the game.

Being cloud native has always been understood to mean a higher level of robustness, but the largest providers’ ability to lose service through a single point of failure calls that claim into question. Resilience, it turns out, has to be designed rather than assumed.

Next year will bring a renewed focus on resilience, redundancy and business continuity, with boards suddenly willing to fund infrastructure modernisation plans that have been delayed for years – if you can connect with them. The emphasis will be on multi-region failover, network diversity and backup discipline – with observability thrown in for good measure.

The knock-on effect of infrastructure returning as a strategic concern, rather than a cost to minimise, will be a growing need for architects and engineers with a grounding in the fundamentals of network construction, not just platform tooling.

Offshoring cyber comes under the microscope – Penny

Confidence: 6/10

The offshoring of cybersecurity has really picked up pace in the last couple of years. The CEO of a tech skills focused non-profit told me recently that: “Across the companies that we work with across the past year, I’ve seen 100,000 jobs offshored in technology that used to sit in the UK.”

Anyone in doubt as to why offshoring cybersecurity can be a terrible idea for both the offshorer and the tax payer and should ask M&S, or perhaps JLR, about the long-term costs of cyberattacks that both occurred via their India-based cybersecurity outsourcer.

There are other, more immediate issues created by outsourcing such a critical function. Each company saves money and gets the kind of cost predictability investors like. However, the loss of institutional and cross-functional knowledge can be terribly expensive in the long term. Outsourcing extends the attack surface across company and country boundaries. Access control, visibility and, crucially, accountability are diluted. We have seen this happen multiple times this year.

The Cybersecurity and Resilience Bill aims to boost the UK’s resilience, but implementing it is going to be much harder when so many of the organisations securing UK business and CNI are based overseas. The CEO quoted above thinks the government is in denial about the scale of the problem.

“Offshoring is just not spoken about in government,” she said. “It’s purely a cost-cutting measure for companies and if you were to ask them why they would say it’s in response to the strengthening of labour laws since Labour came into power and the increase in national insurance requirements.

“But if the government don’t keep an eye on this, all entry-level talent will be offshored in the future. It does go in cycles but to bring that talent back onshore will cost an epic amount of money.”

I think that as the Bill proceeds through the legislative process, the third-party attacks keep landing into next year and mid-career talent becomes harder to find, both the government and private industry will start to re-examine the long-term costs of outsourcing cyber and follow the lead of M&S, which terminated its contract with TCS earlier this year.

At least, I hope they will.

Quantum computing sees a surge in investment and use cases – John

Confidence: 7/10

Quantum computing, like AI, is nothing new. But unlike the world's current hottest tech, it has yet to have a ‘ChatGPT moment’ to bring it to the attention of to the masses. But this may be about to change.

To clarify, few of us are ever likely to wake on Christmas morning to find a gift-wrapped quantum computer waiting under the tree. Quantum's user base consists mainly of scientists and researchers in areas like materials science, energy, power grids and engineering - and this is not about to change. But within those fields there is a growing expectation that we are on the cusp of a step-change in the reliability and capabilities of quantum computers.

This optimism comes from advances in the past 12 months by separate groups of researchers in the field of quantum error correction (QEC) that have both accelerated the process and reduced the number of errors in quantum calculations by orders of magnitude. There has also been progress (albeit still most definitely at the lab stage) in addressing quantum’s scalability issues, with tentative steps to link quantum processors together into cluster, and in the use of AI to draft and test promising algorithmic approaches, with some early signs of "quantum advantage" (capabilities beyond classical computing) claimed in some specialised areas.

Amidst these advances, industry players are starting to publish roadmaps to more generally useful machines by the end of the decade. Stir in a pinch of fear factor (who wants to be caught on the hop when a rival power has the ability to crack widely used cryptosystems?) and a large dollop of potential geopolitical and commercial advantage, add the possibility of cracking problems like carbon capture that have so far proved intractable and you have a tempting treat for capital, both public and private, which is starting to flow into quantum computing research. We expect 2025's rapid progress to continue on its hockey-stick-shaped trajectory on several fronts.

The datacentre pushback will begin – John

Confidence: 5/10

The main character in many of JG Ballard's unsettling dystopias is the modern built environment - bland holiday complexes, underpasses, top-end hi-rise blocks, vast identikit shopping malls. In these highly engineered spaces, the human actors gradually degenerate, losing the patina of civilisation and turning to recreational violence, sexual perversion, wanton destruction and rampant drug use to get their kicks, eventually turning on the facilities themselves. Fiction, of course, but what better representative of the modern built environment in 2026 than the datacentre, the sprawling, impenetrable, unknowable box, examples of which are popping up around urban spaces like mushrooms - their customer-base, power and water use often a closely guarded secret?

In the US, which has a lot more data centres than anywhere else, as well as looser planning laws and bigger tax incentives, residents are starting to get angry, complaining of water shortages, noise, pollution and higher electricity prices, and protesting that promised jobs have failed to materialise or gone to outsiders. Even some Republican politicians are calling for a pause.

The UK, which has a lot less land, has designated datacentres as critical national infrastructure, to be pushed through as a planning priority. Expect the first real flames of resistance to be lit in 2026.