UKAEA unveils plans for digital twin for nuclear fusion reactor

'We need to change the engineering design process'

UKAEA unveils plans for digital twin for nuclear fusion reactor

Image:
UKAEA unveils plans for digital twin for nuclear fusion reactor

Together with Cambridge University, Intel and Dell, the agency is working on virtual systems to accelerate the development of fusion power

The UK Atomic Energy Authority is working on an "industrial metaverse" to simulate the processes in a nuclear fusion reactor in collaboration with Intel, Dell and the University of Cambridge.

UKAEA aims to have a fully functioning fusion power plant providing electricity to the grid by 2040, as part of the UK's net zero goals.

"We've got 17 years to stand up STEP [Spherical Tokamak for Energy Production, the UKAEA's prototype power plant] and plug it into the grid." said Dr Rob Akers, head of advanced computing at UKAEA, during a press call last week.

"This means that we need to think differently, we need to change the engineering design process."

Fusion power could provide an invaluable source of baseload power to back up intermittent renewables. However, despite decades of development, a viable reactor has yet to be produced. The extreme physical conditions required to fuse the nuclei of hydrogen atoms together mean that physical experiments are difficult, expensive and hugely demanding. This, makes the technology an excellent candidate for analysis via a digital twin.

Producing a digital twin of STP with all its complex interacting forces will require some cutting edge technologies of its own, namely powerful supercomputing, AI modelling, and specialised software.

Aker described the task as a "system of systems Grand Challenge".

"A fusion reactor is an incredibly complex, strongly coupled system and the models that underpin the operation of fusion power plants are somewhat limited in their accuracy," Akers said. "It means that we need to exploit the world's largest supercomputers to handle all of this physics and all of this complexity."

Over the next decade, UKAEA plans to use exascale supercomputers, machine learning and other technologies to analyse experimental findings and feed them back into the simulations in order to turn "engineering into an engineering design tool" to accelerate development.

See also UK pledges £900 million towards exascale supercomputer

Nuclear fusion is extremely challenging to analyse, and exascale computing brings problems of its own, said Dr Paul Calleja, director of research computing services at the University of Cambridge. As well as being incredibly expensive to purchase and run, supercomputers are very difficult to use effectively, requiring close collaboration between hardware providers, scientists and application developers all working together and looking at the problem in the round.

"That doesn't happen often," he said."

Calleja's team has been speccing out the issue with UKAEA for four years. They realised they would need GPU technology to provide the required performance, but programming for GPUs is problematic, particularly if one is to avoid being locked into a single vendor's technology.

UKAEA has now contracted Intel and Dell to work on overcoming the many issues. Intel's oneAPI environment allows code written to run on an Intel GPU to be adapted to run on those by Nvidia or AMD with very little refactoring, according to Calleja.

The Cambridge Open Zettascale Lab, a partnership between the university, Intel and Dell, will also deploy DAOS, Intel's software-defined object store, which Calleja said makes more efficient use of NVMe storage than traditional file systems. For its part, Dell provides the HPC servers and is working on improving the data centre efficiency through advanced water cooling techniques.

Keeping the work vendor-agnostic and open source is vital, Akers said. The objective of the industrial metaverse is to enable easy interaction with the system by scientists, so they can build on the findings.

"UKAEA isn't going to design and build stack," he said. "It'll be a massively diverse supply chain that we're assembling and nurturing as we speak. So we've got to make sure that the world has access to everything that UKAEA develops with Intel, Cambridge and Dell. We rest upon an army of developers around the world through an open source approach, so we won't be locking up IP as a commercial product, we'll be making sure that everybody has access to it."

Another area of development at the Cambridge Open Zettascale Lab is a private cloud environment for developing middleware, which is built on OpenStack.

"This is another part of the democratisation; you have to make these systems accessible to companies and scientists that are not used to supercomputer technology," Calleja said. "So that middleware layer is really important."

Adam Roe, EMEA HPC technical director at Intel, said there was now confluence of compute, technology and know-how to inform the simulations, feed in AI-inferred data and accelerate the development.

"Now we're at a point where actually the technology available through hardware and software exists for us to be able to start to simulate some of these extremely complex problems like fusion energy."