Software definition is the next supercomputing step for the Met Office

Software-defined infrastructure can solve the partitioning challenge and boost security

Software definition is the next supercomputing step for the Met Office

The Met Office is the UK's weather service, responsible for weather predictions across the country – everything from consumer-level intelligence to shipping and aviation forecasts that allow ships and planes to keep moving.

Clearly that is a huge amount of work and having top-of-the-line compute power is important. The Met Office began working with Microsoft on a £1.2 billion supercomputing project last year, and has also started investing in software-defined projects to balance productivity and protection.

Speaking at Computing's Cybersecurity Festival last week, Ewen described the organisation's 'Lab & Factory' approach

"There are all classes of use cases [for Met Office predictions], from the low-level but frequent to the unique but very important, and in that we can't afford to miss a detail anymore. As a cyber and continuity challenge that's super tricky, because on the one hand I need a highly collaborative environment that will allow us to maintain our status as a world-leading research institute in weather and climate; and on the other hand, have a data factory that never misses a skip."

Think of a supercomputer - in greatly simplified terms - as a "great big Linux box". Partitioning in Linux is "tricky," said Ewen, and therefore it's difficult to combine approaches that both meet the needs of the scientific community, and are highly trusted.

Some of the benefits of the partnership with Microsoft, where work can be abstracted in the public cloud, is the ability to move into a service and software-configured environment. "Things that I couldn't possibly do with constraints of the number of people and amount of resource and so on and so on, can be done in the future - which will allow me to effectively deliver a laboratory and a factory on the same infrastructure, via configuration."

Running both lab and factory on the same infrastructure brings huge benefits in workload migration.

"Anyone who's worked in technology will be familiar with the concept of a dev stack, a pre-prod stack and a production stack. The problem with that, in the past, has been being able to afford to keep three insulated stacks like one another and relevant for the production environment.

"My experience in the past has been you'll develop something in dev, you'll test it in pre-prod and then you'll push it live; and for some unforeseen reason, largely to do with the fact that the pre-prod environment isn't fully representative of the production environment, something big goes wrong."

Working through hyperscale cloud providers like Microsoft Azure and AWS makes it "very easy" to layer on security regimes, which means that workload migration then becomes the risk.

The Met Office will continue to develop the Lab & Factory approach over the next decade. To learn more, watch the video of Ewen's conversation with Computing's Stuart Sumner below.

vimeo-player