Met Office CIO Charles Ewen: better forecasts with extremely big data and seriously super computers
Few organisations have a bigger data challenge than the Met Office, but as Charles Ewen explains, the organisation has never been better equipped to tackle it than it is now
"I'm a Devon native who started life as an electronic systems engineer, and now I'm managing supercomputers," says Charles Ewen, the CIO at the Met Office, who is ultimately in charge of making sure that the most sophisticated computer systems in the UK deliver results - including the organisation's latest toy, its $87m Cray XC40 supercomputer.
And those results aren't daily, hourly or tied to scheduled weather bulletins, but must be churned out continually for a range of "customers" that, increasingly, are automated rather than human. Furthermore, the Met Office is not solely interested in the weather and climate across the UK, but across the world and is even taking an interest in "space weather".
Indeed, it's hard to simply describe the Met Office's "business" - and its IT requirements - these days, so diverse have its activities become. And it's no longer enough for the Met Office simply to crunch the numbers and spit out a accurate weather forecasts, says Ewen.
Super power
The Met Office's supercomputer purchase is intended to enable it to follow and forecast weather and climate globally at an ever-more more granular level. "Effectively, what we do is take the globe and, over the surface of the globe, lay grid squares. That grid is then pulled-up into the atmosphere in more than 70 layers - and down into the oceans for climate work. If you do the sums, you come up with about 500,000 points of calculation," says Ewen.
"So, you're applying those differential equations at each of those nodes, then you're stepping that model through time. The time-steps then vary inside that model. There are some things, such as cloud formation, that you do need to recalculate every few minutes; while other things, like pressure, that do not have to be calculated as frequently.
"And every node of calculation kind-of needs to know the answer for every other adjacent node before it can confidently progress because their calculations are dependent upon the answer of their near neighbours," says Ewen.
Those kinds of inter-connected, deeply complex calculations don't lend themselves to the cloud due to the fundamental mathematics and implications in terms of input/output and latency "You need a very highly parallel architecture with very known characteristics. The computational capacity, isn't the defining problem.
"It's more about data exchange between nodes. Getting the data, quickly and timely, from one node to another is more of an issue than the sheer calculating capacity," he says.
Furthermore, with the retirement of the Met Office's IBM supercomputers after five years of service and their replacement with Cray XC40s, the level of compute capacity at the fingertips of the organisation's many scientists has increased several times. That has an impact not just on the detail, but also the complexity of the forecasts and climate work that the Met Office is able to do.
"The IBM supercomputer has a combined computational capacity of about 1.2 petaflops. That's 1.2 times 10 to the power of 15 floating-point operations per second. The new supercomputer, when it's fully commissioned in 2018, will have approaching 20 petaflops. Our existing mass storage system is 60 petabytes in capacity. Over the same period, our mass storage systems will approach an exabyte," says Ewen.
One way to use that new computing power will be to reduce in size the "grid squares" to get a higher resolution and more finely grained forecasting capability, which Ewen is also keen to assert may not necessarily mean more accurate weather forecasts alone. "But every time you make those grid squares smaller you are generating hugely more data," he says.
[Please turn to page 2]
Met Office CIO Charles Ewen: better forecasts with extremely big data and seriously super computers
Few organisations have a bigger data challenge than the Met Office, but as Charles Ewen explains, the organisation has never been better equipped to tackle it than it is now
"As a rule of thumb, a doubling of resolution requires about 10 times the computational capacity. So moving from five-and-a-half kilometre squares to two-and-a-half kilometre grid squares means 10 times the computational capacity. Mathematically, it ought to be eight times, but then you have to take into account inefficiency overheads," says Ewen.
That more fine-grained model of global weather is one way in which the Met Office is planning on "spending" its new supercomputer resources. It will enable the Met Office to better forecast weather features, such as convective thunder storms, which are typically just a few kilometres in scale and hard to forecast with any accuracy - until now. The increased capacity will enable more scientific advances in this area to be made, he believes.
Already, though, Ewen and the Met Office's not inconsiderable IT department (you can scarcely run a supercomputer on just a handful of staff) are looking forward to the next generation of supercomputing and how this might completely change their approach to weather forecasting. The supercomputer itself is only one part (albeit a big part) of the wider IT platform at the Met Office.
"As you increase resolution, eventually this model of putting grids over squares breaks down as the points of convergence of the grids becomes too dense. So we're working on a science programme to look at future dynamical core structures," says Ewen.
The Met Office's models also encompass information that reaches into deep space, too. "We monitor the sun as part of what we do, and we are looking for big solar events like coronal mass ejections [pictured below - a massive burst of gas and magnetic field arising from the solar corona, and being released into the solar wind] and so we have got a need to quickly pick up unusual characteristics in data streams."
Extremely big data
Today, says Ewen, a typical weather forecast weighs in at around 400GB per run, and the Met Office runs the forecast several times a day using a number of different model configurations within its 'unified model' regime.
"We run weather forecasts using the 'Ensemble technique'. Rather than running a forecast once, we run the same model a number of times and, every time we run it, we make slight variations in initial conditions to explore how much that might change the forecast. After all, the weather is a classic example of a chaotic system," says Ewen.
"If we get an 'ensemble' of forecasts that all say very different things, then we can't really have any confidence in any of them. But if we run that forecast a number of times with slightly different assumptions and get much the same answer, then we can have a high degree of confidence in all of them," says Ewen.
The approach generates useful information around probabilities and confidence, which can be delivered as part of the forecast.
And it's not just about generating a forecast that the Met Office's scientists can be confident with, but also transmitting that forecast to the right people both inside and outside the Met Office.
"It's not just that the forecast is big, but also, no one wants yesterday's weather forecast. It's not just about moving 400GB of data, it's about moving that data quickly enough so that the Met Office and others can do something useful with it. That's a growing challenge," says Ewen.
So, as weather forecasts get bigger and more complex, the Met Office has to devise ways to deal with ever-bigger data. That is easier said than done. The aim is to pass on as much information as quickly as possible and, at the same time, reduce data-set size to a minimum.
"There are two fundamental approaches to achieve this. The first is to be more selective about the data that is sent. The second is to bring problems - or algorithms - to data," he says.
"We are beginning to use geospatial standards. Instead of taking large datasets and extracting the information that you need from them at destination, increasingly we're subsetting data of interest at source. The subsetting domain could be geospatial or temporal," says Ewen.
"Over the course of the next decade or so, even the data required to do that will get too big and too unwieldy," he warns. "We're looking at emerging technologies that are about truly bringing problems to data.
"There will be applications in which the most efficient method will be to operate algorithms and smaller data against very big data, at source. We already have some early examples of these using emerging technologies," he says.
And the Met Office also has many different customers demanding different weather and climate data, from the armed forces and the Ministry of Defence, of which it used to be a part, to commercial organisations with their own considerations. Furthermore, demand for its services are growing all the time.
[Please turn to page 3]
Met Office CIO Charles Ewen: better forecasts with extremely big data and seriously super computers
Few organisations have a bigger data challenge than the Met Office, but as Charles Ewen explains, the organisation has never been better equipped to tackle it than it is now
That, therefore, brings up questions over big data. Indeed, given the humungous volumes of data that the Met Office generates and deals with every day, Ewen has seen a broad shift from the "firehose" approach, to more targeted "intelligent data services", in which organisations pull-in data dynamically to solve specific problems or challenges.
Ewen is also looking at approaches in which an organisation might provide its algorithm to run against the Met Office's vast dataset, feeding back the answer(s) accordingly. "We've already had some examples of people sending us their smaller datasets and their algorithm. We operate that algorithm against our bigger dataset with their dataset, and give them back the answer. In other words, bringing the problem to the data, rather than the other way round."
Bringing external problems (algorithms) to data held at the Met Office introduces issues of security. An external orgnaisation will obviously expect its intellectual property - the algorithm - to be protected throughout the process. Thinking ahead, Ewen believes that technology like blockchain or secure ledgers could have a role to play. Another area of immediate interest is in rapidly developing containerisation approaches.
Data science in a scientific organisation
Obviously, with all this technology generating multi-petabytes of data for analysis, it raises the question of data science: does an organisation that does so much proper science actually need data scientists?
For Ewen, the answer is yes and no.
"You can think about data at the Met Office in three 'domains'," says Ewen. "The first thing is data analysis to describe what has happened and what is happening. The Met Office's observations programme, for example, is all about establishing a picture of the atmosphere. That's one domain.
"The second is why did it happen? And that's a different question to 'what's happened'. Then, based on understanding what happened and why it happened, you stand a fighting chance of predicting what will happen. So, three clear domains: what happened, why did it happen and what will happen?
"If you break those domains down, and you ask the question, what would a data scientist add? Well, in the area of what's happened, potentially quite a lot because there's quite a lot of value to be added in the 'what happened?' area. And, because it's largely the realm of statistics, and big data and all the kinds of things that data scientists are about, that's good."
If the question is 'why did it happen?', continues Ewen, that's largely driven by the laws of physics and highly specialist knowledge, "so rather than have a data scientist answer that question, you're much more likely to get a better answer if you have a physicist answering that question", he says.
"Consequently, in the domain of 'why did it happen?' for us - and this won't be the same for everybody - data scientists probably wouldn't be much use.
"In the domain of 'what will happen?', purely on weather forecasts, once again, they'd probably not be much help. But if you think more broadly that's not the question that people are often asking. People are asking questions like 'will my high street likely be busier tomorrow than it was today?'," says Ewen.
That might involve basic climate data, but mixed with business data that the average data scientist ought to be most comfortable working with. "If you're looking for an answer to that kind of question, a data scientist potentially has tremendous use, because he or she can use their statistical, correlative, non-diagnostic techniques to make predictions - but in the day-to-day business of weather forecasting and climate projections? Not so much."
The key to data science for most organisations, believes Ewen, is making sure that the data science specialists that they recruit really understand the organisation and its mission from top to bottom, which means not only understanding their skills, but also their ability to apply those skills usefully.
After all, there's any number of correlations that can be made with a wealth of data, for instance, but what a business (or any organisation) really needs are causations - real connections that are more than just coincidence. "If your business is science, then you predominantly need scientists," advises Ewen.
The Met Office's deputy director of applied science and scientific consultancies will be presenting at Computing's Big Data & Analytics Summit, along with a number of other big-name experts from industry and the private sector. Attendance is free to qualifying end-users, so book your place now before they all go