Advanced areas of science such as astronomy are pushing the boundaries of big data in mind-blowing ways, thanks to the scale of the data being collected by astronomers and cosmologists.
That's what Robert Bath, vice president of engineering Europe for data centre solutions provider Digital Realty, told the audience at Computing's Big Data Summit 2014, as part of his talk on big data storage architecture.
"Astronomy is a very interesting area that's pushing the big data issue in a way that's quite mind-blowing," he said.
"A good example of that is the SKA telescope [the Square Kilometre Array] which is effectively a square kilometre radio telescope which is being deployed in Australia and South Africa," Bath told the audience, explaining how the amount of information which needs to be processed and collected is colossal compared with the vast majority of other big data projects.
"It effectively creates circa one exabyte of data per day and the challenge is how that's addressed in terms of the distributed dishes and the arrays. All of that is a uni-directional dataflow which travels back to supercomputer centres. Then it's about how it's distributed to a global network which is a very interesting part of the space."
Bath went on to describe the Square Kiometre Array as "a truly incredible proposition", adding "they're anticipating that the central computers - which will be situated in Cape Town and Perth - will effectively have the processing power of one hundred million PCs".
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)