The biggest big data projects

By Stuart Dommett
20 Jun 2014 View Comments
lovely-photo

From the dawn of time to 2003, humankind created roughly five exabytes of data, yet in the last few years 90 per cent of the world's data was created and this digital universe is set to double every two years.

Further reading

So, whether we read about it or work with it, big data is now part of our daily lives. But some big data is bigger than others, as these projects illustrate.

 

CERN and the Large Hadron Collider

cern and the large hadron collider

Credit: http://cdsweb.cern.ch/record/628469

CERN's Large Hadron Collider, the world's largest and most powerful particle accelerator, effects around 600 million collisions a second, producing 25 petabytes of data a year.

Too much for CERN's own 90,000 processor cores, the data is distributed to and analysed by a network of 150 computing facilities across 40 countries.

 

United Nations Global Pulse

un

Credit: https://openclipart.org/detail/23722

Out of the seven billion mobile phone subscriptions worldwide, 5.4 billion or 78 per cent of them are from developing countries. And all of this mobile activity is generating a mountain of data. As use of mobiles and mobile-services continues to rise in the developing world, there is an opportunity to use some of this wealth of data to gain real-time information on human well-being.

Global Pulse functions as an R&D lab, working on a project by project basis together with UN agency partners to test new approaches and data sources.

 

NASA

nasa

Credit: extracted from http://www.hq.nasa.gov/office/pao/History/alsj/a410/AS8-14-2383HR.jpg

Between moon lasers, surveying polar ice and computing the climate, NASA has to analyse nearly an exabyte of data.

The NASA Center for Climate Simulation alone stores 32 petabytes of climate observations and simulations on its Discover supercomputing cluster, which uses more than 35,000 processing cores to process more than 400 trillion floating-point operations per second.

 

Procter and Gamble and Monte Carlo simulation

mushroom cloud

 Credit: National Nuclear Security Administration / Nevada Field Office

Rather than relying on human judgement, Procter and Gamble uses Monte Carlo simulations to gauge the demand and risk of new products and estimate their average returns.

The method, named after the gambling capital of Monaco, was developed in the 1940s to simulate the probability that the chain reaction needed for the atomic bomb would activate properly.


 Self-driving cars

self-driving-car

Credit: https://www.flickr.com/photos/44124348109@N01

Big data is driving autonomous vehicles.

Google's self-driving car project, using a LIDAR (laser radar) system, numerous sensors and high-resolution maps, is the best known example, but many major automotive manufacturers, such as Volkswagen, Nissan and BMW, are experimenting with the same thing.

[Turn to next page]

Reader comments
blog comments powered by Disqus
Newsletters
Is it time to open Windows?

Computing believes that Microsoft will start offering Windows free of charge by 2017. Is this a good thing for the enterprise?

54 %
19 %
7 %
15 %
5 %