In what is inevitably (but erroneously) being touted as a victory for "big data" over political punditry, US statistician and blogger Nate Silver has been thrust, blinking through erudite specs, into the media spotlight. The reason? Mr Silver correctly predicted the results in every single one of the 50 US states in yesterday's presidential election.
What's more, he did so a long time prior to the onset of the media frenzy that marked the climax of the re-election of Barack Obama on Tuesday. If that achievement weren't enough, in 2008 Silver correctly foresaw the results of all 35 US Senate races – and the presidential results in all but one state.
Refreshingly, in the febrile world of political analysis, Silver's secret is most definitely of the "what you know" rather than "who you know" variety. Indeed, in his New York Times blog Silver cheerfully lays out the basis of his methodology, in contrast to the usual media practice of jealously protecting one's sources and means.
In the success of his recent predictions, as well as the mathematical rather than journalistic basis of his analyses, Nate Silver and his ilk represent a challenge to 24-hour rolling news channels, whose currency is immediacy and which depend on credible presenters painting a picture of sands constantly shifting beneath the protagonists' feet. After all, it's hard to fill a 12-hour TV slot with a forgone conclusion.
Rather than agreeing with this news-friendly picture of voters stricken by last-minute indecision, by delving into historical election data Silver's numbers point to political opinions that are generally pretty stable. They also suggest that the "undecided voter" is, to some extent, a phantom created by putting undue weight on the most recent polls. Allegiances do change, for sure, but they do so in a way that is more predictable than we are often led to believe, once other seemingly unrelated factors are taken into account.
It is this aggregation of many data sources to define the bigger picture that has led many headline writers to run for the fashionable term "big data", but in truth what Silver has achieved could be done in something as unfashionable as Excel. A deep understanding of the data available to him and a tried and tested statistical model have been the key factor in his success, rather than an obsession about the technology deployed to process it.
That said, as big data techniques enable practitioners to correlate ever more diverse and disparate data sets, predictive models will become increasingly accurate.
But before political pundits despair of ever being able to voice an opinion again, they should remember that banks' over-reliance on risk models developed by quants (quantitative analysts) not dissimilar to Silver led to the financial crash of 2008.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)