The adoption of big data analytics has allowed insurers to get a more accurate picture of the risk profiles of individual buildings.
That's what Nigel Davis, head of platforms and delivery at insurance firm Willis, said during his "Big data for risk analytics" presentation at Computing's Big Data Summit, explaining how data including satellite images, simulation models and even blueprints of the inside of buildings are giving insurers more certainty when it comes to managing risk.
"Technology was definitely a barrier. We had to do things in much fewer simulations or analysis was coarser. If you were thinking about modelling a flood the risk assessment would be much more coarse resolution than what we can achieve now," he said, responding to a question from the audience.
But thanks to big data analytics, insurers can get access to highly precise geographic information which can allow them to assess the flood risks that apply to individual buildings rather than a general area, Davis explained.
"Now it's being worried about individual buildings and flood modelling touching parts of buildings, rather than a dot on a map that's either in or out. Previously we were very, very limited with the simulations and the resolution of the data we'd got.
"There's less uncertainty in the modelling now and the science behind it, because it's more computational, we can produce a systemic risk assessment," he said.
That means those who wish to have their home insured will need to provide more than just the postcode.
"Previously as a homeowner, you're insuring your property and you're not very precise about where you live if you're only giving it at postcode level," said Davis.
"Your policy was only telling me this postcode and there isn't much point using that for a flood risk assessment because the uncertainty about where the building is is so big."
Now, however, big data analytics can be used to pinpoint the exact location of a building and determine the various risks to it before insurance is decided upon.
"It's definitely reduced the uncertainty levels that we have," Davis concluded.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)