Despite being an early adopter of high-volume processing, datawarehousing and analytics, the financial sector is largely behind the curve when it comes to utilising big data technologies. That's according to Robert Toguri, partner at EY Enterprise Intelligence Services.
Speaking at a CODE_n press event on big data start-ups in London, Toguri said that since the 2008 crash and the tighter regulations that followed, the financial sector has being looking inward, overly focused on ensuring its internal data is compliant with the rules.
"I'm not sure if there's been an impact from big data on the financial services industry yet," he said.
"By 2010/11 the regulators said ‘we don't think you really understand your internal data', and they started a roadmap around data management so they could report their risk positions to investors and the market."
While there are exceptions to this rule - notably in fraud detection and insurance - Toguri said that in general the financial sector has fallen behind.
Before the crash, the investment banking sector was a leader in datawarehousing and in-memory databases, being among the biggest users of technologies such as Teradata, Pyramid and nCube to process large volumes of transactions. However, Toguri said, failing to get to grips with the ever-increasing volumes of external data has meant that the sector is missing opportunities both to get its own house in order and to open up new markets.
Citing the examples of the recent LIBOR-rigging, money-laundering and PPI-misselling scandals that have beset the financial sector, Toguri asked: "Is there a case for big data to be used to measure conduct against the legislation? Is there value in watching two traders talking across Twitter or on the phone to see whether or not they are trying to manipulate the market?"
Opportunities are being missed, he suggested, in making more intelligent trades.
"Is there an opportunity for traders to assess risk based on all the social media information that's floating around out there?"
There are, said Toguri, plenty of opportunities for big data start-ups to get engaged with the financial sector in areas such as data governance, visualisation, modelling and management and to fill skills shortages that exist in the these firms.
The big banks have sensed the need to change, he said, and are starting to appoint chief data officers (CDOs) and to bring experienced data management people from outside the financial sector to sit on the board as COO and similar roles.
"We've seen new entries into this market, new banks and financial institutions and cross-pollination of skillsets."
However, he went on, it is hard for large organisations that have grown rapidly through acquisitions to change quickly.
"They are used to asking the same old questions," Toguri said. "What they need to do is ask a question, get an answer and then ask two more questions. How do you get applications and data to behave in that manner?"
Actuaries are also behind the times, Toguri said, suggesting that a new way of thinking might have helped them prepare for the recent floods.
"They are constrained by relational database thinking. Perhaps if they could have taken the scientific data and leveraged that into a use case where they shift assets according to risk, that would help in natural disasters like this."
Computing's Big Data Summit 2014 takes place in London on 27 March. Register today.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)