"Fast data" represents the next stage in the evolution of big data, Tibco Software CTO Matt Quinn has told Computing, and represents a real advantage for businesses, especially those in the competitive retail sector.
Quinn made the comments in an interview with Computing in Paris at Transform 2014, one of the stops on the US software solutions provider's European tour.
Earlier in the day he'd spoken about the rise of the Internet of Things and the challenges it represents, but argued if properly harnessed, then the power of rapid analytical tools can give an organisation the extra edge it needs to get what he called a "two second advantage" over their rivals.
That, Quinn argued, represents an evolution from the established means of deploying a big data solution, which he argued doesn't fit in with the rest of an organisation's business strategy.
"I think the biggest challenge that people have had with big data is it's been a bit of a science experiment," he told Computing.
"You take a bunch of data, you build a bit of analytics on top of it with new tools, new technologies, answering questions you couldn't answer before, which is fantastic, but it was separate from business operations."
Quinn argued that in order for an organisation to properly take advantage of its data – which he said could have a shelf life lasting only minutes – getting analytical results in the fastest time possible is key.
"There's two aspects to fast data," he explained. "Yes, it's the blending of real-time information with historical context to make the right decisions. But the important thing is making decisions and acting on it in real time, which pushes it all the way into business operations."
Using the example of a retail environment, Quinn explained how he believed fast data about a customer could be used at the checkout to make an extra sale.
"So the way to think about it is if I'm a retailer and I want to upsell or cross-sell, that's really what we're talking about, fast data enables this," he said.
"But in order to make a real-time upsale or cross-sale, you have to do it at the moment of interaction, so it pushes big data squarely into real-time operations, enabling decisions that are going to benefit the business versus some sort of offline analysis."
Quinn argued that this marks a massive improvement on the established methods of analysing big data.
"That's been the big issue with big data – we can find the needle in the haystack, but what are you going to do next with it? So fast data isn't the next thing, it's the evolution of big data," Quinn continued. "This is the point of inflection where big data becomes real, where people are actually going to interact with it in real time versus just doing some offline analysis."
Quinn said "fast data" should not be dismissed as yet another buzz-phrase, arguing that it represented something that Tibco customers have been doing – and apparently benefiting from – for some time.
"What we've done here with fast data is we've put a label on stuff our customers already do today. This isn't about news and new things, this is about putting out a solid message that's easily consumed for what our customers have already been doing for years and years," he said.
"Yes, we continue to invest in the technology, yes we've made great strides around things like in-memory data architectures, which is a key piece. But this is really about taking all the good stuff our customers have done – for decades in some a cases – and giving it a label."
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)