Businesses are still grappling with the key to unlocking big data. A recent report by SAS suggested a lack of skills is holding them back, suggesting that education is needed for big data to realise its potential.
There are those who disagree that this is the key factor when it comes to unlocking big data. Autonomy Chief Architect Fernando Lucini, for example, told Computing that a skills-based approach to big data, with the need to hire data scientists, is going to lead to a world where "we're going to have a middle layer of people" when they might not actually be needed.
He does believe, however, that we need a new way of processing big data.
"The more I think about it the more it becomes clear - and certainly it is with my customer base - that we're all working together in the IT industry, all creating this data and at the same time we're the consumers of it," he said, arguing that training staff to manually process and analyse big data is a counter-evolutionary step.
"Technology moves on, makes life easier, skills transfer, all that is going in one direction So do we really think that we're going to dial back this evolutionary move where we create technology that allows us to have a better reach and make it more manual?
"The world around us is going to get better and the challenges of having more information will be solved by technology. We as individuals will get better and better at using these technologies. When I say better and better, we need technology that actually understands information. Why? Because understanding information at a rate that a human can't do means the human can concentrate on the highest added-value part of their world and let the machine bring the key items of information."
Lucini likens data scientists to the telephone desk operators or typing pools of the 1950s - employees who will be made obsolete by the march of time and technology. He argues that people adapt to new tools whenever possible, and it'll be the same for big data software.
"We're all using desktop tools to get the information on our desktops. We're not employing people to do that for us, we're employing technology! But how different is big data from the challenge of the data in everybody's laptop? If you tell me that's little data then I don't think so! When I do backup there are gigs and gigs of incompressible data! I'm not employing ten people to make sense of it for me; I'm using technology to do that."
Autonomy argues that using the correct software is all about giving the user the power to process big data for themselves, therefore cutting out a potentially inefficient middle layer of manual operatives. If software is used instead, he says, machines rather than humans can do the hard work.
"We're going to have technology that can reach. Ours does it today, we can run it and it'll consume any volume of information, take it all, form an understanding and get ready to solve problems."
"Then, when the user is trying to create a marketing campaign, or is trying to look at DNA strands for research, we're ready to provide the needle in the haystack. That sounds like the romantic notion that the information was lost: it wasn't. It was always there, it just didn't need fifty people to find it, it just needed the user."
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)