IBM is hoping the launch of its long-awaited Shark storage subsystem will give it some much-needed bite in the storage marketplace, writes Peter Branton.
A multi-platform storage subsystem, Shark will replace Tarpon, IBM's versatile storage server which it launched last year. Unlike products from rivals Hitachi Data Systems and EMC, Tarpon did not allow users to connect to mainframe platforms, which left IBM at a disadvantage.
IBM claims Shark will solve this, allowing users to connect to S/390, Windows NT and different flavours of Unix, and share information between them. Big Blue expects to unveil Shark, which is currently in beta test, at the end of July, although it is not clear when the product will become generally available.
'Shark will be a major advance on what is in the marketplace,' said Christoph von Gamm, communications manager for IBM's technology group. 'We have made enhancements in a number of areas, including data sharing.'
'IBM is long overdue to have a product in this space,' said Claus Egge, storage analyst for IDC. 'It has lost market share for not having it.'
Like Tarpon, Shark will be built around an intelligent controller, based on an RS/6000 server, which allows it to perform functions such as data recovery and backup independently of the platforms to which it is attached.
'The direction IBM is taking appears to be head-on. The question is, can it perform properly?' said Carl Greiner, director of data centre strategies for Meta Group. 'IBM will need to make sure Shark can perform better than Tarpon and is robust enough.'
While IBM claims to have a number of UK and European users beta testing Shark, Greiner said he had yet to see any products in the field. 'Until we can see it in performance I reserve judgement,' he said. 'If Shark is successful, it should help IBM reverse its sales slip of the last few years.'
Successful leaders are infusing analytics throughout their organisations to drive smarter decisions, enable faster actions and optimise outcomes
Focus on cost efficiency, simplicity, performance, scalability and future-readiness when architecting your data protection strategy