Ahead of their presentation at the Computing IT Leaders Festival this week, we spoke to HPE's head of information and data strategies EMEA, Tony Stranack, and Nick Dyer - global storage field CTO & technologist about developments in hybrid cloud, the impact of the pandemic on cloud strategies and their personal and professional take on what to look for next. The interview is edited for brevity.
Computing: Hybrid cloud and multi-cloud seem to be back in favour again, at least if the press releases we receive are anything to go by. Why do you think that is?
Nick Dyer: You saw major uptake of public cloud during the pandemic, but you can't get rid of the legacy, you can't just switch it off, so now everyone's admitted that the right way is a hybrid cloud. HPE said in 2017 that the future would be hybrid cloud, and we're seeing that now.
It will be interesting to see what comes out of this pandemic era because most organisations are admitting that working from home might be the best thing, although of course mental health is a major concern. So, the digital transformation journey everyone was on is having to shift, with new policies for remote working and so on.
How does that affect hybrid cloud strategy?
ND: The focus now is not just on how can IT accelerate my business but also how can IT help my staff work and be more productive, which is a shift.
Tony Stranack: Cloud is not the destination, it's a selection of things you want: flexibility, scalability, pay as you go, broad access. In the pandemic era you don't want to go through a nine-month procurement cycle and then put something on the floor for five years and only use 20 per cent of it in the first year. It's ‘we need to deploy some servers almost instantaneously' and that's what hybrid cloud with HCI [hyperconverged infrastructure] enables you to do.
Did you observe a major move to cloud when lockdown was first announced, when engineers sometimes couldn't get into the data centres, and if so will this stick?
ND: Without a doubt. At one point, Microsoft Azure stated there was no more capacity in its London data centres. Public cloud is so easy to adopt, but the problem is that data has gravity. As vendors, we're all just building new islands of infrastructure but now with someone else managing it, but there's no real way of getting the data up and down, to say ‘I want this data here at this certain time of day'. That's one of the things that we at HPE are trying to tackle. The edge-core-cloud strategy may seem like a buzzword but it's true. The key thing is going to be moving the data across all three areas as and when you need it, and allowing the application to move somewhere else when it's not needed.
And when that happens multi-cloud will really take off?
ND: Moving data is just very hard at this point, even backing up data from one cloud is hard. IT's just not built for it at the moment, but when we crack it things like data recovery across multiple clouds are going to explode. We see other vendors trying to tackle that at the moment, too. Data portability is going to be the next thing consumers are going to be looking for.
TS: We're going to see much more data processed outside the data centre so you want the flexibility to change the characteristics of the storage and compute and everything else so you need to those cloud attributes.
Is it about more and better APIs?
TS: As you move towards a cloud consumption model, especially across multiple clouds, APIs and automation are key so you can harmonise doing one thing across multiple areas with set standards for the requirements and outputs. It also requires IT operations to shift their mindset. If you look at some of the real innovators in this area such as HashiCorp, some of their work is astonishing.
Why do companies tend to struggle on the hybrid cloud journey? Where do cloud strategies come unstuck?
TS: Legacy or traditional systems that won't lift-and-shift into cloud, and peaky workloads that don't transition to the public cloud cost model which is often priced at the instance level. A peaky workload requires a big instance that may only be running at 10 per cent utilisation most of the time.
ND: Organisations make the jump to something - it might be hyperconverged or private cloud or public cloud - then they realise that some applications don't fit, but they still need to be maintained, so there's this gap that widens over time with big cost implications for both Capex and Opex.
TS: It's the ‘one plus one equals three' thing. To maintain the two infrastructures isn't just double the cost, there's more to it. You're doubling up on skills and you may have to refresh that hardware because it's still running core business applications, but you didn't budget for that because you planned for it all to be consumption-based.
Where does HPE see itself fitting in?
TS: We think very well! Organisations need something that bridges that gap between traditional IT and cloud, we call it disaggregated HCI, infrastructure that supports both those worlds. So, run traditional alongside HCI whether that's through VMs or containers you've just got a single platform that can accommodate all those workloads. So instead of ‘one plus one equals three' we're making it ‘one plus one is less than two'. We take that idea with a commercial wrapper and put it into HPE Greenlake which is our pay-per-use approach and you've got the best of both worlds.
So are you moving towards being a software-only company?
ND: HPE is absolutely a hardware provider, but we've rationalised our portfolio right down to what customers want. So, we've innovated and built our new disaggregated HCI platform based on Proliant servers and Nimble storage, but we also have storage-as-a-service inside of the public cloud vendors so customers can connect on-prem DHCI to AWS, Azure or Google, and we can move data to and from the core to the cloud and back again. We said in 2018 everything will be available as a consumption model by 2022; we're on track to do that.
TS: Software will always need hardware to run on. It's a shift in approach because instead of shipping equipment to a customer and saying ‘pay me now', it's shipping equipment to a customer and saying, ‘you only have to pay as you consume it'. People think of the cloud as a destination rather than what they want to get from that and really a lot of our focus is how deliver those attributes, wherever it is in the public cloud, on prem in a data centre or colo, using that consumption-based model. If we can bring those cloud attributes not just for workloads that fit naturally with cloud but also traditional workloads then that's a good place to be.
Move will see Amadeus move from private cloud into Azure
Barnes used Nudge Theory and EAST to rid the Financial Times of its remaining infrastructure burden
Regulatory changes and open-source platforms are addressing long-standing concerns about costs and complexity
Keeping a major bank going during the pandemic: Interview with Kate Platonova, Chief Data and Architecture Officer, HSBC
Computing speaks exclusively to Kate Platonova, Chief Data and Architecture Officer, HSBC to discuss the pandemic, remote working and quantum computing
Issues of information management will not be solved simply by moving your ERP applications to the cloud