Cloud, mobile, social media, big data, IoT and organisations operating in real-time on a global scale have driven sweeping changes to the database arena
Traditional database technology was never intended for the multi-cloud environment in which most businesses now operate. As developers seek to build responsive, flexible applications at scale, they need databases that reflects these principles.
The advent of cloud, mobile, social media, big data, IoT and organisations operating in real-time on a global scale have driven sweeping changes to the database arena, meaning there is now more choice than ever before.
One of the big changes has been the flow of data across clouds and on-premises deployments. Data ubiquity means that support for hybrid and multi-cloud deployments is no longer optional. Many organisations are now looking to control their database technology from a single pane of glass, across all cloud providers.
They also want to link their control base to their database vendor, while keeping the data infrastructure within their own device policy controller (DPC) - splitting the control plane from the data plane and keeping everything behind your standardised firewall. No VPC peering, no VPNs.
Alongside this, increased security and stability, businesses can now leverage their own buying power with cloud providers, avoid infrastructure being resold to them, and steer clear of vendor lock-in.
A forthcoming Computing webinar on Tuesday 5th May will uncover what businesses prioritise when choosing a database vendor and to what extent the shift to multi-cloud is informing these decisions.
It will also measure the main pain points organisations encounter with database infrastructure today - and look to shine a light on a better way forward.
We hope you'll join us on Tuesday 5th May.
Stuart Sumner, editorial director, enterprise IT, Incisive Media
Andrew Hobbs, content strategist & research analyst, enterprise technology, Computing
Perry Krug, director of customer success, Couchbase