Why open - not closed - is best practice both in the data centre and at the edge

No single vendor can provide a complete edge computing solution - the edge requires collaboration

Edge computing is growing rapidly. By 2022, there will be an estimated 55 billion edge devices on the market - by 2025, this is expected to grow to 150 billion. As the amount of data that businesses and enterprises hold in their cloud systems continues to grow, and with issues of data protection and security becoming ever more prominent, businesses are being pressed to deploy a complete edge computing solution.

Beyond just deploying edge computing solutions, however, it's also becoming business-critical to understand this technology and incorporate it as part of a broader hybrid cloud strategy, and that all of this needs to be open.

So what is edge computing, and how does it work?

Cloud computing has led many organisations to centralise their services within large data centers. However, many modern systems - such as the Internet of Things (IoT) - require service provisioning closer to the outer "edges" of a network where the physical devices exist. Edge computing itself refers to a model that distributes computing resources out to the edge of a network when necessary, while continuing to centralise resources in a cloud model when possible.

Edge computing is a model that distributes resources out to the edge of a network when necessary, while centralising resources in a cloud when possible

Edge computing addresses the use cases that can not be adequately addressed by the centralisation approach of cloud computing. Rather, edge computing encourages the creation of multiple small computing sites that reduce network cost, avoid bandwidth constraints and better control the movement of sensitive data. It also allows companies to create faster, more consistent user experiences that limit service failures, while also allowing better control of the movement of sensitive data.

In practice, edge computing is usually defined by one of two use cases. The first involves collecting more data at the periphery of the network than can reasonably be sent to the core for processing, necessitating computing capacity at the edge to filter inputs before they're sent to the core. The second involves making workloads (either complete applications, or only microservices that are part of a larger application) available at the edge, because latency to connect to the core is too high.

Where does hybrid cloud come in?

On the surface, the concept of edge computing is a marked contrast to the cloud computing philosophy. Edge computing is focused on spreading computing resources out geographically, whereas "traditional" cloud deployments attempt to centralise computational power to a single location that can scale up as business needs dictate.

However, there is more overlap than you might think, particularly when it comes to coordinating the computing resources of an edge configuration. It's impossible to manage all the deployments in an edge setup without a secure control plane via automation, management and orchestration.

This is where the hybrid cloud enters the picture. A hybrid cloud deployment gives the varied components of an edge system a common foundation to rest on - whether it's Linux, Kubernetes or Ansible - which allows teams to manage thousands of networked devices just as they would a centralised server. From edge devices to the centralised data center, a hybrid cloud deployment systemises and streamlines an otherwise anarchic ecosystem.

As an added positive, going even further to an open, hybrid cloud approach allows businesses to avoid vendor lock-in. This allows them to fully oversee and control their data and infrastructure across all elements of the solution, in the knowledge that elements may be safely adopted and switched out as technologies and requirements change.

Why does edge computing need to be open?

Edge computing cannot operate in the way that cloud computing historically has. No single vendor can provide a complete edge computing solution - the distributed nature of edge infrastructure means it has to be formed by multiple components, often from multiple vendors.

Coordinating these parts requires a secure control plane, and the only way to maintain that across the vast set of parts in an edge environment is through universal and open standards. Open source supports the creation of such shared standards and foundations, which can be used to configure the hybrid clouds used for orchestration. If we fail to have such standards, then edge computing will inevitably become either too complex and costly to be viable, or the potential of the edge will be hindered by businesses being locked in by a handful of proprietary vendors.

The foundations of edge computing must be kept open, or it will fail

Put simply, the foundations of edge computing must be kept open, or it will fail. The best way to guarantee the success of the edge, then, is through an open source philosophy that ensures that there is an open language for this computing frontier.

This, alongside an open hybrid cloud strategy, allows companies to efficiently centrally manage their edge infrastructure. This open source ecosystem brings the visibility and control the businesses need over the edge infrastructure, which is key to ensuring that edge computing is scalable and secure.

Martin Percival is commercial pre-sales manager at Red Hat