Surviving the container revolution
Containers have helped to drive the adoption of open-source technologies, argues SolarWinds 'head geek' Patrick Hubbard
Big data analytics, machine learning, and artificial intelligence are redefining the landscape in which companies compete with one another. Along with cloud-native and agile operations (DevOps), IT departments have been thrust into the centre of enterprises' strategic decision-making processes in the space of just a decade.
In general, nobody misses the bad, old, help desk-native days of VPN token keys, upcycling beaten-up old laptops one more time, and server rack-and-stacks in dingy, half-forgotten basements. Today, IT departments sit at the heart of the modern organisation with new tools and technologies upturning the way business is conducted.
It's therefore no surprise that today, with a rate of technological innovation comparable to no other time in history save the industrial revolution, IT pros are grappling with developing the right skills and tools in-house to help their businesses flourish. Extreme growth in data alone (which needs to be captured, stored, and analysed if its value is to be unlocked) has driven IT pros to identify novel uses of digital technology.
Digital transformation isn't mysterious to IT pros, who've been challenged to solve traditional business problems in new ways for years. In this context - and with the promise of boosted business agility, fewer outages, less downtime, and therefore more scope for innovation - it's easy to understand why investments in cloud and hybrid IT remain a top priority for IT pros in the UK today.
The next big efficiency thing
A good indicator of new technology adoption is the way IT pros are flocking to containers. Unlike VMs, containers are a more lightweight virtualization approach, sharing a common OS and designed from the outset for orchestration automation. The result? Open-source software development tools and techniques popularity is soaring.
According to SolarWinds' 2018 IT Trends Report: The Intersection of Hype & Performance, the portion of IT pros who cited containers as their most pressing investment priority leapt from 15 per cent in 2017 to 49 per cent the year after. At the same time, 451 Research estimates the container software sector will surge to be worth a staggering $2.7 billion in 2020.
It's not hard to see why containers are so attractive. IT teams can more easily build, or at least package, application services to support their current technology investments, and more safely achieve businesses' growing demand for innovation. Docker rose to early dominance because both developers and ops teams discovered they could pack, ship, and roll-out applications as lightweight, portable, self-sufficient containers, which could run just about anywhere. It's an implementation strategy which mimics business strategy of decoupling. By breaking down existing large components among smaller agile teams, progress is accelerated. More broadly, this can make container technology a significant competitive advantage because it's agile in nature. It allows IT to rapidly respond to external changes, such as shifting customer needs or an opportunity to test a new market.
But as containers become increasingly integrated into hybrid and cloud environments, IT pros must ensure they can successfully navigate the potential challenges that come with app development in this new landscape.
Now you see me…
Delving into containers is an exciting prospect. But IT pros often find they don't know what to do after a first "HelloContainer" deployment. A major challenge for these pioneering IT teams, transitioning towards DevOps, is the existence of a pernickety legacy architecture that can't simply be swept aside. Maintaining visibility of the hybrid environment is a big challenge and can be the source of a monitoring headache. The type of tools required to monitor and manage containers tends to be limited. Only a few vendors present monitoring capabilities suitable to these new environments and don't make an already difficult situation more complicated.
Just as with any other element of an IT infrastructure, containers require oversight and have to be monitored and controlled. Otherwise, IT pros literally have no idea what's running on their servers. The best advice for IT pros is to ensure that their monitoring solutions should cover the entire stack. While container technology is new, it should make life easier rather than add complexity.
SLA-shing back
Service level agreements will eventually be consigned to the dustbin of history. Ambitious IT teams will ditch thinking about the minimum level of service they're willing to set in favour of metrics which emphasise the ways they are doing everything they possibly can to drive their businesses forward. For enterprises willing to embrace new technologies such as containers, IT must align its own goals to those of the business and carefully plan a digital transformation strategy in order to minimise risk.
Key performance indicators have been adequate for monolithic applications, and with on-premises the ability to measure I/Os remains critical. However, containerised distributed apps work differently. Undoubtedly, they will fail and undergo times of intense usage, requiring processing and storage in an uneven pattern. If this is the case, the settings in the application will need to be adjusted accordingly, making this capability crucial.
Securing the insecure
A major issue when it comes to containers is security. Precisely because they drive new levels of innovation, the focus tends to be on developing new applications, rather than thinking about security. In reality, a security breach of a container is just like that of a virtual machine. It's therefore crucial for IT teams to identify areas of potential vulnerability.
Access should be limited where appropriate and controls should be implemented to ensure data doesn't fall into the wrong hands or that applications are entirely erased. The right monitoring tools and processes are essential for this.
On a related note, IT teams will need to make sure each computing resource, including containers, has its own machine identity. This helps teams to remain organised and for them to maintain control by knowing which applications have access to which databases.
Early days
The benefits of containers have been much discussed, but their implications shouldn't be underestimated. Containers allow teams to have more applications running on the same hardware. Vitally, they give developers the ability to respond to sudden shifts in customer needs or market forces by allowing them to rapidly create applications and deploy them with ease.
For mid-to-late adopters, these are relatively early days, but the container revolution has begun. There is enormous value in being able to configure a container and run it on any platform without restriction. While embracing containers throws up a new set of challenges for IT pros, the benefits significantly outweigh the costs. Critical to success is careful planning, forward thinking and the ability to maintain visibility of the IT infrastructure.
Patrick Hubbard is ‘head geek' and technical product marketing manager at SolarWinds