Harnessing 'dark' DevOps data to improve software delivery experiences

Harnessing 'dark' DevOps data to improve software delivery experiences

Image:
Harnessing 'dark' DevOps data to improve software delivery experiences

Rethinking DevOps testing and data capabilities gives a unique opportunity to improve the software delivery experience and governance

Just as Pink Floyd's legendary The Dark Side of the Moon album and its atmospheric soundscapes created a world of its own (and a download favourite down the years), so unexplored, or 'dark' DevOps data remains something of a parallel universe for software development teams.

Yes, it's a riddle, even an enigma, but CIOs and developers know the 'dark side of DevOps data' holds a world of transformational possibilities for their organisation ― if they can understand and harness it.

The clamour for new insights from DevOps is becoming deafening as software development pipelines grow and become the core of global innovation. Developers and operations teams, along with business analysts and value stream architects, now want better data analytics to ensure a great software delivery experience for everyone.

This not only means improved code quality and governance but also more intelligent assessment of risks, faster eradication of vulnerabilities and clearer pathways to performance improvement.

It's previously been down to proprietary and open source software providers and standard bodies to work together to provide compatible integrations, so DevOps teams can collaborate in optimal fashion in toolchains to ensure everyone contributes to a better software delivery experience. However, our view is that there are real opportunities for DevOps teams, vendors and standards bodies to accelerate enhanced collaboration and governance.

Take the most recent DevOps Research and Assessment (DORA) report: it found that high-performing software development teams deploy 973 times more frequently than low performers, have a 6,570 times faster lead time to deploy, and a three times lower change failure rate. These outliers also have an impressive 6,570 times faster time-to-recover from incidents when failures happen.

In recent times, IT organisations with separate teams for observability using AI tools often saw developers receive the relevant information for remediation work after vulnerabilities were found. But if DevOps data streams could be harnessed to predict where errors will occur, such issues could be identified earlier in the lifecycle, reducing software delivery delays.

Organisations that can harness the dark data of DevOps will become more competitive as they de-risk software models, automate testing and simplify dependency checks. And teams that shift processes left will improve their software delivery experience for all their different stakeholders.

But the dark data of DevOps is no longer a tale of dark arts or struggles to interpret data.

We are seeing organisations building best of breed value stream engineering models or using end-to-end DevOps platforms to equip their CIOs and developer teams with the broader and scalable analytics they need.

Three examples show what can be done.

First, there is real scope to automate key aspects of software delivery ecosystems ㅡ previously dependent on multiple tools from multiple vendors where sharing of delivery data and better orchestrating software deliveries was overly-slow and complex.

Second, we can reduce software testing ‘effort' with tools that increase test coverage capabilities while automating away complexity and cost. Breakthroughs include delivery cycles being reduced from months to minutes. One European company even cut its testing ‘effort' by 80% but still increased its test coverage by 49%.

Thirdly, toolchain bottlenecks ㅡ inevitable with the increase in different stakeholders across software pipelines ㅡ can be more effectively identified and resolved. Through DevOps data lake analysis, organisations can harness data from different delivery, business metrics and operational status to find hold-ups faster. A global company that regularly saw 50-75% surges in demand from unpredictable weather conditions and application performance being compromised as a result, successfully embedded observability data streams into its software toolchain to address issues faster and reduce mean time to resolution.

These are software delivery and bottom-line breakthroughs. Other pathways, such as single integrated DevOps platforms that simplify toolchains' value stream analytics models, could yield more, as we are highlighting in our customer discussions and field events with DevOps professionals.

Pink Floyd warned on The Dark Side of the Moon album that the speed of life can cause us to freeze like deer in the headlights or fail to action our cherished plans at all. Those prescient lyrics seem more relevant than ever for today's software development and delivery teams facing daily competitive, testing and remediation pressures.

But by rethinking DevOps' testing and data capabilities, we have a unique opportunity to improve the software delivery experience and improve organisations' toolchain data measurement and governance to respond to those demands. Wish you were here?

James Hunter is head of IBM DevOps