Kickstarting digital transformation and DevOps at a major bank

Tech lead Ömer Saatcioglu talks through the successful pilot project to introduce digital onboarding at an established European bank

Traditional banks are in danger of losing customers to new cloud-native fintech players like Monzo, N26 and Revolut. Students and young professionals - the next generation of customers - are attracted to these new online options; they make it simple to set up an account, the user experience is slick and fees are generally low or non-existent.

What's more, new regulations like the EU Payment Services Directive (PSD2) are changing the relationship between banks and customers, knocking down the castle walls by forcing banks to open up access to relevant account information via APIs and allowing new players to offer competitive services.

Banks are well aware of the need to become more agile, but they are burdened by legacy technology and inflexible working practices in an industry where regulations makes experimentation particularly difficult.

However, some financial institutions have risen to the challenge of digital transformation. ING, Nordea and Barclays are examples of European banks known to have successfully transformed many aspects of their business, moving from Waterfall - with its silos, handoffs between business units and long release cycles - to Agile, and making use of the flexibility and interoperability of private and public cloud platforms. But many banks have barely altered their working practices in the last 20 years, and those banks need to get a move on.

"Startups are coming, and they are showing they can do much better [than traditional banks] with the newer technologies," said Ömer Saatcioglu, a software developer working for the consultancy McKinsey & Company, during a presentation at the Open Infrastructure Summit in Denver last week.

All across Europe banks have no choice but to bite the digital bullet.

Saatcioglu is currently working as tech lead as part of the McKinsey team and to the in-house developers at a long-established bank in southern Europe (which he was not permitted to name). This bank had made some semi-successful attempts at introducing Agile and DevOps but failed to advance them, being hidebound by inflexible structures.

Digital transformation is not about the technology, it's about a change in mentality - Ömer Saatcioglu

The bank makes no use of cloud at all, relying on in-house Windows servers to support its applications. However, as a foundation to build on, this technology was perfectly serviceable, said Saatcioglu.

"Digital transformation is not fundamentally about the technology; it's about a change in mentality to use the new tech for business purposes," he said.

He and his team set up a pilot project to provide digital on-boarding capabilities using mobile and web services for the bank's new customers.

Focusing on four key areas - organisation, automation, methodology, and legacy software - the aim was to show how a significant software project could be delivered more quickly and with better quality and long-term sustainability.

Introducing Agile

First, they put together three Scrum teams consisting of a total of 14 developers and testers plus five business people. "It's better to keep below twenty," Saatcioglu told Computing. "More than that and you lose the personal touch and things get lost in the background noise."

This cross-functional group got straight into Scrum ceremonies, practicing sprint planning, daily stand-up meetings, and so on. This was all a bit mystifying for the business people, and the team was fortunate to be led by an excellent Scrum master who was able to keep the communication channels open.

Automation

From there they developed a full CI/CD pipeline using Microsoft TFS, which was widely used in the bank for source code management and version control. Rather than focusing on the tooling, the important thing was to embed an understanding of the importance of automatically deploying and testing the code.

"We needed to have a move fast, fail fast mentality and automation was critical," Saatcioglu said. "So we made sure the developers made commits to the main branch every day."

Changing the methodology

Even more crucial was getting the developers to use modern practices such as test-driven development (TDD).

"It's all very well having nice tools, but you have to know how to use them to get results," said Saatcioglu.

The developers were taught to write their own failed test cases and to refactor and iterate the code to pass them. Another learning process was using git-flow to represent each user story in a different branch of the code.

While the team does not yet ship live code at the end of the pipeline (for now it's published to a development server), they always have code ready for delivery should it be required.

Dealing with legacy code

Because it was not possible to start from scratch, the team had to encapsulate the legacy code in the repository so that external libraries and backend APIs did not interfere with the test cases. To do this they created mock libraries and mock APIs as a ‘buffer' between the legacy code and the new functionality, and ran unit tests in the pipeline to check for changes so they could avoid breaking anything when creating new features.

"It also helped us to create concise, functional unit test cases because we didn't have to write unit test cases the check the external functionality," Saatcioglu explained.

Speed quality and ownership

Five months in, and the initial stage of the pilot study is over. The team is now deploying three times a day to the main branch and the code at the end of the pipeline is always in a state where it could be deployed. Dependency problems have been overcome by ensuring the build server is treated as the "single source of truth", and everyone feels more responsible for the state of the code and its ongoing development.

"The owners are starting to feel more responsible as they see their changes rolled out quickly, and the developers said this is the highest quality of code they've seen in such a project," said Saatcioglu.

The most positive end result of the pilot has been improving the working experience of both the developers and the business owners, he added.

Next steps

Planned next steps are to improve the testing, monitoring and pipeline automation to allow for single-click deployments to the production server, and ultimately migration to microservices and a hybrid cloud environment.

"The strategy is to improve internal ways of working, and this is what we're trying to achieve in the first phase. Then in the second we're thinking of looking at containers so we can see the advantages of the new technologies and that should also help us move to private and public cloud," said Saatcioglu.

The AI and Machine Learning Awards are coming! In July this year, Computing will be recognising the best work in AI and machine learning across the UK. Do you have research or a project that you think deserves wider recognition? Enter the awards today - entry is free.