How Confluent data in motion tech is giving the personal touch

clock • 5 min read
How Confluent data in motion tech is giving the personal touch

Online electrical retailer is driving hyper-personalised experiences with the help of Confluent and Apache Kafka, furthering its mission to be the global destination for electronics. The online electrical retail specialist, which serves millions of customers across the UK and Germany, saw a sharp increase in growth due to the dramatic shift in consumer shopping habits during the pandemic and needed their technology to support this surge whilst continuing to focus on turning every customer visit to its website into a one-to-one marketing opportunity. utilised the flexible, extensible architecture provided by the Confluent Platform, which has the power and smarts to combine historical customer data with real-time digital signals from customers. "With Confluent powering our Kafka deployment, we can liberate data from our heritage systems and combine it with real-time signals from our customers to deliver a hyper-personalised experience," says Jon Vines, Head of Data Engineering and Integration at

Data in motion unlocks a world of opportunities

Starting out with a self-managed environment based on the Confluent Platform, recently moved to Confluent Cloud, a fully-managed cloud service, enabling the online retailer to continue its goal of innovating customer experiences through event streaming.

Vines says by enabling onsite clickstream data as a real-time data feed, the system allows us to push an appropriate voucher to the customer in real time to create more compelling propositions. "You just can't do that with a data lake and batch processing," he says. has discovered that real-time data in motion provide unique customer intelligence which can unlock opportunities and greater efficiency for businesses - critical to delivering superior brand and the customer experience. But it wasn't always this way.

When the online retailer first started with event streaming, a proof of concept was run to extract data from heritage systems, like order processing, using change data capture (CDC) connectors to track updates in Microsoft SQL Server commit logs. This created raw event streams, which were handled by a homegrown Kafka cluster hosted in multiple AWS EC2 instances - since replaced by Confluent Cloud. Kafka propagated these events to a set of .NET services, which processed the data for several targeted use cases and stored the results in MongoDB.

After the success of its initial phase, decided to leverage the power of the Kafka Streams API to enrich its raw event data with additional context, creating more enriched event streams. Both the raw and enriched topics are sent via connectors to downstream consumers and S3 buckets. The event bucket is used by's data scientists for research and analysis, while the downstream consumers apply additional business logic before propagating the results to MongoDB.

This got them closer to the end goal - real-time hyper-personalisation. To do this, deployed Confluent to collect clickstream events from its web server, again producing both raw and enriched topics. The enriched topic then feeds's backend Lambda/MongoDB/S3 architecture as before. It then goes to Kafka to stream the resulting events back to the web server, injecting the rich, hyper-personalised content into the customer experience.

Customers like what they see, with finding they respond positively to the personal touch and has seen increased conversions. "Our hyper-personalised approach is delivering measurable results," says Vines. "That's proof that our decision to adopt a real-time event streaming approach was the right one.

Unlocking opportunities and efficiencies

And after the successful deployment of its first event streaming use case focused on hyper-personalisation, also worked with Confluent Professional Services to progress rapidly in event streaming maturity, building to the point where reuse of data, efficiencies of scale, and the platform effect are reinforcing one another.  This has allowed the retailer to accelerate innovation across the board without expensive or time-consuming technology upgrade and transformation projects. "Using the Kafka Streams API allows us to build up different views and create new stream processing applications. And with Schema Registry, we get a clean separation between producers and consumers, so we can easily add new types of data without worrying about breaking existing applications," Vines says.

Having Confluent manage its event streaming infrastructure means has also removed an operational burden, freeing up its developers to focus on building new applications. It also allows the retailer to leverage Confluent's Kafka expertise and to get seamless upgrades, giving it easy access to the latest features.

"Before Confluent Cloud, when we had broker outages, it required rebuilds," he says. "With the resulting context switching, it could take up to three days of developers' time to resolve. Now, Confluent takes care of everything for us, so our developers can focus on building new features and applications."

Ultimately, has seen clear benefits from Confluent's expertise in data in motion, built on the Kafka technology developed by the company's founders. It is helping the businesses deliver superior customers experiences in real-time. Some of the business outcomes include:

  • Customer conversion rates increased
  • Developers focused on value-add features, not operations, including the rollout of new business capabilities
  • Data at the speed of business - integrating stock availability data to better guide customer journeys

Vines sums it up well, "The most important outcome is that we can deliver capabilities at pace. Pace became even more crucial during the pandemic because the world moved so rapidly from predominantly in-store shopping to online. The speed at which we can create new use cases that improve the customer journey with Confluent Cloud is helping us to cement our online market leadership position. And that is because it allows us to treat each moment as a one-on-one opportunity to provide a great customer experience. And we're not done yet. The potential is almost limitless as we continue to learn and innovate."

Discover how Confluent technology could help you innovate and develop hyper-personalised customer experiences in real-time to maximise customer satisfaction and revenue growth.

Sign up to our newsletter

The best news, stories, features and photos from the day in one perfectly formed email.

More on Big Data and Analytics

Industry Voice: How tech investment is improving efficiency at Mitie

Industry Voice: How tech investment is improving efficiency at Mitie

A single source of truth underpinning everything

Shaun Carroll
clock 19 June 2024 • 1 min read
A matter of scale: How this World Heritage site is getting a handle on big data

A matter of scale: How this World Heritage site is getting a handle on big data

'In two years it will be 45 million rows, easily'

Tom Allen
clock 19 June 2024 • 4 min read
Blenheim Estate: How tech is protecting 'the finest view in England'

Blenheim Estate: How tech is protecting 'the finest view in England'

Data analysis and a sprawling sensor network are saving money and boosting biodiversity

Tom Allen
clock 12 June 2024 • 5 min read