How Merck solved its clinical data challenges

A massive clinical trial disrupted other work being done at the company, highlighting the need for a scalable solution

Data collection has never been easier, but data control presents a new challenge for businesses. A map is essential to make full use of the information - and comply with GDPR - but too much data turns this important tool into a bottleneck.

That was the problem that pharmaceutical giant Merck faced when performing its largest-ever clinical trial, creating inconsistent data flows for analysis.

Despite the firm having the necessary infrastructure to handle small and medium-sized trials, the ‘mega trial' was an order of magnitude larger, using half of all clinical data collected in one year.

Legacy processes and tools struggled to support the project. Merck was using a single map for all data that had to be deposited in the clinical data repository. Because of the scope of the trial, this map grew to an extreme size.

Frequent changes in the study meant that Merck was having to continually update its map, which impacted data movement and efficiency.

Merck set a series of goals in its search for an answer: lower the time needed to move clinical data for analysis; lower costs; enhance data quality; make the data flow work process and technical mapping components scalable; and implement a solution that would support cloud-based services in the future.

Following a six-month trial, Merck chose to implement a solution from Liaison, based on the company's Contivo mapping platform. The hosted, scalable system is able to run more than 200 simultaneous trials, says Liaison, with a library of standard, forms-based maps that can be used repeatedly.

Since adopting the interface mapping solution for all future clinical trials, Merck says that it has lowered its cycle time from 125 to 39 days, and lowered costs by cutting mapping resources required to support trials. Other benefits have included higher data quality and fewer disruptions to data flows.

As new methods of gathering, analysing and storing data are developed, all industries continue to face challenges in data control and complexity.