Social networking site Twitter told delegates at Oracle's MySQL Connect event in San Francisco today that it uses MySQL because it understands the system, and it's capable of processing the high frequency of queries it receives.
Jeremy Cole, DBA Team Manager at Twitter, said that the site processes around 400 million tweets per day, all of which have to be stored in the firm's databases.
"Processing 20,000 tweets per second is our current record," he said, explaining the scale of the task before his team.
Cole described his mission as to "keep the tweets flowing". He explained that this involves providing capacity additions, fixing broken databases and replacing hardware among the firm's several thousand MySQL servers.
One of his recent projects was to add server-side statement timeout to kill queries that run for more than five seconds, in order to prevent a broken or inefficient query from using up precious resources and delaying the entire queue.
He is also working on optimising Twitter's databases to operate on SSDs (solid state discs), onto which most of the firm's server fleet is planned to migrate.
Twitter uses MySQL as opposed to other relational tools principally because it has the necessary expertise to use it. Cole said that he has been involved in designing MySQL, which as an open source tool welcomes contributions from its community, and his team has extensive experience of using the system.
"MySQL is operable, we know how to use it and upgrade and downgrade it, to push out new releases and fix bugs.
"It's also a high-performance system, that means for us we can have most of our fleet running tens of thousands of queries per second per server. Twitter is all about real time, so if a query takes 50 seconds to run, that doesn't help us. We measure latency in terms of microseconds, so if it's fast, that helps us. Other RDBMS [Relational Database Management Solutions] tend not to be faster despite their claims."
[Turn to next page]
Successful leaders are infusing analytics throughout their organisations to drive smarter decisions, enable faster actions and optimise outcomes
Focus on cost efficiency, simplicity, performance, scalability and future-readiness when architecting your data protection strategy