09 May 2011
“The wonderful thing about standards is that there are so many to choose from” goes the old saying. There seems to be a lot of argument around whether de jure standards (those agreed by a body of people tasked with agreeing a standard) or de facto standards (those agreed by the market itself) are what matters. While not being able to sleep one night, I thought that the “evolution” of standards was pretty much the same as the biblical story of Babel, followed by the history of language itself through the ages.
If we start with the first computer ever made, it had to have a single means of passing information within itself, so it was, by definition, fully internally standardised. Here, we have the equivalent of the biblical common language that everyone is supposed to have spoken in the early days. However, as the people built their tower to try to reach their god, their god had a volte-face and decided to sow confusion by making them all speak different languages.
As the tower of technology has been built, we also found that another god decided that we needed to be brought down a peg or two and so visited on us the idea of multiple standards, making it impossible for computers to talk between themselves.
To this end, just as Babel ended up with groups of people who could not converse with each other, the world ended up with mainframe computers speaking EBSDIC that could not talk to the distributed world speaking ASCII.
In the real world, availability of dictionaries allowed word-by-word translations to be carried out, and the use of third parties evolved to carry out translations of conversations in near real time. In the technology world, direct conversions from one standard to another have been available through point-to-point connectivity systems, and approaches such as the GXS and Ricoh i-Invoicing have attempted to put in place a means of dealing with on-the-fly interactions by accepting a range of incoming standards and publish to other standards as well.
In the real world, the problem is that each language often has major internal differences through dialects, so an English speaker in one place may know what a canal is, but may not know that it is called a cut by others, or a burn may require a hospital visit in one part of the country or an opportunity for getting a fishing rod out in another. On a broader scale, it can get worse; the US and the UK have been described as “countries separated by a common language”. drive on the pavement in the UK, and you will get arrested. Drive off the pavement in the US, and you should probably be in an off-road vehicle. Indeed, there are broader issues here – the US gallon is not the same as the UK gallon, and conversions between imperial and metric measures have led to disastrous results in the past. The technical world has the same problem: different versions of the same standard can result in different dialects being spoken. Everyone thinks that they are speaking the same language but things have unfortunately changed. With bifurcated standards, we run into the US/UK issue: the terms can mean completely different things.
Languages also tend to borrow from each other, and this can also leads to problems. In English, a guerrilla war is one in which attacks are carried out in short, sharp bursts, biting at the heels of the opposing force. In Spain, a guerrilla war (aside from being a tautology) is just a little war – quite different things. And in the technology world, where standards emerge or evolve in a vacuum, the same thing can happen: the same terms can be used to mean completely different things. For example, IP can be the internet protocol at the network level but intellectual property at the information management level, and SaaS can be software, storage or security as a service, depending on who is having the conversation.
Since Babel, there has been the ongoing battle around which language should be the lingua franca. We have gone through various different ages, including those when Greek and Latin where the languages of power, through French and Spanish, to a point at which English is broadly accepted as a common language where a degree of understanding can be agreed across many groups. However, we have various issues even here: when spoken as a second language, English can lead to misunderstandings, the use of colloquialisms can leave many confused, and the speed of comprehension may not be the same as that of delivery. So we try to speak at a lowest common denominator level, using as simple a set of terms and words as possible.
This is once again exactly the same as in the technical world – as standards translators try to keep up with the different versions and strands of a standard, it can be easy to get tripped up on occasion. Finding the the lowest common denominator has been tried – for example, by using SQL as the basis for the vast majority of databases in use today, yet every vendor has an “SQL+” language which is different from everyone else’s SQL dialect.
Finally, we have the ultimate approach: a language that enables everyone to talk easily amongst themselves. This has been the goal of Esperanto, a language invented in the late 19th century to bring everyone together. Unfortunately, even the most optimistic figures show that less than 2 million people speak it to any level, leaving a mere 6.773 billion more to go.
It is estimated that about 6,700 distinct languages are spoken around the world. I would hesitate to guess how many technical standards that there are, but feel that putting a zero or two on the end of that number may not be out of the question.
IT has created its own Tower of Babel and has been inflicted with a wealth of different standards. Instead of working to minimise this, it seems hellbent on creating more and more. Maybe it is time to stop creating new standards, but to work on how existing ones can be best used in order to achieve what the actual goal of IT should be: to help organisations in facilitating their tasks and processes.
Clive Longbottom is service director, business process analysis at Quocirca