From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

From the time when they used to be loaded into gargantuan machines via punchcards and tape to the present day, individual programming languages have always had their passionate advocates - and equally passionate haters. Like their linguistic counterparts, programming languages are frequently described as elegant or clumsy, as beautiful or ugly. They can be verbose and flowery or almost rudely terse. Some are loose and flexible like English; others are restrictive but precise like German.

Languages come with their own histories and cultural baggage. Each has its own followers and each represents a slightly different philosophical view of the world.

There is no one-size-fits all, no right and wrong. Like people, some are great at maths and others excel in the graphic arts. There are low-level languages that allow specialist programmers to speak directly to the memory, CPU and disks, and high-level languages that mediate this conversation via layers of abstraction so that even young children can write a program with ease.

The website Rosetta Code demonstrates how coding tasks (such as the classic "Hello World" program) are performed in many different languages. The site is "aware of 582 languages", but there are certainly many thousands more. Some have been around for decades and are still in use, others have seemingly sprung from nowhere and now have hundreds of thousands of contributors on the developers' site GitHub, while still others are little more than specialist tools used by a handful of people. So how can success be judged?

"There are only two kinds of programming languages: those people always bitch about and those nobody uses," said Bjarne Stroustrup, inventor of C++.

So which are the languages that people have bitched about over the years, and which are they bitching about now? Join Computing as we find out.

Next: Fortran

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Fortran

1957, IBM, team led by John Backus

Generally considered to be the first general-purpose programming language, FORTRAN (which later dropped the capitals to become Fortran) is still used to this day, with Fortran 15 due out next year.

The original FORTRAN featured the first optimising compiler. This enabled code to run almost as fast as hand-coded assembly language while reducing the number of statements required by a factor of 20, making the practice of programming hugely more efficient at a stroke.

Early versions were loaded via punchcards as keyboards and terminals had not been developed at that stage.

Fortran was developed to perform large-scale numerical computations on mainframe computers, and that remains its core (indeed only) purpose today; it is still used by scientists for such tasks. Fortran is used in around half of the high-performance computing programs that run on supercomputers.

Although it is considered a legacy language, its longevity has been increased by the fact that later languages, such as C++, were designed to interface with it.

Next: COBOL

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

COBOL
1959, CODASYL

If you are a developer, one blessed with a calm and patient demeanour, you might do worse than learn a bit of COBOL. Why? Because most banks and large businesses still have many lines of COBOL running on their systems (those that they haven't been able to port to Java) and the majority of technicians who understand it have either retired or been driven mad by years spent coding in COBOL. So those skills are at a premium.

What Fortran was to science, COBOL was to business. Designed by the Conference on Data Systems Languages (CODASYL) with backing from the US military, and based on a previous language called FLOW-MATIC, which was created by the redoubtable Grace Hopper (pictured below), it emerged in 1959 and took the world by storm. By 1970 it was the most widely used programming language on Earth. As recently as 2001 the language was being used to process 85 per cent of business data, according to Gartner.

Which is not the same as saying it was ever popular with programmers.

Verbose and clunky, the famous Dutch computer scientist Edsger Dijkstra said of it: "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence." (Mind you he was equally unkind about Fortran, labelling it "the infantile disorder".)

COBOL was implicated in the Y2K Bug débâcle that threatened to crash the world's mainframes at the end of the last century, and the infamous masses of COBOL "spaghetti code" are blamed for banking glitches to this day.

To be fair, many of the problems associated with COBOL are a result of it being an early language and therefore subject to bad coding practices due to the absence of recognised norms.

That said, the mainframes running COBOL in banks, retailers, government departments and (gulp) nuclear power stations are remarkably stable and excel at high volume, repetitive tasks. The "If it ain't broke..." maxim applies here. And yes, COBOL is still being produced almost 60 years after its debut. So, if you fancy a challenge...

Next: C

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

C
1972, AT&T Bell Labs, Dennis Ritchie

We've arrived at the most influential language of them all: C.

C is quite literally the mother of many other languages that came later: Mathematica, MATLAB, Python, Perl and PHP are all written in C; so are large parts of the Windows, Unix and Linux operating systems. Many libraries used by software programs are written in C, as are virtually all device drivers.

Fast and simple, C allows tight control over memory, files and storage. It is very efficient, makes little demand on memory, and the way it implements algorithms and data structures makes C useful for programs that are computation heavy. Through extensions (Embedded C) it has also become a mainstay of embedded computing.

C was developed by Dennis Ritchie (pictured, standing) of AT&T's Bell Labs between 1969 and 1972. Ritchie and colleague Ken Thompson (seated) were building Unix kernels and wanted a more efficient alternative to coding in assembly language. They wanted a "higher level" language (i.e. more abstracted from the inner workings of the computer) that would give them more control over the data, and so, after some trial and error, Ritchie created C.

In 1973, Ritchie and Brian Kernighan produced a manual The C Programming Language. Considered a classic of its kind, this work also introduced the famous "Hello World!" program, which has appeared in the tutorials of almost every language since.

In the 1980s C was taken up by start-ups like Microsoft. The rest is history, but C remains important to this day.

Today C is considered a low- or medium-level language rather than a high-level one. It is still widely used for systems programming where efficient use of resources is key, but is not the ideal choice for developing more complex user-facing applications, and while it is relatively simple in form, that doesn't equate to it being easy to use.

Later languages based on C addressed these shortcomings.

Next: C++

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

C++
1983, AT&T Bell Labs, Bjarne Stroustrup

C++ was created between 1979 and 1983 by Danish computer scientist Bjarne Stroustrup (he of the "There are only two kinds of programming languages: those people always bitch about and those nobody uses" quote we opened with).

Designed to do everything that C could do and more, for example by providing features that enabled large applications to be built using it, C++ added object-oriented programming capabilities to a C-like language, pulling in many features from another language called Simula. Its mix of high- and low-level features and the introduction of classes made it more versatile than C.

Many office suites such as Microsoft Office and Open Office were written mainly in C++, as were many database management systems. The video games industry largely grew up on C++ and indeed it remains widely used there.

Other use cases include real-time and low-latency financial trading systems, embedded and partly-embedded systems in aviation and military hardware, anti-virus software - and also in high-traffic websites, such as Facebook and Google. Like C, it is used for writing other languages, drivers, libraries and systems tools. Anywhere that fast and efficient code is needed you are likely to find C++.

It is backward compatible with C and even interfaces with Fortran, enabling it to provide a useful bridge between past and present.

While it has a lot of fans, it has a fairly steep learning curve on account of its mix of high- and low-level features. Certainly, it has some high-profile critics.

"C++ is a horrible language," said Linux creator Linus Torvalds in 2007, inadvertently providing grist to Stroustrup's mill. "It's made more horrible by the fact that a lot of sub-standard programmers use it, to the point where it's much much easier to generate total and utter crap with it."

Next: Objective-C

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Objective-C
1983, Stepstone, Brad Cox and Tom Love

At the same time that Stroustrup was creating C++, Brad Cox and Tom Love of software company Stepstone were also adding object-oriented programming capabilities to C, in their case, borrowing many features from Xerox's Smalltalk language rather than from Simula.

Lacking AT&T's marketing clout Objective-C failed initially to take off in the way that C++ did. However, in 1988 it was licensed by Californian workstation vendor NeXT (founder and CEO: one Steve Jobs), who extended the compiler and added libraries, eventually using it in the creation of the NeXTstep programming platform/operating system, as well as the OpenStep API, which was a version of NeXTstep designed to run on Sun's SPARC-based hardware.

In 1996 NeXT was acquired by Apple, which used OpenStep as the basis of its Mac OS X operating system. OS X therefore includes Objective-C and the libraries and add-ons designed by NeXT.

These components also found their way into the Xcode IDE and Cocoa API, which are used to develop applications for Apple's OS X and mobile iOS operating systems.

So, Objective-C found its way to prominence by a very round-about route. Mac and iPhone developers currently use it as their primary development language, making it one of the world's most-used languages. But outside of the Apple ecosystem it is less popular than other variants of C, and its role within Apple may be diminished with the arrival of Swift. While still new, Swift has a simpler syntax and is likely, ultimately, to be favoured over Objective-C by developers.

Next: Haskell

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Haskell

1990, Multiple developers

Haskell? I've barely heard of it - and it's from 1990! Surely that makes it one of Bjarne Stroustrup's languages that no one uses? Well, until three or four years ago you'd have had a point. But there has been a steady rise of interest in functional programming languages such as Haskell in recent times, hence its inclusion here.

Functional programming is hard to explain in a few paragraphs, but one of its advantages is to reduce the size of the code, making it easy to read and thus maintain. Functional programming languages are constrained. Unlike with C, say, it is impossible to make a mistake that will cause a catastrophe - Haskell simply won't let you - and testing time is cut as errors are caught along the way. Overall development time can be reduced as a result.

Despite these advantages, programming in functional languages is quite different from using imperative languages like C++ and Java, hence it is still a minority pursuit, albeit a minority that's growing fast.

Haskell began life in response to a call in 1987 to define an open standard for such functional programming languages. For 25 years, academics have been developing the language based on this standard. Recently, new libraries and extra tools were added taking it out of the realms of academia and into wider usage.

Thanks to the new libraries and tools and because of the increasing requirement to scale up programs to run on multicore processors, Haskell currently has a buzz about it. Along with other functional languages, such as Erlang, Go and Clojure, Haskell makes much fuller use of the CPU resources available. Millions of individual threads can be run on the CPU at the same time making it ideal for web programming (e.g. chat applications) and high-performance multi-threaded programs, as each request can be given its own thread, simplifying the logic of the architecture.

It has found favour in investment banks, such as Standard Chartered, a big Haskell user, that need to produce trading software quickly with minimal bugs and errors. Other use cases are in artificial intelligence and in writing compilers, services and tools. Web giants Facebook and Google (PDF), among others, use Haskell for internal tools.

Some languages are still young at 25.

Next: Python

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Python
1991, Guido van Rossum

Created by Dutch programmer and British comedy fan Guido van Rossum (pictured), Python was designed to be readable and easy to use. Python is the one kids learn at school. You don't need all those curly braces as with C or the many boilerplates demanded by Java and you don't have to compile it. The learning curve is shallow. As an added bonus, some Python tutorials teach the basics of Monty Python at the same time:

The Argument Clinic

print "An argument is a connected series of statements intended to establish a proposition." # Palin
if paid_for_argument:
print "No it isn't!" # Cleese
print "Yes it is!" # Palin
...

Python is an open source language that's freely usable and distributable even for commercial use. It comes with huge numbers of modules and libraries making it very versatile. It's certainly not just for kids: Python is one of main languages for data mining, machine learning and scientific computing.

"It is often used like the mortar in a wall" an IT architect told Computing. You can wrap "bricks" of code from more challenging but faster lower-level languages, like C++, in easy-to-manage Python, so you get speed combined with usability.

So what's not to like? Not too much, except perhaps in the eyes of some snooty developers (anyone asked Mr Torvalds?) in that it allows non-programmers to write passable code. It's not the fastest language out there, but then again you wouldn't use it to build a real-time trading system.

Python seems to be a rare example of a language that is widely used and yet not bitched about (much).

Next: Java

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Java
1991 Sun Microsystems, lead developer James Gosling

Java is probably the most used programming language in the world, depending on who you believe. It was developed as an alternative to C and C++ with the idea that you could "write once, run anywhere". This it did by running on a Java virtual machine (JVM) - although in the early days at least the realisation of write-once-run-anywhere was a bit virtual too.

It was also designed to be highly compatible with C/C++ and similar enough that C++ developers could easily make the switch. Because it was essentially backward facing, later developments were bolted on with the result that it is something of a hotchpotch.

However, because it can run on any computer or mobile device able to run the JVM, it comes with a huge range of libraries and tools, and the fact that it does a great many things pretty well means that Java has become the enterprise language of choice (something also helped by the massive marketing spend of Sun, and now Oracle).

So Java is the language big business turns to for server-side development, integration, back-office functions, messaging and the like. It is mature, stable and well understood and there are millions of developers who know it. There is basically nothing you can't do in Java, although it might not always be the best choice. And being open source there is no risk of vendor lock-in - as there was until very recently with Microsoft's C#.

In short, Java is just the sort of thing that enterprise likes.

It is also the core language of Google's Android mobile operating system, bringing in another huge user base. Hadoop and Apache Cassandra are written in Java and it is heavily used by web giants Google, EBay, LinkedIn and Amazon. The popular game Minecraft is written in Java, too.

Developers seem to have a bit of a love-hate thing with Java, though. It's certainly not cool, and it is tiresomely verbose, but it works well and it certainly pays the bills (and much else besides).

Next: JavaScript

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

JavaScript
1995, Netscape, Brendan Eich

If you want a reasonably authentic web 1995 experience, install the NoScript browser plugin, which blocks JavaScript by default and go for a whirl. Well it's not a whirl. Everything is still. Nothing moves. It's a little dead and eerie.

JavaScript brought much-needed movement to web pages and changed the look and feel of the WWW forever. Emerging just as the web took off it enabled all sorts of whizzy "fun stuff" to happen on a web page - including those annoying cursor trails that followed you around when you inadvertently moused over a portion of a web page. Such japes. From a developer point of view there were many other annoyances as a result of JavaScript's rapid and unplanned growth. However, these have mostly been dealt with.

JavaScript has now grown up and is an ubiquitous part the web. After HTML (which most people consider to be a mark-up language rather than a programming one - discuss) it is the most common element of website code. It is used to allow client-side scripts to interact with the user, control the browser, communicate asynchronously and alter content. It can handle menus, pop-ups, ads, tracking scripts, social media buttons, etc. If it moves there's probably JavaScript in there somewhere.

But that's not all it can do.

HTML5 mandates support for JavaScript's APIs, enabling web pages to act as web applications. Google Docs is an example of a web application written in JavaScript. There is also a lot of potential for using JavaScript and HTML5 to write mobile apps.

And there's more. The arrival of Node.js, a customised version of Google's high-speed JavaScript V8 engine (the one used in the Chrome browser), has allowed the language to be deployed on the server too. JavaScript is moving towards the status of being the "one true language" for web applications, running everywhere and capable of doing pretty much anything asked of it.

No wonder JavaScript developers are in high demand.

Next: PHP

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

PHP

1995, The PHP Group, Rasmus Lerdorf

While JavaScript animated the web, PHP, which emerged at the same time, let users design web pages, connect those pages to databases and looked after things like forms.

Like JavaScript, PHP has become deeply embedded in the way the web works. On the server side PHP is used by 81.9 per cent of all websites according to a W3Techs survey.

PHP is the "P" in the LAMP stack (Linux, Apache, MySQL, PHP), the default way of deploying web applications. It is also widely used to design web pages: WordPress, Wikipedia and Etsy are all created with PHP.

PHP is also the cause of more head-shaped dents in more desks than perhaps any other language (COBOL possibly excepted). This is because it has grown organically from something designed to create simple templated web pages (PHP stands for Personal Home Page) to something much larger in scope. Because of the disorganised way it has grown (JavaScript had similar problems but seems to have overcome them quicker than PHP), basics like the naming of functions and the ordering of their parameters are inconsistent. What works in one place doesn't work in another.

What's more, PHP allows for highly insecure programming. Cross-site scripting and SQL injection vulnerabilities have all been made much more exploitable by the way many PHP web pages have been written, although these issues have now largely been addressed.

It is easy to get started with PHP - but even more easy to get bogged down.

Next: C#

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

C#
2000, Microsoft, team led by Anders Hejlsberg

C# is yet another language that's based on C. It was designed for use in Microsoft's .NET framework for creating Windows software. If you are developing applications for the PC, for Windows server (via ASP.net) or for Windows Mobile, C# in the Visual Studio IDE is the default choice.

By design it is an easy-to-use object-oriented language that is compiled on Windows machines. It shares quite a lot of features with Java, on which it is also based, such as run-time compilation, garbage-collection, and the use of curly braces - a legacy of both languages' C/C++ roots.

However, it is leaner than Java and adds some additional features. As a proprietary technology until very recently, using C# has been restricted to the Windows environment (notwithstanding a few imperfect attempts to port .NET to Linux, notably Netscape's Mono, which provides an open-source C# compiler). Last year, however, Microsoft open-sourced much of the .NET framework and announced a new open-source C# compiler called Roslyn, which was released under the Apache 2.0 licence.

"Pushing that button [to launch Roslyn] was one of the more impactful clicks of my career," said C#'s lead architect, Anders Hejlsberg.

With Microsoft now backing .NET on Linux and Mac, C# may come to rival Java as the all-purpose language of choice for developing enterprise applications that can run anywhere. See also...

Next: Scala

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Scala
2003, École Polytechnique Fédérale de Lausanne, Martin Odersky et al

As a general-purpose language that runs everywhere Java is hard to beat, but it is verbose and it can be hard to understand the logic by reading the code. This has led to the emergence of other languages that were built on the virtualisation part of Java, the JVM, but that are more stripped down and simpler and quicker to code. Scala is one of these.

As it is based on JVM, Android apps can be developed in Scala. Some see Scala as heir apparent to Java (dubbing it "Java++") since it is compatible with existing Java applications. While Java is object-oriented, Scala also supports functional programming (see Haskell) and has many other features that Java does not - some of which are shared with another possible Java competitor: C#.

Scala is not alone in being a non-Java language that runs on JVM (others include Clojure, JRuby and Jython), but asked in 2008 which he would choose, James Gosling, one of Java's chief architects, answered Scala.

In 2011, the Scala team won a five-year research grant of €2.3m from the European Research Council, and with private funding lead developer Martin Odersky (pictured) and collaborators launched a commercial company to provide support and training for Scala users.

High-profile users of Scala include Twitter, Coursera and The Guardian. It is also popular with data scientists.

Next: Go

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Go
2009, Google, Robert Griesemer, Rob Pike, Ken Thompson

The Go language (or Golang) was developed by Google to meet the needs of a world defined by multicore processors, networked systems, compute clusters and web applications.

Emerging in 2009, it was designed to be quick to build, with modern features such as concurrency and garbage collection built in. Go's design also aimed to make managing dependencies easier, and to enable applications to scale up more readily across the environments listed above.

It is easy to write Go applications and to install them as they are compiled to a single executable rather than requiring dependencies to be installed alongside them. Go applications can take advantage of modern multicore processors without too much tinkering, making it suitable for creating web applications for use by large numbers of concurrent users.

And it can run on Windows, Linux, Mac and even on small devices like the Raspberry Pi. That said, it is still a young language and lacks libraries and other add-ons.

Backed by Google, though, there can be little doubt that Go will "go" places. Hey Google, you can use that one if you like. Please get in touch about fees. Thanks.

Next: Swift

From Fortran to Swift: The world's most important programming languages

The fine art of telling machines what to do, from 1957 to 2015

Swift
2014, Apple, Chris Lattner

Last, but certainly not least, is Apple's own developed-in-house language, Swift.

Engineer Chris Lattner began working on the new language in 2010 in order to fix aspects of Objective-C that developers often complain about, including its confusing syntax. He also introduced type inference, which allows data types to be worked out by the compiler rather than having to be written into the code (Scala does something similar).

Just seven months after its release in September 2014, it had already made it into the top-20 languages on the development hub GitHub, with thousands of developers creating their own repositories to work on it.

Swift is likely, eventually, to replace Objective-C as Apple's app development language of choice. There are hundreds of thousands of developers working on apps for Apple's App Store, not all of them professional developers, and a language that makes their life easier is likely to be welcomed with open arms.