From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Readers of Computing will know better than most how wide-reaching and fantastic the effects of IT have been, and continue to be, on the human race. The leaps and bounds technology continues to make year in, year out are staggering.

But who are the people that made information technology what it is today?

From the raw conceptualism of a computing algorithm or building the first transistor-based processor, to executing established practice to create the World Wide Web or social networking, Computing has identified the key individuals in an epic chronology that spans over 400 years.

But who are these people who built IT from nothing to the global phenomenon to which we owe our careers and lifestyles today? Who is the king of relational databases, and who is the queen of mobile phones? Who coded their own height in calculations to send the first humans to the moon?

And who pioneered the induction motor and AC electricity, but ended their life flogging unlikely energy weapons to global superpowers while taking care of injured pigeons?

Read on to find out the potted histories of the smart, brave, visionary and often slightly unhinged souls who made technology what it is today, and without whom we'd still be counting with little stones and making spreadsheets out of sticks and leaves.

Next: Gottfried Wilhelm Leibniz and the 17th century discovery of binary

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Gottfried Wilhelm Leibniz

This 17th century German polymath and philosopher was responsible for several of technology's firsts. Leibniz is believed to have developed calculus completely independently of Isaac Newton, and he went on to become a big deal in the nascent world of mechanical calculators.

In 1685, Leibniz successfully described the Pinwheel calculator - something that would not be brought into mass public use until the 1870s.

But it was the essential computational element of binary that Leibniz should ultimately be remembered for. His interpretation of the hexagrams of 9th century Chinese divination text the I Ching as binary contributed to the "characteristic universalis" [universal characteristic] he had been developing as a single framework to express maths, science and metaphysics.

While he didn't quite achieve that lofty goal, his work with the I Ching correctly identified the "on/off" nature of propositions that led to computational logic.

Next: Charles Babbage somehow makes a computer before the arrival of electricity or silicon

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Charles Babbage

Another polymath (this list is going to be full of them), Babbage's CV includes helping to found the Royal Astronomical Society, pioneering an early form of uniform postal rate and kickstarting what is now Ordnance Survey Ireland.

But Babbage is best known for his work in early computers. This is perhaps ironic, as he never actually completed either of the devices most famously linked with his name - the Difference Engine and the Analytical Engine.

The concept of a mechanical device to carry out calculations was actually dreamt up by J. H. Müller, an engineer in the Hessian army, who published his idea in 1786. Unfortunately, there was no funding to make it a reality.

Babbage picked up the idea in June 1822, proposing it to the Royal Astronomical Society and gaining the attention of the British government who, showing that history usually repeats itself, began ploughing money into a scheme that swiftly began to spiral out of control.

Asking for £1,700 to begin work (around £72,000 in today's money) in 1823, Babbage had spent almost £1m in modern money by 1842 and still not produced a working Difference Engine. The government canned the project.

It was a pity as the machine's architecture was essentially the same as a modern computer's, with data and program memory, instruction based input and operation, and a separate I/O unit.

Still believing in his ideas, Babbage was undeterred by failure, next conceptualising the Analytical Engine - a more complex version of the Difference Engine that would also operate in a simpler way (early shades of Moore's Law here, perhaps?).

Packing an arithmetic logic unit (ALU), control flow (control flow statements to be added later by Ada Lovelace), control branches, loops and integrated memory, the Analytical Engine was very much a complete computer.

It also didn't get finished, Babbage tinkering with various parts of it until his death in 1871. However, he'd also worked on a "Difference Engine No. 2", which eventually inspired one Per Georg Scheutz - a Swedish lawyer and inventor - to begin basing machines on it.

By 1859, Scheutz had even managed to sell one of his Difference Engine-inspired machines to the British government, perhaps at least partly justifying Babbage's original eye-watering R&D sinkhole.

Babbage's machines were eventually completed in the late 1980s and early 90s, and are now on display in London's Science Museum, along with half of Babbage's brain (the other half was dissected to try and discover why he was so clever).

Charles Babbage left another interesting legacy. His work inspired the 1990 alternate history novel The Difference Engine by William Gibson and Bruce Sterling - a book credited with kicking off the steampunk genre by depicting a 19th century world where IT underpins civilisation.

Next: Ada Lovelace gets the jump on Bill Gates and invents software for Babbage's hardware

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Ada Lovelace

Much has been said, written and even drawn about this true luminary of early computing, and most of it is true (except perhaps the ray guns). If Charles Babbage designed the first computer, Lovelace conceptualised the software that would run on it.

As Lord Byron's only legitimate daughter, Augusta "Ada", Countess of Lovelace, grew up under the watchful eyes of her mother, Anne Isabella Noel, who steered her towards maths and logic in an effort to avoid developing the poetry-fuelled "insanity" of her infamous father.

Lovelace's own obsession with the concept of madness led her to an interest in phrenology, which in turn led her to attempt to create a mathematical model of the brain - a "calculus of the nervous system", as she called it.

While the medical and scientific communities would still kill for such a model today, Lovelace's interest in the inner-workings of complex machinery led her eventually to Babbage and his, at the time, unmade Analytical Engine.

While Lovelace, in defining what the Analytical Engine could do, perhaps unwisely dismissed artificial intelligence (to be later attacked by Alan Turing, of more later), the copious notes she added to her translation of a seminar - one may say these days a "keynote" - at the university of Turin resulted in the first description not only of a modern computer, but the software that would run on it.

Lovelace's assertions - completed in 1842 after a year's work - were over a hundred years before their time, and as well as laying the groundwork for programming, predicted the ability of computers to do almost anything, including abstract operations such as weaving textile patterns, and writing music.

Ada Lovelace invented the synthesiser. Discuss.

Next: Nikola Tesla - forgotten genius or misguided crank?

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Nikola Tesla

How could one person popularise alternating current, the induction motor, x-rays, radio waves and plan a potential method for clean, free and potentially infinite energy for the whole world, yet end up almost deleted from history?

The answer is: media and political spin, and the subject in this case was Nikola Tesla.

Born in the village of Smiljan in what was then the Austrian Empire but is now Croatia, Tesla ended up working under phonograph and electric light inventor Thomas Edison.

Tasked with redesigning the Edison Company's direct current (DC) generators, Tesla managed to do so with huge improvement to service and economy. Having apparently been promised $50,000 if he could achieve the task, Edison then informed Tesla that he obviously didn't "understand... American humour", after which Tesla quit.

A failed startup and a bit of ditch-digging later, Tesla managed to hook up with two entrepreneurs who helped him form the Tesla Electric Company, where, following the initial principles of Michael Faraday and later work by French engineer Hippolyte Pixii in 1832, Tesla perfected his induction motor, introducing the world at large to alternating current (AC) in 1887.

AC was far better attuned to long-distance voltage transmission, but Tesla's licensing of the induction motor system to Geroge Westinghouse's electric company, which competed with Edison, who held all the DC patents, kicked off a period known as the War of Currents, as each company effectively tried to prove its own method of electric current was better than the other.

By 1888, this 'war' had become so intense that Westinghouse's ongoing development of Tesla's motor had to be put on hold after so many patents had been purchased that there was no money left. Edison's company kept going with its own AC technology, but by 1892 Edison had lost control of his company anyway and it was consolidated into General Electric - still one of the world's largest electrical companies to this day.

Still, none of this friction stopped a (literally) shining moment in Tesla's life, when the World's Columbian Exposition in Chicago was fully lit by thousands of Westinghouse bulbs powered by the "Tesla Polyphase System". Visitors were wowed by bright bulbs covering an entire exhibition hall - a tantalising glimpse of the future.

Throughout the 1890s, Tesla went on to crack the ongoing problem of utilising Niagara Falls' potential as a power generator - allowing him to fully power a nearby town with distributed AC. He also tackled the matter of "invisible" radiant energy damaging film in his lab experiments, inadvertently discovering x-rays but losing his early experiments in a mysterious fire that burned his lab down in March 1895.

Wilhelm Rontgen officially discovered x-rays a year later.

Tesla's theories on transmission by radio waves also go back to 1893, discussed widely in the media at the time, and continued by his experiments in the Gerlach Hotel - where he lived - in 1896. Tesla even showed off a radio-controlled boat in 1898, but it was of course Guglielmo Marconi who made the historic first ever transatlantic radio transmission in 1901.

An increasingly embittered Tesla claimed 17 of his patents were used to achieve it, but had no official association with that epoch-making moment. A pattern was beginning to emerge.

By 1899, Tesla had moved to his infamous lab in Colorado Springs, and this is where some of his work becomes difficult to follow. He claimed to be conducting wireless telegraphy experiments, but at various points people walking the street in the city of Cripple Creek, 15 miles away, claimed to see sparks beneath their feet or leaping between metallic objects, and light bulbs within range of the lab demonstrated strange glowing effects.

It's a period of Tesla's career that has been often mythologised, with Christopher Nolan's 2006 film The Prestige seeing Tesla (depicted, fittingly, by David Bowie) surrounding his lab with glowing fields of wirelessly powered light bulbs, and accidentally developing a machine that could clone objects.

Tesla even went on record to say he'd discovered signals from space that equated to alien messages from other worlds.

Moving into Wardenclyffe Tower in Shoreham, New York, in 1900, Tesla erected an enormous mast on the roof and began talking about the wireless transmission of actual electric power, in the same way radio waves are transmitted. At the same time, he successfully developed a bladeless turbine and a high-speed oscillator that it was claimed caused an earthquake in New York.

Tesla even (apparently) missed out on a Nobel Prize in 1915 due to his animosity with Thomas Edison (who was also denied the accolade for the same reason, if stories are to be believed).

Nikola Tesla ended his years living alone in the New Yorker Hotel, trying to sell plans for a "teleforce" directed energy weapon he'd designed after studying the Van de Graaff generator. With the Second World War brewing, various world governments were showing unusual interest and Tesla's claims of having built and even tested the device got wilder.

"It is not an experiment ... I have built, demonstrated and used it. Only a little time will pass before I can give it to the world," he boasted, while asking for increasingly huge sums of money for the blueprints.

Dying surrounded by the pigeons he took in from the streets to try and nurse back to health, Nikola Tesla's life and career could be said to have consisted of many standout parts, outweighed by a whole which has miscast him as an eccentric and a crackpot. But without Tesla's work on AC and induction motors, it's unlikely technology would be quite where it is today.

Next: Alan Turing cracks codes, and sets up Philip K Dick and Arthur C Clarke for life

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Alan Turing

Usually recognised as the founder of theoretical computer science and most directly linked these days with his pioneering work in artificial intelligence - which crops up every few weeks as yet another project is argued to have passed or failed the famous Turing Test for an AI's abilities to ‘fool' a person into thinking it's alive - Alan Turing is still very much a key figure in contemporary conversations about IT.

As depicted in Cumberbatch's performance in 2014's The Imitation Game, Turing's most celebrated moments perhaps arrived during World War II, when his cryptanalysis work at Bletchley Park produced all manner of machines and methods for breaking enemy codes, most famously the electromechanical "bombe" and eventually solutions to Enigma, a code used by the German navy. His work on Enigma is widely thought to have contributed hugely to an Allied victory.

As a student at Cambridge University, Turing took on and reformulated mathematician Kurt Gödel's pioneering work on proof and computation, replacing arithmetic, formal language with hypothetical, abstract hardware "machines" that spat out results on tapes according to any rules fed into them. While simple, they could basically simulate the logical of any computer algorithm. This also proved Babbage and Lovelace were definitely onto something.

In 1952, at the age of 39, Turing was rewarded for his overall services to the history of IT and code-breaking feats in World War II by being charged with gross indecency on account of admitting to having a relationship with a man.

With a choice of imprisonment or probation, Turing took the latter, despite its condition that he undergo a treatment of synthetic oestrogen, which made him impotent and caused gynaecomastia. The conviction also took away his government security clearance, meaning he was no longer able to continue consultancy for GCHQ.

Turing was found dead, in his own bed, on 8 June 1954, killed by cyanide poisoning. A half-eaten apple was found next to him, presumed to be laced with cyanide but never tested for it. An inquest delivered an official verdict of suicide.

Turing was officially pardoned for gross indecency by Queen Elizabeth II on 24 December 2013, after being recognised as a hero and inspiration by both society at large and the scientific and LGBT communities for many decades.

Next: Jean E. Sammet writes the first legitimate programming language

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Jean E. Sammet

The first person in this list who is still living, New Yorker Sammet first came to prominence in 1951 when, in her early twenties and sporting a mathematics degree (despite an early bump in her education when she was denied attendance to the Bronx High School of Science due to being a girl), she took a job at the Metropolitan Life Insurance Company as a trainee risk professional. An in-house training scheme introduced her to punch card-driven accounting machines, which fascinated her. After a couple more false starts in teaching and mathematical analysis, she finally managed to enter programming as a career, joining electronics firm Sperry Gyroscope and building a loading programme for its Sperry Electronic Digital Automatic Computer (Speedac).

A company merger with Remington Rand in 1955 made Sammet a colleague of Grace Hopper, who had already conceptualised the idea of machine-independent programming languages, not to mention popularising the term "debugging".

By 1961, Sammet was at IBM, a member of the world's first COBOL group and eventually developed FORMAC - FORmula Manipulation Compiler: an early algebra system based on the rather impenetrable FORTRAN language.

FORMAC is credited with being the first widely-used general computer programming language for manipulating non numeric algrebraic expressions.

Next: Margaret Hamilton goes to infinity and beyond (well, the moon anyway)

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Margaret Hamilton

During the 1960s, Margaret Hamilton was director of the software engineering (a phrase she in fact coined) division of MIT's Instrumentaton Laboratory where she led the team that developed the on-board flight software for the Apollo space programme.

The work she personally contributed also prevented the Apollo 11 moon landing from aborting at a critical point just before touchdown, as several computer alarms were triggered due to an overload in incoming data - the blame being levelled at a faulty checklist.

However, the robust architecture that Hamilton's team designed meant that the landing system prioritised important tasks over lower-level ones, and was able to keep functioning without auto-aborting due to the errors, which turned out to be false alarms.

Hamilton is pictured below in a famous shot showing the Apollo Guidance Computer source code, which in book form equals her own height.

As well as her crucial part in space travel history, Hamilton is also credited with developing the concept of asynchronous software, priority scheduling and end-to-end testing - all core functions of any decent software development team to this day.

Next: Larry Ellison builds a much, much bigger boat

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Larry Ellison

Fifth richest person in the world, aviator, sailor, quiet philanthropist and founder of a small database company you may have heard of named Oracle, Larry Ellison is one of the IT industry's few players who seems to live up to the myth - for good or ill.

After dropping out of university twice, Ellison moved from his native New York to California at the age of 22. Legend has it that he lived in a small boat in the San Francisco Bay Area trying to get his relational database company off the ground with only $1,200 to his name.

Ellison definitely founded Software Development Laboratories (SDL) in 1977 with two partners, after jumping ship from Ampex, where he'd been working on a database project for the CIA and became enamoured with British computer scientist Edgar Frank Codd's research into relational databases.

The rest is history, as SDL became Oracle and Ellison began to earn such figures as 2013's reported income of $94.6m - despite famously taking only $1 salary from Oracle as its CEO (though he finally stepped down in 2014 to offer long-term henchman Mark Hurd the top spot).

Key points in Oracle's history include pipping IBM to the relational database market in the early 1990s when Windows and UNIX grew up, and purchasing Sun Microsystems (and with it, Java and MySQL) in 2010.

Ellison also, however, called the concept of cloud computing "gibberish" in 2008, before being forced, after increasing rivalry from SAP and others, to eat his words several years later, u-turning admirably and declaring that Oracle has always had the cloud anyway, and that it's the entire earth.

Married four times and looking oddly youthful for his 71 years, Ellison shows no sign of letting up anytime soon (despite recently stepping down as Oracle CEO), with industry whispers continually hinting that the IT legend is pouring a substantial amount of his fortune into scientific research to cheat death entirely via cryogenic freezing or even DNA remapping.

Ellison can still be seen popping on a pair of specs to carry out his now famous live tech demos at each year's Oracle Open World conference, though count yourself lucky if he shows up at all, as the avid seafarer has been known to duck a keynote entirely if he's needed on board the Oracle vessel he enters into the America's Cup boat race.

Next: Bill Gates gets inside every PC on the planet

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Bill Gates

The bespectacled Microsoft founder somehow became the byword for ‘nerd’ in the 1990s, but Bill Gates’ reality is rather different – he was actually one of the most ruthless and cunning businessmen of the 20th century. From drunkenly stealing an earthmover to ending up being arrested for speeding in 1975, Gates never seemed to enjoy following the rules. Taking a keen interest in computers from the age of 13, Gates was soon being excused from maths classes to pursue a growing interest in the BASIC language, meeting fellow Microsoft founder Paul Allen and picking up Cobol skills while at a private prep school he began billing to rewrite their payroll system in his spare time. Reaching Harvard by 1973, Gates met Steve Ballmer, and by 1975 the three friends were tinkering with an Altair 8800 and thinking about being the first to market with a consumer-grade BASIC interpreter platform.

Hilariously (especially taking into account the company’s gratis provision of Windows 10), a pre-release version of Microsoft BASIC slipped out in early 1976, and Gates wrote an open letter to computer hobbyists urging them not to pirate his software, or Microsoft wouldn’t be able to continue making more versions of it.

This, of course, would be an issue that continued dogging Microsoft up until the present day. It was the 1980 partnership with IBM, however, which catapulted Microsoft, and Gates, into the limelight. It was also the first time that Gates could be accused of being ‘inspired’ by the product of another company in order to make success for Microsoft, though in this case it was more that Gates saw potential in Seattle Computer Products’ 86-DOS (or QDOS, standing for “Quick and Dirty Operating System”) and exclusively licensed it – eventually using his company’s leverage as the middle man to buy it outright for $75,000. Selling it to IBM outright for $50,000 as PC DOS, Gates cleverly retained the rights and began developing on QDOS to make it MS-DOS. The operating system became the rival, sharded product of something Gates had sold to IBM to begin with.

IBM and Microsoft also banded together in 1985 to create a new operating system called OS/2 – a successor to PC DOS which would include protected mode functionality – which would allow the OS to use more memory beyond the base 640k, and something MS-DOS would also come to perfect.

The partnership collapsed by 1990 once Windows 3.0 became a huge success in being shipped with so many new computers, but more on that later.

After several successful years with MS-DOS, a set of events came to pass which change depend on who you choose to listen to. What seems clear is that, sometime around the early 80s, both Apple and Microsoft got a look at a GUI (graphical user interface) for computers that Xerox had been tinkering with since the late 1970s. Who acted first, who influenced what, when or how is the stuff of conjecture (and eventual lawsuits), but the Apple Lisa launched in 1983 with a GUI, but failed miserably at retail as a computer.

By 1985, Microsoft Windows was launched, installable on top of MS-DOS and, as mentioned above, shipping with “IBM and compatible” PCs as the de facto way to interact with a computer. Apple attempted to sue Microsoft through the 1980s, but the case was finally found in favour of Microsoft in the 1990s. Gates has also been widely accused of leading a drive to rewrite Java runtimes – the “MS JVM” runtime, to effectively make Java a part of Windows and do ‘Windows-only’ things, thus completely undermining the point of Java.

This became a large part of the antitrust trials that followed Microsoft around like a bad penny until the next millennium, even ‘inspiring’ a dreadful 2001 film starring Tim Robbins as a dodgy Gates-alike who would literally have his employees or rivals murdered if they stood against his flagrantly monopolistic or criminal practices.

As Microsoft’s fortunes snowballed into the 2000s with subsequent Windows builds and a successful enterprise arm, Gates took a backseat and now, still as America’s richest man, spends his working week running a charitable organisation with his wife.

However he got there, you can’t argue with that kind of commitment to philanthropy.

Next: Sophie Wilson becomes the mother of the mobile

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Sophie Wilson

Sophie Wilson could easily have had a limited place in history as one of the boffinish Cambridge set that was involved in the British computer wars between Clive Sinclair and Chris Curry, as both battled to fulfil the BBC's wish for an education microcomputer.

Curry's Acorn company of course won out - largely due to Wilson's work on improving BASIC to work with the Acorn Proton, which became the BBC Micro.

But it was perhaps buoyed by this success that Sophie Wilson kept working on processors and, in 1983, came up with the instruction set for the first RISC (reduced instruction set computing) processors - familiar to anyone who used, say, an Acorn A3000 at school along with its clunky and obstinate RISC-OS operating system.

But the real heritage of the Acorn RISC Machine lies in the resulting acronym - ARM.

Wilson's instruction set came into its own as ‘computers' began to shrink in size to the mobile devices we now rely on every second of the day. ARM is now the most widely-used instruction set architecture on the globe - in an estimated 60 per cent of mobile devices. 6.1 billion ARM-based chips were produced in 2010 alone, according to ARM Holdings.

Wilson didn't stick with ARM when it split off from Acorn in 1990, but now can be found at semiconductor firm Broadcom as director of IC design in its Cambridge office.

Next: Tim Berners-Lee fathers the cat photo, 4chan and the list feature

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Tim Berners-Lee

Tim Berners-Lee, by his own admission, is a “web developer”.

This popular internet meme, which pitches the father of the World Wide Web, and thus the entire internet, against AOL’s “digital prophet” David Shing in comparative screen captures from different video, seems to sum the man up to a tee: Berners-Lee may well be one of the most humble characters in IT history.

An Oxford physics graduate, the young Berners-Lee began his career as an engineer at a Dorset-based telecommunications firm. But from June to December of 1980, he did some independent contracting for CERN. There, he took the concept of “hypertext” (coined in 1963 as the development of a theoretical method of creating and using linked content) and promoted its potential to be worked into a system for sharing information among CERN researchers, on account of the facility’s already robust internet connection. He returned in 1984, by which time 1982’s TCP/IP protocol had allowed worldwide proliferation of internet nodes. Berners-Lee was now ready to roll.

“I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web” he is quoted as saying.

By 1990, Berners-Lee’s proposal had gone through CERN and he used, of all things, Steve Jobs’ NeXT computer to design the first web browser. The first website went online on 6 August 1991, offering visitors a breakdown of how the World Wide Web worked, and how to access it using a browser and web server (though one presumes they’d have worked that much out if they could see it?).

Soon afterwards, in 1994, Berners-Lee founded the World Wide Web Consortium (W3C) at MIT, which continues to fight for truth, justice and freedom of information via technical standards and recommendations for the ongoing quality of his creation. Despite how few governments or businesses tend to listen.

Next: Mark Zuckerberg destroys productivity and has lots of fun doing it

From Ada to Zuckerberg: History's most important IT people

Key figures in the development of technology through the ages

Mark Zuckerberg

Still only 31 years old, the Facebook founder is another founder and CEO who takes $1.00 a year, but takes a little more in stock options and bonuses.

In this case, we’re talking about a personal worth of $42.5bn. Facebook itself was the next logical step in Harvard student Zuckerberg’s sophomore experiments with social media, having already crafted a program in 2002 called CourseMatch, which allowed students to look at the course choices of others to help inform their choices, as well as form study groups. You can probably see the parallels already. According to Zuckerberg's Harvard roommate Arie Hasit, he "built the site for fun".

According to David Fincher's 2010 film The Social Network, Zuckberg's real motivation for building social networks was to find ways to connect with women. It's certainly true that his next project - Facemash - was an early version of "Hot or Not" which pitched faces against each other and had a user pick the best looking.

Whatever drove him, by February 2004 Zuckerberg had built and launched "Thefacebook", batting away accusations from other students that he had taken their time and ideas to build this instead of something called HarvardConnection.com, which never appeared.

Starting initally at Harvard, the growing popularity of the site saw it spread to Columbia, Stanford, Yale and several other well-heeled US education institutions. It subsequently spread to other education networks globally, before opening up to anyone who wanted it in the late 2000s.

Zuckerberg constantly insisted that he had built Facebook to "make the world open" and not for the money, but that didn't stop Facebook becoming increasingly covered in display advertising, downloadable app versions of both the main Facebook environment and Facebook Messenger, all as new ways to collect user data and monetise it, as Facebook headed towards an unsteady 2012 IPO.

In 2013, Zuckerberg launched Internet.org, a project to connect the five billion people across the globe who have no internet connection. It's as yet unclear how successful Internet.org has been, but, 11 years on, Facebook remains the de facto social media choice, no matter how much of a person's data it records in perpetuity and how many funny cat videos and photos of babies fill its feeds on a daily basis.

Mark Zuckerberg has just broken into the top 10 richest Americans list.

Did we get them all? Do you see anybody on this list who doesn't belong? Let us know in the comments section below!