How the 1980s changed IT management

A period of disruption like none before

Workers and managers faced great changes in the 1980s, as personal computers finally reached office desks

Image:
Workers and managers faced great changes in the 1980s, as personal computers finally reached office desks

In the 1980s, 30 years after the first commercial use of programmable digital computers, many businesses were grappling with the strategic application of IT for competitive advantage.

1982 was officially Information Technology Year (IT82), with Kenneth (now Lord) Baker as Minister for Information Technology. The decade saw the arrival of the IBM PC, mobile phones, Microsoft Windows and the world wide web. I recall a discussion with my MBA professor about whether email would ever offer business a return on investment. Here, we take a look back: nostalgia for those of a certain age, and perhaps lessons for the younger generation.

In my rear-view mirror is the Butler Cox Foundation, which sought to guide IT directors through the advancing world of technology and help them straddle the corporate and IT systems divide. Butler Cox was an IT advisory group that operated from 1977 to 1991, and Archives of IT was fortunate to be given the complete back catalogue of its foundation reports, by David Butler and Sir George Cox.

We asked Stephen Robertson, a student of the History of Science Technology and Medicine at the University of Manchester, to comb the documents and oral history from our archive, and were also fortunate to get the recollections of Butler and Cox. They reminded us that as well as the question of "how" to harness technology for strategic advantage, a big question was "who" should be responsible for this new weapon in the business armoury, and very importantly the money that was spent on it.

Sadly, David Butler died in July 2022 and the industry lost a key influencer; especially in helping business "weaponise" IT as a strategic tool.

You only have to look at titles of the reports used in the study to get the gist of the story

You can find Stephen's full report here, but I offer a summary, possibly coloured by my own recollections of the time:

There was a great divide in approach towards and understanding of these technological developments, and a distinctly human story: two camps with two distinct approaches and goals. On one hand, 'the business': focused, corporate managers who looked for the practicality of implementation and commercial incentives for the adoption of any new system or methods.

On the other hand, there were those who worked in the systems departments, keeping up keenly with all the latest developments and popular technology trends. However, their desire to see them introduced did not seem - to 'the business' - to always be justified in financial terms. The underdelivery or overrun of new IT projects further resulted in an uneasy working relationship.

Others recognised the fractious relationship between managers and IT departments. Andrew Herbert describes how, while working for APM Consultancy during the 1980s, he came across companies being encouraged to follow industry standards in a bid to remain competitive. While concerned with the financial constraints of a struggling business, managers were being asked to back the adoption of new technology and systems. As Herbert himself describes it, "Adopting new technology was sort of the last thing the management wanted to hear". While a new system may have been beneficial for an organisation in the long term, the upfront cost was of considerable concern to managers: this being especially true if a company was struggling financially.

There was also a cultural and language barrier: when presenting to senior management, systems directors or those responsible for technology related proposals found it difficult to convince others that the adoption of new systems or technology would be financially viable. With the strategic value of technology not fully understood in the business, the IT ,anager was not valued in the same way as, say, a sales manager and so lacked the clout to convince corporate managers they should be entrusted with financial decisions. So, systems directors, who served as the intermediary to senior management, faced a considerable challenge: perhaps that view of the role of the IT director tells a story itself!

Sir George Cox and David Butler suggested there was also an element of distrust: advances in automation, data processing, and communications technology had led to reorganisation of corporate structures, and a plenitude of redundant workers.

Decentralisation causes distrust

Just as the IT department was struggling to overcome these issues and assert its position at the centre of business strategy, along came an added complication: Decentralisation. As early as the April 1980 report, Butler Cox warned that widespread decentralisation 'may lead to chaos'. By the May 1984 report, 'Managing the Microcomputer in Business', it was predicted that microcomputing would occupy 50% of an organisation's total data processing costs by 1991, with decentralised systems at the forefront of many organisations' workflow.

IT departments were no longer the sole repository of information

As control of IT shifted away from the IT department, there was a skills deficit: workers needed to learn to use their new machines and general managers had to understand how and why such machines could be used to improve organisational performance. IT departments were no longer the sole repository of technology and information, but were still seen as responsible for resources over which they had no control. Nightmare!

Whether it was an issue of trust towards employees, or apprehension toward the transition process and associated costs, decentralisation had to be carefully managed. John Handby noted the benefits of a decentralised communication system but equally reported that only through persistence - and careful selection of employees tasked with implementation and persuasion of managers - was he able to succeed in implementing this "huge cultural change." On the other hand, Dr Michael Taylor reported that numerous different approaches and adoption of a variety of systems by individuals working in different departments at the Metropolitan Police had, indeed, led to chaos.

It was a challenging and often unclear period of technological and social change, but what was occurring was the emergence of the modern office experience. Certainly, now it is recognised that IT plays a pivotal role in almost every business and is a competitive differentiator in many. Personal computing and decentralised systems are de rigeur for almost all end-user computing, whether in office environments or mobile, and everyone relies on the communications technology that emerged in the 1980s.

So, is any of this worth revisiting? Well, it is a moral tale of people-related (is there any other kind?) business change and, as Churchill paraphrased in 1948, "Those who fail to learn from history are condemned to repeat it." But now that IT is ubiquitous and the CIO works seamlessly with the other CXO's, the work is done is it not. We have definitely learned a lot, but you only need to Google 'CIO role' or 'CIO priorities' to see there is still a debate about who should lead strategic digital transformation, or whether there is consensus in the boardroom on the cybersecurity budget.

So, for those experiencing this for the first time, the moral is: It was ever thus, and change is what the job is all about.

Archives of IT is a UK registered charity (Number 1164198) that captures and tells the story of the people of tech industries for insight, interest and education.