The top-ten tenets of software quality assurance, part one: the contract review

Each week, QA specialist Mark Wilson will spell out his top-ten tenets for Quality Assurance - the fundamental things you need to do to build better IT systems

This year, I will have been helping businesses build IT systems for 31 years, more than half of that as a contractor/consultant. I've taken a nice little break recently: my band's website got built, I wrote a book about Icons (the holy, artistic kind) and I put some serious hours into writing about my musical career.

The music charts were massive to us as kids in the 1970s. We'd listen avidly to someone's smuggled transistor radio in the playground to hear the chart run-down at 12.30pm on Thursday lunchtime. What would be number one? What about number two? I realised that, over the years, some things are essential to get right when you're embarking on an IT project. My view of what they are is wholly and unashamedly biased.

As I moved from role to role, I learned a great deal about why so many IT projects fail and why so few succeed

I've worked exclusively in Quality since joining the IT industry in 1989, after a first career as a rock'n'roller. All of my clients and employers have been major PLCs, government organisations or institutions. My first degree was BA History, and I retain a love for the subject to this day. It gave me a skill for clear and concise argument and a love of words for their own sake, too.

When I re-trained in computing, I quickly became aware that while there was a need in the industry for people who understood the complexities of programming, digital logic, data structures, and even the way microchips and digital circuits work (and I did understand back then), system engineers would also need people who could write clear, concise English, people who my tutor described to the class one day: "QA people. Yeah, they're the people who look after all the testing and documentation and stuff, and work on the project side of things. They're not proper developers."

It sounded exactly what I wanted to do: combine software engineering practice with the ability to describe what should be done. I was extremely fortunate, straight after my computing course, to get a job as a QA Officer at the University of Leeds, which was launching commercial enterprises to exploit its wealth of engineering software expertise.

Failure to write and review a good contract can not only kill a project, it can seriously damage the business itself

In the following five years I worked with assorted doctors, professors and other gifted software engineers developing Ada compilers and devising a conformance testing service for CAD-CAM exchange software. They allowed me to drink deep at the fountain of engineering principles. They were engineers first, and software developers second.

I therefore had the perfect grounding to launch my career as a consultant in ISO 9001, CMMi , ITIL, test management and configuration management. In fact, so strong was the grounding I'd received, and so confident did the principles of engineering make me, that I was able to move from quality management systems (QMS) expertise, into live service operation, software configuration management, software testing, and later hardware asset testing, with little prior experience. This isn't intended to be a boast, it's a genuine testament to the power of these rules combined with the confidence that they work (plus, I have to say, a very buoyant contract market and not a lot of IR35 worries).

As I moved from role to role, I learned a great deal about why so many IT projects fail and why so few succeed. The quality standard BS 5750 (which was later adopted as ISO 9001) has had a lot of bad press over the years, but like engineering, it's based on a set of principles that can be applied anywhere, again and again. Over the following 25 years I found out that both the engineering principles and those behind BS 5750 were true, the hard way. I worked on loads of expensive, sometimes vast, projects and programmes. Most of them failed, but some succeeded. A lot of them went over the line on time (the only thing most senior managers are really bothered about) but often with shoddy or appalling quality. Some lost a lot of money - in some cases, vast amounts of tax-payers money.

Using the fundamental principles I'd learned, I began to be able to predict success and failure, simply by seeing if the principles had been applied correctly (if at all)

Using the fundamental principles I'd learned, I began to be able to predict success and failure, simply by seeing if the principles had been applied correctly (if at all). All this led me to realise that almost any project's quality and general chances of success could be determined simply by the application of these engineering and contract management golden rules (I think I'm going to call them "Tenets"). I also realised that taking - and acting on - simple metrics was the key to fixing broken projects and improving good ones.

My top tenets are the result of this experience. There are ten of them. My Top Ten Tenets cover development as well as operation; software as well as hardware; agile as well as waterfall. They are valid no matter what flavour of fashionable jargon-heavy technique you are using. I hope they are useful.

The top-ten tenets of software quality assurance, part one: the contract review

Each week, QA specialist Mark Wilson will spell out his top-ten tenets for Quality Assurance - the fundamental things you need to do to build better IT systems

Number One: The contract review

This is, as they say, a no-brainer.

It's probably also the one which I've found the hardest to control, yet its ability to determine success or disaster is second to none, since failure to write and review a good contract can not only kill a project, it can seriously damage the business itself.

Contract review was a ‘super clause' in the original BS 5750 standard back in 1979 (eventually re-published as ISO 9001 - the UK's BS 5750 was the first universal standard for quality management systems). The writers at the British Standards Institution were aware that it was probably wishful thinking to imagine that a poor quality-manager would be able to do anything about it unless there was a concomitant clause in the standard (called ‘management responsibility') requiring quality management to have backing at Board level.

I am still mortified when I visit my doctor or use the health service to see how much clinicians still hate a huge and complex product I was once involved in delivering

Despite this traction being afforded, via proxy, to the quality manager, irresistible business momentum will always trump prudence, and it's easy for process-types, urging caution, to be portrayed as moaning minnies (bosses don't like people who say no).

Yet getting this wrong is still something that will kill projects from the off. In the 2015 version of the standard, "contract review" is no longer an explicit clause, but that golden rule is still there: it can now be found under section 8.1 "Operational Planning".

Clause 8.1 states:

"The planning should include methods for:

  1. Determining the customer requirements for the products and services;
  2. Establishing criteria for the processes and the acceptance of products and services;
  3. Determining the resources needed to achieve conformity to the product and service requirements;
  4. Implementing control of the processes in accordance with the criteria;
  5. Determining, maintaining and retaining documented information to the extent necessary to have confidence that the processes have been carried out as planned;
  6. To demonstrate the conformity of products and services to their requirements."

Subsection 8.2.2 also says "the organisation [must ensure it] can meet the claims for the products and services it offers".

So this covers the capture of the business requirements and how they'll be accepted, and how you'll demonstrate your continued ability to meet them. Let's ignore, for a minute, that it's also an outline of the entire quality management system, too. But at a contractual proposal level - and it's the proposal that's absolutely crucial to success or failure - ISO 9001 is saying: "Make sure all of this stuff is in your proposal."

Note that the proposal also states what resources the contract requires. A lot of people say, "Well, that is what a proposal or tender document does anyway". But read on…

The next thing ISO 9001 says about the proposed contract plan, in section 8.2.3, is:

"The organisation shall conduct a review before committing to supply products and services to a customer, to include:

  1. Requirements specified by the customer, including the requirements for delivery and post-delivery activities;
  2. Requirements not stated by the customer, but necessary for the specified or intended use, when known;
  3. Requirements specified by the organization;
  4. Statutory and regulatory requirements applicable to the products and services;
  5. Contract or order requirements differing from those previously expressed."

This is a full review, to be done after the customer has agreed to sign up to the proposed work scope or purchase order, but BEFORE the supplier has also signed-off and agreed the contract.

To me it's saying "Take a deep breath and step back for a moment. Yes, it's a big contract, it's very exciting, and you might get a hefty commission, but will this job actually end up ruining us? Can we still deliver?" (It doesn't ask: is this still going to be profitable ? It should do, though - see my paragraph below on PRINCE2).

The contract review happens twice: once when the opportunity comes in following your bid and, again, right before you're ready to sign the contract.

They hated the GUI we had built; they hated the login system and they detested the performance

Incidentally, I've never had problems getting management to implement test-readiness reviews (the "entry gate" to a formal test phase) as a sensible part of the Waterfall assurance process, yet the same senior managers seemingly lose the ability to ask the same hard test-readiness review questions of a company's readiness to take on a new job: Are we ready? Do we have the resources and support throughout the period of testing? Is there a stable baseline that we're working against? Who will resolve conflicts of opinion over issues?

So, in some ways, they're very similar. Both ask hard questions about the task we're about to undertake. Getting one of them wrong will only foul-up your testing; getting the other wrong will foul-up your bottom-line.

The latest version of PRINCE2 sums this all up under its Number One Principle: Business case/continued business justification - Is the project, and does the project, continue to be desirable, viable and achievable?

Will users end up hating you?

Something that doesn't come leaping out of ISO 9001, PRINCE2 or any other means of not doing a terrible project and wasting loads of money is due consideration to, and buy-in from, end users: The poor folk who will end up having to use the system you're working on.

The top-ten tenets of software quality assurance, part one: the contract review

Each week, QA specialist Mark Wilson will spell out his top-ten tenets for Quality Assurance - the fundamental things you need to do to build better IT systems

ISO 9001 hints at what I'm saying by referring to "criteria for acceptance". Agile has the acceptance criteria explicit in the user story. But how does anyone know if the end users even want a fantastic new system? Shouldn't the acceptance criteria be theirs?

My view is that any supplier (whoever's developing the system) should ask the following questions:

If these questions were always asked, then maybe there wouldn't be these cautionary tales…These are examples of where the hard questions weren't asked, or if they were, and they didn't like the answers.

Cautionary tales

Doctors hate what I worked on for 13 years

I am still mortified when I visit my doctor or use the health service to see how much clinicians still hate a huge and complex product I was once involved in delivering, and which they say was foisted on them without their consent.

I began work on this major public service programme in 2003 and I was still involved in maintaining it in 2016. We all did what we could to deliver to the specifications the health service and the system integrator gave us. The company I worked for from 2008 on this system was beyond reproach; exemplary, in fact.

But according to my doctor (and I've heard exactly the same opinion from many other clinicians), we delivered something none of them wanted. They hated not being asked. They hated the GUI we had built; they hated the login system and they detested the performance. They then went away and bought off-the-shelf health products that actually did the job, thereby doubling the waste of taxpayers' money. The waste is probably in the low billions of pounds.

Lawyers hated what I worked on

Some years later I helped deliver a very expensive product that was mandated onto thousands of legal professionals in the UK, by their professional body. It was never even used. The lawyers hated it; they'd never been asked what they wanted. The waste was probably £1million.

If the health service and the legal professional body had each looked at what measurable benefit was going to be delivered to their users by having the decency to ask their opinion, then the waste could have been averted. It's crucial to get the consent of the end-user, or the project will fail.

Reputation before financial disaster

But even assuming a major customer has the users on its side, with clear requirements, what if the people hired to build the system don't review the contract? A major systems integrator I worked for failed to do a proper contract review (or follow PRINCE2 principles), thereby failing to establish that the multi-million pound product they were delivering was "desirable, viable and achievable" over the duration of the contract. So much so that they spent the system's entire 10-year budget in the first five years. And still incurred penalties for under-delivery. And all because they felt that walking away would make them look stupid.

Hosing money out of the window

But the best example I have of how to hose money out of the window by failing to do a contract review concerns a UK government department, which requested tenders to replace its ageing administration system.

This system's basic architecture was childishly simple, being merely a list (a database) of accredited personnel, plus the means of deleting or adding new accreditations. It was a replacement, with virtually no major new functional requirements (the changes were largely to deliver better performance, scalability and security). The supplier that won the multi-million pound prize to develop this replacement product was based offshore, but operated in the UK.

To get under the price of the competitors and achieve a fantastic gross margin, 90 per cent of the software would be written offshore (as the business model usually demanded). However, in the contract there was also a requirement for all of the developers to be UK government security-cleared, which meant that all of the software had to be written onshore, not offshore.

At the contract review, not one person in the supplier's senior management team noticed that, because UK security clearance could only be given to individuals resident in the UK, all of the development would have to be done in the UK, thereby blowing the offshore business model out of the water.

No-one questioned why the proposal recommended using Agile. For no reason (other than to appear on-message) the proposal writer had volunteered to use Agile as the development method, despite it being inappropriate (there were fixed documented requirements), being unwanted (not in the invitation to tender document), and dependent on co-located product owners and development teams.

By the time anyone not rubbing their hands in glee at the prospect of a massive contract had a look, it was too late. The contract was signed, and the offshore company was on its way to chalking up a humungous loss, and also undergoing a terrible lesson in when not to do Agile. They did, in the end, manage to leverage the offshore development teams, but these were obliged to work without any contact with the customer's product owners.

And to crown it all, most of what they produced was scrapped.

More QA larks (and some advice) next week.