The Artima Developer Community
Sponsored Link

The Road to Code
For the Sake of Simplicity
by Kevlin Henney
July 18, 2014
Summary
Fifteen years ago this month a piece of mine appeared in EXE Magazine (RIP) lamenting unnecessary complexity in software. A significant update for the twenty-first century proves to be unnecessary. The message and the examples are the same, so I'm posting it again for posterity, as caution for the next fifteen years, and for the sake of simplicity.

Advertisement

Software development is about many things, one of which is the management of complexity. It is a lesson that has been oft repeated, but apparently not as often heeded. The inherent complexity of a software system is something that we can do little about. We cannot eliminate it, but we can hide and abstract it. We can also create complexity.

That software systems are getting bigger and more pervasive is testament that we can at times muster the wherewithal to manage the inherent complexity of constructing large systems. At other times, however, it simply supports the view that creating complexity is far easier than hiding it — the creeping featurism, physical size, and bugginess of certain operating systems and wordprocessing packages is tantamount to a public admission of software engineering failure. To paraphrase Blaise Pascal, "The software to solve your problem is larger and more complex than necessary, because we did not have the ability or resources to make it smaller and simpler."

The inherent complexity of a software system is related to the problem it is trying to solve; the actual complexity of a software system is related to the size and structure of the software system as built. The difference between the two is a measure of our inability to match the solution to the problem.

This deficit (or surplus, depending on your perspective) is caused by — and manifested in — many aspects of the software and its development process. It reflects the structure and culture of the organisation and people that created it. It is reflected in its interface, by which I mean GUI, configuration, public API, and so on. And it is reflected in its reliability and adaptability. A more complex system will inevitably have more bugs and be harder to change — all other things being equal — than an equivalent simple system.

Note that there is a major, but nonetheless subtle, distinction between simple and simplistic. Dumbing things down, sweeping complexity under the carpet, is an example of simplistic — the complexity still leaks out. Far from eliminating the need for skills, when simplistic tools (and mindsets) are applied to non-simple problems they highlight that technology alone cannot solve the problem, and that skills for managing software complexity — skills better known as programming — are needed as much as ever. Zero- and minimal-programming models solve particular categories of problem for particular categories of people, but beyond that scope they are simplistic rather than simple, creating rather than solving problems — keep in mind, for example, that the majority of spreadsheets have significant bugs.

In this context, what is simplicity? Simplicity is related to minimalism, as characterised by John Pawson: "The minimum could be defined as the perfection that an artefact achieves when it is no longer possible to improve it by subtraction. This is the quality that an object has when every component, every detail, and every junction has been reduced or condensed to the essentials. It is the result of the omission of the inessentials."

In other words, leaving or taking things out by constantly examining the difference between inherent and actual complexity, questioning and reducing the number of features, and questioning and reducing the amount of code. For benighted management that still measures productivity in terms of KLOCs, this is scary: it appears to represent negative productivity. But... less code, fewer bugs... fewer features, greater ease of use... and so on.

This leads to an interesting marketing possibility, and one that I have seen in action only a few times: the idea that a subsequent release of a product might be smaller or have fewer features than the previous version, and that this property should be considered a selling point. Perhaps the market is not mature enough to accept it yet, but it remains a promising and classic ideal — less is more.

In software development we have already had these principles enshrined in our terminology for many decades: low coupling and high cohesion. As David Parnas noted, "Partition to minimize interfaces." The wisdom has been captured. Most developers today, however, either do not know what these terms mean or they do not understand their significance. The practice of aggressively managing dependencies between parts of a software system applies at every level: function to function; class to class; component to component. A separation of concerns leads to separation in software architecture — space to breathe, space to think, space to work.

A software system reflects the principles used to build it. Personally, I find it quite disturbing to see some of the files being copied or replaced when an application is installed on Windows — "Hey, isn't that file part of the operating system? Why is the application layer messing around with the operating system layer? And why do I need to install this completely unrelated piece of software to make this other one work? ... Uh, why don't some of my other applications work any more?"

It is not technology that solves problems, it is understanding: simplicity and minimalism are not criteria to apply to our understanding, but criteria that should be applied to the products of our understanding. We are often in the situation of having technology before understanding, and without principles to guide us we can end up with logical absurdities. To illustrate this, and reinforce the message of inherent versus actual complexity, I will leave you with the following thought: why is it that when I install new software on a Windows machine it typically requires me to reboot (or, as is often the case, rerereboot) the operating system, and yet plugging in a new piece of hardware does not? The hardware often seems softer than the software.



This article was first published in the July 1999 issue of EXE Magazine.

Talk Back!

Have an opinion? Be the first to post a comment about this weblog entry.

RSS Feed

If you'd like to be notified whenever Kevlin Henney adds a new entry to his weblog, subscribe to his RSS feed.

About the Blogger

Kevlin is an independent consultant and trainer based in the UK. His development interests are in patterns, programming, practice and process. He has been a columnist for various magazines and web sites, including Better Software, The Register, Application Development Advisor, Java Report and the C/C++ Users Journal. Kevlin is co-author of A Pattern Language for Distributed Computing and On Patterns and Pattern Languages, two volumes in the Pattern-Oriented Software Architecture series. He is also editor of the 97 Things Every Programmer Should Know site and book.

This weblog entry is Copyright © 2014 Kevlin Henney. All rights reserved.

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use