The Artima Developer Community
Sponsored Link

Notes from Underfoot
Are Programmers People? And If So, What to Do About It?
by Ken Arnold
April 9, 2003
Summary
...Wherein we explore the radical notion that programmers are people, and then look at what this might mean for deep tools, such as APIs, protocols, and programming languages.

Advertisement

My experience is that significant advances are triggered by radical notions. My current radical notion is this: Programmers are people.

Some will be skeptical, but I think that there is sufficient evidence to accept this, at least for the sake of argument. Briefly, though, I will point out that programmers are well known to sleep on occasion, and to smell badly if they do not bathe. Beyond these basic animal traits, we can also note that they use language (although often crypticly), a trait commonly accepted to be a hallmark of humans.

If we accept that programmers are humans, one primary and interesting consequence is that human factors issues can be properly applied towards the tools they use. I'm not talking here about IDEs, which have GUIs that are clearly subject to human factors analysis. (Answer: They mostly suck.) I am speaking about the more basic tools programmers use every minute they do their work: programming languages and APIs.

I first started to think about this when writing a paper on C++. In the first (and only) draft, I declared that a programming language is a user interface to the language's abstract programming model. I was surprised to discover that this view was not common, although it was easy to sell. It was easy to sell because most people quickly see the point: Users (in this case programmers) are interfacing with the programming model through the medium of the code text.

The striking thing was that it wasn't already widely accepted. I thought I was stating an obvious truism. But for many, I was stating a wholly new idea. Clearly the practice of programming languages does not look to human factors.

C++ is a grand example of this. As a small example, consider the following:

There are two ways to declare variables: The way used for function parameters and the way for everything else: foo(int a, int b) vs. int a, b;. One is comma separated, the other semicolon terminated. You cannot directly declare two variables to have the same type if and only if they are function parameters. Basic human factors says that an interface should have one way to do a single thing that always looks the same.
Is this earth-shattering? Hardly. It's just easy to describe in a paragraph. And it shows the basic premise at work.

More profoundly, consider garbage collection. One can argue up and down about the overhead, etc., but consider this: Two of the best selling C++ books are Scott Meyer's Effective C++ and More Effective C++. The single largest category of the 85 tips -- about one quarter of them -- is dealing with potential memory leaks. In a garbage collected environment, none of those tips need to be considered. Whatever other costs you consider, you must consider the bug rate when those hints are not used, or are used incorrectly. This is fundamentally a human factors problem: You can tell people how to avoid the whirling knives of the abattoir, or you can close the abattoir door.

I do not claim to be original in worrying about such issues: The classic Brian Kernighan1 paper "Why Pascal Is Not My Favorite Programming Language" does not say so, but many of its critiques are human factors based. But I think our lives as programmers could be a lot better if we took this seriously, actually understanding human factors and applying them, instead of just doing our best to exercise personal good taste.

Consider the following human factors principles (the numbers are only so I can refer back to them; these are in no particular order):

  1. There should be one way things are done (whether or not it appears in multiple places). For example, the login dialog box should look the same, no matter where and why it pops up.
  2. Similar things should be done similarly. This helps people use their learning from one place to work in another.
  3. Different things should be clearly different.
  4. Dangerous things should be impossible, or where necessary, clearly labeled as dangerous.
  5. Default behavior should be harmless.
  6. Users should be able to predict what the results of actions will be.

My first example fails (1) and (2). A C++ example of (3) (and (6)) might be implicit conversion. Without a large knowledge of the local environment (or a very good IDE) nobody can tell if return x + y (where x and y are objects) creates one new object, or three, or twenty. It all depends on how many conversions are necessary on y to make it usable in some operator+ method on x, and then how many more are required to turn the result into an object compatible with the return type. In one example I uncovered, the total number of objects created and discarded was 12. The difference between x + y and x + new Foo(new Bar(new Gogin(y))) is pretty important, and completely invisible.

Many of C++'s most egregious violations of human factors violate (5). Consider that a default copy constructor will be created for you if you do not provide one. This constructor will be simple (field-by-field copy), but also often enough wrong. You can stop it from giving you one by providing your own, but you must do this whether or not you want a copy constructor at all. If you don't want one, you declare one but make it private. In other words, the default is dangerous (and the workaround backhanded).

And compare that behavior to a violation of (6): One would expect that there would be a default overload for the == operator. Why? Because in almost every situation where the default copy constructor was valid, an analogous field-by-field comparison for equality would be valid as well. But you don't get a default operator==. And surely if you do provide one, it clearly obvious that 99.99% of the time, you can define x != y as !(x == y). This is far more likely to be correct than the default copy constructor, but in one place C++ is "helpful" and in the other place we must trust people to know Scott's Effective C++ tip that you should always override == and != as a pair. This is a rich source of bugs.

Note in addition that these two language choices take fundamentally different views of the human programmer. Is C++ a "tries to be helpful" language for default method implementations or a "conservative choice" language? What would you guess: Does C++ provide an automatic = (assignment) operator? In this and many other ways it is schizophrenic, and hence unpredictable.

I don't want to pick on C++. At least not exclusively. Smalltalk's unusual syntax contributed greatly to its demise. Java chose to keep C's "fall through" default for switch statements which is Just Plain WrongTM. Perl has so many ways to do the same thing that most Perl programmers commonly encounter chunks of Perl they cannot comprehend: each Perl programmer has a sub-dialect they understand, and much they've never seen. I could go on for a long time on languages alone, never mind APIs. And so could you.

No matter what, every language will have human factors problems. Tradeoffs are inevitable even with the best design, so you can always find one principle violated to support another. But I want us to be trading them off against each other, not ignoring them. Let us think of each other as human (well, with a few exceptions) and then design like it.


1 Originally my editor misattributed this to Dennis Ritchie. Unfortunately for me, I'm my own editor so it's still my bad.

Talk Back!

Have an opinion? Readers have already posted 44 comments about this weblog entry. Why not add yours?

RSS Feed

If you'd like to be notified whenever Ken Arnold adds a new entry to his weblog, subscribe to his RSS feed.

About the Blogger

Ken Arnold is a recognized loose cannon in the software business, whose previous fusilades include being an inventor of Jini, designing JavaSpaces, writing books on Java and distributed systems, helping design CORBA 1.0, and (while at Berkeley on the BSD project) the curses library package, co-authoring rogue, and generally enjoying himself. His interests include designing APIs and programming languages using general principles of human factors design because of his radical hypothesis that programmers are human, and other applications of this same principle to software design, management, and production.

This weblog entry is Copyright © 2003 Ken Arnold. All rights reserved.

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use