The Artima Developer Community
Sponsored Link

Heron-Centric: Ruminations of a Language Designer
Myths of Memory Management
by Christopher Diggins
September 4, 2005
There is a widespread belief that garbage collection leads to automatic memory management, and that without a GC memory management is neccessarily complex. These are both false.


In every piece of non-trivial software, irregardless of the language, resource management is a concern. When a resource is no longer needed you need to be assured that that resource is returned to the system. This is as true for GC languages as it is in non-GC langauges. Some programmers seem to think that because the language has a GC they simply can turn a blind eye to resource management. I think this accounts for why so many programs written in Java I come accross, consume staggering amounts of resources, and have miserable performance, despite all the research which says how efficient Java is compared to C++.

In a GC language, the garbage collector will only reclaim resources which are no longer live. Assuring a resource is reclaimed is no easy task in a non-trivial piece of software. The best way to assure a resource is released is for the programmer to set pointers to null when they no longer refer to a resource (or let them go out of scope), and to minimize the sharing of objects (which is just good practice).

So in a GC language, you have to pay attention to resource management, just as you would in a non-GC language, but one major difference exists. If you make a resource management mistake in your design using a GC language such as Java, everything works fine, it is simply a bit less efficient than it should be. In C++ something magical happens: undefined behaviour. Of course undefined behaviour is the greatest of the programming evils, and it is primarily responsible for the mass exodus of programmers from C++.

Given that a programmer in Java or C# should be setting their pointers to NULL, what is the difference between that and explcitly calling a delete? The answer is more or less nothing. The real question is what happens when the programmer makes the inevitable mistake. Undefined behaviour is unacceptable for most of us sane people, but silently keeping things alive is a very hard to detect resource leak. I think a programmer would have the best of both worlds if their language avoided undefined behaviour, threw errors when things aren't deleted when they should, and prevented things from being deleted which shouldn't be.

As for the complexity of resource management, well the design issues of who owns what resource is inescapable whether or not your langauge has a GC or not. You can always choose to ignore it, but it will have repercussions. The only difference for your language is whether this results in poorly performing software, or undefined behaviour.

Postscript: Moore's Law

Something which drives me up the wall, is programmers who quote Moore's law and say resource management is unimportant. What they conveniently ignore is that the demands placed on computers and the functionality inherent in software has followed the same curve. My point: you can't use Moore's Law as an excuse to write bad software.

Talk Back!

Have an opinion? Readers have already posted 81 comments about this weblog entry. Why not add yours?

RSS Feed

If you'd like to be notified whenever Christopher Diggins adds a new entry to his weblog, subscribe to his RSS feed.

About the Blogger

Christopher Diggins is a software developer and freelance writer. Christopher loves programming, but is eternally frustrated by the shortcomings of modern programming languages. As would any reasonable person in his shoes, he decided to quit his day job to write his own ( ). Christopher is the co-author of the C++ Cookbook from O'Reilly. Christopher can be reached through his home page at

This weblog entry is Copyright © 2005 Christopher Diggins. All rights reserved.

Sponsored Links


Copyright © 1996-2018 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use