The Artima Developer Community
Sponsored Link

Angle Brackets and Curly Braces
Architecture the Accelerator
by Bill Venners
February 23, 2005
Writing mediocre code may help you move fast in the short term, but it can slow you down in the long term. A good architecture can help you move fast in the long term, but slows you down in the short term. How is a developer to decide how good is good enough?


In the discussion of my recent weblog post, Can You Write Quality Software Quickly, Bob Martin wrote:

The only way to go really fast is to write the best code you can possibly write, with the best design you can possibly give it. Any other technique will slow you down. ... In my estimation the belief that quality is an expediter rather than an impediment is what separates professionals from amateurs.

My experience has been that writing mediocre code helps me move quickly in the short term, but often slows me down in the long term. In addition, I have observed that it is usually quality architecture, not quality code in general, that helps me move quickly in the long term. And herein lies the tradeoff: although a quality architecture can help me move quickly in the long term, building a good architecture slows me down in the short term.

I agree with Bob Martin that quality code is an expediter, but I think it is important to appreciate that quality has a cost, and development teams need to make cost/benefit analyses. In some situations, the "best code you can possible write" will be the right level of quality to aim for, in others it will not. You have to judge the return on investment of your development efforts. The higher the quality, the higher the cost. As you invest more in quality, at some point you will start getting diminishing returns.

Unfortunately you never have complete information about exactly where that point of diminishing returns is, because to know where that point is you have to predict the future. So you need to make a judgment, given each situation, as to the level of quality that gives the optimum return and aim for that. Moreover, you have to make a judgment as to which parts of your system deserve more attention, because the return on investment from quality is not spread homogenously throughout your system. Lately, my intuition has been telling me that the place I should really be investing in quality is in the architecture, because that will help me move quickly in the long term.

The Meaning of Architecture

By "architecture," I mean the basic technologies, structure, and programming conventions of a software system. In his interview, Growing, Pruning, and Spiking your Architecture, Luke Hohmann defined architecture like this:

...Architecture is some reasonable regularity about the structure of the system, the way the team goes about building its software, and how the software responds and adapts to its own environment. How well the architecture responds and adapts, and how well it goes through that construction process, is a measure of whether that architecture is any good.

Architecture includes technology choices that you use intact. For example, at Artima we're in the process of defining a new architecture, and we have already made some technology choices: Linux on commodity PCs, Java, Tomcat, Hibernate, Velocity, PostgreSQL, Lucene, Jini, JavaSpaces. These choices are part of our architecture.

Architecture also encompasses the basic organization of the system, the kind of thing you'd see represented in a block diagram. A crucial aspect of our architecture, for example, is how we will be distributing functionality across clusters of servers so that changes are as easy as possible to make and scalability is easy to attain. Although we have done a bit of investigation and planning, we haven't made many decisions in that area yet.

Architecture also involves the infrastructure you build to support your development processes. You'll likely invest a significant amount of time creating test harnesses and writing build scripts. You may create domain-specific languages that generate code. Such tools represent a structure in which you work together with others to create the software.

The rest of an architecture is embodied in conventions. In our new architecture, for example, we had many questions to answer. How will we render web pages? How will we version entities in the database? How will we name variables? I believe it is worthwhile to come up with one answer to each of these questions that we then use throughout the system. I foresee deciding upon and enforcing many, many such conventions.

Moving Really Quickly

An architecture can make you fast to the extent that it supports the kind of changes and enhancements you need to make in the future. What are the key ingredients that enable you to build such an architecture? In my experience, you need a combination of domain knowledge and luck.

For example, started out in 1996 as a single HTML file that I edited by hand. Over time I added more pages, also edited by hand. Before long I found I was quite regularly needing to make global changes, such as altering the headers and footers on all pages. I realized I needed a way to quickly and easily make changes to every page. So I added HTML comments in each page that essentially marked up the header, footer, and other logical areas of the page. Whenever I needed to make a global change, I wrote a Perl script that iterated through all the HTML files and made changes based on the markup comments. If I wanted to update the header of all pages, for example, I'd write a Perl script to look for the HTML comments that marked the beginning and end of the header on each page, and replace what was between them.

The Perl approach worked for quite a while, but as the growth in the number of pages accelerated, the Perl approach became unwieldy. Some pages managed to not get the comments in the right places, so they wouldn't get updated, and these failures wouldn't be obvious to me when I ran the Perl script. As a result, I eventually replaced the Perl technique with a small Java application I called "Page Pumper." Page Pumper read an XML descriptor and a Raw HTML file for each page, and generated HTML files that were served up by Apache on the live web site. I still use a decendant of that original Page Pumper today, and it still helps me move quickly when I need to make global changes to the Artima Developer website. When I decided to put a search box at the bottom of every page on the site, for example, the entire process of updating the code of Page Pumper, testing the change, regenerating and deploying all pages on the live site took 15 minutes.

I knew how to write Page Pumper because I had domain knowledge. I had experienced that in the domain of publishing a website, the site-wide look and feel of pages tends to be constantly tweaked and massaged. The trouble is that I didn't have complete knowledge of the future. For example, when I wrote Page Pumper, Artima was composed primarily of static web pages. Page Pumper worked very well for generating static pages. I didn't foresee that I would eventually be generating pages dynamically with Java code in a servlet container. When that time came, I did manage to hack Page Pumper so that it worked with JSPs too, but it wasn't a very good fit. In the new architecture, the plan is to have all pages generated dynamically, and I don't expect we'll need a Page Pumper anymore.

The reason I said you need luck as well as domain knowledge to create a good architecture is that you can't see everything that's coming in the future. Domain knowledge helps you predict the future only to the extent that the future resembles the past. Today Artima is a web site, but tomorrow it may be running in car dashboards and cell phones, or something else I have never imagined. When that time comes, the architecture I'm building today will likely not, unless I am lucky, be ready to support those new requirements gracefully. When those new requirements come, I will likely need to invest more time in architecture building.

I'm curious to what extent your experience matches my own. Have you found that a quality architecture is the primary accelerator? To what extent have you found general code quality enables you to move quickly? Have you found that the quality of one part of your system is more important than quality of other parts? If so, which parts and why? Has code quality mattered more on some teams you worked than others? Does the return on code quality increase as the size of the team increases? Please add to the discussion by posting on the forum for this post.

Talk Back!

Have an opinion? Readers have already posted 59 comments about this weblog entry. Why not add yours?

RSS Feed

If you'd like to be notified whenever Bill Venners adds a new entry to his weblog, subscribe to his RSS feed.

About the Blogger

Bill Venners is president of Artima, Inc., publisher of Artima Developer ( He is author of the book, Inside the Java Virtual Machine, a programmer-oriented survey of the Java platform's architecture and internals. His popular columns in JavaWorld magazine covered Java internals, object-oriented design, and Jini. Active in the Jini Community since its inception, Bill led the Jini Community's ServiceUI project, whose ServiceUI API became the de facto standard way to associate user interfaces to Jini services. Bill is also the lead developer and designer of ScalaTest, an open source testing tool for Scala and Java developers, and coauthor with Martin Odersky and Lex Spoon of the book, Programming in Scala.

This weblog entry is Copyright © 2005 Bill Venners. All rights reserved.

Sponsored Links


Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use