The Artima Developer Community
Sponsored Link

Weblogs Forum
Architecture the Accelerator

59 replies on 4 pages. Most recent reply: Mar 5, 2005 8:49 AM by Isaac Gouy

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 59 replies on 4 pages [ 1 2 3 4 | » ]
Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Architecture the Accelerator (View in Weblogs)
Posted: Feb 22, 2005 5:18 PM
Reply to this message Reply
Summary
Writing mediocre code may help you move fast in the short term, but it can slow you down in the long term. A good architecture can help you move fast in the long term, but slows you down in the short term. How is a developer to decide how good is good enough?
Advertisement

In the discussion of my recent weblog post, Can You Write Quality Software Quickly, Bob Martin wrote:

The only way to go really fast is to write the best code you can possibly write, with the best design you can possibly give it. Any other technique will slow you down. ... In my estimation the belief that quality is an expediter rather than an impediment is what separates professionals from amateurs.

My experience has been that writing mediocre code helps me move quickly in the short term, but often slows me down in the long term. In addition, I have observed that it is usually quality architecture, not quality code in general, that helps me move quickly in the long term. And herein lies the tradeoff: although a quality architecture can help me move quickly in the long term, building a good architecture slows me down in the short term.

I agree with Bob Martin that quality code is an expediter, but I think it is important to appreciate that quality has a cost, and development teams need to make cost/benefit analyses. In some situations, the "best code you can possible write" will be the right level of quality to aim for, in others it will not. You have to judge the return on investment of your development efforts. The higher the quality, the higher the cost. As you invest more in quality, at some point you will start getting diminishing returns.

Unfortunately you never have complete information about exactly where that point of diminishing returns is, because to know where that point is you have to predict the future. So you need to make a judgment, given each situation, as to the level of quality that gives the optimum return and aim for that. Moreover, you have to make a judgment as to which parts of your system deserve more attention, because the return on investment from quality is not spread homogenously throughout your system. Lately, my intuition has been telling me that the place I should really be investing in quality is in the architecture, because that will help me move quickly in the long term.

The Meaning of Architecture

By "architecture," I mean the basic technologies, structure, and programming conventions of a software system. In his interview, Growing, Pruning, and Spiking your Architecture, Luke Hohmann defined architecture like this:

...Architecture is some reasonable regularity about the structure of the system, the way the team goes about building its software, and how the software responds and adapts to its own environment. How well the architecture responds and adapts, and how well it goes through that construction process, is a measure of whether that architecture is any good.

Architecture includes technology choices that you use intact. For example, at Artima we're in the process of defining a new architecture, and we have already made some technology choices: Linux on commodity PCs, Java, Tomcat, Hibernate, Velocity, PostgreSQL, Lucene, Jini, JavaSpaces. These choices are part of our architecture.

Architecture also encompasses the basic organization of the system, the kind of thing you'd see represented in a block diagram. A crucial aspect of our architecture, for example, is how we will be distributing functionality across clusters of servers so that changes are as easy as possible to make and scalability is easy to attain. Although we have done a bit of investigation and planning, we haven't made many decisions in that area yet.

Architecture also involves the infrastructure you build to support your development processes. You'll likely invest a significant amount of time creating test harnesses and writing build scripts. You may create domain-specific languages that generate code. Such tools represent a structure in which you work together with others to create the software.

The rest of an architecture is embodied in conventions. In our new architecture, for example, we had many questions to answer. How will we render web pages? How will we version entities in the database? How will we name variables? I believe it is worthwhile to come up with one answer to each of these questions that we then use throughout the system. I foresee deciding upon and enforcing many, many such conventions.

Moving Really Quickly

An architecture can make you fast to the extent that it supports the kind of changes and enhancements you need to make in the future. What are the key ingredients that enable you to build such an architecture? In my experience, you need a combination of domain knowledge and luck.

For example, Artima.com started out in 1996 as a single HTML file that I edited by hand. Over time I added more pages, also edited by hand. Before long I found I was quite regularly needing to make global changes, such as altering the headers and footers on all pages. I realized I needed a way to quickly and easily make changes to every page. So I added HTML comments in each page that essentially marked up the header, footer, and other logical areas of the page. Whenever I needed to make a global change, I wrote a Perl script that iterated through all the HTML files and made changes based on the markup comments. If I wanted to update the header of all pages, for example, I'd write a Perl script to look for the HTML comments that marked the beginning and end of the header on each page, and replace what was between them.

The Perl approach worked for quite a while, but as the growth in the number of pages accelerated, the Perl approach became unwieldy. Some pages managed to not get the comments in the right places, so they wouldn't get updated, and these failures wouldn't be obvious to me when I ran the Perl script. As a result, I eventually replaced the Perl technique with a small Java application I called "Page Pumper." Page Pumper read an XML descriptor and a Raw HTML file for each page, and generated HTML files that were served up by Apache on the live web site. I still use a decendant of that original Page Pumper today, and it still helps me move quickly when I need to make global changes to the Artima Developer website. When I decided to put a search box at the bottom of every page on the site, for example, the entire process of updating the code of Page Pumper, testing the change, regenerating and deploying all pages on the live site took 15 minutes.

I knew how to write Page Pumper because I had domain knowledge. I had experienced that in the domain of publishing a website, the site-wide look and feel of pages tends to be constantly tweaked and massaged. The trouble is that I didn't have complete knowledge of the future. For example, when I wrote Page Pumper, Artima was composed primarily of static web pages. Page Pumper worked very well for generating static pages. I didn't foresee that I would eventually be generating pages dynamically with Java code in a servlet container. When that time came, I did manage to hack Page Pumper so that it worked with JSPs too, but it wasn't a very good fit. In the new architecture, the plan is to have all pages generated dynamically, and I don't expect we'll need a Page Pumper anymore.

The reason I said you need luck as well as domain knowledge to create a good architecture is that you can't see everything that's coming in the future. Domain knowledge helps you predict the future only to the extent that the future resembles the past. Today Artima is a web site, but tomorrow it may be running in car dashboards and cell phones, or something else I have never imagined. When that time comes, the architecture I'm building today will likely not, unless I am lucky, be ready to support those new requirements gracefully. When those new requirements come, I will likely need to invest more time in architecture building.

I'm curious to what extent your experience matches my own. Have you found that a quality architecture is the primary accelerator? To what extent have you found general code quality enables you to move quickly? Have you found that the quality of one part of your system is more important than quality of other parts? If so, which parts and why? Has code quality mattered more on some teams you worked than others? Does the return on code quality increase as the size of the team increases? Please add to the discussion by posting on the forum for this post.


Patrick Wilson-Welsh

Posts: 5
Nickname: padraig
Registered: Oct, 2003

Re: Architecture the Accelerator Posted: Feb 22, 2005 8:23 PM
Reply to this message Reply
> <p>
> ... I have observed that it is
> usually quality <em>architecture</em>, not quality code in
> general, that helps me move quickly in the long term. And
> herein lies the tradeoff: although a quality architecture
> can help me move quickly in the long term, building a good
> architecture slows me down in the short term.</p>
>
> <p>
> I agree with Bob Martin that quality code is an expediter,
> but I think it is important to appreciate that quality has
> a cost, and development teams need to make cost/benefit
> analyses. In some situations, the "best code you can
> possible write" will be the right level of quality to aim
> for, in others it will not. You have to judge the return
> on investment of your development efforts. The higher the
> quality, the higher the cost. As you invest more in
> quality, at some point you will start getting diminishing
> returns.</p>

Though I agree that unfortunate architectural choices may slow you down more than bad code in general, I believe that only the latter is truly preventable. As your own story of the evolution of artime aptly illustrates, Bill, architectures often evolve as projects scale up. Anticipatory, over-complex architecture is likelier to hamper me later than over-simple architecture.

And if we must refactor our architectures, we have to focus on the quality of the code itself. Well-factored design, within any overall architecture or any set of third-party frameworks, is what enables us to change those architectural or framework choices most easily. By anticipating nothing in particular, we anticipate everything in general. Like a Tai Chi practitioner, we are pretty tolerant of small and big changes coming at us from all directions (which is where they come from).

And I consider it a tad dangerous to warn, today, against the dangers of diminishing returns when pursuing the highest quality code possible. We still live in an age where the average codebase has very little test coverage, is rarely refactored, has high coupling and low cohesion, has gigantic classes and methods, intention is inscrutable, and ugly procedural code wears a thin OO nightgown. Most programmers still seem to tacitly accept that as a codebase grows, they will cave to insane production pressures, and quality will be the first thing out the window. They will spend more and more of their time interacting with that code via debuggers, defect rates will continually rise anyway, and the code will steadily slip out of their control. Sooner rather than later, the whole shebang will require a massive overhaul.

In such an age, I think we have plenty to learn about how poor quality has slowed us down, and everything to learn about how extremely high quality might speed us up. Refactoring vigorously as we go is a skill that most of us are just starting to learn. It has so many of its own tricks, and its own feel to it. I know programmers who, much better at it than I, and assisted by automated refactoring tools of recent vintage, seem not at all slowed down by feverishly improving the code as they go. Certainly not in the long term, but apparently not in the short term either. (My goodness, how much will we all end up owing to Extract Method and Rename Method, by themselves?)

We are just starting down this road of pushing high quality to its limits. My hope is that we spend a few years feverishly trying to see how the highest possible code quality affects our short run and long run returns. Until I see that most of us are skilled in the aggressive pursuit of code quality, I for one won't accept that we know much about how and when those returns might diminish.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Architecture the Accelerator Posted: Feb 22, 2005 10:06 PM
Reply to this message Reply
> Though I agree that unfortunate architectural choices may
> slow you down more than bad code in general, I believe
> that only the latter is truly preventable. As your own
> story of the evolution of artime aptly illustrates, Bill,
> , architectures often evolve as projects scale up.
> Anticipatory, over-complex architecture is likelier to
> hamper me later than over-simple architecture.
>
I believe you, and I have had that experience myself, but that doesn't mean you shouldn't make any bets that something will be useful in the future based on your domain knowledge. The choice isn't just between an over-complex architecture and an over-simple one. There is also middle ground. No matter what you do you are making bets about the future, and I think well placed bets about the future based on past experience are prudent, because in many ways the future does repeat the past. At least it has in the past.

> And if we must refactor our architectures, we have to
> focus on the quality of the code itself. Well-factored
> design, within any overall architecture or any set of
> third-party frameworks, is what enables us to change
> those architectural or framework choices most easily. By
> y anticipating nothing in particular, we anticipate
> everything in general. Like a Tai Chi practitioner, we are
> pretty tolerant of small and big changes coming at us from
> all directions (which is where they come from).
>
This has not been my experience. Even if I had invested the time it would have taken to make Artima's code on the existing architecture excellent, that wouldn't help me much now that I'm moving to a new architecture. For example, Artima's current web tier is composed of JSPs that call into Jive Software's forums API. How does well factored JSP code that calls into the Jive API help me create well factored controllers that call into a new Artima API (which will be very different from the Jive API), which uses Velocity templates for rendering? There is nothing I can reuse from the existing JSPs except the domain experience I gained working with them. So the code quality of those JSPs is really irrelevant to the speed with which I can perform this re-architecture.

> And I consider it a tad dangerous to warn, today, against
> the dangers of diminishing returns when pursuing the
> highest quality code possible. We still live in an age
> where the average codebase has very little test coverage,
> is rarely refactored, has high coupling and low cohesion,
> has gigantic classes and methods, intention is
> inscrutable, and ugly procedural code wears a thin OO
> nightgown. Most programmers still seem to tacitly accept
> that as a codebase grows, they will cave to insane
> production pressures, and quality will be the first thing
> out the window. They will spend more and more of their
> time interacting with that code via debuggers, defect
> rates will continually rise anyway, and the code will
> steadily slip out of their control. Sooner rather than
> later, the whole shebang will require a massive
> overhaul.
>
I agree there is danger that people would use such discussion as an excuse for bad code. However, I would like to hear what people have experienced in practice about the return on investment in quality code. I'm hoping that the knowledge that will emerge from this discussion will be a clearer indication of when it actually makes sense to point to this discussion not as an excuse, but as a reason for not investing more in the quality of a particular piece of code. Just because some people "can't handle the truth" doesn't mean that we shouldn't seek the truth so the rest of us can benefit from it. And who knows, the truth may be that you should always strive for the utmost code quality.

Nevertheless, the attitude that you should always strive for the most excellent code all the time reminds me of the attitude that you should always strive for the fastest performing code all the time. In practice, only 10% to 20% of the code matters to performance, and time spent speeding up the rest is simply wasted. It is money down the drain. As much as I prefer personally to create quality code, I have found in practice that when I didn't, it often made no difference. Good enough code in the right places was good enough, and any more time spent making it better would have ended up having little or no return on investment. A good example is those JSPs I mentioned earlier. Had I spent a few weeks cleaning those up, it would not have helped me in the current architectural change. It would have been wasted effort, money down the drain.

> In such an age, I think we have plenty to learn about how
> poor quality has slowed us down, and everything to learn
> about how extremely high quality might speed us up.
> Refactoring vigorously as we go is a skill that most of
> of us are just starting to learn. It has so many of its
> own tricks, and its own feel to it. I know programmers
> who, much better at it than I, and assisted by automated
> refactoring tools of recent vintage, seem not at all
> slowed down by feverishly improving the code as they go.
> Certainly not in the long term, but apparently not in the
> short term either. (My goodness, how much will we all
> end up owing to Extract Method and Rename Method, by
> themselves?)
>
> We are just starting down this road of pushing high
> quality to its limits. My hope is that we spend a few
> years feverishly trying to see how the highest possible
> code quality affects our short run and long run returns.
> Until I see that most of us are skilled in the aggressive
> pursuit of code quality, I for one won't accept that we
> know much about how and when those returns might diminish.

I don't want to wait, and I don't think we need to wait. I think a lot of folks already have a lot of experience they could share on this question that would help begin to illuminate the answers, and that's what I'm hoping to get out of reading the discussion that ensues here.

Parag Shah

Posts: 24
Nickname: parags
Registered: Mar, 2003

Re: Architecture the Accelerator Posted: Feb 23, 2005 1:57 AM
Reply to this message Reply
Bill,
There are two forces that a software usually has to reckon with. Changes in business requirements and changes in technology. Let's say we build an ERP system. Some business rules in the accounting module might have to be modified due to changes in tax laws. Parhaps the client might want to produce more reports or change some existing ones. The client might want to change the UI. Brainstorming with domain experts and clients will most likely help us make a comprehensive (though not all-encompassing) list of requirements that can change in the next let's say three years. With some further thought we may also be able to attach a probablity of change. If we incorporate enough flexibility for high probablity changes then we will get a good return on investment on the architecture. I think this is the middle ground between an overly simplified architecture and a very complex one.
The other force that acts on our architecture is changes in technology. Had we created our ERP system with an amateuer homegrown MVC kernel, then we might at some point feel the need to use Struts or an equivalent framework. The refactoring effort will depend on our choices of classes and their responsibilities. If we have used good OO priciples and ensured that each class has a coherent set of responsibilities and the methods are well structured, then we might actually be able to reuse a large part of our existing code base with minor modifications in the new architecure.
What if we have to face a drastic technological change, which will render our current code base useless? This is possible though not very likely. Technological changes do not usually happen overnight. There is enough indication before a promising technology becomes mainstream. If we adopt the practice of constant refactoring then we should be able to adapt to the new tecghnology gradually.
I think there is a definite value to a well thought architecture. This value is maintained if the thought on architecture is not limited to the architecture phase, but rather is a continous process.
Bill, you mention diminishing returns on code quality. How do you define code quality, and why does creating good quality code take more time? Good quality code is not bulkier than mediocre code. I think it is effort that is spent thinking on what to code, or how to structure code that is time consuming. If we constantly practice writing good code then this thought process becomes second nature, and it does not take a lot of time. The extra time it does take is worth more than the effort we would have to put in to fix bugs.
Is it possible that if programmers had a lot of discipline and dilligence then they would produce good quality code by deault?

Vincent O'Sullivan

Posts: 724
Nickname: vincent
Registered: Nov, 2002

Re: Architecture the Accelerator Posted: Feb 23, 2005 3:16 AM
Reply to this message Reply
> Writing mediocre code may help you move fast in the
> short term, but it can slow you down in the long term.
> A good architecture can help you move fast in the
> long term, but slows you down in the short term.

I don't think the problem lies in the code, nor in the technologies, nor the architecture. The hard truth is that we are both the problem and the solution. Us, the developers, the designers, the architects. Not the tools. Good tools have been around for years. Some people use them well, some badly. The difference between good usage and bad usage is not the tool but the user.

Too many claims are made that language A is ten (it's always ten) times better than language B, etc. But, in my humble experience, no language developed in the last twenty years has any significant new features, just different ways of implementing them. A good C/VB/Java/Python programmer will develop better stuff quicker than a poor C/VB/Java/Python programmer (select any combination or any other language).

> How is a developer to decide how good is good enough?

To the advice "do the best you can", I would add "train, practice, learn, repeat".

Vince.

Mark Wilkinson

Posts: 2
Nickname: mhw
Registered: Mar, 2003

Re: Architecture the Accelerator Posted: Feb 23, 2005 6:10 AM
Reply to this message Reply
I have a fairly simple measure for how good the architecture of a system is: the architecture is 'right' when it's clear which parts of the system need to be modified to implement any given requirement.

So, for example, your users ask for a new field to be added to the customers in the system. If it's obvious how to do this then your architecture is right. Your users ask for the system to be deployed with the web front-end and the business logic on separate machines. Again, if it's obvious how to do this then your architecture is right.

The measure obviously leads to the conclusion that the 'rightness' of the architecture depends on the future requirements, so I think much of the trade-off that you're describing actually comes back to being able to anticipate future requirements and not over-committing on architectural decisions where you don't need to.

Darryl Mataya

Posts: 9
Nickname: dmataya
Registered: Nov, 2004

Re: Architecture the Accelerator Posted: Feb 23, 2005 8:58 AM
Reply to this message Reply
Architecture is a fabric that unifies individual strands of technology into something with recognizable properties and a distinguishable shape.

A colleague and I constructed this definition years ago when we were trying to define, for management at that time, an analogy that explained why we weren’t coding for the first three months of a new project. Of course architecture exists for every project you work on. If unattended, it will likely resemble a patchwork quilt although a gifted architect will often default to an elegant work without any prior planning.

The short run problem is recognizing that you have architecture, whether defined or not. Then you have to assess whether your architecture is helping or hindering you in your project’s goals. But even good design and architecture cannot help you if you have ambiguous specifications. I think misunderstood objectives are a much bigger barrier to project success than bad architecture. Bill states that he did not "have complete knowledge of the future" when working with a static web site. That is often true, but can’t usually be solved by architecture decisions – though sometimes you get lucky.

I have never been involved in any coding project that would not have had an immediate return on time investment from spending more time early with architectural assumptions and definition. I think a primary reason why every architecture benefits from additional investment has to do with the nature of the software development personality. Most developers tend to over solve problems. By that I mean we have a hard time initially imagining the simplest possible solution to each specific problem. We have all worked with the inexperienced developer who views a language or operating system as nothing more than a series of levers and pulleys that are fun to move.

Another overlooked benefit of "time spent architecting" is that we improve its understanding among the humans on the development team. I think this is part of what Vincent meant when he pointed out the human part of the problem. Software architecture is still expressed primarily in words or other language symbols. There is of course still no universal software blueprint methodology. Take for example the statement that "any user tool or component shall not access the underlying database or data tables directly, but will always access logical data via the business object properties exposed in the AI namespaces." That statement will probably be fairly clear to most experienced developers, but even they will occasionally screw up perhaps because of a disagreement about what the term "user tool" means. Does it also mean the quick and dirty utility that was written to fix database incompatibilities? (We originally used the phrase "end user interface" and changed it for this very reason.)

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Architecture the Accelerator Posted: Feb 23, 2005 9:03 AM
Reply to this message Reply
> Bill, you mention diminishing returns on code quality. How
> do you define code quality, and why does creating good
> quality code take more time? Good quality code is not
> bulkier than mediocre code. I think it is effort that is
> spent thinking on what to code, or how to structure code
> that is time consuming. If we constantly practice writing
> good code then this thought process becomes second nature,
> and it does not take a lot of time. The extra time it does
> take is worth more than the effort we would have to put in
> to fix bugs.
>
I don't have a good definition of quality, other than perhaps suggesting reading Zen and the Art of Motorcycle Maintenence. It is subjective, a "we know it when we see it" kind of thing. Some things I would recognize as quality code in general is well factored code, with focused, well-named classes and methods and minimal duplication. Code that is adequately covered by tests. What is good really depends on the situation, but in general, quality code is easy to understand and change as well as correct and robust.

Where does the cost come from? I think you're right in that it is mostly time, and time is money because someone is paying you for your time. If I need to implement a feature, I can go in quickly and do it and test it by hand. Or I can think about it a bit, then go in quickly and do it and test it by hand. Or I can think about it a bit, then go in quickly and do it and write good automated test coverage along the way, and test it by hand at the end. Or I can think about it a bit, do some preparatory refactoring, perhaps build a test harness and spend time on the build process so that there's a one button build, then write good tests and code, refactor the code after, and do some testing by hand at the end. Each time I invest a bit more time, which costs a bit more money, but with that extra investment in time I can higher quality code.

You wrote "The extra time it does take is worth more than the effort we would have to put in to fix bugs." I think it is good to think like that, because that's return on investment thinking. What's important to estimate is not just the cost up front, but the total cost of ownership. The question I'm asking is where's how do you decide how much to do up front? The more you put in up front, the lower the total cost of ownership, but only to a point. At some point, you will start getting diminishing returns. In practice, I curious to learn what people have observed about the actual return gained on varying levels of quality code.

> Is it possible that if programmers had a lot of discipline
> and dilligence then they would produce good quality code
> by deault?
>
One thing we can do is constantly try to increase the quality of the code we generate in a given amount of time. This is something we as professionals should do, and it is something that I'd like Artima to help facilitate through discussions like this. But from the perspective of the person who is paying you, this still has a cost that has to be factored into his or her thinking. Are you being paid to read this? Have you ever been sent by your company to a class or seminar, or submitted an expense report for a tech book? That costs money.

You can also say that only the best and brightest should be hired in the first place, folks who will generate higher quality code in a given time frame. One problem there is that the best and brightest often work somewhere else, so you can't hire them. Those that are available to work for you are often hard to recognize. It also takes time to do interviews, and time spent doing interviews costs a lot of money too, both because you're paying people to do the interviews and because when people are interviewing they aren't writing code. Finally, when you do hire one of these hot shot developers, they will likely cost more than an average developer. They in general aren't able to charge as much more as they are worth, so it is still a much better deal.

In short, no matter how you get at improving the quality of your codebase--whether through better processes or better people or both--it will cost you money.

Darryl Mataya

Posts: 9
Nickname: dmataya
Registered: Nov, 2004

Re: Architecture the Accelerator Posted: Feb 23, 2005 9:26 AM
Reply to this message Reply
Regarding software quality, let me paraphrase Bill's question. Since quality takes time and time is money, how much quality do you want to buy?

That is an unanswerable question to management, because their answer will either be "as much as it takes" or "all of it." (I'm paraphrasing on their behalf;)

I prefer a software quality definition based on fitness for purpose - as opposed to anything oriented towards "good, better, best", or worse yet, zero defects. The entire organization can then get involved in a fitness for purpose definition/evaluation and allow designers to focus on architecture questions.

With that background, in my experience, we usually under-invest in architecture decisions. When we know when we've got a piece good enough for the purpose we quit. But most of our refactoring later is due to incorrect specifications for the purpose, not choosing an incorrect data structure somewhere along the way.

Joel Neely

Posts: 10
Nickname: joelneely
Registered: Mar, 2003

Re: Architecture the Accelerator Posted: Feb 23, 2005 9:32 AM
Reply to this message Reply
Discussions of this nature always remind me of the old saw:

Good judgement comes from experience;
Experience comes from bad judgement.


because much of the advice regarding extreme programming seems to boil down to:

Use good judgement (or good taste!)

This doesn't imply condemnation of agile development approaches; it is simply my experience that experts in most domains behave differently than beginners or other non-experts. Often they have learned from, and internalized, the lessons from long practice of their craft/profession to the point that they perceive, make decisions, and perform many tasks without laborious conscious analysis. Discussions of agile practices - including this one - are full of phrases that seem to appeal to this level of performance: "code smells", "YAGNI", "just enough", "right balance", "when it's clear", etc.

IT currently seems to share with many other areas of commercial endeavor the pressure to commoditize skills and products. Taken to the extreme, this pressure even has the appearance of mistrust toward individual expertise (a point raised many times in E. W. Dijkstra's writings). However, expertise seldom can be reduced to a handful of simple recipes decoupled from experience.

So how does this apply to the architecture/accelerator question, at least for those of us who don't claim a master's intuitive grasp of development?

I've come to regard a software development project as a literary endeavor. The code (and related documents) must tell a story that is intelligible to those who come after (including the original authors after enough time has passed!). A project journal may be one of the most under-rated artifacts that can enhance this intelligibility. A narrative of major decision points that clarifies why each alternative was taken or rejected is a risky thing (it is easy for choices to be second-guessed with 20-20 hindsight, and for political issue to appear... political), but can offer great insight both into "How did it get into this state?" and into "How will we do it differently next time?"

An honest log of conscious choices (e.g. how much architecture to introduce at what points) would be a great asset to weighing the pros and cons the next time around.

Robert C. Martin

Posts: 111
Nickname: unclebob
Registered: Apr, 2003

Re: Architecture the Accelerator Posted: Feb 23, 2005 3:48 PM
Reply to this message Reply
I make the following statements without supporting argument. They are maxims of professionalism that I will not compromise. Holding to them is how I keep my self respect as a programmer.

* The cost of quality is negative. Always.
* You never go faster by reducing quality. Not even in the very short term.
* If the code is bad, then so is the architecture.
* The right architecture today is the wrong architecture tomorrow.
* Good architecture is so well written that it can incrementally evolve with your system.

Again, these are maxims. They are articles of faith backed up by years of experience. I have adopted these maxims because I have tried the other course and found it lacking.

Too many quick hacks became long disasters.
Too many trade-show demos crashed.
Too many prototypes were shipped and maintained for years.
Too many quick and dirty changes became multi-week debugging sessions.
Too many 2AM coding sessions became long-term architectural impediments.

The day a programmer stands his ground and says: "The only way to go fast is go go well." is the day he becomes a professional.

I will not accept the argument that says we have to reduce quality to achieve time-to-market.

Robert C. Martin

Posts: 111
Nickname: unclebob
Registered: Apr, 2003

Re: Architecture the Accelerator Posted: Feb 23, 2005 4:03 PM
Reply to this message Reply
> An architecture can make you fast to the extent that it
> supports the kind of changes and enhancements you need to
> make in the future.

This is a true statement, but IMHO it implies an incorrect conclusion. It implies that in order to build a good architecture you must be able to predict the future. If that were true, then building effective architectures would be hopeless.

An architecture that is right for today, is wrong for tomorrow. Architectures are transient things that support where we are now rather than where we think we might need to go.

To support change, an architecture must be changeable. This means that it must be very well written, simple, and elegant. Such an architecture does not predict the future. Rather it adapts to the future by being easy to change; and by hiding the agents of change from the application.

For example, an architecture that is designed around J2EE entity beans is hard to change. The system running within that architecture is bound to the J2EE APIs almost inextricably.

On the other hand an architecture that uses J2EE, but hides it from the application, frees the application, and is much easier to change than the application itself.

Frank Sommers

Posts: 2642
Nickname: fsommers
Registered: Jan, 2002

Re: Architecture the Accelerator Posted: Feb 23, 2005 4:23 PM
Reply to this message Reply
> I make the following statements without supporting
> argument.

That's interesting, because that statement sounds like a hallmark of a religion.

> They are maxims of professionalism that I will
> not compromise. Holding to them is how I keep my self
> respect as a programmer.
>
> * The cost of quality is negative. Always.
>

What does that mean? So "quality" makes you money? How so? Ask Microsoft. Isn't it happy customers that make you money? And if so, what if a customer is happy with a given time-to-market (because otherwise they'd go out of business), and would not be willing to pay for work that takes more time but that would result in "higher" quality?

> * You never go faster by reducing quality. Not
> t even in the very short term.


Well, sure you do. Just press the accelerator on your car to the metal - you *will* go faster, but you might also hit a tree. Now, if you're in a railroad crossing, and a train is fast approaching, you still would want to hit the accelerator as fast as you can. That train might be your competitor.

> * If the code is bad, then so is the architecture.

I agree that bad architecture encourages bad code, but I've seen a lot of bad code written to an otherwise good architecture.

> * The right architecture today is the wrong architecture
> e tomorrow.
> * Good architecture is so well written that it can
> n incrementally evolve with your system.

Hmm, these two points seem to contradict.


> The day a programmer stands his ground and says: "The
> only way to go fast is go go well."
is the day he
> becomes a professional.

Not in my book. The day the programmer becomes a professional is the day he can judge the exact amount of time and quality required for a given task.

> I will not accept the argument that says we have to
> reduce quality to achieve time-to-market.


I don't know if you've ever ran a business, but that's what I do, and I can tell you that it's a dangerous world out there. When it rains, you just want an umbrella - if you waited for a hut to be build, you'd die of hypothermia. So you sometimes do have to reduce quality to reach an objective.

Just look at many successful Web sites today, and you will notice many didn't have such a great quality architecture from the start. For instance, eBay had problems serving the great many visitors they got after a few years. But by then they had the money to develop the exact architecture required to serve those people, and that's what they did. What they focused, instead, first was the quality of user experience needed to get all those people to trade there.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Architecture the Accelerator Posted: Feb 23, 2005 5:13 PM
Reply to this message Reply
> I make the following statements without supporting
> argument. They are maxims of professionalism that I will
> not compromise. Holding to them is how I keep my self
> respect as a programmer.
>
I'm not sure why, and I'd like to investigate this with Kent Beck some day if I get to interview him, but I hear this kind of attitude often from XP enthusiasts. These are the kind of statements filled with religious fervor that give me the impression of XP as cult. I think the success of XP has done a huge amount of good by promoting many very good ideas, and building a religious fervor around the ideas did help market them. But the downside to me is that it also causes some people to not think. I see tons of discussion on the XP mailing list, which I think is great. People are discussing and thinking, but I also hear a lot of people opening their mouths and just uttering the XP party line seemingly without much thought. Here the partly line would be that aiming for the utmost in code quality is always the right choice, but I think it is useful to question that assumption and discuss it, so we can all gain from each other's experience.

As a person, I really want to create high quality everything. I want this site to be high quality. I want the code to be high quality. But as a professional, if someone is paying me, I want their return on investment to be high quality. That's what their paying me for. And if that sometimes means hacking together a quick demo for that person to meet a deadline, even if it is more likely to crash, then I do that and I still respect myself in the morning.

> * The cost of quality is negative. Always.
> ]

This is not my experience. Frank and I just got off the phone with a happy Artima customer, with whom we were talking about what their advertising goals are for the next year. They didn't say a thing about Artima's code quality. They could care less if our JSPs are ugly. And the users never bring it up either. The JSP I'm using now to post to this forum, which was originally written by Jive Software, is quite messy, but it is apparently good enough to host this discussion.

I know that Artima has given its users a great deal of pain in occassionally slow pages and reboots of Tomcat. I am working to fix that, but it actually has little to do with code quality as far as I can tell. It has more to do with a combination of the hardware the site is currently running on and the fact that the app isn't clustered. The problem also is almost always caused by unintentional denial of service attacks from web copiers or unfriendly spiders.

> * You never go faster by reducing quality. Not
> t even in the very short term.


Well, I just don't see this. Can you explain a bit, or give an example? An example of how time and quality are related is how we all post to this forum. I use the Preview button to make sure it all looks good before I post. I don't just whip it out and press Post Message. I go back and forth iteratively between the edit box and the preview until I'm happy. That takes more time than just whipping out the post and pressing Post Message.

> * If the code is bad, then so is the architecture.

I don't see this either. Maybe we have a different definition of architecture. To me, there's architectural parts and non-architectural parts. The theory I'm putting up for debate in this weblog post, which is what I'm attempting to put into practice right now at Artima, is if I pay very close attention to ensuring a high quality architecture up front, that I won't have to be so attentive or micro-managing down the road. The architecture establishes an inertia. My theory is that if we establish an architecture that points people towards high quality work, they will lean towards that. And even if they need to hack some stuff together one week to meet a deadline, the cruft will still be contained enough that it is easy to deal with later.

> * The right architecture today is the wrong architecture
> e tomorrow.

I believe I agree with that statement. I think you're basically saying that it is hard to predict the future requirements of the system, so the architecture that facilitates changes in the near term is likely to not facilitate changes that will be needed in the long term.

> * Good architecture is so well written that it can
> n incrementally evolve with your system.
>
That sounds good too, but I don't know how to do that. I think you're referring to the best practices of having tests and clean, well-factored code. But I just don't believe that when significant changes in business requirements and technologies come about, that well-tested, well-factored code will help much.

> Again, these are maxims. They are articles of faith
> backed up by years of experience. I have adopted these
> maxims because I have tried the other course and found it
> lacking.
>
> Too many quick hacks became long disasters.

I have seen quick hacks become long disasters, but I have also seen quick hacks that meet a business need and then are either cleaned up shortly thereafter or discarded. I have also seen quick hacks that meet a need that don't ever really need much changing, and these stay hacks and continue to serve a useful purpose.

> Too many trade-show demos crashed.

True again, but my theory is that this has less to do with code quality than with a race of invisible demo gremlins that go around wreaking havoc. Actually, those crashes probably were due to poor quality in most cases. But I have also seen a lot of demos that worked, and I somehow doubt that none of them were hacks.

> Too many prototypes were shipped and maintained for years.
>
That's true.

> Too many quick and dirty changes became multi-week
> debugging sessions.

I have found this to be true more in the days when I was using C or C++ and doing memory management. Now that I use Java and have a garbage collector, the debugging sessions don't seem to last very long.

> Too many 2AM coding sessions became long-term
> architectural impediments.
>
> The day a programmer stands his ground and says: "The
> only way to go fast is go go well."
is the day he
> becomes a professional.
>
> I will not accept the argument that says we have to
> reduce quality to achieve time-to-market.


I wouldn't say that any of us should reduce quality so much as be satisfied with varying degrees of quality depending on the circumstance. Let me give you a metaphor in the building architecture world.

I spend so much time sitting in front of computers that I prefer to climb stairs rather than take elevators to get exercise. I was in China last November, and our room was on the 15th floor. So I went up and down the stairs many times. These stairs were made out of cement and dirty. The railing was metal and had paint that was peeling off. Their intended use is as an emergency exit in case of fire or a power outage that renders the elevators inoperable. I also noticed they were used by the cleaning crew to go between floors. By contrast, in the lobby of that same hotel there was a staircase that was nicely curved and carpeted. The railing was wooden and polished. It was a higher quality set of stairs.

For the purpose that the emergence stairs was intended for, I don't think it should be as high quality as the lobby stairs. The main quality aspect that matters in the case of the emergency stairs is that they stairs are usable and dependable. They don't have to be clean or pretty. In the case of the lobby stairs, the desired quality is higher because the purpose of the stairs is not only to move people up and down every day (i.e., not just emergencies), but it also is a part of the initial impression of customers. If customers walk in the door and see a dirty, cement staircase, they will assume the hotel rooms are dirty and low quality, and many of them will turn around and look elsewhere.

If that hotel hired you to build the side stairs, would you insist that your professional integrity depended upon only the highest quality mahogany rails and plush carpet? I'm sure that sounds silly to you, but to my ears what your saying about insisting on the utmost quality in code no matter what the situation sounds similar.

Isaac Gouy

Posts: 527
Nickname: igouy
Registered: Jul, 2003

Re: Architecture the Accelerator Posted: Feb 23, 2005 5:43 PM
Reply to this message Reply
> I make the following statements without supporting
> argument.

Frank Sommers wrote: "That's interesting, because that statement sounds like a hallmark of a religion."

Bill Venners wrote: "These are the kind of statements filled with religious fervor that give me the impression of XP as cult."

There are many religious folk who are deeply thoughtful.

In contrast we might say that these statements are an expression of dogma.

Flat View: This topic has 59 replies on 4 pages [ 1  2  3  4 | » ]
Topic: I'd rather use a socket. Previous Topic   Next Topic Topic: The Price Of Two

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use