The Artima Developer Community
Sponsored Link

.NET Buzz Forum
XP covers TDD's a$$

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Udi Dahan

Posts: 882
Nickname: udidahan
Registered: Nov, 2003

Udi Dahan is The Software Simplist
XP covers TDD's a$$ Posted: Mar 5, 2004 3:58 PM
Reply to this message Reply

This post originated from an RSS feed registered with .NET Buzz by Udi Dahan.
Original Post: XP covers TDD's a$$
Feed Title: Udi Dahan - The Software Simplist
Feed URL: http://feeds.feedburner.com/UdiDahan-TheSoftwareSimplist
Feed Description: I am a software simplist. I make this beast of architecting, analysing, designing, developing, testing, managing, deploying software systems simple. This blog is about how I do it.
Latest .NET Buzz Posts
Latest .NET Buzz Posts by Udi Dahan
Latest Posts From Udi Dahan - The Software Simplist

Advertisement

I think that I've already said this about a million times, but it bears repeating. Test driven development does not a well tested system make. OK, "So what good is TDD then ?", you ask.

First, let's prove my original claim. In order to test a simple method like "long add(int a, int b)" you have to write a ridiculous amount of tests. Checking for range and sign are just the start of it, because that will at best tell you that it's behaving ok. But what about performance ? Is it performant enough ? Does it behave in multi-threading situations ? What about when you're out of memory ? And on, and on.

In order to properly test anything, so that your as close to 100% sure as you can that any possible use of the code will work as expected, requires a lot of thought, time, and code.

If you're going to do all this on every bit of code you write, you'll never finish anything ! And, of course, there will be some tests you forget to write, so you can't really be sure that your tests will catch all the potential problems.

The TDD gurus will tell you that TDD is not about writing every possible test for every bit of code. BTW, studies have shown that testing is an inferior technique for reducing defects in code than code inspection. Of course, XP has pair-programming for constant code inspection for exactly this reason.

So, if TDD doesn't give you a well tested system or particularly reduce defects, why the hell should you run around writing all these tests ?

There are 2 (main) reasons.

1. TDD is a "design" technique. It makes you think about how you'll use your code, before you write it. This leads to higher quality code. Coupled with YAGNI ( you aren't gonna need it ) from XP, and you prevent over-engineering.

2. TDD pushes the creation of many tests that show that the system is behaving as expected under given assumptions. This is important ! Document your assumptions. In code. In your tests. As assertions. Without the assumptions, you'll be bumping your way around a dark room forgetting everything after each bump.

But here's the point: The tests act as change detectors.

Whenever something changes ( usually one of the assumptions you made has now been found to be false as a result of some other code being written ), the place where your assumption appears, blows up.

Hooray !

There's nothing better than the code standing up and yelling "Hey, over here ! This here's broken ! Come fix it."

On the other hand, it could be that the assumption is still valid, and then the person that wrote the code will see that their use of it is wrong. Hopefully sooner rather than later. Continuous integration would let them know right away. ( Getting pretty obvious why in XP you see all the techniques together - by themselves, they have holes, together, each covers the other's holes. )

So, although TDD is a good technique, it doesn't give much in the way of actually testing the system. Enter XP, yet again. Integration testing, on-site customer that checks (tests) that a story is complete, all contribute to the "coverage". However, one of the places that XP does not have a technique built in, is when it comes to serious testing. Performance, scalability, out-of-resource situations are all areas which must be specifically tested for. They are not part of the rhythm which makes up XP.

This then defines a family of systems for which XP automatically does not quite fit. Systems which have stringent performance or scalability requirements will need to have the XP methodology tweaked so that they don't run into problems.

One common workaround is this - create customer stories for performance. Its simple, easy, and doesn't break the rhythm. But woe to those who forget. That's the only reason why I bring it up, because XP doesn't take care of it for you like all the rest.

Final point: if you're doing TDD, but not XP, you're missing out. XP covers TDD's ass.

Read: XP covers TDD's a$$

Topic: Worldwide Media Exchange Previous Topic   Next Topic Topic: Live From DevDays

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use