The Artima Developer Community
Sponsored Link

News & Ideas Forum (Closed for new topic posts)
Test-Driven Development

12 replies on 1 page. Most recent reply: Feb 1, 2005 2:06 AM by Vincent O'Sullivan

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 12 replies on 1 page
Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Test-Driven Development Posted: Dec 2, 2002 8:58 AM
Reply to this message Reply
Advertisement
Artima.com has published Part V of an interview in which Martin Fowler discusses the unhurried quality of test-first design, defines monological thinking, and distinguishes between unit and functional testing.

http://www.artima.com/intv/testdriven.html

Here's an excerpt:

The thing I like about taking small steps and writing tests first is that it gives me a simple to do list of the things I've got to do. At each end point I have a certain amount of closure. I say, OK, this stuff works. Check it in. It's all there. It does what it does and it does it correctly.

There's an impossible-to-express quality about test- first design that gives you a sense of unhurriedness. You are actually moving very quickly, but there's an unhurriedness because you are creating little micro-goals for yourself and satisfying them. At each point you know you are doing one micro-goal piece of work, and it's done when the test passes. That is a very calming thing. It reduces the scope of what you have to think about. You don't have to think about everything you have to do in the class. You just have to think about one little piece of responsibility. You make that work and then you refactor it so everything is very nicely designed. Then you add in the next piece of responsibility. I previously used the kind of approach you describe. I'd ask, "What's the interface of this?" I've now switched and I much more prefer incremental design.


Do you think you can come up with a good design by only looking at one piece of the problem at a time? If you practice test-first, incremental design, how much does your design tend to change through refactoring once you have finished all the little pieces and take a step back to look at your design through a larger lense?


Chris Dailey

Posts: 56
Nickname: mouse
Registered: Dec, 2002

Re: Test-Driven Development Posted: Dec 3, 2002 1:51 PM
Reply to this message Reply
> Do you think you can come up with a good design by only
> looking at one piece of the problem at a time? If you
> practice test-first, incremental design, how much does
> your design tend to change through refactoring once you
> have finished all the little pieces and take a step back
> to look at your design through a larger lense?

I have to wonder about this. You do some testing, then some coding, and you get it to work. Do this a few times. After a while, all of your methods work, but they've gotten pretty large.

Say you combine this methodology with constantly testing metrics using a tool such as javancss. You run javancss -function *.java, and decide there are some methods that you have to split up. You refactor that, and the method metrics look much better. Then you run javancss -object *.java, and you see that your object has too much responsibility. Then you have to refactor, changing your object model and all of your tests. That's a lot of refactoring work.

I don't think that "it works" is a good enough measure. This is a good solution for "in the small" programming, but it falls apart when the scope of the software is larger. Some thought has to be put into design up-front to determine if a given approach makes sense.

Mr. Fowler seems to assert (and I admit I'm paraphrasing liberally) that a lot of the project has to be figured out as you go along, and I agree to an extent that is the case. It's impossible to know everything beforehand for all but the most trivial of software. But the amount of up-front figuring out has to increase as the size of the project gets larger in order to lower the amount of refactoring that will be required during the course of development.

Martin Fowler

Posts: 1573
Nickname: mfowler
Registered: Nov, 2002

Re: Test-Driven Development Posted: Dec 3, 2002 10:49 PM
Reply to this message Reply
> I have to wonder about this. You do some testing, then
> some coding, and you get it to work. Do this a few times.
> After a while, all of your methods work, but they've
> gotten pretty large.
>
> Say you combine this methodology with constantly testing
> metrics using a tool such as javancss. You run
> javancss -function *.java, and decide there
> are some methods that you have to split up. You refactor
> that, and the method metrics look much better. Then you
> run javancss -object *.java, and you see that
> your object has too much responsibility. Then you have to
> refactor, changing your object model and all of your
> tests. That's a lot of refactoring work.

There's an important point about TDD that's missing here. When using TDD you consider refactoring after every single test, to ensure the design quality is very high at all times. I don't know of anyone who uses a metrics tool when working on it - they mostly make judgments based on looking at the code.

> Mr. Fowler seems to assert (and I admit I'm paraphrasing
> liberally) that a lot of the project has to be figured out
> as you go along, and I agree to an extent that is the
> case. It's impossible to know everything beforehand for
> all but the most trivial of software. But the amount of
> up-front figuring out has to increase as the size of the
> project gets larger in order to lower the amount of
> refactoring that will be required during the course of
> development.

To some extent that's true, although I think it's less than most people think. There is a lot of refactoring involved in TDD, but it's not wise to get hung up on the amount of refactoring. The real question is how good is your design and how quickly you get there.

Martin

Arnold deVos

Posts: 18
Nickname: arnoldd
Registered: Dec, 2002

Tests inhibit evolution? Posted: Dec 4, 2002 1:40 AM
Reply to this message Reply
It is surprising how quickly good design emerges when you resist the urge to generalise early and rely on refactoring instead. The main speedup for me seems to be due to avoiding writers' block by working on the immediate problem instead of guessing about the generalisations that will be needed later.

But I, for one, am guilty of working without a safety net because JUnit-style tests seem to get right into to the joints of a system and inhibit flexibility.

Imagine a case where a refactoring affects just two collaborators A and B. A "mathematical" refactoring can be performed quickly and reliably. But if there are tests on A and on B and associated test cases there is a lot more work to do.

Perhaps its all in the art of choosing your tests? Any comments, Martin?

Martin Fowler

Posts: 1573
Nickname: mfowler
Registered: Nov, 2002

Re: Tests inhibit evolution? Posted: Dec 4, 2002 7:24 AM
Reply to this message Reply
> Imagine a case where a refactoring affects just two
> collaborators A and B. A "mathematical" refactoring can
> be performed quickly and reliably. But if there are tests
> on A and on B and associated test cases there is a lot
> more work to do.
>
> Perhaps its all in the art of choosing your tests? Any
> comments, Martin?

I'm too much of a coward to do much refactoring without tests because I'm too likely to break things. Automated tools can reduce this a lot, but there's too many refactorings that aren't automated yet.

I don't find that tests get in the way much. You do have to update them from time to time, but I've not found it burdensome.

Martin

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Tests inhibit evolution? Posted: Dec 4, 2002 9:02 AM
Reply to this message Reply
> But I, for one, am guilty of working without a safety net
> because JUnit-style tests seem to get right into to the
> joints of a system and inhibit flexibility.
>
> Imagine a case where a refactoring affects just two
> collaborators A and B. A "mathematical" refactoring can
> be performed quickly and reliably. But if there are tests
> on A and on B and associated test cases there is a lot
> more work to do.

I look at the question of whether or not to write tests as a ROI question. If I invest some time writing tests, will I get back a return that is worth the investment? That's why I asked Martin to make the business case for writing unit tests and refactoring, because such activities do cost money in programmer time. His response was that time spent writing unit tests increases the quality of the software, reduces the time developers spent debugging, and speeds future development. I believe Martin is right about the kinds of return on investment you can get from refactoring and testing, but I still ask myself in each case am I getting enough return to warrant the investment. Sometimes the answers is yes. Sometimes no.

For example, I recently wrote a little app that goes into the database of this forum and cleans out watches for threads that have been silent for 30 days. It now runs as a cron job once a week. I didn't write any unit tests for that app, because it was simple enough that I didn't feel it warranted the time spent stubbing out a database or creating a test database and filling it with test data. I just tested the app by hand using a local copy of the real database. In that case, I didn't feel it was worth the return on investment to write unit tests, because a database was involved which made it harder to write unit tests and because the app was so simple.

I think unit testing is an important tool for developers, one that should be used often, but not always. Unit testing makes sense when you expect a decent return on your investment of time writing the unit tests.

Randal Hanford

Posts: 1
Nickname: rh
Registered: Dec, 2002

Re: Test-Driven Development Posted: Dec 7, 2002 10:22 PM
Reply to this message Reply
I read your article and in addition read Kent Beck's new book on TDD. I had a question on how or if UML fits in. I have fellow workers that firmly believe that design is done using class/interaction diagrams.

Peter Pascale

Posts: 1
Nickname: artisan
Registered: Jan, 2003

Re: Test-Driven Development Posted: Jan 2, 2003 10:19 AM
Reply to this message Reply
Having examined test-driven development from a number of angles (personal experience, OOPSLA tutorial, Kent Beck's new book) I am excited by its potential when combined with agile methodologies. Essentially, if a process encourages/allows incremental design, test-driven development seems a natural fit.

I happen to work in an environment that is moving towards processes that discount, even dis-allow incremental design and test-driven development. We are focusing our process improvement initiatives around CMM, training developers in PSP (personal software process), inspections (requirements, design, and code), and design-by-contract. We don't have to debate the value of these (although we can) instead, my question is:

If an organization is focusing on requirements up-front, design up-front, and lots of quality/verification steps pre-implementation, how valuable will unit tests be?

It would seem in this CMM/inspection heavy approach, quality design and bug reduction is handled previous to unit testing. It has been suggested that unit tests would have a more limited role - do them when they provide coverage of a particularly tricky or important aspect of the implementation, or when doing so assists with some aspect of the implementation. It seems the value of unit testing is eroded. Anyone out there seem value in unit tests in this type of process?

Bill Kidwell

Posts: 1
Nickname: bkidwell
Registered: Mar, 2003

Re: Test-Driven Development Posted: Mar 6, 2003 9:12 PM
Reply to this message Reply
I beleive that unit tests still play an important role in quality, even if there are a number of pre-implementation steps to verify the quality of the code. They may play less of a role in driving the design in this situation, but not every aspect of the code will be designed.

Design-By-Contract may help limit the number of tests you write since they are a kind of a built-in-test themselves. There would still be value in executing the assertions to insure that they are correct.

Inspections and tests uncover some similar code faults, but in some cases there are faults that are easier to find using one method or the other. In this way these methods complement each other. And, what better way to end an inspection than with the execution of test code that will just add additional confidence to the state of the code?

It may be a little harder to justify the time necessary to write unit tests. One approach might be to use the metrics collected on two or more similar projects, some with unit tests and some without to determine the value of the tests. Keep in mind, some of the value in the tests is in the maintainability of the system. The tests make it easy to come back to a system after some time, make some changes and know that you have not broken something.

Patrick Lisser

Posts: 2
Nickname: pat
Registered: Mar, 2003

Re: Test-Driven Development Posted: Mar 19, 2003 4:23 AM
Reply to this message Reply
> Venners: What's the difference between unit and
> functional testing?
>
> Fowler: The current name in XP circles for functional
> tests is acceptance tests. Acceptance tests view the
> system more as a black box and test more end to end
> across the whole system. You would have unit tests for
> individual classes in the domain and database mapping
> layers. In most of these unit tests, you might not even
> connect the database. You would stub the database out.
> But with the functional tests, which go end to end, you
> would want everything connected.

What do you think about a distinction between more than two types of test?

In our project (Web application suite integrating several data sources) I'd see four of them (unfortunately not all of them are implemented yet):

1. unit tests as described in your conversation: testing classes and their collaborations using data source stubs and for example JUnit.

2. functional tests: testing use cases but still using data source stubs because of (at least) better performance, (more) reliable test data, and off-line capability (demonstrations, travelling). We use WebTest and HttpUnit. The former is so simple that the customer can write them herself. Thus, these functional tests actually serve as acceptance tests.

3. data source tests: testing the data source and its connection only. Here again we use JUnit.

4. GUI tests: testing the page layouts, distances between widgets, etc. This is done by hand (eye), but we reuse the HTML responses kept during the functional tests (WebTest) so we don't have to wait (again) for the system's response.


Unit and functional tests are automated and must always run 100% (another reason why we want to use data source stubs for functional tests). A build (in the sense of continuous integration) fails if any unit or functional test fails.

The data source tests are automated as well but can fail without causing the build to fail.

The GUI tests are the worst, because we didn't find a reasonable way to automate them. As an optimization we only keep the HTML responses during the functional tests (WebTest) as described above.

Patrick Lisser

Posts: 2
Nickname: pat
Registered: Mar, 2003

Re: Test-Driven Development Posted: Mar 19, 2003 4:26 AM
Reply to this message Reply
> Venners: What's the difference between unit and
> functional testing?
>
> Fowler: The current name in XP circles for functional
> tests is acceptance tests. Acceptance tests view the
> system more as a black box and test more end to end
> across the whole system. You would have unit tests for
> individual classes in the domain and database mapping
> layers. In most of these unit tests, you might not even
> connect the database. You would stub the database out.
> But with the functional tests, which go end to end, you
> would want everything connected.

What do you think about a distinction between more than two types of test?

In our project (Web application suite integrating several data sources) I'd see four of them (unfortunately not all of them are implemented yet):

1. unit tests as described in your conversation: testing classes and their collaborations using data source stubs and for example JUnit.

2. functional tests: testing use cases but still using data source stubs because of (at least) better performance, (more) reliable test data, and off-line capability (demonstrations, travelling). We use WebTest and HttpUnit. The former is so simple that the customer can write them herself. Thus, these functional tests actually serve as acceptance tests.

3. data source tests: testing the data source and its connection only. Here again we use JUnit.

4. GUI tests: testing the page layouts, distances between widgets, etc. This is done by hand (eye), but we reuse the HTML responses kept during the functional tests (WebTest) so we don't have to wait (again) for the system's response.


Unit and functional tests are automated and must always run 100% (another reason why we want to use data source stubs for functional tests). A build (in the sense of continuous integration) fails if any unit or functional test fails.

The data source tests are automated as well but can fail without causing the build to fail.

The GUI tests are the worst, because we didn't find a reasonable way to automate them. As an optimization we only keep the HTML responses during the functional tests (WebTest) as described above.

Greg Grivas

Posts: 1
Nickname: stratos
Registered: Jan, 2005

Re: Test-Driven Development Posted: Jan 31, 2005 8:41 AM
Reply to this message Reply
This is all well and good philosophically, and I believe it is good in practice to a point. What I would like to hear is real examples of Bug Free code generated by TDD. My view is that TDD is great, but traditional testing should also be continued parallel to it.
Stress
Perf
TTL
Func
End to End
Black Box
etc.

It is my belief that TDD will create stronger code, and that the easy to find bugs will be eliminated, but that more testing, not less is required. This is good for the customer, and for the company. But the idea the TDD will give us better code alone is absurd.

Vincent O'Sullivan

Posts: 724
Nickname: vincent
Registered: Nov, 2002

Re: Test-Driven Development Posted: Feb 1, 2005 2:06 AM
Reply to this message Reply
> My view is that TDD is great, but traditional testing
> should also be continued parallel to it.

I don't think that that contradicts anything in TDD. TDD concentrates on unit testing and addresses the problem that - generally speaking - unit testing during development is generally inadequate, if done at all.

It makes sense that units of code (objects or whatever) should be able to demonstrate that they can pass their unit tests before they are entered for integrated testing, which is largely what 'traditional' methods are.

Vince.

Flat View: This topic has 12 replies on 1 page
Topic: Use the Best Tool for the Job Previous Topic   Next Topic Topic: Observing JavaSpace-Based Systems

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use