The Artima Developer Community
Sponsored Link

News & Ideas Forum (Closed for new topic posts)
Refactoring with Martin Fowler

10 replies on 1 page. Most recent reply: Aug 23, 2016 10:55 AM by Michael Goosen

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 10 replies on 1 page
Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Refactoring with Martin Fowler Posted: Nov 3, 2002 12:31 AM
Reply to this message Reply
Advertisement
Artima.com has published an interview with Martin Fowler, chief scientist at Thoughtworks, Inc. and author of numerous books on software design and process, in which Martin discusses refactoring, testing, and design.

http://www.artima.com/intv/refactor.html

Here's an excerpt:

Refactoring is about saying, "Let's restructure this system in order to make it easier to change it." The corollary is that it's pointless to refactor a system you will never change, because you'll never get a payback. But if you will be changing the system?either to fix bugs or add features?keeping the system well factored or making it better factored will give you a payback as you make those changes.

What do you think of Martin's comments?




Posts: 20
Nickname: johnnyo
Registered: Oct, 2002

Re: Refactoring with Martin Fowler Posted: Nov 5, 2002 9:34 AM
Reply to this message Reply
What I find most striking about comprehensive testing is that I spend almost no time in my IDE's debugger. The tests find almost everything. I did not expect that at all, yet it is very welcome. (So long, debugger, nice knowin' ya...)

Larry Hohm

Posts: 1
Nickname: hohm
Registered: Nov, 2002

Re: Refactoring with Martin Fowler Posted: Nov 19, 2002 3:25 PM
Reply to this message Reply
How do you manage unit tests that are data-dependent? Unit tests for server-side components often rely on specific values in the database, and the tests often break because someone changes the data.

For example, suppose I am testing a method such as getCaseDetails(String caseNumber), and I need to test a scenario in which the case has no participants. I enter my test data in the database, and all is well. Next week, I find that my test is broken, because some other developer added some participants for the case number I was using.

We really need to manage the test data upon which the unit tests depend. Otherwise our unit tests are not reliable. But this can turn into a big job.

Do you have any thoughts on managing data-dependent unit tests?

Rick Salsa

Posts: 1
Nickname: rsal
Registered: Nov, 2002

Re: Refactoring with Martin Fowler Posted: Nov 20, 2002 7:20 AM
Reply to this message Reply
You should use Fowler's Service Stub pattern, also known as a Mock Object to XP'ers.

Basically, create an interface for your service then implement this in your concrete class. For testing purposes, you will implement the interface as well in a service stub, which you can hard code values in the class and use this class from your unit tests. This takes away your dependency on any external service.

In your case, you'd have your caseNumber coded into the service stub.

This just one way of doing it.

HTH,
/rick

MikeD

Posts: 15
Nickname: mike
Registered: Apr, 2002

Re: Refactoring with Martin Fowler Posted: Nov 22, 2002 10:08 AM
Reply to this message Reply
You can also unit test database-centric code by having the unit test set up and control the data in the database, just as normal unit tests set up and control the data under test - the "fixture" in JUnit. Of course, you need a serperate "test instance" of your database. There is good information on it here:

http://www.dallaway.com/acad/dbunit.html

Julian Harris

Posts: 1
Nickname: kiwi
Registered: Nov, 2002

Re: Refactoring with Martin Fowler Posted: Nov 28, 2002 6:52 PM
Reply to this message Reply
Mr Fowler says:
"Refactoring improves the design. What is the business case of good design? To me, it's that you can make changes to the software more easily in the future. "

The problem I have is I've never encountered a customer who accepts this business case. They always ask 'what changes?'. Also, doesn't this run counter to the XP idea of 'only write what you need to'?

Mike Spille

Posts: 25
Nickname: mspille
Registered: Nov, 2002

Re: Refactoring with Martin Fowler Posted: Nov 29, 2002 1:18 PM
Reply to this message Reply
> You should use Fowler's Service Stub pattern, also known
> as a Mock Object to XP'ers.
>
> Basically, create an interface for your service then
> implement this in your concrete class. For testing
> purposes, you will implement the interface as well in a
> service stub, which you can hard code values in the class
> and use this class from your unit tests. This takes away
> your dependency on any external service.
>
> In your case, you'd have your caseNumber coded into the
> service stub.
>

Hmmm....this is rather difficult is my interface is JDBC talking to a real database. You wanna create a mock JDBC driver?

> This just one way of doing it.
>

This sort of thing is very useful for the developer, but (almost) useless for the project. In the scenario you describe, you've validated the code but things break just as bad when a true integration test is run (because the database data is bad).

IMHO XP'ers think too much in terms of their own code, and not enough about the total system. Unit tests, stubs, drivers and what have you are good tools for developers, but ultimately you're not deliverying classes to your customer, you're deliverying a full system. Configuration management, reference databases, canned test data, test servers for real time data, etc are all needed to verify that your app works "round trip" from one end to the other.

This may rankle some people, but I'll throw it out anyone. No one but the developer cares if a test driver works. No one but the developer cares if you pass unit tests. Why? Because the developer wrote them. What people care about is passing a full integration test, a functional regression test, performance tests, coverge tests. IMHO, XP should focus less on unit tests and developers working in isolation, and more on developers working directly on truly
integrated code with real reference data.

Unit tests are a tiny indicator that something is right. It's a good thing, but tiny in the great scheme of things. Passing a unit test based on a faked up driver will not get you kudos if you break the integration build, or throw runtime errors when the real system tries to use it.

Going back to the original poster's comment - using drivers is good _until you have a real concrete implementation to talk to_. A driver/proxy/Mock Object is good when the other side doesn't exist yet. When the other side _does_ exist, using your made-up object gets you caught up in circular verification. Once you have a real DB interface with real reference data, it's in your best interest to switch to that to catch both code and data problems early. If you try not to and use internal drivers instead, all you're doing is delaying the pain to QA, the customer, or production.

> HTH,
> /rick

-Mike

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Refactoring with Martin Fowler Posted: Nov 30, 2002 3:16 PM
Reply to this message Reply
> Mr Fowler says:
> "Refactoring improves the design. What is the business
> case of good design? To me, it's that you can make changes
> to the software more easily in the future. "
>
> The problem I have is I've never encountered a customer
> who accepts this business case. They always ask 'what
> changes?'. Also, doesn't this run counter to the XP idea
> of 'only write what you need to'?

I believe that XP's "You're not going to need it" (YAGNI) principle mostly is saying you won't need a feature, a piece of functionality, or a particular flexibility. YAGNI isn't really saying you won't need an improved, refactored design. It is saying be conservative about adding functionality unless you are absolutely positive it is actually needed.

I agree that it is hard to sell a major refactor of the sort where you set aside a few weeks just to do nothing but improve a design. I think that probably should be a hard sell in most cases. For example, I have long wanted to spend a day or two refactoring the build process that generates the pages of this web site, but I haven't done it yet. I always feel something else is higher priority, like fixing bugs or adding enhancements. But on the other hand, when I go in to fix a bug or add an enhancement, I sometimes do a refactor or two. That's when I tend to do them. I think that kind of refactoring is easy to sell to customers, because you don't actually have to ask their permission. They want a bug fixed, and you in your professional opinion decide to make an incremental improvement in the overall design while fixing the bug. The refactoring you do is part of how you fix the bug, and you have the customer's buy in on the bug fix.

Kevin Klinemeier

Posts: 7
Nickname: zipwow
Registered: Dec, 2002

Re: Refactoring with Martin Fowler Posted: Jan 23, 2003 6:10 PM
Reply to this message Reply
> Hmmm....this is rather difficult is my interface is JDBC
> talking to a real database. You wanna create a mock JDBC
> driver?

Don't let the name fool you, you should be doing nothing of the sort. If you have some code that should select a particular thing from the database, make a mock connection that returns exactly that thing, every time.

True, you'll be making a lot of MockConnections, probably anonymous ones, but they're all completely trivial. Additionally, they describe exactly what you expect to happen.

There's a good discussion of this on this wiki server:
http://c2.com/cgi/wiki?MockDatabase

One important example about database access is concurrency. Where I work, we use a database server in development that is expensive, and so each developer doesn't have his/her own instance.

Consider, then that I have a test that:

Creates an object as its setup
Attempts to insert a duplicate, expecting to fail
Deletes the object in question as part of its teardown.

A race condition exists, whereby the first dev may delete the second dev's object, or the second dev's insertion will fail. Believe me, it is not at all clear why this is happening the first time you run into it.

> This sort of thing is very useful for the developer, but
> (almost) useless for the project. In the scenario you
> describe, you've validated the code but things break just
> as bad when a true integration test is run (because the
> database data is bad).

As I understand it, XP is about having tests everywhere. Unit test your individual code (that is, after all the 'unit' in JUnit), then use that well-tested code in your integration test.

As you point out, unit tests don't replace integration tests. However, good unit tests make integration testing much easier.

> IMHO XP'ers think too much in terms of their own code, and
> not enough about the total system.

I'd agree with a paraphrase: "Many XP'ers think that unit testing is all the testing that needs done."

> This may rankle some people, but I'll throw it out anyone.
> No one but the developer cares if a test driver works.
> No one but the developer cares if you pass unit tests.

Are you seriously arguing that unit tests aren't helpful? I'd say that a lot of people care if you pass unit tests. The proof of this is the converse. Does no one care if you fail unit tests? I'm assuming here that if your unit tests fail, your integration tests don't have a chance.

> IMHO, XP should focus less on unit tests and
> developers working in isolation, and more on developers
> working directly on truly
> integrated code with real reference data.

Unit Tests != XP. People need to say that statement a lot louder. XP is all about integration, continuous integration, in fact. XP is not about developers 'in isolation'. In the process as described, code is constantly integrated, with tests written both at the unit level, integration level, and even customer-driven acceptance tests.

> If you try not to and use internal
> drivers instead, all you're doing is delaying the pain to
> QA, the customer, or production.

Not if you have integration tests. Then you catch the integration bugs in the integration tests. Which is where it makes sense to catch them.

-Kevin

Joel Francia

Posts: 1
Nickname: joelperu
Registered: Aug, 2003

Null treatment Posted: Aug 6, 2003 10:20 PM
Reply to this message Reply
Hi

I was reading you book and in page 260 when you describe null treatment. I am working with C#, I found that we can use "NullableTypes", what's your opinion about it?.

Thanks

Joel

Michael Goosen

Posts: 1
Nickname: michaelaja
Registered: Aug, 2016

Re: Refactoring with Martin Fowler Posted: Aug 23, 2016 10:55 AM
Reply to this message Reply
What stands out for me, is the simple statement made by Martin Fowler that "What's left of the old method reads like documentation..."

The goal of good software design is to hide complexity, and in code, this one statement makes the case for refactoring identical to the outward goal, but on the inside, at the code level. Perfect.

Flat View: This topic has 10 replies on 1 page
Topic: The C# Design Process Previous Topic   Next Topic Topic: The Trouble with Checked Exceptions

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use