This post originated from an RSS feed registered with .NET Buzz
by Steve Hebert.
Original Post: Code Coverage Analysis and non-TDD projects
Feed Title: Steve Hebert's Development Blog
Feed URL: /error.htm?aspxerrorpath=/blogs/steve.hebert/rss.aspx
Feed Description: .Steve's .Blog - Including .Net, SQL Server, .Math and everything in between
I'm looking at code coverage analysis and everyone regardless of methodology says 'shoot for 80-90% coverage'.
I'm very curious about non-TDD projects where only acceptance testing is automated. On this type of product, you come up with a test plan and implement it and then run code coverage analysis during the test run and come up with a number. What is that initial number?
The reason I ask is if the initial coverage average is something like 40%, then I am having to spend a tremendous amount of effort to meet my 80% minimum guideline during the test/debug phase. Not to mention the effort to understand what test patterns will return the biggest coverage gain. And how does that compare to initial TDD numbers?