Most developers have read at least a few of the many available code improvement books, displayed these books prominently on their bookshelves, and felt that they were doing a good deed by recommending the books to other developers. When developers read the coding rules that these books suggest, they typically nod their heads like stoics and agree that the book's rules are wonderful.
...
Sooner or later, most developers run across a static analysis tool that can read code and check whether it follows a set of coding rules.
Most developers are curious enough to give it a test run in hopes that it will confirm what great code they have and perhaps yield a few helpful hints. However, the results are usually far from pretty. In fact, the developer often learns that his code violates many of the coding rules the tool was designed to check.
Guess what the developer does next?
First, he says that his code cannot possibly be so bad, and starts examining a few reported violations in hopes that the hundreds of violation messages are all a mistake. After realizing that the code does in fact violate the rules, the developer then starts attacking the rules.
Sad but very, very true. I've been guilty of this; I'm sure most developers - if they are honest with themselves - would admit to being guilty on this as well