The Artima Developer Community
Sponsored Link

Agile Buzz Forum
Too many bad ideas to count

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
James Robertson

Posts: 29924
Nickname: jarober61
Registered: Jun, 2003

David Buck, Smalltalker at large
Too many bad ideas to count Posted: Jun 10, 2005 2:17 PM
Reply to this message Reply

This post originated from an RSS feed registered with Agile Buzz by James Robertson.
Original Post: Too many bad ideas to count
Feed Title: Cincom Smalltalk Blog - Smalltalk with Rants
Feed URL: http://www.cincomsmalltalk.com/rssBlog/rssBlogView.xml
Feed Description: James Robertson comments on Cincom Smalltalk, the Smalltalk development community, and IT trends and issues in general.
Latest Agile Buzz Posts
Latest Agile Buzz Posts by James Robertson
Latest Posts From Cincom Smalltalk Blog - Smalltalk with Rants

Advertisement

Richard Mansfield has been scarred by the "C" language family. In this article, he tries to explain why OOP is bad. What he mainly explains is something very simple - he's only been exposed to OOP done wrong, and drawn bad conclusions from this:

To the extent that OOP is involved in components such as text boxes (not much, really), it's very successful. GUI components are great time-savers, and they work well. But don't confuse them with OOP itself. Few people attempt to modify the methods of components. You may change a text box's font size, but you don't change how a text box changes its font size.

What he's doing here is arguing against a concept that predates OO - encapsualtion. I was confused by this trashing of basic programming concepts until I read the short bio at the end of the piece:

Richard Mansfield has written 32 computer books since 1982, including bestsellers 'Machine Language for Beginners' (COMPUTE! Books) and 'The Second Book of Machine Language' (COMPUTE! Books). From 1981 through 1987, he was editor of COMPUTE! Magazine and from 1987 to 1991 he was editorial director and partner at Signal Research.

It looks to me like Richard started out in assembly, and his exposure to OOP has been through the marvelous examples provided by C++, Java, and C#. In those languages, you get OO as the supposed main play, but it's surrounded with stupidity like primitive data types, "final" class definitions, etc, etc - it's no wonder he's come away with so many bad ideas:

For a while it was a success, but things took a turn. In those early days, computer memory was scarce and processors were slow. Processor-intensive programs such as games and CAD had to be written in low-level languages just to compete in the marketplace. To conserve memory and increase execution speed, such programs were written in assembly language and then C, which conformed to the computer's inner structure rather than to the programmer's natural language. For example, people think of addition as 2 + 2, but a computer stack might work faster if its programming looks like this: 2 2 +. Programmers describe it as little Ashley's first birthday party: the computer starts counting from zero, so to the machine it's her zeroth birthday party.
When fast execution and memory conservation were more essential than clarity, zero-based indices, reverse-polish notation, and all kinds of bizarre punctuation and diction rose up into programming languages from the deep structure of the computer hardware itself. Some people don't care about the man-centuries of unnecessary debugging these inefficiencies have caused. I do. Efficiency is the goal of OOP, but the result is too often the opposite.

He then goes on to argue that data and functionality need to be separate - that this somehow increases the odds that your code will be flexible:

I find that leaving the data in a database and the data processing in the application simplifies my programming. Leaving data separate from processing certainly makes program maintenance easier, particularly when the overall structure of the data changes, as is so often the case in businesses (the most successful of which continually adapt to changing conditions). OOP asks you to build a hierarchical structure and thereafter try to adapt protean reality to that structure.
Encapsulation, too, is a noble goal in theory: you've reached the Platonic ideal for a particular programming job, so you seal it off from any further modification. And to be honest, constructing a class often is fun. It's like building a factory that will endlessly turn out robots that efficiently do their jobs, if you get all the details right. You get to play mad scientist, which can be challenging and stimulating. The catch is that in the real world programming jobs rarely are perfect, nor class details flawless.

This is where I think he runs off the rails. Inheritance is one aspect of OOP, but it's not the end-all, be-all. In fact, most of us recognize that deep inheritance trees lead to obfuscation more than they lead to elegance. But never mind that - what I want to know is this: If I have a set of functions over here, and the database over there - as opposed to a set of objects over here, and the database over there - how is the former easier to update than the latter? If the form of the data changes, guess what? The functions (or methods) need to adapt. Richard seems to think that a rigid separation somehow makes this easier - either he's smoking something, or he's never worked on a non-trivial system. I spent years working in C, and I also spent a fair bit of time in Basic (and other similar languages). Believe me, Smalltalk makes it far, far easier to deal with code migration issues than anything else I've ever worked with.

Mansfield has discovered that the Emperor has no clothes, and he thinks the Emperor is OOP. What he hasn't figured out - most likely through lack of exposure - is that C++, Java, and C# do not define OOP. In fact, they are all fairly ugly hacks pretending to be OO, while preserving the nasty familiarity of C style syntax. If he wants to blame someone, he can look at the supposed giants: Stroustrup, Gosling, and Hejlberg - who have managed to inflict a few megatons worth of damage to the software development field during their careers.

Read: Too many bad ideas to count

Topic: Apple to intel Previous Topic   Next Topic Topic: Sun's fuzzy math

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use