The Artima Developer Community
Sponsored Link

News & Ideas Forum (Closed for new topic posts)
The Good, The Bad, and the DOM

2 replies on 1 page. Most recent reply: Jun 21, 2003 6:46 PM by Mike Champion

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 2 replies on 1 page
Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

The Good, The Bad, and the DOM Posted: Jun 17, 2003 6:27 PM
Reply to this message Reply
Advertisement
Elliotte Rusty Harold says: "There's a phrase, 'A camel is a horse designed by committee.' That's a slur on a camel. A camel is actually very well adapted to its environment. DOM, on the other hand, is the sort of thing that that phrase was meant to describe."

Read this Artima.com interview with Elliotte Rusty Harold, in which he discusses the problems with the DOM API, and the design lessons he learned from DOM:

http://www.artima.com/intv/dom.html

What do you think of Elliotte Rusty Harold's comments?


Erik Price

Posts: 39
Nickname: erikprice
Registered: Mar, 2003

Re: The Good, The Bad, and the DOM Posted: Jun 18, 2003 3:59 AM
Reply to this message Reply
I agree wholeheartedly. So far I've been lucky, in that the only XML structures I've had to work with have been relatively simple. In one project I've been able to use the simple-as-pie Digester (from http://jakarta.apache.org ), and in the other it wasn't so complex that I couldn't just use SAX.

Although I've not heard of XOM before, I have gone through the JWSDP and I think that I would far prefer to use JDOM than DOM in an actual project. But I think it's nice that we do have a horse designed by a committee, for some strange situations that I just can't think of where a language-neutral API is helpful*. Perhaps it's nice just to have DOM to put on a shelf and say "well, we can always fall back on that, but I hope we don't have to". Maybe without DOM we'd never have the more developer-friendly APIs like JDOM.

* Taking into account DOM's use in JavaScript, where I do not know of an alternative.

Mike Champion

Posts: 2
Nickname: mchampion
Registered: Apr, 2003

Re: The Good, The Bad, and the DOM Posted: Jun 21, 2003 6:46 PM
Reply to this message Reply
[disclaimer: I was the principal editor of the Level 1 DOM and was on the working group from 1997 to 2002, so take any whiny defensiveness below with as many grains of salt as you wish!]

First, I have to agree with the "camel designed by a committee" gibe. DOM is an ugly beast in a lot of ways, and most of them stem from the fact that when a consensus-driven group has to make a decision between Option A and Option B, "A and B both" is usually the result. I would differ from Harold in one way: DOM is reasonably well suited and actually quite successful in the environments it was designed for -- HTML/XML Web browsers and XML authoring systems. I'd argue that like a camel, it's perverseness is sufficiently constant across environments that it's a good thing to ride on in a trip through uncharted terrain. On the other hand, it *does* make a terrible racehorse, and it is too prone to bite the unwary to be suitable for novices. That hasn't stopped people from using camels.

Also, it's important to understand that DOM was not really intended as a high-level API for ordinary Dynamic HTML authors or people just trying to tweak some XML. It's better thought of as an "assembly language for the XML Infoset". The "complexity" of the API comes largely from the fact that it (at least originally) tried to confine itself to the most basic operations on an XML tree and include only the most obviously universally useful "convenience methods" (getElementsByTagName() is the example i remember from our 1997 discussions). The expectation was that libraries of other "convenience methods" would emerge to make life tolerable for ordinary users. I'm not sure why JDOM came to life as a whole different API rather than a "convenience library" on top of the DOM. I totally agree that it is silly to ask ordinary people to create a "Hello XML" DOM tree by laboriously creating and linking together the DOM nodes (Harold's "Java and XML" book has a great example of how much easier this is in JDOM than DOM, IIRC). Is it *that* much less efficient to implement such things as a sequence of DOM calls (collected into libraries) rather than define a whole new API? I may be missing something profound here... but the obvious solution (that I use in my own work) is to package up a set of utilities that alleviate the pain of the DOM's low-level orientation in whatever environment I'm working in. By now, I would have expected these utility libraries to be commoditized / standardized on top of DOM rather than fragmented into contending APIs. What am I missing here?

I pretty strongly disagree with the points about its language neutrality and being defined as an interface rather than concrete class. I find it useful and even comforting to know that the DOM is *roughly* the same (if you stick close to the actual Recommendation) in Javascript, Java, Python, PHP, C#, and probably serveral more languages. I don't know much about PHP, for example, but I can figure out how to do things with the DOM without a whole lot of trouble. I'm sure that PHP geeks are just as apalled by DOM as Java geeks are, but someone just trying to get an XML processing script running in a world where someone has decreed that PHP is the platform of choice is not likely to care.

Similar point about interfaces and classes. With DOM you can write code that works with any implementation, and switch (e.g. from one JAX implementation to another) if one is better suited to a particular application. With JDOM and XOM, your code works if you can link in your library of choice, or some application uses that library ... and if not, you get to rewrite the code. One can (and I have) worked with essentially the same DOM application-level code simultaneously to integrate across three environments (e.g., XMetal's DOM implementation, the MS DOM implementation , and Tamino's DOM interface). Maybe that's a corner case for most people, but that kind of application integration is my world!

In any event, the original POINT of the DOM was to be an abstraction of the data structures and internal APIs used in the different browsers and different XML authoring tools. I suppose that could have been done with classes, but interfaces seemed like the "textbook correct" approach at the time. Perhaps that seems a bit quaint today when DOM-like data structures and APIs are built deeply into most XML products. I don't know ... I suspect that interfaces will rescue us once again as performance becomes critical and the underlying data structures become less and less like "trees of nodes" that are expensive to construct, and more like flat text buffers or optimized "binary" data for which Nodes are created on demand. Again, I suppose this could be done with classes, but the Interface and Factory design patterns seem like the obvious approches, and that's what the DOM ran with. Definitely more hassle for the application programmer that does not need that level of abstraction, but offers immensely more flexibility to the power user. Again, the obvious solution seems to be for the application programmers to use convenience libraries that hide the Factory and Interface stuff behind nice classes rather than rebuild the whole API on a class foundation ... but again I may be missing something profound here.

I agree that namespaces in DOM are a bit of an abomination, but the namespace spec is a bit of an abomination IMHO -- it is (almost certainly by design) oriented completely toward XML syntax and parse-time implementation rather than a post-parse data model orientation that any reasonable read/write API requires. It is quite hard to model namespaces in a read-write environment; XML and JDOM do a better job than DOM because they have no pre-namespace legacy to support. (XPath does a *much* better job, but it is not a read/write data model!) Should DOM just toss out non namespace-aware processing ("Level 1")? That would make life easier for geeks, but as even the most casual reader of the xml-dev mailing list knows too well, there is a substantial amount (perhaps a majority) of real XML processing code that ignores or merely pays lip service to namespaces. The DOM working group made a conscious choice: "Do we make force all those Dynamic HTML scripts to either break or become namespace aware, or do we make the Level 2 DOM a bit kludgy and keep those scripts legal?" Lots of geeks flame the resulting inelegance, but I'm not sure even in hindsight whether that was a bad decision.

All that said, the DOM is approaching its 5 year anniversary as a W3C Recommendation. I wish that the W3C had some sort of "sunset law" making Recommendations subject to reconsideration / refactoring after 5 years. XML is more than 5 now and long overdue for a vigorous application of Occam's Razor, and the DOM needs the same treatment. Some of the really pointless stuff that Harold points out (e.g. the use of the 'short' type) could be polished out at the same time. In any event, I welcome efforts such as XOM, dom4j, etc. that attempt to shave away the consensus-driven cruft. When the time comes to refactor this stuff (by W3C, the JCP, the unholy Microsoft-IBM alliance, or whomever) they should take XOM *extremely* seriously.

Flat View: This topic has 2 replies on 1 page
Topic: Is Artima Getting Quieter? Previous Topic   Next Topic Topic: What's Wrong with XML APIs

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use