Having whacked the hornet's nest once, I take another swing, trying to clarify what I was trying to say originally. Is that a buzzing I hear?
When I posted my last entry, I realized that I might be stirring up a
hornet's nest. Indeed, I could hear the buzzing just before I pressed
the submit button, and almost decided against it. Such good sense
has never been my hallmark in the past, however, nor did it overcome my
hope that I wouldn't regret the posting. While I don't regret it, it has
been instructive to listen to the replies. Some have been well-thought out,
others not so much. But the replies did make me realize that I had posted
in haste in one respect, since the main points I was trying to make have
not seemed to have come through clearly.
So I'll try again. I'm hoping that the second time will be the
charm. Either I will make my points more clearly, or I'll realize that they
can't be made. So here goes...
Point one: Just because something is called a standard doesn't
make it open; and something that isn't a standard is not, because of that,
proprietary. The standard/non-standard and open/proprietary
dimensions are orthogonal. Certainly, there are lots of technologies that
are standards and are also open, but there is no causal connection between the
two properties. Just look at the controversy in a number of standards groups right now
as to the granting of intellectual property when contributing to a
standard. If being a standard made something open, there would be no such
controversy. In the same way, there are things that aren't standards that
are quite open...look at most of the open or community software that hasn't
been formally standardized. The source code is free for the taking; the
reuse of that code is determined by more or less restrictive licenses. There
are plenty that are very open, even though they are not standard.
The real point here is that whether something is open or proprietary
depends on things like the licensing model, the intellectual property
grants, and other sorts of legal notions around the technology. Whether or
not something is a standard depends on acceptance by some standards
group. Barring some connection between the two (which, by the way, a number
of standards groups are trying to construct), there is no connection
between the two. A simple point, but one that often gets lost.
Point two: A standards body is often a lousy place in which to
invent a technology. This is the essence of what I was trying to get
at by distinguishing between de facto and de jure
standards. Technology, I would claim, is best invented or designed by one
person or a small group of people, working together for a common goal that
all of them understand and agree to. While there can be a standards group
that meets this description, it is rarely the case that standards groups
do. More generally, such groups are made up of a rather large group, each
member of which is pushing for the good of his or her own company or
interest group. Such a group can actually be a good thing if you are trying
to determine which of a group of existing technologies will get the widest
adoption, but they tend to be bad at creating a coherent, efficient, and
elegant design from whole cloth.
This doesn't mean that such a group can't ever produce a good
technology. There is no logical impossibility that is meant to be implied
here. And while I would be surprised, I could be convinced that a good
technology had actually been created by such a diverse group. But I don't
personally know of any examples.
Notice, by the way, that there can be a small group of people with a
common goal who work together towards an understood and agreed to end that
calls itself a standards group. Many of the IETF's early working groups
were just such organizations. But once a standard is seen as commercially
important, it is much less likely that the standards group will be made up
of such technologists. Again, this is a personal observation, not a logical
Point three: The previous posting was not a veiled (thinly or
otherwise) attack on any particular standards group or collection of
standards groups. I was not trying to pick on J2EE, or XML, or SOAP,
or W3C, or IETF, or... I actually don't watch the standards groups closely
enough these days to want to pick on any particular group. There is blame
enough to go around in this area. I do remember, about a year ago, when I
was paying enough attention to notice that there was a new standards effort
starting up on average of once a week, and a new standards organization
starting up on average of once a month. Many of these groups and
organizations were trying to standardize the same activities. Which leads
Point four: If there are multiple groups competing to write a
standard for the same thing, it is probably a safe bet that the technology
being standardized isn't ready for standardization. This is the
point I was really trying to make, but didn't state explicitly. But this is
the one that I think is important for all of us who are trying to produce
and use technology to understand.
It is unfortunate that many technology businesses have decided that it
is a good competitive strategy to declare their own solution to some
problem a standard, and label other approaches to that problem
proprietary. Often this is done by creating a standards group which will
give that company (or some small set of companies) enough cover that they
can claim that the standard is in fact independent. Recently, this has gone
even a step further, as companies start us standards groups
to create a standard technology.
There is no one company that is doing this sort of thing; indeed, this is
another area where there is plenty of blame to go around. But it doesn't
matter what company or group of companies is doing it; it is a bad thing
for innovation. The whole point of this exercise is to try to pre-empt
any innovation in an area, and instead negotiate some solution that will
advantage the founding companies in an attempt to force the industry into
adopting that solution. The adoption is driven not by the technical
excellence of the approach, but because it is a standard. This
is the sort of thing that I find objectionable, and the sort of thing
that I believe hurts the industry.
Once again (re-read point three), I'm not trying to single out any
particular standardization effort. If you are involved in some standards
group and think I'm talking about you, then perhaps I am, but not about you
only. There are lots of examples, and when I'm writing this I really don't
have any particular example that I am talking about.
What I'm really talking about, I'm afraid, is an industry that has lost
the courage to actually innovate on the belief that a better solution will
win in the marketplace. Instead we now have an industry that seems driven by
the hope that the marketing noise made around establishing a fake standard
will make our customers forget that there is little of value that we are
providing. Such an industry is ripe for harvesting by some company (probably
a small one) taking over by introducing something that is really
innovative. We can all think of examples of where this happened in the
past. I don't think things have changed so much that it won't happen again
in the future.
In the SIP world you wouldn't believe the number of vendors that equate standards based with open. Many times I hear them say their platform is standards based (SIP and VoiceXML). When you ask them for the ability to change the VoiceXML all of a sudden they say 'Sorry we coded it in Java and we don't give out the source.' So much for standards!
The problems you talk about have been around since the '80s. I remember the meetings that come up with X.400 and X.500. Now there was a nightmare! Marketing types all over the place. I immediately went to the IETF working on SNMP, WinSock, etc. It was nice - technologists that built stuff working on standards. Today the same problem that hit the X400/X500 commits are in the IETF and W3C!
It is unreasonable to assume a standards body can entirely create a technology. However, this is almost never the case.
Usually, a standards body is created to standardized an existing or recently-created technology. (The W3C is a good example of this, as is the ISO committee governing the C language.) I seriously doubt the JPEG group set out to create the JPEG standard -- they probably already had some preliminary form of the technology developed, and then standardized it.
When standards bodies set out to create entirely new technologies, usually these bodies are comprised of people knowledgeable about the particular task and may be able to "innovate" a little better than your John Q. Programmer. That isn't to say that John couldn't innovate, but chances are, the experts are usually more qualified to do so. Also, under my previous logic, if he did innovate, he could always have it developed into a standard with his own group later on.
I don't see anything wrong with standards, but I see a lot right with them.
Command driven technology trying to compete with market driven technology, without having a slush fund to distort that process (Microsoft), is in for a tough ride.
Command (or Control) based systems derive from the top and trickle to the bottom. Determine everything that should or will be, plan for it, and then execute the plan. Except there are a million-and-one variables that are unseen and unpredictable that can upset that cart.
Market driven systems start from the bottom and percolate up. The process appears piece-meal and chaotic as it adopts to demands. But the result is something that has adapted to challenges.
The building of the Internet was the nursery for IP, TCP, SMTP, FTP, and other standards. The rise of personal computers was the birthplace for CPM, DOS, Unix, Linux, the Floppy, and so on.
I think standard practices emerge because they work best and allow their adopters to survive against competitors. When you go counter to that, develop a standard first, the resulting standard starts out with a handicap that will be difficult to overcome when it hits the real world.
Yet exceptions do exist. MPEG-2, MPEG-4 are good examples. But it seems like they are exceptions rather than the norm.
In 1999 I thought it would be a good idea to develop a open way of moving business documents around. I'm talking about things like basic orders and invoices. Ideally there would be a simple way to transmit and verify business documents, and that they would have a standard format such that most accounting systems could understand them.
The ebXML project was designed to solve exactly this problem. The only thing is that they have created something that meets only the complex needs of major companies. The complexity of the solution makes it virtually unusable, and mind numbingly difficult to implement for a small software company.
My solution was to simply define a XML schema, introduce inheritable schemas to take care of extending the documents for specific purposes, and develop a transport that was easy to use, secure, and automated the authentication process. However, my solution isn't a standard, even though the source was open, and what killed the project was a political difficulty - of getting people to consider using it.
The point is that its four years later - and we still have no easy way to transfer orders and invoices - something so very basic.
The problem is that developers wait for this standard, and then when it arrives it blocks more simple solutions which are not 'standard'. I would love to support ebXML - but the complexity of the thing makes this impossible, since we don't have months or years to develop the implementation.
> Just because something is called a standard doesn't > make it open; and something that isn't a standard > is not, because of that, proprietary.
People are more likely to use software that is cheap. It doesn't get cheaper than free. If more people use software, it is more likely to become a standard. I see a link.
> A standards body is often a lousy place in which to invent a technology.
I can?t grumble about that ? it?s true. The way to develop a standard that sticks is to lead and innovate and back fill it with a standard after it?s done. This is the principle of the 'unconscious standards body', i.e. technologists who make a standard but don't know they are making a standard while they are doing it!
The issue doesn't seem to be that standardization is bad/crippling/whatever, but that creating a standard to solve a problem is the wrong approach. It is important, however, to standardize once the problem has been solved. It was great that C was created at AT&T, but it was crucial that it was standardized later so that code can be reliably compiled by any vendor's compiler (ignoring semi-deliberate ambiguities in the standard, such as the size of primitive data types). Similarly, it's great that word processing has gotten to the point that everyone from 6-year-olds to CEOs to grandmothers can handle it, but it's about time the file format got standardized (and opened) so that a document created in (for example) MS Word can be reliably manipulated by any vendor's word processor.
The C++ standard has become huge and hard to implement correctly. It has taken years for any compiler to be able to dependably compile even most compliant code. Now that a few compilers do, including gcc/g++, it is a huge benefit to the developer community. Without that standard, developers would be forced to write to a minimal subset of C++ (see http://www.mozilla.org/hacking/portable-cpp.html). Without that standard, Java "Tiger" wouldn't have anything to aspire to.
Perhaps the best solution is to require, somehow, that anything deployed in the market for over a year should be standardized (and opened) to allow for interoperability. After all, the end of the spectrum opposite over-standardization is vendor lockin, and that's no good either (except for the monopolitistic vendor, e.g. old-school IBM or modern-day Microsoft).
Well your re explanation apparently has not worked given the comments. I think people are still missing the point. The point IMHO is that if you look at the good standards we can all point to like all the backbone of the internet technologies they did not come from standards committees. They became standards only after they solved the problem. When we try to get standards groups to invent solutions to new problems they are not likely to be good.
And I think the main point is that manager tendency to only want to go with standards even when there is really nothing in the area is crippling technology.
BTW - Standardizing on message formats is NOT what we are talking about. Of course you should create standard message layouts XML or otherwise. That is not technology that is a contract between venders. These also will not be as good as if two companies agreed on what they specifically need but it will have advantages of driving the industry forward. That is not the same as inventing the Persistence service in CORBA before there is even a reference implementation of it.
> And while I would be surprised, I could be convinced that a good > technology had actually been created by such a diverse group. But > I don't personally know of any examples.
I would proffer MPEG-2 as an example. It was created by precisely the sort of group you describe, i.e. a diverse set of companies each with their own agendas. Even so, the approaches were selected on technical merit, and the technology has been about as widely deployed as any you could name, including the perhaps unanticipated offshoot of MP3.
With respect to point 2: 'Standards bodies are often a lousy place to develop standars' - it is interesting to note that a fair number of such bodies, such as for example the IETF strictkly disabllow such development - and have instructed their working group chairs to declare such things out of scope or generally to be kicked out.