The Artima Developer Community
Sponsored Link

Weblogs Forum
Java: Evolutionary Dead End

92 replies on 7 pages. Most recent reply: Jan 3, 2008 9:06 AM by Bruce Eckel

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 92 replies on 7 pages [ « | 1 ... 3 4 5 6 7 | » ]
George Sakkis

Posts: 14
Nickname: gsakkis
Registered: Jun, 2007

Re: Java: Evolutionary Dead End Posted: Jan 12, 2008 8:51 PM
Reply to this message Reply
Advertisement
I also have to question the obsession for 100% forever-and-ever backwards compatibility. Nobody suggests that large, running, stable systems with codebases of thousands KLOC have to be upgraded to the latest and greatest release, just to be in fashion. And as several commenters pointed out, often this doesn't happen in practice even when backwards compatibility is maintained. The main target audience of every big non-compatible release should be the new projects, projects that are still young or not even born yet.

As a Python user, I welcome the 3.0 cleanup even if I don't get to use it for quite some time after it is released. It's not like the 2.x versions will stop working the day after.

Howard Lovatt

Posts: 321
Nickname: hlovatt
Registered: Mar, 2003

Re: Java: Evolutionary Dead End Posted: Jan 13, 2008 12:56 AM
Reply to this message Reply
@George,

Well said

Vincent O'Sullivan

Posts: 724
Nickname: vincent
Registered: Nov, 2002

Re: Java: Evolutionary Dead End Posted: Jan 13, 2008 12:48 PM
Reply to this message Reply
Do you know what I want? I want to work on two projects in a row without needing to learn a new lanuage, version, IDE, operating system, command line interface (it's the 21st century for god's sake!), tool, procedure, protocol, technique, fad or whimsy.

PS. Where's the guy who always says that Smalltalk could already do it all (and years ago too)? He should be here by now.

George Sakkis

Posts: 14
Nickname: gsakkis
Registered: Jun, 2007

Re: Java: Evolutionary Dead End Posted: Jan 13, 2008 1:26 PM
Reply to this message Reply
> Do you know what I want? I want to work on two projects
> in a row without needing to learn a new lanuage, version,
> IDE, operating system, command line interface (it's the
> 21st century for god's sake!), tool, procedure, protocol,
> technique, fad or whimsy.

That's easy, just stick to Windows 3.1 and Java 1.1 and you should be fine.

Vincent O'Sullivan

Posts: 724
Nickname: vincent
Registered: Nov, 2002

Re: Java: Evolutionary Dead End Posted: Jan 13, 2008 11:15 PM
Reply to this message Reply
> That's easy, just stick to Windows 3.1 and Java 1.1 and
> you should be fine.

Is that your solution? I think that if I tried that, I'd be unemployed.

Fortunately, I'm still agile enough to keep up with enough of the passing ideas to keep the wolves from the door. Unfortunately, the training and learning involved is a massive drag and a permanent brake on efficiency and productivity. Brilliant for those that sell books, courses, conferences, consultancy and advertising space; not so good for those of us in the trenches.

Achilleas Margaritis

Posts: 674
Nickname: achilleas
Registered: Feb, 2005

Re: Java: Evolutionary Dead End Posted: Jan 14, 2008 3:23 AM
Reply to this message Reply
> >
> > That's why Java is going to be a dead end. Further
> > analysis of why this is bad is not required: those who
> > understand type theory understand the problem with
> this.
> >
>
> This looks like a religious impasse. Analysis is futile,
> and agreement is unlikely.

Ok, how you are going to infer the types for this?


new HashMap().put(foo, bar);


You have to climb down the AST, reach symbols 'foo' and 'bar', find their types, then analyze the code of method 'put' and find all associations of symbols 'foo' and 'bar' to other symbols, then assign the type of 'foo' and 'bar' to all the local variables of 'put', and then repeat the process until all members of the HashMap instance are type-inferred. That is, IF you are lucky and the function 'put' does not have any non-terminating constructs.

Isn't it simpler to do the reverse? apply the types to HashMap and then assign a type to the assigned variable...the existing compilers already do that so as that they can report the type errors. Why should things be more complicated than they should?

Mark Thornton

Posts: 275
Nickname: mthornton
Registered: Oct, 2005

Re: Java: Evolutionary Dead End Posted: Jan 14, 2008 3:32 AM
Reply to this message Reply
Doesn't type inference from method parameters necessarily work the other way around (i.e. you already have the type of the 'lhs').

>
> new HashMap().put(foo, bar);
>


I don't think this is very useful as an example as it wouldn't actually occur in any useful code. You don't need to infer anything here.

Achilleas Margaritis

Posts: 674
Nickname: achilleas
Registered: Feb, 2005

Re: Java: Evolutionary Dead End Posted: Jan 14, 2008 5:16 AM
Reply to this message Reply
> Doesn't type inference from method parameters necessarily
> work the other way around (i.e. you already have the type
> of the 'lhs').
>
> >
> > new HashMap().put(foo, bar);
> >

>
> I don't think this is very useful as an example as it
> wouldn't actually occur in any useful code.

How about:


class ComplicatedObject<T1, T2> {
...bla bla
}

new ComplicatedObject().put(foo, bar);


> You don't need
> to infer anything here.

You need to infer the type of key and value.

Steve Simmons

Posts: 5
Nickname: scs
Registered: Sep, 2007

Re: Java: Evolutionary Dead End Posted: Jan 14, 2008 8:42 AM
Reply to this message Reply
George wrote: <i>I also have to question the obsession for 100% forever-and-ever backwards compatibility. Nobody suggests that large, running, stable systems with codebases of thousands KLOC have to be upgraded to the latest and greatest release, just to be in fashion.</i>

Just to be in fashion? No, that would be stupid. The more frequent reason is because the environment and/or compilers that the system requires are no longer available. On one hand, that argues for using open source, because you can maintain the language yourself. On the other hand, the problem doesn't become critical until 7 to 10 years down the road. At that point, the source may no longer be available. Witness how may versions of things disappeared and were never recovered when the FSF repository was hacked. And when you're needing a 7-year-old version of perl or gcc or python, the community isn't much interested in porting it to a new UNIX flavor. The communities tend to focus on the leading edge, not the trailing.

<i>And as several commenters pointed out, often this doesn't happen in practice even when backwards compatibility is maintained. The main target audience of every big non-compatible release should be the new projects, projects that are still young or not even born yet.</i>

True. The closer to 'current', the longer your then-frozen version is going to be available and working.

P Huber

Posts: 2
Nickname: soylent
Registered: Jan, 2008

Re: Java: Evolutionary Dead End Posted: Jan 17, 2008 1:28 AM
Reply to this message Reply
Thanks for this clear and precise statement!
I share your thoughts and I feel myself confirmed in my saying that nobody ever looks on the effects the current featurism has on projects, teams and their success.
Following many discussions and blogs I perceive it's allways like only throwing code examples back and forth.
The most "enlightning" example fro me was the one with a closures agains a template version of Doug Leas Parallel Processing Framework.
But given that templates in java - my view - are crude as they are implemented, the two examples want me to choose between pest and cholera. The example makes closures look like a good choice, but the trick is here to compare the "nice" closure example to a raelly "evil" template example...
I would suggest to R&D departments at google and sun or anybody else just to use Java in a productive way and not to destroy it...make libraries not war...ah, I mean, syntax changes.

Mark Thornton

Posts: 275
Nickname: mthornton
Registered: Oct, 2005

Re: Java: Evolutionary Dead End Posted: Jan 17, 2008 1:46 AM
Reply to this message Reply
> The most "enlightning" example fro me was the one with a
> closures agains a template version of Doug Leas Parallel
> Processing Framework.
The fork-join framework is trying to support relatively fine grained parallelism. In these case the boiler plate required by regular Java (with or without generics) overwhelms the real content. You are welcome to suggest how this might be done with clarity but without resorting to closures.

P Huber

Posts: 2
Nickname: soylent
Registered: Jan, 2008

Re: Java: Evolutionary Dead End Posted: Jan 17, 2008 2:14 AM
Reply to this message Reply
I know doug leas concurrent framework from a time when java 5 was not yet here. So there was no closure, there was no template and it worked. It worked pretty well. I want to give you this example just to show that even without sugar things might taste good.
The concurrent framwork might not have been optimal in a sense of "number of interfaces and classes" but it worked and did a great job, and therefore "number of classes" is not a meassurement that counts for me in such a discussion.

To come back to your suggestion:
So why don't strip of any "dead freight" and see what a suggar-less java 1.4 version would look like. I mean, we all were able to live with a simple java.util.List for years, then came templates and type erasure...And I could go with a simple List forever because I tried to avoid using it in APIs I design. I would provide domain objects with specific types, getters and setters and stuff. And I would rather spend time in a team to let them learn a view more classes and interfaces than tell them what language constructs they should favour in what situation (it's easier if there is no choice at all...kis).

Jean-Daniel Nicolet

Posts: 13
Nickname: jdnicolet
Registered: Oct, 2003

Re: Java: Evolutionary Dead End Posted: Jan 17, 2008 8:58 AM
Reply to this message Reply
In the real world, things are unfortunately not so easy. Let me tell you a true story that has happened about a year ago in a big production environment (several hundreds Java/J2EE applications running on as about as many instances of Borland Application Server BES on a Sun/Solaris operating system platform).

All begun with Borland announcing that in twelwe months from then they would not support the currently in use BES version anymore (version 5.8 something) and that an upgrade to version 6 something should be envisioned. Needless to say, this new version would require a step forward to JDK 1.5 (1.4.2 in use at that time). That in turn would imply the Solaris version had also to be updated, because the current one was not totally compatible with the new JDK / JVM (speaking of Java hardware "independence"...).

But changing the JDK also implied adjusting the J2EE framework that had been developped in the previous years on which the few hundred applications relied on. All in all, a daunting, unsustainable amount of work and risks, because it had all to be done in parallel with new developments and in a narrow timeframe (less than a year).

The end story is it was decided to abandon Borland and switch to a new application server (Open source, this time, allowing to stick longer to a given version), because it was economically more feasible and less risky.

Backwards compatibility is a *major* issue, because the previous story shows that it is illusory to think you can decide yourself when you want to upgrade. A simple constructor announcment may trigger an unexpected amount of ripple changes that you have no choice to absorb in a way or another. And sooner or later, you'll have to switch to a new version of your XDK, whatever the 'X' stands for.

Several hundreds applications means many tens of millions of lines of code, and even a simple recompilation / redeployment is not an easy task, provided you have to guarantee the continuity of operations, including interoperability, of course.

In such an environment, there is simply no possibility for a programming language that could significantly break compatibility from a given version to the next.

Mateusz Fiolka

Posts: 2
Nickname: jau
Registered: Dec, 2007

Re: Java: Evolutionary Dead End Posted: Mar 7, 2008 12:42 AM
Reply to this message Reply
Scala way of declarations

a map

val map = Map(3 -> "fdad", 6 -> "blah")
// will become Map[Int, String]


if we want to program to interfaces

val list: Seq[Int] = List(3, 5)
// will be List[Int], but type of variable will be Seq[Int]


Maybe I'm different, but I prefer it to Java way. When I come back I'm angry I have to do it by hand (Eclipse alleviates the problem, but for complex constructs it's very annoying). The difference with JRuby and Jython is that Scala is very staticly typed, so possibilities for refactoring and IDE support are comparable to Java ones.

don groves

Posts: 1
Nickname: dgpdx
Registered: Mar, 2008

Re: Java: Evolutionary Dead End Posted: Mar 9, 2008 5:25 PM
Reply to this message Reply
Absolutely right on.

Q: What is the one thing we always learn from doing anything?
A: How to do it right the next time!

By clinging to the past, we damage the potential of the future.
--
don

Flat View: This topic has 92 replies on 7 pages [ « | 3  4  5  6  7 | » ]
Topic: Java: Evolutionary Dead End Previous Topic   Next Topic Topic: Can iTunes Accomplish What Jini Couldn't?

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use