The Artima Developer Community
Sponsored Link

Weblogs Forum
What Are Your Java Pain Points, Really?

264 replies on 18 pages. Most recent reply: Jan 17, 2008 10:07 AM by GJ sonke

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 264 replies on 18 pages [ « | 1 ... 13 14 15 16 17 18 | » ]
Gregg Wonderly

Posts: 317
Nickname: greggwon
Registered: Apr, 2003

Re: Lack of extension Posted: Feb 26, 2007 10:53 PM
Reply to this message Reply
Advertisement
> > This is precisely what the applet paradigm should have
> > started out as, and what Java web start should had
> started
> > with. Instead, we have the concept of small
> applications
> > (applets) and seldom changing codebases of large
> focused
> > applications (java web start's download optimization)
> as
> > the platform supported paradigms of distributed
> > application deployment strategies.
>
> What a lot of people do not understand is that java was
> produced in a hurry. Sun saw it a a strategic tool in
> their battle with Microsoft. The last thing the Oak
> designers had on their minds was the internet, they where
> more concerned with toasters :^).

Oak was targeted at the Set Top box market place. The people involved in that project where placed into the Java project which was targeted at the internet.

The retargeting was done in a flurry, and the initial APIs visible in the HotJava browser were rough and quite ugly.

Today, we still pay for the decisions made at that time. Remember the original AWT eventing model and how terrible it was to run a complex GUI with that model?

Remember what it was like before there was a JIT? Remember the Symantec JIT and how fast, yet buggy it was?

Things may not be perfect, but they really are much better than they were.

The past 5 years of the JCP focus on everything J2EE has been a mistake that the language and J2SE may be able to recover from, if the right steps are taken now.

I'm intrigued by the generic collection mechanisms in linq. That definiately provides a nice dynamic typing mechanism where there is real value. The IDE support for it is a powerful mechanism that can make the code independent of the data source.

Jeff Ratcliff

Posts: 242
Nickname: jr1
Registered: Feb, 2006

Re: Lack of extension Posted: Feb 26, 2007 11:09 PM
Reply to this message Reply
>
> I'm talking about 64KB-128KB for big applications in that
> day and age on small devices. Micro-controller
> applications were largely written by hardware "dudes" in
> that day and age, and they weren't really software
> engineers familiar with virtualization techniques of the
> day. Most of the time, processors capable of going beyond
> 64KB would only have 32KB or 48KB because of the perceived
> expense of memory. The demand is what kept the price of
> memory up. Only when the PC market 1

I'm not sure how developed the virtualization techniques were in those days, but I doubt that anyone at the time could demonstrate an effective implementation that allowed a real-time application to be cost, memory and speed competitive with a typical embedded system of that era. If you have an example from the historical record that indicates otherwise, I'd be happy to hear about it.


>
> > Many of the embedded systems of that era didn't even
> > include OS's, so adding a JVM would increase resource
> > requirements significantly.
>
> If you have to "add" a JVM in a micro environment, you
> haven't designed your micro environment to capitalize on
> Java. Instead you've decided that Java can't add any real
> value and so it should just be an add-on that you can
> pitch when it fails to meet the needs of the users.
>

I think it's a hard case to make that one would include (if you prefer that word to "add") Java as a way of reducing the memory footprint or making the micro system run faster. I suspect that it is used for the same reason that most high-level languages are used - to simplify and accelerate software development.


> This is precisely what has happened in the cell phone
> industry. J2ME CLDC with MIDP is terrible to use on
> mobile phones because you can't make it interact with the
> user without disabling everything else the user wants to
> use the phone for. The manufactures don't think Java is a
> platform. They think that Java is an add-on for games and
> gimics.
>
Keep in mind that Java can't really be used for the core activity of a cell phone: signal processing. They have to use DSP's for that. So for cell phones, it's always going to be an add-on to some degree.

> Java is not an add-on technology! Java is a platform
> technology. If you are not getting everything that you
> need out of Java, I'd wager that you have not put it close
> enough to the problem space to allow it to work
> effectively.
>
> RTSJ demonstrates that Java can provide real-time
> programming. Java smart card (every resident of Korea,
> and Brazil carries a Java smart card), demonstrates that
> Java can be a platform on a very limited device.
>

I think a lot of this is Java "branding". One can always reduce the functionality of any language or platform and then say "It can run on a smart card". One can always make changes to the standard implementation to allow semi-real-time capability. The point is that some compromise will be involved that makes the "new" implementation differ from the norm.

There are two criteria you can use to judge these different implementations: 1) Will every standard program run on the new implementation? 2) Will there be enough in common with the standard implementation that programmers can leverage their knowledge?

Typically, the answer to 1 is no and the answer to 2 is yes.

Paul Beckford

Posts: 51
Nickname: phaedrus
Registered: Feb, 2007

Re: Lack of extension Posted: Feb 27, 2007 1:20 AM
Reply to this message Reply
> > > This is precisely what the applet paradigm should
> have
> > > started out as, and what Java web start should had
> > started
> > > with. Instead, we have the concept of small
> > applications
> > > (applets) and seldom changing codebases of large
> > focused
> > > applications (java web start's download optimization)
> > as
> > > the platform supported paradigms of distributed
> > > application deployment strategies.
> >
> > What a lot of people do not understand is that java was
> > produced in a hurry. Sun saw it a a strategic tool in
> > their battle with Microsoft. The last thing the Oak
> > designers had on their minds was the internet, they
> where
> > more concerned with toasters :^).
>
> Oak was targeted at the Set Top box market place. The
> people involved in that project where placed into the Java
> project which was targeted at the internet.
>
> The retargeting was done in a flurry, and the initial APIs
> visible in the HotJava browser were rough and quite ugly.
>
> Today, we still pay for the decisions made at that time.
> Remember the original AWT eventing model and how terrible
> e it was to run a complex GUI with that model?
>
> Remember what it was like before there was a JIT?
> Remember the Symantec JIT and how fast, yet buggy it
> t was?
>
> Things may not be perfect, but they really are much better
> than they were.
>
> The past 5 years of the JCP focus on everything J2EE has
> been a mistake that the language and J2SE may be able to
> recover from, if the right steps are taken now.
>
> I'm intrigued by the generic collection mechanisms in
> linq. That definiately provides a nice dynamic typing
> mechanism where there is real value. The IDE support for
> it is a powerful mechanism that can make the code
> independent of the data source.

The thing that keeps comming to my mind, is what is the purpose of Java today? It isn't well suited to it's original stated purpose, namely Applets and as an Internet language. It isn't that well suited to it's designed purpose. As a low level embedded language C/C++ is a much better choice.

So where next for Java? EAI is a strong area, as there is a lot of integration libraries available between Java and legacy systems, but as Ruby and Python catch up in this area too, why use Java?

I can see Java just becoming another legacy language, much like Visual Basic. I think the lesson to be learned is that short-term opportunism and good marketing isn't a basis on which to build a long term future for a programming language.

Sun knew all the technical issues at the outset IMO, but marketing won out over Software Engineering.

In the final analysis, I think the only people we can blame is ourselves. Hopefully we will take more responsibility for our own destinies moving forward, otherwise we may just find ourselves in the same situation 10 years down the line.

Lisp was created in 1956 and is still far more advanced then Java and C# today. Along the way it has been able to absorb new paradigms such as OOP in a high fidelity way. I'm not suggesting that we all program in Lisp, but the Lisp inspired languages like Smalltalk, Ruby etc do seem to point the way forward. Static typing is the last argument for languages like Java and C#. But this can be added to Smalltalk/Ruby/Python etc without affecting their dynamic runtime nature.

Maybe a dynamic language with type annotations is what is needed to finally get the C/C++ programmers who moved to Java/C# to understand what the Smalltalk community were going on about all along.

Paul.

Gregg Wonderly

Posts: 317
Nickname: greggwon
Registered: Apr, 2003

Re: Lack of extension Posted: Feb 28, 2007 6:43 PM
Reply to this message Reply
> I'm not sure how developed the virtualization techniques
> were in those days, but I doubt that anyone at the time
> could demonstrate an effective implementation that allowed
> a real-time application to be cost, memory and speed
> competitive with a typical embedded system of that era. If
> you have an example from the historical record that
> indicates otherwise, I'd be happy to hear about it.

My contention is that it wasn't as developed as we had technology and knowledge for because only "hardware dudes" were writing software in small devices. They didn't understand what was needed.

There were countless examples of virtualization in the form of P-Code and the like. There were small stack based machines used in the likes of calculators etc.

The J2SE JVM (then) was "larger" than these because of the native interfaces back to file, socket and system interfaces. The GC implementation was the complicating factor. At the same time that Java appeared, I was finishing a similar language that is/was used inside of the 5ESS switch for procedure automation. It used a simple counting collector, and did not have a threading model, which simplifies many things.

> > > Many of the embedded systems of that era didn't even
> > > include OS's, so adding a JVM would increase resource
> > > requirements significantly.
> >
> > If you have to "add" a JVM in a micro environment, you
> > haven't designed your micro environment to capitalize
> on
> > Java. Instead you've decided that Java can't add any
> real
> > value and so it should just be an add-on that you can
> > pitch when it fails to meet the needs of the users.
> >
>
> I think it's a hard case to make that one would include
> (if you prefer that word to "add") Java as a way of
> reducing the memory footprint or making the micro system
> run faster. I suspect that it is used for the same reason
> that most high-level languages are used - to simplify and
> accelerate software development.

I did not say that it would "reduce memory footprint" or "run faster". That would not be the point of using Java in a micro environment. The point of using Java in that environment is to capitalize on language features that would allow you to do testing outside of the micro environment, as well as remove/reduce memory management issues. By using Java, you could also design software that ran on multiple different hardware types by designing it to take advantage of the Java platform, instead of calling into native code that was non-portable.

> > This is precisely what has happened in the cell phone
> > industry. J2ME CLDC with MIDP is terrible to use on
> > mobile phones because you can't make it interact with the
> > user without disabling everything else the user wants to
> > use the phone for. The manufactures don't think Java is a
> > platform. They think that Java is an add-on for games and
> > gimics.
> >
> Keep in mind that Java can't really be used for the core
> activity of a cell phone: signal processing. They have to
> use DSP's for that. So for cell phones, it's always going
> to be an add-on to some degree.

DSP software can be written in Java and targeted to a DSP. I've written some DSP algorithms in Java. It is possible, even though the perfect math in Java can be a problem.

The DSP is not where you should be writing user interfaces and many other types of things that some developers use it for, just because there's memory available to add more to the DSP. That just makes you more dependent on that platform because you have to write software differently.

> > Java is not an add-on technology! Java is a platform
> > technology. If you are not getting everything that you
> > need out of Java, I'd wager that you have not put it
> close
> > enough to the problem space to allow it to work
> > effectively.
> >
> > RTSJ demonstrates that Java can provide real-time
> > programming. Java smart card (every resident of Korea,
> > and Brazil carries a Java smart card), demonstrates
> that
> > Java can be a platform on a very limited device.
>
> I think a lot of this is Java "branding". One can always
> reduce the functionality of any language or platform and
> then say "It can run on a smart card". One can always
> make changes to the standard implementation to allow
> semi-real-time capability. The point is that some
> compromise will be involved that makes the "new"
> implementation differ from the norm.

Yes, there are incompatible differences in the platform. Sun has acknowledged this as a problem, and I believe they will start to address the more prominent issues, such as the J2ME Connector mechanisms which don't exist in J2SE.

> There are two criteria you can use to judge these
> different implementations: 1) Will every standard program
> run on the new implementation? 2) Will there be enough in
> common with the standard implementation that programmers
> can leverage their knowledge?
>
> Typically, the answer to 1 is no and the answer to 2 is
> yes.

I don't use those criteria exactly. I ask these questions.

1) Can the new platform support the intended audience without compromising the development practices of that domain?

2) If new functionality is needed for the new platform can that be introduced into the old platform without impacting existing software?

3) Is there a big enough audience to need a new platform or do we just need new features on the old platform that allow it to scale to the new domain?

Jeff Ratcliff

Posts: 242
Nickname: jr1
Registered: Feb, 2006

Re: Lack of extension Posted: Feb 28, 2007 8:08 PM
Reply to this message Reply
> > I'm not sure how developed the virtualization
> techniques
> > were in those days, but I doubt that anyone at the time
> > could demonstrate an effective implementation that
> allowed
> > a real-time application to be cost, memory and speed
> > competitive with a typical embedded system of that era.
> If
> > you have an example from the historical record that
> > indicates otherwise, I'd be happy to hear about it.
>
> My contention is that it wasn't as developed as we had
> technology and knowledge for because only "hardware dudes"
> were writing software in small devices. They didn't
> understand what was needed.
>
> There were countless examples of virtualization in the
> form of P-Code and the like. There were small stack based
> machines used in the likes of calculators etc.

Yes, I got the "hardware dudes" thing. There were plenty of non-hardware guys invovled too, as a matter of fact.

The point is that nobody demonstrated that anything java-like could have been used in place of assembly or perphaps C in these real-time embedded systems without requiring more resources. In those days memory was so expensive firmware developers would have to beg the hardware designers to add enough to get the job done. It wasn't a lack of knowledge that excluded java-like approaches, it was the lack of capability of those approaches to get the job done within economic constraints.

Gregg Wonderly

Posts: 317
Nickname: greggwon
Registered: Apr, 2003

Re: Lack of extension Posted: Feb 28, 2007 10:31 PM
Reply to this message Reply
> In those days memory was so
> expensive firmware developers would have to beg the
> hardware designers to add enough to get the job done. It
> wasn't a lack of knowledge that excluded java-like
> approaches, it was the lack of capability of those
> approaches to get the job done within economic constraints.

I remember in 1995 when the 5ESS switching project finally figured out that customers were not upgrading equipment because of the cost of memory each time they installed the next generation of software. Memory was upgraded in miniscule amounts. It turned out that it was only marginally more expensive to make memory boards with 8x the amount of memory.

The extra margins on memory boards were also not necessary if the right price was used for memory and the profits came from more customers upgrading instead.

Memory cost way more than it needed to because nobody was counting on volume. The PC swell from the internet in 1996 and later proved that point quite well. Memory is dirt cheap now.

In the late 1980's when AT&T was talking to the DISK vendors about the next hard drives for the 3B computer systems, the disk companies were really put off that AT&T only wanted 300MB disks. They were ready at that point to provide 10GB disks.

Today, we are still paying too much for disks. They can do a lot more than what they are doing now. Applications are not going for big disk and memory usage, because there are no vendors selling huge machines for an affordable price.

Because there are no applications, the vendors are not selling the hardware they could be selling.

Paul Beckford

Posts: 51
Nickname: phaedrus
Registered: Feb, 2007

Re: Lack of extension Posted: Mar 1, 2007 4:57 AM
Reply to this message Reply
I've got a degree in Electronics Systems, and have some knowledge of micro-electronics. My understanding of the cost of memory is that it has more to do with yield than volume. In each silicon waffer there is only a percentage of chips that are "good". The others have to be thrown away. This is usually due to inpurities in the silicon.

As the material science has improved and the size of gates have reduced, yields have increased. This is the main driver in reducing both memory (and processor) costs as I understand it.

Comming back to software - since 1956 hardware has improved several orders of magnitude, yet Lisp is still state of the art in Software. Software engineering has fallen along way behind the hardware IMO, and we are still building software systems more or less the same way we did in the 1970's (C/Unix).

Alan Kay uses the anology of architecture and compares how we build software today with building a dog house, and says that we need to develop our tools, architectures and abstractions so that we can build Cathedrals. The man who helped invent objects, windowing operating systems and the internet thinks that the computer revolution hasn't happened yet:

http://video.google.com/videoplay?docid=-2950949730059754521

I tend to agree! and I think history will judge languages like Java as a col-de-sac along the road to where we need to be.

Paul.

Vincent O'Sullivan

Posts: 724
Nickname: vincent
Registered: Nov, 2002

Re: Lack of extension Posted: Mar 1, 2007 10:21 AM
Reply to this message Reply
> ...yet Lisp is still
> state of the art in Software. Software engineering has
> fallen along way behind the hardware IMO, and we are still
> building software systems more or less the same way we did
> in the 1970's (C/Unix).

Very true. I've been saying the same for some years now. For all the capabilities of current langauges, I am still coding by assembling 'for loops' and 'if statements' in much the same way as when I first learnt Fortran and got my first job using COBOL.

In this respect, Java and Python (et al.) are identical and there is still little sign that we have figured out how to break out of this particular box of limitations.

Jeff Ratcliff

Posts: 242
Nickname: jr1
Registered: Feb, 2006

Re: Lack of extension Posted: Mar 1, 2007 10:57 AM
Reply to this message Reply
> I've got a degree in Electronics Systems, and have some
> knowledge of micro-electronics. My understanding of the
> cost of memory is that it has more to do with yield than
> volume. In each silicon waffer there is only a percentage
> of chips that are "good". The others have to be thrown
> away. This is usually due to inpurities in the silicon.
>
> As the material science has improved and the size of gates
> have reduced, yields have increased. This is the main
> driver in reducing both memory (and processor) costs as I
> understand it.

I agree.
>
> Comming back to software - since 1956 hardware has
> improved several orders of magnitude, yet Lisp is still
> state of the art in Software. Software engineering has
> fallen along way behind the hardware IMO, and we are still
> building software systems more or less the same way we did
> in the 1970's (C/Unix).
>
> Alan Kay uses the anology of architecture and compares how
> we build software today with building a dog house, and
> says that we need to develop our tools, architectures and
> abstractions so that we can build Cathedrals. The man who
> helped invent objects, windowing operating systems and the
> internet thinks that the computer revolution hasn't
> happened yet:
>
> http://video.google.com/videoplay?docid=-295094973005975452
> 1
>
> I tend to agree! and I think history will judge languages
> like Java as a col-de-sac along the road to where we need
> to be.

Here I disagree with you and Alan Kay.

One of the reasons is because a lot of the hardware improvements have come from what you noted above: material science and manufacturing improvements. Those kind of improvements aren't available to software.

It's also helpful to remember that the fundamental reason software exists is to create complex behaviors that would be economically infeasible to create using hardware. So however primitive software methods are, hardware isn't even in the game.

If you tried to implement a video game or a web browser using only hardware, would the techniques used be notably more modular, maintainable, or extensible than what software does today? I greatly doubt it.

How much real change has there been in microprocessor designs over the last few years? We're getting dual core processors because the technology we've used for many years isn't capable of producing significantly faster chips anymore. So software design has to change in order to make applications run faster. Isn't this the tail wagging the dog?

My point is not to knock hardware design or electrical engineers, but I think it's an apples to orange comparison as long as software is the only discipline that is taking on the most complex problems.

Gregg Wonderly

Posts: 317
Nickname: greggwon
Registered: Apr, 2003

Re: Lack of extension Posted: Mar 1, 2007 11:43 AM
Reply to this message Reply
> I've got a degree in Electronics Systems, and have some
> knowledge of micro-electronics. My understanding of the
> cost of memory is that it has more to do with yield than
> volume. In each silicon waffer there is only a percentage
> of chips that are "good". The others have to be thrown
> away. This is usually due to inpurities in the silicon.
>
> As the material science has improved and the size of gates
> have reduced, yields have increased. This is the main
> driver in reducing both memory (and processor) costs as I
> understand it.

I believe this to be a contributing factor. But I also know that the pricing was not optimal for many types of memory devices because of the perceived volumes that "could" be sold.

> Comming back to software - since 1956 hardware has
> improved several orders of magnitude, yet Lisp is still
> state of the art in Software. Software engineering has
> fallen along way behind the hardware IMO, and we are still
> building software systems more or less the same way we did
> in the 1970's (C/Unix).

It has been a long time since I wrote any Lisp code. I don't think that it is "lisp" that is state of the art. What is state of the art, are developer tools based around lisp and smalltalk software development.

I contend that this is largely true because there has been limited competition in that arena, and this has allowed those interested in those markets to focus on true innovation instead of chasing competition creating features of limited value to the productivity of software developers.

James Watson

Posts: 2024
Nickname: watson
Registered: Sep, 2005

Re: Lack of extension Posted: Mar 1, 2007 12:40 PM
Reply to this message Reply
> I've got a degree in Electronics Systems, and have some
> knowledge of micro-electronics. My understanding of the
> cost of memory is that it has more to do with yield than
> volume. In each silicon waffer there is only a percentage
> of chips that are "good". The others have to be thrown
> away. This is usually due to inpurities in the silicon.

Pure silicon does not have the properties required to make transistors. All silicon used for such purposes has impurities by design. What you are saying may be correct but it is unclear. Perhaps you meant imperfections? My understanding was that pushing lithography to it's limits was the main problem with chip yields.

Paul Beckford

Posts: 51
Nickname: phaedrus
Registered: Feb, 2007

Re: Lack of extension Posted: Mar 1, 2007 2:29 PM
Reply to this message Reply
> > I've got a degree in Electronics Systems, and have some
> > knowledge of micro-electronics. My understanding of the
> > cost of memory is that it has more to do with yield
> than
> > volume. In each silicon waffer there is only a
> percentage
> > of chips that are "good". The others have to be thrown
> > away. This is usually due to inpurities in the silicon.
>
> Pure silicon does not have the properties required to make
> transistors. All silicon used for such purposes has
> impurities by design. What you are saying may be correct
> but it is unclear. Perhaps you meant imperfections? My
> understanding was that pushing lithography to it's limits
> was the main problem with chip yields.

Hi James,

You are bringing me way back to my University days so I'm a bit shacky here. I believe what you are calling "impurities by design" is what is properly known as dopants. Dopants are material induced into the base silicon substrate to create a pnp or npn junction. As far as I can remember dopants are introduced as a gas using a lithography technique. What I am talking about is impurities in the silicon itself, I think a common impurity is carbon. Just one impurity in a chip means that the whole chip is void. Hence what you want is small chips and a low number of impurities. This way you will yield a lot of chips from a single waffer.

Reducing the size of gates means that you can get the same number of gates onto a smaller chip, so for the same 9 inch wafer you will produce more chips and the chances are that less of those chips will contain impurities.

I hope that helps. BTW please check my facts, this is all off the top of my head and it was a long time ago :(

I'm feeling old!

Paul.

Paul Beckford

Posts: 51
Nickname: phaedrus
Registered: Feb, 2007

Re: Lack of extension Posted: Mar 1, 2007 3:02 PM
Reply to this message Reply
> > http://video.google.com/videoplay?docid=-295094973005975452
>
> > 1
> >
> > I tend to agree! and I think history will judge
> languages
> > like Java as a col-de-sac along the road to where we
> need
> > to be.
>
> Here I disagree with you and Alan Kay.
>
> One of the reasons is because a lot of the hardware
> improvements have come from what you noted above: material
> science and manufacturing improvements. Those kind of
> improvements aren't available to software.
>
> It's also helpful to remember that the fundamental reason
> software exists is to create complex behaviors that would
> be economically infeasible to create using hardware. So
> however primitive software methods are, hardware isn't
> even in the game.
>
> If you tried to implement a video game or a web browser
> using only hardware, would the techniques used be notably
> more modular, maintainable, or extensible than what
> software does today? I greatly doubt it.
>
> How much real change has there been in microprocessor
> designs over the last few years? We're getting dual core
> processors because the technology we've used for many
> years isn't capable of producing significantly faster
> chips anymore. So software design has to change in order
> to make applications run faster. Isn't this the tail
> wagging the dog?
>
> My point is not to knock hardware design or electrical
> engineers, but I think it's an apples to orange comparison
> as long as software is the only discipline that is taking
> on the most complex problems.

I actually agree with you here. Software is not like hardware. Analogies can sometimes be confusing. The point that I was trying to make is that a lot of software language design decisions are based on the historical limitations imposed by hardware. I think Greg was making the same point.


For desktop personal computers which is what Alan Kay is refering to, the processing power available today is ample. The comparison with software has to do with how well that hardware is being exploited. The hardware guys have exploited semi-conductors to a massive degree and have exceeded moores law - yet we are still building Software the same way we always have. In fact if you compare Java to Lisp, we have in effect gone backwards.

The challenges facing software are completely different I agree. The point is though is that we haven't raised our game. The issues today are the same as they where 30 years ago: complexity, coupling, cohesion and abstraction.

Count the types of Abstractions available in Lisp (functions, macros, lamda expressions, etc) and compare that to Java and you will see what I mean. And all this confusion of Type (interface) with Class (implementation) in C++ and Java under the guise of "type safety" has lead to massive coupling and early binding which we are now trying to address with IoC, Spring, AOP etc.


Type safety -really means program correctness. Program correctness can be improved several ways. The primary way is through testing. And if you want static type checks, well just add type annotations to your dynamic language.

The turning point IMO was a basic mis-understanding of Objects back with C++. I think we need to explore that blue plane Alan is talking about. Thirty years on and we have barely started.

Paul.

James Watson

Posts: 2024
Nickname: watson
Registered: Sep, 2005

Re: Lack of extension Posted: Mar 1, 2007 6:21 PM
Reply to this message Reply
> > > I've got a degree in Electronics Systems, and have
> some
> > > knowledge of micro-electronics. My understanding of
> the
> > > cost of memory is that it has more to do with yield
> > than
> > > volume. In each silicon waffer there is only a
> > percentage
> > > of chips that are "good". The others have to be
> thrown
> > > away. This is usually due to inpurities in the
> silicon.
> >
> > Pure silicon does not have the properties required to
> make
> > transistors. All silicon used for such purposes has
> > impurities by design. What you are saying may be
> correct
> > but it is unclear. Perhaps you meant imperfections?
> My
> > understanding was that pushing lithography to it's
> limits
> > was the main problem with chip yields.
>
> Hi James,
>
> You are bringing me way back to my University days so I'm
> a bit shacky here. I believe what you are calling
> "impurities by design" is what is properly known as
> dopants. Dopants are material induced into the base
> silicon substrate to create a pnp or npn junction. As far
> as I can remember dopants are introduced as a gas using a
> lithography technique.

The lithography I was talking about is how the chip is etched into the wafer. They take a large image of the layer of the wafer and reduce it down to shine on the wafer at the actual chip scale. A chemical which reacts to the light (not visible) then etches the wafer.

> What I am talking about is
> impurities in the silicon itself, I think a common
> impurity is carbon. Just one impurity in a chip means that
> the whole chip is void. Hence what you want is small chips
> and a low number of impurities. This way you will yield a
> lot of chips from a single waffer.

I was just perusing a book on quantum mechanics last night that used the exact word 'impurities' to describe the additions to the silicon that allow it to have the necessary properties to build a transistor. Pure silicon cannot be used to do anything special. This is why I am saying it is unclear. It may just be a terminology thing. I was a physics major in a previous life and don't know the computer engineering lingo.

> Reducing the size of gates means that you can get the same
> number of gates onto a smaller chip, so for the same 9
> inch wafer you will produce more chips and the chances are
> that less of those chips will contain impurities.

But reducing the gate size means the the lithography is much harder. As you try to resolve the light on smaller and smaller areas,you start bumping into limits based on the wavelength of the light (new invented negative refraction lenses could have an impact on this) such that the edges of the image start to get blurry. There's a range where they will mostly all come out OK or mostly all come out bad. In between is where it's a cost-benefit equation.

The larger the die, the more problems because it's more likely you have a 'smudge' in a greater area. This is a major part of why the P4 chip was canned. It had such a large die that they had to toss a large number of the chips. This, of course, also supports what you are saying.

I believe this is part of why chips of the same chip-set with lower clock ratings are cheaper. They are etched at lower resolutions and therefore are more likely to be correct. They can also be produced with older lithography equipment. I'm not 100% sure on that, though.

James Watson

Posts: 2024
Nickname: watson
Registered: Sep, 2005

Re: Lack of extension Posted: Mar 1, 2007 6:27 PM
Reply to this message Reply
> For desktop personal computers which is what Alan Kay is
> refering to, the processing power available today is
> ample. The comparison with software has to do with how
> well that hardware is being exploited. The hardware guys
> have exploited semi-conductors to a massive degree and
> have exceeded moores law - yet we are still building
> Software the same way we always have. In fact if you
> compare Java to Lisp, we have in effect gone backwards.

I've actually decided to teach myself Lisp as a result of reading these comments. I once decided to learn a dynamically typed language for the same reason and it opened my eyes a great deal so I hope to have the same thing happen with Lisp. I also have a sneaking suspicion that Scala would make more sense to me if I knew Lisp but I could be wrong.

To whet my appetite could one of you try to explain what makes Lisp so much better in your opinion. If you can prove it, that's great, but I don't request any proof.

If it's just too much to explain here, that's fine.

Flat View: This topic has 264 replies on 18 pages [ « | 13  14  15  16  17  18 | » ]
Topic: What Are Your Java Pain Points, Really? Previous Topic   Next Topic Topic: The Future of Online Dialog


Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2017 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use - Advertise with Us