Alas, in his focus on Sun's 'Niagra' SPARC "microsystem" hardware, Greg doesn't even touch the even larger, more complex issue of the walls that the software world has run into in making better software. As yet, there's no clear software solution equivalent to the notion of these hardware "microsystems". There's a lot of exploration of various software facets such as OO, distributed systems, decentralized systems, various takes on functional programming, numerous approaches to concurrency, TDD, DDD, BDD, BVDs (just checking to see if you're awake :-), LOP, dynamic languages, tools, etc. ad nauseum but nothing that's obviously the next step.
So, what do you think is the next level of software development?
I forget whom to credit for this, but I also read a commentary on the empty promises of computing's future, though not to sound too much like a pessimist-
Instead of giving the business world, and therefore, us, more time for other tasks since the computer can do them for us, instead, computers have given us a world where we must squeeze every last ounce of productivity out of an individual.
The incredible shrinking PC is the result of a world where we just don't spend enough time at home relaxing. We must take these things from our home with us everywhere we go.
I've heard a lot of buzz on 'aspect oriented' programming, though to me, that just sounds like a kind of metaprogramming geared towards creating around awkward user interfaces.
You see, the 'point and click' thing barely works on PDAs and doesn't work on cell phones. The future is the search for new interfaces, and even the World Wide Web needs this.
As innovative as the Web has become with RSS and blogs and all that, the truth is, we still fill out a form and click a 'send' key, which isn't much better than what glass terminals did thirty years ago. We can do better than that.
In a less detatiled sentence, the hardware is now officially way ahead of the software.
I will try to post a short answer to your really complex question.
Although I like various programming paradigms (like logic, functional etc., they are so "cool" and "different"), I think that in practice imperative paradigm has shown to be closest to human thinking and easiest to program in. However, another breakthrough-paradigm that will solve all the hardest problems of today's software development looks just like a wish that will never come true.
Instead, it seems like imperative paradigm can be improved much. Today's programming languages can be improved in many areas where support for essential features in insufficient, like polymorphism, generics, modules/components, easy syntax etc. IDEs can also improve.
Hopefully, when languages get better support for those essential features, they could make libraries really easy to use and write, finally creating a big, useful and standardized libraries/components base. The features of the language should be such that it is easy to create libraries that are fast, applicable in wide variety of situations, and easy to integrate. And finally, we should be able to build systems from those components with some minimal code that binds them together, like houses are built from bricks and concrete.
I think this spells the return of being conscious of the underlying hardware. I think everyone has been saying this recently that the joyride of the 90's is over. With parallel processing, more than ever, hardware will be right in our face. Of course, there will be many attempts at abstracting this. What scares me is that this is really OLD news. And after what, 40-50 years of parallel processing, there's still no consensus?
We're in for a wild ride. Truly understanding threading for different models will become critical. In fact, most still don't understand threading on a single core vs. threading on multiple cores. These are two completely different strategies, yet people use the same strategy on both and wonder why it runs slower or not as efficiently on the multi-core system. Or worse, it doesn't work at all.
I'm looking forward to it. It'll seperate the script-kiddies and cut&pasters from the real programmers. I'll always consider the '90s and even up to now the dark ages of programming.