Sponsored Link •
2007 was the second year Bill Venners and I roamed the JavaOne Pavilion floor in search of fresh and interesting ideas that we thought our readers would find, if not relevant, at least intriguing. One of the main currents we observed this year had to do with scaling applications on multicore CPUs.
Business applications that have thus far counted on generationally increasing CPU clock cycles are increasingly executed on multicore CPU architectures. In such architectures, a CPU contains many cores, but each core's CPU clock speed is typically less than that of yesterday's single-core power-hungry processors. Enterprise applications must rely on concurrency to benefit from scaling on multi-core CPUs, and many applications were not written with such explicit concurrency in mind.
Nor should they have, according to Patrick Leonard, vice president of product development at Rogue Wave software. In an interview with Artima, Leonard talks about tools that can automate the concurrent execution of an initially single-threaded application, and describes the architecture of software pipelines that, presumably, mimic pipelined CPUs in software.
At the conclusion of our interview, Leonard described why concurrency should not become an intrinsic aspect of an application's design, but should instead be thought of as a configuration option to take advantage of a deployment environment:
Thirty years ago, it was very common to embed the data model in application logic. Then we figured out that it was a good idea to have a database that was outside the application. That way, you separate the data model from the application logic.
Likewise, it's good practice and good design to separate the concurrency model from the application. The more multithreaded code you write, the more embedded your concurrency model is with your application. The downside is that you loose a lot of flexibility, it's harder to test, it introduces a lot of complexity into the environment.
The more you can have the concurrency model separate, the more flexibility you have... If the concurrency model is separate, then it becomes much more configurable, much more manageable. And when you change that, you don't have to open up your application source code. That's truly the key. The last thing you'd ever want to do is open up your application source code ... when you want to make a change in your computing environment.
Patrick Leonard, vice president for products at RogueWave Software, discusses why concurrency should be an externalized aspect of an application.
What do you think about Patrick Leonard's views on externalizing concurrency?Post your opinion in the discussion forum.
About the authors
Frank Sommers is Editor-in-Chief of Artima Developer. He also serves as chief editor of the IEEE Technical Committee on Scalable Computing's newsletter, and is an elected member of the Jini Community's Technical Advisory Committee. Prior to joining Artima, Frank wrote the Jiniology and Web services columns for JavaWorld.
Bill Venners is president of Artima, Inc. He is author of the book, Inside the Java Virtual Machine, a programmer-oriented survey of the Java platform's architecture and internals. His popular columns in JavaWorld magazine covered Java internals, object-oriented design, and Jini. Bill has been active in the Jini Community since its inception. He led the Jini Community's ServiceUI project, whose ServiceUI API became the de facto standard way to associate user interfaces to Jini services. Bill also serves as an elected member of the Jini Community's initial Technical Oversight Committee (TOC), and in this role helped to define the governance process for the community.