The Artima Developer Community
Sponsored Link

Weblogs Forum
What Virtualization Means to Developers

8 replies on 1 page. Most recent reply: Apr 26, 2006 7:32 AM by Bill Venners

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 8 replies on 1 page
Frank Sommers

Posts: 2642
Nickname: fsommers
Registered: Jan, 2002

What Virtualization Means to Developers (View in Weblogs)
Posted: Apr 24, 2006 9:25 AM
Reply to this message Reply
Summary
In a recent blog post, John Clingan traces the patterns of application deployment from one application per server to a model where an app "runs somewhere on a grid," and to a dynamic infrastructure that includes the virtualization of app containers and applications as well, trends that hold important implications for developers.
Advertisement

In many ways, Java pioneered a mass-market demand for virtualization: Java's "platform-independence" is the result of Java code running in a Java Virtual Machine, isolating that code from most differences in the underlying operating systems and hardware.

Virtualization as "abstraction of resources across many aspects of computing" (see Wikipedia), has recently taken several new turns, mainly in response to the need to simplify application deployment and management. While virtualization of storage and even network resources has been popular for years—think SANs (storage-area networks) or NFS, for instance—more recent tools, such as XEN and Solaris containers allow the virtualization of operating-system environments or complete servers. And it doesn't have to stop there: The virtualization of an entire data center environment might not be far off.

John Clingan argues that "virtualization is about to hit mainstream, if it hasn't already," fueled by the confluence of several enabling factors such as the price-performance of servers capable of running virtualization software, the maturity of high-quality virtualization tools, and the need to render data centers more dynamic:

The customers I have worked with over the years maintained fairly rigid environments. "This server runs this application". "These servers run these applications". "Our department purchased these servers out of our budget so they run our applications." It has been like this for over a decade now.

[...] The move to virtualization is enabling a more dynamic infrastructure, and I wonder how long it will take customers to move from "these applications run on these virtual servers" to "our applications run somewhere on this grid".

Clingan suggests that beyond the virtualization of grids are application-container—and even application—virtualization techniques, what he calls a "dynamic infrastructure:"

It won't take that long for customers to move from virtualization to a dynamic infrastructure[...] Dynamic infrastructure [...] includes not only the OS and hardware, but application containers and the applications themselves.

Just as application server containers, such as J2EE, allow one to plug various app server implementations into an environment with no, or at most minimal, changes to the applications running in those containers, standardized application-level interfaces might also emerge over time to allow the virtualization of entire application domains. That could have interesting implications for developers.

The most important implication might be that more of what used to be called application-layer software is pushed into the infrastructure layer. For instance, the Java Content Repository API (JCR) is an effort to standardize through the Java Community Process an interface to content repository systems from Java code. As more content repository products implement the JCR API, an organization will be able to simply plug in any JCR-compliant implementation into its data center, alleviating the need for users to concern themselves with the brand or maker of a particular content-management product. The Java Data Mining API (JDM) is another example of standardizing common functionalities required in an application domain. Implementations of JDM will allow data mining functionality to be increasingly pushed into the enterprise IT infrastructure from what today is still implemented by proprietary vendor products.

Since virtualization often occurs in the infrastructure layer, moving more application functionality into the infrastructure layer serves virtualization. Indeed, application domain-specific standardization efforts are often driven by the need to virtualize enterprise resources. With standardization efforts under way in domains as diverse as health care and automotive finance, enterprise application development might take on several new meanings.

In one camp will be developers implementing standardized application interfaces, such as a content management system or an insurance claims management system. These developers will likely work for vendors, or in open-source communities, that will need to differentiate from competitors by addressing specific application concerns, such as scalability, ease-of-use, or the need for special configurations. The model here might be existing infrastructure vendors, such as purveyors of application-servers or databases.

In another camp will be developers creating modules or add-ons to such systems. Most of these developers will work for enterprises using these systems, or via commercial offshoots of open-source projects.

In his blog post, John Clingan points out that "virtualization doesn't drive consistency. [...] It won't take that long for customers to move from virtualization to a dynamic infrastructure." But virtualization fueled by open standards and open-source implementations of those standards could lead to consistency, as it already has in the area of application servers and databases, for instance.

Thus, a developer working for, say, an insurance company, will not have to design and implement a claims management system. Rather, there would be one or a handful of standards, and a developer or the IT manager would be able to choose from several competing implementations of those standards that together provide most of the claims management functionality. Because those products, whether open- or closed-source, will implement the standards, an organization will be free to "virtualize" such a resource, and plug in a different implementation at will.

Just as few enterprises today would pay their developers to code up an app server, a transaction processing monitor, or an HTTP server, in the future, few enterprise developers might be tasked to write an inventory management system, a data mining application, blogging software, or insurance claims management code. Instead, developers might increasingly participate in open-source communities centered around implementations of standardized application interfaces, and enterprises would benefit from using the resulting high-quality software, which they could deploy in a virtualized fashion. In addition, enterprises would pay developers mainly to customize or add modules on top of services that are by then pushed deep into the enterprise infrastucture.

I would certainly welcome the day when I no longer would have to code another user management module, inventory system, or workflow application, and instead just rely on high-quality implementations of such components and focus on adding value to my end-users by providing features they really care about.

But how long before standardized application interfaces become wide-spread? And what kinds of application interfaces should be standardized? The Java Content Repository and Java Data Mining API designers factored common application functionality into an API that most of the vendors in that domain could agree on. Could that work be replicated in other domains, too? How should developers or organizations drive that kind of standardization? Would this be through an organization such as the JCP, or through an open-source community such as Apache? And what would the proliferation of such widely available standards and associated implementations mean to developers?


Rafael Ferreira

Posts: 8
Nickname: rafaeldff
Registered: Feb, 2006

Re: What Virtualization Means to Developers Posted: Apr 24, 2006 2:38 PM
Reply to this message Reply
I tend to be skeptical regarding the popularization of domain specific frameworks. It's not a new idea, even CORBA had a go at it with their Domain Interfaces concept, and it didn't work.

This is one of those ideas that are always reappearing and never seem to succeed (like end-user programming, separating syntax from code structure, graphical programming environments a.k.a programming by dragging boxes, among others).

Frank Sommers

Posts: 2642
Nickname: fsommers
Registered: Jan, 2002

Re: What Virtualization Means to Developers Posted: Apr 24, 2006 5:04 PM
Reply to this message Reply
> I tend to be skeptical regarding the popularization of
> domain specific frameworks. It's not a new idea, even
> CORBA had a go at it with their Domain Interfaces concept,
> and it didn't work.
>
> This is one of those ideas that are always reappearing and
> never seem to succeed (like end-user programming,
> separating syntax from code structure, graphical
> programming environments a.k.a programming by dragging
> boxes, among others).


I used to be skeptical about this also, but the fact that standardization was possible with things like transaction processing monitors (e.g., J2EE), or even data access, is really encouraging.

Most important, standardization, if successful, must be driven by customers of applications. If customers want an application domain standardized, they will push vendors to follow suit.

My point was that the biggest motivation from the customer side might be virtualization that enables an organization to run its IT operations more effectively. This has already proved beneficial in data management and application servers - you can simply switch to a different app server, of you so decide, and your J2EE application will continue to work. This trend now is occuring in the area of operating systems as well.

A similar principle can apply to more application domains - CRM, ERP, design automation, data mining and decision support, etc. And even higher-level line of business applications.

Dileban Karunamoorthy

Posts: 9
Nickname: dileban
Registered: Feb, 2006

Re: What Virtualization Means to Developers Posted: Apr 25, 2006 3:17 AM
Reply to this message Reply
>
> A similar principle can apply to more application domains
> - CRM, ERP, design automation, data mining and decision
> support, etc. And even higher-level line of business
> applications.
>


But aren't some of these the very things that companies look to do differently? The more streamlined, the more agile and aptly designed a solution is, the better it can meet an organization’s specific needs.., and this is what gives them a competitive advantage. I personally feel that applications that are of a more generic nature like content repositories would probably end up being standardized, as there are numerous advantages for this. But for those that are more business-specific, the chances are less likely, as organizations would always look to do things differently, look to innovate and outsmart competitors

V.H.Indukumar

Posts: 28
Nickname: vhi
Registered: Apr, 2005

Re: What Virtualization Means to Developers Posted: Apr 25, 2006 8:05 AM
Reply to this message Reply
I believe that it is possible to standardize accross a wide range of domains. An example could be how Eclipse has standardized the development of IDEs, but still providing enough space for companies to innovate over it.

Most of the domain specific functionalities can be standardized, and still different vendors can compete on top of that standard.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: What Virtualization Means to Developers Posted: Apr 25, 2006 12:56 PM
Reply to this message Reply
Dileban Karunamoorthy wrote:

> But aren't some of these the very things that companies
> look to do differently? The more streamlined, the more
> agile and aptly designed a solution is, the better it can
> meet an organization’s specific needs.., and this is what
> gives them a competitive advantage. I personally feel that
> applications that are of a more generic nature like
> content repositories would probably end up being
> standardized, as there are numerous advantages for this.
> But for those that are more business-specific, the chances
> are less likely, as organizations would always look to do
> things differently, look to innovate and outsmart
> competitors
>
I think that for many business processes, there is so much variety in how people do things that it would be difficult to come up with a standard that isn't overwhelmingly complicated. In such cases, I think people can try and find what is common, and standardize on that. For example, if it is hopeless to try and standardize on the fields of a particular form, then perhaps you can try and standardize on a transmission format for any form.

I also question whether virtualization will not require that the thing being virtualized, application or container, be written with virtualization in mind. I suspect it will.

Frank Sommers

Posts: 2642
Nickname: fsommers
Registered: Jan, 2002

Re: What Virtualization Means to Developers Posted: Apr 25, 2006 3:14 PM
Reply to this message Reply
> I think that for many business processes, there is so much
> variety in how people do things that it would be difficult
> to come up with a standard that isn't overwhelmingly
> complicated.

While many businesses do certain processes differently, there's not often a strategic advantage of performing many processes in a unique way. However, companies often deal with legacy problems - "because we always did it this way, this is how it's done around here." Given sufficient economic incentives, a business might initiate change and adapt a standard, if there is a suitable standard.

As an example, consider processes such as those defined by ISO 9000, or even unit testing in software, or "best practices" - when there are incentives and tools, companies seem to realize that only a small number of processes are core competencies, and the rest are just necessary supporting processes that can follow some standard.

I'm not just talking about de jure standards, but also de facto ones. For instance, most businesses use industry standard PCs for business computing, and few companies would see the need to equip employees with custom-designed computers. As well, most companies use one of two popular office productivity software packages, and see little need to differentiate themselves from competitors by developing an in-house alternative.

I think there's a lot of myth and smoke and mirrors about business processes and applications beyond word processors, spreadsheets, presentation programs, databases, application servers, operating systems... (and that's a long list already). A lot of people, for instance, are finding that CRM is an area where a core set of functions apply to almost all users, and then you may want to develop modules or add-ons that make your organizations especially competitive. There is no need to develop an entire system from scratch.

In time, I can image that those core CRM functions are standardized - perhaps because users don't want to be locked into a single company's product - and then service providers can push most of that core functionality into the infrastructure layer (and even into the network). That will allow economies of scale, and thus reduce cost to the end-user. (Virtualization is but one means of reducing cost.)

> I also question whether virtualization will not require
> that the thing being virtualized, application or
> container, be written with virtualization in mind. I
> suspect it will.

A lot of virtualization environments work by introducing some amount of inefficiency, and even increased possibility of error, in return for often higher overall availability and total throughout. For isntance, many clustering techniques today take off-the-shelf components, e.g., a database, and attempt to deploy instances of that software on multiple servers, creating an illusion from the outside that a client is communicating with a single database server.

Such clustering introduces a new set of possible errors, and may even reduce response time vis-a-vis having just single database on the network. But the cluster as a whole might still produce better overall availability and throughput (number of transactions processed). This kind of a poor man's clustering is very popular, because it's cheap - compared with products that were designed from the ground up with cluster-based deployment in mind. Greg Pfister of IBM once called this "lowly parallelism" in the context of clustering. So I think in virtualization, also, people would take just plain old components, such as an OS or an app server, and perform such "lowly virtualization."

Dileban Karunamoorthy

Posts: 9
Nickname: dileban
Registered: Feb, 2006

Re: What Virtualization Means to Developers Posted: Apr 25, 2006 10:29 PM
Reply to this message Reply
>
> As an example, consider processes such as those defined by
> ISO 9000, or even unit testing in software, or "best
> practices" - when there are incentives and tools,
> companies seem to realize that only a small number of
> processes are core competencies, and the rest are just
> necessary supporting processes that can follow some
> standard.

This does make sense. I was involved in the development an ERP module called Document Management, which is basically a system that allows you to create document records, attach files and objects, define access rights and approval processes etc. The overall behavior of this product is much like all other similar products in the market. So the core processes are pretty much standardized, thanks to the initial customer and product research done, but we do have "extra things" that other products probably don't support yet and this probably gave us and the customers a competitive advantage.

I believe defining business standards would be a long and complicated process, with many organizations making their own extensions to those standard "interfaces" depending on their needs and region, just like what we've seen in CORBA, and the benefits of vendor "independence" would be hard to come by for some organizations with specific needs.

Also, which body you think would take of the responsibility of standardizing such processes and are there such initiatives already in progress?


> I think there's a lot of myth and smoke and mirrors about
> business processes and applications beyond word
> processors, spreadsheets, presentation programs,
> databases, application servers, operating systems... (and
> that's a long list already). A lot of people, for
> instance, are finding that CRM is an area where a core set
> of functions apply to almost all users, and then you may
> want to develop modules or add-ons that make your
> organizations especially competitive. There is no need to
> develop an entire system from scratch.

True, in fact, this seems to be the trend now. Many ERP companies develop pre-build components which are often localized to the specific needs of a customer depending on their requirements, and these customizations are often requested by other customers from the same geographical region. And even if there were companies that built CRM modules from scratch, I suspect they reuse a lot of their APIs.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: What Virtualization Means to Developers Posted: Apr 26, 2006 7:32 AM
Reply to this message Reply
> I think there's a lot of myth and smoke and mirrors about
> business processes and applications beyond word
> processors, spreadsheets, presentation programs,
> databases, application servers, operating systems... (and
> that's a long list already). A lot of people, for
> instance, are finding that CRM is an area where a core set
> of functions apply to almost all users, and then you may
> want to develop modules or add-ons that make your
> organizations especially competitive. There is no need to
> develop an entire system from scratch.
>
Well, I was talking about standards, not building things from scratch. Some things are hard to standardize because of inherent heterogeneity. That doesn't mean you have to build it from scratch if there isn't a standard, just that if you choose some vendor's proprietary solution, you can't switch vendors as easily later.

A lot of times people do the same thing in spirit everywhere, but if you look closely, you'll see they are doing things differently everywhere. Just about every web application I've seen has some notion of users, creating an account, dealing with forgotten passwords, and so on. But there isn't a standard way to do that. You can use a lot of off the shelf implementations, but often they don't quite do what you want. An implementation that would let everyone customize things they way they want would be quite complex, and its complexity would be a deterent to using it. Sometimes is it cheaper to just build your own than try and coerce someone else's framework into doing what you want.

Many business processes are like that. Everyone does them, but everyone does them slightly differently. And that's not because they need to get together and figure out one right way to do it. Some things are just heterogeneous like that. What would you say if someone asked, "why do we have so many programming languages? Java, C++, C#, Python, Ruby, Smalltalk, Scala, Perl, LISP, ... They're all Turing-complete. Why don't people just get together and define a standard programming language?"

> > I also question whether virtualization will not require
> > that the thing being virtualized, application or
> > container, be written with virtualization in mind. I
> > suspect it will.
>
> A lot of virtualization environments work by introducing
> some amount of inefficiency, and even increased
> possibility of error, in return for often higher overall
> availability and total throughout. For isntance, many
> clustering techniques today take off-the-shelf components,
> e.g., a database, and attempt to deploy instances of that
> software on multiple servers, creating an illusion from
> the outside that a client is communicating with a single
> database server.
>
In general I think scalability and performance are often at odds with each other. You trade off one for the other.

> Such clustering introduces a new set of possible errors,
> and may even reduce response time vis-a-vis having just
> single database on the network. But the cluster as a whole
> might still produce better overall availability and
> throughput (number of transactions processed). This kind
> of a poor man's clustering is very popular, because it's
> cheap - compared with products that were designed from the
> ground up with cluster-based deployment in mind. Greg
> Pfister of IBM once called this "lowly parallelism" in the
> context of clustering. So I think in virtualization, also,
> people would take just plain old components, such as an OS
> or an app server, and perform such "lowly virtualization."
>
But I think lowly virtualization is orthogonal to my question. My question is to what extent can virtualization of an application be done without the participation of the developer of the application? Take any servlet application running on one server. How many are really ready to be clustered as is? If you drop them into a clustered J2EE environment, might there not be a place or two in the application where the developer made assumptions that it would be running on one server? If so, virtualizing it would break the application, unless virtualization really is transparent to the application. I guess I'm questioning whether virtualization can really be transparent. I find it more plausible that we need to come up with standard virtualization APIs to which developers write.

Flat View: This topic has 8 replies on 1 page
Topic: What Virtualization Means to Developers Previous Topic   Next Topic Topic: Compatibility versus Entropy in Java and Python

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use