The Artima Developer Community
Sponsored Link

Weblogs Forum
Signs of the Next Paradigm Shift: Part I

24 replies on 2 pages. Most recent reply: Jul 30, 2006 8:05 PM by James O. Coplien

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 24 replies on 2 pages [ 1 2 | » ]
James O. Coplien

Posts: 40
Nickname: cope
Registered: May, 2003

Signs of the Next Paradigm Shift: Part I (View in Weblogs)
Posted: Jul 30, 2006 8:05 PM
Reply to this message Reply
Summary
Design techniques good and bad come and go in the industry, often more quickly than educators can foresee. This 'blog looks whimsically at a possible resurgence in Multi-paradigm Design, and Part II reflects on how educators can prepare students for industry design whims.
Advertisement
Hello, everyone,

It's good to be back from a long hiatus, during which time I made a foray into the surreal world of startup software development. Having one's heart, soul and mind in the throes of everyday development makes it difficult to dedicate time to the kind of reflection good writing requires. It's good to be back.

Today's topic is an old one: multi-paradigm design. My book Multi-paradigm Design for C++ first appeared in October 1998. It was a distillation of ideas that emerged from the Software Production Research department at Bell Labs where I had been working on C++ and object-oriented programming, when a new boss named Dave Weiss brought a domain analysis agenda to the department. As a software design researcher, I took an interest in Dave's ideas. My interest became a project when Dave's boss, the well-known Bell Labs manager Sandy Fraser, challenged me to find a way to combine Dave's work with mine. Multi-paradigm design was the answer.

Multi-paradigm Design was a refinement and broad elaboration of notions that go back to the earliest days of the C++ programming language. One-time Dane Bjarne Stroustrup has never called C++ an "object-oriented programming language," and has noted on occasion that if you want to use Smalltalk, then you should use Smalltalk: Smalltalk is the best Smalltalk there is. That good design takes taste, insight, experience, and a rich toolset was one of his more closely held sentiments. He would occasionally use the phrase "multiple paradigms." And when I organized a panel several years ago with panelists including fellow Dane Ole Lehrman Madsen, he noted that Beta is a language that "transcends paradigms." Hmmm, maybe it's a Danish thing. Anyhow, these ideas begged exploration and expression, and a good part of my career took up this endeavor.

The work took shape, honed by workshops and lectures. One of the earliest lectures was given at a conference in Boston in 1993, and the work already constituted a well-rehashed set of concepts at that time. Since then I have published a handful of papers on the topic, and the book (which was an outgrowth of the original 1993 lectures) eventually appeared in English in late 1998. Since then the book has also appeared in Japanese (マルチパラダイムデザ インと C++における実装), Chinese (译者:鄢爱兰周辉), and Russian (МУЛЬТИПАРАДИГМЕННОЕ ПРОЕКТИРОВАНИЕ ДЛЯ C++).

Briefly, multi-paradigm design (MPD) is this:

  1. Let your intuition guide the formation of groupings, called families, in the problem domain.
  2. Find out what is common among family members; for example, they may all share the same structure and behaviour.
  3. Parameterize how the family members vary; for example, they may all vary in the algorithms they use to implement certain behaviours.
  4. See if some available implementation technique nicely expresses the commonality/variability association you found for each of the given domains. For example, the object paradigm is very good at expressing commonality in behaviour and variation in each behaviour's algorithms (inheritance and virtual functions), while templates are great to express commonality in code that must be parameterized with some value or type.
  5. Frame out the structure, and code it up.

This is different than using "object-oriented analysis" -- whatever that is -- that might prejudice the analysis with premature design partitioning. It allows one to come to object-oriented design honestly and offers strong cues about where templates, overloading, and other C++ goodies apply. And it's not only for C++. It's just that having a richly expressive language makes the transition from domain analysis to coding easier, and C++ was one of the few languages back in the early 1990s that offered such expressiveness; Ada was potentially another. Java as we know it today supports most such expressiveness, as does the C# programming language that has since come onto the scene.

In this 'blog I'll ask you to puzzle with me over why there seems to be renewed interest in multi-paradigm ideas these days after more than a decade of disinterest and, sometimes, outright criticism or even dismissal of such notions.

Mixed Reviews

Every author's dream is that his or her work receive stellar reviews in a large number of visible publications. While my earlier books had enjoyed good reviews, the reviews for the MPD book were mixed. Francis Glassborow (C-Vu) almost gave it the honor of the top C/C++ book for 1998, edged out by the excellent Generic Programming and the STL by Matt Austern. Angelika Langer and Klaus Kreft poured high praise on the book in their review, "a great book filled with elaborate and compelling ideas". There were favourable reviews by Brad Appleton in Computer Literacy, Steve Berczuk and others on Amazon, and even one in German by Ulrich Eisenecker -- who would later recount the book's role as one of the foundations of Generative Programming. David R. Miller wrote a studious and insightful review of the book in 1991. But Robert Glass's review was mixed, saying that though it may be "the definitive book on domain analysis and engineering," that it was a difficult and unrewarding read. The Amazon ranking is a lackluster three and a half stars, and many readers' reviews were unflattering.

The bottom line is that the book sales were flat, it was hard to find industrial application of its ideas, and educators didn't touch it with a ten-foot pointer.

Back to the Future

There has been a coincidence of good little signs of recognition for the now 8-year-old book in recent weeks. The first was in a mail I received from Lise Hvatum at Schlumberger, in which she said that Scott Meyers had been advocating the book in his lectures on C++ for Embedded Systems. Scott in fact has been a long-time advocate of multi-paradigm approaches, and has publications in the area.

The second and most elaborated review came in a 'blog from Darren Wesemann at Sungard. SunGard is a $4 billion software house that extensively employs what they call a Common Services Architecture, which is a method and architecture based extensively on reusable components. The 'blog describes how MPD is a central underpinning of their overall development framework.

This post was particularly important for me. While I had seen MPD work in the small many times over the years, and while I still think it's a good model for how great programmers actually think about programming in C++ without being conscious of it, I had uncovered few enterprise-scale applications of MPD techniques. Neil Harrison and I had often discussed the relative failure of domain engineering techniques; the ideas look good on paper, but they seemed always to fall victim to unevenness in the scope and scale of parameters of variation within a given domain. People want to enumerate all variabilities in the same basket without pausing to think about scoping them, or about how to ascribe the right variabilities to the right abstractions. Most analysts mix parameters of variation from several domains into a single pot, failing to build on even the degree of domain knowledge they possess.

I sent Darren's 'blog to my colleague Theo D'Hondt. I was Theo's Ph.D. student from 1997 to 2000, during which time I both expanded and significantly refined the MPD concepts. I whimsically pointed out that Darren's post provided evidence that University research indeed had industrial benefits. Theo responded by saying that VUB was considering starting a "largish" program centered on variability in software development. So yet a third advance on the MPD front.

The fourth and final sign of encouragement came from my friend Trygve Reenskaug. Trygve is best known for his early work on object-oriented programming and in particular for having raised the importance of roles as an organizing design principle. Out of this work on roles came the famous Model-View-Controller architecture, as well as his OORAM method as described in his great book Working with Objects: the OORAM Software Engineering Method that discusses object-oriented programming instead of this class-oriented programming about which one hears so much (see my 'blog on this topic and Trygve's follow-up comments). I hold him in awe, and what he says means much to me. I just got a short mail from Trygve on June 30 that said nothing more than:


Hi,
Been reading "Multi-Paradigm Design for C++". A truly impressive and
scholarly work!

Cheers
--Trygve
Trygve came from a culture close to the work of Kristen Nygaard, one of the fathers of Simula and object-oriented programming. I had given a copy of the book to Kristin while at ROOTS in mid-May 2001, about a year before his death. Coincidentally, I had just visited Trygve at his company in Oslo one week earlier (though the main topic there was patterns rather than multi-paradigm design).

So after thirteen years of skepticism and marginalization, MPD is suddenly coming into some limelight. Does this mean that MPD is maybe not a niche idea after all? And if that's so, what is going on in the world now that such notoriety should start to blossom?

Whither Thirteen Years?

It was thirteen years ago that I made that first MPD presentation at the OBJEX conference in Boston. Thirteen years is an eternity in Internet years. What explains the potential newfound interest in MPD and the long gestation period?

I'd be interested in 'blog readers' thoughts on this. Please add your comments below. In the mean time I'd like to explore three possibilities: that it didn't market itself as a miracle cure, that it hit an optimally bad point in the world economy curve, and that there just might be a necessarily fixed gestation period for new ideas.

Sex Sells

Multi-paradigm design followed no real bandwagon of the time; it was a loner. And it was a loner not pretending to offer miracle cures. Object orientation had promised productivity and reuse; component-oriented programming had later offered the same; CORBA was to offer the miracle of transparent distribution. Promising to do the impossible (none of these technologies ever delivered on their claim -- read here about the failure of CORBA) gained mindshare. It is not necessary to have a great product for it to sell; only that people believe it could be a great product. Sex sells, and it has become increasingly important to be sexy in software.

If MPD fit any emerging design category, it would be domain engineering -- which at the time was relegated largely to large military projects that lived in the shadows of object-oriented stuff. It was old hat, dating back to the 1970s. No sane opportunist would trace the roots of their key ideas back to that era -- one would be perceived as way behind the curve or as a fool. Ever trying to avoid being the opportunist, I really didn't care, and I could see the value of the techniques. In the book's preface I say: "I never considered titles containing the words pattern, object, CORBA, component, or Java."

The Generative Programming book by Ulrich Eisenecker and Krzysztof Czarnecki was one of the few popular books whose roots go back to an MPD seminar I gave in Germany in the early 1990s, that Ulrich attended, and that moved him to broaden beyond object orientation to deepen the foundations of what we know as Generative Programming today; much of the work was taken up by his student Krzysztof. A whole generative movement and conference series ensued, but it never took on the same proportions as objects, patterns, or components had ever achieved. It wasn't sex, and a lot of people didn't find it pretty.

One of the first things you learn in marketing is that sex sells, and in this era of object-oriented sexiness Domain Engineering looked like a wrinkled 80-year old. Indeed the entire software industry is driven by novelty. In MPD, the ties to C++ were novel, but C++ itself had already been around for a decade. The design technique was powerful and the translation from design to implementation was novel, but there was simply no way to make it fit Rose or Software Through Pictures, the big draws of the time. As such, it sat alone, unheralded by other works because of their ideological remoteness.

Instead, MPD is a technique of first principles. It is not the only set of consistent, broad principles of design that one can imagine, and indeed other systems of design are equally tractable as foundations for learning. But it is a fairly simple and straightforward way of thinking about design that, while eschewing the buzzwords, easily can be reconciled with the deepest principles of OOA, OOD, the GoF "patterns," and of course domain engineering. Even better, MPD provided an integrative framework that showed the relationship between these perspectives. That's systems thinking: thinking about relationships between basic concepts from diverse viewpoints. It is difficult to appreciate MPD without having lived through a few life cycles of a large system, during which time one comes to learn MPD principles the hard way.

One reason that sex sells is its universality: everyone feels they know what it is. (There is the old joke that OOP is like teenage sex: everything thinks they discovered it, and that they do it well, but the reality is otherwise.) If something is outside the comfort zone of the buzzwords of the day, it means that not many people are doing it. That makes it appear risky. Managers want not only low risk but would also be happy with a miracle or two. That's why people who say "Analysis and Implementation Cannot be Unified" will never be listened to. (By the way, what the title says is wrong, but everything in the article is right.)

Last, MPD is open-ended. It is not a methodology or a religion, and is designed to be "universal" in a way so it can accommodate a large spectrum of programming languages and lower-level design techniques (such as object orientation). Open-ended techniques make people suspicious because they are easily led to believe that generality and leverage are mutually exclusive. The cool thing about MPD is that it powerfully spans the entire spectrum of analysis, design, and programming -- programming at the very lowest level. Yet it is suggestive rather than prescriptive. It is, to my mind, everything that a software guideline should be. I think that part of the problem is that people -- or their managers -- want something more prescriptive. If you look at the ideas that have gained mindshare in our industry in the past years (which is not the same as the ideas that were successful) they have a high control component. Extreme Programming -- one of the highest ceremony methods in existence -- is more popular than SCRUM for this reason. C++ (with its compile-time type system) was more popular than Smalltalk (which put more of the power of the type system in the hands of the programmer) for this reason. And object-oriented design (which stubbornly insisted on reducing all design constructs to objects) overshadowed MPD (which was viewed as being more laissez-faire).

Design Method and Economics

I was discussing the renewed interest in MPD with Gertrud Bjørnvig, asking: why now? She asked what year the work first appeared, and I told her. She conjectures that work that appears during an economic downturn will always be investigated for conformity to the status quo: people are risk-averse in economically hard times. The first presentation of MPD was in 1993 at the lowest point of European GDP volume in many years (see Figure 2 in this diagram). The book hit the bookshelves in 1998, right in the middle of a slowdown in the world economy. There was an upturn in 2000, but all eyes probably turned to the software ideas that were emerging concurrent with that upturn: it was in 2001 that Stan Lippman would turn from his C++ programming roots to yield to the fashion of C# to publish a primer on the language, and when Alexandrescu would open up the practical world of template programming to the C++ masses through his Modern C++ Design.

Today, with the economic signs stronger than they have been over the past five years, industry is looking to invest a bit more in the future and to take more risks to outpace competition. That might spark the spirit of innovation and exploration which in turn fuels interest in alternative technology.

The Paradigm Shift Lag

I have often heard it said (but strangely enough, cannot track down an authoritative citation) that it takes a great idea 13 years to reach adoption after it is first introduced. Maybe the source was David Wheeler at AT&T, who used to maintain that it took an idea 15 years from the time it appeared in a research lab until it became mainstream. Such seems to have been the case with object-oriented programming (Simula in 1967; Smalltalk '80 in 1980), automatic transmissions (first introduced by Oldsmobile in 1940 but not common until the 1950s), and many others. (I'd be interested in hearing about more that you can find.) Maybe this corresponds to the time it takes an old generation to die out, or to be displaced far enough by early adapters of a new technology that it overtakes the inertia of earlier technologies.

From its "coming out" in 1993 to the present day is also about 13 years. Perhaps this wine's time has come, and that we're seeing echoes of it across the board. I'll stand up and take a risk here by saying that C++ had a remarkably prescient view of software design. In a course Tom Cargill designed for Technology Exchange Corporation -- Addison-Wesley's teaching arm back in the 1980s -- he showed how most languages of the time were in fact slowly converging on a relatively stable C++ worldview. Maybe this trend has since converged and yielded to another (it's hard to argue that Python fits this model, for example -- and I will say more about Python later) but it was arguably true at the time.

The problem with being great and alone is that you stand alone. No other popular languages of the time featured the expressiveness of C++. Ada was taking a shot at it but would take too long to come to fruition, and died for lack of cultural compatibility with the large C language base out there. Java was far too immature then and C# was still to have its coming out at the turn of the millennium. From a programming language view C++ was culturally compatible with C in an upward-compatible way, but was not ideologically compatible with any other popular language of the time.

Also, in the 1990s -- as has been true in most software design thinking in the industry since the 1980s -- managers believed that one size fits all. UML is a great example of this disease. They were looking for a common design method that would insulate them from changes in technology. The common theme of the day was objects, and the belief (wildly misplaced) that OOD could save you from the pains of migrating from C++ to Java was a major foundation of management religion. To adopt a design strategy that depended on C++ married you to C++, and managers could easily fear such inflexibility wasn't risk-averse enough.

If one looks at a handful of popular languages today, one can argue that Cargill's observation of language convergence has become even more true than he may have predicted. The differences between C#, Java and C++ are much smaller today than were the differences between the main competing languages (Ada, Java, Visual Basic and C++? you decide) in the early 1990s. It is now easier to see, with confidence, how multi-paradigm design transcends the languages of the new millenium. In fact, it was always a broadly applicable technique if you were willing to replace the expressiveness of the C++ type system with some manual discipline, but discipline doesn't sell, either. A design based on MPD would not only survive a language conversion but might help you decide, at the outset, what set of language features (and hence which language) might be best for developing a given product. While such choices were limited at the outset, the development world has slowly converged on this worldview, and viable implementation choices for an MPD design abound.

I have honestly never heard such discussions at an explicit level and doubt that any such rationale has risen to a conscious level of consideration in any project--the Sungard case above being the only exception I know about. Maybe Microsoft has a conscious MPD strategy that one sees reflected in their hybrid combination of languages in the Windows platform; however, such considerations don't really touch the application programmer. Nonetheless, the ideas have been in the air over the past ten years, and that's enough to nudge decisions one way or the other.

The Next Installment

In the next installment I'll discuss how this thirteen-year hiatus affects curriculum planning, and what academia can do both to build a more general education framework and to be able to integrate new ideas as they emerge.


David Brabant

Posts: 1
Nickname: lisp4ever
Registered: Aug, 2006

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 1, 2006 1:40 AM
Reply to this message Reply
Hi James,

Just a minor correction: this is Kirsten Nygaard, not Kristin.

Roland Pibinger

Posts: 93
Nickname: rp123
Registered: Jan, 2006

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 1, 2006 11:45 AM
Reply to this message Reply
The problem of multi-paradigm design is that it requires very knowledgeable and proficient designers. People who can evaluate the "right" solution from a set of different and conflicting paradigms. It's an approach for gurus.
BTW, the concept of paradigm stems from Kuhn. It describes the prevailing basic assumptions in a scientific community. Strictly speaking, 'multi-paradigm' is a contradiction. At the same time there is only one dominating paradigm. A new paradigm replaces the old by a 'scientific revolution'.

Alex Stojan

Posts: 95
Nickname: alexstojan
Registered: Jun, 2005

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 1, 2006 3:07 PM
Reply to this message Reply
> The problem of multi-paradigm design is that it requires
> very knowledgeable and proficient designers. People who
> can evaluate the "right" solution from a set of different
> and conflicting paradigms. It's an approach for gurus.

I'm not sure I agree with this - I would think that multi-paradigm design is more natural than trying to fit everything in a single-paradigm approach, like OOD.

Bjarne Stroustrup

Posts: 60
Nickname: bjarne
Registered: Oct, 2003

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 1, 2006 5:56 PM
Reply to this message Reply
That's Kuhn's opinion, but I'm not sure that it makes sense even for the areas where he applied it. It certainly doen't fit that way people brandish the word "paradigm" around in the computing world these days. I have seen discussions that listed 14 "paradigms" supported by C++ alone. I consider "paradigm" a pretentious word that I prefer to avoid, but it is hard to avoid a commonly used word. If I use it, I definitely don't use it with the implication that there is only one dominant paradigm and that it is then magically completely replaced by a newer one leaving nothing valuable behind.

Kristen Nygaard, answered the "what will replace OO" with "what replaced addition? It wasn't replaced by multiplication, it just found a proper place in the more advanced scheme of things; so will OO". For that to make sense, you of course need a sensible concept of OO - as Kristen had.

I suspect that we won't have our minds sufficiently wrapped about "the next paradigm" until we can find a better name for it than "multi-paradigm", but until then "multi-paradigm" fits better than any other candidate that I can think of.

Bjarne Stroustrup

Posts: 60
Nickname: bjarne
Registered: Oct, 2003

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 1, 2006 5:59 PM
Reply to this message Reply
A nit: David Wheeler worked in Cambridge. If you don't know of him, look him up - his contribution is amazing. For example, he wrote a serious and insightful paper on the design of libraries in 1951 (that's righ fiftyone).

disney

Posts: 35
Nickname: juggler
Registered: Jan, 2003

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 2:21 AM
Reply to this message Reply
Hi Cope,

I'm glad MPD seems to be finding an audience at last. About time! It's a book stuffed full of those things that are *so* obvious ... once someone else has written them down and placed them in front of you. Time to dig out my copy and read it again, methinks.

Roland Pibinger

Posts: 93
Nickname: rp123
Registered: Jan, 2006

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 3:14 AM
Reply to this message Reply
> Kristen Nygaard, answered the "what will replace OO" with
> "what replaced addition? It wasn't replaced by
> multiplication, it just found a proper place in the more
> advanced scheme of things; so will OO". For that to make
> sense, you of course need a sensible concept of OO - as
> Kristen had.
> I suspect that we won't have our minds sufficiently
> wrapped about "the next paradigm" until we can find a
> better name for it than "multi-paradigm", but until then
> "multi-paradigm" fits better than any other candidate that
> I can think of.

Consider two popular C++ "paradigms" (simplified):
P1: "Everything is an object"
P2: "Everything is a value"

The question quickly arises: When are objects appropriate and when values? To get an answer one must take an integrative point of view and find the "more advanced scheme of things". One has to transcend the contradictory, conflicting "multi-paradigm" perspective. That integrative point of view is sorely missing from C++ related discussions.

Terje Slettebø

Posts: 205
Nickname: tslettebo
Registered: Jun, 2004

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 6:48 AM
Reply to this message Reply
> Consider two popular C++ "paradigms" (simplified):
> P1: "Everything is an object"
> P2: "Everything is a value"

I'm not sure I follow you, here. In C++ standards terminology, values of both built-in types (like "int") and user-defined types (classes) are considered "objects". The difference is only that one type is built-in, and the other is user-defined.

So... maybe you're using a terminology I'm not familiar with. Could you give examples?

Objects are typically classified according to the _kind_ of object they are, such as value objects, entity objects, etc. Maybe that's what you were thinking of?

> The question quickly arises: When are objects appropriate
> and when values?

Or maybe you make a distinction between built-in and user-defined types (it can seem so, here).

> To get an answer one must take an
> integrative point of view and find the "more advanced
> scheme of things". One has to transcend the contradictory,
> conflicting "multi-paradigm" perspective. That integrative
> point of view is sorely missing from C++ related
> discussions.

Unfortunately, I don't understand this part, without knowing what you put in the meaning of "object" and "value".

And about the thread topic: Yes, if MPD is getting renewed interested, I'd say it's about time, too!

Cope's book is a great book, and it complements nicely the "Generative Programming" tome of a book, with its focus on domain engineering and implementation.

Maxim Noah Khailo

Posts: 25
Nickname: mempko
Registered: Nov, 2004

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 7:43 AM
Reply to this message Reply
"multi-paradigm" certainly doesn't make sense if you think of paradigm the way Kuhn does. However, in the context of programming and software development, a paradigm is a model for solving some kind of problem (Although I unfortunately read Multi-Paradigm Design for C++ a long long time ago, so I may be thinking about paradigm in the wrong way now). OO is a model for solving problems dealing with "objects". Functional program deals with problems which are “functional” in nature, etc. “multi-paradigm” certainly makes sense in the context of software development as problems are never just dealing with objects or just dealing with math or functions. Problems can be complex and it is silly to try to use one paradigm to solve them.

One simple example is that there is a Math object in Java. It contains static methods of common mathematical functions such as sqrt (square root). However, it doesn't make any sense to think of math as an object. So the hack in Java is to make it a class with static methods. This in C++ can be accomplished using a namespace and simple functions. The C++ way is more closely related to the way we think about math than the Java way.

Maxim Noah Khailo

Posts: 25
Nickname: mempko
Registered: Nov, 2004

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 7:47 AM
Reply to this message Reply
Your post, James, make me want to go back and reread your book. I read it first as a young lad in a library. Maybe I will pay the publishing gods homage and buy your book this time as I am older and are not so strapped for cash.

Roland Pibinger

Posts: 93
Nickname: rp123
Registered: Jan, 2006

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 7:58 AM
Reply to this message Reply
> Objects are typically classified according to the _kind_
> of object they are, such as value objects, entity objects,
> etc. Maybe that's what you were thinking of?

Yes. Unfortunately there is no uniformly used terminology (e.g. "value objects" are not objects in the OO sense).

Roland Pibinger

Posts: 93
Nickname: rp123
Registered: Jan, 2006

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 8:11 AM
Reply to this message Reply
> OO is a model for solving problems dealing
> with "objects". Functional program deals with problems
> which are “functional” in nature, etc. “multi-paradigm”
> certainly makes sense in the context of software
> development as problems are never just dealing with
> objects or just dealing with math or functions. Problems
> can be complex and it is silly to try to use one paradigm
> to solve them.

When you explain to someone why you use one paradigm for a certain problem (and not another) you already transcend 'multi-paradigm'. For that you need a "more advanced scheme of things".

Terje Slettebø

Posts: 205
Nickname: tslettebo
Registered: Jun, 2004

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 11:50 AM
Reply to this message Reply
> > OO is a model for solving problems dealing
> > with "objects". Functional program deals with problems
> > which are “functional” in nature, etc. “multi-paradigm”
> > certainly makes sense in the context of software
> > development as problems are never just dealing with
> > objects or just dealing with math or functions.
> Problems
> > can be complex and it is silly to try to use one
> paradigm
> > to solve them.
>
> When you explain to someone why you use one paradigm for a
> certain problem (and not another) you already transcend
> 'multi-paradigm'. For that you need a "more advanced
> scheme of things".

I'm curious to know what you're thinking of about this "more advanced scheme of things", :) could you elaborate?

Anyway, in languages supporting multiple paradigms, one important point is that the whole is more than the sum of the parts. C++ has been carefully designed to let the various features of the language work well together. With a multi-paradigm language, you may do things by combining paradigms, that no paradigm by itself can handle. One example that has been mentioned other places, is iterating through a type-safe container (generic programming) calling polymorphic member functions on the contained objects (OO).

As an aside, one thing I find especially interesting, is the possibility of getting support for generic concepts in C++ (which I've mentioned in other threads, as well). This would allow compile-time checked "duck typing", where you don't have to specify the exact type for an interface, only what properties the type needs to have, as well as enabling arbitrarily fine-grained overloading.

Doug Holland

Posts: 1
Nickname: mvpdh
Registered: Aug, 2006

Re: Signs of the Next Paradigm Shift: Part I Posted: Aug 2, 2006 2:46 PM
Reply to this message Reply
is the possibility of getting support for generic concepts in C++ ...This would allow compile-time checked "duck typing", where you don't have to specify the exact type for an interface, only what properties the type needs to have, as well as enabling arbitrarily fine-grained overloading.

Ruby does it too with some benefits, but I could argue either approach (duck vs static) depending on the project, developers and maintenance.

Flat View: This topic has 24 replies on 2 pages [ 1  2 | » ]
Topic: Signs of the Next Paradigm Shift: Part I Previous Topic   Next Topic Topic: Hardware Upgrade Tonight

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use