Article Discussion
My Most Important C++ Aha! Moments... <em>Ever</em>
Summary: In this article, Scott Meyers shares his picks for the five most meaningful Aha! moments in his involvement with C++, along with why he chose them.
25 posts.
The ability to add new comments in this discussion is temporarily disabled.
Most recent reply: October 31, 2006 4:27 PM by Mike
    Bill
     
    Posts: 409 / Nickname: bv / Registered: January 17, 2002 4:28 PM
    My Most Important C++ Aha! Moments...Ever
    September 6, 2006 9:00 AM      
    In this article, Scott Meyers shares his picks for the five most meaningful Aha! moments in his involvement with C++, along with why he chose them.

    http://www.artima.com/cppsource/top_cpp_aha_moments.html

    What were your most important aha! moments in C++?
    • Jitendra
       
      Posts: 1 / Nickname: jkd20 / Registered: September 6, 2006 5:00 PM
      Re: My Most Important C++ Aha! Moments...Ever
      September 6, 2006 10:39 PM      
      My most important Aha moment had been when came to know about the function objects.

      () operator can be overloaded and the object can itself act as a function !!
    • Terje
       
      Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
      Re: My Most Important C++ Aha! Moments...Ever
      September 7, 2006 8:27 AM      
      One of my biggest aha! moments with C++ came when I learned that you could do metaprogramming in C++ (from Todd Veldhuizen's work: http://osl.iu.edu/~tveldhui/papers).

      I was working with things like 3D graphics at the time, and compilers weren't that very good to do things like automatical loop-unrolling and similar optimisations, so I was thrilled to learn that you could actually make the compiler do this, yourself... As well as other program transformations and computations.

      As is now well known, C++ happens to have a Turing-complete language at compile-time, which makes theoretically any transformation or computation - where the data is available at compile-time - possible.

      You may even do it partially at compile-time, partially at run-time, and take advantage of any information existing at compile-time, to make execution more efficient or safe (the latter coming more from static typing, than metaprogramming).

      In a sense, the C++ template system - with its non-type template parameters - is a little like a dependent type system.
    • Niels
       
      Posts: 9 / Nickname: dekker / Registered: September 7, 2006 6:14 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 7, 2006 0:39 PM      
      Understanding how to write an exception safe copy assignment operator, using a swap helper function. My first attempts to implement a decent assignment for a non-trivial class failed miserably. I tried (quite literally!) to assign all of its data members, catching all possible exceptions, but I ended up having hopelessly corrupted objects. I'm pretty sure I first read about using a non-throwing swap to implement operator= in an article by Herb Sutter. My initial reaction was "huh?!?" But it was followed quickly by a big "Aha!" :-)
    • Max
       
      Posts: 18 / Nickname: mlybbert / Registered: April 27, 2005 11:51 AM
      Metaprogramming
      September 7, 2006 3:38 PM      
      I have to pick metaprogramming as well. And RAII. And "wow, this sure has a lot more power than Java does" which was more a Java Aha moment than a C++ Aha moment ("lemme write this quick one-off; should I use Java or C++? in C++ I'd ..., in Java I'd ..., wait a minute, Java gives me fewer choices!")
    • Glenn
       
      Posts: 3 / Nickname: gpuchtel / Registered: August 11, 2006 0:41 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 8, 2006 10:33 AM      
      While reading 'Object-Oriented Analysis and Design with Applications' by Grady Booch.
    • Sumant
       
      Posts: 2 / Nickname: sutambe / Registered: September 8, 2006 7:43 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 8, 2006 0:58 PM      
      My first Aha! moment was when I understood how Singleton design pattern works by making the constructor private and a public static method. Another one was pretty recent when I understood how execute-around method like pattern can be implemented in C++ using "Double Application of Smart Pointers". Given here: http://www.aristeia.com/sdnotes_frames.html
    • Nemanja
       
      Posts: 40 / Nickname: ntrif / Registered: June 30, 2004 1:10 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 8, 2006 8:15 AM      
      Probably RAII. I don't think any other idiom changed my programming style as much.
      • Roland
         
        Posts: 25 / Nickname: rp123 / Registered: January 7, 2006 9:42 PM
        Re: My Most Important C++ Aha! Moments...Ever
        September 8, 2006 11:58 PM      
        > Probably RAII. I don't think any other idiom changed my
        > programming style as much.

        The same for me!
        Private copy constructors and assignment operators are closely related to RAII. But the rest of the Aha-list (2 misnomers, a trick that "had little impact") is rather strange.
    • Vincent
       
      Posts: 40 / Nickname: vincent / Registered: November 13, 2002 7:25 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 7, 2006 1:02 AM      
      Enough already! It's like hearing the same joke over and over.

      What's next? The Most Important Piece of Cable that's ever carried C++?

      (Please tell me we're not going to go through this a dozen times for every language.)
      • Matt
         
        Posts: 62 / Nickname: matt / Registered: February 6, 2002 7:27 AM
        Re: My Most Important C++ Aha! Moments...Ever
        September 14, 2006 2:29 PM      
        > Enough already! It's like hearing the same joke over and over.

        Well if you had RTFA, you'd have seen this at the end:

        Okay, that’s it, the last of my “five lists of five.” For the record, here’s a summary of this series of articles: what I believe to be the five most important books, non- book publications, pieces of software, and people in C++ ever, as well as my five most memorable “Aha!” moments. I’ll spare you another such exercise in narcissism for at least another 18 years.

        So anyway, these two strings walk into a bar...
    • Achilleas
       
      Posts: 98 / Nickname: achilleas / Registered: February 3, 2005 2:57 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 7, 2006 7:14 AM      
      I never had any aha moments...C++ was always a struggle for me...an endless chase of pointers, analysis of crashes, and things like that!
      • Roland
         
        Posts: 25 / Nickname: rp123 / Registered: January 7, 2006 9:42 PM
        Re: My Most Important C++ Aha! Moments...Ever
        September 9, 2006 0:03 AM      
        > I never had any aha moments...C++ was always a struggle
        > for me...an endless chase of pointers, analysis of
        > crashes, and things like that!

        Absorb RAII and the struggle will go away.
        • Cameron
           
          Posts: 26 / Nickname: cpurdy / Registered: December 23, 2004 0:16 AM
          Re: My Most Important C++ Aha! Moments...Ever
          September 10, 2006 1:45 PM      
          > > I never had any aha moments...C++ was always a struggle
          > > for me...an endless chase of pointers, analysis of
          > > crashes, and things like that!

          > Absorb RAII and the struggle will go away.

          OMG. I went and looked up RAII and was [hardly?] surprised to see that it is nothing but "clean up in a destructor". Wow, that was a waste of a half hour of reading.

          What next? Maybe we could use CFront to add methods to structs? ;-)

          The problem in large-scale C++ wasn't cleaning up stuff in a destructor, it was knowing when you could (and when you had to) destroy things. Modern platforms (i.e. what we've been using for the past 10 years) automatically handle this for us perfectly -- in most cases.

          Peace.
          • Terje
             
            Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
            Re: My Most Important C++ Aha! Moments...Ever
            September 10, 2006 2:42 PM      
            > > > I never had any aha moments...C++ was always a
            > struggle
            > > > for me...an endless chase of pointers, analysis of
            > > > crashes, and things like that!
            >
            > > Absorb RAII and the struggle will go away.
            >
            > OMG. I went and looked up RAII and was [hardly?] surprised
            > to see that it is nothing but "clean up in a destructor".
            > Wow, that was a waste of a half hour of reading.
            >
            > The problem in large-scale C++ wasn't cleaning up stuff in
            > a destructor, it was knowing when you could (and when you
            > had to) destroy things. Modern platforms (i.e. what we've
            > been using for the past 10 years) automatically handle
            > this for us perfectly -- in most cases.

            And so can C++, but it seems from your posting it may be a while since you've looked at C++, and therefore may judge it based on how it was 10 or more years ago. Things have changed a lot since then. It would be similar to judging today's Java against your experience with Java 1.0.

            If you're referring to garbage collection, then (unless you count things like reference-counting - which C++ is perfectly able to handle - as garbage collection), then they don't - ever - destroy things... Finalisers? Forget it; they are not guaranteed to run, ever, and even if they were, it's nothing much useful they could do.

            People who confuse lifetime management from memory management, and thinks GC handles it all, is in for a rude awakening... GC only handles the latter, and not necessarily being the best way of doing it. In other words, GC only handles memory. For a lot of other resources (such as file handles, graphical contexts, etc.), you need to manage their lifetimes, as well, and GC gives you no help with that. Using RAII, you may handle both, and get deterministic lifetimes, without using ugly try-finally clauses (which really only handles _one_ resource at a time, too).

            Besides, as has been said many times, it's perfectly possible to use a garbage collector for C++. However, there are often better ways of doing things.
          • Roland
             
            Posts: 25 / Nickname: rp123 / Registered: January 7, 2006 9:42 PM
            Re: My Most Important C++ Aha! Moments...Ever
            September 11, 2006 0:48 AM      
            > > Absorb RAII and the struggle will go away.
            >
            > OMG. I went and looked up RAII and was [hardly?] surprised
            > to see that it is nothing but "clean up in a destructor".
            > Wow, that was a waste of a half hour of reading.

            You haven't understood RAII. 'Waste' another half hour to get it.

            > The problem in large-scale C++ wasn't cleaning up stuff in
            > a destructor, it was knowing when you could (and when you
            > had to) destroy things. Modern platforms (i.e. what we've
            > been using for the past 10 years) automatically handle
            > this for us perfectly -- in most cases.

            RAII means automatic and deterministic resource management. Compared to Java this saves 99% of all try/catch/finally blocks. Allocation and deallocation of resources (any resources, not just new-ed objects) disappears from the surface level of your code (see: http://www.artima.com/intv/modern3.html ). C++ has one asset, one advantage over newer (modern?) platforms, RAII.
          • Morgan
             
            Posts: 37 / Nickname: miata71 / Registered: March 29, 2006 6:09 AM
            When I realized that the name is a bug
            September 12, 2006 1:02 PM      
            It should be ++C.

            C++ is the same result as C, just with a side effect you may not notice till later. Apparently even the designers can't deal with the complexities of the language.
            • Terje
               
              Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
              Re: When I realized that the name is a bug
              September 13, 2006 0:52 PM      
              > It should be ++C.
              >
              > C++ is the same result as C, just with a side effect you
              > may not notice till later. Apparently even the designers
              > can't deal with the complexities of the language.

              Yet more perpetuation of the myth that "simple language" = "simple programs"... It's simply not true.

              The essential complexity has to go somewhere, and if it isn't in the language or library, it'll be in the programs - all the programs. I'd rather have it in the language/library, so that I can concentrate on the task at hand, being able to express my intention clearly the language, because of its richness in abstractions.

              Investing in learning an expressive language like C++ really well, pays handsomely back later.

              Is English a complex or big language? Would you be better off with fewer words? Reducing it to a few "grunt", "grunt" words surely must be the ultimate in simplicity! Think of the productivity gain from that... or maybe not. :)

              There's a difference between simple and simplistic, and too many people confuse them.
            • Bjarne
               
              Posts: 48 / Nickname: bjarne / Registered: October 17, 2003 3:32 AM
              Re: When I realized that the name is a bug
              September 13, 2006 6:55 PM      
              > It should be ++C.
              >
              > C++ is the same result as C, just with a side effect you
              > may not notice till later. Apparently even the designers
              > can't deal with the complexities of the language.

              Congratulations! You must be about the 100,000th person to note the meaning of ++C and also one of the innumerable people who didn't bother checking the FAQ before trying to show off their cleverness: http://www.research.att.com/~bs/bs_faq.html#name

              After 20+ years, this gets a bit tedious.

              -- Bjarne Stroustrup; http://www.research.att.com/~bs
          • Matt
             
            Posts: 62 / Nickname: matt / Registered: February 6, 2002 7:27 AM
            Re: My Most Important C++ Aha! Moments...Ever
            September 14, 2006 2:38 PM      
            > > > I never had any aha moments...C++ was always a
            > struggle
            > > > for me...an endless chase of pointers, analysis of
            > > > crashes, and things like that!
            >
            > > Absorb RAII and the struggle will go away.
            >
            > OMG. I went and looked up RAII and was [hardly?] surprised
            > to see that it is nothing but "clean up in a destructor"... [snip]

            Actually, it is shocking how much code there is in existence that claims to be C++ code, but uses handles and error codes exclusively. I would guess the idea of RAII is unknown to a large percentage of people who call themselves C++ programmers.
    • Todd
       
      Posts: 27 / Nickname: tblanchard / Registered: May 11, 2003 10:11 AM
      I have a few
      September 28, 2006 6:38 PM      
      The first was when I realized that the compiler was acting somewhat like a backward chaining rule system that kept substituting types based on available constructors and type conversion operators. If it could find a path to synthesize the type required by that function, it would synthesize one at any cost.

      The second was when I was writing endless amounts of binding code to bind UI components to models and it occurred to me that the C++ compiler was throwing away the most useful information in the program.

      The third was when I had decided that the language was a total productivity trainwreck and then spent some time pondering why such an awful language attracted such smart people as fans. My conclusion: C++ programming requires a level of mastery of complexity on par with chess. C++ is the ultimate programmer's chess game with a relatively simple set of rules resulting in a combinatorial explosion of nuances, behaviors, and design options. It is intellectually rewarding to master the subtleties and produce slick code (for C++), new types, clever allocation schemes managed by object lifetimes and all the rest.

      But in the end, it is still a productivity sink and a rotten language in which to GET THINGS DONE.
    • Cleo
       
      Posts: 6 / Nickname: vorlath / Registered: December 16, 2005 1:35 AM
      Re: My Most Important C++ Aha! Moments...Ever
      September 7, 2006 8:08 PM      
      My best Aha! moment wasn't directly related to C++ although it uses it. I realised that programming doesn't need functions, execution points or any execution control statements at all. These things actually make programming restrictive. This makes the Turing machine irrelevant and makes your software instantly portable without any VM or anything like that. You just need a small 'kernel' that will handle the low level execution handling for you which will then be linked into your software's executable. In other words, it'll handle the Turing machine side of it. This can be written in C++ or whatever language. It's a 30 year old idea that will probably take another 30 years to come to fruition by the looks of things.
      • Terje
         
        Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
        Re: My Most Important C++ Aha! Moments...Ever
        September 10, 2006 2:46 AM      
        > My best Aha! moment wasn't directly related to C++
        > although it uses it. I realised that programming doesn't
        > need functions, execution points or any execution control
        > statements at all. These things actually make programming
        > restrictive. This makes the Turing machine irrelevant and
        > makes your software instantly portable without any VM or
        > anything like that. You just need a small 'kernel' that
        > will handle the low level execution handling for you which
        > will then be linked into your software's executable. In
        > other words, it'll handle the Turing machine side of it.
        > This can be written in C++ or whatever language. It's a
        > a 30 year old idea that will probably take another 30
        > years to come to fruition by the looks of things.

        This sounds interesting, could you elaborate on it, or give any pointers to where to learn about this idea?
        • Cleo
           
          Posts: 6 / Nickname: vorlath / Registered: December 16, 2005 1:35 AM
          Re: My Most Important C++ Aha! Moments...Ever
          September 27, 2006 11:33 AM      
          > > My best Aha! moment wasn't directly related to C++
          > > although it uses it. I realised that programming
          > doesn't
          > > need functions, execution points or any execution
          > control
          > > statements at all. [...] It's
          > a
          > > a 30 year old idea that will probably take another 30
          > > years to come to fruition by the looks of things.
          >
          > This sounds interesting, could you elaborate on it, or
          > give any pointers to where to learn about this idea?

          I'm talking about specifying instructions on data and not the machine. It can come in any form. Right now, only data flow "languages" come close, but not completely. I took it for granted. I didn't understand the power that this gives.

          I don't know where I could give you a link because there's nothing out there that fully exploits this power. I'm working on a tool for this though.

          The simplest way I could describe this is by comparison. When we add two numbers, we take for granted that we're asking the computer to do it.

          a = b+c;

          is actually

          a = CPU->Add(b,c);

          I don't like that. I'd rather specify that b and c should be added and sent to a.

          (b,c)->ADD->(c).

          Looks like data flow, but that's just the tip of the iceberg. BTW, the variables are streams, so TONS of data can be pumped through (hence parallel). Data flow doesn't take advantage of the true power that comes with this. It would take a book to explain all the stuff that is possible with this that conventional languages and even data flow can't do.

          I found the majority of people think that the two Add statements above are the same. They're not even equivalent, but that's irrelevant. One is machine specific (low level), the other is not. This seemingly small detail in the previous sentence can open up a whole new area in the computing industry. Understanding this was my AHA! moment.
          • Terje
             
            Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
            Re: My Most Important C++ Aha! Moments...Ever
            October 2, 2006 5:36 AM      
            > > > My best Aha! moment wasn't directly related to C++
            > > > although it uses it. I realised that programming
            > > doesn't
            > > > need functions, execution points or any execution
            > > control
            > > > statements at all. [...] It's
            > > a
            > > > a 30 year old idea that will probably take another 30
            > > > years to come to fruition by the looks of things.
            > >
            > > This sounds interesting, could you elaborate on it, or
            > > give any pointers to where to learn about this idea?
            >
            > I'm talking about specifying instructions on data and not
            > the machine. It can come in any form. Right now, only
            > data flow "languages" come close, but not completely. I
            > took it for granted. I didn't understand the power that
            > this gives.
            >
            > I don't know where I could give you a link because there's
            > nothing out there that fully exploits this power. I'm
            > working on a tool for this though.
            >
            > The simplest way I could describe this is by comparison.
            > When we add two numbers, we take for granted that we're
            > e asking the computer to do it.
            >
            > a = b+c;
            >
            > is actually
            >
            > a = CPU->Add(b,c);

            Well, not really from a functional programming view of things, i.e. "a = b+c" _declares_ that a is the sum of b and c; no instruction about how to do that, or in which order (compared to other things) is given. However, that distinction may not be important in this context.

            That's also something like a 30 year old idea.

            Indeed, declarative programming (such as functional programming) is recognised as a powerful way of doing things, but like other "paradigms", there may be areas where it may be hard to write something in a purely declarative style.

            We have the example in HTML, which is declarative (it only specifies that something is a header, another thing is a paragraph, etc.; nothing about how it's supposed to look (unless you use presentational markup...)). However, for behaviour, JavaScript is typically used, an imperative language, again.

            However, recent developments of HTML (XHTML 2.0) aims to take care of 90% or so of what we now use scripting for (validation, calculations, etc.) using declarative constructs.

            It seems you're talking about something quite different, though.

            > I don't like that. I'd rather specify that b and c should
            > be added and sent to a.
            >
            > (b,c)->ADD->(c).
            >
            > I found the majority of people think that the two Add
            > statements above are the same. They're not even
            > equivalent, but that's irrelevant. One is machine
            > specific (low level), the other is not.

            Regarding the last paragraph: The same can be said for the distinction between imperative and declarative programming: a=b+c is a statement about the relationship between a, b and c, but it _can_ also be viewed as the instruction: "Add b and c, and assign the result to a".

            > This seemingly
            > small detail in the previous sentence can open up a whole
            > new area in the computing industry. Understanding this
            > was my AHA! moment.

            I see. Hm, you say the variables are _streams_... I still have a little difficulty understanding what is meant by the pseudo-code you gave. How is it different from the FP-interpretation of a=b+c?
    • Mike
       
      Posts: 2 / Nickname: mikepetry / Registered: April 28, 2005 10:31 AM
      Re: My Most Important C++ Aha! Moments...Ever
      October 31, 2006 4:27 PM      
      If you would like a C++ Aha moment, for kicks, consider developing an application in straight C.

      I may be doing some embedded work in the future and much of that work is still done in C. I like C but I have come to depend on the benefits of C++/OO and I am a proponent of interface-based / polymorphic designs. How will I implement my beloved Factory Method / Strategy Patterns instead of immense switch statements?
      We all know how modeling data as structs and then passing the structs to functions typically leads to code rot.
      I decide to use C structs with function pointer members and I will pass the struct to each struct function member as parameter (ala Python). I now feel a little better but I still don't have auto-scope management of embedded pointers.
      Instead of interfaces I can use function pointers but as I create all these functions I am faced with another peril! Namespace collisions - I have to create cheesy function names to get things to work in the one cluttered namespace!
      What about the Observer pattern - please tell me its not so! I can once again lean on function pointers but it gets a little messy with my scheme. I need to squirrel away a pair of values for each observer. The function to call and the struct to pass as parameter. And handily the struct can be cast a void*. (This is not worrying or anything.)
      Maybe someone else already solved these problems for me. Oh yes and it is called C++.