Article Discussion
My Most Important C++ Aha! Moments... Ever
Summary: In this article, Scott Meyers shares his picks for the five most meaningful Aha! moments in his involvement with C++, along with why he chose them.
26 posts on 2 pages.      
« Previous 1 2 Next »
The ability to add new comments in this discussion is temporarily disabled.
Most recent reply: October 31, 2006 4:27 PM by Mike
Terje
Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
Re: My Most Important C++ Aha! Moments...Ever
September 10, 2006 2:42 PM      
> > > I never had any aha moments...C++ was always a
> struggle
> > > for me...an endless chase of pointers, analysis of
> > > crashes, and things like that!
>
> > Absorb RAII and the struggle will go away.
>
> OMG. I went and looked up RAII and was [hardly?] surprised
> to see that it is nothing but "clean up in a destructor".
> Wow, that was a waste of a half hour of reading.
>
> The problem in large-scale C++ wasn't cleaning up stuff in
> a destructor, it was knowing when you could (and when you
> had to) destroy things. Modern platforms (i.e. what we've
> been using for the past 10 years) automatically handle
> this for us perfectly -- in most cases.

And so can C++, but it seems from your posting it may be a while since you've looked at C++, and therefore may judge it based on how it was 10 or more years ago. Things have changed a lot since then. It would be similar to judging today's Java against your experience with Java 1.0.

If you're referring to garbage collection, then (unless you count things like reference-counting - which C++ is perfectly able to handle - as garbage collection), then they don't - ever - destroy things... Finalisers? Forget it; they are not guaranteed to run, ever, and even if they were, it's nothing much useful they could do.

People who confuse lifetime management from memory management, and thinks GC handles it all, is in for a rude awakening... GC only handles the latter, and not necessarily being the best way of doing it. In other words, GC only handles memory. For a lot of other resources (such as file handles, graphical contexts, etc.), you need to manage their lifetimes, as well, and GC gives you no help with that. Using RAII, you may handle both, and get deterministic lifetimes, without using ugly try-finally clauses (which really only handles _one_ resource at a time, too).

Besides, as has been said many times, it's perfectly possible to use a garbage collector for C++. However, there are often better ways of doing things.
Roland
Posts: 25 / Nickname: rp123 / Registered: January 7, 2006 9:42 PM
Re: My Most Important C++ Aha! Moments...Ever
September 11, 2006 0:48 AM      
> > Absorb RAII and the struggle will go away.
>
> OMG. I went and looked up RAII and was [hardly?] surprised
> to see that it is nothing but "clean up in a destructor".
> Wow, that was a waste of a half hour of reading.

You haven't understood RAII. 'Waste' another half hour to get it.

> The problem in large-scale C++ wasn't cleaning up stuff in
> a destructor, it was knowing when you could (and when you
> had to) destroy things. Modern platforms (i.e. what we've
> been using for the past 10 years) automatically handle
> this for us perfectly -- in most cases.

RAII means automatic and deterministic resource management. Compared to Java this saves 99% of all try/catch/finally blocks. Allocation and deallocation of resources (any resources, not just new-ed objects) disappears from the surface level of your code (see: http://www.artima.com/intv/modern3.html ). C++ has one asset, one advantage over newer (modern?) platforms, RAII.
Morgan
Posts: 37 / Nickname: miata71 / Registered: March 29, 2006 6:09 AM
When I realized that the name is a bug
September 12, 2006 1:02 PM      
It should be ++C.

C++ is the same result as C, just with a side effect you may not notice till later. Apparently even the designers can't deal with the complexities of the language.
Terje
Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
Re: When I realized that the name is a bug
September 13, 2006 0:52 PM      
> It should be ++C.
>
> C++ is the same result as C, just with a side effect you
> may not notice till later. Apparently even the designers
> can't deal with the complexities of the language.

Yet more perpetuation of the myth that "simple language" = "simple programs"... It's simply not true.

The essential complexity has to go somewhere, and if it isn't in the language or library, it'll be in the programs - all the programs. I'd rather have it in the language/library, so that I can concentrate on the task at hand, being able to express my intention clearly the language, because of its richness in abstractions.

Investing in learning an expressive language like C++ really well, pays handsomely back later.

Is English a complex or big language? Would you be better off with fewer words? Reducing it to a few "grunt", "grunt" words surely must be the ultimate in simplicity! Think of the productivity gain from that... or maybe not. :)

There's a difference between simple and simplistic, and too many people confuse them.
Bjarne
Posts: 48 / Nickname: bjarne / Registered: October 17, 2003 3:32 AM
Re: When I realized that the name is a bug
September 13, 2006 6:55 PM      
> It should be ++C.
>
> C++ is the same result as C, just with a side effect you
> may not notice till later. Apparently even the designers
> can't deal with the complexities of the language.

Congratulations! You must be about the 100,000th person to note the meaning of ++C and also one of the innumerable people who didn't bother checking the FAQ before trying to show off their cleverness: http://www.research.att.com/~bs/bs_faq.html#name

After 20+ years, this gets a bit tedious.

-- Bjarne Stroustrup; http://www.research.att.com/~bs
Matt
Posts: 62 / Nickname: matt / Registered: February 6, 2002 7:27 AM
Re: My Most Important C++ Aha! Moments...Ever
September 14, 2006 2:29 PM      
> Enough already! It's like hearing the same joke over and over.

Well if you had RTFA, you'd have seen this at the end:

Okay, that’s it, the last of my “five lists of five.” For the record, here’s a summary of this series of articles: what I believe to be the five most important books, non- book publications, pieces of software, and people in C++ ever, as well as my five most memorable “Aha!” moments. I’ll spare you another such exercise in narcissism for at least another 18 years.

So anyway, these two strings walk into a bar...
Matt
Posts: 62 / Nickname: matt / Registered: February 6, 2002 7:27 AM
Re: My Most Important C++ Aha! Moments...Ever
September 14, 2006 2:38 PM      
> > > I never had any aha moments...C++ was always a
> struggle
> > > for me...an endless chase of pointers, analysis of
> > > crashes, and things like that!
>
> > Absorb RAII and the struggle will go away.
>
> OMG. I went and looked up RAII and was [hardly?] surprised
> to see that it is nothing but "clean up in a destructor"... [snip]

Actually, it is shocking how much code there is in existence that claims to be C++ code, but uses handles and error codes exclusively. I would guess the idea of RAII is unknown to a large percentage of people who call themselves C++ programmers.
Cleo
Posts: 6 / Nickname: vorlath / Registered: December 16, 2005 1:35 AM
Re: My Most Important C++ Aha! Moments...Ever
September 27, 2006 11:33 AM      
> > My best Aha! moment wasn't directly related to C++
> > although it uses it. I realised that programming
> doesn't
> > need functions, execution points or any execution
> control
> > statements at all. [...] It's
> a
> > a 30 year old idea that will probably take another 30
> > years to come to fruition by the looks of things.
>
> This sounds interesting, could you elaborate on it, or
> give any pointers to where to learn about this idea?

I'm talking about specifying instructions on data and not the machine. It can come in any form. Right now, only data flow "languages" come close, but not completely. I took it for granted. I didn't understand the power that this gives.

I don't know where I could give you a link because there's nothing out there that fully exploits this power. I'm working on a tool for this though.

The simplest way I could describe this is by comparison. When we add two numbers, we take for granted that we're asking the computer to do it.

a = b+c;

is actually

a = CPU->Add(b,c);

I don't like that. I'd rather specify that b and c should be added and sent to a.

(b,c)->ADD->(c).

Looks like data flow, but that's just the tip of the iceberg. BTW, the variables are streams, so TONS of data can be pumped through (hence parallel). Data flow doesn't take advantage of the true power that comes with this. It would take a book to explain all the stuff that is possible with this that conventional languages and even data flow can't do.

I found the majority of people think that the two Add statements above are the same. They're not even equivalent, but that's irrelevant. One is machine specific (low level), the other is not. This seemingly small detail in the previous sentence can open up a whole new area in the computing industry. Understanding this was my AHA! moment.
Todd
Posts: 27 / Nickname: tblanchard / Registered: May 11, 2003 10:11 AM
I have a few
September 28, 2006 6:38 PM      
The first was when I realized that the compiler was acting somewhat like a backward chaining rule system that kept substituting types based on available constructors and type conversion operators. If it could find a path to synthesize the type required by that function, it would synthesize one at any cost.

The second was when I was writing endless amounts of binding code to bind UI components to models and it occurred to me that the C++ compiler was throwing away the most useful information in the program.

The third was when I had decided that the language was a total productivity trainwreck and then spent some time pondering why such an awful language attracted such smart people as fans. My conclusion: C++ programming requires a level of mastery of complexity on par with chess. C++ is the ultimate programmer's chess game with a relatively simple set of rules resulting in a combinatorial explosion of nuances, behaviors, and design options. It is intellectually rewarding to master the subtleties and produce slick code (for C++), new types, clever allocation schemes managed by object lifetimes and all the rest.

But in the end, it is still a productivity sink and a rotten language in which to GET THINGS DONE.
Terje
Posts: 33 / Nickname: tslettebo / Registered: June 22, 2004 8:48 PM
Re: My Most Important C++ Aha! Moments...Ever
October 2, 2006 5:36 AM      
> > > My best Aha! moment wasn't directly related to C++
> > > although it uses it. I realised that programming
> > doesn't
> > > need functions, execution points or any execution
> > control
> > > statements at all. [...] It's
> > a
> > > a 30 year old idea that will probably take another 30
> > > years to come to fruition by the looks of things.
> >
> > This sounds interesting, could you elaborate on it, or
> > give any pointers to where to learn about this idea?
>
> I'm talking about specifying instructions on data and not
> the machine. It can come in any form. Right now, only
> data flow "languages" come close, but not completely. I
> took it for granted. I didn't understand the power that
> this gives.
>
> I don't know where I could give you a link because there's
> nothing out there that fully exploits this power. I'm
> working on a tool for this though.
>
> The simplest way I could describe this is by comparison.
> When we add two numbers, we take for granted that we're
> e asking the computer to do it.
>
> a = b+c;
>
> is actually
>
> a = CPU->Add(b,c);

Well, not really from a functional programming view of things, i.e. "a = b+c" _declares_ that a is the sum of b and c; no instruction about how to do that, or in which order (compared to other things) is given. However, that distinction may not be important in this context.

That's also something like a 30 year old idea.

Indeed, declarative programming (such as functional programming) is recognised as a powerful way of doing things, but like other "paradigms", there may be areas where it may be hard to write something in a purely declarative style.

We have the example in HTML, which is declarative (it only specifies that something is a header, another thing is a paragraph, etc.; nothing about how it's supposed to look (unless you use presentational markup...)). However, for behaviour, JavaScript is typically used, an imperative language, again.

However, recent developments of HTML (XHTML 2.0) aims to take care of 90% or so of what we now use scripting for (validation, calculations, etc.) using declarative constructs.

It seems you're talking about something quite different, though.

> I don't like that. I'd rather specify that b and c should
> be added and sent to a.
>
> (b,c)->ADD->(c).
>
> I found the majority of people think that the two Add
> statements above are the same. They're not even
> equivalent, but that's irrelevant. One is machine
> specific (low level), the other is not.

Regarding the last paragraph: The same can be said for the distinction between imperative and declarative programming: a=b+c is a statement about the relationship between a, b and c, but it _can_ also be viewed as the instruction: "Add b and c, and assign the result to a".

> This seemingly
> small detail in the previous sentence can open up a whole
> new area in the computing industry. Understanding this
> was my AHA! moment.

I see. Hm, you say the variables are _streams_... I still have a little difficulty understanding what is meant by the pseudo-code you gave. How is it different from the FP-interpretation of a=b+c?
Mike
Posts: 2 / Nickname: mikepetry / Registered: April 28, 2005 10:31 AM
Re: My Most Important C++ Aha! Moments...Ever
October 31, 2006 4:27 PM      
If you would like a C++ Aha moment, for kicks, consider developing an application in straight C.

I may be doing some embedded work in the future and much of that work is still done in C. I like C but I have come to depend on the benefits of C++/OO and I am a proponent of interface-based / polymorphic designs. How will I implement my beloved Factory Method / Strategy Patterns instead of immense switch statements?
We all know how modeling data as structs and then passing the structs to functions typically leads to code rot.
I decide to use C structs with function pointer members and I will pass the struct to each struct function member as parameter (ala Python). I now feel a little better but I still don't have auto-scope management of embedded pointers.
Instead of interfaces I can use function pointers but as I create all these functions I am faced with another peril! Namespace collisions - I have to create cheesy function names to get things to work in the one cluttered namespace!
What about the Observer pattern - please tell me its not so! I can once again lean on function pointers but it gets a little messy with my scheme. I need to squirrel away a pair of values for each observer. The function to call and the struct to pass as parameter. And handily the struct can be cast a void*. (This is not worrying or anything.)
Maybe someone else already solved these problems for me. Oh yes and it is called C++.
26 posts on 2 pages.
« Previous 1 2 Next »