> The choice of Scala, is potentially contentious because > Scala isn't as mature as Haskell or OCaml. However Scala > is chock full of support for advanced programming > techniques, it is type safe, and it has a sane syntax.
Just to get a clarification of the above: It's probably not what you meant, but did you mean to imply that Haskell/OCaml aren't type safe, or doesn't have a sane syntax...? :) I mean, since it was apparently used as part of the justification for not choosing one of these.
> The order of these sections are not intended as hard and > fast rules. I am hoping to design the book such that > entire sections can be skipped or moved. For example, > Pointers and Memory Addressing can be placed at the end, > or left out entirely. Another possibility is that > Functional Programming can be placed before Object > Oriented Programming.
Just off the top of my head, and you've probably thought of this, but I'd started with _what_ a person would get out of this, rather than focusing on features first, and use second. I.e. thinking from the "outside-in", rather than "inside-out", as it has been said.
For example in "Accelerated C++", the authors starts with the standard library, and this way lets the user start with something interesting and potentially useful, right away. Thus, they considers what is needed for a particular task, and teaches that.
For this reason, pointers aren't introduced before way back in the book (if even then), after things working with iterators and containers.
In other words, the order features are introduced is mainly motivated by what is needed or useful for a specific task, rather than how or where it might fit in some paradigm.
"Multi-Paradigm Programming in Leda" takes a similar approach, typically showing different ways to solve the same thing in different "paradigms", throughout the book, but then, that book is aimed at specifically teaching multi-paradigm programming. This seems to be topic of point 5 in your list.
By the way, interesting posting, Cleo. I started with BASIC, too, and found it a very good way of being introduced to programming (really!).
For me, as a beginner, the discoveries were like (aided by a book, of course):
>PRINT 1+2 3
"Aha, so you can use it as a calculator." :)
>A=123 >PRINT A
"Aha, so you can store data in named 'somethings'. So 'A' i s a kind of container for '123'..."
And so on...
You need to start with the absolute basics! (pun unintended :) )
From this, you learn about types (in BASIC, they are distinguished by the trailing sign: "A%" is an integer variable and "A$" is string variable), and so on.
I didn't learn about bits until much later, as I didn't need it, at all, earlier. You can learn about boolean operations without knowing anything about bits (which has more to do with the implementation).
Two's complement and all that was something I started with _much_ later, when starting doing assembly programming, which is not really applicable to beginner level... :)
However, I naturally agree that knowing "how things work" is a big advantage, but we're talking beginner-level, here... You can't really expect someone to start as a beginner, and end up as Bjarne Stroustrup, in _one_ book (then we're more talking stacks of books, and many years of study and practice).
This later part was mostly in response to what Cleo wrote in his posting, and the quite advanced things he brought up there.
When "beginners" start asking how "PRINT" is implemented, and how they can do it, themselves, as well, then I'd say they aren't really "beginners", anymore... :)
Then we're talking advanced stuff like hardware architecture, assembly language, prosessors, I/O circuits, interrupts, and the like. Very interesting and rewarding to learn and play with, but hardly beginner stuff.
I'm afraid that if you start talking about that in a beginner book, you'll loose them real fast!
Beginners (and more advanced programmers, as well) will likely be more interested in what each feature does, and what you might use it for, rather than how it works, underneath. For the latter, you have another language: Assembly. :)
A simplified model of how a computer work, can however be useful, and help put things in context.
>There needs to be a bridge between a language and the >machine. If the student doesn't understand how the >language gets the computer to do something, they must take >it on faith. That makes a lot of people uncomfortable.
Well, like I said above, a simplified model may convey this, and if they want further details, they're of course free to study that. However, I doubt that this will be high on the list for someone just starting out programming, not even knowing how a computer works in the least...
>Just to bring home my point, I once helped a friend of >mine learn a programming language. This guy was used to >working with hardware, so he knew that it was the machine >that actually did the work. So after a while, he stopped >me and said something to this effect, "All this stuff is >nice and all. But let's say I have no OS and no library. >How do I get the machine to do something? I can call all >my own custom functions and do arithmetic all day long, >but if I can't show the results on the screen or get user >input, no programming language is going to be of any use. >There has to be a way to tell the machine directly to do >these things. I don't necessarilly care to do them myself. >I just want to know how it's done because right now, it's >just a bunch of operations with no substance."
Uhm, you usually don't need to dip to assembly language and the hardware level, just to do I/O... I'm afraid I didn't really get your point, but it might be a good one, so could you have elaborated?
>And this was from a guy who knew how the machine worked. >After explaining the memory map, he thought languages were >childish tools. That it should be a LOT more advanced. >This was 1986 and not only has the situation not improved, >but the computing industry has now deteriorated beyond >recognition.
This probably followed from the above, so without understanding the point there, this didn't make a lot of sense to me, either (but might, with clarification of the above).
> I am guessing C# is not mentioned because C# is > (effectively) Windows only and thus irrelevant for most of > those that are interested in languages.
I cannot imagine a beginer who wont be using windows for learning programming concepts. I cannot find a single word in the ECMA/ISO C# specification (about 500 pages long) that makes C# windows only. And by the way, how exactly is windows irrelevant to people interested in learning to program ?
> > For this particular case, I think Christopher is > overlooking dynamic OO languages (introduce Ruby, Python > or possibly Smalltalk), and that the use of C++ to > introduce pointers is a mistake. Use C or assembler > instead.
Python,Ruby and the like have a very strong case, but my problem with them is their distance from the underlying metal and a more or less 2D abstraction level. A language which can carry a student from the lower,explicit and verbose level upto the most abstract,productive, and terse level is a better medium for growth than a language at a fixed abstraction level. So a language with this 3D abstraction space is, in my opinion, best for academic purposes. Ten years earlier my choice would have been C++, now its C#.
/* I am guessing C# is not mentioned because C# is (effectively) Windows only and thus irrelevant for most of those that are interested in languages.
[H]ow exactly is windows irrelevant to people interested in learning to program ? */
I wanted to learn programming largely because I had heard of Linux and thought open source was a cool idea. Lots of other people are in that boat. Not all of them, but lots. I think that was the gist of the original comment.
A lot of people use their computers as well-defined tools, and do exactly what (Software Company) expects them to do, whether that's surfing the web, writing emails, or making church newsletters. Others try to get their computers to do backflips, push the envelope, and want to learn how the computer does X specifically so that they can get the computer to do Y. That second group is much more likely to buy this book. That second group is also more likely to try running something like Linux or even BeOS.
/* I cannot find a single word in the ECMA/ISO C# specification (about 500 pages long) that makes C# windows only. */
That's true. And Mono and dotGNU are trying to get a non-Windows version out there. However, the original comment was that .NET is effectively Windows-only, and that seems about right. There's no requirement that Visual Basic be Windows-only, but it's efectively the same thing.
/* A language which can carry a student from the lower, explicit and verbose level up to the most abstract, productive, and terse level is a better medium for growth than a language at a fixed abstraction level. */
Agreed, however I think the students are best served by starting at the top level, and then working down to the more verbose, explicit level.
/* Absolutely. But I think there will always be people whose temperament makes the below upward approach best for them. Its a highly controversial topic anyway. */
That's no fun. We don't have any reason to argue anymore.
I once signed on to an open source project that never went anywhere ( http://embassy.sourceforge.net/about.html , but consider http://www.alice.org/ ). The idea was to teach programming through making it possible to program virtual battle-bots. The original idea was to start the newbie programmers out in assembly code, and then move them up to C when they had enough points, then C++, then Java, then improved libraries, etc.
When I first heard about it I didn't even consider whether starting out in assembly code was a good idea. After a little while things kind of dawned on me. Who wants to write "Hello World" in assembly code ( http://www2.latech.edu/~acm/helloworld/asm.html )? That learning curve's too steep for most people.
As I mentioned in the start of this discussion, I liked the idea of the book. But, IMO, it should not be a book for total beginners. It would be more usefull as some kind of programming paradigms and techniques "encyclopedia". When in college, we are, hopefully, exposed to several programming paradigms. But, more often than not, people don't fully comprehend the usefulness of each paradigm. They simply choose imperative/oo and forget about the rest. A book such as this one that Diggins is proposing could open the eyes of new programmers to effective techniques of many languages and paradigms. The only problem is how will you make them buy the book. "Learn Foo in ## days" will continue to be the choice of people learning about programming.
Because I did it for years at the university level.
To teach programming, you must make the students write programs. Every language has a bunch of baggage that comes along with it like build semantics, file layouts, program invocation conventions, etc.
A typical beginning programmer will be lucky to come to grips with a single environment in a semester. You can teach one language. Pick one. Based on your outline, the students will spend about 10% of their time absorbing the concept you intend to teach and the other 90% trying to get the latest environment configured on her machine - and you'll spend all your time as an instructor doing tech support.
No sane instructor would choose to teach from such a text.
Now if you want to write a survery of languages book, fine - that's an upper level class and switching languages is kind of the point. But I think the outline you posted is just insane for an introductory text.
Terje Slettebø: My main point was just that there's a disconnect between programming languages and how this language accomplishes some of its tasks. In the case of my friend, he was a (amateur) hardware guy, so he knew how to make hardware do stuff by creating his own circuit boards and stuff. To actually get the computer to do stuff, he needed to learn a language. But this didn't answer his question of how it gets the computer to do something. Yes, how to implement your own PRINT function is advanced. But there's no reason why some concept of it can't be explained even to a beginner. All too often, I see people give up because it just never "clicks" how the computer actually does things.
Anything that has to do with concrete things on the computer, you have to use a built-in function. We all understand this today. But for a beginner, it puts up quite a wall. If a built-in function isn't available, they will have no recourse. And I see this ALL the time especially once they learn a beginner language because it usually has limited links to the OS.
So then I see them trying to figure this out with what they've learned. They know a language. They should be able to figure this out. This is what goes through their minds. But of course they can't. The language doesn't provide it. So they're at a loss and they leave. This happens at the beginner stage. After they learn a few commands, the search for knowledge grows exponentially. When they hit that wall, they hit it at full force and they usually never return.
Hope that explains it better. I'm not saying we should give them the tools to hit the hardware. I'm saying there should be an overview of what makes a computer tick and an explanation of why we don't actually hit the hardware. This was a common thread in the '80s but the explanation has since disapeared. I think the explanation should be revived with proper details on the machine.
BTW, thanks on the comments. I agreed with the BASIC examples. I had a feeling of nostalgia when I read it. The other stuff, I never meant to go that deep (low-level), so I agree there too.
> Terje Slettebø: My main point was just that there's a > disconnect between programming languages and how this > language accomplishes some of its tasks. In the case of > my friend, he was a (amateur) hardware guy, so he knew how > to make hardware do stuff by creating his own circuit > boards and stuff.
Ah, in that context it makes a lot more sense. However, most newcomers to computers don't know the low level stuff, and would thus have a hard time mapping what they see on the top level, to that.
The idea is to find a good mapping from what you want to teach or learn, to what the person is familiar with or understands, and in the case of a hardware guy, showing the hardware connection makes complete sense.
> To actually get the computer to do > stuff, he needed to learn a language. But this didn't > answer his question of how it gets the computer to do > something.
> Yes, how to implement your own PRINT function > is advanced. But there's no reason why some concept of it > can't be explained even to a beginner.
> All too often, I > see people give up because it just never "clicks" how the > computer actually does things.
Yeah, or make make grossly inefficient programs as they have no clue about how it actually works...
> Anything that has to do with concrete things on the > computer, you have to use a built-in function. We all > understand this today. But for a beginner, it puts up > quite a wall. If a built-in function isn't available, > they will have no recourse. And I see this ALL the time > especially once they learn a beginner language because it > usually has limited links to the OS.
Well, I was lucky in that regard, as BBC Basic (on the BBC computer) was one of my first Basics (and where I learned most of it), and besides giving you full access to the OS, it has a built-in assembler. :)
> So then I see them trying to figure this out with what > they've learned. They know a language. They should be > able to figure this out. This is what goes through their > minds. But of course they can't. The language doesn't > provide it. So they're at a loss and they leave.
Yet, it seems like high level languages with little or difficult access to low-level stuff, like Java, etc. are quite popular among beginners.
I know with myself that I had no yearning of learning the low-level stuff at the beginning: Learning and programming Basic was more than enough for me.
However, people are different, and come from different backgrounds. Even though I had done electronics before the first home computers came, learning a high-level language first gave a gentle introduction to computers and what they can do.
By the time you're sophisticated enough to wanting to go beyond that, you're usually also aware of the possibilities you may have (such as assembly, or in the other direction, towards higher abstractions, such as C++ or Java). At least, that's my own experience, but others may have a different one.
> This > happens at the beginner stage. After they learn a few > commands, the search for knowledge grows exponentially. > When they hit that wall, they hit it at full force and > d they usually never return.
Why? If they really want to learn "how it works", what's stopping them?
> Hope that explains it better. I'm not saying we should > give them the tools to hit the hardware. I'm saying there > should be an overview of what makes a computer tick and an > explanation of why we don't actually hit the hardware.
Agreed. That's what I meant by a simplified model of the computer. If they want to go into the details, they can.
> This was a common thread in the '80s but the explanation > n has since disapeared.
Oh. Hm, maybe my experience isn't particularly relevant, then. :) I.e. things are done differently, these days.
This is, coincidentally, much of the reason for me recently having ordered an Iyonix ARM-based computer, with RISC OS, which I'm very familiar with, as, for one thing, I've grown tired of all the limitations and difficulties of being able to go down to the fundamentals, in operating systems like Windows.
Apart from that, other reasons are that the OS is in ROM, so it boots in a couple of seconds... This means you can turn it on, and immediately try something out, or just play with it, having full access to graphics and sound, using Basic or whatever.
Just like the 80's. :) Only this time, it's with a 600 MHz Xscale RISC processor, etc. :)
> BTW, thanks on the comments. I agreed with the BASIC > examples. I had a feeling of nostalgia when I read it.
Ditto when I read your posting. :) That's what inspired it.
I don't think we're high level at all. I think we're probably at the highest level of low level programming languages. That's why I find it ironic that people who program in low level languages that directly control stuff like execution point (loops, functions, threads, locking, conditionals, etc), memory, input and output devices (files, mouse, I/O, sockets, screen, etc.), have no concept of how these things work. It's really mind-boggling.
Even if they don't want to know how low level stuff works, maybe they should if they use any current low level language such as Java, C++ or any of the Lisp variants or the php, perl, ruby and python's of today.
The most ironic thing is that a programming lanuguage is supposed to control how the computer does things, yet we want to hide that away.
There's been talk of this before. There's nothing really to get new programmers into the field. Not any REAL programming. And I use that term loosely. What new programmer will be able to write an OS? Who will be able to really write something that requires a breakthrough? Who will understand enough of the computing world to really make a difference. We were all beginners at one point. The worst thing we can do is not adequately explain to them the link between a programming language and the computer.
It is still possible to create things with a few people in a basement. It's just that we have too many layers. That's the only difference and why we see it less.
> I don't think we're high level at all. I think we're > probably at the highest level of low level programming > languages.
Naturally, "high" and "low" are relative terms. C was once considered a "high-level language", and compared to assembly, it is, but not really compared to the higher level languages.
> That's why I find it ironic that people who > program in low level languages that directly control stuff > like execution point (loops, functions, threads, locking, > conditionals, etc), memory, input and output devices > (files, mouse, I/O, sockets, screen, etc.), have no > concept of how these things work.
You mean at the hardware/assembly level? Well, they don't really need to, either, do they? Moreover, that they may not understand how it works "under the covers" surely shows it to be a higher-level abstraction than the hardware, and therefore useful in itself.
> Even if they don't want to know how low level stuff works, > maybe they should if they use any current low level > language such as Java, C++ or any of the Lisp variants or > the php, perl, ruby and python's of today.
Oh, "low level"... Well, maybe they are "low level" compared to something you're thinking of, but I've yet to get a clear grasp of what that is, so it's a little hard to asses this opinion.
In any case, what level of abstraction you work on depends on the context, so any of these levels are useful, depending on the domain. "high level" is not "better" than "low level", or vice versa, just different. They each have their strong sides.
I think this is a very important point, as we tend to get low-level "zealots" and ditto high-level ones, and few voices of reason saying that these complement each other...
> The most ironic thing is that a programming lanuguage is > supposed to control how the computer does things, yet we > want to hide that away.
The way I understand it is that a programming language is a way to state what you want to get done. In the case of a logic programming language (like Prolog), this is done by stating facts and relationships, and by asking questions of the system, the system itself tries to come to a conclusion, based on what it knows. Thus, there's no notion of "controlloing how the computer does things" (although you might give it hints, to optimise the search for a solution, but that's optional).
From this understanding of programming languages, it makes perfect sense to hide away irrelevant details not having anything to do with the task to be done. I see no irony about it.
> There's been talk of this before. There's nothing really > to get new programmers into the field. Not any REAL > programming. And I use that term loosely. What new > programmer will be able to write an OS? Who will be able > to really write something that requires a breakthrough?
These are definitely good points: I think it's important to know both the high- and the low-level of programming, for these and other reasons (such as efficiency).
I've heard that there's been a bit of a backlash from the industry to the educational institutions, and some going "back" from teaching Java to C++, as companies complain that people who have only learnt Java aren't really "programmers".
> It is still possible to create things with a few people in > a basement. It's just that we have too many layers. > That's the only difference and why we see it less.
I think it also depends on the system. I've used Windows for many years, and haven't been very tempted to "go deep" with it, but a computer system I've owned earlier (Acorn Archimedes/RiscOS) was something that really invited playing and experimenting. I even co-authored an assembler for it, and knew the OS and hardware pretty well.
I've found that you get one kind of "rush" from doing low-level programming, where you're in complete control, and know exactly what is happening, down to the individual bits, and another from being able to use high-level abstractions.
Again, they complement each other. :)
P.S. There's a reply to you in the "My Most Important C++ Aha! Moments...Ever"-thread, related to this, that I hope you'll reply to, as well.
I agree with other posters that multiple languages are not suitable for an introductory course. Often times simply setting up a programming environment is a challenge - even for someone who is experienced. That's time wasted. Even if you have university machines pre-configured, most people want to use their own PC. Learning the syntax and libraries for multiple languages is also probably too much.
I would definately pick a cross-platform language, like Java or one of the scripting languages.
If the goal of the course is to make real programmers out of them, then I would teach C++. C++ supports many (probably too many) paradigms. It exposes you to low level details, but it also provides ample tools for abstracting those details away.
It's a good way to learn. Make them shoot themselves in the foot manually twiddling with memory, then show them ways to abtract away the twiddling. If you just try to teach them abstraction, too many of them won't believe you that it is important. Why do I care about creating a Square that subclasses Shape and is composed of Points? Can't I just define four variables and be done with it? I have to do that anyway because I need values to pass to the constructor...
Oh, and give them a compiler on a CD or website that they can install on their computer. As in any computer - Linux, Mac, Windows, Solaris, etc - and without requiring admin rights.
So here's my suggested outline: 1. Programming Basics 2. OO w/o polymorphism 3. Pointers and Memory Addressing 4. Basic data structures 5. Basic templates and STL 6. Algorithms on basic data structure 7. Polymorphism 8. Generic programming w/templates 9. Smart Pointers and Garbage Collectors 10. Advanced data structures and polymorphism 11. Algorithms on advanced data structure with generic programming using templates 12. Python
Students will appreciate the fact that your book will get them through 2 years worth of programming classes. Until they hit chapter twelve at which time they will hate you for making them learn all that unnecessary gunk. Until they work with people who don't know all that "unnecessary" gunk and want to kill them.
I have enough trouble when schools think it's a good idea to change programming languages for every class. Changing it in one book is a lot worse.
Think about this before you write your book. What are you trying to teach and who is your audience? My first programming teacher told me “I'm going to teach you to program not any programming language” He used Pascal but I was happy to find I transitioned into C quickly and with a little help C++.
Your original concept is not bad if you are writing a book on the strengths and weaknesses of programming languages. There are classes that many colleges give on the differences in languages but you have to have understand programming first.
If you want to teach the bare bone basics that anyone can understand then pick a simple language (Java is to strict, and C++ to advanced) and drop about 3/4 of your topics.
If you want to teach MS Visual C++, Borland C++ Builder, or g++ then title the book accordingly and teach the parts of the product that are unique.
If you want to teach advanced programming structure and concepts from the ground up then do that. (That is what I gathered was the goal from your original list)
You should put the language in the title like "Advanced Programming from the Ground up! with C++".
I do recommend C++. It's about as generic of a programming language as you can get. Keep compiler specifics out of the book. Pick a generic cross platform compiler and give a brief explanation on where to get it and how to compile code and don't mention it again. If you want to write the code in something easier to work with that is fine just make sure the code that makes it to the page compiles in other compilers. This will ensure the largest reader base because you keep the politics of OS's out of it.
Being a programmer these days does not mean what it used to. Web Masters, Application Developers, IT Professionals, DBA's, and Software Engineers can all be grouped under programmers. They have very different uses for programming and most don't care about the low level details that you want to cover. Using C++ (and putting the name in the title) will help people understand who the book is written for. Readers who will get the most out of the book will not have trouble with C++ and it's easy to acquire a compiler that runs on the hardware they have. They will have no trouble going off after reading your book and witting their own code if you wrote it well.
Good luck with the book. I'm always interested in seeing how new books try and explain the more advanced abstract concepts to new programmers.
Flat View: This topic has 28 replies
on 2 pages