Re: Designing a Programming Textbook
Posted: Sep 23, 2006 2:35 PM
I wrote a set of tutorials for my brother once. Although he wasn't an absolute beginner, there were many concepts he was missing that most programmers don't have. I saw many people who graduated from University, got their degrees and had no concept of how a computer worked.
First, you need a language that is super simple. BASIC is still good because it's line based. That's all you want to explain simple concepts.
Then you need to explain numbers and how computers operate on them. They need to know how bits work and why things are 8,16 and 32 bits. They need to know how to combine bits to form numbers. They need to know that negative and positive numbers are just a matter of interpretation. They need to know two's complement and that negative numbers can also represent positive numbers (interpretation by specifying signed or unsigned). They need to know boolean operations. Shifting, comparisons, etc.
But the main thing for beginners is to explain how this stuff translates to doing something productive. A LOT of beginners get bored or annoyed because they don't understand the leap of faith most programmers do when it comes to programming. Sure, you have a print command. But how does it do this? How can I write my own print command? How can I display graphics? How can I read the keyboard. The point isn't that you want them to hit the hardware. The point is that you need to tell them how a programming language goes from just doing arithmetic to actually reading and writing to devices (without API calls). Using functions and all the arithmetic in the world isn't going to tell them how this is accomplished. I see many people who say we should be programming at a higher level, but that's no excuse for not teaching beginners how the computer works.
They need to understand memory and how writing to memory can do several things. Writing to certain memory just stores data there. But other memory can be attached to physical devices. For example, each pixel on your screen can be mapped to a certain range of bytes. Writing to those bytes will change the color of a certain pixel on the screen. On a PC, there is another kind of memory used ONLY for devices called the I/O bus. This is where you can access all devices like the mouse, keyboard, HD and everything else. The OS takes care of this for you. Then you can go on to explain which functions in the standard library of the particular language you're dealing with does these things for you. Heck, a C64 emulator and using POKE and PEEK to memory addresses would explain this rather quickly. POKE addresses 0400-07E7 hex for showing text on screen. D800-DBE7 for colors (4bits forground/background of each character). D020 and D021 for border and background color.
There needs to be a bridge between a language and the machine. If the student doesn't understand how the language gets the computer to do something, they must take it on faith. That makes a lot of people uncomfortable.
I still know people who think it's the language that makes the computer do things. They don't understand that the computer in HARDWARE is what makes these things possible.
After that, you can move on to other concepts.
Just to bring home my point, I once helped a friend of mine learn a programming language. This guy was used to working with hardware, so he knew that it was the machine that actually did the work. So after a while, he stopped me and said something to this effect, "All this stuff is nice and all. But let's say I have no OS and no library. How do I get the machine to do something? I can call all my own custom functions and do arithmetic all day long, but if I can't show the results on the screen or get user input, no programming language is going to be of any use. There has to be a way to tell the machine directly to do these things. I don't necessarilly care to do them myself. I just want to know how it's done because right now, it's just a bunch of operations with no substance."
And this was from a guy who knew how the machine worked. After explaining the memory map, he thought languages were childish tools. That it should be a LOT more advanced. This was 1986 and not only has the situation not improved, but the computing industry has now deteriorated beyond recognition.