Russian Translation Coming Soon.
Making Great Programmers - Why BASIC is Still Relevant
Ian Larsen - drblast [at] sourceforge.net
When anyone asks if their children should learn BASIC, there is always a vocal group that says "NO! It teaches all the wrong things! Learn a modern language like (insert favorite language here)." But this is nonsense.
Just as spoken languages are tools we use to communicate ideas to other people, computer languages communicate ideas to a computer. What's really important is knowing what you want the computer to do, what idea to comunicate. The language you use to express that is largely irrelevant, as long as it works for your purposes. For the purposes of writing simple programs and games, BASIC works just fine, and has for years.
But as for the argument that learning BASIC will hinder further progress in learning about computers, why would that be true? It's not true for spoken languages. Nobody ever said, "I wish I never learned English, because I had to un-learn so many bad habits to learn Japanese." People who learn many spoken languages say it gets easier the more you learn, because you can relate them to each other and recognize patterns. It's the same for computer language, probably even more so, since computer languages are much more similar to each other than spoken languages are.
The problem is, "What programming language should my child learn?" is not the right question to be asking. The ultimate goal is not to teach students "how to program." The goal should be to teach them how computers work. Once you understand how computers work, you get the "how to program" part almost for free. At that point, it's just a matter of syntax.
So how do you make someone understand these ideas that make computers work? Well, it's an interesting question. I guess it would depend on your audience, how much time you had, and probably a thousand other factors. Currently there are about three distinct methods for teaching computer science.
The first the traditional approach, which teaches students how to program in the latest industry standard language. My college used this approach. Prior to when I enrolled, students learned C. I learned C++, and now they teach in Java. I struggled, like everyone else, with classes and pointers. I don't know what the current students struggle with, but I'm sure it's not pointers, because they don't exist in Java. I'm not sure when the current students learn pointers, or if they're expected to learn that later on their own, like we were expected to learn how linkers worked, what the compiler was actually doing when we hit the green button, etc. Maybe they never learn what pointers are.
This brings up a dilemna. Either pointers are obsolete, or they're being glossed over in the current computer science curricula. If they're obsolete, my college did a disservice to me by teaching me a faddish programming construct like pointers in CompSci 101. But if they're still valuable, then the current students are getting the short end of the stick by not learning them early.
The traditional approach will always have this drawback. The higher the level at which they teach, the more stuff they have to leave out, by necessity. Now, a good program will cover a range of topics, including how the hardware works. But if the range continues to grow, something's going to get left out.
The second approach is the MIT approach. MIT uses an excellent book which teaches the mathematical model of how computer languages and algorithms are structured. It's very theoretical and complete. Any student taking that course will gain a complete understanding of how computer programming works, and be able to implement the entire process from the source code to machine code generation. Interesting to note is that the computer language they use to teach it (scheme) has its roots in Lisp (which is over 40 years old) and will probably never change. This is an indication that what they're teaching is universally valuable. It's not just this year's industry concensus.
So why not just use the MIT approach? Well, consider MIT's audience. I'd wager that nobody in the MIT computer science curriculum is taking CompSci 101 and writing their first computer program. These people have probably lived and breathed computers all of their lives, and want to learn something that's difficult to learn on their own. They want deep understanding, and probably already have all the motivation they need.
Now consider the audience we'd like to teach, young children. The approach MIT takes is going to fail with them. It's abstract, mathematically challenging, and you don't get to draw pictures on the screen.
There's a third method, and while no college uses it, I'm fairly certain that my generation would find it familiar. It's the bottom-up approach. In the bottom-up approach, you'd start with a machine that has a small subset of functionality. I had a Commodore 64, which had a BASIC on it that was just a step above assembly language. You would write your first program and run it:
And then you'd add more. You use a GOTO statement to print that over and over. You'd see exactly how the computer jumped around in memory doing things in the order you specified.
What's the point? Well, one of the most necessary things when teaching Computer Science is to de-mystify the machinery. High level languages like Java or Python don't do this. What's needed is a language that's directly analagous to how computers actually work, like assembly language.
Once you learn assembly language, nothing else is ever too difficult. Is that because assembly language is so much harder than everything else? Well, not really. Compared to Java and its libraries, assembly language is tiny -- learning it all is simple. It's because when you understand assembly, you understand computers. You know that "functions" are nothing more than a jump with some stack manipulation to make it look like you're not using GOTO. You know that "objects" are nothing more than a collection of function pointers and data with compiler-enforced rules about how it all can be accessed.
So should children learn assembly language first? Certainly not. It's way too difficult to do fun things that a child would want to do, like write games, in assembly. They'd get bored and go do something else. Ninety percent of teaching children is keeping them interested long enough for them to absorb information. But if assembly language were easier and more fun, wouldn't it be an ideal language to teach children?
The truth is, the BASIC most of us grew up with was hopelessly low-level. At the same time, it was simple enough for a 7 year old child to understand. Why is that? Well, it's because computers are inherintly stupid, simple machines. There's no magic in there, despite what the Intel ads say. Making the transition from Commodore BASIC to Commodore assembly language was simple, because Commodore BASIC was so limited, just like assembly. They even had the same control constructs, like GOTO.
That's where BASIC comes in. It's as close to the computer hardware as you can get and still do fun things. It's the highest level low-level language there is.
So don't listen to the traditional approach folks. To teach someone how to program, he first has to understand how computers work. BASIC will teach him that easily, bottom-up, and he'll stay interested. Then, when he's got that down, he'll easily be able to learn more the MIT way.