The latest issue of Dr. Dobb’s Journal arrived today. Frankly, I’m not a fan of this magazine but they keep offering me a free subscription and I keep hoping I’ll get something useful out of it. The problem with Dr. Dobb’s is that it’s just not relevant to me as a programmer. With its heavy focus on Java, C++, .Net and similar languages, I find it useful for little more than taking up room in my recycling bin. With the latest issue, though, there’s hope. The cover story is the luridly titled “Ruby On Rails - Java’s Successor?”
First, let me make clear that I’m biased. Like many in my field, I latch onto an idea and I tend to filter new ideas through the old one. Admittedly, this means I don’t always see things accurately and I freely admit that my bias could be wrong, but it’s there and I seem to be stuck with it. You see, I’m biased in favor of economics. I went to college to be an economist and came out a programmer. Go figure.
How does economics influence my decisions? Well, in the case of programming languages, I am convinced that “dynamic” and type inference languages are going to crush “static” languages (yes, I’m quite aware that there’s a huge amount of ambiguity in those terms. Hopefully those familiar with the debates will cut me some slack.) Why do I believe this? Let’s have a history lesson..
Even though I’m in my thirties, I used to be a mainframe programmer. I had lots of fun (cough) working on programs written before I was born. I was in a pretty standard environment where our COBOL programs would be called by JCL, a primitive scripting language which forced the programmer to worry about such trivial detail such as how many tracks, cylinders or blocks on a disk to allocate to a program, how much extra space could be allowed if the allocated space is exceeded, the record length of the file (newlines did not typically delimit records) and so on. Further, some of this information gets duplicated in your COBOL program so it has to be kept in synch with the JCL. The idea of just opening a file and using it is pretty foreign to COBOL.
The difficulty of working with files (ahem, “datasets”) in COBOL is not just some quirk of the language. There are many things in COBOL which are tough to do. While I’m not a fan of the “For Dummies” series, I just happen to have a copy of the book COBOL for Dummies. The last chapter is entitled “Ten Tasks That Are Really Hard To Do in COBOL” and then proceeds, amusingly enough, to list nine tasks:
- Determing the Actual Size of a Record
- Arranging Data into Columns
- Extracting Part of a Text String
- Combining Text Strings
- Writing Comma-Delimited Text
- Reading Comma-Delimited Text
- Converting Between Upper-and Lowercase
- Finding a Square Root
- Generating Random Numbers
Can you believe that? Those are hard to do in COBOL! However, there’s a logical reason for that. Back when COBOL was first introduced, computers were very expensive relative to programmers. Programmers would carefully desk-check their programs to avoid bugs. They would go through them line by line looking for problems. None of this “run the damned thing and see if it breaks” tomfoolery. Computers were so expensive that it was important that as much of the work be shifted to the programmer as possible. As a result, COBOL didn’t do a lot of work to “just open the file and use it”. It didn’t offer a lot of built-in string processing. Difficult math wasn’t available. You usually read records directly into the variables you needed, you did some very simple processing and wrote the new data back out. That would be an entire, simple program but it would take a long time to write compared to today’s languages.
With the languages most of us were familiar with in the 80s and early 90s, programmer productivity gained enormously. For example, C, C++ and Java were all much faster languages to write in than COBOL. Because computers are so much cheaper and the languages made the programmers more productive, more interesting software could be built. However, you remember the brouhaha over Java’s automatic memory management? People claimed that it wouldn’t work. If you didn’t manage memory manually, your software wouldn’t be as efficient. The supporters argued that if the programmer didn’t have to worry about memory management, they’d be less likely to have memory leaks and the programmer would be more productive. As it turns out, the supporters were right and most newer programming languages offer some form of automatic memory management.
Even though there were a lot of folks dubious about the merits of Java’s memory management and JVM architecture and how “simple” the language seemed to be compared to C and C++, Java took off. Now, perversely, Java programmers often join the ranks of C and C++ programmers to sneer at “dynamic” programming languages such as Python, PHP, Ruby and Perl. Often dismissed as mere “scripting” languages, more and more programmers are starting to appreciate their power. This power can be summed up in a response noted Perl guru Randal Schwartz made in response to a Java enthusiest (a student, I believe) asking him how he dealt with Perl’s lack of “strong” typing. He replied “I just smile and move my program into production before the Java programmer has his first compile.”
These languages are not quirks. They’re a natural continuation of the economic forces which have shifted the productivity burden from the programmer to the computer. Programming languages 40 years from now will likely have less in common with today’s languages than today’s languages have in common with COBOL. The dynamic languages make programmers so much more productive that even conservative business types are forced to sit up and notice. That’s why I love Ruby on Rails, despite having not used it.
David Hansson, love him or hate, has created a killer app which is turning even diehard Java enthusiests to dynamic languages. There’s a reason why Amazon, LiveJournal and Slashdot rely so heavily on Perl. There’s a reason why Yahoo! decided to start using PHP. There’s a reason why Rails is written in Ruby and not Java. I think we’ve finally hit the turning point where the economic forces at work are too great too ignore. Of course, Java will be around for a long time to come — COBOL is still widely used, for example — but it’s simply math. The faster your programmers can turn out good applications, the more money you save (and can therefore earn).