Saturday, August 2, 2014

Processing information vs. Cutting and Pasting , II

If I could do college over again, I’d major in computer science. I realized this during senior year when I took a class in artificial intelligence and learned LISP, the first programming language that really resonated with me (how I reveled in recursion), and then (in my final semester) a computation theory class in which I wrote rules for symbolic languages via formal grammars, and “virtual” programs (i.e., on actual paper rather than as software) via Turing Machines:



The funny thing was that I took these classes for enjoyment, while the computer science majors (pretty much everyone else in the room) took them because they had to. Most of them preferred other aspects of computer science that I didn’t find so inspiring: data bases, systems, and other real-world applications.

But if I were doing college today, I almost certainly wouldn’t major in computer science. That’s because real-world applications have edged out theory, and even basic programming techniques like recursion are, or so I’ve heard, becoming dying arts. I first wrote about this phenomenon six (!) year ago. But I had some new revelations about bad things have gotten more recently when I started looking for someone to help me port my GrammarTrainer software into a tablet-based platform. Surely the place to find someone was a computer science or digital media program, where students learn coding systems at the frontiers of today’s computational technologies.

But time after time these young people let me down. It turned out that they knew how to write code for iPads, and they knew how to use all sorts of high level programming packages, but they couldn't figure out how to get these packages to interface with my not-so-high-level code. Even though I set things up so that they didn’t have to actually deal with this code directly, but simply take as inputs to their code a rather small set of possible outputs from mine, they seemed to lack the necessary flexibility in programming and rigor in logic to write code that, however high level it needed to be (i.e., XCode), simply wasn’t that complex from a logical standpoint. Basic Boolean formations and basic decoding techniques seemed to elude even a bright computer science major from a top engineering school. What are kids learning these days, I kept wondering.

Sometimes one finds help in unexpected places. The person who has turned out to be up to the GrammarTrainer conversion tasks—very much so—is someone from my generation; someone who got in-depth training in actual programming, including all that fun theoretical stuff. Many of you know her as FedUpMom. She’s great! She can do anything!

Kids today may be learning about fancy graphical interfaces and how to plug one off-the-shelf program into another. But, in the end, no amount of cutting and pasting from one high-level package to another can match the flexibility of a Turing Machine. As I learned almost 30 years ago, no set of logical operations is too elaborate for those primitive Turing Machines—or for those students who spend time playing around with them.

3 comments:

ChemProf said...

My husband is a programmer, and he keeps saying that age discrimination keeps kicking in about five years older than him (he's now in his early 40's). But it keeps moving along with him, because the young folks aren't as computer literate as Gen Ex -- too much high level stuff, not enough time in the guts of the machine.

Anonymous said...

High school kids all think they know so much about "technology." If I ask one if they know what a DOS prompt is, I get blank stares. When my own son decided to teach me HTML, we were five minutes into it when I exclaimed, "Oh my gosh, it's nothing but Wordstar!" "Wordstar?" said my son, "What's that?"

FedUpMom said...

Wow! I'm blushing! Thanks!

I wonder if some of the problem with young programmers is related to fuzzy math and its contempt for algorithms. Programs, of course, are all about algorithms, and clear algorithm design is essential.

What I see in the efforts of some young programmers who have worked on GrammarTrainer is almost a guess-and-check approach; that is, they just keep throwing code at it until the program sort of seems to work. The result is a program totally cluttered up with, for instance, five different variables all holding the same basic information; three different chunks of code for inputting data files, only one of which (the most ridiculous one, reading data from the internet) is used; and large hunks of what I call "zombie code", that is, code that is never actually referenced by the program.

I wonder if our crummy math teaching is resulting in young people who don't understand logic and algorithms.

From a programmer's point of view, the demands of the program are actually quite simple (read some input, get responses from the user, respond to the responses), and the program design should be simple and clear too.