Getting on with It |
|
|
|
It's 7:15AM and I'm on the RIPTA #60 inbound from Little Compton to Kennedy Plaza in Providence. My laptop is perched on my knees knocking noisily against the seat in front of me as the bus bounces along the country roads. I find it easier to type than scribble notes that I probably wouldn't be able to decipher later anyway. This will be the last entry in this journal or at least this volume of this journal. 32 (25) entries over some 71 days. It has been an interesting journey but I have some other places I want to go and I don't expect it would be much fun for you, the reader, to follow me as I stumble and lurch about trying to make sense of some of the research ideas that have alternately enthralled and plagued me over the last few weeks.
The other passengers on the bus are quiet and seem a little more quiet and apprehensive than usual. It's one year to the day since the terrorist attacks on the pentagon and the world trade center. We were told then by the press and the pundits that we'd all remember exactly where we were and what we were doing when we heard of the attacks. I remember the perfect late summer day and the news that turned everything dark, the cloudless blue sky now suspicious and foreboding, the knowledge of what must have occurred just moments before as the two planes that crashed into the twin towers had passed overhead on their way from Boston to New York. Perception and memory are so difficult to unravel; perhaps my fellow passengers and I are simply following a script, striving to meet expectations. We've been programmed to be apprehensive just like we've been programmed to remember where we were and what we were doing on the morning of September 11th. Or perhaps I'm the only apprehensive one and I'm reading my attitudes into the behavior of my fellow passengers. It's easy for me to get side tracked thinking about the tangled threads of computation and communication that write our individual and collective histories.
The main reason I'm going into Brown today is to participate in a seminar course that a colleague, Roger Blumberg, is running this semester. Roger and I collaborated on the design of an exercise designed to get students thinking about machine intelligence and the ethical and moral issues that are likely to arise when human and non human intelligent machines inhabit the same space and vy for the same scarce resources. The September 6th entry of a few days back was my attempt to prepare for today's class. The students in Roger's class have read the exercise and are expected to come to class today prepared to discuss the issues. Both Roger and I in looking over the RoboPet exercise, as we call it, realize that it contains many deep and difficult questions. I worry that it will be too complicated, losing the students in a maze interlinked issues and ideas. But then I stop myself, realizing that the students will find their own way choosing from among the many themes those that resonate most strongly with their interests and background.
Roger is an expert in the use of technology in the classroom. Which isn't to say that he's a zealous advocate or a "technology evangelist" as some advocates of computers in the classroom like to characterize themselves. He has a healthy skepticism regarding the potential advantages of sprinkling computers throughout our schools. Not that he doesn't think computers are useful and have their place in schools but simply that they won't solve the problems we're faced with in getting students to learn. Notice that I said "getting students to learn" and not "teaching students". I've been "teaching" for over sixteen years now and I still don't know how to get students to learn. Sometimes it just happens. Sometimes I like to think that something I did, a project, a lecture or a new approach to getting the students involved, was responsible for making things come together. The only approach that I can depend on succeeding better than chance is working closely with students, lavishing time and attention, getting inside of their heads and steering them toward ideas and projects that they find motivating and rewarding. Roger is an excellent guide (I'm avoiding the use of the word "teacher") not because he knows any magic teaching method or has any technological advantage but because he is good at getting inside the heads of his students and because he engages them on their own terms and is generous with this time and energy.
The Brown faculty is blessed with many insightful, engaging and generous guides and mentors. The problem, however, is that despite the relative high density of such people here at Brown there are far too few to satisfy the demand for personal attention. This is one reason why educators look to technology to amplify, distribute and (blasphemy!) automate and clone the skills of our best educators and academic guides. By now you can guess that I believe this is, at least in principle, possible. But I share Roger's skepticism that current technology offers a solution to our present problems.
I'm partial to the idea that we learn inductively by doing. I liked Neil Gershenfeld's depiction in "When Things Start to Think" [Henry Holt & Company, Inc., 1999], of how students working in the MIT AI lab learned by choosing a project that they were passionately interested in and then backfilling their knowledge when they ran up against a problem. It reminded me of descriptions of how teaching and learning occurred at the Dewey School. John Dewey (1859-1952) was a philosopher and educational reformer who, while chairman of the Philosophy Department at the University of Chicago, started an elementary school or "laboratory" as he liked to call it. The curriculum of the school was based on his educational philosophy that knowledge is a by-product of activity; you do things, you learn from doing and then you consolidate the useful things you've learned so they'll be available the next time you do something. In Dewey's school, making lunch was an education in itself; the chemistry and physics of how ingredients combine in the process of cooking; the mathematics involved in weighing ingredients, adapting recipes and calculating cooking times; the biology involved in the production, digestion and elimination of food.
But the Dewey School worked in large part due to the dedication and inspiration of its staff and Dewey himself. When it opened in 1896 the school had sixteen students and two teachers; by 1902 it had 140 students, 23 teachers and 10 graduate student assistants [Menand, 2001]. Beleaguered elementary school teachers today would be jealous of such student-to-teacher ratios. Is it possible that the web will provide a nourishing and supportive environment in which students can learn without the hard-to-come-by individual support that lucky students are afforded in our most elite institutions? I suspect the answer is "No!" but I have to qualify my answer somewhat.
If you're interested in computer technology, the availability of relatively inexpensive computers, the software and educational by-products of the open-source movement, and the accessibility of high-quality on-line tutorials and course materials provide a veritable cornucopia of learning resources for anyone interested in computing. I've benefited hugely from these opportunities and given freely of my time and energy in providing much of my output free and on-line. But it's difficult distinguishing the good on-line resources from the bad, the useful from the misleading or downright wrong; it helps to have a guide and I've had my share of helpful mentors over the years. But today's on-line resources are not so different from what was available in the public libraries when I was growing up. I liked to hang around libraries; it was common for me to teeter home on my bicycle with a backback full of heavy science books. I preferred learning on my own to learning in school and I became good at it. But as Richard Felder and many other educators have noted there are very different types of learners. Students who like examples and concrete detail and students who like generalizations and abstract principles. There are students who learn best by reading text or listening to lectures and those who need diagrams, pictures and demonstrations to make things clear. There are students who like to get their hands dirty working on projects and conducting experiments and those who like to reflect on what they've heard, read about or seen others do.
I certainly don't want to be discouraging but learning is hard. Until the day comes when scientists figure out how to pour knowledge into our heads by hooking us up to a neural shunt and downloading the wisdom of the ages, most learning will happen the old-fashioned way, one step at a time with the onus on the learner and the over-extended teacher/guide doing what ever can be done to facilitate and motivate. For those of you who like being exposed to the nitty-gritty details of real programs and real syntax and at the same time the high-level models and general principles that explain the functioning of those programs, I hope you've found this journal enjoyable and possibly even inspiring. At the very least I hope I've communicated some inkling of my enthusiasm for computer science and inspired you to learn more on your own.
I'm now in my office and I just took the opportunity to look up some of details concerning the early years of the Dewey School. I'm at a bit of a loss for what to do next, both specifically about this journal and more generally about my research. There is a certain comforting rhythm that you get into working on a project like this journal. It provided a focus and an agenda for my waking hours. I worked on many other projects over the last 71 days but this journal was a reliable constant that I could depend on and throw my energies into working on. Research is a very different sort of endeavor. It's characterized by uncertainty and fraught with the potential for frustration. Scientists can spend their careers trying to answer a question only to have someone else, quite possibly a not-dry-behind-the-ears upstart, find the answer after mere weeks of seeking. They can find their theories which stood the test of many years collapse under the weight of new findings. They can find that the questions they sought to answer are irrelevant or ill posed. The carrot is that each morning they get to face the unknown, pose the question, suggest the hypothesis, run the experiment, analyze the findings. They get to spend their days and nights inventing the future. What could be more fun, more challenging, more compelling than that?
As for this journal, I'm of two minds. I could just leave it as a web site in its present, somewhat disheveled state. The advantage of this alternative is that it wouldn't require any more work on my part and it would likely reach some of the audience that inspired me to write it in the first place. Alternatively, I could seek out a publisher and the services of a good editor to clean it up and put it in a form that would make it more accessible and possibly reach a wider audience. This would require more time on my part, time that I would otherwise spend on research. Also I have to admit that I'm not looking forward to editorial scrutiny. Having written a couple of books and over a hundred scientific papers, I'm no stranger to criticism and editorial feedback, but that doesn't mean I go out of my way to subject myself to it. Beside I've heard scary stories.
I've heard that the editors who manage the authors of popular science
and technology books tell their charges that each equation will cost
them ten percent of their expected readership. I assume that this
means the first equation costs you ten percent of whatever your
largest potential readership might be, the second costs you ten
percent of whatever remains, and so on. That's pretty scary for as we
know from the August
2nd entry the above prediction indicates a
negative growth process with an exponential decay. After ten
equations, you've reduced your readership by 0.910 =
0.3486784401000001 ... oops just lost another ten percent.
And if I had (only) 100 equations (including the next one) then I'd
reduce my readership to 0.9100 =
2.656139888758754e-05 (that's
0.00002656139888758754 in scientific notation) of its
original potential, so in this case I'd need more than
1.0/2.656139888758754e-05 = 37648.61949599018 readers in
my initial pool of potential readers to ensure having even one
stalwart left after the 100th equation. This book is full
of code and has a healthy sprinkling of equations to boot so either
the editors are wrong in this case or there's no point in actually
publishing this journal. I suppose I can hope that code isn't quite
as repugnant to readers as equations seem to be.
I like expressing ideas in code. Computer code is, or at least can be, actinically precise and you can take the code and run it on a computer. This latter property is, I think, the most exciting, the most revolutionary aspect of computers and computing technology. There's nothing wrong with natural language for expressing ideas precisely, but communicating precisely with natural language usually takes more effort and discipline on the part of both reader and author than either have the patience to endure. And, of course, you can be just as sloppy with code as with prose perhaps even more so if it's particularly inscrutable code because no one will even take the time to read it and provide feedback.
My own code is not a model of precision, clarity or much of anything else positive. My style of writing code like my style of writing prose is somewhat eclectic, discursive and occasionally bottom-up to the extreme. In order to write an algorithm or library that is clear, well-structured, top-down and of some general utility, I have to write it two or three times at the very least and throw away the intermediate versions. I wish I could say that reading my code is like reading prose but it isn't. Nevertheless, I think that there is value in reading the code of any reasonably disciplined programmer. The value is in learning to translate the syntactic and stylistic conventions of one person into the syntactic and stylistic preferences of another using as a semantic basis the various models of computation that provide meaning for computer programs.
I tried to avoid attaching too much importance to computational models in this journal, but I did mention them with some fanfare in passing: the substitution model, the register machine model (also known as the von Neumann architecture), the relational model for databases, the client-server model. I'm now convinced that most students pick up these models by osmosis but don't learn their names or articulate the standard terminology until they have reason to, say in borrowing from the mathematics of theoretical computer science and learning to prove properties of computational problems and algorithms for solving them.
And what is the value of scanning reams of incomprehensible syntax trying to make some sense out of the whole? I counseled, somewhat disingenuously, "squint" at the unfamiliar, scan for the verbs and nouns and don't worry about the analogs of the "uh"'s, "umm"'s, "really"'s, and "like man"'s that punctuate everyday speech. Every paren and semicolon in a computer program conveys some meaning but some of the syntax doesn't convey enough information to bother puzzling over. Computer scientists get used to extracting the gist out of unfamiliar syntax. Hardly a day goes by when some piece of arcane syntax doesn't whiz past as I'm invoking, installing, compiling and otherwise talking with machines. And sometimes I have to go back and figure out what the mysterious character sequences meant but thankfully not often.
I guess that part of the reason for inflicting so much code on the unsuspecting reader is to demonstrate that syntax need not be the unsurmountable barrier to understanding computers that it can become for some people. Just as learning a second language is often best accomplished by immersing yourself in the everyday discourse of a country in which the language is spoken, so learning to talk with computers is best accomplished by jumping in talking with computers and computer programmers in their language. This is why I like interactive programming environments (sometimes inappropriately identified with so-called "interpreters") such as those typically available for Scheme, Common Lisp, Prolog, Mathematica, various database languages, scripting languages such as those available under Unix shells, and potentially any language though less commonly used for C, C++ and Java. Talk with your computer. Make the computer adopt your language conventions by learning how to devise new special-purpose languages and building compilers and interpreters to make sense of those languages.
You have the advantage right now. You're smarter, more adaptable and more knowledgeable than any computer now in existence. There's no part of the computer on your desktop or for that matter no part of the biggest, fastest computer on the internet that you can't understand in all of its fascinating detail. You can write programs that make it easier to communicate with and share experiences, information and projects with your friends anywhere in the world. You can build new, virtual worlds to inhabit and invent new forms of commerce, new forums for political discourse, and new ways of experiencing the physical world in which we now live. You can make the digital prostheses of future happen now. In short you can conjure up the spirits in the machine and make them dance to your bidding. And perhaps you'll be the first person on your block to build a computer program that's smarter than you are.
It's 5:22PM. I had to run to catch the 5:10PM bus and now we're stuck in traffic on I-195 heading east. All day long with the continued warnings to be alert to possible signs of terrorism, I've been a little more sensitive to police sirens. But I'm pretty confident that the ambulance and police vehicles zipping past in the breakdown lane with lights flashing and sirens wailing are hurrying to investigate an accident up ahead that's caused traffic to grind to a halt.
There's bright sun and blue sky overhead interspersed with high scattered clouds and strong winds that have been gusting through the city streets all day long. Someone in front of me is saying that Gustav, the first hurricane of the 2002 hurricane season, is responsible for the winds that are knocking down trees and power lines throughout southeastern New England. I'm looking forward to seeing the whitecaps flashing in the late afternoon sun on Narragansett Bay as the bus heads across the Mount Hope Bridge on its way to Little Compton. Roger's class today went well and I enjoyed talking with the students about human and machine intelligence. It was fun seeing them get excited about the issues, testing their own preconceptions, trying on new ideas, and finding holes in their own and their classmates' arguments. You can't help thinking that this, talking with one another about who we are and how we relate to one another, is one of the most important aspects of being human, or, dare I say, being a machine with suitably advanced software. My laptop is warning me that its battery is low. I've been carrying it with me all day and haven't taken the time to plug it in for a recharge. My computer has more sense than I do it when it comes to knowing when to call it a day.