Nietzsche in the Maine Woods

Previous Home Next

We're crossing the Confederation Bridge from Prince Edward Island to New Brunswick. Unfortunately, the barriers along the sides of the bridge are too high to see over, otherwise I'd be looking at the view. All I can see is a ribbon of asphalt stretching in front with a four-foot concrete barrier on either side. Once we crested the highest point on the bridge the view got a little better but the atmosphere was hazy no doubt due to several days of unusually hot (for P.E.I.) weather.

We spent four nights on P.E.I. and I gladly would have stayed longer but we felt we should be getting back. It took us three days to drive up and I reckon it will take three days to drive back. After spending the first two nights at Shaws Hotel, we took a tour of Kings County in the Northeastern part of the island and spent one night in Souris, a little town once known as the Tuna Capital of the World. They still process tuna there but not, I assume, in sufficient quantity to qualify as the world capital. We had a delicious grilled halibut dinner in the Blue Fin Restaurant; when I asked why there was no tuna on the menu, Steven, our waiter, looked surprised by my question and then told me that no one could afford to eat tuna around here and that all of the tuna was packed and shipped by air directly to Japan.

The beaches in Kings County were pristine. We stopped at the Singing Sands Beach, a provincial park, on the southern side of the island and then, on the way back to Shaws, we spent half a day at the P.E.I. National Park in Greenwich. The national park in Greenwich had only recently opened and featured a splendid walk through grass-covered fields, woods that smelled of balsam, and a bog replete with cattails and surrounded by dunes. The path through the bog was along a winding wooden walkway that floated on hidden pontoons and afforded great views of the surrounding landscape. The path led to a beach that extended in either direction for miles and was occupied by only a few people. We walked a mile or so and had the beach to ourselves.

I haven't written a journal entry since last Thursday. But in the evenings I've been doing some recreational hacking developing code to automate the construction and maintenance of a journal web site. When considering writing software to perform a particular task, a programmer has to ask whether it would be easier to perform the task by manually or write the required software to perform the task automatically. This question is always tricky to answer as it is hard to quantify the time and effort required both for performing the task manually and for writing the requisite code. Starting programmers are notoriously bad at estimating the effort involved in writing software. I decided to automate the creation of a jounral web site for two reasons: First, I figured I could write the code more quickly than I could produce all of the necessary web pages by hand. Second, I thought it might be fun in the sense that solving a crossword puzzle can be fun. Hence, I spent several evenings engaged in recreational hacking sitting on the porches and in the parlors of the various hotels and inns we stayed at on the island.

My task was that of building a web site for the journal which I've now decided to refer to as "Diary of a Computer Scientist" or "DoCS" (alluding to "documentation" and the fact that Jennet, my assistant when I was department chair, used to refer to me and every past chair as "doc" and so we, current and past chairs, are, collectively, "docs"). Constructing the web site would require cleaning up all the journal entries and adding HTML formatting tags so the pages display nicely in a browser. I also wanted to enhance the content in a number of ways. I wanted to add a table of contents page, and bibliography page and add HTML code to support navigation within the web site. For each entry page, I wanted navigation buttons to get to the previous and next journal entries and to jump to the table-of-contents and bibliography pages. I also wanted a subject index containing a list of terms in alphabetical order along with pointers to journal entry pages in which the terms appear.

I knew that I would have to scan of each entry page to check for spelling and grammar so I figured I'd use the opportunity to add hypertext links to supplementary web pages and add HTML tags to identify terms for the subject index. My intent was to automate the things that were relatively easy to handle and that might require revisions, upgrades and enhancements. Things that were were difficult to automate such as identifying appropriate items for the index I'd just have to do by hand but, with a little foresight, only once. I decided on a simple format and navigation scheme for each page; since I was going to automate building the pages I could change this format at any time and use the same software to rebuild the web site. Indeed I wanted to be able to make changes - improve the format or navigation, add supplementary links or identify additional terms for the subject index - at any time and rebuild only the relevant parts of the web site with a single command.

For identifying items to appear in the subject index, I used an existing HTML tag, "A" for "Anchor", and attribute, "NAME", assigning the attribute a value of the form "index-term" where term is the index entry as it will appear in the subject index with hyphens replaced by spaces, e.g., "index-operating-systems" will display in the subject index as "operating systems". Adding these annotations will be a bit tedious, but once done I can easily automate the construction of the subject index web page.

I started coding in Perl but soon grew weary of this. I use Perl for very small programs - no more than a few dozen lines - and my repertoire of Perl functions is limited. I'm comfortable using Perl for searching and modifying strings but this project would involve a good deal more. I happened to have PLT Scheme on my laptop as well as the very useful Help Desk that comes with DrScheme. The DrScheme Help Desk made it easy for me to read the documentation on all Scheme functions and libraries that might prove useful.

I didn't have a carefully-thought-out plan for writing this software but I've done similar sorts of things in the past so I knew pretty much how to proceed. The first thing I did was write code to scan the file system containing all of the journal entries and compile a database containing a record for each journal entry. The record for a given entry contains its path in the filesystem relative to the root of the journal subtree, the paths for the previous and next entries, title information gleaned by scanning the first couple of lines of the entry and other sundry information such as the last modification date to assist in automatic updates. I also wrote code to scan all of the files to extract information on items identified for the subject index.

Once I had the procedures for compiling all of the information needed to construct the web site, I wrote procedures to generate all of the relevant pages and update those pages if I made any changes. These procedures contain the HTML formatting for all of the web pages used in the site; if I want to change the formatting, I only have to modify these procedures to change the "look and feel" of the entire of web site. It took me four evenings, a couple of hourse each evening, to write and debug all of the code and another evening to clean up and document the code so I'll be able to understand what I wrote if I have to come back and change it at a later date. If you're curious, you can look at my code but I make no claim for its style or adherence to good programming practice.

By the way, there are all sorts of existing tools that could have made my task much easier. There is an increasing number of on-line diaries and so-called web logs (or "blogs") maintained by web log editors (or "bloggers"). There are also open-source tools for implementing blogs and automating the process of building web sites such as Zope and WML. But in this case I was interested in recreational hacking and sometimes it takes less time to build your own tools that it does to understand someone else's.

We're just about to St. John, New Brunswick, and planning to drive on to Bangor, Maine, before we quit for the day. We're listening to the last tape of the books-on-tape version of Tom Wolfe's "Hooking Up" [Wolfe, 2000] and he's rattling on about how neuroscience is exorcising the "ghost in the machine", the mysterious "mind's I" that lurks in so many early accounts of mind. Wolfe is recounting an interesting anecdote about how scientists developed a quick and painless method for measuring intelligence relying on electroencephelography; the method provides an estimate of a person's IQ that is within 1/2 of one standard deviation of the estimate obtained by the Wechler/Wychler's adult IQ test. It seems that no one was interested in the practical application of their method in part, Tom Wolfe contends, because the success of their method underscored the fact, becoming increasingly clear from the scientific study of the brain, that intelligence, like so many other attributes of cognition, is merely a matter of how we are wired.

Wolfe also mentioned Friedrich Nietzsche. which piqued my interest as I'd mentioned Nietzsche in the August 1, 2002 entry talking about web spiders. (That reference to Nietzsche gave me a nice test of how my cross referencing scheme is working.) Wolfe mentions Nietzsche's famous (1882) statement "God is dead" and Nietzsche's prediction that, without god as a basis for guilt, there would be a "total eclipse of all values" leading to conflicts of unprecedented brutality and scope in the twentieth century. World War I and II were certainly conflicts of terrible violence and broad scope but it's not clear to me that humans in the twentieth century are any more or less lacking in values than their counterparts in any earlier century in the millenium. What is clear is that we now have weapons of unprecedented power capable of destruction on a scale unimaginable in Nietzsche's time and, hence, lapses in good sense and morality can have disasterous consequences.

Wolfe goes on to suggest that a statement analogous to "God is dead" but concerning the existence of the soul may be issued by some neuroscientist or science popularlizer in the next decade or so once the mysteries of the brain are sufficiently solved. Wolfe predicts that humans may be similarly set adrift (bereft not only of their cherished values but of their vaunted free will) upon learning that the human mind is nothing more than a biological computer whose capabilities are largely predetermined by our genes. Heady stuff to accompany a scenic drive through the Maine woods - we're currently just over the Canadian border and heading along Route 9 toward Bangor.