Friday, April 29, 2011

Don’t Look Now—Psychopathic Tendencies in (Educational) Technology

Imagine a perfectly rational, logical person, one with a perfect memory, but one who has no emotions.

Scary, huh? Sort of a psychopath? Yet we have spent countless hours and sums over the past century or so developing a machine that embodies exactly this fantasy, and many of us now spend hours each day interacting with these devices—perfect logic, perfect memory, and no emotions. In fact… don’t turn around… you’re probably reading this on one right now.

Real psychopaths have a will of their own and present obvious danger to those around them. Machines represent the will of their creators and users, and so computers—fortunately—act only as psychopaths in the movies (2001, A Space Odyssey) or in the hands of actual psychopaths. (Whether or not a device with latent psychopathic tendencies magnifies psychopathic tendencies in each of us would be an interesting area of study.)

Further, as Joseph Weizenbaum (Computer Power and Human Reason) and others have pointed out, tools are not merely the embodiments of instrumental reason. (Instrumental reason, you could say, arises when we generalize the values of technology to all values.) Tools embody the values of their creators and they become part of the human world in which they are used—once created, they become part of the way we picture the world and our role in that world.

Consequently, as Weizenbaum also points out, all technology is educational—one function of education is cultural transmission, and technology is intimately bound up with any culture.

Once upon a time, a human being grabbed a rock and used it as a hammer. She may have put it down, looked around, and thought, “Gee, I never realized how many hammers are lying around here.” Technology begins—and ends—not with devices external to us, but in our own minds and in our perceptions of the world.

Further, in the long run, the devices we create may replace an older version of the world with a new version in which the values of the device may be mistaken for reality itself. A simple instance will suffice. Experience used to be seamless and whole, one with the rhythms of nature—sunrise, sunset, the cycle of the moon, the seasons. Eventually, European monks decided to build a mechanical bell ringer to remind them of the time to pray. This precursor of our modern clocks (from the French cloche, bell) had no face and simply established a mechanical rhythm, roughly correlated with the times during the day and night when a bell should ring to call the monks to prayer.

Over the past 800 years, we have so internalized and enhanced the mechanism of the clock that most of us would agree that clocks “measure time.” A moment’s reflection will show that they do no such thing. What do they do? Like a metronome, they establish a mechanical rhythm. That’s all. Any interpretation of time with regard to the rhythm of the device belongs to us, although we have largely forgotten this.

Clocks, then, despite their obvious advantages, have also served to mechanize, standardize, and fragment our experience of time.

(Similarly, the moveable type printing press mechanized, standardized, and fragmented our experience of texts.)

Leaping ahead, we may say that computers, because of their astonishing malleability—what do you want the computer to do? We’ll program it to make it do that—don’t simply introduce psychotic rationalism to one portion of reality—say, our experience of time or of a text—they co-opt, rationalize and standardize experience itself. The experience of the computer is the illusion of experience.

Now, Jill or Johnny, I want you to go upstairs and don’t come down until you’ve spent at least one hour doing your homework on your psychopath.

No comments:

Post a Comment