Tuesday, December 03, 2002

The New York Times has an article on the Dutch film Necrocam


Toward the end of "This Is Our Youth," Kenneth Lonergan's play about disaffected New Yorkers set in 1982, the characters learn of an acquaintance's death. The news spooks the motor-mouthed Dennis into pondering the benefits of religion when confronting the afterlife. "How much better would it be," he asks, "to think you're gonna be somewhere, you know? Instead of absolutely nowhere. Like gone, forever."

Fast forward to 2001, when the Internet has given the youths in "Necrocam," a 50-minute film made for Dutch television, a less conventional way to cope with death's mysteries. Christine, a teenager with cancer, tells her friends that upon her death she wants a digital camera with an Internet connection installed in her coffin. Images of her decaying remains will then be transmitted to a Web page for all to see, making her virtually immortal.


The Times sees this as symptomatic of how all pervasive computers and the internet have become for the younger


The movie's accomplishment is to capture the way technology, including the Internet, has permeated contemporary culture. This is our youth's daily existence. The film's young people communicate through online messages, play computer games and record their pledge with a video camera instead of a quill dipped in blood. For them technology is an extension of life. So it is only logical that cyberspace would play a role in death.


generation. I can only confirm agree with this.

I was at high school in the 1980s, an undergraduate from 1987-1991, and a graduate student from 1992-1997. My high school got its first computers in 1982, and this was the first time I had seen a personal computer. As I was studying computing science at university, I had access to a Unix machine connected to the internet from 1988. In all cases, I was discovering something new. In high school, the teachers and I were still discovering what computers were. When I was an undergraduate, I wasn't taught anything about the internet: I simply discovered that Usenet was there, and that I could sent e-mail to anywhere in the world by playing around. When I was a graduate student, my contemporaries in the non-scientific world were just discovering e-mail, and even rather brilliant Arts students in some cases did not use computers almost as a matter of pride.

I have used computers since I was 12, and the internet since I was 18, but this was always somewhat novel. In my generation, there was something novel, and geeky, and antisocial about this fact. Socially, the fact that I used computers made me unusual, and was an isolating factor. Now, however, the reverse is true. The generation ten years younger than me have been using the internet at least since they started high school. Even the arts students are immersed in it, and cannot imagine a world without it. An English graduate student is going to use it all the time for communications, research, to feed her Buffy habit when she is out of the country, and everything else. The idea that one would not is inconceivable. I know a lot of the history of computers and the net, because to some extent I lived through it, and in any event I find researching these things to be fascinating. However, for the younger generation, it is just there. It's part of life. It's the opposite of an isolating factor. This is quite different to how it was for me.

Some more thoughts. When I was a graduate student in mathematics and physics a decade ago, it was not possible to be a graduate student in these fields and not to find computers used pervasively in your work. Your mightnot have directly required them for your research (although more and more fields do, and mine certainly did) but computers were all pervasive in the field, and you used them for communicating with other researchers, writing and submitting papers, and just as part of life. There was, and undoubtedly still are, however, (often very senior) scientists who were still active who resisted this trend. In their day, computers were not widely used, and they did their work by the old fashioned method. We would generally regard these people as a bunch of old fogeys. Old scientists did not necessarily belong to this group (as computers were invented in the 1940s and 1950s, and some Cambridge mathematicians were some of the first people to use them) but some did.

Other fields (eg biology) have been transformed in exactly the same way). Fields in which this has happened have inevitably been transformed by it, as what a researcher and a computer can do is different from what a researcher without a computer can do. (Oddly, I think it has much to do with using a computer for computation in most cases, too. The areas where computation can be useful are not always appreciated, but give the researchers tools with great computational power, and they will figure it out).

However, ten years ago, the humanities fields were still largely exempt. I have contemporaries who did Ph.D.s in things like English and History who actively resisted using computers, and insisted that they were not relevant to their work, and were not useful. (One person I am thinking of was a particularly brilliant researcher who is now a faculty member at a very prestigious university). These people now probably do use computers and the net, but I suspect that they still don't really enjoy it.

As I mentioned, if you meet someone now who is a graduate student in English, or is about to start a Ph.D. in English (as I did the other day), and you find the computer skills go very deep. The gap between old fogeys and the younger generation is going to become apparent here, too. This is interesting, and the humanities fields are going to be changed by this.

Michael's pet foible of the day . I loathe the expression "Information Technology" or "IT". Fundamentally, computers are for computation, and that is where there biggest impact lies. Management of information is secondary. (I will explain why this is so some other time. Promise).

No comments:

Blog Archive