Monday 23 April 2007

walk through walls, like a philosopher

One of the great things about being an academic is that you interact with weird people. No offense, weirdos--you perform a great service. You tell me things I would never know without you. One of you, for example, told me that the Israeli Defense Force reads the postmodern philosophy of Deleuze. This from Eyal Weizman at www.frieze.com (sorry, I don't have a better citation). An Israeli friend of mine suggests taking this story with a grain of salt, so please do so. Nevertheless, I find it interesting enough to warrant repeating.

In its way, I suppose this should not be unexpected--philosophy and warfare have always influenced one another. The trouble is, when I'm in the world of philosophers, I tend to view warfare at a distance: abstract and simple. Philosophy enters, if at all, at a moral or ethical level.

But Deleuze (reconditioned) is all about tactics: rethinking the battlefield itself. The idea is simple enough. Examine your preconceptions about battle. These will probably be very similar to the preconceptions your enemy has. Now invert those conceptions--in a world of booby-traps, doors become the one place you don't pass through. Instead, you walk through walls. In a world of deadly crossfire, alleys aren't passages, they're barriers. Former barriers (buildings, civilian dwellings, ceilings, and walls) become passageways.

What better way to surprise a Clauswitzian than by being Foucaultian?

That's not to say all philosophy is equally useful. Derrida, apparently, is a bit to opaque even for the IDF. Can't say I'm surprised.

Saturday 21 April 2007

graduate school is great exercise

There is an exercise placebo effect. From Psychological Science via Overcoming Bias.
Although actual behavior did not change, 4 weeks after [a group was told their on-the-job routine satisfied the Surgeon General's exercise advisory, they] perceived themselves to be getting significantly more exercise than before. As a result, compared with the control group, they showed a decrease in weight, blood pressure, body fat, waist-to-hip ratio, and body mass index. These results support the hypothesis that exercise affects health in part or in whole via the placebo effect.
I'm sure I need not remind everyone that reading my blog satisfies the Surgeon General's exercise advisory. Keep reading and watch those pounds fall away!

Friday 20 April 2007

art without a frame

Here is another dated post, a Washington Post piece by Gene Weingarten. It is significant not so much for the 'experiment' but for Weingarten's thoughtful analysis.

The setup: what happens when one of the world's greatest musicians plays one of the world's greatest instruments at one of the worlds greatest, uh, metro stops?

Mark Leithauser, senior curator at the National Gallery, puts the project in perspective:
"Let's say I took one of our more abstract masterpieces, say an Ellsworth Kelly, and removed it from its frame, marched it down the 52 steps that people walk up to get to the National Gallery, past the giant columns, and brought it into a restaurant. It's a $5 million painting. And it's one of those restaurants where there are pieces of original art for sale, by some industrious kids from the Corcoran School, and I hang that Kelly on the wall with a price tag of $150. No one is going to notice it. An art curator might look up and say: 'Hey, that looks a little like an Ellsworth Kelly. Please pass the salt.'"
Context matters. Most reports would stop here. Not Weingarten. He brings in Kant.

Paul Guyer of the University of Pennsylvania, one of America's most prominent Kantian scholars, says... if Kant had been at the Metro watching as Joshua Bell play to a thousand unimpressed passersby, "He would have inferred about them... absolutely nothing."

Even this is not the end of the story--to say that art needs a frame and a person on the street can't be blamed for missing it isn't enough. The most interesting parts of the story are the interviews with the individuals who did notice the art. Check out the article--there are accompanying videos.

neuroscience and experience

From Will Wilkinson (a month ago):
One afternoon recently, Paul [Churchland] says, he was home making dinner when [spouse] Pat burst in the door, having come straight from a frustrating faculty meeting. “She said, ‘Paul, don’t speak to me, my serotonin levels have hit bottom, my brain is awash in glucocorticoids, my blood vessels are full of adrenaline, and if it weren’t for my endogenous opiates I’d have driven my car into a tree on the way home. My dopamine levels need lifting. Pour me a Chardonnay, and I’ll be down in a minute.’”
Wilkinson looks upon Pat's statement as a victory of some sort; her capacity to diagnose the chemical cause of her emotional symptoms, and moreover her ability to prescribe a treatment, is rather a trick (apparently, the Churchlands talk like this all the time), but not of the sort Wilkinson wants. He thinks that the alternative to Pat's highly attuned scientific sense of herself--her scientific objectivity, let's say--is what keeps her from walking in and shouting at Paul for making a mess in the kitchen. Sorry, but one has little to do with the other. Naming chemicals doesn't prevent transference. Moreover, any self-observant person is capable of saying, "Paul, don't speak to me, I'm in a bad mood. I need to relax. Pour me a Chardonnay, and I'll be down in a minute." Leave out the neuroscience, and we still get the same result.

None of which is to say I disagree with the self-diagnosis Pat gives--surely she's right. Nor am I insisting there's "more to the experience" (however phenomenologically unsatisfying 'glucocorticoid' may be). Despite my stubborn deterministic materialism, I am simply not convinced science is always helpful. There's a reason we have everyday language, and it's to describe everyday experience. Only for people like Paul and Pat, who know the phenomenological content of 'seratonin', is such a descriptor appropriate. The rest of us already have a word for that: tired.

reflexive integrity

Brian Cantwell Smith has a simple four-stage model of how language changes during a transition between a “prior” era and a “successor” era.
  1. Conservative. The language of the prior era is used uncritically, its epistemology and ontology taken for granted. Much everyday work can be accomplished here.
  2. Reactionary. The language of the prior era is still in use, but its objects are denied. There is an equivocation on the denial, for it is not yet clear if a theory is wrong, or if the criteria are wrong, or if the words themselves have lost their meaning.
  3. Liberal. It has become clear that the conceptual framework is flawed. The prior language referred to something, but it's now hard to say what.
  4. Radical. A wholly new language is formed. Goto step 1.
It is easy to be a reactionary, says Brian, but reactionaries almost always fail the basic test of reflexive integrity. Because their language is inherited from the prior era, it is almost never appropriate to their task. Verificationism, for example, fails the test of reflexive integrity, because the tenet that "claims should be accepted only if they are verifiable" is itself not verifiable.

But how do you get from 2 to 3? Or 3 to 4? Those steps seem quite mysterious, perhaps magical. Thar lie dragons! And paradigm shifts! And lions and tigers and bears (oh, my)!

[yes, it is clearly that time of year when I have essays due, for I am spending all of my time writing on my blog]

innoculation

Nunberg again:
There is a widely repeated claim to the effect that a daily issue of the New York Times contains more information than the average seventeenth-century Englishman came across in a lifetime. Now whatever writers have in mind when they make such claims (not a great deal, you suspect), it’s clear that they are not talking simply about the sum of individual propositions that are communicated from one agent to another.
Nunberg supposes that we can gather a great deal of what is meant by that term, “information,” from uncritical uses like this, or the claim that “the amount of information is doubling every 15 years.” He is undoubtedly correct, but more interesting is the attractiveness of claims like each of these—that is, the uncritical claims about seventeenth-century Englishmen and information growth rates. They belong to that class of believably absurd claims, memes, mental viruses which infect their hosts (Nunberg in 1995, me upon reading it in early April, you now, perhaps). Do these paragraphs constitute a new strain of this nasty bug? Perhaps a less virulent variety? Or more virulent, now clothed as it is in haughty self-reference?

the future is now

Geoffrey Nunberg wrote in “Farewell to the Information Age” that “nothing betrays the spirit of an age so precisely as the way it represents the future.” Two mistakes futurists make, he continues, are to extend some current innovation to its logical end or to unintentionally naturalize some assumptions about culture. His example is a picture in a 1950 issue of Popular Mechanics. An apron-clad woman sprays down oddly-shaped plastic furniture with a garden hose—in her living room. Plastics, yes, Mr. McGuire’s “one word” to Benjamin in The Graduate, is undoubtedly the former of the futurist mistakes, while the unwitting naturalization of “woman’s work” in the image is the latter. Nunberg hopes to avoid these mistakes in his own futurism. Nunberg writes on the future of “information” in the era electronic. For the most part, Nunberg is masterful—and surprisingly prescient. He wrote in 1995, and many of his predictions have already come to pass.

Nunberg argues that information is dead. Information requires print, because it is through the physical and social institution of print that information came into being. The constrained, valuable column inch of the printed page helped to create the material character of information—that uniform, morselized, quantifiable entity. The social history of its development, and the institutions that grew up to support it contributed to and codified its semantic qualities: objectivity and autonomy. The example Nunberg uses to illustrate his point is, ironically, non-print. Imagine stepping into a rental car for the first time, turning on the radio and hearing that the Red Sox lost to Toronto. “You accept what you hear without interrogating it,” says Nunberg, “in virtue of the form of language that expresses it and the kind of document that presents it.”

In the age of the Internet, Nunberg is certain, these strictures will break down. Nunberg foresees words sprawling across undelimited web pages, no longer tightly constrained in expensive column-inches. He anticipates the growing concern with sources, the changing meaning of what it is to be an author, or journalist, or photographer, or professional. Strikingly, for 1995, he foresees discussion-based forms of collaboration. A linguist, Nunberg wonders at the new, twisted meanings of words when applied in this new context. Would a Derridean speak of the hyperlink as actualizing intertextuality to the point of eradicating all boundaries between texts? If so, would she realize the anachronism inherent in the very idea of intertextuality? In a realm where there can be “intertextuality without transgression,” intertextuality is emptied of its old meaning.

Perhaps because he is so worried about avoiding the two mistakes of the futurist, Nunberg falls into the trap of the presentist—he misidentifies information, that very entity he hopes to define. His example of hearing a score on a radio belies his mistake. For who among us trusts any printed source without interrogation? Perhaps we trust the sports scores and the weather (insofar as we trust any weather report). If so, then information is far more limited than Nunberg makes it out to be. Let me repeat what Nunberg says of the radio: “you accept what you hear without interrogating it... in virtue of the form of language that expresses it and the kind of document that presents it.” True enough—interpretation of sports scores and weather reports is largely about formatting, not about authorship. Besides these, though, what else do we trust?

Are we now so embedded in the interrogative electronic era that we have now altered how we read print sources? Am I simply a jaded academic, cynically picking fights over the supposed “objectivity” of the media? Or is Nunberg just plain wrong?

the most fun you can have in the shower... almost

Science is indeed stranger than fiction. From Seed.

observations on the changing climate

I grew up in Maine. My hometown has two thousand people, many square miles of farmland, and many more of forest. In my youth I wandered for hours in the woods behind my house without ever seeing another person—though there were signs. Mossy old stone walls, rutted tracks overgrown with birch saplings, the occasional rusted bit of barbed wire with a tree growing around it. The people there mostly work in offices now, but their childhoods were like mine. They have a deep connection with the land, and understanding of how things change over the years.

I live in Toronto now. The juxtaposition is jarring. I probably walk past two thousand people on my way to class each day. Neighborhoods stretch for miles, office buildings scrape the sky. Torontonians are much more easily convinced that human beings are causing climate change. Why? Because human beings make up so much of their environment. Buildings come down and new ones go up. People see the crush of humanity, the grime of city streets. They see the lush green of the parks trampled to mud. In my hometown, there’s a lot of dirt, but none of it is grime. In my hometown, people see the power of nature, and those among us who still farm rely on it for their livelihoods. Nature is still majestic in my hometown. The sublime lives. The eighteenth century is present in the attitudes of the people. In Toronto, the eighteenth century is present in the stone buildings now dwarfed by steel towers.

No wonder people in cities tend toward liberalism, humanism, and climate change more easily than folks who work the land for their very lives.

city life

As I fly over Toronto, I sometimes wonder how much of it—this city—is devoted to its own maintenance. Feeding the people here, repairing the infrastructure, removing waste and supplying replacements for broken items. What else goes on here in Toronto?

Steve Johnson has written books exploiting the trope of the city as an organism. But new research suggests this is a false metaphor. When organisms get larger, they use less energy (by mass), and move slower. On the other hand, “as cities get larger they create more wealth and they are more innovative at a faster rate.” In other words, as compared to growth in size, energy requirements grow more slowly while wealth grows more quickly. Bigger cities are 'smarter' cities. That's not to say things are perfect:
Large cities generate considerable wealth, they are home to many high paying jobs and are seen as engines of innovation. But cities also generate pollution, crime and poor social structures that lead to the urban blight that plagues their very existence.
This research suggests that more resources should go toward solving pollution, crime, and social problems—not toward suburbanization. I agree; suburbs are a great evil. I suppose that if cities handled pollution, crime, and social problems better, fewer people would flee to the burbs.

But I sure wouldn't trade in my childhood of running around in the woods of Maine.