What is the deal with institutional clocks? In the classroom I'm working in this semester, the wall clock has been wrong, differently, every day so far this semester...but always by approximately exact-hour intervals (in other words, it might say 5:11 when it's actually 10:11). The two nearest clocks in the hallway are also incorrect - and different both from one another and from the clock in my room, but also wrong by exact hours.
This is something I've noticed about institutional clocks, in schools, universities, and the like, for years...and it baffles me. In the rest of the world, setting a clock is no big deal: it runs accurately for a good long time, and sure, once in a while someone might forget which way to move it when Daylight Saving Times comes or goes, and so for a while it might be two hours out of whack - but that's easily corrected. What doesn't happen is random, arbitrary shifting of the time such clocks tell.
There seems to be some centralized (or at least distant) control for institutional clocks...since, as I'm sure you've all observed, when they're wrong, sometimes they suddenly start progressing rapidly forward or backward. Presumably someone somewhere is causing this to happen. But that still doesn't explain why such clocks are so prone to going wrong...or why when they do so, they're so often wrong by exact-hour intervals.
One more thing: these are the same kind of clocks whose minute hands creep backwards for a second before advancing...a phenomenon well-known to clock-watching test-takers, causing momentary heart attacks as if they will never get out of that room.
too much typing—since 2003
Showing posts with label thinky. Show all posts
Showing posts with label thinky. Show all posts
2.03.2009
12.15.2008
most bizarre house-style typographical preference evar...
So I'm reading an article from The New Yorker on Elvis Costello's talk show (having neither cable nor terrestrial TV, this was a surprise to me), and I discover the following rather peculiar typographical sequence, in reference to '60s singer Billy Stewart, "R. & B. singer."
What? Those periods and spaces are just weird, especially in conjunction with the otherwise abbreviation-friendly ampersand ("R. and B." would be exceedingly odd, but somehow more internally consistent).
And I thought The New York Times' infamous "Mr." or "Mrs." fetish was odd (deployed most infamously in a late-seventies article on Meat Loaf, referring to him, absurdly and comically, as "Mr. Loaf").
What? Those periods and spaces are just weird, especially in conjunction with the otherwise abbreviation-friendly ampersand ("R. and B." would be exceedingly odd, but somehow more internally consistent).
And I thought The New York Times' infamous "Mr." or "Mrs." fetish was odd (deployed most infamously in a late-seventies article on Meat Loaf, referring to him, absurdly and comically, as "Mr. Loaf").
10.31.2008
the other side
An essay by John Jeremiah Sullivan in the latest issue of Harper's describes a recently reissued, extremely rare country blues recording called "Last Kind Words Blues" by Geeshie Wiley. At one point Sullivan cites some lines from the song, which he hears as "The Mississippi River, you know it's deep and wide, / I can stand right here, / See my baby from the other side." In the context of the song, whose narrator seems to be speaking with her dead father, the peculiarity of the syntax takes on a spooky literality: the narrator seems half embodied, this side of the river, and half ghost, viewing her "baby" from the other side.
While Sullivan acknowledges the possibility that such a reading pays overly literal attention to the lyrics (Sullivan quotes John Fahey, who - after asserting that the lyrics of these old songs didn't mean much - spends several hours trying to determine what the words are), and while he analyzes obsessive rare-record collecting to some degree, he only hints at one possible reason such collectors might be so compellingly haunted to track down these records. And that is the way that, much like the narrator of the Geeshie Wiley song, sound recordings can open a sort of virtual portal to another time and place.
The art of audio recording, conventionally conceived, tends to stress "fidelity" or "presence," the illusion that the performance you're hearing sounds as if it's taking place right there in your room - but even the highest-fi recordings generally sacrifice sonic literalism for musical purposes. On rock recordings, for example, no one bats an eye when an acoustic rhythm guitar is the same volume as, or louder than, a hugely distorted electric lead guitar, even though that's possible only with differential amplification. (This amplification already departs from the "realism" of the actual loudness of an acoustic guitar, and therefore calls into question whether there's any such thing as the "actual" loudness of an electric guitar. Of course one might argue that rock is the first fully postmodern music, conceived entirely in the era of mechanical reproduction of sound, which therefore never depended upon a hypothetical "real" performance based in real time and space.) I remember reading a fascinating article called "A Bedroom Community" by Franklin Bruno (from the 1997 issue of Badaboom Gramophone: it does not appear to be available online), in which he analyzed the effect of the creation of sonic spaces in recorded music: far too little attention is paid to the way such essentially spatial effects like reverb or echo affect listeners' perception of the imaginary space of the hypothetical performance, or the way even clearly patchwork recordings, dubbed part upon part, sometimes strive to create an aura of performed authenticity through the inclusion of count-ins, stray amp noise, or even mistakes in playing or singing (despite the obvious fact that if the song's made up of multiple overdubs, such "errors" could easily have been removed or eliminated in a subsequent take).
An interesting thing happens when the "fidelity" of a recording is less than ideal. First, there's the question of how that infidelity (secret noises offstage) came to be: through the unavailability of quality recording equipment or technique, through intentionally eschewing such equipment or technique ("low-fi"), or through aging and degradation of the recorded artifact itself (such as an old, scratchy 45). Clearly these things signify differently: the recording that's poor of necessity might convey the artist's persistence, rawness, amateur intensity, etc.; the "low-fi" recording might seek to convey those same things or, more smartly, undercut the assumptions that low fidelity just does mean persistence, rawness, or intensity; and the degradation of the artifact is often thought of as irrelevant to the "real" recording.
This last assumption isn't correct, though. First, although we might try to hear "through" noise and poor signals (and the more experienced we are as listeners, the likelier we'll be capable of pulling off this listening performance), I don't think we can fully escape the crackle of the vinyl or the attenuation of dynamic and frequency range. This is particularly the case with older recordings (whether by "older" we mean scratchy LPs we've owned since our teens, or recordings that are mastered from ancient 78s): I'd argue that the older the recording, the more we tend to hear those sonic accretions as part of the musical experience. It would seem strange indeed if, somehow, a Robert Johnson song were suddenly available in wide-screen, massively compressed, state-of-the-not-necessarily-high-fidelity-art 2008 sound.
One of the peculiarities of recorded sound is the way it can substitute itself for real-time performance even when such a real-time performance was a necessity. (Bruno quotes Stephin Merritt, who refers to the "false realism" of recordings that seek to convey the illusion of a unified sonic space, even though there was never any such space of performance.) My Robert Johnson example is unfair in that way: rather than imagine a Robert Johnson recording that sounds like a modern recording, imagine what being in the room with Robert Johnson playing "Dead Shrimp Blues" might have sounded like. Even though it was 1936, presumably our ears, and the air in the room, would have been every bit as capable of conducting sound waves without the imposition of crackling static as they are now when our friend or roommate belabors an acoustic guitar in the same room with us. Would that 1936 performance sound "right," the unmediated Robert Johnson, with no microphone, no recording equipment's stamp of sonic character, only his voice and guitar in the naked air? I think most people would like to say it would...yet since we know such a thing is impossible outside of science fiction, I wonder if we'd respond the same way to such a recording (if it were possible). I suspect that, just as some people respond to what they perceive as the "warmth" of vinyl, and most respond more positively to the exact same recording if it's louder than another, our history of contact with old, degraded sonic artifacts would mean that ultimately, we'd prefer those over the hypothetical, ultra-cleaned-up recording. No doubt such arguments would be mounted in terms of "authenticity," with claims that our hypothetical sonic clean-up device somehow distorts or cheapens the original sound (as if we'd know what the original sound was). Even though sound behaved in 1936 the same way it does in 2008, the context of a performance's larger sonic world is not the same now as it would be in the world of 1936. (A clichéd citation...but this idea is converging on one reason Borges' verbatim new Quixote differed from the original: though the object is identical, its context is not - and since no object is apprehensible without context, essentially the object is not identical in any real experience.) The removal of the patina of age, which might seem superfluous to the object itself, results in a significant loss.
Specifically, what would be lost is such a recording's ability to open a psychic time-portal in the listener: the sense that while I'm here in 2008, and the Mississippi of time is deep and wide, I can see myself listening there, on the 1936 side. I think this effect is one reason some people are so strongly attached to old recordings: the sound of the time is right there, magnetically arrayed, etched into shellac or vinyl grooves, encoded in cryptic digits of shiny, aluminum-coated plastic that can reflect another world.
The paradox is, that patina wasn't there from the start (except insofar as 1936 recording equipment was less capable of "fidelity" than contemporary equipment). It itself is an artifact of the fact that many such old recordings exist solely on a mere handful of timeworn old 78 RPM records, and it's the patina of such records that we hear. Which is to say, if a record collector or serious fan fetishizes such sound, it's because they hear themselves in the sonic reflection: the years of their and others' collecting, the inevitable wearing down of such records no matter how carefully preserved and packaged, their own obsessions emerging into the air, curatorial and remembered.
While Sullivan acknowledges the possibility that such a reading pays overly literal attention to the lyrics (Sullivan quotes John Fahey, who - after asserting that the lyrics of these old songs didn't mean much - spends several hours trying to determine what the words are), and while he analyzes obsessive rare-record collecting to some degree, he only hints at one possible reason such collectors might be so compellingly haunted to track down these records. And that is the way that, much like the narrator of the Geeshie Wiley song, sound recordings can open a sort of virtual portal to another time and place.
The art of audio recording, conventionally conceived, tends to stress "fidelity" or "presence," the illusion that the performance you're hearing sounds as if it's taking place right there in your room - but even the highest-fi recordings generally sacrifice sonic literalism for musical purposes. On rock recordings, for example, no one bats an eye when an acoustic rhythm guitar is the same volume as, or louder than, a hugely distorted electric lead guitar, even though that's possible only with differential amplification. (This amplification already departs from the "realism" of the actual loudness of an acoustic guitar, and therefore calls into question whether there's any such thing as the "actual" loudness of an electric guitar. Of course one might argue that rock is the first fully postmodern music, conceived entirely in the era of mechanical reproduction of sound, which therefore never depended upon a hypothetical "real" performance based in real time and space.) I remember reading a fascinating article called "A Bedroom Community" by Franklin Bruno (from the 1997 issue of Badaboom Gramophone: it does not appear to be available online), in which he analyzed the effect of the creation of sonic spaces in recorded music: far too little attention is paid to the way such essentially spatial effects like reverb or echo affect listeners' perception of the imaginary space of the hypothetical performance, or the way even clearly patchwork recordings, dubbed part upon part, sometimes strive to create an aura of performed authenticity through the inclusion of count-ins, stray amp noise, or even mistakes in playing or singing (despite the obvious fact that if the song's made up of multiple overdubs, such "errors" could easily have been removed or eliminated in a subsequent take).
An interesting thing happens when the "fidelity" of a recording is less than ideal. First, there's the question of how that infidelity (secret noises offstage) came to be: through the unavailability of quality recording equipment or technique, through intentionally eschewing such equipment or technique ("low-fi"), or through aging and degradation of the recorded artifact itself (such as an old, scratchy 45). Clearly these things signify differently: the recording that's poor of necessity might convey the artist's persistence, rawness, amateur intensity, etc.; the "low-fi" recording might seek to convey those same things or, more smartly, undercut the assumptions that low fidelity just does mean persistence, rawness, or intensity; and the degradation of the artifact is often thought of as irrelevant to the "real" recording.
This last assumption isn't correct, though. First, although we might try to hear "through" noise and poor signals (and the more experienced we are as listeners, the likelier we'll be capable of pulling off this listening performance), I don't think we can fully escape the crackle of the vinyl or the attenuation of dynamic and frequency range. This is particularly the case with older recordings (whether by "older" we mean scratchy LPs we've owned since our teens, or recordings that are mastered from ancient 78s): I'd argue that the older the recording, the more we tend to hear those sonic accretions as part of the musical experience. It would seem strange indeed if, somehow, a Robert Johnson song were suddenly available in wide-screen, massively compressed, state-of-the-not-necessarily-high-fidelity-art 2008 sound.
One of the peculiarities of recorded sound is the way it can substitute itself for real-time performance even when such a real-time performance was a necessity. (Bruno quotes Stephin Merritt, who refers to the "false realism" of recordings that seek to convey the illusion of a unified sonic space, even though there was never any such space of performance.) My Robert Johnson example is unfair in that way: rather than imagine a Robert Johnson recording that sounds like a modern recording, imagine what being in the room with Robert Johnson playing "Dead Shrimp Blues" might have sounded like. Even though it was 1936, presumably our ears, and the air in the room, would have been every bit as capable of conducting sound waves without the imposition of crackling static as they are now when our friend or roommate belabors an acoustic guitar in the same room with us. Would that 1936 performance sound "right," the unmediated Robert Johnson, with no microphone, no recording equipment's stamp of sonic character, only his voice and guitar in the naked air? I think most people would like to say it would...yet since we know such a thing is impossible outside of science fiction, I wonder if we'd respond the same way to such a recording (if it were possible). I suspect that, just as some people respond to what they perceive as the "warmth" of vinyl, and most respond more positively to the exact same recording if it's louder than another, our history of contact with old, degraded sonic artifacts would mean that ultimately, we'd prefer those over the hypothetical, ultra-cleaned-up recording. No doubt such arguments would be mounted in terms of "authenticity," with claims that our hypothetical sonic clean-up device somehow distorts or cheapens the original sound (as if we'd know what the original sound was). Even though sound behaved in 1936 the same way it does in 2008, the context of a performance's larger sonic world is not the same now as it would be in the world of 1936. (A clichéd citation...but this idea is converging on one reason Borges' verbatim new Quixote differed from the original: though the object is identical, its context is not - and since no object is apprehensible without context, essentially the object is not identical in any real experience.) The removal of the patina of age, which might seem superfluous to the object itself, results in a significant loss.
Specifically, what would be lost is such a recording's ability to open a psychic time-portal in the listener: the sense that while I'm here in 2008, and the Mississippi of time is deep and wide, I can see myself listening there, on the 1936 side. I think this effect is one reason some people are so strongly attached to old recordings: the sound of the time is right there, magnetically arrayed, etched into shellac or vinyl grooves, encoded in cryptic digits of shiny, aluminum-coated plastic that can reflect another world.
The paradox is, that patina wasn't there from the start (except insofar as 1936 recording equipment was less capable of "fidelity" than contemporary equipment). It itself is an artifact of the fact that many such old recordings exist solely on a mere handful of timeworn old 78 RPM records, and it's the patina of such records that we hear. Which is to say, if a record collector or serious fan fetishizes such sound, it's because they hear themselves in the sonic reflection: the years of their and others' collecting, the inevitable wearing down of such records no matter how carefully preserved and packaged, their own obsessions emerging into the air, curatorial and remembered.
9.09.2008
"Since the beginning of time, mankind has yearned to blot out the sun."
Thus spake Montgomery Burns (if not in exactly those words), and if the humor of that remark comes from the absurdity of Burns's assumptions...well, the assumption itself may be absurd, but that he assumes it's shared does reflect a certain reality: one of the major levers grasped by anyone seeking to manipulate the public (advertisers, politicians, etc.) is the potent resistance many people have to believing that others actually and genuinely have opinions different from their own.
You can see this in the culture wars underway on behalf of the Great American Middle Class (everyone's middle-class in America! but only some of them are white, which remains unspoken), kindly engineered by the sort of folks who lose count of how many houses they own, in which a sinister significance is attributed to one's preference in vegetables, heated beverages, or ostentatious display of nationalistic bric-a-brac. Not only are such odd preferences different, which is just weird, but those who profess to enjoy the latte that dare not speak its name are suspected of doing so merely to mark out their difference and their superiority from the riff-raff, and to toady to some imaginary overclass ("the elitists") whose mission seems to be to denigrate the tastes and preferences of "real Americans" in lieu of the trivial, the ephemeral, the fashionable, and - worst of all - the French. (Don't try to tell me "latte" is from the Italian - it sounds French, so it is French. And what's the difference anyway? They all wear fancy-ass shoes, and they lose wars like the Phillies lose baseball games.)
This is why it's such a seemingly short leap from trivial questions of taste to the more serious matter of insincerity, fifth-columnism, and hidden agendas: if you're willing to pretend to like bizarre vegetables and elaborate coffees, there's no telling what else in your life is a pretense, a sucking-up to expectations...so all the flag pins, nice suits, and patriotic speeches mean nothing, because you'll say anything to curry favor.
And god help you if you prefer to deliberate, analyze, plan, or do anything other than kick ass and react from the gut. Real Americans - which is to say, real men - don't waste time cogitating and hawing and hemming, they get out there and play some rapid-fire chin music. You don't see John Wayne heading the debate club, do you? My favorite story in this regard involves a former co-worker of Rose's who, after Rose argued such-and-such, citing various facts, statistics, etc., replied that Rose's opinions weren't as valuable - because Rose had relied upon what she knew, and on ideas, whereas the co-worker felt what she did from the heart. Or, in the other common phrase, she operated from a gut feeling.
You know, you can't really figure out what your guts are up to from an upright position. I think the optimal view probably comes from bending over backwards and sticking one's head straight up the poop chute.
Another advantage to that position: you won't be bothered by that demon sun.
You can see this in the culture wars underway on behalf of the Great American Middle Class (everyone's middle-class in America! but only some of them are white, which remains unspoken), kindly engineered by the sort of folks who lose count of how many houses they own, in which a sinister significance is attributed to one's preference in vegetables, heated beverages, or ostentatious display of nationalistic bric-a-brac. Not only are such odd preferences different, which is just weird, but those who profess to enjoy the latte that dare not speak its name are suspected of doing so merely to mark out their difference and their superiority from the riff-raff, and to toady to some imaginary overclass ("the elitists") whose mission seems to be to denigrate the tastes and preferences of "real Americans" in lieu of the trivial, the ephemeral, the fashionable, and - worst of all - the French. (Don't try to tell me "latte" is from the Italian - it sounds French, so it is French. And what's the difference anyway? They all wear fancy-ass shoes, and they lose wars like the Phillies lose baseball games.)
This is why it's such a seemingly short leap from trivial questions of taste to the more serious matter of insincerity, fifth-columnism, and hidden agendas: if you're willing to pretend to like bizarre vegetables and elaborate coffees, there's no telling what else in your life is a pretense, a sucking-up to expectations...so all the flag pins, nice suits, and patriotic speeches mean nothing, because you'll say anything to curry favor.
And god help you if you prefer to deliberate, analyze, plan, or do anything other than kick ass and react from the gut. Real Americans - which is to say, real men - don't waste time cogitating and hawing and hemming, they get out there and play some rapid-fire chin music. You don't see John Wayne heading the debate club, do you? My favorite story in this regard involves a former co-worker of Rose's who, after Rose argued such-and-such, citing various facts, statistics, etc., replied that Rose's opinions weren't as valuable - because Rose had relied upon what she knew, and on ideas, whereas the co-worker felt what she did from the heart. Or, in the other common phrase, she operated from a gut feeling.
You know, you can't really figure out what your guts are up to from an upright position. I think the optimal view probably comes from bending over backwards and sticking one's head straight up the poop chute.
Another advantage to that position: you won't be bothered by that demon sun.
7.10.2008
dark stars cluster
In paranoia news, some fear that a Swiss experiment attempting to duplicate conditions obtaining picoseconds after the Big Bang will create a series of black holes that will swallow the earth. (A series of Swiss white holes would, however, merely be an enormous cheese - which, in line with its opposite coloration, would be swallowed by the earth.)
Since as far as I can tell the winning lottery ticket is likelier to materialize in midair in my bedroom, held between the teeth of a nude Scarlett Johansson, also materializing in midair in my bedroom, I'm able to worry about this in the abstract...and stumbled across a curious philosophical issue. If the universe as we know it dematerialized utterly, would we even be aware of it, in any sense? Barring supernatural beliefs, I find myself wondering whether time wouldn't also cease, and whether it would "feel" to us merely as if we were in a perpetual present (which wouldn't feel perpetual, being only a present, it being only time passing that's recognizable as time) - rather like being on hold to the cable company.
Slightly more seriously, it got me to thinking that what makes disasters disasters isn't only what happens to those killed in the disaster - it's those deaths' impact on survivors. But an instantaneous, universal disaster would leave no survivors, no tremorous onset, no "omigod" moment of impending death - and no aftermath. It would seem to be an existential disaster only.
And that led me to recognize, yet again, the depth of our interdependence, that we really aren't whole persons except in the society of others. In his most recent book I Am A Strange Loop, Douglas Hofstadter addresses the nature of consciousness by arguing (this is an extremely reductive summary) that consciousness and intelligence emerged initially from self-recognition and then (and crucially) from the recognition that others are self-perceivers like we are. More than that: if "identity" means anything, each person's is inextricably bound up with those of others: who we are includes, is formed by, and continues to inform our consciousness of other beings, our mental maps of them, the mirror sites in our minds of their consciousness. Our language, our knowledge, our habits, our folkways, our likes, dislikes, joys, and fears, all are part of multiple continua with all the others we've met: most closely, our parents, families, partners, children, and friends, but in more scattered but legible circles radiating to encompass coworkers, political, social, and religious leaders, and those whose consciousness is transmitted (however also transmuted and refracted through whatever aesthetic lens) through their writings, artwork, or music.
And all that was on my mind as I recently collected a bunch of stray Brian Eno songs, including a song he did in collaboration with German duo Cluster, called "The Belldog." The title phrase (as Eno explains in a sadly out-of-print book called More Dark Than Shark, which compiles critical essays on Eno's work with "portraits" of each of Eno's lyrics by a young Russell Mills and comments from Eno himself) comes from an encounter with a New York street musician, playing an "out-of-tune upright piano on wheels" (!). The musician sang, over and over again, the words "the belldog, where are you?" Eno writes that he vaguely thought the belldog must be some sort of "herald," since bells (a later strong interest of Eno's) tend "to summon attention."
Despite Eno's disavowal of the importance of lyrics in his music, as a lyricist his work is first-rate. He avoids cliché, his lines are clean and unforced, and his imagery is fresh and evocative. While he claims that his lyrics often evolve from improvised sound, similar to the Surrealist practice of "psychic automatism," Eno - by contrast with many Surrealists' reluctance to edit its results - typically then worked with these near-nonce syllables to sculpt some sort of meaning from them, often guided by the mood of the music. While occasionally he'd develop something like a narrative, more often his words were content to evoke, to suggest, to weave a loose web of imagery, leaving plenty of space for the listener's consciousness to collaborate. "The Belldog" is one of Eno's more worked-through lyrics - Eno discarded several drafts - and while there's no clearly set story here, there is a narrative of sorts: from day to night, from control to uncontrol, from isolation to incorporation. Reductively, the main character begins by disregarding surroundings ("in the dark sheds / that the seasons ignore") and attempting to exert control over his immediate environment ("I held the levers that guided the signals"), then gradually begins to "lose control" and individuation ("at last I am part of the machinery"), to the extent that the last verse is in third person, with the narrator no longer present. "The world makes its circle through the sky," "the light disappears," and the belldog...well, where is the belldog? What is the belldog?
I find it interesting that whether these transitions are positive or negative is left wholly open-ended, and that what the narrator becomes one with is not "nature" or "the universe" but "the machinery." That machinery seems to have something to do with communication, or lack thereof (the "signals" referred to above are guided "to the radio," but "the words I receive" are "random code, broken fragments from before"), but there's no hint that communication ultimately does take place. Only that, perhaps to the narrator, it no longer matters, as the light disappears, and the narrator's no longer present. (Maybe this can be the soundtrack to our impending disaster.)
Musically, the song is concerned (as is much of Eno's work) with flattening out aspects of music that are typically hierarchical. In terms of time, we have a rapidly pulsating sixteenth-note sequenced bassline...but rather than being propulsive, it ultimately seems static, even meditative. In contrast there are many long notes, often indeterminate in terms of beginning and end, and frequently shifting in pitch as well. The song is content to let its introduction play out for some minutes before the vocal enters (we cannot immediately slot the piece into "vocal" or "instrumental"), and harmonically it oscillates between what might be two chords (G-flat major and E-flat minor), except that both chords are harmonized somewhat unusually. Frequently the G-flat is a G-flat 6/9, with added A-flat and E-flat, and the E-flat minor adds a 7th, while the bass oscillates for both chords between the root and the fourth degree of its respective scale, implying harmonic movement that never arrives. (Now that I think of it, it seems likely this was written on the piano, using primarily the black keys: those keys' pentatonic scale recurs throughout.)
As a sort of companion piece to "The Belldog," consider "St. Elmo's Fire." (Amusingly, although I hadn't noticed earlier, this one's in C major...that is to say, as far away harmonically from G-flat major as you can get, and entirely on the white keys of the piano...) Lyrically, this is as close as Eno gets to narrative: he and a companion ("Brown Eyes"), having exhausted themselves, find respite "in the blue August moon, in the cool August moon." And there - in a "desert" (which I take, for some reason, in an older sense merely of "isolated area") - they experience the mysterious phenomenon of St. Elmo's Fire - in what Russell Mills describes, aptly, as an "electrifying couplet": "And we saw St. Elmo's Fire / spitting ions in the ether." The numinosity of that phenomenon would be less striking in the absence of Robert Fripp's fantastic guitar solo: he was asked, I recall reading somewhere, to imitate the effect of electrical sparks arcing unpredictably from point to point. (I stole the idea - and a phrase or two from the backing piano figure - in my entirely in-camera laptop synth solo on "Victorian Photographs.") Here again Eno embeds some intriguing oppositions: in the first verse, "Brown Eyes" and the narrator "had walked and...scrambled / through the moors and through the briars"; in the second, their mode of travel is less clear: "Over the nights and through the fires / we went surging down the wires / through the towns and on the highways / through the storms in all their thundering." In contrast to all that activity and travel is, first, the cool, blue August moon, and second, the desert (with bones bleached white, contrasting conceptually with the dark nights, storms, and vegetation...although Eno stumbles in one of his rare lyrical gaffes, forcing a rhyme with "ether" with the comically superfluous "bones...white as teeth, sir") - and finally, the narrators at rest - where they can experience this strange phenomena (which does, as it happens, spit ions).
The songs begin with similarly restless activities, and while "The Belldog" evaporates its narrator, "St. Elmo's Fire" presents its characters with beguiling meteorological phenomena. You could argue that the characters wouldn't have been in the desert to experience it if not for all their surging and scrambling - or you could suggest that if they'd stayed right where they were, they would have experienced it anyway. Certainly, the moon would be as cool and as blue no matter where they were. But in contrast (yet a sort of concord) with the narrator in "The Belldog," these characters seem highly present in their moment - or the moment highly present in them. It's the latter reading that brings the situation closer to that of "The Belldog."
Aside from Fripp's miraculous guitar, the song, for me, nearly exactly evokes the feeling of a late summer evening, the heat of the day finally dissipating into a cooler stillness, but still a sort of captive energy to the air (which, I suppose, in the song finds release in St. Elmo's fire).
Brian Eno with Cluster "The Belldog" (After the Heat, 1978)
Brian Eno "St. Elmo's Fire" (Another Green World, 1975)
Since as far as I can tell the winning lottery ticket is likelier to materialize in midair in my bedroom, held between the teeth of a nude Scarlett Johansson, also materializing in midair in my bedroom, I'm able to worry about this in the abstract...and stumbled across a curious philosophical issue. If the universe as we know it dematerialized utterly, would we even be aware of it, in any sense? Barring supernatural beliefs, I find myself wondering whether time wouldn't also cease, and whether it would "feel" to us merely as if we were in a perpetual present (which wouldn't feel perpetual, being only a present, it being only time passing that's recognizable as time) - rather like being on hold to the cable company.
Slightly more seriously, it got me to thinking that what makes disasters disasters isn't only what happens to those killed in the disaster - it's those deaths' impact on survivors. But an instantaneous, universal disaster would leave no survivors, no tremorous onset, no "omigod" moment of impending death - and no aftermath. It would seem to be an existential disaster only.
And that led me to recognize, yet again, the depth of our interdependence, that we really aren't whole persons except in the society of others. In his most recent book I Am A Strange Loop, Douglas Hofstadter addresses the nature of consciousness by arguing (this is an extremely reductive summary) that consciousness and intelligence emerged initially from self-recognition and then (and crucially) from the recognition that others are self-perceivers like we are. More than that: if "identity" means anything, each person's is inextricably bound up with those of others: who we are includes, is formed by, and continues to inform our consciousness of other beings, our mental maps of them, the mirror sites in our minds of their consciousness. Our language, our knowledge, our habits, our folkways, our likes, dislikes, joys, and fears, all are part of multiple continua with all the others we've met: most closely, our parents, families, partners, children, and friends, but in more scattered but legible circles radiating to encompass coworkers, political, social, and religious leaders, and those whose consciousness is transmitted (however also transmuted and refracted through whatever aesthetic lens) through their writings, artwork, or music.
And all that was on my mind as I recently collected a bunch of stray Brian Eno songs, including a song he did in collaboration with German duo Cluster, called "The Belldog." The title phrase (as Eno explains in a sadly out-of-print book called More Dark Than Shark, which compiles critical essays on Eno's work with "portraits" of each of Eno's lyrics by a young Russell Mills and comments from Eno himself) comes from an encounter with a New York street musician, playing an "out-of-tune upright piano on wheels" (!). The musician sang, over and over again, the words "the belldog, where are you?" Eno writes that he vaguely thought the belldog must be some sort of "herald," since bells (a later strong interest of Eno's) tend "to summon attention."
Despite Eno's disavowal of the importance of lyrics in his music, as a lyricist his work is first-rate. He avoids cliché, his lines are clean and unforced, and his imagery is fresh and evocative. While he claims that his lyrics often evolve from improvised sound, similar to the Surrealist practice of "psychic automatism," Eno - by contrast with many Surrealists' reluctance to edit its results - typically then worked with these near-nonce syllables to sculpt some sort of meaning from them, often guided by the mood of the music. While occasionally he'd develop something like a narrative, more often his words were content to evoke, to suggest, to weave a loose web of imagery, leaving plenty of space for the listener's consciousness to collaborate. "The Belldog" is one of Eno's more worked-through lyrics - Eno discarded several drafts - and while there's no clearly set story here, there is a narrative of sorts: from day to night, from control to uncontrol, from isolation to incorporation. Reductively, the main character begins by disregarding surroundings ("in the dark sheds / that the seasons ignore") and attempting to exert control over his immediate environment ("I held the levers that guided the signals"), then gradually begins to "lose control" and individuation ("at last I am part of the machinery"), to the extent that the last verse is in third person, with the narrator no longer present. "The world makes its circle through the sky," "the light disappears," and the belldog...well, where is the belldog? What is the belldog?
I find it interesting that whether these transitions are positive or negative is left wholly open-ended, and that what the narrator becomes one with is not "nature" or "the universe" but "the machinery." That machinery seems to have something to do with communication, or lack thereof (the "signals" referred to above are guided "to the radio," but "the words I receive" are "random code, broken fragments from before"), but there's no hint that communication ultimately does take place. Only that, perhaps to the narrator, it no longer matters, as the light disappears, and the narrator's no longer present. (Maybe this can be the soundtrack to our impending disaster.)
Musically, the song is concerned (as is much of Eno's work) with flattening out aspects of music that are typically hierarchical. In terms of time, we have a rapidly pulsating sixteenth-note sequenced bassline...but rather than being propulsive, it ultimately seems static, even meditative. In contrast there are many long notes, often indeterminate in terms of beginning and end, and frequently shifting in pitch as well. The song is content to let its introduction play out for some minutes before the vocal enters (we cannot immediately slot the piece into "vocal" or "instrumental"), and harmonically it oscillates between what might be two chords (G-flat major and E-flat minor), except that both chords are harmonized somewhat unusually. Frequently the G-flat is a G-flat 6/9, with added A-flat and E-flat, and the E-flat minor adds a 7th, while the bass oscillates for both chords between the root and the fourth degree of its respective scale, implying harmonic movement that never arrives. (Now that I think of it, it seems likely this was written on the piano, using primarily the black keys: those keys' pentatonic scale recurs throughout.)
As a sort of companion piece to "The Belldog," consider "St. Elmo's Fire." (Amusingly, although I hadn't noticed earlier, this one's in C major...that is to say, as far away harmonically from G-flat major as you can get, and entirely on the white keys of the piano...) Lyrically, this is as close as Eno gets to narrative: he and a companion ("Brown Eyes"), having exhausted themselves, find respite "in the blue August moon, in the cool August moon." And there - in a "desert" (which I take, for some reason, in an older sense merely of "isolated area") - they experience the mysterious phenomenon of St. Elmo's Fire - in what Russell Mills describes, aptly, as an "electrifying couplet": "And we saw St. Elmo's Fire / spitting ions in the ether." The numinosity of that phenomenon would be less striking in the absence of Robert Fripp's fantastic guitar solo: he was asked, I recall reading somewhere, to imitate the effect of electrical sparks arcing unpredictably from point to point. (I stole the idea - and a phrase or two from the backing piano figure - in my entirely in-camera laptop synth solo on "Victorian Photographs.") Here again Eno embeds some intriguing oppositions: in the first verse, "Brown Eyes" and the narrator "had walked and...scrambled / through the moors and through the briars"; in the second, their mode of travel is less clear: "Over the nights and through the fires / we went surging down the wires / through the towns and on the highways / through the storms in all their thundering." In contrast to all that activity and travel is, first, the cool, blue August moon, and second, the desert (with bones bleached white, contrasting conceptually with the dark nights, storms, and vegetation...although Eno stumbles in one of his rare lyrical gaffes, forcing a rhyme with "ether" with the comically superfluous "bones...white as teeth, sir") - and finally, the narrators at rest - where they can experience this strange phenomena (which does, as it happens, spit ions).
The songs begin with similarly restless activities, and while "The Belldog" evaporates its narrator, "St. Elmo's Fire" presents its characters with beguiling meteorological phenomena. You could argue that the characters wouldn't have been in the desert to experience it if not for all their surging and scrambling - or you could suggest that if they'd stayed right where they were, they would have experienced it anyway. Certainly, the moon would be as cool and as blue no matter where they were. But in contrast (yet a sort of concord) with the narrator in "The Belldog," these characters seem highly present in their moment - or the moment highly present in them. It's the latter reading that brings the situation closer to that of "The Belldog."
Aside from Fripp's miraculous guitar, the song, for me, nearly exactly evokes the feeling of a late summer evening, the heat of the day finally dissipating into a cooler stillness, but still a sort of captive energy to the air (which, I suppose, in the song finds release in St. Elmo's fire).
Brian Eno with Cluster "The Belldog" (After the Heat, 1978)
Brian Eno "St. Elmo's Fire" (Another Green World, 1975)
6.18.2008
a little dap'll do ya
I was getting the oil changed and tires rotated on our Mazda today, and since I'd forgotten to bring along my own reading material, I was paging through an issue of Newsweek to pass the time. Although I can't find a reference online (update - I found it: the writer's name is Devin Gordon, and here's a link), a little item from a recent issue struck me as a perfect illustration of the way some Americans are incredibly small-minded, and why Barack Obama has an uphill battle not necessarily against conscious prejudice as such but against the sort of racism that's enculturated so deeply it's barely recognizable as a thought.
The item that occasioned this thought was a short comment about all the publicity attending Barack and Michelle Obama's fist bump prior to his nomination speech. The writer said that with all that publicity, they'll never be able to do that again, since it would seem contrived, stagey, etc.
And I thought, really? Michelle Robinson Obama grew up in a primarily black neighborhood in Chicago, and Barack Obama lived and worked in similar neighborhoods for much of his life. The fist bump or "dap" is an established and normal aspect of black American culture (and not just black American culture), so it's entirely possible that the famous (or infamous, if you're a Fox News imbecile) dap was completely spontaneous.
But here's the thing: If Mr. White Guy from a Good Home in a Good Suburb is nominated for his party's presidential ticket, and he hugs his wife, or shakes hands with a close friend or advisor, no one makes anything of it. Certainly no one suggests that, you know, since he shook hands at that moment, he won't be able to do it again, because that would imply that he's just trying to repeat the moment of hand-shaking that generated all that publicity.
Because, of course, a white guy shaking another white guy's hand would generate no publicity at all. It's considered utterly normal, totally unremarkable, not worthy of any press at all. And that, obviously, is because shaking hands is so integral a part of white male business and political culture (and not just white culture) that it's about as worthy of notice as the fact that a politician was wearing a suit and tie.
I doubt that our Newsweek columnist is a bigot - yet his arch little observation about the showbizzy nature of politics illustrates the way racism can persist, in the form of "othering" those whose actions, speech, or thoughts seem alien to us. (I was about to remove that "us" - since it repeats the same thing I'm complaining about...but I'm leaving it, because my point is not that this middle-class white guy is better than everyone else, but that even when we're trying to not act like a jerk sometimes these little things poke through.)
The problem Obama faces is that for parochial Americans, black culture just is not "American culture." Increasingly, in fact, "American culture" is only what's comfortable and familiar to some sort of vaguely middle- and working-class white suburban cohort - too much education, any sort of unusual tastes (latte, arugula, ad nauseam), and you're no longer a "real" American. And Obama confounds these folks further both by being black and by being a Harvard Law grad: for such increasingly narrowcast "Americans," that notion simply does not compute.
The dangers of such parochialism are not just hypothetical, nor are they only an anomaly faced by this year's Democratic presidential nominee. Law professor Jacqueline Stevens points out that panic over immigration effectively votes Latino/as off Citizenship Island, even if they are citizens: although the law says that the Immigration and Customs Enforcement Agency (ICE) has no jurisdiction over US citizens, Stevens outlines numerous cases in which citizens were caught up in ICE sweeps and imprisoned, sometimes for years, until they could prove citizenship. So much for "innocent until proven guilty."
Stevens also notes a disturbing and little-noted aspect of America's grotesquely high rates of imprisonment: a 2006 Justice Department report says that "about 40 percent of the incarcerated population has 'symptoms of mania,'" and nearly one-quarter of inmates suffer from psychotic disorders. Those with untreated mental illness are far likelier to make poor decisions when confronted with law enforcement, and of course their illness, poverty, and (often) racial or ethnic identity limit their employment options. It is unsurprising that many such people end up committing crimes.
Fortunately, no one takes seriously Fox News' suggestion that the Obamas' dap was a "terrorist handshake." Other people who don't fit in that sort of definition of "American" are not so fortunate.
The item that occasioned this thought was a short comment about all the publicity attending Barack and Michelle Obama's fist bump prior to his nomination speech. The writer said that with all that publicity, they'll never be able to do that again, since it would seem contrived, stagey, etc.
And I thought, really? Michelle Robinson Obama grew up in a primarily black neighborhood in Chicago, and Barack Obama lived and worked in similar neighborhoods for much of his life. The fist bump or "dap" is an established and normal aspect of black American culture (and not just black American culture), so it's entirely possible that the famous (or infamous, if you're a Fox News imbecile) dap was completely spontaneous.
But here's the thing: If Mr. White Guy from a Good Home in a Good Suburb is nominated for his party's presidential ticket, and he hugs his wife, or shakes hands with a close friend or advisor, no one makes anything of it. Certainly no one suggests that, you know, since he shook hands at that moment, he won't be able to do it again, because that would imply that he's just trying to repeat the moment of hand-shaking that generated all that publicity.
Because, of course, a white guy shaking another white guy's hand would generate no publicity at all. It's considered utterly normal, totally unremarkable, not worthy of any press at all. And that, obviously, is because shaking hands is so integral a part of white male business and political culture (and not just white culture) that it's about as worthy of notice as the fact that a politician was wearing a suit and tie.
I doubt that our Newsweek columnist is a bigot - yet his arch little observation about the showbizzy nature of politics illustrates the way racism can persist, in the form of "othering" those whose actions, speech, or thoughts seem alien to us. (I was about to remove that "us" - since it repeats the same thing I'm complaining about...but I'm leaving it, because my point is not that this middle-class white guy is better than everyone else, but that even when we're trying to not act like a jerk sometimes these little things poke through.)
The problem Obama faces is that for parochial Americans, black culture just is not "American culture." Increasingly, in fact, "American culture" is only what's comfortable and familiar to some sort of vaguely middle- and working-class white suburban cohort - too much education, any sort of unusual tastes (latte, arugula, ad nauseam), and you're no longer a "real" American. And Obama confounds these folks further both by being black and by being a Harvard Law grad: for such increasingly narrowcast "Americans," that notion simply does not compute.
The dangers of such parochialism are not just hypothetical, nor are they only an anomaly faced by this year's Democratic presidential nominee. Law professor Jacqueline Stevens points out that panic over immigration effectively votes Latino/as off Citizenship Island, even if they are citizens: although the law says that the Immigration and Customs Enforcement Agency (ICE) has no jurisdiction over US citizens, Stevens outlines numerous cases in which citizens were caught up in ICE sweeps and imprisoned, sometimes for years, until they could prove citizenship. So much for "innocent until proven guilty."
Stevens also notes a disturbing and little-noted aspect of America's grotesquely high rates of imprisonment: a 2006 Justice Department report says that "about 40 percent of the incarcerated population has 'symptoms of mania,'" and nearly one-quarter of inmates suffer from psychotic disorders. Those with untreated mental illness are far likelier to make poor decisions when confronted with law enforcement, and of course their illness, poverty, and (often) racial or ethnic identity limit their employment options. It is unsurprising that many such people end up committing crimes.
Fortunately, no one takes seriously Fox News' suggestion that the Obamas' dap was a "terrorist handshake." Other people who don't fit in that sort of definition of "American" are not so fortunate.
5.30.2008
helpful hints for would-be criminals
From today's Milwaukee Journal-Sentinel:
1. If you decide to leave your parked car blasting music - in the wake of loads of publicity about the city's new noise ordinance - it's probably better to do it somewhere other than directly across the street from a police station.
2. Also, leaving a gun stuffed between the seats of said car will increase the officers' interest in your vehicle.
3. Leaving drugs in such a vehicle is too complicated. You should instead gift-wrap them and deliver them, with a nice card bearing your name, directly to the police station. This saves the police the minor effort they had to expend of asking around to see who owned the car with the drugs, the gun, and the loud music blasting.
1. If you decide to leave your parked car blasting music - in the wake of loads of publicity about the city's new noise ordinance - it's probably better to do it somewhere other than directly across the street from a police station.
2. Also, leaving a gun stuffed between the seats of said car will increase the officers' interest in your vehicle.
3. Leaving drugs in such a vehicle is too complicated. You should instead gift-wrap them and deliver them, with a nice card bearing your name, directly to the police station. This saves the police the minor effort they had to expend of asking around to see who owned the car with the drugs, the gun, and the loud music blasting.
5.21.2008
the divine right of cowboys
We tend to think (and many Westerners tend to think of themselves this way) of cowboys as the quintessential American individualist: dependent upon no one, self-reliant utterly...this is the image (regarded both positively and negatively) that makes "cowboy" such a resonant symbol. But insofar as modern-day ranchers are cowboys (and of course, they are that, among other things), they put your Reagan-fantasy "welfare queens" to shame with their reliance on government funding. Christopher Ketcham (in an article in the June 2008 issue of Harper's) notes that not only do most major ranchers use federal lands (national parks and forests and the like) for grazing, the cost of such grazing permits is only about a twelfth of the cost of the market rate for foraging on private lands and costs US taxpayers at least $120 million annually, with additional hidden costs driving that figure up as high as $1 billion per year. On behalf of such ranchers (who include poor folks like Ted Turner and Paris Hilton's grandpa), Ketcham notes, the US government "clears forests; plants grass; builds roads, cattle guards, and fences; diverts streams; blows up beaver dams; 'improves' habitats; monitors the health of stock; excises predators, including 80,000 coyotes; and poisons, traps, or shoots more than 30,000 prairie dogs and beavers...each year."
Worse yet, this welfare (and half of Montana's cattle, for example, are owned by only 10 percent of the state's cattle ranchers) goes to support an activity that is hugely destructive to the Western habitat. Cattle, it turns out, are grossly less-suited to the environment, and more destructive of it, than were buffalo. Had the US not driven out the original buffalo herds in favor of beef cattle (and - no small factor this - as an effective strategy to clear Native Americans from the land), and had we instead relied upon buffalo rather than cattle, the deforestation and desertification, the loss of topsoil and even species, would have been mitigated if not eliminated. To take one example, buffalo hooves are sharper than cattle's hooves, which means that their passage separates and oxygenates the soil, whereas flat cattle hooves pound it down deplete it.
So: not only are cattle ranchers welfare-dependent guzzlers at the state teat, their practices are extremely harmful...not only to the land in general but ultimately to their own livelihood.
(Introducing: vaguely associated musical selections...)
Mission of Burma "All World Cowboy Romance" (Signals, Calls, and Marches 1981)
Scott Walker "Tilt" (Tilt 1995)
Worse yet, this welfare (and half of Montana's cattle, for example, are owned by only 10 percent of the state's cattle ranchers) goes to support an activity that is hugely destructive to the Western habitat. Cattle, it turns out, are grossly less-suited to the environment, and more destructive of it, than were buffalo. Had the US not driven out the original buffalo herds in favor of beef cattle (and - no small factor this - as an effective strategy to clear Native Americans from the land), and had we instead relied upon buffalo rather than cattle, the deforestation and desertification, the loss of topsoil and even species, would have been mitigated if not eliminated. To take one example, buffalo hooves are sharper than cattle's hooves, which means that their passage separates and oxygenates the soil, whereas flat cattle hooves pound it down deplete it.
So: not only are cattle ranchers welfare-dependent guzzlers at the state teat, their practices are extremely harmful...not only to the land in general but ultimately to their own livelihood.
(Introducing: vaguely associated musical selections...)
Mission of Burma "All World Cowboy Romance" (Signals, Calls, and Marches 1981)
Scott Walker "Tilt" (Tilt 1995)
5.17.2008
why the metric system never caught on in the US

When I was a kid, I remember a big push (from my math teachers in particular) to get everyone to become familiar with the metric system. I remember baseball stadiums (even those not in Canada) suddenly sprouted metric measurements on their outfield fences. Unfortunately, when those signs said things like "106.68 m," they reinforced the notion that the metric system required a nerdy precision, what with carrying things out to a couple-few decimal places. This was completely unnecessary, since in most cases we do not need an exact measurement. I doubt that your typical outfield fence that says "350 ft." is exactly 350 feet - it may be 350 feet give or take a few inches, and given that the fences are rarely curved in such a manner as to be equidistant from home plate, at one point in the "350 ft." sign they will be 349 feet and 6 inches and at the other end of the sign maybe 350 feet and 4 inches.
What should have been emphasized wasn't conversion but familiarity. We really don't need to know most measurements to the centimeter, and it would have been far more effective to know that a meter is a little longer than a yard, a kilometer's a little less than two-thirds of a mile, etc. - but beyond that, we become familiar with these units in reference to what they measure. We're all used to 2-liter beverage containers, for example (for some reason, that measurement took hold). And if we got used to how fast 80 kph felt, or how long it took to drive 125 km on the freeway, who cares about conversion?
3.31.2008
my rights versus yours
I opened my review of the Stephen Malkmus & the Jicks/John Vanderslice concert last week by jokingly noting that the boilerplate legalese on the back of the ticket apparently compelled me to cede any rights I might have had to talk about the concert.
You know what's funny? Apparently, that's true...or at least, the folks who run the Pabst Theater website feel they have rights to such descriptions. My former Milk Magazine colleague Don at Timedoor was surprised (pleasantly, I should note) to find that his review of the show was reproduced at the Pabst Theater's site. Curiously, this happened without anyone from the Pabst contacting him to ask whether it was okay for them to reproduce his review (not just link to it). Now, to me, it's kosher to link to anything on the web - after all, it's published and available to anyone who enters the URL on their machine - but reproducing entire chunks of text from someone else's site, without permission, seems a bit dubious. I'm not utterly bent out of shape by this - arguably, just as a music review site can expect its reviews to be quoted at other sites, including that of the artist if the review's positive or particularly insightful, a concert review might be considered in the same category.
Still, if it had been my review, I would have appreciated the Pabst at least asking. Curiously, while some sites, such as Flickr, allow users to use Creative Commons licenses for the photos, neither Blogger nor Timedoor's hosting entity appear to offer such options for their bloggers' work. My Flickr photos, for example, are displayed under an Attribution-Noncommercial-Share Alike license, which means anyone can use them so long as they credit me, aren't using them commercially, and attach similar conditions to such derivative usage. Under those terms, I don't expect anyone to ask me for such use - I've already given permission in advance.
So I don't know whether the Pabst simply assumed such a license (it shouldn't), whether it figured quoting was okay so long as credit and linking were present, or whether it was asserting its proprietary rights over "descriptions of the Event."
(Incidentally: I'm partially visible in one of the photos on display at the Pabst site. See the tall guy with the beige cap in the center of the image, facing the other way? And see the woman in the leather coat with the knit bag and black wool cap to his right? That's me, behind her.)
You know what's funny? Apparently, that's true...or at least, the folks who run the Pabst Theater website feel they have rights to such descriptions. My former Milk Magazine colleague Don at Timedoor was surprised (pleasantly, I should note) to find that his review of the show was reproduced at the Pabst Theater's site. Curiously, this happened without anyone from the Pabst contacting him to ask whether it was okay for them to reproduce his review (not just link to it). Now, to me, it's kosher to link to anything on the web - after all, it's published and available to anyone who enters the URL on their machine - but reproducing entire chunks of text from someone else's site, without permission, seems a bit dubious. I'm not utterly bent out of shape by this - arguably, just as a music review site can expect its reviews to be quoted at other sites, including that of the artist if the review's positive or particularly insightful, a concert review might be considered in the same category.
Still, if it had been my review, I would have appreciated the Pabst at least asking. Curiously, while some sites, such as Flickr, allow users to use Creative Commons licenses for the photos, neither Blogger nor Timedoor's hosting entity appear to offer such options for their bloggers' work. My Flickr photos, for example, are displayed under an Attribution-Noncommercial-Share Alike license, which means anyone can use them so long as they credit me, aren't using them commercially, and attach similar conditions to such derivative usage. Under those terms, I don't expect anyone to ask me for such use - I've already given permission in advance.
So I don't know whether the Pabst simply assumed such a license (it shouldn't), whether it figured quoting was okay so long as credit and linking were present, or whether it was asserting its proprietary rights over "descriptions of the Event."
(Incidentally: I'm partially visible in one of the photos on display at the Pabst site. See the tall guy with the beige cap in the center of the image, facing the other way? And see the woman in the leather coat with the knit bag and black wool cap to his right? That's me, behind her.)
2.27.2008
geeknits being picked
It's always sorta bothered me on science fiction movies and TV shows, where some sort of matter transporter device always manages to readily distinguish between which matter belongs to the person and which just happens to be nearby. Oh - and miraculously, recognizes clothing and things the person's holding as transport-marked, also.
Seems to me that even if you imagine that you could somehow code for biological entities such that a transporter would recognize the physical entity as a unit and not, you know, the floor beneath the transporter pad, etc., there'd be no particular way to recognize clothing, or the communication device the person's holding, etc.
In fact, it strikes me as weirdly analogous to the problem of spam-recognition: while various algorithms do a reasonable job of recognizing much spam as spam, it's still not as accurate as human eyeballs in spotting out obvious spam (though it's much, much quicker and less prone to succumbing to sheer tedium).
I think it'd be much more believable if a transporter device were less physics-based and more a matter of extreme mental discipline, the sort of thing that takes years of intensive training for the transportee to focus on exactly and only what needs to be transported - sort of a yogic discipline scenario.
Of course, I read very little SF these days...so far all I know, a million stories have taken up exactly this concept.
Seems to me that even if you imagine that you could somehow code for biological entities such that a transporter would recognize the physical entity as a unit and not, you know, the floor beneath the transporter pad, etc., there'd be no particular way to recognize clothing, or the communication device the person's holding, etc.
In fact, it strikes me as weirdly analogous to the problem of spam-recognition: while various algorithms do a reasonable job of recognizing much spam as spam, it's still not as accurate as human eyeballs in spotting out obvious spam (though it's much, much quicker and less prone to succumbing to sheer tedium).
I think it'd be much more believable if a transporter device were less physics-based and more a matter of extreme mental discipline, the sort of thing that takes years of intensive training for the transportee to focus on exactly and only what needs to be transported - sort of a yogic discipline scenario.
Of course, I read very little SF these days...so far all I know, a million stories have taken up exactly this concept.
1.15.2008
harbinger
Leaving the house and heading north on the freeway earlier this evening, I noticed that, for some reason, lights were on at Miller Park - and I realized that even though it's the deepest depth of winter, baseball season is less than three months away. I'm not at all a major sports fan...but I do like baseball, and it occurred to me, seeing the lights illuminating the arching roof of Miller Park on the horizon, that I really ought to get to a few more Brewers games this season.
To hell with winter - bring on the springtime!
To hell with winter - bring on the springtime!
1.05.2008
possible misapprehensions
I wonder if it's viable to claim any legitimacy at all for a genre of writing that intentionally goes off half-cocked. Via Franklin Bruno's Nervous Unto Thirst, I read a review he wrote of Julian Dodd's Works of Music: An Essay in Ontology, an examination of the ontology of music that, according to Bruno, asks two key questions: "the categorical question (what sort of entity is a work of music?) and the individuation question (how are works identified and distinguished?)."
These questions are, of course, much harder to answer adequately than one might think. I, however, haven't read the review carefully, but in reading through it (much as one might look through a windowpane), I found myself asking some questions about the questions Dodd raises. For example: Bruno notes that "Dodd's 'sonicism' (2) holds that 'work-identity consists in acoustic indistinguishability' (8)," contrasted with contextualism, under which "various properties not readily described in acoustic terms are also essential to work-identity." This seems similar to me to the old literary-critical argument between the New Criticism, which argued against any sort of extra-textual considerations and asked to read solely, and closely, the work itself, contrasted with numerous later (as well as earlier) schools of literary interpretation, which argued for the importance in interpretation of, variously, contexts historical, biographical, literary, sociological, psychological, and so on.
The example and question I'm thinking of initially raises the secondary question of what counts as a "work" not only under these considerations but also in terms of wholeness and integrity: what constitutes the boundaries of a musical event? Is a single movement of a symphony a "work"? Is a single song on a through-segued concept album? More directly to my silly little example to come: is a two-bar sample? Because there, context is clearly relevant. I'm thinking of MF DOOM's "Tick, Tick..." specifically - which samples the queasy string figure at the end of the Beatles' "Glass Onion." MF DOOM further distorts the source by altering pitch and tempo (further queasifying it, for that matter), but even assuming a particular two-bar segment in the identical tempo and key as that of the Beatles' original (that is, which possess "acoustic indistinguishability" relative to one another), clearly the acoustic context makes a huge difference in the effect of that two-bar segment. Now, granted, this isn't a case of a "propert[y] not readily described in acoustic terms"; but it's pretty easy to imagine examples which clearly would fulfill that criterion quite readily. Let's imagine a public-domain recording by a marching band of "The Star-Spangled Banner." In one context, it appears on a 45-rpm record in a flag-festooned package called "The Soul of a Great Nation," given away free to young men and women at an armed services recruiting center. In the second context, it is presented as the b-side of a single whose a-side is an incendiary piece of anti-nationalistic punk rock. Yeah, I'm weighting the example ridiculously - but equally obviously, the self-same acoustic phenomenon (tentatively, "work") is going to have a very different meaning in each context. How reasonable is it even to call the two acoustically-identical recordings "the same"? (Let's ask the renowned musicologist Jorge Luis Borges...)
Please note that I'm using a dense and technically written review of an even more dense and technically written text, a text I have not read, as a jumping-off point: I'd be very surprised if I'm saying anything at all interesting about Dodd's work. But I do think it's interesting to ask, regardless of that context (a context I can't claim to be able to work within), a different question, however tangentially related: what do we mean when we say that a piece of music is the "same" as another piece of music?
Carrying on with my attempt to sculpt a cube from someone else's sphere, I note that Bruno paraphrases Dodd's answer to "the categorical question" by stating that "works are sound-event-types, the tokens or occurrences of which are their performances," and that "an ontology of music [must] account for works'...audibility (works are the sort of things that can be heard [11])." This last raises an interesting question: is a work of music not a work of music until it is heard? If a fully fleshed-out orchestral score exists, but the musical performance that would render that score into an acoustical event has not yet occurred, does that mean the piece is not yet a "work of music"? I note that we're in some curiously murky terminological waters: it's fairly common for musicians to refer to scores as "the music," or in some senses to argue or imply that the score is, in some sense, music: the score is referred to by the title of the piece; or at any rate, it's hard to find a clear way to distinguish the work that comes into being only when performed or recorded from the representation or abstraction of that work which is the score. If a manuscript positively verified as being in Beethoven's hand were discovered, a manuscript which was a complete score to a tenth symphony, it would be very odd indeed to claim that this wasn't a work of music until it was performed, even in a very formal usage of the term in question - particularly since many musicians can hear, in their heads, a virtual performance as they peruse a score.
The questions get even more vexing when modern, studio-based popular music is considered. Take a remixed version of a particular song, constructed from the exact same recorded tracks as its initially released version, only with those tracks re-equalized, re-balanced, remixed, etc. Many non-professional ears would be incapable of hearing the difference between the two recordings; it seems odd to argue that the question of whether a "work" is distinct depends on who's hearing it, or on the readings of a carefully calibrated oscilloscope. Further: at what point does the work become a work, given the second criterion of "audibility," given in turn that the whole song doesn't exist until all the tracks are mastered in their final form? How many works of music are we talking about here?
Again: Dodd might well say I've utterly and completely misconstrued his point (entirely possible), but nonetheless I think the questions I'm asking are interesting, and (again) more difficult to answer than it seems. Consider: if, as I imply, it's absurd to talk about two near-identical mixes of the same track as two different works of music (one version, say, brings up the treble on the piano track, raises the volume of the bass drum slightly, and digitally reduces some tape hiss on the original recording; and the whole mix is compressed and increased in volume), it seems equally absurd to say they're the same work in other senses. Certainly, more experienced or subtler ears would hear the differences, and collectors would consider the two as different enough that, say, someone who lacked one or the other version would not be considered to have the artist's complete discography.
What constitutes a "work of music" - even to the apparently simple questions of when is it music, or what makes it different from some other work of music - is, then, by no means a simple question.
Which, of course, is why philosophers can write books on questions that most people would imagine are simple and obvious. Very little, it seems, is truly simple or obvious.
These questions are, of course, much harder to answer adequately than one might think. I, however, haven't read the review carefully, but in reading through it (much as one might look through a windowpane), I found myself asking some questions about the questions Dodd raises. For example: Bruno notes that "Dodd's 'sonicism' (2) holds that 'work-identity consists in acoustic indistinguishability' (8)," contrasted with contextualism, under which "various properties not readily described in acoustic terms are also essential to work-identity." This seems similar to me to the old literary-critical argument between the New Criticism, which argued against any sort of extra-textual considerations and asked to read solely, and closely, the work itself, contrasted with numerous later (as well as earlier) schools of literary interpretation, which argued for the importance in interpretation of, variously, contexts historical, biographical, literary, sociological, psychological, and so on.
The example and question I'm thinking of initially raises the secondary question of what counts as a "work" not only under these considerations but also in terms of wholeness and integrity: what constitutes the boundaries of a musical event? Is a single movement of a symphony a "work"? Is a single song on a through-segued concept album? More directly to my silly little example to come: is a two-bar sample? Because there, context is clearly relevant. I'm thinking of MF DOOM's "Tick, Tick..." specifically - which samples the queasy string figure at the end of the Beatles' "Glass Onion." MF DOOM further distorts the source by altering pitch and tempo (further queasifying it, for that matter), but even assuming a particular two-bar segment in the identical tempo and key as that of the Beatles' original (that is, which possess "acoustic indistinguishability" relative to one another), clearly the acoustic context makes a huge difference in the effect of that two-bar segment. Now, granted, this isn't a case of a "propert[y] not readily described in acoustic terms"; but it's pretty easy to imagine examples which clearly would fulfill that criterion quite readily. Let's imagine a public-domain recording by a marching band of "The Star-Spangled Banner." In one context, it appears on a 45-rpm record in a flag-festooned package called "The Soul of a Great Nation," given away free to young men and women at an armed services recruiting center. In the second context, it is presented as the b-side of a single whose a-side is an incendiary piece of anti-nationalistic punk rock. Yeah, I'm weighting the example ridiculously - but equally obviously, the self-same acoustic phenomenon (tentatively, "work") is going to have a very different meaning in each context. How reasonable is it even to call the two acoustically-identical recordings "the same"? (Let's ask the renowned musicologist Jorge Luis Borges...)
Please note that I'm using a dense and technically written review of an even more dense and technically written text, a text I have not read, as a jumping-off point: I'd be very surprised if I'm saying anything at all interesting about Dodd's work. But I do think it's interesting to ask, regardless of that context (a context I can't claim to be able to work within), a different question, however tangentially related: what do we mean when we say that a piece of music is the "same" as another piece of music?
Carrying on with my attempt to sculpt a cube from someone else's sphere, I note that Bruno paraphrases Dodd's answer to "the categorical question" by stating that "works are sound-event-types, the tokens or occurrences of which are their performances," and that "an ontology of music [must] account for works'...audibility (works are the sort of things that can be heard [11])." This last raises an interesting question: is a work of music not a work of music until it is heard? If a fully fleshed-out orchestral score exists, but the musical performance that would render that score into an acoustical event has not yet occurred, does that mean the piece is not yet a "work of music"? I note that we're in some curiously murky terminological waters: it's fairly common for musicians to refer to scores as "the music," or in some senses to argue or imply that the score is, in some sense, music: the score is referred to by the title of the piece; or at any rate, it's hard to find a clear way to distinguish the work that comes into being only when performed or recorded from the representation or abstraction of that work which is the score. If a manuscript positively verified as being in Beethoven's hand were discovered, a manuscript which was a complete score to a tenth symphony, it would be very odd indeed to claim that this wasn't a work of music until it was performed, even in a very formal usage of the term in question - particularly since many musicians can hear, in their heads, a virtual performance as they peruse a score.
The questions get even more vexing when modern, studio-based popular music is considered. Take a remixed version of a particular song, constructed from the exact same recorded tracks as its initially released version, only with those tracks re-equalized, re-balanced, remixed, etc. Many non-professional ears would be incapable of hearing the difference between the two recordings; it seems odd to argue that the question of whether a "work" is distinct depends on who's hearing it, or on the readings of a carefully calibrated oscilloscope. Further: at what point does the work become a work, given the second criterion of "audibility," given in turn that the whole song doesn't exist until all the tracks are mastered in their final form? How many works of music are we talking about here?
Again: Dodd might well say I've utterly and completely misconstrued his point (entirely possible), but nonetheless I think the questions I'm asking are interesting, and (again) more difficult to answer than it seems. Consider: if, as I imply, it's absurd to talk about two near-identical mixes of the same track as two different works of music (one version, say, brings up the treble on the piano track, raises the volume of the bass drum slightly, and digitally reduces some tape hiss on the original recording; and the whole mix is compressed and increased in volume), it seems equally absurd to say they're the same work in other senses. Certainly, more experienced or subtler ears would hear the differences, and collectors would consider the two as different enough that, say, someone who lacked one or the other version would not be considered to have the artist's complete discography.
What constitutes a "work of music" - even to the apparently simple questions of when is it music, or what makes it different from some other work of music - is, then, by no means a simple question.
Which, of course, is why philosophers can write books on questions that most people would imagine are simple and obvious. Very little, it seems, is truly simple or obvious.
12.03.2007
the 21st century breathing down his neck
Here's a thoughtful, complex response to Morrissey's latest insertion of his foot into his mouth. While I don't think Morrissey is a thoroughgoing racist, his comments are naive at best, and they reflect a curiously static notion of cultural identity (and, as Momus and several commentors point out, a rather ironic one, given Morrissey's most recent places of residence). England is, in fact, the bastard depository of any number of streams of European and, lately, non-European blood: it's the second-last place on earth anyone should be babbling about any sort of static culture. (Guess where the last is.) Momus's confessions that, essentially, he understands how Morrissey might feel - but from the outside looking in at cultures he admires - is useful also for undercutting the strand of idealism and false identification that lurks underneath the most mainstream notions of "multiculturalism": too often, the "multi" in the phrase is reified into a static notion of cuisines, costumes, and musics, without recognizing that it's just as "authentic" for a South African musician to respond to and incorporate hip-hop into his music as it is to incorporate older, native musical traditions (which, often, are nearly museum pieces at the local level - which makes them, ironically, far more alien-seeming than the hip-hop the musician hears every day).
This whole discussion bears some relation to the Frere-Jones foofaraw about race in music - in that I think one thing he does is focus on certain surface-level signifiers of cultural influence while ignoring deeper structures. For example: it's almost impossible to find any sort of rock or pop recording from the last fifty years whose rhythmic basis does not bear a deep influence from African-derived rhythmic models. Both at the level of the ubiquitous accented offbeat and, far more subtly, the swung or uneven subdivisions of the beat below the pulse level (either the quarter- or eighth-note), African influence is apparent even in places where most listeners would not expect it...such as in the drumming on a Toby Keith record, say.
Of course, if neither the listeners nor the musicians recognize the influence as being African-derived without thinking about it, is it still legible as a cultural influence? Or has it been totally assimilated into another culture? Or, is it more accurate to say that the whole question rests upon static notions of what's in and what's out of a culture: as if there's a "European" tradition forever and ever, upon which an "African" tradition forever and ever can work its influence, leading either to a new hybrid (and presumed dynamic) cultural work-in-progress or to a new, hypostasized static culture?
Ultimately - to return to our large-quiffed friend - any imagination of a one-time, static culture is just that: imagination. The tendency is for whatever prevalent model-just-gone to lose its placement in time and become, in the imagination, what had always been. (For example: the Leave it to Beaver-style nuclear family...which, in its particular formation, was actually relatively new at the time of that show's broadcast, replacing more extended family formations both among urban and rural Americans.)
This whole discussion bears some relation to the Frere-Jones foofaraw about race in music - in that I think one thing he does is focus on certain surface-level signifiers of cultural influence while ignoring deeper structures. For example: it's almost impossible to find any sort of rock or pop recording from the last fifty years whose rhythmic basis does not bear a deep influence from African-derived rhythmic models. Both at the level of the ubiquitous accented offbeat and, far more subtly, the swung or uneven subdivisions of the beat below the pulse level (either the quarter- or eighth-note), African influence is apparent even in places where most listeners would not expect it...such as in the drumming on a Toby Keith record, say.
Of course, if neither the listeners nor the musicians recognize the influence as being African-derived without thinking about it, is it still legible as a cultural influence? Or has it been totally assimilated into another culture? Or, is it more accurate to say that the whole question rests upon static notions of what's in and what's out of a culture: as if there's a "European" tradition forever and ever, upon which an "African" tradition forever and ever can work its influence, leading either to a new hybrid (and presumed dynamic) cultural work-in-progress or to a new, hypostasized static culture?
Ultimately - to return to our large-quiffed friend - any imagination of a one-time, static culture is just that: imagination. The tendency is for whatever prevalent model-just-gone to lose its placement in time and become, in the imagination, what had always been. (For example: the Leave it to Beaver-style nuclear family...which, in its particular formation, was actually relatively new at the time of that show's broadcast, replacing more extended family formations both among urban and rural Americans.)
11.25.2007
John Lee Super-Understander
In an article in yesterday's New York Times, Paul Davies claims that science is hypocritical in decrying faith, because (he says) it too is based on a different sort of faith. His argument, though, seems both disingenuous with its science (Davies is a physicist, so he certainly would know this) and rather slippery in defining its key terms.
Davies claims that "science has its own faith-based belief system" because it "proceeds on the assumption that nature is ordered in a rational and intelligible way." He continues: "You couldn't be a scientist if you thought the universe was a meaningless jumble of odds and ends haphazardly juxtaposed. When physicists probe to a deeper level of subatomic structure, or astronomers extend the reach of their instruments, they expect to encounter additional elegant mathematical order."
The problem here is a confusion of "expectation" with "faith." The reason no scientist is likely to think that "the universe was a meaningless jumble" is that, so far as science can determine based on its observations to date, it isn't - and that any such "meaningless jumble" would be, by the very fact that so many patterns and forces have been established, a local exception, probably ultimately explainable by a higher-order organization (science has done quite a lot to explain the seemingly random: see chaos theory). That's not "faith" - that's reason. There is no rational reason to expect a large-scale exception to what has already been plentifully established.
Davies goes on to argue that "the idea that [scientific] laws exist reasonlessly is deeply anti-rational. After all, the very essence of a scientific explanation of some phenomenon is that the world is ordered logically and that there are reasons things are as they are." Reason is a tool. It is a mode of understanding phenomena. It is not an originating principal. The world may be ordered logically, but that does not mean a logical being ordered it (one implication of Davies' ideas); nor does it mean there are "reasons things are as they are." Davies here switches the meaning of the word "reason" from "a mode of thinking" to "a cause" without noting the switch. The two meanings are not equivalent.
Davies makes a similar move concerning the notion of "laws." He writes, "a God's-eye view might reveal a vast patchwork quilt of universes, each with its own distinctive set of bylaws. In this 'multiverse,' life will arise only in those patches with bio-friendly bylaws, so it is no surprise that we find ourselves in a Goldilocks universe — one that is just right for life. We have selected it by our very existence." Uh, no: we haven't "selected" it - we exist because it exists, with conditions ripe for our existence. That's an odd definition of "selecting." The multiverse theory, however, addresses what some people perceive as a problem: the fact that had certain variables in the early moments of the universe been infinitesimally different, life as we know it would not have evolved. What are the odds, these people say, that all those factors would have come together at random? There must be a higher, ordering power, they argue - by which they typically mean God.
That's always seemed like an odd question to me. Let's say you program a calculator to come up with a random number between 1 and 1,000,000. The calculator comes up with the number 50,392. You fiddle around with that number for a bit, enumerating its properties, its multiples and divisors and so forth...and then you exclaim, why look, this number has all these particular properties! How likely is it that a random process could have coughed up just these properties? Now imagine (pretend it's an animated cartoon illustrating mathematical principles) that those various properties of 50,392 are presented as living beings. They're all running around exclaiming how amazing it is that they exist. But it's not extraordinary at all - or rather, that the number is 50,392 is exactly as extraordinary as if it had been 672,911, 15, 4,887, or 999,999: a million to one, each with its own set of properties. Even if there's only ever been one universe, obviously it's the one we have. You can call it 50,392 if you like. The multiverse theory makes this a little more comprehensible (in some ways): perhaps all those other universes once existed, exist in dimensions inaccessible to us, etc. Once again: we're here because this particular universe is conducive to our existence. For all we know, billions of other universes exist, have existed, or will exist which aren't.
Davies argues that the multiverse theory doesn't really explain much, since, he says, "there has to be a physical mechanism to make all those universes and bestow bylaws on them. This process will require its own laws, or meta-laws. Where do they come from? The problem has simply been shifted up a level from the laws of the universe to the meta-laws of the multiverse." Well, yes: but he's moved from the question of "why are we here?" (emphasis on "here") to "why are there 'heres'?" Those aren't the same question. Davies is essentially complaining that science has not answered, and likely cannot answer, every question. But science doesn't claim to be able to do that, so far as I understand it: science claims to provide answers - the best, contingent ones possible given the current state of data - to those questions that we understand sufficiently to provide workable theories (i.e., about which we can make predictable statements). When Davies asks "where do they come from?" he's implicitly suggesting that it's absurd not to imagine some Ultimate Source. He seems to be smuggling God, in an unmarked package, through the back door. At least I suspect this is how many readers will understand him (even though elsewhere he describes himself as an agnostic).
What I've never understood about this approach to religion is the belief of its adherents that it answers anything at all. "Where did all those laws and principles come from? They must have come from God!" Okay...so where did God come from? "God doesn't 'come from' anywhere: God is infinite, universal, and eternal, without beginning or end." In place of science's "I don't know," we have an "answer" that isn't an answer at all: it's a semantic gesture, in which "God" is defined as "that which evades all logical questions about origin or causality, being defined as beyond all that." But you might as well just shrug your shoulders and say I don't know for all the insight such a definitional gesture gets you. It's merely answering the question with a name, a name delineating attributes that deflect rather than answer the question.
Back to Davies. Confirming our suspicion as to the contents of that unmarked package, he writes that "the very notion of physical law is a theological one in the first place, a fact that makes many scientists squirm. Isaac Newton first got the idea of absolute, universal, perfect, immutable laws from the Christian doctrine that God created the world and ordered it in a rational way. Christians envisage God as upholding the natural order from beyond the universe, while physicists think of their laws as inhabiting an abstract transcendent realm of perfect mathematical relationships."
First, I'm not sure that Christians typically believe that God created the universe "in a rational way" - I'm not even sure what that would mean. I suspect Davies means that Christians see the order in the universe as evidence of an order-giver. But in this passage, Davies is confusing "law" in the juridical sense with "law" as it's used in science. There's really only a coincidental similarity between the sort of "universal, perfect, immutable laws" some Christians might assert and scientific laws. The religious laws are more like juridical laws, in that they're envisioned more or less as God saying, it shall be so in this manner, and so it is. Gravity does what it does, electrons do their thing, because God ordered them so. But a scientific "law" is merely a description, derived from many repeated observations, of the way something is. It does not assert a lawgiver or anything else beyond the observable universe.
Some more linguistic sleight of hand: while Christians might indeed envision God as existing "beyond" the universe, the "transcendent realm" of mathematics is not "beyond" the universe in the same sense. Again: mathematical laws describe the universe. That doesn't mean they exist beyond or outside it. As well say that grammarians are positing something eternal and "beyond the universe" when they describe the grammar of a text: after all, the text itself says nothing about nouns, verbs, or dependent clauses.
I'm not so sure why it's such a huge problem to accept that we humans, living here in this physical universe, simply can't understand certain concepts: never mind "eternity," even understanding "a trillion years" is well beyond most people's capacity to imagine. Which is easier: to understand a universe that has no outside, has no beyond...or to understand the notion of an "outside" that's somehow not part of the universe? I'd say neither notion is very comprehensible within our logic. So we posit a Super-Understander, one who can make sense of all this, one who can grasp trillions of years and trillions of light-years - and we give it a name, so we can thereby embody it and bring it down to earth. That's understandable, and not what I have a problem with where religion is concerned (although it'd be nice if it were understood as happening that way). What I can't understand is the peculiar - and rather astonishingly hubristic - notion that we can take this inconceivable and infinite Super-Understander - whose very existence in our minds comes from a need to at least have something understand and comprehend what we cannot - and fold it between the pages of a book, put it on a leash, and use it to police our bounds, all the while imagining we know what this Super-Understander we call God wants us to do, how He wants us to worship it (a peculiarly human attribute for an omniscient being), and that we know exactly which little rituals and localized lists of sins He endorses. (There may be eminently human, and humane, reasons to do the things religions typically enjoin us to do - but that's a separate issue from whether God wants us to do them.) Does it never occur to religionists how arrogant they are - if the God they believe in exists, with all the attributes they imagine Him with - to pretend to be able to know His mind?
That is a far higher irrationality than science shrugging its shoulders and saying, you know, we just don't know yet. At least that answer is honest, and appropriately humble.
Davies claims that "science has its own faith-based belief system" because it "proceeds on the assumption that nature is ordered in a rational and intelligible way." He continues: "You couldn't be a scientist if you thought the universe was a meaningless jumble of odds and ends haphazardly juxtaposed. When physicists probe to a deeper level of subatomic structure, or astronomers extend the reach of their instruments, they expect to encounter additional elegant mathematical order."
The problem here is a confusion of "expectation" with "faith." The reason no scientist is likely to think that "the universe was a meaningless jumble" is that, so far as science can determine based on its observations to date, it isn't - and that any such "meaningless jumble" would be, by the very fact that so many patterns and forces have been established, a local exception, probably ultimately explainable by a higher-order organization (science has done quite a lot to explain the seemingly random: see chaos theory). That's not "faith" - that's reason. There is no rational reason to expect a large-scale exception to what has already been plentifully established.
Davies goes on to argue that "the idea that [scientific] laws exist reasonlessly is deeply anti-rational. After all, the very essence of a scientific explanation of some phenomenon is that the world is ordered logically and that there are reasons things are as they are." Reason is a tool. It is a mode of understanding phenomena. It is not an originating principal. The world may be ordered logically, but that does not mean a logical being ordered it (one implication of Davies' ideas); nor does it mean there are "reasons things are as they are." Davies here switches the meaning of the word "reason" from "a mode of thinking" to "a cause" without noting the switch. The two meanings are not equivalent.
Davies makes a similar move concerning the notion of "laws." He writes, "a God's-eye view might reveal a vast patchwork quilt of universes, each with its own distinctive set of bylaws. In this 'multiverse,' life will arise only in those patches with bio-friendly bylaws, so it is no surprise that we find ourselves in a Goldilocks universe — one that is just right for life. We have selected it by our very existence." Uh, no: we haven't "selected" it - we exist because it exists, with conditions ripe for our existence. That's an odd definition of "selecting." The multiverse theory, however, addresses what some people perceive as a problem: the fact that had certain variables in the early moments of the universe been infinitesimally different, life as we know it would not have evolved. What are the odds, these people say, that all those factors would have come together at random? There must be a higher, ordering power, they argue - by which they typically mean God.
That's always seemed like an odd question to me. Let's say you program a calculator to come up with a random number between 1 and 1,000,000. The calculator comes up with the number 50,392. You fiddle around with that number for a bit, enumerating its properties, its multiples and divisors and so forth...and then you exclaim, why look, this number has all these particular properties! How likely is it that a random process could have coughed up just these properties? Now imagine (pretend it's an animated cartoon illustrating mathematical principles) that those various properties of 50,392 are presented as living beings. They're all running around exclaiming how amazing it is that they exist. But it's not extraordinary at all - or rather, that the number is 50,392 is exactly as extraordinary as if it had been 672,911, 15, 4,887, or 999,999: a million to one, each with its own set of properties. Even if there's only ever been one universe, obviously it's the one we have. You can call it 50,392 if you like. The multiverse theory makes this a little more comprehensible (in some ways): perhaps all those other universes once existed, exist in dimensions inaccessible to us, etc. Once again: we're here because this particular universe is conducive to our existence. For all we know, billions of other universes exist, have existed, or will exist which aren't.
Davies argues that the multiverse theory doesn't really explain much, since, he says, "there has to be a physical mechanism to make all those universes and bestow bylaws on them. This process will require its own laws, or meta-laws. Where do they come from? The problem has simply been shifted up a level from the laws of the universe to the meta-laws of the multiverse." Well, yes: but he's moved from the question of "why are we here?" (emphasis on "here") to "why are there 'heres'?" Those aren't the same question. Davies is essentially complaining that science has not answered, and likely cannot answer, every question. But science doesn't claim to be able to do that, so far as I understand it: science claims to provide answers - the best, contingent ones possible given the current state of data - to those questions that we understand sufficiently to provide workable theories (i.e., about which we can make predictable statements). When Davies asks "where do they come from?" he's implicitly suggesting that it's absurd not to imagine some Ultimate Source. He seems to be smuggling God, in an unmarked package, through the back door. At least I suspect this is how many readers will understand him (even though elsewhere he describes himself as an agnostic).
What I've never understood about this approach to religion is the belief of its adherents that it answers anything at all. "Where did all those laws and principles come from? They must have come from God!" Okay...so where did God come from? "God doesn't 'come from' anywhere: God is infinite, universal, and eternal, without beginning or end." In place of science's "I don't know," we have an "answer" that isn't an answer at all: it's a semantic gesture, in which "God" is defined as "that which evades all logical questions about origin or causality, being defined as beyond all that." But you might as well just shrug your shoulders and say I don't know for all the insight such a definitional gesture gets you. It's merely answering the question with a name, a name delineating attributes that deflect rather than answer the question.
Back to Davies. Confirming our suspicion as to the contents of that unmarked package, he writes that "the very notion of physical law is a theological one in the first place, a fact that makes many scientists squirm. Isaac Newton first got the idea of absolute, universal, perfect, immutable laws from the Christian doctrine that God created the world and ordered it in a rational way. Christians envisage God as upholding the natural order from beyond the universe, while physicists think of their laws as inhabiting an abstract transcendent realm of perfect mathematical relationships."
First, I'm not sure that Christians typically believe that God created the universe "in a rational way" - I'm not even sure what that would mean. I suspect Davies means that Christians see the order in the universe as evidence of an order-giver. But in this passage, Davies is confusing "law" in the juridical sense with "law" as it's used in science. There's really only a coincidental similarity between the sort of "universal, perfect, immutable laws" some Christians might assert and scientific laws. The religious laws are more like juridical laws, in that they're envisioned more or less as God saying, it shall be so in this manner, and so it is. Gravity does what it does, electrons do their thing, because God ordered them so. But a scientific "law" is merely a description, derived from many repeated observations, of the way something is. It does not assert a lawgiver or anything else beyond the observable universe.
Some more linguistic sleight of hand: while Christians might indeed envision God as existing "beyond" the universe, the "transcendent realm" of mathematics is not "beyond" the universe in the same sense. Again: mathematical laws describe the universe. That doesn't mean they exist beyond or outside it. As well say that grammarians are positing something eternal and "beyond the universe" when they describe the grammar of a text: after all, the text itself says nothing about nouns, verbs, or dependent clauses.
I'm not so sure why it's such a huge problem to accept that we humans, living here in this physical universe, simply can't understand certain concepts: never mind "eternity," even understanding "a trillion years" is well beyond most people's capacity to imagine. Which is easier: to understand a universe that has no outside, has no beyond...or to understand the notion of an "outside" that's somehow not part of the universe? I'd say neither notion is very comprehensible within our logic. So we posit a Super-Understander, one who can make sense of all this, one who can grasp trillions of years and trillions of light-years - and we give it a name, so we can thereby embody it and bring it down to earth. That's understandable, and not what I have a problem with where religion is concerned (although it'd be nice if it were understood as happening that way). What I can't understand is the peculiar - and rather astonishingly hubristic - notion that we can take this inconceivable and infinite Super-Understander - whose very existence in our minds comes from a need to at least have something understand and comprehend what we cannot - and fold it between the pages of a book, put it on a leash, and use it to police our bounds, all the while imagining we know what this Super-Understander we call God wants us to do, how He wants us to worship it (a peculiarly human attribute for an omniscient being), and that we know exactly which little rituals and localized lists of sins He endorses. (There may be eminently human, and humane, reasons to do the things religions typically enjoin us to do - but that's a separate issue from whether God wants us to do them.) Does it never occur to religionists how arrogant they are - if the God they believe in exists, with all the attributes they imagine Him with - to pretend to be able to know His mind?
That is a far higher irrationality than science shrugging its shoulders and saying, you know, we just don't know yet. At least that answer is honest, and appropriately humble.
11.24.2007
here is your throat back, thanks for the loan
If someone had told me, around the time of the first few Sonic Youth albums, that one day they'd perform quite possibly the best and most comfortably fitting cover on a collection of Bob Dylan songs, I don't think I would have believed you. But in fact, Sonic Youth's cover of the obscure (i.e., massively bootlegged but unreleased until now) Dylan song "I'm Not There" is a highlight of the soundtrack to the film of the same name. (I haven't seen the film yet, but I intend to.)
The world evoked by Dylan's music of the period this song comes from (1967, recorded at the same time as the songs later released as The Basement Tapes) is shadowy and carnivalesque: images of drifters, gypsies, and other dubious characters recur - the sleeve art for The Basement Tapes gives you a good idea. Dylan's music at this time tended to feature at least two guitars (electric and acoustic), both piano and organ, and bass and drums, along with Dylan's vocals and harmonica. The interplay among the instruments, and the circus atmosphere evoked by the garish organ and Dylan's wheezy, neon-blue harmonica, complemented that imagery so effectively that musicians still evoke it in their arrangements if their lyrics are particularly Dylanesque. Dylan's version of "I'm Not There" doesn't feature his harmonica, but the calliope-like organ wheeling away steadily in the background against the somewhat tentative mix of strummed guitar and loping bass (clearly an early take, since there are several clams in that bass part) suggests shady proceedings just outside the door or around the corner.
Sonic Youth's version, by contrast, has a slightly more threatening pulse in its rhythm, the heavy drum strokes seeming to stalk the beat. In the background, replacing that carnival organ, there's the ghostly, corroded, metallic code of distorted electric guitar. That sound evokes archetypal Sonic Youth of the late '80s and early '90s, when their music was steeped in the mise en scene of Bladerunner, William Gibson, and Philip K. Dick, an imaginary realm of iffy reality, multiple yet sketchy identity, infinite mobility beyond the physical, the guitar seeming far less embodied than Dylan's organ.
I think it works, though, because there are similarities between the old, weird America of Dylan's mid-sixties work, the underworld whose surrealistic menace seemed indebted both to Nathanael West and to the Marx Brothers, or an inversion of Whitman's exuberant and muscular democracy of the body. Gibson and Dick also seem concerned with limning the coded imaginary linking their everyday, underground characters with those from the powerful elite, and despite the way a wired interconnectivity extends characters' reach outside the bounds of their bodies, those bodies tend to reassert themselves, or be reasserted, in both writers' often pungently physical descriptions of sex and violence - a directness powerfully present in the music of Sonic Youth as well.
Though Sonic Youth is more aggressive in staking out such claims for itself and its listeners, both musicians explore a rootlessness, an absence and lack of home, in their music. I'd say it's no coincidence that the two recent film treatments of Dylan and his influence have been titled No Direction Home (Scorcese's 2005 work) and I'm Not There. And Dylan's role-playing - so at odds with the early '60s folk community's emphasis on authenticity and genuineness, and later at odds with mainstream rock's fetishization of the authentic - seems almost postmodern in advance, a public figure glorying in being who he's not: an Okie, a holy clown, a mystic recluse, and so on...all the while enjoying what would seem to be transparent attempts at "putting on" the press (most notoriously in Don't Look Back's "science student" scene). Sonic Youth don't overtly role-play in the same way (sometimes I think they'd be better if they did: when they're too sincere they approach insufferability), but the world of their music - both lyrically and sonically - certainly deconstructs the notion of a single, stable identity.
Sonic Youth's done a lot of cover songs over the years - but it's curious that my two favorites might be this one and "Superstar" (the Carpenters' song, apparently about a stalker, whose inherent creepiness Sonic Youth's version brings well forward). A curious side note: Todd Haynes, who directs I'm Not There, also directed Superstar: The Karen Carpenter Story (that's the one that uses Barbie dolls...)
Sonic Youth "I'm Not There" (I'm Not There soundtrack, 2007)
Bob Dylan and The Band "I'm Not There" (I'm Not There soundtrack, 2007)
These are DivShare downloads, so right-clicking won't work...
The world evoked by Dylan's music of the period this song comes from (1967, recorded at the same time as the songs later released as The Basement Tapes) is shadowy and carnivalesque: images of drifters, gypsies, and other dubious characters recur - the sleeve art for The Basement Tapes gives you a good idea. Dylan's music at this time tended to feature at least two guitars (electric and acoustic), both piano and organ, and bass and drums, along with Dylan's vocals and harmonica. The interplay among the instruments, and the circus atmosphere evoked by the garish organ and Dylan's wheezy, neon-blue harmonica, complemented that imagery so effectively that musicians still evoke it in their arrangements if their lyrics are particularly Dylanesque. Dylan's version of "I'm Not There" doesn't feature his harmonica, but the calliope-like organ wheeling away steadily in the background against the somewhat tentative mix of strummed guitar and loping bass (clearly an early take, since there are several clams in that bass part) suggests shady proceedings just outside the door or around the corner.
Sonic Youth's version, by contrast, has a slightly more threatening pulse in its rhythm, the heavy drum strokes seeming to stalk the beat. In the background, replacing that carnival organ, there's the ghostly, corroded, metallic code of distorted electric guitar. That sound evokes archetypal Sonic Youth of the late '80s and early '90s, when their music was steeped in the mise en scene of Bladerunner, William Gibson, and Philip K. Dick, an imaginary realm of iffy reality, multiple yet sketchy identity, infinite mobility beyond the physical, the guitar seeming far less embodied than Dylan's organ.
I think it works, though, because there are similarities between the old, weird America of Dylan's mid-sixties work, the underworld whose surrealistic menace seemed indebted both to Nathanael West and to the Marx Brothers, or an inversion of Whitman's exuberant and muscular democracy of the body. Gibson and Dick also seem concerned with limning the coded imaginary linking their everyday, underground characters with those from the powerful elite, and despite the way a wired interconnectivity extends characters' reach outside the bounds of their bodies, those bodies tend to reassert themselves, or be reasserted, in both writers' often pungently physical descriptions of sex and violence - a directness powerfully present in the music of Sonic Youth as well.
Though Sonic Youth is more aggressive in staking out such claims for itself and its listeners, both musicians explore a rootlessness, an absence and lack of home, in their music. I'd say it's no coincidence that the two recent film treatments of Dylan and his influence have been titled No Direction Home (Scorcese's 2005 work) and I'm Not There. And Dylan's role-playing - so at odds with the early '60s folk community's emphasis on authenticity and genuineness, and later at odds with mainstream rock's fetishization of the authentic - seems almost postmodern in advance, a public figure glorying in being who he's not: an Okie, a holy clown, a mystic recluse, and so on...all the while enjoying what would seem to be transparent attempts at "putting on" the press (most notoriously in Don't Look Back's "science student" scene). Sonic Youth don't overtly role-play in the same way (sometimes I think they'd be better if they did: when they're too sincere they approach insufferability), but the world of their music - both lyrically and sonically - certainly deconstructs the notion of a single, stable identity.
Sonic Youth's done a lot of cover songs over the years - but it's curious that my two favorites might be this one and "Superstar" (the Carpenters' song, apparently about a stalker, whose inherent creepiness Sonic Youth's version brings well forward). A curious side note: Todd Haynes, who directs I'm Not There, also directed Superstar: The Karen Carpenter Story (that's the one that uses Barbie dolls...)
Sonic Youth "I'm Not There" (I'm Not There soundtrack, 2007)
Bob Dylan and The Band "I'm Not There" (I'm Not There soundtrack, 2007)
These are DivShare downloads, so right-clicking won't work...
9.11.2007
three thousand flags
Walking on campus today, I noticed that a group had placed a large number of flags across a grassy area on the mall between the Union and the library. (Here's a link to image from the campus newspaper.) I assumed (correctly, as it turned out) these flags were meant to commemorate the victims of 9/11 - but I found myself thinking that in some ways, the flag was an odd symbol to choose. Looking over a list of victims, I noticed that about fifty victims were not from America (the non-Americans were primarily Mexican, British, and Canadian, but several other nations were represented).
This, of course, accords with the symbology of the attack, whose two targets, the Pentagon and the World Trade Center, seemed chosen to represent military and economic power - a multinationalist economic power embodied in the WTC's very name. One could argue, in fact, that it's the very internationalist, non-parochial potential of trade (leaving aside for now arguments about favoring the mobility across national lines of capital over people) that opposes it to the fundamentalism of Al Qaeda and the Taliban, since fundamentalism rests upon a pre-sorted identity, an exclusion of the disfavored, a defining and utterly delimited naming of "truth," to which the post-national flexibility exemplified by the presence of citizens of so many nations in a single place would be anathema. For any fundamentalism, truth is fixed, received once in a revelation regarded as literally divine. There's no question of interpretation, modification, or a loyal opposition: you're either with us, or with the infidels.
If the flags mourning these victims of many nations might represent America at its best, as essentially the first post-national nation ("post-national" because its ideals and its citizens are not native, nor imagined as exclusively and specifically so; instead, there's no national tongue, no national blood, only shared ideals which are explicitly universalist, not limited to a chosen few people), then the victims are appropriately memorialized. Certainly no one asked them for their citizenship or passport before they were untimely obliterated.
In this light it's sadly ironic to see where the post-9/11 world has come - since America at its worst seeks by force to extend a monolithic, rigid set of ideals, ideals of economics, of control, of ideology and "democracy from above." In this reading the sad irony is that the international victims, who in life might have proudly proclaimed their British, Mexican, or other citizenship, find themselves posthumously ruled American by fiat, by decree.
But as I've written before (in rather a different context), there's only one nation, and it both extols and extinguishes those better ideals. And so one of the sadder aspects of the US post-9/11 is way borders have dissolved utterly for capital (flowing one way) and jobs (the other), while rigidifying for humans, both at the physical borders and in the psychological borders within the nation, separating people on the basis of name, color, or employment. (Legally, by the way, anyone earning money in this nation, citizen or not, licitly employed or not, is compelled to pay taxes on that income - regardless of whether they have any voice in the nation's affairs. Taxation without representation, anyone?) Capital moves across the face of the nations, leaving its borders waste and without form, but frowns on the dark faces chained to its machinery.
I'd like to end with some rousing peroration about those three thousand flags, about the need to make them reflect our ideals rather than a will to power, but I'm feeling rudderless and powerless at the moment, as lies and distortion wash over us in a surge of propaganda, and our supposed loyal opposition is so loyal its opposition looks more like an exact reflection. So, sorry.
This, of course, accords with the symbology of the attack, whose two targets, the Pentagon and the World Trade Center, seemed chosen to represent military and economic power - a multinationalist economic power embodied in the WTC's very name. One could argue, in fact, that it's the very internationalist, non-parochial potential of trade (leaving aside for now arguments about favoring the mobility across national lines of capital over people) that opposes it to the fundamentalism of Al Qaeda and the Taliban, since fundamentalism rests upon a pre-sorted identity, an exclusion of the disfavored, a defining and utterly delimited naming of "truth," to which the post-national flexibility exemplified by the presence of citizens of so many nations in a single place would be anathema. For any fundamentalism, truth is fixed, received once in a revelation regarded as literally divine. There's no question of interpretation, modification, or a loyal opposition: you're either with us, or with the infidels.
If the flags mourning these victims of many nations might represent America at its best, as essentially the first post-national nation ("post-national" because its ideals and its citizens are not native, nor imagined as exclusively and specifically so; instead, there's no national tongue, no national blood, only shared ideals which are explicitly universalist, not limited to a chosen few people), then the victims are appropriately memorialized. Certainly no one asked them for their citizenship or passport before they were untimely obliterated.
In this light it's sadly ironic to see where the post-9/11 world has come - since America at its worst seeks by force to extend a monolithic, rigid set of ideals, ideals of economics, of control, of ideology and "democracy from above." In this reading the sad irony is that the international victims, who in life might have proudly proclaimed their British, Mexican, or other citizenship, find themselves posthumously ruled American by fiat, by decree.
But as I've written before (in rather a different context), there's only one nation, and it both extols and extinguishes those better ideals. And so one of the sadder aspects of the US post-9/11 is way borders have dissolved utterly for capital (flowing one way) and jobs (the other), while rigidifying for humans, both at the physical borders and in the psychological borders within the nation, separating people on the basis of name, color, or employment. (Legally, by the way, anyone earning money in this nation, citizen or not, licitly employed or not, is compelled to pay taxes on that income - regardless of whether they have any voice in the nation's affairs. Taxation without representation, anyone?) Capital moves across the face of the nations, leaving its borders waste and without form, but frowns on the dark faces chained to its machinery.
I'd like to end with some rousing peroration about those three thousand flags, about the need to make them reflect our ideals rather than a will to power, but I'm feeling rudderless and powerless at the moment, as lies and distortion wash over us in a surge of propaganda, and our supposed loyal opposition is so loyal its opposition looks more like an exact reflection. So, sorry.
7.17.2007
being young and stupid isn't just for the young any more
A few days ago, our local paper printed an article noting the increase in deaths and serious injuries from motorcycle accidents, noting as well that the victims of such accidents have been, more and more frequently, older and not wearing helmets. One would have thought that as you grow older, you realize that you are not, in fact, immortal (as you thought in your teens and early twenties), and that if you're going to do something inherently risky like propel yourself on top of two wheels, an engine, and a gas tank at sixty miles an hour down a freeway amidst multi-ton behemoths, wearing a helmet might be a useful thing to do, providing at least some insurance that in the event of an accident, your brains will not end up a red and black smear on the roadway.
A curious aspect of this article: even though it focuses on motorcycle helmets as a factor in deaths and brain injuries (and even though helmets have been shown to reduce the incidence of death and injury in accidents), it never once mentioned that Wisconsin had several times tried to pass mandatory helmet laws. I e-mailed the author of the article, who said that she had limited space and that material had been edited out (fair enough) - but it still seems a curious omission.
Perhaps it has something to do with the same bunch of people - regular people as well as lobbyists - who ensure that a Google search for "motorcycle helmet law" yields, first, a slew of pages ranting about the jackbooted thugs' attempt to limit bikers' freedom to crush their skulls on America's highways. And probably the same bunch of people sending abusive letters to the author for even mentioning the concept that helmets might save lives (as she also noted in her reply to me).
When I get in my car (which at least provides some sort of protection), I am legally required to strap on a seatbelt. Yet motorcyclists are legally allowed on the roadways wearing pretty much nothing more than what they might wear to the beach. Such freedom: freedom to subject other motorists to your grisly death, freedom to cast your parents, lovers, or children into mourning. I wonder what Angeline Schreiber, the young woman quoted in the article whose parents were killed in a (helmetless) motorcycle accident, thinks of their exercising such "freedom."
A curious aspect of this article: even though it focuses on motorcycle helmets as a factor in deaths and brain injuries (and even though helmets have been shown to reduce the incidence of death and injury in accidents), it never once mentioned that Wisconsin had several times tried to pass mandatory helmet laws. I e-mailed the author of the article, who said that she had limited space and that material had been edited out (fair enough) - but it still seems a curious omission.
Perhaps it has something to do with the same bunch of people - regular people as well as lobbyists - who ensure that a Google search for "motorcycle helmet law" yields, first, a slew of pages ranting about the jackbooted thugs' attempt to limit bikers' freedom to crush their skulls on America's highways. And probably the same bunch of people sending abusive letters to the author for even mentioning the concept that helmets might save lives (as she also noted in her reply to me).
When I get in my car (which at least provides some sort of protection), I am legally required to strap on a seatbelt. Yet motorcyclists are legally allowed on the roadways wearing pretty much nothing more than what they might wear to the beach. Such freedom: freedom to subject other motorists to your grisly death, freedom to cast your parents, lovers, or children into mourning. I wonder what Angeline Schreiber, the young woman quoted in the article whose parents were killed in a (helmetless) motorcycle accident, thinks of their exercising such "freedom."
5.05.2007
paper and iron
One of the articles I've had my students read this past semester concerns itself with the effect of grading on education and concludes that, ultimately, grading itself degrades the quality of education. Grading does so (to oversimplify) by substituting desire for the grade for desire for learning, in other words, by substituting an extrinsic motivation for an intrinsic one. Rather than learning because one values learning, knowledge, or skills, all of those things are seen as secondary to something else, something external: in the case of the classroom, grades. Of course, those grades are not really their own goal either: they are seen as keys to another extrinsic motivator, which is, of course, success (i.e., money, and the things you can do with it).
One argument typically raised by students against the premise of this article is that without the motivation of grades, students will realize they can slack off and not be penalized. Of course, that's true only if you define "penalized" in terms of receiving a short-term negative: if you actually do value the learning and knowledge and skills, slacking off quite clearly penalizes you, in that you will not learn. But this criticism does have an element of truth, since our whole conception of effort tends to be keyed to the notion of reward. So it's most likely true that, at first, eliminating grades (in favor of, say, comments intending to help the student improve the quality of work) might lead to some students putting in less effort, simply because they do not see any immediate penalty.
But the tension here between avoidance of short-term penalty and awareness of longer-term reward is not just a part of education; it's a huge part - and problem - of capitalism generally. So long as you get paid, so long as you get the job, so long as you maintain the prestige necessary to ensure your continued income, the quality of the work itself simply does not matter. The usual argument is that people will recognize lower-quality work, favor higher-quality work, and penalize slackers and just-enoughers accordingly. But in fact, most people don't care, don't know, or can't afford to exercise such discernment, and so the good-enough will always outshine the quality.
What's worse, though, is the way extrinsic motivation is concomitant with an excessive emphasis on competition compared to cooperation (some readers will recognize that I'm continuing to follow the ideas of Alfie Kohn, the writer linked above). Competition is valorized as the necessary fuel for the motor of economic achievement, the essential ingredient, the lack of which (in the form of absent "incentive") is supposed to doom attempts to interfere with the natural workings of the market (such as by a government), since people without the bracing slap in the face of competition will simply accept what they get and not work any harder. But as I suggest, competition also breeds its own species of disincentive, since the need to appeal to a mass audience (even if a demographically pinpointed mass) tends to work against a desire for improvement in quality, since that quality is irrelevant to larger numbers of that audience.
Furthermore, I think an excessive emphasis on competition is corrosive and generates cynicism, since it poisons the idea that anyone could really be out for anything but his or her own self-interest and casts a dubious eye upon claims to the contrary (see my post above on global warming and scientists). This cynicism is real, though, and has real effects, in that most people resent being misperceived, and after a while give up attempts to beat their heads against that wall of cynicism, and simply give in: yeah sure I'm only in it for the money - and those folks who say they aren't are just liars. Furthermore, if such people actually do have money, they probably didn't earn it: a corrosive resentment follows, both from personal frustration and as a consequence of the inability to theorize non-economic motivations within that economic system.
Status and prestige are concomitant with money and success; lacking those two elements, people are forced back upon other devices to assert their status and position. The wealthy do not generally involve themselves in drive-by shootings, because they don't need to. Deprive a person of all the means of achieving respect and status, except for brute force, and don't be surprised if brute force is what the person uses.
(One irony on the whole emphasis on competition: it downplays the fact that cooperation is equally essential, even within a capitalist system. The chief economic actor these days is, in fact, the corporation: that is, a cooperative endeavor that exists because people realize it's more effective and more efficient to cooperate to earn money than for every individual to do so. Corporations themselves go on and on about "teamwork" (even when that's just a code word for "do as I say"), and even the elite tend to educate their children in classrooms, not solely with individual tutors.)
One argument typically raised by students against the premise of this article is that without the motivation of grades, students will realize they can slack off and not be penalized. Of course, that's true only if you define "penalized" in terms of receiving a short-term negative: if you actually do value the learning and knowledge and skills, slacking off quite clearly penalizes you, in that you will not learn. But this criticism does have an element of truth, since our whole conception of effort tends to be keyed to the notion of reward. So it's most likely true that, at first, eliminating grades (in favor of, say, comments intending to help the student improve the quality of work) might lead to some students putting in less effort, simply because they do not see any immediate penalty.
But the tension here between avoidance of short-term penalty and awareness of longer-term reward is not just a part of education; it's a huge part - and problem - of capitalism generally. So long as you get paid, so long as you get the job, so long as you maintain the prestige necessary to ensure your continued income, the quality of the work itself simply does not matter. The usual argument is that people will recognize lower-quality work, favor higher-quality work, and penalize slackers and just-enoughers accordingly. But in fact, most people don't care, don't know, or can't afford to exercise such discernment, and so the good-enough will always outshine the quality.
What's worse, though, is the way extrinsic motivation is concomitant with an excessive emphasis on competition compared to cooperation (some readers will recognize that I'm continuing to follow the ideas of Alfie Kohn, the writer linked above). Competition is valorized as the necessary fuel for the motor of economic achievement, the essential ingredient, the lack of which (in the form of absent "incentive") is supposed to doom attempts to interfere with the natural workings of the market (such as by a government), since people without the bracing slap in the face of competition will simply accept what they get and not work any harder. But as I suggest, competition also breeds its own species of disincentive, since the need to appeal to a mass audience (even if a demographically pinpointed mass) tends to work against a desire for improvement in quality, since that quality is irrelevant to larger numbers of that audience.
Furthermore, I think an excessive emphasis on competition is corrosive and generates cynicism, since it poisons the idea that anyone could really be out for anything but his or her own self-interest and casts a dubious eye upon claims to the contrary (see my post above on global warming and scientists). This cynicism is real, though, and has real effects, in that most people resent being misperceived, and after a while give up attempts to beat their heads against that wall of cynicism, and simply give in: yeah sure I'm only in it for the money - and those folks who say they aren't are just liars. Furthermore, if such people actually do have money, they probably didn't earn it: a corrosive resentment follows, both from personal frustration and as a consequence of the inability to theorize non-economic motivations within that economic system.
Status and prestige are concomitant with money and success; lacking those two elements, people are forced back upon other devices to assert their status and position. The wealthy do not generally involve themselves in drive-by shootings, because they don't need to. Deprive a person of all the means of achieving respect and status, except for brute force, and don't be surprised if brute force is what the person uses.
(One irony on the whole emphasis on competition: it downplays the fact that cooperation is equally essential, even within a capitalist system. The chief economic actor these days is, in fact, the corporation: that is, a cooperative endeavor that exists because people realize it's more effective and more efficient to cooperate to earn money than for every individual to do so. Corporations themselves go on and on about "teamwork" (even when that's just a code word for "do as I say"), and even the elite tend to educate their children in classrooms, not solely with individual tutors.)
4.15.2007
id(o)ling
Unfortunately I didn't have my camera on hand to document the precise wording of his sign, but yesterday I saw a lone protestor, outside a Riverwest grocery store (for Milwaukeeans - since there really is only one Riverwest grocery store - I've just ID'd the location exactly...), holding an enormous sign that said something like Stop Voting for American Idol and Start Voting to Bring Our American Heroes Home: End the War.
Anyone's who read more than two or three entries of this blog knows I agree with the gist of this guy's message...but there are just so many things wrong with his method of presenting it - and unfortunately, these things map pretty clearly to problems progressives in general have had gaining political traction.
First - and the problem that leads to all the other problems - our protestor seems to act under the impression that he's speaking for a small minority against an overwhelming force opposed to him. American Idol is, of course, one of the most popular shows on TV; if the juxtaposition of that with "American heroes" makes any sense at all, it would be to suggest his assumption that more people are concerned with the TV show than with the war in Iraq. First problem: the two have nothing to do with one another. It is entirely possible for someone to be obsessed with American Idol and opposed to the war - even actively opposed to the war. In fact (and this is where the implied comparison completely falls apart), polls these days typically show that two-thirds to three-quarters of Americans think the war is a bad idea. Unless the audience for American Idol is drawn entirely from the pro-war minority, it's likely that most fans of American Idol are also opposed to the war.
In other words, he's assuming (in this case, against all evidence) that most people don't care about the war, or support it if they do, or are apathetic, and instead would rather watch TV. There's an implicit elitism here, too: "we finer sorts know what's truly important - and it isn't whoever Paula Abdul babbles and coos over." Even if everything above were true - the minority, the concerned elite - it's hardly an effective tactic to try to win people over by, essentially, insulting them. "Hey you, ya big slob: try using your brain for once, idiot!" That works, right?
I'll pass over the tortured attempt to make "American" relevant in each half of the slogan, and the odd use of "vote" (while there are referenda on the war, they're not binding - and this was a week after the elections). It's not the populace that needs convincing here - it's elected officials. If the sign had said something like "Tell your congressional representative you want to end the war now!" - that would have made much more sense, and probably been much more effective, not only in communicating to people who saw the guy's sign, but also (assuming people followed up) in letting representatives know which way the wind's blowing. (The Democrats, mostly, as usual seem in mortal terror of the Fox Squeaky Wheel Faction: the minority of rabid wingers who invariably seem more motivated to jump when their side says jump than the Democrats' own tepid voters. Perhaps that tepidity has something to do with the fearfulness of those representatives?)
And of course, I'm sure the guy felt he was earning some sort of moral brownie points for standing out in the 45-degree cold, all by himself, stoically holding up his hand-painted sign. Perhaps he should - but it's too bad the approaches his sign demonstrates are such a turnoff to so many. Hectoring defensively is rarely a good idea.
(PS: Tell your congressional representative you want to end the war now!)
Anyone's who read more than two or three entries of this blog knows I agree with the gist of this guy's message...but there are just so many things wrong with his method of presenting it - and unfortunately, these things map pretty clearly to problems progressives in general have had gaining political traction.
First - and the problem that leads to all the other problems - our protestor seems to act under the impression that he's speaking for a small minority against an overwhelming force opposed to him. American Idol is, of course, one of the most popular shows on TV; if the juxtaposition of that with "American heroes" makes any sense at all, it would be to suggest his assumption that more people are concerned with the TV show than with the war in Iraq. First problem: the two have nothing to do with one another. It is entirely possible for someone to be obsessed with American Idol and opposed to the war - even actively opposed to the war. In fact (and this is where the implied comparison completely falls apart), polls these days typically show that two-thirds to three-quarters of Americans think the war is a bad idea. Unless the audience for American Idol is drawn entirely from the pro-war minority, it's likely that most fans of American Idol are also opposed to the war.
In other words, he's assuming (in this case, against all evidence) that most people don't care about the war, or support it if they do, or are apathetic, and instead would rather watch TV. There's an implicit elitism here, too: "we finer sorts know what's truly important - and it isn't whoever Paula Abdul babbles and coos over." Even if everything above were true - the minority, the concerned elite - it's hardly an effective tactic to try to win people over by, essentially, insulting them. "Hey you, ya big slob: try using your brain for once, idiot!" That works, right?
I'll pass over the tortured attempt to make "American" relevant in each half of the slogan, and the odd use of "vote" (while there are referenda on the war, they're not binding - and this was a week after the elections). It's not the populace that needs convincing here - it's elected officials. If the sign had said something like "Tell your congressional representative you want to end the war now!" - that would have made much more sense, and probably been much more effective, not only in communicating to people who saw the guy's sign, but also (assuming people followed up) in letting representatives know which way the wind's blowing. (The Democrats, mostly, as usual seem in mortal terror of the Fox Squeaky Wheel Faction: the minority of rabid wingers who invariably seem more motivated to jump when their side says jump than the Democrats' own tepid voters. Perhaps that tepidity has something to do with the fearfulness of those representatives?)
And of course, I'm sure the guy felt he was earning some sort of moral brownie points for standing out in the 45-degree cold, all by himself, stoically holding up his hand-painted sign. Perhaps he should - but it's too bad the approaches his sign demonstrates are such a turnoff to so many. Hectoring defensively is rarely a good idea.
(PS: Tell your congressional representative you want to end the war now!)
Subscribe to:
Comments (Atom)

