|
||||||
|
||||||
. . . 2002-04-15 |
Mark your calendar, but not so heavily that you can't see the dates
Ray Davis will be lecturing on "Portentously Vague Quotation: An Explanatory Subtitle; or, After Such Colon What Knowledge?" at 848 Divisadero on Friday, April 26, 2002, as "the slow part" of a frenzied evening organized by Adam Tobin, Jenny Bitner, and Jay Schwartz, and headlining Camille Roy, Evidence of the King, Sam Tsitrin, Matthew Davignon, and expresso gamelan fury.
It's supposed to start at 8 PM PST and I don't know when it'll all end, but I, for one, will be pleased to take your prolonged booing and hissing into account.
. . . 2002-04-16 |
Neuraesthetics: Negative Correlations
I distrust taxonomy partly because I've noticed my own misuse of categories and partly because I've noticed misuse by others. We're all editorial cartoonists at heart: the first thing we want to do with any generalization is to treat it as an instance, or better yet a personification. Classifications are played as if they were the same sort of tokens as the objects classified, and what started as descriptive analysis becomes a competitive comparison against an enticingly novel and coherent ideal.
Similar confusions arise from the "statistical significances" of contemporary medical and psychological research. Narratives aren't built of percentages, much less of scatterplots. Following our narrative impulse, we'll reify a positive correlation from a controlled experiment into an exemplary (but imaginary) real-life case, adding a dollop of assumed causality for dramatic emphasis.
Providing fine backup of my prejudices and a fine example of how a non-scientist like myself will caricature complex research given half a chance, a couple of recent studies ("Psychophysics and Physiology of Figural Perception in Humans", "Visual Categorization and Object Representation in Monkeys and Humans") have looked into how closely various models of classification match what humans (and rhesus monkeys) actually do when they categorize. The models tested were:
And so it turned out, for us and the rhesus monkeys both.
The researchers kept tight control over instances and implied categories, making it less likely that their four-variable diagrams would be shanghaied into pre-existing classifications such as "Looks more like my boyfriend's family than mine" or "Probably edible." All the researchers sought, of course, was a cleaner description of how the mind works.
But my half-assed misappropriation would apply their results to how to work with our minds:
Categories that are constructed and maintained using an exemplar or boundary model are more likely to be useful and less likely to be misunderstood than categories that require a prototype or cue-validity model. Because we're probably going to insist on using the exemplar or boundary model regardless.
"Race" and "sex" being obvious examples: a pre-existing classification based on boundary conditions fuels research whose results -- valid only (if at all) as descriptive scatterplot with plenty of outliers -- are then misapplied to intensify our focus on those boundary conditions.
. . . 2002-04-17 |
It's the BEST cup of COFFEE I've had in a LONG time.
The most underpraised comic strip of the 1980s and early 1990s was "Brenda Starr." Artist Ramona Fradon adapted the scratchy legacy of the strip's creator, Dale Messick, into a unique and surprisingly flexible look of thrift-shop fantasy -- like a Poverty Row serial version of MGM Technicolor soap -- entirely appropriate to the scripts of Linda Sutter and, later, Mary Schmich, which fused the conventions of romance, adventure, and gag-a-day sit-com comics into a high-camp (and glacially paced) original. In 1996, June Brigman took over as artist. I still haven't cottoned to her super-clean Barbie-ish style, and her increasing reliance on recycled panel art casts a defeatist tone over the whole enterprise -- read a week or two at one sitting and it starts to seem like "Red Meat." And Schmich's storyline of last fall and winter suffered from a wooziness that seemed part too-old-hat (meeting the lovable homeless people) and part too-current-events (a biological warfare twist unfortunately coincided with the anthrax scare). But the current storyline is a return to full hoot, dragging Starr back to her glamour girl roots (with a fashion editor whose critical eye reminds me as much of Dale Messick as of Helen Gurley-Brown or Diana Vreeland) and rewarding her (and our) dogged persistence by finally dishing her a full night alone with one of those dangerous hunks: "Why don't you come back after work? We'll TALK some more."Hurrah for Production Code codewords! |
From the Fradon years |
. . . 2002-04-18 |
It was thought, therefore it was
Trio leaves no room for doubt. Whoever says: "The man has gone to town", must indicate in the form of the verb whether or not he saw the man going to town. If the speaker was not an eyewitness, he also needs to indicate whether he has understood this to be the case or whether he has indirect evidence.A language that forbids the indefinite passive and the absent expounder has been much desired, and it's to be regretted that competitive pressures ensure its continued absence. [via Simcoe, whose prolonged sporadicness has much enfeebled the fighting spirit of the American people]
. . . 2002-04-26 |
Awfully sorry about the extended absence. Those of you who'd like to abuse me in person (satisfaction guaranteed!) are reminded that I'll be stammering and gesturing wildly tonight at 848 Divisidero, between Fulton and McAllister, in world-famous San Francisco, California.
Those of you who'd just like to see Camille Roy should come, too.
Show up by 8 PM (world-famous Pacific Standard Time) for the full effect.
. . . 2002-04-30 |
Movie Comment: The Phantom of Liberty
I've never understood why so many of this movie's viewers complain about the shallow absurdity of the fate it apportions its mass murderer. On first sight and every sight since, it's sent chills up my spine and a gurgling chortle down my throat; it seems the pinnacle pivot in a science fictional narrative constructed as thoroughly as possible of pivots.
The tidy bespectacled protagonist of the episode has tidily and distantly, like a perfectly manicured deity, dispatched about a half-dozen utterly random humans. We see him in his skyscraper with his long-range rifle; more tellingly we see several scenes of people walking normally through the city's streets and then individuals among them silently (due to the long range of the rifle), inexplicably plummeting to the ground like so many god-despised sparrows. Have they fainted? Are they pulling some kind of scam? No; it's simply that they used to live and now they don't.
And that's not even abnormal, is it? Just sort of a shock to those around them. Like a slaughterhouse or a legal proceeding is shocking (to nonprofessionals) without being in the least abnormal.
The protagonist is quite rightly found guilty of premeditated murder, and condemned to death. He shrugs. He's congratulated. He's given an cigarette. (In another episode of the movie, that other episode's protagonist is told of his terminal cancer by his friend who's also his doctor who to offset the diagnostic blow offers him a cigarette and is quite rightly struck down by a punch in the face.) He leaves the courtroom, to walk normally through the city's streets.
Is this really so hard to follow?
UPDATE 2018-11-10: Axiom-of-Cinema David Cairns's defense of another narrative thread got me going all over again:
Similarly, the kidnapping subplot doesn't feel completely arbitrary to me -- wonderfully silly, but not arbitrary. Adults often do ignore the exhaustingly alien inner lives of children in favor of making them story points in the adults' drama: "Please don't interrupt Mommy when she's telling the doctor about your ADD." It's also a nice dream reversal of the old "kid witnesses a crime but no one will listen" story. In this case a kid DOESN'T witness a crime and no one believes her.
Every man is an island, but he can make day trips when the ferry is running
Those who are curious but who for whatever reason couldn't make it to Friday's performance can be assured that it went, actually, surprisingly well. (Or at least the reception did -- for obvious reasons, I can't speak as to the quality of the performance, although a Chronicle columnist acclaimed it as "AMUSING!") Given the unhipness of the notion (that same columnist, I'm told, expressed misgivings about "attending a lecture on a Friday night"), the audience was astonishingly supportive and friendly, and I glowed like a fat old white guy after a sauna and rubdown even before I started spending my portion of the door's take on booze.
I may not quite be ready to make the Cory Doctorow career leap to professional speaking (I'll need to cover more than one night's bar tab for that), but I had a lovely time as spectator, listener, and participant, and I thank the organizers. You thank them too, OK, when you see them?
. . . 2002-05-02 |
Neuraesthetics: Foundations
You may have already heard this news -- it was all over Herodotus -- but when the ancient Egyptians went to mummify somebody, the first thing they did was take out the brain.
They had these narrow metal rods, and some had a hook on the end, like a nut pick, and some had more like a corkscrew, and supposedly they pushed them up the nose and, you know, pulled. (I'm kind of surprised that it hasn't become standard non-Western medicine out here in California: direct stimulation of your frontal lobes, smart massage....)
Well, some ancient Egyptians, anyway. And also some researchers at the U. Maryland Med School, who got permission to try this out on some poor 70-year-old Baltimorean. According to them, it was actually much too frustrating to try to drag the brain out in chunks, so -- you can hear the exasperated sighs echo through the millennia -- they finally just shoved the pick up there and whisked it around to get sort of a slurry, and then held the old guy upside down....
Especially after that level of detail, an afterlife wandering around with an empty skull probably doesn't sound so attractive. To me, it brings back those Wizard of Oz nightmares -- you know, the ones with zombie Ray Bolger? "Brain...! Give me... brain...!"
But to the ancient Egyptians it made sense. They were obsessed with fighting death. Death meant turning into rotten stinking gooey stuff. Therefore rotten stinking gooey stuff was the enemy: Their physicians specialized in enemas and emetics, and those who could afford the time devoted three days of each month to purgation.
If we assume that the soul is immortal, the soul can't be in stuff that decays. And the brain is too icky and gooey to embalm, so it most be useless to the soul.
Of course, the ancient Egyptians were right that the brain is an obstacle to immortality. Sadly, that's because the brain is what defines mortality. They were right that we rot from the head down. They were just wrong about there being an alternative.
But their technique was embalming, so that's what they went by. It's a common enough fallacy -- a subspecies of what experimental psychologists tend to call "priming." We apply (shape, mask, filter) what we've encountered (whether we realize it or not) to our actions, and we apply our actions (whether we realize them or not) to what we perceive.
. . . 2002-05-03 |
Neuraesthetics: Foundations, cont.
|
It's inherent to the mind's workings that we'll always be blinded and bound by our own techniques.
Another example of this -- which actually bugs me even more than ancient Egyptians taking brains out of mummies -- is when essayists or philosophers or cognitive scientists or divorced alcoholic libertarians or other dealers in argumentative prose express confusion (whether positive or negative) before the very existence of art, or fantasy, or even of emotion -- you know, mushy gooey stuff. Sometimes they're weirdly condescending, sometimes they're weirdly idealizing, and sometimes, in the great tradition of such dichotomies, they seem able to, at a glance, categorize each example as either a transcendent mystery (e.g., "Mozart," "love") or a noxious byproduct (e.g., "popular music," "lust").
(Should you need a few quick examples, there are some bumping in the blimp-filled breeze at Edge.org -- "Why do people like music?", "Why do we tell stories?", "Why do we decorate?" -- a shallow-answers-to-snappy-questions formula that lets me revisit, through the miracle of the world-wide web, the stunned awful feeling I had as a child after I tackled the Great Books Syntopicon....) |
These dedicated thinkers somehow don't notice, even when they're trying to redirect their attention, what they must already know very well from intellectual history and developmental psychology both: that their technique is blatantly dependent on what seems mysteriously useless to the technique.
Cognition doesn't exist without effort, and so emotional affect is essential to getting cognition done. Just listen to their raised or swallowed, cracked or purring voices: you'll seldom find anyone more patently overwhelmed by pleasure or anger or resentment than a "rationalist," which is one reason we rationalists so often lose debates with comfortably dogmatic morons.
Similarly, "purely observational" empiricism or logic could only produce a sedately muffled version of blooming buzzing confusion -- would only be, in fact, meditation. Interviews, memoirs, and psych lab experiments all indicate that scientists and mathematicians, whether students or professionals, start their work by looking for patterns. Which they then try to represent using the rules of their chosen games (some of the rules being more obviously arbitrary than others). And they know they're done with their new piece when they've managed to find satisfying patterns in the results. It's not that truth is beauty so much as that truth-seekers are driven by aesthetic motives. ("It's easier to admit that there's a difference between boring and false than that there's a difference between interesting and true.")
Studies in experimental psychology indicate that deductive logic (as opposed to strictly empirical reasoning) is impossible without the ability to explicitly engage in fantasy: one has to be able to pretend in what one doesn't really believe to be able to work out the rules of "if this, then that" reasoning. The standard Piaget research on developmental psychology says that most children are unable to fully handle logical problems until they're twelve or so. But even two-year-olds can work out syllogisms if they're told that it's make-believe.
Rationality itself doesn't just pop out of our foreheads solitary and fully armed: it's the child of rhetoric. Only through the process of argument and comparison and mutual conviction do people ever (if ever) come to agree that mathematics and logic are those rhetorical techniques and descriptive tools that have turned out to be inarguable. (Which is why they can seem magical or divine to extremely argumentative people like the ancient Greeks: It's omnipotence! ...at arguing!)
An argument is a sequence of statements that makes a whole; it has a plot, with a beginning, a middle, and an end. And so rhetoric is, in turn, dependent on having learned the techniques of narrative: "It was his story against mine, but I told my story better."
As for narrative.... We have perception and we have memory: things change. To deal with that, we need to incorporate change itself into a new more stable concept. (When young children tell stories, they usually take a very direct route from stability back to initial stability: there's a setup, then there's some misfortune, then some action is taken, and the status quo is restored. There's little to no mention of motivation, but heavy reliance on visual description and on physically mimicking the action, with plenty of reassurances that "this is just a story." Story = Change - Change.)
And then to be able to communicate, we need to learn to turn the new concept into a publicly acceptable artifact. "Cat" can be taught by pointing to cats, but notions like past tense and causality can only be taught and expressed with narrative.
It seems clear enough that the aesthetic impulse -- the impulse to differentiate objects by messing around with them and to create new objects and then mess around with them -- is a starting point for most of what we define as mind. (Descartes creates thoughts and therefore creates a creator of those thoughts.)
So there's no mystery as to why people make music and make narrative. People are artifact-makers who experience the dimension of time. And music and narrative are how you make artifacts with a temporal dimension.
Rational argument? That's just gravy. Mmmm... delicious gravy....
Further reading
I delivered something much like the above, except with even less grammar and even more whooping and hollering, as the first part of last week's lecture. Oddly, since then on the web there's been an outbreak of commentary regarding the extent to which narrative and rationality play Poitier-and-Curtis. Well, by "outbreak" I guess I just mean two links I hadn't seen before, but that's still more zeitgeist than I'm accustomed to.
|
. . . 2002-05-05 |
More scoop on La Starr: From exotic Chicago, land of Mary Schmich, our esteemed colleague Stumpshaker has supplied new insight into fashion editor Ms. Cozmo's reaction to Brenda's nom-de-mode: "Intelligentsia? Sounds about as sexy as a bag of coffee." |
I just found out yesterday (thereby missing the first week or two of archive) that Brooke McEldowney, whose "9 Chickweed Lane" we've complimented before, has begun a fantasy-comedy-adventure strip, "Pibgorn." The narrative pace is still a bit wobbly, and I'd appreciate it if some 733t gremlin would strip all gradient tools from his graphics software, but hey! this is the first new attempt at a syndicated fantasy adventure that I can remember since Pinkwater & Auth's "Norb"! And how the man loves to draw! |
. . . 2002-05-07 |
Cold Feet, Bedroom Eyes
Beauty, since you so much desire To know the place of Cupids fire: About you somewhere doth it rest, Yet never harbour'd in your brest, Nor gout-like in your heele or toe; What foole would seeke Loves flame so low? But a little higher, but a little higher, But a little higher -- But a little higher -- There, there, o there! lyes Cupids fire. |
Thinke not, when Cupid most you scorne, Men judge that you of Ice were borne; For, though you cast love at your heele, His fury yet sometime you feele; And where-abouts if you would know, I tell you still, not in your toe: But a little higher, but a little higher, But a little higher -- But a little higher -- There, there, o there! lyes Cupids fire. |
|
- Thomas Campion, 1617 |
Then downe my pray'rs made way To those most comely parts That make her flye or stay, As they affect deserts: But her angry feete, thus mov'd Fled with all the parts I lov'd. |
Yet fled they not so fast As her enraged minde: Still did I after haste, Still was I left behinde, Till I found 'twas to no end With a Spirit to contend. |
Twentieth-century American pop musicians often commuted between gospel and secular genres (which themselves mandated varying degrees of directness), and sometimes carried specific songs with them. But pretty much all the earlier transformations I'm familiar with were cross-author, and often conceived of as parodies, with the Earl of Rochester's memorably horrific (and Bataille-trumping) Scroope-over one of the most vicious.
Exceptionally, Campion was here burlesquing one of his own. From the first Booke of Ayres, published in 1601:
Mistris, since you so much desire To know the place of Cupids fire, In your faire shrine that flame doth rest, Yet never harourd in your brest; It bides not in your lips so sweete, Nor where the rose and lillies meete, But a little higher, but a little higher: There, there, O there lies Cupids fire. |
Even in those starrie pearcing eyes, There Cupids sacred fire lyes; Those eyes I strive not to enjoy, For they have power to destroy; Nor woe I for a smile, or kisse, So meanely triumph's not my blisse; But a little higher, but a little higher, I climbe to crowne my chast desire. |
(The fine editor of my Campion, Walter R. Davis, oddly refers to the later version as "slightly revised.")
. . . before . . . | . . . after . . . |