pseudopodium
. . . Narratology

. . .

In its final scene, 1943's Old Acquaintance brilliantly explains away its own trashiness by ascribing the story's authorship to the movie's talentless hack writer rather than to its artistic one.

. . .

Our story begins four words ago -- which means we're already in the past tense and I'm tardy again. At that time....

+ + +

The appeal of writing fiction over the usual biter bit is the fresh air: out of the echo lab. The external comes as a relief when it comes. Interest enriched by empathy. Could say similar things about sex: self-pleasure as a lens that focuses other-pleasure.

. . .

It's got its good paragraphs, but E. E. Cummings's allegorical reading of Krazy Kat -- with Kat as democracy caught between Mouse-anarchy and Pupp-fascism -- has always rubbed me the wrong way.

For starters, Cummings refers to Krazy as "she" throughout, whereas the strip used "he" much more often. (Bowing to public pressure, Herriman experimented with unequivocal she-ness once, but decided it just didn't suit that dear kat.) Following a natural train of thought, Ignatz's rage could be better described as homophobic than as anarchistic: he hates Krazy not because Krazy is a symbol of authority, or repression, or respectability, or even stability, but because Krazy is eccentric, flamboyant, unaggressive, affectionate, and a little kwee.

For the main course, any historically-dependent reading misses Herriman's achievement: a complete universe grown from one necessarily inexplicable but endlessly fecund triangle. Jonathan Lethem came closer to the mark in his story, "Five Fucks," where the triangle is a mysteriously universal solvent; even Lethem took the easier way out, though, in making the triangle violently entropic rather than pleasurably generative.

As Herriman demonstrated in later strips ("A mouse without a brick? How futile."), Coconino's reality depends on support from each point of the triangle; as he demonstrated throughout the strip's three decades, the triangle supports an infinite unfolding of reality. Lacking that central mystery, other comics, no matter how minimalist or how beautifully drawn, seem artificial and puffy by comparison.

. . .

The prole thrillers which came closest to the jokey splatter of neo-noir weren't from Mickey Spillane but from Chester Himes. I recently read the second volume of Himes's autobiography, which, amidst hundreds of pages of complaints about girlfriends, royalties, and his Volkswagen, revealed just how liberating Himes found the hard-boiled genre after a dozen years of working the Richard-Wright-defined mainstream:

I would sit in my room and become hysterical thinking about the wild, incredible story I was writing. But it was only for the French, I thought, and they would believe anything about Americans, black or white, if it was bad enough. And I thought I was writing realism. It never occurred to me that I was writing absurdity. Realism and absurdity are so similar in the lives of American blacks one can not tell the difference.

.... I was writing some strange shit. Some time before, I didn't know when, my mind had rejected all reality as I had known it and I had begun to see the world as a cesspool of buffoonery. Even the violence was funny.... If I could just get the handle to joke. And I had got the handle, by some miracle.

I didn't really know what it was like to be a citizen of Harlem; I had never worked there, raised children there, been hungry, sick or poor there. I had been as much of a tourist as a white man from downtown changing his luck. The only thing that kept me from being a black racist was I loved black people, felt sorry for them, which meant I was sorry for myself. The Harlem of my books was never meant to be real; I never called it real; I just wanted to take it away from the white man if only in my books.

-- p. 109, 126, My Life of Absurdity by Chester Himes

. . .

Elements of Film Style:
"Critics are inclined to belittle them and call them cheap. But they don't seem to sense the idea that life is made up largely of melodrama. The most grotesque situations rise every day in life.... And yet when these true to life situations are transferred to the screen, they are sometimes laughed down because they are 'melodrama.'

"If this is true then all life is a joke and while some humorists hold to this idea, I am not one those who believe it so."

-- Frank Borzage as quoted by Peter Milne in Motion Picture Directing, 1922
Little Man, What Now?

Those of us who have attended fiction workshops may recognize this as the flip side of the common warning against overly dramatic plot points whose only defense is "But that's how it really happened!" Some such warning is needed, as those of us who have read manuscripts in fiction workshops can testify, but when overapplied leads to the numbly unmoving body of cliché called "literature" by its practitioners and "MFA crap" by everyone else.

And then we end up relying on the unguiltily mendacious genre of the memoir to get our melodrama fix.

Not a pretty sight. Not compared to a Borzage movie, anyway.

Our memories and self-images are formed of stories. And so it's inevitable that we're particularly drawn to the most obviously story-like (i.e., melodramatic) incidents that crop up in our "real life," and that we strive to make the incidents that seem important to us more story-like.

But when we put ourselves to the job of story-telling rather than the job of real life, we're operating in a different context. In real life, it's excitingly unusual for story-like forms to appear. In story-telling, it's expected; you don't get extra credit for producing a story that does nothing but sound like a story -- that's the bare minimum that you promised when entering the fray.

Borzage (along with most of the other narrative artists I love) shows by example that melodrama is not a guarantee of success, to be clung to; nor a guarantee of failure, to be shunned. Melodrama is an added responsibility, to be taken on and dealt with, to be rewarded and punished. Melodrama executed with courage, wit, observation, and beauty will always carry more weight than work that avoids "grotesque situations."

And it'll also always run the risk of being laughed down.

. . .

Goodbye to All That

A long essay on a difficult writer: Laura True-Teller and Other Fairy Tales.

. . .

Free and direct discourse Krazy's diary

Was writing, considered as external memory storage, truly a revolutionary leap in cognitive evolution?

It was an advance in shopping list technology, sure. But, considered as very long-term external memory storage, writing relies on the kindness of strangers almost as much as that other external memory storage, oral culture, does. Look at how few "immortal masterworks" since the invention of writing have survived to reach us. Whether kept in the noggin or kept on parchment or kept busily transferring from one mechnically-interpreted-medium-of-the-decade to the next, words' persistence and accessibility are almost completely dependent on interested individuals. Parchment just has an edge as far as dumb luck goes.

Similarly, the contractual use of writing as external evidence of intent wasn't a revolutionary leap in social development. Forgeries can be made and denounced; libel is only slightly easier than slander; witness's depositions are just as unreliable as their oral testimony....

But writing's use as external object is another matter, and not one that gets mentioned much in the cognitive science texts.

Person-to-person, we use language to express and to manipulate. To have one's words be understood is an ambition that's hard to even describe without the assumption of distance. It's not the noisy-channel-between-transmitter-and-receiver described by information theory. It's a channel between transmitter and object, followed by a completely different group of channels between object and receivers, channels whose "success" can't be measured by eliminating the middleman and totting up the error rate because the middleman is the point. I'm not standing behind my words to guarantee them; I'm standing there because you're not supposed to see me. I'm no longer the "message source"; I've handed that status over to an inanimate object, and that object can't be queried as to the success of the transmission.

Signed Ignatz
We empty the bottle and stick a note in it. We toss the brick over the wall hoping for a kat. The most novel aspect of writing is its status as artifact, its separability from the inchoate author, our signature no more important than any other indexable aspect.

. . .

Condensed Time Since movies began, they've been swapping techniques with dreams. I think that's because they share a structural problem: how to maintain different rates for elapsed time and for narrative time -- expressing years in an hour, an hour in minutes -- in a medium where the narrative is directly experienced rather than related.

Rather than come to grips with this problem, the filmmakers of contemporary Hollywood tend to simply give up, appending more and more running time to avoid the question of condensation, and saddling clumsy voice-over narration onto the broken back of the "direct experience":

Hi-ho, Sliver! and away!
In another way, all narrative art, including written narrative, condenses time: creator time vanishes into the much shorter audience time. A novel may take a month or fifteen years to write, but almost always takes less time to read.

And with movies the time compression is even more extreme, especially if we start talking about people-hours....

Now, although there's always the possibility that I'm falling into the food-in-a-tube fallacy, it seems to me that this compression -- story as time-compactor machine -- is key to the pleasure taken in the curiously strong arts of narrative. As evidence, when there's little or no such compression -- as with the semi-automatic writing of Gertrude Stein or Lionel Fanthorpe, or the semi-automatic early moviemaking of Andy Warhol -- the results, fine though they are, seem more lyric than narrative.

We must think further on this, if we can do so without falling asleep....

Although a narrative work's creation takes more time than any single incident of its consumption, a certain type of audience (mine) may revisit it so often that audience-time eventually sums up bigger than creator-time. I know for certain that I've accumulated more days reading The Glass Key than Dashiell Hammett took to write it.

My type of audience includes most of the critics in the world, and we aren't shy about flattering ourselves (e.g., Barthes's "Those who fail to re-read are obliged to read the same story everywhere," undoubtedly referring to Joseph Campbell). But there's something distinctly unelevated about surrounding ourselves with these papered and videoed units of time, like so many Everlasting Gobstoppers, and I don't believe we escape the market through our repetitions any more than a kid with The Lion King T-shirts, action figures, picture books, and computer games fights the power by insisting on watching the original work again.

Instead our re-reading and re-viewing gives us the chance to treat time itself as a commodity -- something to collect, to hoard, to revel in -- becoming misers of time, diving and wallowing in our libraries for all the world like Scrooge McDuck....

Scrooge McDuck

. . .

Proving again that it's the teller that makes the story, one of my favorite storytellers, Martha Soukup, is telling a story I didn't think I had the slightest interest in. Thus I favorably notice Salon twice in one week. Is this what mellowing feels like?

. . .

(part of our Sexual Degradation Special)

Like many another author setting out on a masterpiece, John Collier must have begun His Monkey Wife with the worst of intentions: to plan a romance novel whose virtuous heroine is a chimpanzee betrays a less than honorable attitude toward romance novels and virtuous heroines. In Collier's typical folderols of feckless poets and rich bullies, the female human plays the luscious main dish or the Acme beartrap but never the protagonist. And his novel, like his short stories, foregrounds a comically exaggerated ideology of misogynous sexism and Anglophilic colonolialism.

But rather than a Triumph of Arch, it's Collier's only really moving work. One of the wonders of narrative is that a story, when well-written enough (and His Monkey Wife is very well written), can be so much wiser than the storyteller. Once immersed in the point of view of long-suffering Emily, we're unlikely to be able to hold her chimpdom clearly in sight except as the primal cause of her suffering.

What results is not so much a travesty of romance as one of its purest examples, complicated but essentially unbesmirched by the deadpan perversity of the humor. Our focus shifts between the extremes of expressed sincerity and implied sarcasm until the two views dissolve into a wavering, headache-inducing, but very impressive illusion of depth. By the time sex is dragged in by a prehensile foot, we are, like Mr. Fatigay, more than ready to succumb.

I think Emily Watson for the movie role, don't you?

Tarzan and his mate
Bestiality has never seemed particularly profound in Real Life, but, since Robert Musil's quiet Veronika was first tempted by her Saint Bernard, it's been a sure-fire booster of moral complexity in Fiction.

Sex can work heavy-duty alchemical action on even the shallowest of animal fables, as proved by the only good thing ever written by hack libertarian and Welsh-supremecist Dafydd ab Hugh, "The Coon Rolled Down and Ruptured His Larinks, A Squeezed Novel by Mr. Skunk."

Again we find the ambition-performance ratio unexpectedly reversed. In ab Hugh's story, zero-sum economics applies to intelligence: as one part of society gains IQ, another part accordingly dumbs down, which is why democracy can't work. If he'd illustrated his postulate with, say, American ethnic groups, he might have had some difficulty selling his story to a genre magazine. And so he uses the slightly less controversial hierarchy of species.

Which is how he ended up with something more sellable and richer and stranger than he could possibly have imagined. No matter how fleabit and fanatic, cute fuzzy hungry animals can't help but gain our sympathy; a taboo against "love in the streets" can't help but predispose us to cheer on an affaire de coeur between underboy and underdog, no matter how disgusting.

So, even though the story (mercifully) doesn't work as propaganda for ab Hugh's political position, his viciousness does manage to keep this Incredible Journey from falling into Disneyesque propaganda of another sort. Thus muddling doth make heroes of us all.

. . .

Neuraesthetics: Hypnotic Narratology

The influence of Ernest R. Hilgard's Divided Consciousness: Multiple Controls in Human Thought and Action, first published in 1977, was somewhat hobbled by Hilgard's up-front completely speculative application of his brilliant hypnosis research to his not-very-rigorous understanding of multiple personality disorder. That kind of "these conditions are obviously completely different but their descriptions have some words in common and therefore they must actually be exactly the same" move is best left to us popularizers; in Hilgard's book, it's just a distraction from the stuff Hilgard really knows about.

That stuff started in a classroom demonstration when Hilgard hypnotised a blind guy and told him he would be completely deaf until a hand was placed on his shoulder. The students had their usual sadistic fun trying to make little Tommy jump with handclaps, gunshots, and Keith Moon imitations, and then some Pysch-for-Poets throwback asked about subconscious memories (a red herring, but good bait). OK, if you're curious, said Hilgard to the student, and then to the blind deaf guy, "If there's some part of you that hears me, I'd like your right index finger to raise."

It did.

And the blind guy said "Please restore my hearing so you can tell me what you did. I felt my finger rise and it wasn't a twitch."

Post-hand-on-shoulder, Hilgard asked the blind guy what he remembered.

"Everything was quiet for a while. It was a little boring just sitting here so I busied myself with a statistical problem. I was still doing that when I felt my finger lift."
Hilgard hypnotized the guy again and told him, "I can be in touch with that part of you which made your finger rise and it can answer me when I put my hand on your arm."

"It" did. "It" answered, for example, every question about what kind of noises had assaulted the deafened blind guy, and reported the lifting finger command too.

Then Hilgard asked the hypnotized "subject" what "he" thought was happening.

"You said something about placing your hand on my arm and some part of me would talk to you. Did I talk?"
... to be continued ...

. . .

Neuraesthetics: Hypnotic Narratology, cont.

Lacking both a theoretical foundation and any profitability for pharmaceutical companies, hypnosis has never been a particularly easy sell to scholars. Ernest Hilgard's Stanford lab was a great legitimizer, and it was justified largely by hypnosis's usefulness as a pain reliever (although only as a last resort -- there are those pharmaceutical companies to consider, after all).

How do you measure pain without damaging the experimental subjects? (After all, there are only so many political prisoners in the world.) With a bucket of ice water, that's how: a freezing hand has the useful quality of ramping up its agony fairly predictably over time. Once you get yourself a time-coded diary of reported pain levels ("bad," "awful," "fucking awful," and so on) and reported discomfort levels ("I'd rather be doing something else," "I can't stand it any more," "I'll kill you, I swear I'll kill you" and so on), all you have to do is put the subject through it again after hypnosis and measure how the new reports differ.

While the speaking "hypnotized subject" was told not to feel pain, the "part that knew about the hypnotic suggestion" was given control of the subject's non-freezing hand and told to report its feelings in writing. And while the "subject" then verbally reported low pain and discomfort levels, the "outside the performance" hand reported pretty much the same pain levels as before and discomfort levels midway between the unhypnotized state and the hypnotized state.

So some aspect of the mind could accurately perceive what had been hidden from the unifying verbal consciousness that we usually metonymize as "The Mind." Moreover, that "aware" aspect had privileged access even to perceptions that the explicitly "conscious" aspect was allowed to monitor. In the ice water tests, the hypnotized subjects may not feel pain qua pain, but they can still feel a variable rate of throbbing in their hand. When the "aware" aspect and the "conscious" aspect had to make their reports by taking turns using the same medium, the "aware" reporter described slow continuous change in the throbbing while the "conscious" reporter described sudden jumps correlating to the time taken up by the "hidden" reports.

Hilgard says that it was only later that he realized how these experiments duplicated some results from the ooh-spooky! days of hypnosis research, when there was similarly trendy interest in automatic writing: an arm was hypnotically anesthetized and then "the arm" was given a pencil and paper and permission to write; when the anesthetized arm was pricked with a pin, "the subject" felt nothing but "the arm" was vehement in its complaints. (Unfortunately, automatic writing research continues to languish in the spiritualist ghetto from which Hilgard and company partially rescued hypnotism.)

... to be continued ...

. . .

Neuraesthetics: Hypnotic Narratology

Continuing our summary of Ernest R. Hilgard's out-of-print Divided Consciousness....

Hilgard called the aware-but-not-included-in-unified-consciousness portion of the communicating subject "the hidden observer," partly because of its privileged access to sensation and partly because it denied being the boss: when a hypnotized "subject" thinks he has a unmovable arm, the "observer" knows that the arm is actually just stiff but still doesn't feel itself as stiffening the muscles of the arm.

If the "hidden observer" had been asked what was controlling the arm, the answer would presumably have been the hypnotist, because, by definition, that's the story that's agreed to during the hypnotic state. According to Hilgard, after all other suggested attributes were successfully argued away, the final explanation of the hypnotized as to how they know they're hypnotized is "I know I'll do what you tell me."

But the hypnotist's assignments still give a lot of leeway, and hypnotized subjects aren't puppets: they come up with their own back story fantasies to explain the suggestion and their own strategies for performance. Which is why Hilgard describes hypnotism not as control so much as goal-setting. What the successfully hypnotized report about their experience of hypnotic suggestion isn't a feeling of subjection to the suggestion but a complete lack of interest in not following the suggestion; e.g., "I didn't blink because it just didn't seem worth the effort to blink."

And, Hilgard guesses, it's also why, when he looked for some personality trait that highly hypnotizable subjects have in common (because we're only talking about a subclass of the most hypnotizable subjects here; less hypnotizable people don't enter into this -- or into most of the other interesting clinical results, for that matter), the only correlation he found was "imaginativeness." His best hypnotic subjects were already comfortable with contorting themselves into somewhat arbitrary new goals and rules; in fact, they sought them out: they included pleasure readers, sensualists, and adventurers, used to vivid fantasies and compartmentalized emotions.

Thus they also included a number of what Hilgard descibes as "excellent storytellers," one of whom was asked, under hypnosis, to produce a story with "you and some friends in front of a cave." What spilled out was fifteen minutes of smoothly related, vividly detailed narrative which started by exploring the cavern's chambers and then moved into a Lost World adventure.

Afterward, the storyteller explained his apparent effortlessness: "In hypnosis, once I create the pattern, I don't have to take any more initiative; the story just unfolds. I knew ahead of time that there would be another room inside the cavern, but I didn't know what it would look like until I walked through and was describing it. In the waking state, storytelling seems more fabricated. I don't see the things that I describe in the way I actually see them in hypnosis."

And, backing up this report, while still in the hypnotic state, a "hidden observer" had claimed to be handling the distracting structural work of planning the move to the Lost World and monitoring the story's length, thus letting "the subject" concentrate on a description of the passively viewed surface fantasy.

... Next: What does it all mean, Doctor? ...

. . .

Concluding the Neuraesthetics: Hypnotic Narratology saga... Bookworm or Genius?

Sometimes all it takes to leap to a conclusion is the choice of an article. Whereas calling a communicating-but-unavailable-to-consciousness mental process "a hidden observer" would have brought out its transient nature, calling it "the hidden observer" made it sound like an invincible Masked Avenger, predisposing Hilgard to treat it as more persistent than his evidence would support.

Once on that train of thought, multiple personality disorder may have seemed like a natural stop, but you'd think a clinical hypnotist would be more reluctant to draw attention to a "hypnotist = trauma," "suggestion = repressed trauma," and "hypnosis = serious mental illness" equation.

Anyway, there's no need to board that train. What makes MPD socially and emotionally problematic isn't its modularity per se but its assignment of personalities to the modules. What makes the term "modularity" problematic isn't the notion that the mind is multiply streamed, but the notion that the streams can all be divvied up neatly into things called modules. And what makes Hilgard's hypnosis research interesting isn't how it maps dysfunction but the insight it offers into function.

(Walter Jon Williams's Aristoi takes a similar wrong turn: the novel assumes that modularity makes for efficient thinking, but it takes a trauma-and-MPD route rather than the practice-and-hypnosis route.)
Post-Hilgard hypnosis researchers have been at pains to point out that "hypnosis entails social interaction as well as alterations in conscious awareness"; what they forget and what Hilgard's research underscores is the extent to which conscious awareness is also a matter of social interaction.

One of the noisiest "paradoxes" of the cognitive sciences is that the mind handles tasks faster than consciousness possibly can. But as a paradox, it shows the same kind of naivete as artsy types blathering about quantum theory. Philosophers got over that one in the nineteenth century (and before the nineteenth century, they handled it with stuff like the humours and astrology): we're just talking about the fact that consciousness, by definition, has to pretend to be the boss even when it's perfectly obviously not. Maybe Nietzsche overstated the case when he described consciousness as only a kind of surface froth on the driving waves (Nietzsche exaggerating for effect? What are the odds?), but he was clearly right that the conscious self's usefulness and power get overestimated to justify concepts of legal and religious responsibility.

The corresponding problem is, most folks who latch onto the non-unified-self drop into one of two camps: a New Agey hippyish irresponsibility groovin' on its drives, man, or a Calvinistic morose fatalism where the lucky ones happen to be born naturally more unified (and then fool themselves that it's their super-thick undisturbed froth of will power that's doing the trick) and the rest of us are born permanent losers.

Hilgard's work points toward a less melodramatically binary state of affairs. Rather than a sharp constrast between the self-deceived "self" and the uncontrollable mind-flood, it indicates a constantly shifting array of simultaneous processes, capable of handing off tasks and even of taking over communication. It's not so much that "consciousness" is inherently modular as that modularity is a useful mental technique, with the narrating "consciousness" a specific case in point.

(Man, it's hard to figure out where to put the scare quotes with this stuff. At least I'm leaving sous rature out of the toolbox....)

A narrating consciousness doesn't exclude the possibility of other modules, nor is it invalidated by their existence. When we're all at our best, it's more a matter of efficient mutual support, like in a WPA poster, or like in Hilgard's storyteller story.

When I read it, I thought of Samuel R. Delany, in The Motion of Light in Water:

"... but now what pressed me to put words on paper -- what made me open my notebook and pick up my ball-point -- were comparatively large, if vague, blocks of language that came.... It was as if the whole writing process had finally secreted another, verbal layer. These 'language blocks' were not, certainly, lengths of finished prose, all words in place. But now, as well as the vague images and ideas that formed the prewritten story, I would also envision equally vague sentences or paragraphs, sometimes as much as a page and a half of them -- which was when I knew it was time to write."
The skeptical reaction to Hilgard's work was that since hypnotism research depends on introspection by the lab animals, the research is untrustworthy. In particular, since we know that distraction reduces the affect of pain for less hypnotizable subjects, and since hypnosis reduces the affect of pain for highly hypnotizable subjects, how do we know, first, that the supposedly hypnotizable subjects aren't lying, and second, that the supposedly hypnotizable subjects aren't just distracted drama queens?

So someone compared how well distracted subjects in pain did on a vocabulary test with how well hypnotized subjects in pain did on a vocabulary test. The distracted subjects seemed... distracted. The hypnotized subjects did just as well as if they weren't in pain at all.

The researchers expressed surprise, because surely the limited store of "cognitive energy" (I'm picturing a sort of green glowing fluid) would be even more depleted by ignoring pain and forgetting that you're ignoring the pain than it would be by just taking the pain straight.

As if we have more energy when we aren't doing anything! No, as sensibly evolved organisms, we're more likely to produce energy when there's a reason to do so: when we have an achievable goal and we're achieving it efficiently. Thus the appeal of hypnotism to the hypnotizable, and thus the dismay that Hilgard reports after he revealed her "hidden observer" to a hypnotized woman:

"There's an unspoken agreement that the hidden observer is supposed to stay hidden and not come out. He broke his promise, he's not abiding by the rules..."

. . .

Strange, that in a work of amusement, this severe fidelity to real life should be exacted by any one, who, by taking up such a work, sufficiently shows that he is not unwilling to drop real life, and turn, for a time, to something different. Yes, it is, indeed, strange that any one should clamor for the thing he is weary of; that any one, who, for any cause, finds real life dull, should yet demand of him who is to divert his attention from it, that he should be true to that dullness.

There is another class, and with this class we side, who sit down to a work of amusement tolerantly as they sit at a play, and with much the same expectations and feelings. They look that fancy shall evoke scenes different from those of the same old crowd round the custom-house counter, and same old dishes on the boarding-house table, with characters unlike those of the same old acquaintances they meet in the same old way every day in the same old street. And as, in real life, the proprieties will not allow people to act out themselves with that unreserve permitted to the stage; so, in books of fiction, they look not only for more entertainment, but, at bottom, even for more reality, than real life itself can show. Thus, though they want novelty, they want nature, too; but nature unfettered, exhilarated, in effect transformed. In this way of thinking, the people in a fiction, like the people in a play, must dress as nobody exactly dresses, talk as nobody exactly talks, act as nobody exactly acts. It is with fiction as with religion: it should present another world, and yet one to which we feel the tie.

If, then, something is to be pardoned to well-meant endeavor, surely a little is to be allowed to that writer who, in all his scenes, does but seek to minister to what, as he understands it, is the implied wish of the more indulgent lovers of entertainment, before whom harlequin can never appear in a coat too parti-colored, or cut capers too fantastic.

-- Herman Melville, The Confidence-Man, His Masquerade

    Pierre, in POLA X

+ + +
Movie Comment: POLA X

Canons is the crrrrraziest people! I mean, I love Melville, but what could be nuttier than assigning a book like Moby Dick to a bunch of kids?

Beats me, but doing a big film adaptation of Pierre, or, The Ambiguities has got to come close.

And POLA X is a pretty close adaptation, given that the story's been bumped forward 150 years. Leos Carax even improved the original by explaining the dark sister as a refugee from the Balkans, which takes care of Melvillean mysteries like her lack of education, her fear of authority, and why in the world a false marriage would be more useful than a firmly stated fraternity. And should Herman Melville have developed a time machine, and travelled into the present day, he would almost certainly watch the Carax version, perhaps on a DVD, would he not? And then it seems clear that the incandescent metal coil of competition would drive deep into his heart, and heat and stir his blood, turning him into a lava lamp of nineteenth century American fiction -- is that not also true? And so it would follow that upon returning to his own time, Melville would modify his novel to make Isabel an escaped slave, which would match Carax's explanations point for point and up the ante by explaining the mysterious weightiness of the paternal sin and Pierre's resultingly mysterious compulsion to atone. And then Carax, in despair, would fold.

Which would be just as well, because the movie doesn't work.

As long as I'm rewriting history, would there have been any way to make it work? First, a true film adaptation of Pierre would have to be about a spoiled kid squandering all of his fortune and then some on making a film, a film upon which he would be desperately staking the fate of himself and all his loved ones, a film which would ultimately not be accepted by any festivals, which would, at best, go straight to video. Next, the film itself -- the film which told the story of this sad indie director -- would have to be equally utterly disastrous for the career of its maker, a contemptuous and self-loathing disaster much bigger than, for example, Les Amants du Pont-Neuf, a disaster on the level of The Lady from Shanghai or Marnie. But then also the look of the film must be fevered and murky rather than slick and glamorous.... Oh, perhaps if George Kuchar had married Geena Davis, we'd be approaching the necessary conditions -- but what are the odds? Slim; very slim.

. . .

Errata

If a critic performs any useful function, it must be to spell things out, and I fear I may've been negligent yesterday in not explaining just how it can be pleasant and reassuring to contemplate mass death on the stumbling heels of love's failure, hate's failure, reason's failure, and passion's failure.

It's reassuring because it means that considerations of success and failure don't have to enter into our decisions to privilege love or hate or reason or passion. Since we're equally likely to lose or to lose regardless, we can decide to decide on some basis other than winning and losing. Whereas most stories seem to want to get us all in a muddle on that point, which after a while either makes us a little confused or a little suspicious of stories, which either way is a tiresome strain.

And it's pleasant (in the much quoted words of Homer Simpson) because I don't know them.

. . .

Neuraesthetics: Foundations, cont.

Take an object.
Do something to it.
Do something else to it.

- Jasper Johns (via Juliet Clark)

It's inherent to the mind's workings that we'll always be blinded and bound by our own techniques.

Another example of this -- which actually bugs me even more than ancient Egyptians taking brains out of mummies -- is when essayists or philosophers or cognitive scientists or divorced alcoholic libertarians or other dealers in argumentative prose express confusion (whether positive or negative) before the very existence of art, or fantasy, or even of emotion -- you know, mushy gooey stuff. Sometimes they're weirdly condescending, sometimes they're weirdly idealizing, and sometimes, in the great tradition of such dichotomies, they seem able to, at a glance, categorize each example as either a transcendent mystery (e.g., "Mozart," "love") or a noxious byproduct (e.g., "popular music," "lust").

  (Should you need a few quick examples, there are some bumping in the blimp-filled breeze at Edge.org -- "Why do people like music?", "Why do we tell stories?", "Why do we decorate?" -- a shallow-answers-to-snappy-questions formula that lets me revisit, through the miracle of the world-wide web, the stunned awful feeling I had as a child after I tackled the Great Books Syntopicon....)

These dedicated thinkers somehow don't notice, even when they're trying to redirect their attention, what they must already know very well from intellectual history and developmental psychology both: that their technique is blatantly dependent on what seems mysteriously useless to the technique.

Cognition doesn't exist without effort, and so emotional affect is essential to getting cognition done. Just listen to their raised or swallowed, cracked or purring voices: you'll seldom find anyone more patently overwhelmed by pleasure or anger or resentment than a "rationalist," which is one reason we rationalists so often lose debates with comfortably dogmatic morons.

Similarly, "purely observational" empiricism or logic could only produce a sedately muffled version of blooming buzzing confusion -- would only be, in fact, meditation. Interviews, memoirs, and psych lab experiments all indicate that scientists and mathematicians, whether students or professionals, start their work by looking for patterns. Which they then try to represent using the rules of their chosen games (some of the rules being more obviously arbitrary than others). And they know they're done with their new piece when they've managed to find satisfying patterns in the results. It's not that truth is beauty so much as that truth-seekers are driven by aesthetic motives. ("It's easier to admit that there's a difference between boring and false than that there's a difference between interesting and true.")

Studies in experimental psychology indicate that deductive logic (as opposed to strictly empirical reasoning) is impossible without the ability to explicitly engage in fantasy: one has to be able to pretend in what one doesn't really believe to be able to work out the rules of "if this, then that" reasoning. The standard Piaget research on developmental psychology says that most children are unable to fully handle logical problems until they're twelve or so. But even two-year-olds can work out syllogisms if they're told that it's make-believe.

Rationality itself doesn't just pop out of our foreheads solitary and fully armed: it's the child of rhetoric. Only through the process of argument and comparison and mutual conviction do people ever (if ever) come to agree that mathematics and logic are those rhetorical techniques and descriptive tools that have turned out to be inarguable. (Which is why they can seem magical or divine to extremely argumentative people like the ancient Greeks: It's omnipotence! ...at arguing!)

An argument is a sequence of statements that makes a whole; it has a plot, with a beginning, a middle, and an end. And so rhetoric is, in turn, dependent on having learned the techniques of narrative: "It was his story against mine, but I told my story better."

As for narrative.... We have perception and we have memory: things change. To deal with that, we need to incorporate change itself into a new more stable concept. (When young children tell stories, they usually take a very direct route from stability back to initial stability: there's a setup, then there's some misfortune, then some action is taken, and the status quo is restored. There's little to no mention of motivation, but heavy reliance on visual description and on physically mimicking the action, with plenty of reassurances that "this is just a story." Story = Change - Change.)

And then to be able to communicate, we need to learn to turn the new concept into a publicly acceptable artifact. "Cat" can be taught by pointing to cats, but notions like past tense and causality can only be taught and expressed with narrative.

It seems clear enough that the aesthetic impulse -- the impulse to differentiate objects by messing around with them and to create new objects and then mess around with them -- is a starting point for most of what we define as mind. (Descartes creates thoughts and therefore creates a creator of those thoughts.)

So there's no mystery as to why people make music and make narrative. People are artifact-makers who experience the dimension of time. And music and narrative are how you make artifacts with a temporal dimension.

Rational argument? That's just gravy. Mmmm... delicious gravy....

Further reading

I delivered something much like the above, except with even less grammar and even more whooping and hollering, as the first part of last week's lecture. Oddly, since then on the web there's been an outbreak of commentary regarding the extent to which narrative and rationality play Poitier-and-Curtis.

Well, by "outbreak" I guess I just mean two links I hadn't seen before, but that's still more zeitgeist than I'm accustomed to.

. . .

"Go on, our glory, go; know better fates."

The last time I saw Marc Laidlaw was when he worked at the law office and I worked at -- jeez, was it Aeneid? ("It's the name of a Greek god," explained the publicist in the neighboring cubicle.) No, it was earlier, because it was around the same time he said, "weird writers form mutual admiration societies in which we can sit around and admire each other's handicaps." During that lunch, though, Laidlaw seemed more interested in the games company he'd covered for Wired and the game he was novelizing.

Yesterday, I caught up with what he's been doing for the past five years:

He spotted a promising path and took it.

Which just sounds like good sense, or a chapter from Everything I Need to Know I Learned in First-Person Shooters. But good sense is a rare thing and few its acolytes.

 
Q: Do you miss your days as a novelist?
I don't miss the lifestyle that accompanied being a novelist, which can be described as: Work all day as a bored and frustrated administrative assistant or word processor or legal secretary, then come home and try to summon some creativity with the last dregs of my energy, late at night. I miss writing novels, and I'm planning to do more of them. But I think I'd rather be happy all day and not write novels than suffer all day and hope I can funnel my misery into a book.

An art form alive enough to make a living at, rather than one reliant on the sacrifices and squabbles of role-playing volunteers; too new, wet, squirming, and squalling to attract serious critical notice.... Aside from any practical considerations, such a path must hold a special glow for any historian of comics, or pop music, or movies, or pulp genres, such as fiction.

Myself, I'm too much a natural-born critic to join him on it -- as I've had frequent occasion to admit, we only join a party after it ends -- but I'm enough of a historian to appreciate the glow. And I'm heartened by what Laidlaw wants to bring over from his last art form: "mood, character, dramatic rhythm and pacing"; -- and by what he doesn't: "For me, the game design process is already granular enough. I don't want to make plot one of those elements."

'Cause it's not like narrative is a precious waif to be coddled on its sickbed: narrative is something we can't escape. And whenever I've been promised ambitious storytelling in hypertext or interactive multimedia or dynamic websites, whether by a heartwarming NPRish family chronicler or by a pin-cushioned anorexic art-school outlaw, it's always been something cohered only by triteness, like some self-congratulatory version of Stars on 45.

 
One of the reasons I moved into game design was because I wanted some new tools to play with, and new problems to think about. Sometimes the tools are useful for telling stories; sometimes they're useful for building traps or puzzles or exciting combat sequences. When I can figure out how to do all those things at the same time, it's most gratifying.

I am actually wary of games that promise a compelling story. I figure it's a warning that I might have to look even harder for the fun.

Although two or three years old, this is the cheeriest news I've heard about anyone for a while, so I thought I'd pass it along.

. . .

Stop Me If You've Heard This One

The narrative drive is a concept that invokes psychology, but not one that I personally recognize from that field. If accorded the status of a drive, narrative in this sense of joining elements together to create coherence is much more (or perhaps much deeper) than the parsimony principle of cognitive linguistics. Is it eros, the death wish, some combination? I’m not certain, but the way Osman puts the concept out there in this poem makes me want to mull it over in more depth than I have done before.
-- Ron Silliman

I don't think we'll get far explaining the narrative drive by the death wish or eros, since the narrative drive is prior to either and behind any attempt at explaining anything, including the narrative drive.

Eros offers moments of timelessness to the willing, but virtually all social settings of eros are shaped by narrative concerns. And the "death wish" is a direct effect of (or metaphor for) the narrative drive's centrality to consciousness: our desire for extension and our desire for conclusion are yoked into conflict like testy oxen, a disagreeable tension typically resolved in the sequelitis of the world's religions. Like other aspects of Freudian psychology, "death wish" is less biology than case-studied narratology and more useful in literary criticism than in treating mental illness.

Knowledge of the existence of the electromagnetic spectrum impels us to drag all into the visible. There's a similar urge to transform into narrative those facets of existence that, clearly and invisibly and dangerously, lie outside narrative -- by doing a close reading of a lyric poem, for example.

. . .

More Math

Fiction     Any problem can be solved with another layer of indirection

  =  
Narrative     Programming

. . .

From an interview with Richard Butner:

I once had a creative writing teacher tell me that he didn't understand why authors used science fiction or magical realism to tell a story or impart a theme. Why do you think we do, when good old realism might do the trick?

"I sit with a philosopher in a garden; he says again and again 'I know that that is a tree,' pointing to a tree that is near us. A second man comes by and hears this, and I tell him: 'This fellow isn't insane: we're only doing philosophy.'"
--Wittgenstein, On Certainty

. . .

Movie Mop-Up: Holes

Despite my adherence to movie-is-a-movie book-is-a-book orthodoxy, what a pleasure, after suffering through a long run of incoherent film-schooled star-indulgent crap, to encounter a script so devoted to its source novel.

Oh, the staging of the script had its discords, starting with the obtrusive music. The cast was charming, but I couldn't help but feel sorry for the overgrown hulk somewhere who'd been denied his big break when apish Stanley Yelnats was assigned to a more conventional willowy teenager. And although the desert made convincing desert, standard-issue F/X exaggerated the gruelling trial-by-mountain into Schwarzenegger fantasyland.

But Louis Sachar's transplanted machinery carried on, doing its job: the low contrivance of melodrama built up and extended, gear by chute by trip-board by flywheel, until it became the high artifice of comedy. It's a practical, if currently neglected, aspect of information theory that, while a little complexity creates suspense, increased complexity either collapses into noise or crystallizes into laughter.

Our anxiety and our relief, being pure products of storytelling technique, float free, ready to attach to whatever sentiment we find close at hand. In a screwball comedy, we associate them with romance, which is why screwball comedies are traditional first-date films and the Three Stooges aren't. Holes, on the other hand, induced in me a strong, and more than slightly disconcerting, upflux of patriotism, and I left the theater in as flag-waving a mood as I've felt in some time.

My reaction isn't easy to explain. It's true that Sachar's elaborate multi-generational farce pivots on important aspects of American history, but lynchings, anti-immigrant prejudice, land barons, and chain gangs make weak propaganda. Maybe there's a bit of Stockholm Syndrome here: America caused the story's anxiety, and so I associated America with the story's relief. After all, I'd be at least as hard-pressed to find positive aspects of sexual love in His Girl Friday or Bringing Up Baby.

Maybe by interlocking our national horrors with the comic survival of individuals, the movie hit at the heart of the particular sort of patriotism I call my own: a love of what Americans have managed to achieve despite all the crap they've gone through; a hope that sheer mobility is enough to release children from the chains and curses of their parents; a fractured fairy tale of chance recombination leading to something better than hostility unto the final generation.

At the very least, it might be worth trying out as a replacement for It's a Mad, Mad, Mad, Mad World on the Fourth of July.

. . .

Movie Mop-Up: Seabiscuit

Before I became a contented critical cow, back when I was trying to write fiction, I was fascinated by synecdochical technique, and wasted some time trying to devise a story told entirely by implication, with only the set-ups visible, with every punchline delivered offstage.

What I found (and you're probably way ahead of me) is that exclusive reliance on synedoche restricts the author to the thoroughly familiar. A reader can only be trusted to complete a cliché.

In Seabiscuit, I saw my old experiment retried and my old results verified. In place of a traditional exposition, it started with a long series of abbreviated gestures toward foregone conclusions: a certain swell of music, a certain tone of lighting, a certain placement of stars, and you understood that a fortune's about to be made, that disaster's about to strike, that these two people will get married.... Then on to the next Life Incident. The poor schmucks were following "Show, don't tell" to the letter, not realizing that such empty stuff was meant to be told and gotten out of the way to make room for the real movie.

Another Life Incident followed, and another, and I eventually realized that there was never going to be a real movie. These expensive skills, props, and costumes were going to continue to be devoted to showing only what we were trained to think we already knew. Nothing was going to be allowed on film that hadn't already been handled and worn to sepia.

Clearly, the filmmakers had pitched this as not just a horse picture but a real human story with important life lessons, and then, true to their word, had ignored the horse picture in favor of self-help homilies, thus teaching us the important life lesson that it's not always a good idea to stay true to our word.

What kept me in my seat, in that tepid bath, past that point? Perversity, I guess. Just how many little white lies would the filmmakers's consciences take on for the sake of their craft? Ah, the near-psychotically stoic Red Pollard was actually a doe-eyed crybaby who in moments of triumph pumped his arm and shouted "Yes!", just as you and I and Duff-Man. Ah, the great Depression was ended by a new spirit of optimism rather than a change in economic policies: a very timely insight. Ah, races are dull affairs, to be clipped to incoherence; the only cinematic sport is boxing, and the only real way to film it is Raging Bull's. Ah, all underdogs, even underhorses, are cute and small. (As with the similar miscasting of Stanley Yelnats, I mournfully pictured a spavined butt-ugly horse denied its only chance at stardom.)

I would not love you so, Pirates of the Caribbean, loved I not Seabiscuit less. Just for not having a surfer dude show up or a voiceover explain that the sea symbolized freedom to a weary nation, I loved you.

. . .

Karen Joy Fowler's Sister Noon: A Note on Method

"As a corollary, then, historical inference can only take us back to the furthest-past extension of the principles that now govern the world. A time when 'everything was different' is in principle not reconstructable, i.e. not available to history."
- On Explaining Language Change, Roger Lass
"That's not where I want to be."
- The Ramones
"If it's not all about me, she might have said, why does everyone watch everything I do? Lucky she didn't. Who would complain of this to Mrs. Pleasant, about whom the whispers never hushed?"
Sister Noon - Karen Joy Fowler
"On the one hand, in effect, one must want the greatest good for the friend -- hence once wants him to become a god. But one cannot want that, one cannot want what would then be wanted, for at least three reasons...."
Politics of Friendship - Jacques Derrida
"Here discretion lies not in the simple refusal to put forward confidences (how vulgar this would be, even to think of it), but it is the interval, the pure interval that, from me to this other who is a friend, measures all that is between us.... It is true that at a certain moment this discretion becomes the fissure of death."
- Friendship, Maurice Blanchot

Most good fiction set in the past achieves its brief rapprochement between history and story by avoiding any names that might rouse mutual interest. But let an old beau be brought up and the holiday is ruined: "Well, if you'd only listened to Aaron Burr —" "Aaron Burr! Aaron Burr! Always she throws Aaron Burr in my face!"

Name-dropping historical fiction, whether researched-sincere or postmodern-bratty, may sell well, but it withers quickly. Even Flaubert couldn't lift John the Baptist to the same level as Frédéric Moreau, and a Michener or Vidal, or worse yet a James Tully or Sarah Booth Conroy, seems irredeemably presumptuous. History originally comes from story the rushed and slanted newspaper report, the misremembered self-serving memoir and if I'm going to give up the illusion of certainty, I might as well just return to those primary sources. Their half-truths will most likely provide more surprises than a contemporary fiction writer's could.

You could easily argue with that opinion. Me, I just hold it. And it was with no small confusion that, in the dazzle of my first reading of Sister Noon, I looked down and found it still there in my unattended hands, a commute-worn hat from which a table-filling bouquet had been produced with a show of perfect ease.

Well, one of the critic's tasks is to figure out how the trick's done. It doesn't begin to make the magic easy, but it's what we do. And after a few years of slow-mo-ing through Fowler's performance, I think I might've done it.

Rather than confusing gossip and slander with knowledge, Sister Noon eyes that confusion's source and the hunger that feeds it. Its hero isn't the ascertained celebrity, but the half-reluctant, half-fascinated hanger-on. Its plot isn't a schematic rise to power and fall into disgrace, but a journey into the sucking bog of schemas and back out again.

With "poor, fanciful, inconsequential little Lizzie," we learn how one's unattended, unkempt life becomes structured into narrative by a brush with celebrity, or a dream of celebrity, or a memory of celebrity. Suddenly that's what people know about you, that's what people think about you, that's what they want to hear, that's what you want to tell them. You find yourself with a story, even if it's a pale distorted reflection of someone else's story, even if that story itself distorts the celebrity's own unattended, unkempt life.

As docudrama's smugness resembles the lower forms of biographer or journalist, Fowler's fond respect resembles The Quest for Corvo. What's offensive about those other ginks is their wilful, even spiteful, ignorance. The finer stuff of Fowler and Symons gracefully incorporates its own limitations. Symons begins his biography not with an unpromising birth but with the author's curiosity, and ends it not with an overdue death but with the author's satisfaction. In Sister Noon's first chapter, the protagonist is brought into the circle of the most infamous name of her time and place; in its last, they definitively separate, and, satisfyingly, that's the only thing made definitive.

Fowler's choice of protagonist neatly solves another generic problem as well, that being how to convey the alienness of another time or culture with the techniques of realistic fiction. If reader identification takes for granted a shared notion of what's natural, how can what's "natural" become an issue? Of course, this is also a foundational problem for science fiction, and in both genres a frequent solution is to make the novel's protagonist a first-time visitor to the novel's setting. Fowler instead leverages the insight that alienation from one's mundane surroundings is a familiar shared experience (albeit not one that's necessarily taken for granted). Lizzie Hayes exhibits the same dully baffled irritability towards spiritualism, white slavery, and the Doom Sealers that I feel towards Burning Man, multi-player shooters, or the Great Anthrax Scare. We all occasionally find ourselves stranded on Mars or in a suburb of Carthage.

The ambiguous and disputatious sources of history aren't different in kind from those of the present. To resolve them is to falsify not just "what really happened" but also "what really happens."

And we readers, gossipers, hanger-ons make up part of "what really happens," unattended though we might be even by ourselves. A close look at an entrepreneurial multi-millionaire may confirm our unconfessed contentment in the tweedy middle class. Long-standing acquaintance with a successful author may reduce the shame-facedness with which we prefer self-publishing. On the sinister hand, our growing identification with a target of scandal may weaken our own restraints: having seen the worst, the fears that hemmed us in seem tawdry things, low-grade cotton rotted and easily torn.

There's a reason the fiction was put into this historical fiction. Lizzie Hayes isn't merely a passive conductor, capacitor, or resistor of the social current. When she has reached this realization or rather more actively has realized it, in the most humane and engaged way imaginable the tension between perfectly known fiction and permanently unknowable history is released, and the characters are set spinning out of the name star's orbit, from the documented fantastic to the unlimited mundane, cycling around once, four years later, to be glimpsed in the novel's first paragraph, and then (re-)lost to view.

Not that Lizzie Hayes would genuinely vanish, much as she might like to. Duties, if not heavens, forbid. She and her new-found (or rather more actively new-founded) family drive away, quite as material as they ever were, into what would appear to be a most distinctive narrative of their own.

But that's another story.

. . .

A comment on "Some Versions of Mock-Pastoral, Part I"

2) It turns out there are fairies, men and women can't be friends, one big speech can save the nation and win your love, a guy in tights will protect us from a guy in tights, Republican retards do make the best philosophers, love does conquer all, Charlie Kaufmann must throw in the car chase, chivalry is not dead, Jesus loves screwball nuns, the highest corn stalk successfully blinds the rogue elephant, the right team always wins the big game, Ed Wood did produce a hit....

Yes, we knew it already; we tell it that way ourselves. Fools are almost always vindicated. Even before the pressures of the producer's pitch and the ad campaign, it's so much the fastest way home. Narrative closure is where any fiction, no matter how ostentatiously naturalistic in other ways, stops mimicking reality: where we more-or-less forcefully, more-or-less idiotically, shut our eyes to make the world go away. Intelligence complicates plots; only stupidity can end them.

That fraudulence stings less when there's more to it. Buster Keaton didn't just play a holy fool: he played a holy fool who performed phenomenally graceful and painful stunts.

Even so the formula does get tiresome. (They save the world and next month the world needs saving! They die tragically and then they come back! Oh, fermez la porte, s'il vous plait!) And so does living with the results of all this fool-flattery.

One way to de-simplify is to display those results: Don Quixote riding the bomb that'll trigger the Doomsday Device, hee-yah!; or Troilus and Cressida, even: A Trojan Ending. That's intellectually respectable, but there's something unlovably smug and stand-offish about it, with a whiff of titillation-and-punishment hypocrisy. Despite its manifest inferiority as a script, I find the ending of Elektra: Assassin far more satisfying than that of Watchmen (and the happy endings of The Sentimental Education and The Temptation of Saint Anthony far more satisfying than Madame Bovary's catastrophe).

Another way is to allow the dolts their rightness while still rewarding the clever, who, after all, only need to get stupid while negotiating the finale. Most of my favorite romantic comedies might be described that way; see also the aplomb with which Bill Murray redirects his cynicism from gullible students to the gods themselves in Ghostbusters.

Responses

Other possibilities include stepping in to remove the sophisticate before the third act, either by train (Metropolitan) or suicide (Primary Colors).

Postscript: See also the finale of Prince Prigio.

. . .

[OE. framian to be helpful or profitable, to make progress]

Critics and teachers try to explain the frame tale as if it was somehow for the reader's comfort.

In fact, readers are just as happy without it. When they retell, they extract the "real" story's plot and discard the shell their own frame as reteller is sufficient, thanks. All a movie adaptation typically needs, if anything, is a title sequence of flipped creamy heavy-stock pages. On the best seller's enticing cover, "As Told To" is kept in small print if it's in print at all.

No, a frame benefits the builder. Construction starts more smoothly with explicit boundaries set and with the burden of justification deferred.

Experiment yourself. Make up a story aloud. Then try starting it with "The other day this guy at work told me". Or pretend it's a folk tale or a translation. See how much easier that was?

Try singing straight out:

"A woman's a two-face: a worrisome thing who'll leave me to sing the blues in the night."
Feels kinda stupid, don't it?

Responses

Blackface, like any dangerous modality, requires more art than straight delivery. Arlen's ethnic superiority tickling the ivories right alongside his gleaming cuff links. "America The Beautiful" versus "This Land Is Your Land". I heard Janis Joplin sing "Go Down Moses" one time, very early on. It was electrifying precisely to the degree it was untheatric. Cross-modality but genuine grief and hope. Arlen's just cooning around.

Comparisons are odious. But if you gotta assign points, my understanding was that Harold Arlen wrote the tune and Johnny Mercer wrote the words (and sang it with, you're right, not a lot of oomph).

Mercer was a clever guy, but my own favorite mainstream 1940s pop blackface-without-makeup singer-songwriter is Hoagy Carmichael, who at his worst borders Mick Jagger territory. Hard to resist Hoagy, though that affected accent sometimes makes me want to try, and though I guess Fats Waller managed it.

There was an animated cartoon, a buzzard, he was flying along and singing: "Ah'm a bringin home a baby bumble bee, ba doop ba doop, ba doop-a-doop a-doop."
I can't hear "Blues In The Night" without thinking of it.

The "Arkansas Traveller" lyric you're reaching for goes, as I remember:

I'm bringin' home a baby bumblebee.
Won't my mama be so proud of me?

The name of the buzzard was (depending on whether you talk to Mama, Bugs Bunny, or Bob Clampett) Killer, Beaky, or the Snerd Bird. I don't think of him when I hear "Blues in the Night," but I do think of him an awful lot.

UPDATE: My readers are a superior (or at least select) bunch, and the initial anonymous responder tones down with great grace:

My apologies to Howard Arlen and his heirs and afficionados. I saw this thing on PBS? Where Al Jolson was trying to justify his "Mammy" schtick? Then the screen started doing this low-light-level throb, I started getting sleepy...

Well said. Just try to imagine what PBS would make of any of us, and imagine the conclusions viewers would draw.... (N.B.: I am much taller in person.)

. . .

Do you think that you could make it with Frankenstein?

A question at the end of one of Jeff VanderMeer's recent posts has been nagging at me -- "Do writers of experimental fiction need to prove they can tell a good story before they start experimenting?"
- Mumpsimus

Conclusions elude us. It could be there are none to be drawn without distortion.

  1. The laws are ambiguous and the judges are prejudiced.

    Matthew Cheney and I both seek out the tang of the unexpected problem; we welcome obstacle. And so, faced with increased experimentation, we're likely to tilt our camera eye to make a narrative of progress where others may tilt a decline. Whenever Joyce published, he lost a former supporter. Gardner Dorzois, among others, regrets the "squandered promise" of Samuel R. Delany's maturity. And I'm sure there are some who wish M. John Harrison had never put Viriconium through its literary retcon.

  2. Outside a historical context, terms like "craft", "good story", and "experimental" are little more than Whiggish fertilizer.

    Nothing I've read in the past few years can compare with the experimentation of Tom Jones or Wurthering Heights, but we don't see Mark Amerika giving them props. Me, I don't think Beckett ever again wrote anything as brain-droppingly new as Watt; I think of his last thiry years as laying down a very good groove and think of John Barth's later career as safe shtick. Make Barth as hard to find as Barbara Comyns or Bob Brown and I'll reconsider.

  3. It's chancy to generalize about particularities.

    Was Orlando more or less experimental than To the Lighthouse? How about positioned between To the Lighthouse and The Waves?

    Flaubert started out with wildly uncontrolled blurts of fantasy. Were those stabs in the murk less or more experimental than Madame Bovary? Was Salammbo less experimental than The Temptation of St. Anthony? Bouvard & Pécuchet?

  4. What seems blandly normal or tediously artificial to the reader may have been a coltish celebration of new skills for the author.

    If Melville chafed against the limitations of the autobiographical sea story while writing Typee, it doesn't show. The sincerity of Modernist poets' juvenilia is hardly its besetting problem.

  5. In conscious transitions from "expected" to "peculiar", what matters may be the sheepskin, not the education.

    That is, the trigger is being granted permission to experiment, either from the publishing industry or oneself. If you write to make a living, there may not be much of a distinction. The Glass Key wouldn't have been Hammett's first publication, if only because he couldn't have afforded it.

    The most startling such transformation I've personally witnessed was at Clarion 1993, when a workshop member who'd slaved over unconvincing Analog filler realized that such an apprenticeship wasn't required, and suddenly began producing beautifully polished and balanced works of ambiguous speculation. (Like most good artists, he seems to have eventually decided that artmaking wasn't worth the effort, but that doesn't dim the thrill of witness.)

Responses

Discussion continues at Mumpsimus, at Reading Experience (with a clumsy, verbose response of my own), and at Splinters.

And does Dan Green's hospitality know no limits?— still more at the Reading Experience.

Update: Dan weeded and discarded his initial post in 2006. Here was my comment at the time:

I'm prone to note resemblances, which is fine, but then rhetoric sometimes tempts me to go too far. So I might talk about a "tradition" of presumptious lyric, and in that jumble together some unaristocratic Tudors, some Restoration satirists, Keats, the Objectivists, the New York School, and Language poets. I suppose somewhat the same impulse determines Oxonian anthologies and encourages such after-the-fact categories as film noir, nationalist canons across the world, and women's writing.

In your brief overview of "experimental writing," there's a temporal gap between "Tristam Shandy" and James Joyce's career. Do any books fit in there? I ask partly because I think I'd like them, and partly because explicit experimentation *as a tradition* would seem to require a firmly established norm, and I'm not sure when the particular narrative conventions being fought became firmly established, or how long it took before insurgent tactics became narrative conventions in their own right.

I also wonder about the conceptual gap between a single book and a career. "Tristram Shandy" stays just as wonderful but becomes slightly less startling positioned between the "Sermons of Mr. Yorick" and "A Sentimental Journey"; Sterne-as-career becomes slightly less startling positioned between the polyphonic digressions of sixteenth and seventeenth century English fiction and the sentimental, didactic, and political novels of the late eighteenth and early nineteenth century. Even before an "oppositional" tactic becomes group property, it may be a personal habit. Is a writer who attempts something drastically new in each new publication only as "experimentalist" (to use Steve Mitchelmore's word) as a writer who challenges narrative convention the same way every time? (I'm not denigrating the latter, by the way; I believe in the power of the groove.)

Conversely, early Joyceans proved that it was easy to miss the formal ambitions of "Dubliners" and "Portrait" without "Ulysses" and "Finnegans Wake" to foment suspicion. One might read "Moby Dick" as a (failed) conventional narrative, but can one say the same of "The Confidence Man"? 150 years after "Madame Bovary", we might take it as conventional, but I believe Kenner is right to draw Joyce's artistic ambitions directly from Flaubert: "A Simple Story" to "Dubliners", "Sentimental Education" to "Portrait", "Temptation of St. Anthony" to the later episodes of "Ulysses", "Bouvard and Pecuchet" to Leopold Bloom -- and, on a different trail, to Beckett's "Mercier and Camier".

And there's that final gap between the isolated heroic figures of the modern canon and a contemporary American school of writers who share some publishers, make livings in academia, and swap blurbs, bridged by the pulp-sprung and compulsive Burroughs.

Well, I'm afraid all this gap-minding sounds both more detached and more combative than my feelings justify. You yourself call it a "pragmatic" distinction. I suppose my uneasiness truly comes down to worrying just what use our pragmatisms get put to. Provisional categorization can work as a portal of discovery. (Jerome McGann's championing William Morris as the first Modernist is a delightful example of what can be done with hindsight genre.) But windows require walls, and human beings do seem to love their wall-building. Once we have our categories up, it may be hard see around them. If I'm not mistaken, a similar uneasiness stirred your "Don't Change" entry of September 22.

I suppose I sound as if I'm trying to eradicate distinctions, when what I'd like is to make them finer.

. . .

Salomé, What She Watched

(Written for The Valve)

Fenitschka and Deviations by Lou Andreas-Salomé, tr. Dorothee Einstein Krahn
The Human Family (Menschenkinder) by Lou Andreas-Salomé, tr. Raleigh Whitinger
Looking Back by Lou Andreas-Salomé, ed. Ernst Pfeiffer, tr. Breon Mitchell

Two-and-a-half stories into Menschenkinder (timidly Englished as "The Human Family") and I'm pleasantly surprised by their oblique viewpoints, the suggestive opacity of their sweeping gestures. By eight-and-a-half, my cracked fingernails are pawing the door while I whimper for air, air....

The last book to dose me like this was No. 111 2.7.93-10.20.96 by Kenneth Goldsmith, three years' worth of noticed utterances ("found texts" understates its inclusiveness), sorted alphabetically and by number of syllables. Against the author's advice, I read it front to back. (Not at one sitting, but still.)

For all I remember, two-thirds of the way through someone in Goldsmith's circle discovered true love and a revitalizing formula for social progressivism. If so, the next two hundred pages of advertising, trash-talk, and D. H. Lawrence warhorse scribbled them away. Goldsmith's big white volume flattens all layers of a life that seems not to have been unduly dull, solitary, or settled into solid shallowness as far as the mechanically-aided eye can reach. No there there, or anywhere else either; no under; no outside. Nothing but an unbreakable but by no means scuff-free surface. The discursive universe as the wrong side of a jigsaw puzzle.

I wouldn't imply any aesthetic affinity between Lou Andreas-Salomé and Kenneth Goldsmith. But the horror conveyed by both is an emergent formal property whereby the self-traced boundaries of a free-range spirit are established as crushingly limited.

Twelve stories by Andreas-Salomé have been translated into English. All were originally published in 1898 and 1899 and probably written in the same two-year burst. About half the stories have a male point-of-view; about half a female; some split down the middle. Although some include long letters or soliloquies, only one is in the first person. Elements and settings and character types and plotlines appear and re-appear trains, hospitals, mountain walks, hotels; doctors, artists; older men, slightly less older men; seductions, spellbindings, disillusionments, untrustworthy re-affirmations in never exactly replicated configurations, with just enough variation to convince us that a solution won't be found.

The puzzle is constant: There's a singularly intelligent and beautiful woman. (The traits are inseparable in these stories.) And all human value is placed in slavish idealization of the (almost always) gender-defined Other. Whether it's a case of male worshipping female, female worshipping male, or (rarer, dismissable) female worshipping female, such idealization is shown as irresistable but unmaintainable, thrashing between the fetishized parties —"I must sacrifice all for you!" "No, I must sacrifice all for you!"— and usually snapped by a sexual outburst.

(I confess that two of the twelve stories do offer "solutions", but both are so absurdly inept that the effect's more revolting than reassuring. According to one, a woman [or Woman] finds fulfillment only in childbirth; transparently the appeal of the theorized child is its strictly theoretical state as inseperable Other. Otherwise, the stories show far less interest in children or mothers than in fathers. Mothers aren't bright, or ambitious, or heroic. At most, they're embarrassing. And one such mother embarrassingly points out the egotism of the second "solution" offered: wait until the imperfect Other is safely dead, produce an idealized portrait, and rest content in mutual [but not consensual] redemption.)

As an exercise in spritual discicpline, I'd wanted to avoid gossip while reading Andreas-Salomé's fiction. But these exercises in objective solipsism are so clearly trying to work something out that my resolve crumbled, and I found, in the autobiographical essays she wrote more than thirty years later:

In the dark of night I didn't just tell God what had happened to me that day—I also told him entire stories, in a spirit of generosity, without being asked. These stories had a special point. They were born of the necessity to provide God with the entire world which paralleled our secret one, since my special relationship to him seemed to divert my attention from the real world, rather than making me feel more at home in it. So it was no accident that I chose the material for my stories from my daily encounters with people, animals, or objects. The fairy-tale side of life hardly needed to be emphasized—the fact that God was my audience provided adequately for that. My sole concern was to present a convincing picture of reality. Of course I could hardly tell God something he didn't already know, yet it was precisely this that ensured the factual nature of the story I was telling, which was why I would begin each story, with no small degree of self-satisfaction, with the phrase:

as you know

[After losing faith in God] I continued to tell my stories before I fell asleep. As before, I took them from simple sources, encounters and events in my daily life, although they had suffered a decisive reversal as well, since the listener was gone. No matter how hard I tried to embellish them, to guide their destiny along a better path, they too disappeared among the shadows. [...] For that matter, was I even sure that they were true, since I had ceased to receive them and pass them on with the confident words "as you know"? They became a cause of unconfessed anxiety for me. It was as if I were thrusting them, unprotected, into the uncertainties of the very life from which I had drawn them as impressions in the first place. I recall a nightmare—one which was often retold to me—which occurred during an attack of the measles, when I was in a high fever. In it I saw a multitude of characters from my stories whom I had abandoned without food or shelter. No one else could tell them apart, there was no way to bring them home from wherever they were in their perplexing journey, to return them to that protective custody in which I imagined them all securely resting—all of them, in their thousandfold individuality, constantly remultiplying until there was not a single speck of the world which had not found its way home to God. It was probably this notion which also caused me to relate quite different external impressions to one another. [...] It was as if they belonged together from the first. This remained the case even when the sum total of such impressions gradually began to overload my memory, so that I began to use threads, or knots, or catchwords to orient myself within the ever more densely woven tapestry. (Perhaps something of this habit carried over into later life when I began to write short stories; they were temporary aids in getting at something which was after all a much larger coherent whole, something which could not be expressed in them, so that they remained at best makeshift.)

And later:

[...] nothing can affect the significance of any thing, neither murder, nor destruction, unless it be to fail to show this final reverence to the weight of its existence, which it shares with us, for, at the same time, it is us. In saying this I've let slip the word in which one may well be inclined to see the spiritual residue of my early relationship to God. For it is true that throughout my life no desire has been more instinctive in me than that of showing reverence—as if all further relationships to persons or things could come only after this initial act.

It's easy enough to guess why such a person would have felt attracted to Freudian methods.

To return to her fiction, for those who'd prefer not to commit themselves, one Menschenkinder story is online. The books' most representative highlights might be "Maidens' Roundelay" (with a full double cycle of other-idealization and self-disillusion) and "Fenitschka" (which begins with near date-rape and ends years later in an ambiguously liberating act of forced voyeurism).

Having suffered the effects of full committal, I'm inclined to favor the two least representative stories. "On Their Way" is a black comedy of criss-crossed class incomprehension in which a young couple fail at romantic suicide but succeed at idiotic boyslaughter. "At One, Again, with Nature" stares aghast at the iciest of Andreas-Salomé's girl geniuses. Inventing California-style boutique organic produce, mocking country cousin and sugar daddy, romping with colts, kicking poor pregnant servants out in disgust, and anticipating the final solution of Ethan Edwards, Irene von Geyern escorts us out of the sequence into a harsh and welcome winter's wind.

These two don't solve the problem of Andreas-Salomé, but they do solve the problem of Story: an Other given the small mercy of The End.

Responses

peli grietzer asks:

How come all these large scale radical textual experiments operating by a linguistic rather than representational principal (No. 11...., Sunset Debris, etc.) end up being lauded for their sense of suffocation, melancholy and quiet hysteria?

I also like them for this very reason, it's just that it seems like all technically referential works guided by a non-mimetic logic end up being prized for the same emotional effect, that doesn't seem to have much to do with the actual specific non-mimetic logic they operate by.

I've noticed a similar trend among reviewers. (It may be just the default establishment mood in which to take any odd and encompassing work: the earliest defenders of James Joyce similarly treated him as a conduit of Waste-Land-ish moping.) But, for me, one of the meta-interesting things about radical textual experimenters, as with twelve-tone composers or free jazz musicians or three-chord garage bands, is that they don't all sound alike. Trying to articulate how that magic's managed may be among the most amusing challenges available to contemporary critics. Can we do any better than "voice"?

For the record, I wanna say that all of Silliman's work (including Sunset Debris) leaves me pretty cheerful, and the same goes for Gertrude Stein and Jackson Mac Low. On the other hand, the carefully crafted movies of Jean Eustache distill the bitterness of human limits into something finer than either Goldsmith (intentionally) or Andreas-Salomé (unintentionally) do by "accident".

For that matter, Goldsmith himself credits the development of his technique (and this message) to the influence of Andy Warhol, whose movies and fine art don't really effect me that way although maybe the Factory novel a would if I could stand reading it.

peli responds:

What I was really reminded of by your description of "No. 11.... "is the experience of watching season 2 of, let's say, Buffy when you're already a veteran of all seasons + Angel. Know what I mean? Knowing the resolving of the big point of narrative interest which just took place is going to be trivial from the perspective of five seasons later, not by a grand artistic architecture utilizing this trivialization, but just by everything moving on to different narrative interests that negate earlier ones (Oz and Willow being great great greatest love, later Willow and Tara being far more great greater love).

The obvious analogy with life actually devalues the poignancy of this, I think : in art we expect climaxes not to be retconned away meaninglessly, so it hurts more.

. . .

Liar, liar, pants on ice

Starting from friend G. T., most of my favorite unreliable narrators are free indirect discoursers well, I suppose that's almost the defining lesson of free indirect discourse, for readers, anyway, that one's access to narrative truth is tainted by person.

What Villette's Lucy Snowe teaches by first-person example is that one's access to personal truth is tainted by narrative. Intent on rectitude but powerless in worldly terms, Snowe should, like the Duchess of Newcastle, be able at least to discover freedom in words. Instead, cold feet bleeding at a crux, she prevaricates, taking a three-decker plunge into both non serviam self-damnation and redemptive acceptance Lear's end retold in a whispered snarl.

"First Chill then Stupor then the letting go —" Best say nothing; next best, speak truthfully and next? It depends how glibly we reached the question mark. As fond as I am of Beckett, the more encumbered Brontë carries more weight. In sweeping the bullshit away, he clarified the problem but changed it, too, to something with a solution clean enough to be called formula. The latter-day avant-lite, let-go from the get-go, with white lies conveniently stacked in their grocer's freezer, give no second thought to fudging the story for the sake of a gag.

Responses

Peli Grietzer restates (

Wait, by unreliable narrators in free indirect discourse, do you mean when the indirect quoter is unreliable, or when the indirect quotee is unreliable?

) a question first raised by Bouvard & Pécuchet and first answered by Ulysses: "Yes".

Although, as Peli pointed out in later email, there have been other answerers since:

Actually Ulysses is small change here: The absolutely unquestionably utterly most insane thing in the history of free indirect discourse is on The Man Without Qualities, page 97, first two sentences of second paragraph (the 1996 Random House edition). In fact, I think it might the craziest mind-fuck in the history of literature, Sterne and Nabokov and Grillet and Borges not even managing to give it a fight.

A friendly anonymous proofreader reminds me:

Snowe

Of course, properly we should be speaking of her under a different name altogether....

. . .

The Lie of the Last Minstrel

Factual Fictions: The Origins of the English Novel
by Lennard J. Davis, 1983 (2nd ed. 1996)

Both Tom Jones's hero and genre were mysterious bastards. Unlike the hero, the genre's parentage remained open to question, and, in '83, Davis ambitiously aimed to prune classical romances (and even the mock-heroic anti-romance) from its family tree.

In place of that noble lineage, he proposed a three-act structure:

  1. Set-Up: The literate public happily consumes crime-and-punishment ballads and monstrous-birth broadsheets which claim without scruples to be both true and improved, wondrously new yet mostly recycled.
  2. Crisis: Economic competition, diversified political power, and new libel laws forcefully direct the attention of writers and readers towards previously unproblematic distinctions like timely/timeless and provable/interesting....
  3. Resolution: ... which reconfigure more stably in (verifiable yet evanescent) journalism and (undeniable yet false) realism.

In his own storytelling, Davis sometimes stumbled most painfully, he blew the punchline and I wished he'd included a chapter on "secret histories", whose length, legal issues, and formatting (memoirs, correspondence, oddly well-informed third-person narrators) all seem to make them at least as germane as ballads. Most of all, without broad quantitative analysis to back them up, such ventures can always be suspected of cherry-picking the evidence.

But I'm an irresponsibly speculative collagist myself, and these cherries are delicious. I already understood how framing narratives relieve pressure, how they establish both authenticity and deniability: "I don't know, but I been told." But I hadn't realized how often pre-fictional writers had felt the need for such relief. Not having read a life of Daniel Defoe, I hadn't known how brazenly he forged even his own letters. And, speaking of letters, I hadn't read Samuel Richardson's flip-flops on the question of his real-world sources.

The sheer number of examples convinces us that something was shifting uncomfortably, tangled in the sheets of the zeitgeist. How else explain, across decades and forms and class boundaries, this increasingly vexed compulsion to face the old question head on, like a custard pie?

And by the end of the book, we still haven't found fully satisfying answers; the process continues. Recently and orally, for example, our impulse to simultaneously avow and disavow narrative discovered a felicitous formula in the adverbial interjections "like" and "all like".

We haven't even fully agreed to accept the terms of the problem. Remember those quaint easy-going characters in Lennard Davis's Act I? Believe it or not, living fossils of unperplexed truthiness roamed the Lost World of rural America during our lifetimes! My own grandmother sought out no journalism and no novels; she read only True Confessions and watched only her "stories" that is, soap operas, "just like real life" they were, another quotidian reconfiguration.

* * *

All novelists descend from Epimenides.

Well, OK, if you want to get technical about it, so do novel readers ("All Cretans know my belief is false"), and so does everyone else.

That's the problem with getting technical. (Or, Why I Am Not an Analytic Philosopher, Again.)

But what about memory retrieval??
In contrast to common past-future activity in the left hippocampus, the right hippocampus was differentially recruited by future event construction. This finding is notable, not only because others report right hippocampal activity to be common to both past and future events (Okuda et al., 2003) but also because it is surprising that future events engage a structure more than the very task it is thought to be crucial for: retrieval of past autobiographical events....
It does seem strange that no regions were more active for memory than for imagination. So memory doesn't differ from fiction? At the very least, it didn't result in greater brain activity than fiction, not in this particular study (an important point).
There was no evidence of any regions engaged uniquely by past events, not only in the PFC but across the entire brain. This outcome was unexpected in light of previous results (Okuda et al., 2003). Moreover, regions mediating retrieval processes (e.g., cue-specification, Fletcher et al., 1998) such right ventrolateral PFC (e.g., BA 47) should be engaged by a pure retrieval task (i.e., past events) more than a generation task (i.e., future events). More surprising was the finding that right BA47 showed more activity for future than past events, and that past events did not engage this region significantly more than control tasks.
- The Neurocritic, citing
Addis DR, Wong AT, Schacter DL. (2007)

(I should admit, even though that re-citation honestly conveys what's on my mind I happened to read it while writing this, and so there it is it doesn't honestly convey what I consider a strong argument. Like The Neurocritic, I'm skeptical about the functional neuroimaging fad; it seems too much like listening to a heart pound and deducing that's where emotion comes from. Reaching just a bit farther, then from my keyboard to my bookshelf....)

For researchers in the cognitive sciences, a narrative works like a narrative, whether fictional or not:

... with respect to the cognitive activities of readers, the experience of narratives is largely unaffected by their announced correspondence with reality. [...] This is exactly why readers need not learn any new "rules" (in Searle's sense) to experience language in narrative worlds: the informatives are well formed, and readers can treat them as such.
- Richard J. Gerrig, Experiencing Narrative Worlds

According to Davis, modern mainstream genres partly result from legal changes which forced propositionally ambiguous narratives to face courtroom standards of truth. I didn't find his evidence completely convincing, but there's something that felt right about his tale.

A narrative is not a proposition. When narrative is brought into a courtroom, interrogation attempts to smash it into propositional pieces.

But any hapless intellectual who's made a genuine effort to avoid perjury can testify how well that works. We don't normally judge narratives: we participate in them, even if only as what Gerrig calls (following H. H. Clark) a side-participant. If we restricted ourselves to "deciding to tell a lie" or "trying to tell the truth," there wouldn't be much discourse left. Depending on personal taste, you may consider that a worthwhile outcome; nevertheless, you have to admit it's not the outcome we have.

We've been bred in the meat to notice the Recognizable and the Wondrous. The True and the False are cultural afterthoughts: easily shaken off by some, a maddening itch for others, hard to pin down, and a pleasure to lay aside:

At the tone, it will not be midnight. In today's weather, it was not raining.

Responses

January 2009: Since I haven't found anyplace better to note it, I'll note here that the best academic book I read in 2008 (unless Victor Klemperer's The Language of the Third Reich counts) was Reading Fictions, 1660-1740: Deception in English Literary and Political Culture, by Kate Loveman, whose metanarrative convincingly allows for (and relies on) pre-"novel" hoaxes and satires while not erasing generic distinctions.

. . .

Nothing Personal, 6

Pop music makes a horrifically misleading comparison point. English song and poetry in English have diverged too much since Campion's day, and as much as I love the lyrics of Chuck Berry, Lord Melody, Smokey Robinson, Tom Verlaine, the Coup, Mos Def, and Slug, none would fit a little magazine or chapbook.

A closer demotic relative of contemporary lyric is stand-up comedy, with its definitional dictions, its canonical revolutionaries, and its School of Quietude.... They're different forms with different capabilities, but there are examples from both who'd fit either.

Responses

Ask Ron Silliman about the Russian edition yoking him with Louis Zukofsky and Woody Allen.
"a something along these veins .."
And while we have for decades been told that the lyrics of Jim Morrison, Bob Dylan, Patti Smith et al (rarely however is this said about Slug) can be read without music as standalone poems, the same is never claimed for the work of Richard Pryor or Eddie Izzard.

Ah, but Steven Wright...?

Peli writes:

First thing they teach you in Narratology school is in the face of a literary theory scrutinize the selection of evidence. I counter with: Brian Eno, John Cale, 'Berlin' David Bowie, Early and mid Beck, Destroyer, Jonathan Richman, Pavement, MF Doom, Li'l Wayne, Ghostface Killah, and a bunch of Israeli stuff. Not that their lyrics are in anyway better or more interesting than your batch, just that they're all both a) very good, b) modern poetry is a somewhat relevant frame of reference to their work either historically or theoretically or both.

. . .

A universal narratology of narrating

This helpful comics-construction chart from Sydney Padua also describes the process behind most fiction, most screenplays, my own essays, and some poems and songs.

Left out of Padua's graphic (but described in its walltext) is the extent to which all nodes are fed by STUMBLE-UPON. And of course DON'T KNOW HOW IT ENDS is generally resolved by the power trio of HUBRIS, FATIGUE, & DISGUST.

Responses

wow

. . .

If on a springtime's blog a blatherer...

I've been thinking about two types of metafiction, or at least metafictional moments: the type we're all too familiar with in recent years, where the metafiction is the point, and the (what to call it?) target fiction is in its service, and another more common, more exhilarating type (as I have come to think), where metafictional moments are actually in service of the story itself....
- balaustion

As Balaustion's examples suggest, there is a history, a lifespan, to apparently unmediated narrative or lyric. Thackerey and Trollope notoriously lack that goal, Byron (and then Pushkin) contested its triumph, and by the time we reach Bouvard & Pécuchet and Huysmans it's devouring itself. The perplexing disruptions of Ulysses simmered down into a signature sauce for Beckett and O'Brien, and then dessicated into spice jars for postmodern fabulism and swingin'-sixties movies. If Nabokov is a chess problem and Perec is a jigsaw puzzle, John Barth and Robert Coover are search-a-word.

Even more specifically, the desire for unmediated narrative is linked to genre Mark Twain and William Dean Howells were contemporaries, after all and therefore self-congratulatory metafictionality is also linked to genre. When, back in 1976 or so, I sought goods fresher than those provisioned by the oxymoronic experimental mainstream, I found them labeled as science fiction or fantasy. And they included a generally more relaxed use of metafictionality. Not Dick, of course; Dick is Barth haloed by sweat-drops. But Disch and Russ in the 1970s, and then in the 1980s and so on M. John Harrison and Fowler and Emshwiller and Womack and so on.

What I really wanted to blather about, though, was a rare third type of metafiction, neither the recircling of an already-overworked puzzle, nor the matter-of-fact surfacing of one discursive mode in a cove of splishy-splashy discourse, but instead doing something an emotionally engaged and affectively effective metafictionality. I likely first encountered that possibility in Warner Bros. cartoons and Hans Christian Andersen. But a lot of Updike passed under the bridge before I reached Delany's Dhalgren: a unique three-decker in which every tool of realistic fiction attempts to portray structuralism from within. It's like Zola as Fabulist, or Sergei Bondarchuk's seven-hour adaptation of an original story by Frank Tashlin. And about fifteen years later, Crowley's Engine Summer delivered a similarly visceral charge by embodying romantic loss in a closed roman.

Responses

Josh Lukin differs:

Honestly, I think the sweaty Barth is Gaiman. Dick is, I dunno, Philip Rieff with a Crawdaddy subscription? Tough one.

And I think Gaiman is Mary-and-Charles-Lamb-going-to-a-Police-concert, so go figure.

. . .

Realism : An Anthology

An appeal to an artwork's realism, its roots in reality, is an appeal not to its accuracy at registering facts but to the depth of its claim upon us. The claim is not, 'this is the real world', but rather, 'this is your world'.
- Josh Kortbein, josh blog

Career tip: flatter your readers by telling them they're "made of stories".

Some days I wake up sick to death of language.

As for fiction.

99.999999% of the "conversation" is rhetoric so bad you don't know whether to choke or laugh.

You look around in despair for some state that doesn't include the use of language.

"Made of stories." Bland, meaningless crap.

Noncommunicative actions, impossible to to turn into language & thus not subject to constant mild but slimy abuse. Where are they?

- M. John Harrison, Twitter

“Oh, I’ve said, ‘You can't describe it. You'd have to be there.’ But that’s my first wife telling her mother-in-law about the time we went to Persia. And that isn’t what I mean.”

Kid smiled back and wished he hadn’t.

It isn’t his moon I distrust so much, he thought, as it is that first wife in Persia.

- Samuel R. Delany, Dhalgren

Responses

That last can do double duty as our review of Gravity (2013).

. . .

How Musil Can Change Your Life!

Mixed feelings are more productive in fiction than in conversation. Even writers with definite or self-definitive prejudices will induce muddle in pursuit of a story. (And then, reversing the process, their biographers become disillusioned by the bigoted troll.) Those whose second thoughts resecond, rethird, refourth and so on to Reichian volume and density may be lured into the hunt merely by the blessed prospect of something captured.

You'd better bag the game, though. Otherwise all you've achieved is another unsatisfying conversation.

 

Copyright to contributed work and quoted correspondence remains with the original authors.
Public domain work remains in the public domain.
All other material: Copyright 2015 Ray Davis.