. . . M. John Harrison

. . .

Doug Asherman queries: "Here's something probably no one cares about, there a conspiracy to hide all the good music from us? And the good books? Are people in the arts laughing at us for buying their crap? And if so, what can we do about it?"


Disclaimer: Ray Davis is a paid contributor to The Reader's Guide to Contemporary Authors.
Despite my familiarity with Salon, I was surprised by the Reader's Guide. It's like stooping to pick up what looks like an ordinary mediocre valise only to find that it's been packed full of mediocre lead bricks. According to the official mission statement, "We decided to let our contributors' enthusiasm and curiosity be our guides." These guides would seem to share one enthusiasm (the New York Times Book Review), one curiosity (what's on the front page of the New York Times Book Review?), and one path (to the trade paperback table at Borders).

I'm hardly a specialist in late-twentieth-century fiction, but if I'd wanted to provide a service to readers I would've found room for Joanna Russ, Patricia Highsmith, Chester Himes, Carol Emshwiller, Jack Womack, Bob Gluck, Barbara Comyns, Thomas Disch, Alexander Trocchi, Wendy Walker, M. John Harrison, Pat Califia, John Crowley, would've asked someone to tell me more about Patrick O'Brian and Dorothy Dunnett....

In fact the book includes only two authors that really matter to me: Karen Joy Fowler (given a short entry) and Samuel R. Delany (a short entry). Broadening my selection to writers I merely respect would add Elmore Leonard (a short entry) and Toni Morrison (a long entry, because she publishes in the mainstream genre) to the overlap.

And it's not because I haven't heard of the book's choices, which are beautifully (if unintentionally) parodied by John Updike's inset guide to "Timeless Novels about Loving," apparently repurposed from TV Guide:
"Madame Bovary by Gustave Flaubert - A young bourgeois wife seeks spiritual and sexual fulfillment away from the marital bed and runs grievously into debt."

"The Scarlet Letter by Nathaniel Hawthorne - Among the Puritan pioneers of Boston, a promising clergyman falls afoul of a dark-haired protofeminist and her wizardly older husband."

Yeah, my list is idiosyncratic. An idiosyncratic list is what you'd expect to get from a serious pleasure reader. Whereas the Guide is what you'd expect to get from a serious follower of publicity.

The depersonalized author selection ensembles beautifully with the reviewers' khaki prose. Dumbed down for ease of swallowing, it's the house style of virtually every free weekly paper: "authority" and "irreverence" and "wittiness" depilated of information or personality or humor. Journalistic blatherers used to flatter their readership by pretending it was full of good taste and morality while somehow simultaneously dumb as a stump. Nowadays the readership's still dumb as a stump yet somehow full of detached ironic wit.

Thus it makes sense that the only writer given full stylistic rein is "Pagliacci Dave" Eggers, who ends his celebrity roast of Kurt Vonnegut:

"So. Vonnegut is good. If you like books, and like to read them even if they are easy to read and frequently funny, you will like the work of Kurt Vonnegut, a writer. Also: He has a moustache."
Robert Benchley, move over! ... Or roll over. Do something, man! ... Guess he's dead.
Even John Clute, one of the most eccentric stylists ever to write a review, is sanded down to transparency here. Which is extra sad, since his argument against the tyranny of the "mainstream literary" genre lances so precisely the core of this book. Restricting yourself to mainstream fiction in the late twentieth century is like restricting yourself to heroic tragedy after 1650. The mainstream's just not where good writing is being done. Unless you crave watery flavorless writing.
How did this awful thing happen? One clue may be found in the opening of the entry for Angela Carter:
"Carter enjoyed little renown during her life, but after her death...."
"Little renown"? Say what? Angela Carter? The only way I can make sense of this is to translate it as "I wasn't assigned Carter in high school and I didn't read the New York Review of Books back then, but after I graduated...." An editor should've caught the gaffe, but since an editor wrote it, it was probably immune.
Someone on the staff did make time, though, to reduce my already undistinguished sentence:
"An early devotee of poststructuralist and feminist theory, Delany in 1979 began Return to Nevèrÿon, an archeological fantasy series that was to occupy him on and off through the 1980s."
to complete nonsense by deleting "in 1979." Maybe someone in the publishing business could clear this up for me: Are copyeditors paid more for adding lots of mistakes to a manuscript, like programmers whose productivity is measured by lines-of-code? Or do they fuck things up out of sheer gotta-leave-my-mark egotism, like programmers who work nights and weekends?
Another clue: from what I can gather, my topic was originally assigned to a friend of the editors who decided only when the deadline was nigh that it was too hard to finish the research. These aren't enthusiasts trying to communicate their enthusiams: they're a clique trying to act like grown-ups.

From the mission statement:
"We encouraged our contributors to think of you, the reader, as an intelligent, interested friend or relative who'd just asked, 'So tell me about John Updike. What are his books like?'"
I find it unlikely that your auntie would be asking you such a thing unless she was checking how well you'd learned your lessons.
What we got here is not just smugness, but downright noisy celebration of shared limited knowledge. It's like that guy in the museum painstakingly explaining every perfectly obvious thing to his wife. It's like my geeky friends parroting TV shows. It's what they teach you to do in school. And it's what makes a successful journalism career. critics are they who seek to enjoy, without incurring the Immense Debtorship for, a thought thunk.

See Also: Anyone who enjoys this crap should probably seek out the opinions of more journalists. Those intrigued by my crackpot theories can find them expressed more calmly in a 1998 response to Jonathan Lethem.

Ray Davis will be appearing with other contributors to The Readers Guide to Contemporary Authors at A Clean Well-Lighted Place for Books in San Francisco, Thursday, September 14, 7:30pm.

. . .


David Auerbach writes (or rather wrote, eleven days ago -- I gotta improve my turnaround time!)

Your treatment of free will as being subordinate to the predictability issue is justified (there are articles out there maintaining that chaos theory proves the existence of free will), but I think there's some cultural significance to the free will issue that you've overlooked. Free will is mostly used in ethical and political contexts. You say that regardless of your free choice, you'll be held responsible for hitting the lamppost, but I think that's only 2/3 true. Given the 3 canonical reasons for the sentence awaiting you:

  1. Let's use him as an example so people stop hitting lampposts as much.
  2. Let's make sure he never hits lampposts again.
  3. Let's give him what he deserves for hitting that lamppost.
--the first is almost never used as justification except in cases of capital punishment, where it's generally acknowledged as fallacious anyway, the second is more common for petty crime than real crime, leaving the third as the dominant rationale in the justice system today. Which would be fine, except that desert really does rely on some notion of autonomous action, separable from environmental factors, in order not to fall apart. (There's a lot of hand-wringing that can go on here over deserts being assigned within/without a being, but it's all bean-counting.) Agency survives determinism, since it was you what hit the lamppost, but a sentence that doesn't fall into the category of determent doesn't.

(I know I'm taking a Sartre-like position that inconsistency is the worst of all possible sins, but hey, that was always true in the rarefied world of philosophy.)

So, if you follow determinism, people can be assigned responsibility for actions without having any moral desert for what follows from them, and I've never seen a convincing argument linking the two. But introduce free will and the world is suddenly a much fairer place. And it's not just coincidence that

(t1) Paul Allen deserves to have 50 billion dollars.
sounds a lot better than
(t2) Paul Allen should have 50 billion dollars.
People like Robert Nozick have always been careful to couch their moral pronouncements in the first form rather than the second, and with good reason. But it's only with the presumption of some sort of free will that the statements have any meaningful difference.

It's been a few years, but I recall that Rawls uses the same desert principles to defend his social justice system, and I've never understood why, because he doesn't seem to need them. ("The poor deserve to have a decent standard of living" vs. "The poor should have a decent standard of living.", e.g.) Maybe it makes his arguments more palatable to Confucianists.

So my main point of departure from you is that I think there is a very definite use for the concept of free will beyond religion, unfortunately. The best that can be said is that free will is a far more established concept for neocons to pin their hopes on than, say, substantive due process or strict constructionism. You're probably right on the irrelevancy of the concept in classical civilizations, but that's a question for Alasdair MacIntyre to answer.

Work calls, but free will is a nice distraction from matters of importance. (I note the triumphalist tone in that piece clashing nicely with utter despondency in the privacy entry.)

I actually don't hear much about the world being a fair place, so I'll skip that debate.

Only in special circumstances (which I'll get to in a bit) does the notion of "desert" rely on the assumption of free will in the deserver. Instead, "desert" relies on the existence (tacit or explicit) of some disher of deserts, and moreover assumes that the disher has free will. It's meaningless to discuss "what reward or punishment is deserved" if there's no possibility of a rewarder or punisher. Without such an agency, all we have is "what is" or (if you're feeling ambitious) "what is caused."

It's easy for me to say that "Carol Emshwiller's books deserve front page coverage by the New York Times Book Review, the Times Literary Supplement, and the New York Review of Books" not because I assume that Emshwiller's books have free will, but because it's easy for me to picture the fat-cat top-hatted stogie-chomping editors of those organs free-wilfully derelicting their duties, the cads.

On the other hand, I'm more likely to say that "I wish M. John Harrison's books were bestsellers" or "People should really be doing more to make M. John Harrison's books bestsellers" than to say that "M. John Harrison's books deserve to be bestsellers" (and even less that "M. John Harrison deserves to write bestsellers"), because the birth of a bestseller is far too confusing a process for my pretty little blond head to handle.

And what ho! here's monotheism again! A more general notion of desert (or, as the professors call it, justice) assumes (again, often tacitly) a more general notion of judge: a universal, omniscient, omnipotent, fair, righteous, and free agency. When we say, "I deserve to be happy," that's who we're appealing to; when we say "Those 6000 people didn't deserve to die" (or, conversely, "Those 6000 people deserved to die") that's who we're implicitly arguing with (or explicitly agreeing with). Similarly, when we (or more likely they) say that "Paul Allen deserves fifty billion dollars," they're imaginatively placing themselves in the role of that wise and benevolent deity called the Free (ha ha ha ha!) Market.

Consider instead the human agents of the employee benefits organization CIGNA who decided that Wilson H. Taylor deserved a $1,200,000 salary and a $3,500,000 bonus. It's hard to believe that they cared about Taylor's freedom of will any more than Microsoft cares about its programmers' freedom of will. On the contrary, I imagine that any business would prefer that its employees be as deterministic as possible. What they instead assume is their own right to hand such a large sum of money over to an individual -- and, particularly, to an individual who so resembles themselves.

The apparent justice of such judgments depends on a shared context; that is, ideally on a jury of one's peers (as opposed to a jury of the peerage -- e.g., Bush before the Republican Supreme Court as opposed to you-or-me before the Republican Supreme Court, or Steve Ballmer in front of Microsoft's board as opposed to you-or-me in front of Microsoft's board). When I bomb a building, whether I'm punished or praised depends on what context I share with judge and jury; when I do my job, whether I'm given a million-dollar bonus or laid off depends on the same.

Which finally brings us to the barely visible sliver of human existence in which the notion of "justice" and the notion of "free will" overlap. "Free will" (or "determinism") is experienced more often in introspective recollection than in action -- "It's my own fault" or "I didn't really have a choice" -- and punitive judgment is, very slightly and only after the more essential matter of deciding whether a law has been broken or a party has been injured, a matter of applying that introspective experience to someone else's past conduct. On a jury or on the bench, we assume both an unusual freedom and an unusual weightiness in our decision making, and the closer the miscreant came to our own (extremely rare) state of knowledge and power, the more culpable we consider them. *

Therein inheres the wit of T. P. Uschanov's swinging the deterministic spotlight onto the hidden-but-necessary P.O.V. courtroom characters of judge and jury.

*       Which is putting it awfully idealistically, of course: many judges and juries couldn't care less about that aspect of "justice," and those who could find it much easier to apply these strict standards externally than internally: that is, we blandly assume that the (extremely rare) mindset that we're in at the moment is the same as held in whatever external situation we're considering, and then blandly forget that mindset while going about our quotidian affairs. Thus the honestly self-righteous indignation displayed by those in power when they're declared miscreants, or by daytime talk show audience members who find themselves treated roughly on the talk show stage.

. . .

Make the voices stop

At a similar literary salon about, oh, maybe seven or eight years ago, the favorite first lines game was played, and I quickly realized that I didn't have any.

Partly that's because so much of my favorite writing occurred before the late-twentieth-century vogue for hooky opening sentences; partly it's because I dislike that vogue, which will seem as eccentically simplistic to future readers as an earlier era's focus on moments of moral sublimity seems to us. Grabbing the reader by her arm and yanking seems a rude way to initiate a conversation, and when I remember particularly enthralling beginnings, I remember their structural effects rather than the wording of sentence one: the early and peculiar disappearance of Madame Bovary's initial narrator, for example. (Back at that salon, the only opening line I could recall right off was a condensed version of the Bovary gambit, as played by Beckett in Mercier and Camier: "The journey of Mercier and Camier is one I can tell, if I will, for I was with them all the time." The rest of the book being I-less.)

On the other extremity, I'm a sucker for endings that snap close with a satisfying click, and I recall (and re-read) a good many last lines, with special fondness for those whose persistent startle ripples backward through the entire work, restructuring it retrospectively into something far richer than one had even dared to hope for as one kicked joyfully up surfacewards holding one's perfectly timed-to-the-last-page breath.

(Oddly, few of the examples I'm about to offer really count as "spoilers": to understand their defiance of expectations, one must have developed those expectations in the first place. The truly itchy can feel free to request story-wrecking explanations from me.)

Such an ending is more likely to speed the traveller on with a slamming of the door than with a gentle swinging to, treating readerly expectations so aggressively that they could almost be called rebuttals to their own books. (Ulysses is one such rebuff after another.) Closure is, after and above all, a refusal of further story.

Trouble no quiet, kind heart; leave sunny imaginations hope. Let it be theirs to conceive the delight of joy born again fresh out of great terror, the rapture of rescue from peril, the wondrous reprieve from dread, the fruition of return. Let them picture union and a happy succeeding life.

Madame Beck prospered all the days of her life; so did Père Silas; Madame Walravens fulfilled her ninetieth year before she died. Farewell.

"That was the happiest time we ever had," said Frédéric.

"Yes, perhaps you're right. That was the happiest time we ever had," said Deslauriers.

I remember them all with such happiness.
I should certainly never again, on the spot, quite hang together, even though it wasn't really that I hadn't three times her method. What I too fatally lacked was her tone.
"That may be," Nora said, "but it's all pretty unsatisfactory."
He bent to pick it up, then stopped. Don't touch it, he thought, don't touch it.
[The first remains the most chill-enducing and daringly experimental ending I've ever read, as befits Charlotte Brontë's Villette, the pinnacle, in English literature, of characterization through narrative voice: The plot is resolved in the imperative! or, more accurately, via the narrator's very use of the imperative! Aided by the unemphasized selectivity of her seemingly conventional last paragraph wrap-up! (I think we can agree that exclamation marks are called for here, given the tightrope-acrobat precision of the performance.)

The second concludes Flaubert's most brilliant closing movement: that of the infinitely self-undermining Sentimental Education -- whose influence can be clearly seen in my third entry, from M. John Harrison's The Course of the Heart, and perhaps also in Mavis Gallant's "The Moslem Wife" (as cited in Eclogues).

Next, and speaking of characterization through narrative voice, the befuddled detective of Henry James's The Sacred Fount finally manages to reach a conclusion. Fifth is Dashiell Hammett's last word on the murder mystery genre (or perhaps on fiction in general) in The Thin Man, and lastly Patricia Highsmith's The Cry of the Owl abruptly becomes non-Highsmithian -- and freezes.]

Some end with a flourished signature:

And by what I have written in this document you will see, won't you, that I have obeyed her?
And the twelfth stroke of midnight sounded; the twelfth stroke of midnight, Thursday, the eleventh of October, Nineteen Hundred and Twenty-eight.
Having seen this time what I needed to see, I started writing; and in time wrote all that you have read.
"You and Capablanca," I said.
[Janet Frame's Faces in the Water throws mental health into our eyes like vitriol; Virginia Woolf's Orlando shoots its arrows of desire right through the temporal barrier; Jack Womack's Going, Going, Gone goes home; Raymond Chandler's The High Window gives everyone a fucking break.]
Some with a gleeful or furious or heartbreaking -- but perfectly definite -- denial of closure:

That is said nowadays by the most modern of the physicists. If that is true, then that is how it is with Pooch and with Carmen and with all the others.
I'd always felt the future held wonderful things for me. I'd never quite caught up with it, but quite soon I would. I felt sure I hadn't long to wait.
Something further may follow of this Masquerade.
Ever after. I promise. Now close your eyes.
[Carol Emshwiller's Carmen Dog; Barbara Comyns's Mr Fox; Herman Melville's The Confidence Man; and the devastating final sentence of John Crowley's Engine Summer, whose subject (in several senses) might be said to be the tragicomedy of incompletion.]
And some are simply, disturbingly or delightfully, accomplished:

"Nothing, Mamma. I was just thinking."

And, drawing a deep breath, he considered the faint whiff of scent that rose from his mother's corseted waist.

He ran this way and that, low down in his throat crying, and she grinning and crying with him; crying in shorter and shorter spaces, moving head to head, until she gave up, lying out, her hands beside her, her face turned and weeping; and the dog too gave up then, and lay down, his eyes bloodshot, his head flat along her knees.
The beautiful weather was compared with the Great Disappointment of '44, when Christ failed once again to appear to the Millerites.
[Robert Musil's Young Törless enters sentimental grad school; Djuna Barnes's Nightwood pays tribute to Aphrodite; Karen Joy Fowler's Sister Noon lights out for the hills.]

2015-06-21 : Guy Lionel Slingsby kindly directed my attention to this trimmer and more Twitter-friendly approach.

. . .

The Launderer's Hand

Continuing the discussion:

As has been pointed out many times before, "genre" is not a simple compound, or even a clear formula, and its assorted aspects of publishing, writing, and reading are only loosely interdependent. Some writing, it's true, affirms generic coherency, snug and compact in a neatly labeled bundle. But much of what I'm drawn to seems badly wrapped, corners rubbing against frays and duct tape.

It always comes marked, however. No matter how much writer or reader idealizes invention from whole cloth, there'll be some natural discoloring, someone to see a pattern, and someone to apply a dye. Even the launderer's hand grows red with wringing.

To drop the metaphors:

  1. My favorite writing is sui generis.
  2. It was (and is) all published (more or less antagonistically) within a generic context.
  3. Assuming that one particular genre has special access to the sui generis greatly reduces the chance of actually finding it.

Which is why, as I wrote earlier, plowing cover-to-cover through some 19th century volumes of Blackwood's or Harper's, or High-Modernist-era New York Times book reviews or High-Hollywood-era movie reviews, would be salutary for most English and creative writing majors. Someone who refused to look at smut would have missed Lolita (fittingly, Nabokov himself first received Ulysses as an exemplar of smuttiness); someone who refused to look at sea stories (or flop gothics) would have missed Melville; someone who refused to look at cornpone humor would have missed Twain; and so on. And someone who refused to read academically canonized writing would miss all the same books now. For we who love to be astonished, it's worth attempting to read Hammett's and Thompson's (or Fitzgerald's and Faulkner's) prose the same way whether behind pulp covers or a Library of America dustjacket.

To take a limit case, there are (and have been) an astonishing number of readers who treat everything written by women as its own genre, resulting in a comedy of re-interpretation when misattributions are corrected and as the purported "genre" is denigrated or celebrated.

All this from publishers and readers. For a writer, genre may considered a conversational context, with one's social circle not necessarily restricted to one's neighbors, or even to the living. Since the literary mainstream's "discovery" of Patricia Highsmith began, I've seen a number of bemused references to the influence of Henry James, but this isn't an unusual phenomenon. The work itself is always more (or less, if truly "generic" work) than whatever genre it's in.

Carol Emshwiller, John Crowley, Karen Joy Fowler, Jack Womack, and Kelly Link write the sui generis they write and publish in whatever genre welcomes (or allows) them. But a contemporary may find it useful to learn that they all began publishing within the context of the science fiction genre, whether they themselves started as genre readers or not. And although I seek out Dalkey Archive and Sun & Moon Press spines in the bookstores, I enjoy knowing that the past decade of The Magazine of Fantasy & Science Fiction has shown more lively variety than any university-sponsored or trust-funded fiction journal.


Lucius Shephard also

God, yes. There are, oh, let's not start feeling guilty about not mentioning M. John Harrison, there are lots more. And then all the great writers who are publishing mysteries, thrillers, romances, Y/As, and including, sure, the literary mainstream and the poetry presses, but all of them, now ignored or long forgotten or even deservedly noticed, should get more than just a for instance, and I just meant for instance.

A welcome update, fourteen years later, from Josh Lukin:

Well you know my fave bemused reference to the influence of Henry James . . . although I'm sure Baldwin's Jamesianity too incurred some bemusement (to say the least) in his day, of the "'Notes of a Native Son'? What does this guy think he is?" sort (I don't at the moment have the wherewithal to turn to my Marcus Klein and Maxwell Geismar and Irving Howe and see if that was among their beefs).

I've been reading some James stories and am struck by their reliance on Ideas (pace T.S. Eliot). And I mean Ideas in the way SF writers mean Ideas: premises that one can quickly pitch to an editor (or to a writer, if one is that kind of editor I did have a pair of cats named Horace* and Campbell). It's an unoriginal insight that post-Chekhovian litfic doesn't make for good log lines the way that older stuff does; but I wonder whether pitchability has an economic origin or not: did Maupassant**, whom James might have gotten it from, write for magazine editors?

*Horace is still with us, but he doesn't like me to read the New York Edition. He will plop himself on it or gently close it or try to eat "Daisy Miller." I had to get hold of the first book editions of the stories so he'd leave me alone.

**Did anybody else pan their influences as interestingly as James? Not Wilde, not Nabokov, not Alan Moore . . . the list isn't as long as Harold Bloom led me to believe . . .

. . .

Do you think that you could make it with Frankenstein?

A question at the end of one of Jeff VanderMeer's recent posts has been nagging at me -- "Do writers of experimental fiction need to prove they can tell a good story before they start experimenting?"
- Mumpsimus

Conclusions elude us. It could be there are none to be drawn without distortion.

  1. The laws are ambiguous and the judges are prejudiced.

    Matthew Cheney and I both seek out the tang of the unexpected problem; we welcome obstacle. And so, faced with increased experimentation, we're likely to tilt our camera eye to make a narrative of progress where others may tilt a decline. Whenever Joyce published, he lost a former supporter. Gardner Dorzois, among others, regrets the "squandered promise" of Samuel R. Delany's maturity. And I'm sure there are some who wish M. John Harrison had never put Viriconium through its literary retcon.

  2. Outside a historical context, terms like "craft", "good story", and "experimental" are little more than Whiggish fertilizer.

    Nothing I've read in the past few years can compare with the experimentation of Tom Jones or Wurthering Heights, but we don't see Mark Amerika giving them props. Me, I don't think Beckett ever again wrote anything as brain-droppingly new as Watt; I think of his last thiry years as laying down a very good groove and think of John Barth's later career as safe shtick. Make Barth as hard to find as Barbara Comyns or Bob Brown and I'll reconsider.

  3. It's chancy to generalize about particularities.

    Was Orlando more or less experimental than To the Lighthouse? How about positioned between To the Lighthouse and The Waves?

    Flaubert started out with wildly uncontrolled blurts of fantasy. Were those stabs in the murk less or more experimental than Madame Bovary? Was Salammbo less experimental than The Temptation of St. Anthony? Bouvard & Pécuchet?

  4. What seems blandly normal or tediously artificial to the reader may have been a coltish celebration of new skills for the author.

    If Melville chafed against the limitations of the autobiographical sea story while writing Typee, it doesn't show. The sincerity of Modernist poets' juvenilia is hardly its besetting problem.

  5. In conscious transitions from "expected" to "peculiar", what matters may be the sheepskin, not the education.

    That is, the trigger is being granted permission to experiment, either from the publishing industry or oneself. If you write to make a living, there may not be much of a distinction. The Glass Key wouldn't have been Hammett's first publication, if only because he couldn't have afforded it.

    The most startling such transformation I've personally witnessed was at Clarion 1993, when a workshop member who'd slaved over unconvincing Analog filler realized that such an apprenticeship wasn't required, and suddenly began producing beautifully polished and balanced works of ambiguous speculation. (Like most good artists, he seems to have eventually decided that artmaking wasn't worth the effort, but that doesn't dim the thrill of witness.)


Discussion continues at Mumpsimus, at Reading Experience (with a clumsy, verbose response of my own), and at Splinters.

And does Dan Green's hospitality know no limits?— still more at the Reading Experience.

Update: Dan weeded and discarded his initial post in 2006. Here was my comment at the time:

I'm prone to note resemblances, which is fine, but then rhetoric sometimes tempts me to go too far. So I might talk about a "tradition" of presumptious lyric, and in that jumble together some unaristocratic Tudors, some Restoration satirists, Keats, the Objectivists, the New York School, and Language poets. I suppose somewhat the same impulse determines Oxonian anthologies and encourages such after-the-fact categories as film noir, nationalist canons across the world, and women's writing.

In your brief overview of "experimental writing," there's a temporal gap between "Tristam Shandy" and James Joyce's career. Do any books fit in there? I ask partly because I think I'd like them, and partly because explicit experimentation *as a tradition* would seem to require a firmly established norm, and I'm not sure when the particular narrative conventions being fought became firmly established, or how long it took before insurgent tactics became narrative conventions in their own right.

I also wonder about the conceptual gap between a single book and a career. "Tristram Shandy" stays just as wonderful but becomes slightly less startling positioned between the "Sermons of Mr. Yorick" and "A Sentimental Journey"; Sterne-as-career becomes slightly less startling positioned between the polyphonic digressions of sixteenth and seventeenth century English fiction and the sentimental, didactic, and political novels of the late eighteenth and early nineteenth century. Even before an "oppositional" tactic becomes group property, it may be a personal habit. Is a writer who attempts something drastically new in each new publication only as "experimentalist" (to use Steve Mitchelmore's word) as a writer who challenges narrative convention the same way every time? (I'm not denigrating the latter, by the way; I believe in the power of the groove.)

Conversely, early Joyceans proved that it was easy to miss the formal ambitions of "Dubliners" and "Portrait" without "Ulysses" and "Finnegans Wake" to foment suspicion. One might read "Moby Dick" as a (failed) conventional narrative, but can one say the same of "The Confidence Man"? 150 years after "Madame Bovary", we might take it as conventional, but I believe Kenner is right to draw Joyce's artistic ambitions directly from Flaubert: "A Simple Story" to "Dubliners", "Sentimental Education" to "Portrait", "Temptation of St. Anthony" to the later episodes of "Ulysses", "Bouvard and Pecuchet" to Leopold Bloom -- and, on a different trail, to Beckett's "Mercier and Camier".

And there's that final gap between the isolated heroic figures of the modern canon and a contemporary American school of writers who share some publishers, make livings in academia, and swap blurbs, bridged by the pulp-sprung and compulsive Burroughs.

Well, I'm afraid all this gap-minding sounds both more detached and more combative than my feelings justify. You yourself call it a "pragmatic" distinction. I suppose my uneasiness truly comes down to worrying just what use our pragmatisms get put to. Provisional categorization can work as a portal of discovery. (Jerome McGann's championing William Morris as the first Modernist is a delightful example of what can be done with hindsight genre.) But windows require walls, and human beings do seem to love their wall-building. Once we have our categories up, it may be hard see around them. If I'm not mistaken, a similar uneasiness stirred your "Don't Change" entry of September 22.

I suppose I sound as if I'm trying to eradicate distinctions, when what I'd like is to make them finer.

. . .

Heathcliff, Come Home

(Written for The Valve)

I suppose many readers of The Valve eventually get around to The Yale Journal of Criticism on their own, but if un-lit blogs can point you to the New York Times front page, it must be OK for me to point you to "Petted Things" by Ivan Kreilkamp, starring the Brontë sisters as animal rights pioneers.

Kreilkamp's essay pleasingly draws from history, the authors themselves, and recent Derrida in the service of (to me) a novel, amusing, and evocative association of realism with anthropomorphism. The critic even shows good reason for having treated "the Brontës" as a group rather than as individual novelists.

Potential Disney adaptors of Jane Eyre should especially note the story of Clumsy, A Dog:

"Tell how he grows ugly in growing up;... Madam's disgust for him; the rebuffs he suffers.... Clumsy, for that is what she calls him now, banished to the yard; his degradation; detail his privations, the change in food and company."

Everyone else should especially note that Carol Emshwiller's Carmen Dog carries far more entertainment value than its equivalent in Lucas-movie-and-junk-food.

+ + +

Afterthought: The Brontës as potential writers of noble-dog stories reminds me of one of my own favorite alternate-literary-history scenarios: What if, rather than giving up their shared fantasy worlds, the Brontë sisters had successfully brought their mature styles and concerns into Gondal and Angria, weirdly anticipating Joanna Russ's Alyx, M. John Harrison's ret-conning of Viriconium, Samuel R. Delany's Nevèrÿon...?

Pointless, I know, but at least it's a break from imagining the rest of Emma.

. . .

If on a springtime's blog a blatherer...

I've been thinking about two types of metafiction, or at least metafictional moments: the type we're all too familiar with in recent years, where the metafiction is the point, and the (what to call it?) target fiction is in its service, and another more common, more exhilarating type (as I have come to think), where metafictional moments are actually in service of the story itself....
- balaustion

As Balaustion's examples suggest, there is a history, a lifespan, to apparently unmediated narrative or lyric. Thackerey and Trollope notoriously lack that goal, Byron (and then Pushkin) contested its triumph, and by the time we reach Bouvard & Pécuchet and Huysmans it's devouring itself. The perplexing disruptions of Ulysses simmered down into a signature sauce for Beckett and O'Brien, and then dessicated into spice jars for postmodern fabulism and swingin'-sixties movies. If Nabokov is a chess problem and Perec is a jigsaw puzzle, John Barth and Robert Coover are search-a-word.

Even more specifically, the desire for unmediated narrative is linked to genre Mark Twain and William Dean Howells were contemporaries, after all and therefore self-congratulatory metafictionality is also linked to genre. When, back in 1976 or so, I sought goods fresher than those provisioned by the oxymoronic experimental mainstream, I found them labeled as science fiction or fantasy. And they included a generally more relaxed use of metafictionality. Not Dick, of course; Dick is Barth haloed by sweat-drops. But Disch and Russ in the 1970s, and then in the 1980s and so on M. John Harrison and Fowler and Emshwiller and Womack and so on.

What I really wanted to blather about, though, was a rare third type of metafiction, neither the recircling of an already-overworked puzzle, nor the matter-of-fact surfacing of one discursive mode in a cove of splishy-splashy discourse, but instead doing something an emotionally engaged and affectively effective metafictionality. I likely first encountered that possibility in Warner Bros. cartoons and Hans Christian Andersen. But a lot of Updike passed under the bridge before I reached Delany's Dhalgren: a unique three-decker in which every tool of realistic fiction attempts to portray structuralism from within. It's like Zola as Fabulist, or Sergei Bondarchuk's seven-hour adaptation of an original story by Frank Tashlin. And about fifteen years later, Crowley's Engine Summer delivered a similarly visceral charge by embodying romantic loss in a closed roman.


Josh Lukin differs:

Honestly, I think the sweaty Barth is Gaiman. Dick is, I dunno, Philip Rieff with a Crawdaddy subscription? Tough one.

And I think Gaiman is Mary-and-Charles-Lamb-going-to-a-Police-concert, so go figure.

. . .

Realism : An Anthology

An appeal to an artwork's realism, its roots in reality, is an appeal not to its accuracy at registering facts but to the depth of its claim upon us. The claim is not, 'this is the real world', but rather, 'this is your world'.
- Josh Kortbein, josh blog

Career tip: flatter your readers by telling them they're "made of stories".

Some days I wake up sick to death of language.

As for fiction.

99.999999% of the "conversation" is rhetoric so bad you don't know whether to choke or laugh.

You look around in despair for some state that doesn't include the use of language.

"Made of stories." Bland, meaningless crap.

Noncommunicative actions, impossible to to turn into language & thus not subject to constant mild but slimy abuse. Where are they?

- M. John Harrison, Twitter

“Oh, I’ve said, ‘You can't describe it. You'd have to be there.’ But that’s my first wife telling her mother-in-law about the time we went to Persia. And that isn’t what I mean.”

Kid smiled back and wished he hadn’t.

It isn’t his moon I distrust so much, he thought, as it is that first wife in Persia.

- Samuel R. Delany, Dhalgren


That last can do double duty as our review of Gravity (2013).

. . .

M. John Harrison's "Getting Out of There" and back again

A year later and I keep re-reading this. Well, it's a pretty little thing innit? Gorgeous cover. Glossy paper. Fourteen pages of mouth-tested prose. Title chimes with an Alan Halsey. Proofreading! (You don't really notice the otherwise omnipresent din of typos till you enter an eerily silent space like this one.)

Maybe because I finished The Wine-Dark Sea right before the chapbook arrived, it reminds me more of Robert Aickman than any other Harrison story has reminded me of Robert Aickman. The soppingly grounded Englishness of it. Its protagonist of a certain age and dislocation and curiously libido-free urge to couple. Most of all its pacing: a determined no-nonsense but no-particular-tourist-destination-in-mind tramp into what critics call "dread" (the unaccountable corporate flight of nesting colonies of terns and gulls), not minding the gaps at all, or at least making only token efforts to fill them. This particular gap's as good as a nod to an Aickman influence:

‘That yellow lichen on the roofs down there,’ Hampson said, ‘I wonder what it is?’

She laughed.

‘I thought you were a local,’ she said.

Like all the best influences, Aickman's-on-Harrison was retroactive: verification rather than emulation. They'd independently developed an architecture of negative space.

Harrison, at least, consciously recognized and worked it. Here he explains how he wrote the story which first drew the comparison:

The way I started out, I asked myself a question: How would you write a horror story and take all the horror out of it? How would you write a ghost story and remove almost everything? a couple of sentences, a pair of sentences that would do the trick... "The Ice Monkey" was my first attempt at that. I wrote it as a normal horror story in which it's quite evident what had happened. And then I spent two or three weeks just removing sentence after sentence that directed the reader towards the normal ending, until finally you're left only two sentences in four thousand words which give you the clue as to what might or might not have happened. [...] Yeah, scraped it out. To see what would happen. I wanted to see where it would fall over. [...] After you've been doing it for twenty years, you put fewer of them in. When I started I had to throw out whereas now I know what not to put in.

But they differ in the thoroughness of their erasures. Aickman rarely sealed his unsettling build-ups without a deflationary appearance by crap F/X. Harrison's more likely to scrape away even that much comfort. His multi-volume fantasy series isn't threaded by hero's quest or cod-Gondal dynastic charts but by ways of not-knowing a city. Although some of us have always been creeped by roses, his dark occult novel horrifies mostly through absence. His macho sporting-life naturalism lacks self-pity, rivalry, the thrill of victory, or even the thrill of disillusionment.

Here are the dreadfully recurring associations of "Getting Out of There":

That last develops into the most blatantly anti-realistic aspect of the story, and it's hardly a Famous Monster of Filmland.

I took it because we have to take things somehow as a temporally-displaced dream-version of social media. I don't how you would take it; I'm pretty sure the bait wouldn't attract huge buzz from the buzzers of record. Very important novelists like Eggers and Amis (and Sinclair Lewis and Upton Sinclair) are newsworthy because they're journalists; they're terrified of burying the lede. Whereas Harrison knows the lede's there to enrich the soil. Insofar as "Getting Out of There"'s bit of fantasy was ripped from today's headlines, it was then collaged into plaster-of-paris, and then painted over.

And then discarded for a vacuum-welded clampdown of the unutterably mundane. The twirly-shiny bit played misdirection in a sleight-of-hand maneuver which models our sleight-of-hand transfer from post-youth to pre-senescence. After which, as the poet sang, "You may ask yourself, well, how did I get here?" It wasn't where we were watching.

There's nothing jokey or puzzly about the gaps that bind this free indirect discourse. They're mimetic: deliberate sacrifices of discursive freedom for the respite of further indirection. A temporary but renewable respite. Renewable to a point.


tl;dr It's a horror story whose ultimate brain-melting horror is a happy ending.

Speaking of Harrison, "The Killing Bottle" is a fine fannish-vocational-scholarly analysis of his style.

. . .

Genre note

Although auteurs like M. John Harrison will always fit old clips into new montages, the all-out fixup novel served as loyal attendent to the commercial market for short stories and novellas and did not survive its patron.

While fiction magazines withered, academia doubled-down on publish-or-perish. Journal and books lists exploded, culture took its course, and for several decades humanities' new-book-shelves have been as loaded with fixups as a 1950s paperback rack.

Of course, not all the tactics of their original home were carted over. Lacking the pretense of organic character-focused narrative, no fixing-up scholar need attempt the reconstructive surgery of Dashiell Hammett or Raymond Chandler. Isaac Asimov's Foundation is the model: elucidation and proof of millenia-spanning psychohistory through chapters on Theocritus's Idyll 15, Eliza Heywood's Distress'd Orphan, and Grand Theft Auto V's soundtrack. As the man says, "ideal for tales of epic sweep through time and space."

. . .

Trolling the Neuralnet of Things

Partisan of Things by Francis Ponge,
translated by Joshua Corey & Jean-Luc Garneau

At first, it's a luxurious sink into surface, like wading into a cozy cushion until the upholstery reknits overhead.

And then slight dizziness from the echoes of reflections, as, for instance, from "FIRE"

Once the methodically contaminated masses have collapsed the escaping gasses light a path for a solitary rabble of butterflies.

across pages to "BUTTERFLY"

A flying match of uncontagious flame. Besides, he arrives too late to do more than note that the flowers have already opened. Never mind: like a lamplighter, he checks each lantern's supply of oil. He drops atop the flowers the withered rag it carries, avenging at last his long caterpillar humiliation at the feet of their stems.

Therein a glance into the unsolvable labyrinth of Ideas in Things, as the poet sang, or, to prosify the poet, the strange entanglement of subject-minds with subject-matters, particles-of-the-observer refracting right through the most solid of apperceived bodies. Thereby a handful of handcarved woodchips off the spiritual block supporting M. John Harrison's Kefahuchi Tract trilogy, minus the colorless flavorless icing of super-science.

. . .

Prescribed Burns

Parietal Games: Critical Writings by & on M. John Harrison,
ed. Mark Bould & Michelle Reid

No admirer of artisanal butchery should be without the young-loud-and-snotty pieces M. John Harrison published beween 1969 and 1975. I was most impressed by his doomed berserker whirls against the incoming tide of Tolkien's fantasy ("By Tennyson Out of Disney")

... in any rural pub you can met Samwise Gamgee’s “Gaffer” swearing and spitting unpleasantly into the fire; and I once worked in a Warwickshire hunting stable with an amiable rustic character who beat up his dog so often it wet itself every time he went near it.

and against science-fiction's marching-morons-of-MENSA ("Filling Us Up"), which left a few nicks in my own carcass as well:

This is how thinking is done in sf: conversationally. Inevitable, then, that it should fall down all the holes that conversation is heir to side-tracking, argument from the wrong side of the analogy, rhetoric as a substitute for logic, the accidental modification of premises (or even subject matter). It is not rigorous. Its vocabulary consists almost wholly of terms like “granted” and “posit”, “given” and “for the sake of argument”; its grammar is punctuative, the oratorical “right?” and “agreed?” used as fish-glue to cement unrelated items; its impromptu syntax reflects its impromptu reasoning; it is a muck of colloquialisms and jargon words used outside their proper fields. [...]

In lieu of actual thought, Rackham and Coney offer brash, colloquial pontification, achieved through disembodied mouthpieces; Del Rey senses that “science” has something to do with careful reasoning, but embraces opinion instead; Maine bases his entire extrapolative argument on nothing more than a value-judgement, effectively bypassing the mouthpiece and presenting his cant direct.

Thought and prose cannot be considered as discrete states: the one modifies the other, to infinity. None of the above writers can make a precise, sensible prose, only a vague uncommunicative babble. Meanwhile, the IDEA! bulbs flash stroboscopically, and with each little explosion science fiction reels back, bemused by its own ability to think of things. With each brief illumination of the irresistible notion, the sense of its own importance grows.

Back at John Rackham’s table they’ve got the drinks in against closing time. The amateur sociologists and historians and technocrats are wiping foam off their lips. The pause that refreshes is over, and fragments of the eternal unformed rodomontade are drifting across the bar on a warm front of cigar smoke:

“We say - and we can prove... like the key principle in cybernation...”
“The energy of a finger movement on a switch can control millions of horsepower.”
“That is simply the logical extension of your postulate.”
“To a certain degree, everyone lives in a fantasy world...”
“You ivory tower boys can always make a good case.”

Who can complain? this is the style of the Seventies. The editorial toad has escaped from the centre pages; comment has eaten the news; punditry swallows both. The majority reveals itself as a broil of minorities, each convinced of its own indispensability and itself comprised of as many minorities as it has adherents. We speak, eventually, in private languages. Fiction isn’t art, is it?

Another great First for science fiction.

Which answers David Auerbach's unvoiced question. What draws big-capital big-bluster libertarian types to science fiction? The fatuous sound of men convincing themselves they're the smartest guys in the room.

After that initial blast of room clearing was done, after 1975, what lives of "M. John Harrison" is his fiction (and to some extent his interviews, although not the one included here). The genre overviews he provisioned in 1979 and 1980 are disengaged, distracted, throwing handfuls of ill-sorted proper names like pebbles against a window. From 1990 on, Harrison's byline occasionally appears on reliably professional man-of-letters book reviews in professional man-of-letters venues: perfectly fine; not where the action is. But the first third of this volume is where the action started.


Josh Lukin checks his watch:

Wait, Tiptree's sharp-faced man isn't the voice of the Seventies? Maybe he's the voice of the Eighties, then.


Copyright to contributed work and quoted correspondence remains with the original authors.
Public domain work remains in the public domain.
All other material: Copyright 2015 Ray Davis.