|. . . Analytic Philosopher|
|. . . 2006-02-16|
When someone comes at you on the street with a knife you just yell, you don't shake your head and say, "Forget it, I led the debate team at Mineola Tech."
|. . . 2007-02-13|
Theoretically speaking, I agree: Good works count more than good faith. After all, dead men don't speak sincerely — in fact, judging from what mediums tell us, they're terrible liers — and authors, by definition, are dead.
Practically speaking, I agree, too. Anyone who can be amusing or thought-provoking or typo-correcting is jake (or johnemerson) by me.
But. (And this is a very skinny but.) I do (and will continue to) take an active interlocutor's lack of good faith, or an active interlocutor's obvious assumption of bad faith in myself, as freeing me from any ethical obligation to respond to the S.O.B.
First, so long as they're active, they're not dead. Therefore they're not authors. Therefore a critic has nothing to say about them.
Second, a devil's advocate serves no purpose unless we're in a debate club or trial. What I do participate in — conversation or its simulation — only comes to an end when we're willing to shut up and think about things for a while. Someone who arbitrarily chops and shuffles words in pursuit of the sneerable can extend his "Now your turn" game long after my "Seeking mutual understanding" game's played out.
Third, people drastically overestimate their ability to maintain detachment or insincerity. Rationalists are (to quote Lee Marvin rather than myself) really quite... emotional. Most self-pronounced tricksters turn out to be a bellicose drunk under a lampshade droning about alimony. This is the sort of job that should be done whole-assed or not at all.
Fourth, humanity is fallen, and so there's a limit to how good our good works get. Even Heidegger didn't always write perfectly clearly. Past a certain point — as instanced when I have my Valve Face on — trying to block all possibility of misunderstanding or misstatement reduces us to incoherent trivia. (Or: Why I Am Not an Analytic Philosopher, Again.) This doesn't mean you have to listen respectfully to a neo-Nazi; just it's nice not to have to waste time wondering why what the guy is saying happens to make him sound so much like a neo-Nazi. If you're an overbearing bore, I'd certainly appreciate your letting me know that before we strike up a conversation, and I'm sure you appreciate the same from me.
Fifth, didn't I liberate myself from all this "ethical obligation" crap once we decided to devote ourselves to the pure play of signifiers what don't signify? OK, then, my chosen signifiers are "Oh dear, look at the time, I really must be going."
Finally, what do I hear in Daniel Johnston's best songs? Conviction. The kitty's fed and something's at stake. "What say let's make this game interesting?"
By the way, this shouldn't need saying, but you never know, so I better gotta say Joseph Kugelmass, despite not wanting extra credit for it, seems like someone speaking in good faith. As the poet sang, "Keep punching Joe."
Happy Valventine's Day to you too!
Kugelmass wins this hand.
|. . . 2007-03-16|
I'd rather be right than President. Disappointingly, that entails admitting that presidents matter much more than right people.
|. . . 2007-03-18|
Factual Fictions: The Origins of the English Novel
by Lennard J. Davis, 1983 (2nd ed. 1996)
Both Tom Jones's hero and genre were mysterious bastards. Unlike the hero, the genre's parentage remained open to question, and, in '83, Davis ambitiously aimed to prune classical romances (and even the mock-heroic anti-romance) from its family tree.
In place of that noble lineage, he proposed a three-act structure:
In his own storytelling, Davis sometimes stumbled — most painfully, he blew the punchline — and I wished he'd included a chapter on "secret histories", whose length, legal issues, and formatting (memoirs, correspondence, oddly well-informed third-person narrators) all seem to make them at least as germane as ballads. Most of all, without broad quantitative analysis to back them up, such ventures can always be suspected of cherry-picking the evidence.
But I'm an irresponsibly speculative collagist myself, and these cherries are delicious. I already understood how framing narratives relieve pressure, how they establish both authenticity and deniability: "I don't know, but I been told." But I hadn't realized how often pre-fictional writers had felt the need for such relief. Not having read a life of Daniel Defoe, I hadn't known how brazenly he forged even his own letters. And, speaking of letters, I hadn't read Samuel Richardson's flip-flops on the question of his real-world sources.
The sheer number of examples convinces us that something was shifting uncomfortably, tangled in the sheets of the zeitgeist. How else explain, across decades and forms and class boundaries, this increasingly vexed compulsion to face the old question head on, like a custard pie?
And by the end of the book, we still haven't found fully satisfying answers; the process continues. Recently and orally, for example, our impulse to simultaneously avow and disavow narrative discovered a felicitous formula in the adverbial interjections "like" and "all like".
We haven't even fully agreed to accept the terms of the problem. Remember those quaint easy-going characters in Lennard Davis's Act I? Believe it or not, living fossils of unperplexed truthiness roamed the Lost World of rural America during our lifetimes! My own grandmother sought out no journalism and no novels; she read only True Confessions and watched only her "stories" — that is, soap operas, "just like real life" they were, another quotidian reconfiguration.
* * *
All novelists descend from Epimenides.
Well, OK, if you want to get technical about it, so do novel readers ("All Cretans know my belief is false"), and so does everyone else.
That's the problem with getting technical. (Or, Why I Am Not an Analytic Philosopher, Again.)
But what about memory retrieval??In contrast to common past-future activity in the left hippocampus, the right hippocampus was differentially recruited by future event construction. This finding is notable, not only because others report right hippocampal activity to be common to both past and future events (Okuda et al., 2003) but also because it is surprising that future events engage a structure more than the very task it is thought to be crucial for: retrieval of past autobiographical events....It does seem strange that no regions were more active for memory than for imagination. So memory doesn't differ from fiction? At the very least, it didn't result in greater brain activity than fiction, not in this particular study (an important point).There was no evidence of any regions engaged uniquely by past events, not only in the PFC but across the entire brain. This outcome was unexpected in light of previous results (Okuda et al., 2003). Moreover, regions mediating retrieval processes (e.g., cue-specification, Fletcher et al., 1998) such right ventrolateral PFC (e.g., BA 47) should be engaged by a pure retrieval task (i.e., past events) more than a generation task (i.e., future events). More surprising was the finding that right BA47 showed more activity for future than past events, and that past events did not engage this region significantly more than control tasks.
(I should admit, even though that re-citation honestly conveys what's on my mind — I happened to read it while writing this, and so there it is — it doesn't honestly convey what I consider a strong argument. Like The Neurocritic, I'm skeptical about the functional neuroimaging fad; it seems too much like listening to a heart pound and deducing that's where emotion comes from. Reaching just a bit farther, then — from my keyboard to my bookshelf....)
For researchers in the cognitive sciences, a narrative works like a narrative, whether fictional or not:
... with respect to the cognitive activities of readers, the experience of narratives is largely unaffected by their announced correspondence with reality. [...] This is exactly why readers need not learn any new "rules" (in Searle's sense) to experience language in narrative worlds: the informatives are well formed, and readers can treat them as such.- Richard J. Gerrig, Experiencing Narrative Worlds
According to Davis, modern mainstream genres partly result from legal changes which forced propositionally ambiguous narratives to face courtroom standards of truth. I didn't find his evidence completely convincing, but there's something that felt right about his tale.
A narrative is not a proposition. When narrative is brought into a courtroom, interrogation attempts to smash it into propositional pieces.
But any hapless intellectual who's made a genuine effort to avoid perjury can testify how well that works. We don't normally judge narratives: we participate in them, even if only as what Gerrig calls (following H. H. Clark) a side-participant. If we restricted ourselves to "deciding to tell a lie" or "trying to tell the truth," there wouldn't be much discourse left. Depending on personal taste, you may consider that a worthwhile outcome; nevertheless, you have to admit it's not the outcome we have.
We've been bred in the meat to notice the Recognizable and the Wondrous. The True and the False are cultural afterthoughts: easily shaken off by some, a maddening itch for others, hard to pin down, and a pleasure to lay aside:
At the tone, it will not be midnight. In today's weather, it was not raining.
January 2009: Since I haven't found anyplace better to note it, I'll note here that the best academic book I read in 2008 (unless Victor Klemperer's The Language of the Third Reich counts) was Reading Fictions, 1660-1740: Deception in English Literary and Political Culture, by Kate Loveman, whose metanarrative convincingly allows for (and relies on) pre-"novel" hoaxes and satires while not erasing generic distinctions.
|. . . 2007-11-12|
I find it surprising that you are so sweepingly dismissive of philosophy, as a discipline, frankly. Wittgenstein, Austin, Searle, Dennett, Putnam, Kripke, Davidson, lord knows I can rattle on if you get me started [...] it's all crap, or arid twiddling, you assume? You are, of course, entitled to your opinion. I'm not offended, or anything, but I'm a bit surprised. It's a fairly unusual attitude for someone to take, unless they are either 1) John Emerson; 2) strongly committed to continental philosophy, from which perspective all the analytic stuff looks crap; 3) opposed to interdisciplinarity, per se.- John Holbo, in a comment thread
I have sometimes characterized the opposition between German-French philosophizing and English-American philosophizing by speaking of opposite myths of reading, remarking that the former thinks of itself as beginning by having read everything essential (Heidegger seems a clear case here) while the latter thinks of itself as beginning by having essentially read nothing (Wittgenstein seems a case here). [...] our ability to speak to one another as human beings should neither be faked nor be postponed by uncontested metaphysics, and [...] since the overcoming of the split within philosophy, and that between philosophy and what Hegel calls unphilosophy, is not to be anticipated, what we have to say to one another must be said in the meantime.- Stanley Cavell, "In the Meantime"
I should acknowledge that John's question wasn't addressed to me. Also, that I'm no philosopher. I begin by having read a little, which makes me an essayist — or, professionally speaking, an office worker who essays. I'm going to appropriate John's question, though, because some of the little I've read is philosophy and because essaying an answer may comb out some tangles.
Restricting myself to your menu of choices, John, I pick column 2, with a side of clarification: Although that menu may indicate a snob avoiding an unfashionable ingredient, it's as likely the chef developed an allergy and was forced to seek new dishes. I wasn't drawn to the colorful chokeberry shrubs of "continental tradition" (and then the interdisciplinary slap-and-tickle of the cognitive sciences) until after turning away from "philosophy, as a discipline." Before that turn, I was perfectly content to take Bertrand Russell's word on such quaint but perfidious nonsense.
In fact I came close to being an analytic philosopher — or rather, given that I'd end up working in an office no matter what, being someone with a degree from an analytic philosophy department. On matriculation I wanted coursework which would prod my interest in abstract analysis, having made the (warranted) assumption that my literary interests needed no such prodding. The most obviously abstractly-analytical majors available to me were mathematics-from-anywhere or anglophilic Bryn Mawr's logic-heavy philosophy degree. As one might expect from a teenage hick, my eventual choice of math was based on surface impressions. The shabby mournfulness of Bryn Mawr's department head discouraged me, and, given access for the first time to disciplinary journals, I found an "ordinary language" denatured of everything that made language worth the study. In contrast, the Merz-like opacity of math journals seemed to promise an indefinitely extending vista of potentially humiliating peaks.
Having veered from Bryn Mawr's mainstream major, my detour into Haverford's eclectic, political, and theologically-engaged philosophy department was purely a matter of convenience — one which, as conveniences sometimes do, forever corrupted. I left off the high path of truth: Abstract logic fit abstractions best: natural language brought all of (human) nature with it. As I wrote in email a few years ago, it seemed to me the tradition took a wrong turn by concentrating on certainty to the exclusion of that other philosophical problem: community.
* * *
I'd guess, though, that besides expressing curiosity your query's meant to tweak the answerer's conscience.
At any rate, it successfully tweaked mine. To paraphrase Hopsy Pike, a boy of eighteen is practically an idiot anyway; continuing to restrict one's options to what attracted him would be absurd.
I don't mean I'll finally obtain that Ph. B., any more than I ever became a continental completist. No, I just think my inner jiminy might be assuaged if I gathered some personal canon from the twentieth-century Anglo-American academic tradition.
Cavell, instantly simpatico, will likely be included, but one's not much of a canon. By hearsay Donald Davidson seemed a good risk, and recently a very kind and myriadminded friend lent me his immaculate copy of Subjective, Intersubjective, Objective.
Davidson's voice was likable, and I was glad to see him acknowledge that language is social. But I was sorry he needed to labor so to get to that point. And then as the same point was wheeled about and brought to the joust again and again, it began to dull and the old melancholy came upon me once more. Could these wannabe phantoms ever face the horrible truth that we're made of meat?
With perseverance I might have broken through that shallow reaction, but I didn't want to risk breaking the spine of my friend's book to do it. I put it aside.
And then, John, you tweaked my conscience again:
If you just want a reference to post-Wittgensteinian analytic philosophers who think language is a collective phenomenon and who are generally not solipsists, that's easy: post-Wittgensteinian analytic philosophy as a whole.
Because, of course, my shallow reaction to the Davidson sample might well be expressed as "My god, they're all still such solipsists."
* * *
I remember one other "Farewell to all that" in my intellectual life. At age eight, I gave up superhero comic books.
The rejection was well-timed. I'd experienced Ditko and Kirby at their best; I'd seen the Silver Surfer swoop through "how did he draw that?" backgrounds I didn't realize were collaged. After '67, it would've been downhill.
But eventually, in adulthood, I guilt-tripped back again.
With iffy results, I'm afraid. I greatly admire Alan Moore's ingenuity, but that's the extent of his impact. Jay Stephen's and Mike Allred's nostalgic takes are fun, but I preferred Sin and Grafik Muzik. Honestly, the DC / Marvel / Likewise product I look at most often is Elektra: Assassin, and I look at it exactly as I look at Will Elder.
No matter how justly administered, repeated conscience tweaking is likely to call forth a defensive reaction. And so, John, my bruised ignorance mutters that Moore showed far less callousness than Davidson regarding the existential status of swamp-duplicates — Davidson talks as if the poor creature's not even in the room with us! — and wonders if AAA philosophers' attention to collective pheonomena might not parallel attempts to bring "maturity" to superhero comics:
"We've got gay superheroes being beaten to death! We've got female superheroes getting raped! We've got Thor visiting post-Katrina New Orleans! How can you say we're not mature?"
Because immaturity is built into the genre's structure.
Similarly, whatever it is I'm interpreting as microcultural folly might be the communally-built structure of academic philosophy, and leaving that behind would mean leaving the discipline — as, I understand, Cavell's sometimes thought to have left?
Well, Davidson I'll return to. In the meantime, I bought an immaculate Mind and World of my own to try out. After all, any generic boundaries feel arbitrary at first, and, fanboy or not, I still own some superhero comic books....
1) Wilfrid Sellars 2) Grant Morrison [the set is "practitioners who turns the fault of their framing genre into merits by seriously thinking about why they embrace them allowing this understanding to shape their practice"]
John Holbo sends a helpful response:
Quick read before I get on the bus. That comment you quote is a bit unfortunate because, in context, I wasn't actually complaining about Bill not studying philosophy as a discipline. I was objecting to his claim that there was nothing interesting about post-Wittgensteinian Anglo-American philosophy. It has nothing to say about language or mind or any of the other topics that interest Bill. It isn't even worth giving an eclectic look in, to borrow from, in an interdisciplinary spirit. Bill is an interdisciplinarian who makes a point of steering around the philosophy department - not even giving a look-in - when it comes to language, intentionality and mind. I find that combination of attitudes perverse. So rather than saying 'opposed to the discipline' - hell, I'M opposed to analytic philosophy as a discipline (how not?) - I should have typed: 'convinced that it is a giant lump of crap that does not even contain a few 14k bits of goldishness'. Bill and I were arguing about whether there might not be bright spots in post war Anglo-American philosophy. I said yes. He said he assumed not. (He assumes it must all just be solipsism, ergo not helpful.)
Another point. "Could these wannabe phantoms ever face the horrible truth that we're made of meat?" I think it's a wrong reading of various fussy, repetitive approaches to materialism and mind to assume that people are shuffling their feet because they are FEARFUL of letting go of, maybe, the ghost in the machine. Rather, they are caught up in various scholastic debates and are hunched down, porcupine-wise. They are anticipating numerous attacks, serious and foolish, pettifogging and precise. In Davidson's case it's always this dance with Quine and empiricism. (I could write you a song.) But shying away from the very idea that we're made of meat isn't it, spiritually speaking. This lot are fearless enough, at least where positions in philosophy of mind are concerned. They're just fussy. (Not that waddling along like a porcupine is any great shakes, probably. But it isn't exactly a fear reaction. It's the embodiment of an intellectual strategy.)
Is that a porcupine or a hedgehog, then.
Reckon it depends on whether you're American or Anglo.
I wish this was the conclusion of a review of The Gay Science, but it's just the conclusion of a review of In Kant's Wake: Philosophy in the Twentieth Century:
In the 100-year struggle for a philosophical place in the sun, analytic philosophy simply won out — by the end of the twentieth century it was the dominant and normal style of philosophy pursued in the most prestigious departments of philosophy at the richest and most celebrated universities in the most economically and politically powerful countries in the world. [However] In Kant's Wake shows that there are some serious unresolved issues about the history of twentieth-century philosophy that every serious contemporary philosopher should be seriously interested in.
Always a pleasure to hear from Josh Lukin, here responding to Peli's comment:
Yeh, that's what's interesting about Morrison, for those of us who believe he succeeds at what he sets out to do: his self-reflexive attitude toward trotting out the Nietzsche and the Shelley and the Shakespeare to justify some old costumed claptrap. My clumsy undergraduate piece about that, "Childish Things: Guilt and Nostalgia in the Work of Grant Morrison," showed up in Comics Journal #176 and is cited here with more respect than it deserves.
Looking at comics with a maturity/immaturity axis in mind is great at explaining why Miller's Eighties work is more successful than Watchmen; but it has its limits, not least of which being that we've been down this road before in the superhero stories of Sturgeon, in PKD's (and H. Bruce Franklin's) critique of Heinlein, in Superduperman [find your own damn explanatory link, Ray [anyone who needs an explanatory link to Superduperman probably stopped reading me a long time ago. - RD]], etc. Like David Fiore, I find the Carlyle/Emerson axis (which, come to think of it, has its parallels in Heinlein vs. Sturgeon) to be more fruitful: are we talking fascist superhero stories or Enlightenment superhero stories and, if the former, does the aesthetic appeal of the fascist sublime outweigh the ethical horror?
Copyright to contributed work and quoted correspondence remains with the original authors.
Public domain work remains in the public domain.
All other material: Copyright 2015 Ray Davis.