|. . . 2005-09-20|
A reader writes, referring to something or other I wrote or quoted at the end of 2001:
never a outbreak
never a outbreak
never a break
never a break
And Peli Grietzer points out, regarding portions of My Funny Valentine:
Wow, that's, like, pretty fucking harsh. Really, really pretty fucking harsh. Really pretty damn ha-- ok, that can go on for a while. Do you really stand behind it? Cause really, is pretty fucking harsh.
I should have made clearer that my prettiest fucking harsh words related more to poetry blurbs and reviews than the poetry itself.
In a follow-up message, Peli pointed out an odd disconnect that's occurred in the last few decades between T. S. Eliot's new disapproval from the hoity toity and his continued popularity among adventuresome proles.
To which I had to admit that my own reaction against Eliot (like the negative reactions of his contemporaries) had depended on prior massive approval from (and assumption of) authority. But when Christopher Hitchens pisses on him for personality flaws...? Staggering. (And I don't just mean Hitchens.)
I officially recant. Eliot's OK. Peli says in Israel his "elder statesman figure" is just seen as an "irrelevant appendix, like David Bowie after '81." And hey, I'm no Bowie fan, but I was bopping to "Young Americans" and "Rebel Rebel" just the other day.
We deeply regret any inconvenience.
Many thanks to Morris Jackson for pointing out that my fingers went astray after typing "hoi".
Chris Hitchens certainly shows a certain Ezra Pound-like devotion to his chosen Duce these days, eh? -- RQH
|. . . 2005-10-07|
Variations on a theme by Amardeep Singh
I have always liked Andersen's fairy tale of the Steadfast Tin Soldier. Fundamentally, it is the symbol of my life.- Thomas Mann to Agnes Meyer
At that moment one of the little boys picked up the soldier and tossed him right into the stove, giving no explanation at all. The troll in the box was most certainly to blame.
The tin soldier stood there, brightly lit, and felt a terrible heat, but whether it was from the actual fire or from love, he didn't know. The paint had worn right off him, but whether this happened on his journey or from sorrow, no one could say.
Every day you see his army march down the street,
In Singh's account, a feminist critic of Toy Story would be pleased that a girl owns toys. A less sanguinely imagined feminist would also note the toys' rigid gender segregation, with girls relegated to support and nagging while character development, plot points, and boffos go to the boys. Another viewer might be nettled by the contrast between a story which merged handmade family toys with imported plastics and a production which contributed to the replacement of hand-drawn original characters with celebrity-voiced 3-D models. Or by the movie's recycling in more concentrated form an earlier era's conformist fantasies, newly trademarking someone else's nostalgia to push "like momma used to buy" security. And leave us let aside those misguided children who for some reason lack access to such lovably life-fulfilling objects....
I believe these reactions to the Toy Story movies are possible since, alongside cheerier reactions, I felt them all myself. And, as with Amardeep's reactions, I think they all suggest stories about criticism. He's struck (or stuck) a rich vein here — as Hans Christian Andersen did when he first made the fairy tale a vehicle for meta-fiction.
* * *
"The Steadfast Tin Soldier" isn't an example of Andersen's meta-fictions. (I've made a long list of them and I just checked: "The Steadfast Tin Soldier" isn't on it.) But as the ur-text of Toy Story 1 and 2, it might have something to offer meta-criticism. Let's see!
This particular tin soldier — "the one who turned out to be remarkable" — is disabled — a birth defect left him only one leg — and immobile. While the other toys gain autonomy and "play" (that is, squabble, jostle, chafe, bully, whine, and put on airs), the tin soldier stays resolutely toylike, moved only by outside forces.
But his immobility has nothing to do with his disability; on the contrary, it's his claim to mastery: No matter what threatens him, no matter who attracts him, no matter how it might benefit him to bend or speak up, he remains "steadfast", silent, at attention — until the end, of course, when we find what stuff he's made of.
The troll-in-the-snuff-box curses the soldier for the fixity of his male gaze, its object an immobile paper ballerina en pointe. Misled by his unvaried point of view, he believes her also one-legged, and therefore a suitable match. He learns his mistake only a moment before one of the children decides to put away childish things with a vengeance.
* * *
I don't know how other folks take the "station" in "Playstation". I'm a Navy brat, so I assume it refers to a tour of duty — something you're assigned to live through, pleasant or not.
For me, not; maturing seemed a continuous trading up. (Until I got to backaches and ear hair, anyway.)
But then my version of maturity — like yours — is a bit peculiar.
* * *
Advertising supports and depends on reader identification. This story is your story; this story is brought to you by this product; this product produces your story.
Our story, ours right here, is a story of salvation-through-consumption. No matter how we put it to ourselves, literary readers' status as consumers seems clear enough to publishers and copyright hoarders. What makes us niche consumers is our attachment to kid's stuff — stuff we refuse to throw away despite its blatant obsolescence.
For most non-academics, including a number of English majors I've met, all literature is children's literature. Prepubescents get Gulliver's Travels, adolescents get Moby Dick, and college freshmen might be served an indigestible bit of Henry James. Once normal people have a job, they never again bother with such things until they have children of their own. Even if they patiently crate, uncrate, and re-shelve their T. S. Eliot and Emily Dickinson volumes over the decades, they won't place Amazon orders for A Hundredth Sundrie Flowers or Best American Poetry 2004.
(Which is why "fair use" nowadays tends to get narrowly defined as educational use. No normal adult would want access to a 1930s novel or magazine or song or movie for its own sake.)
In such a world, disputes between proponents of "realistic" and "experimental" fiction seem as absurd as a Federation-outfitted Trekkie snubbing a Dark Shadows fan for his fangs. Grown-ups know the real battles are between the Red Sox and the Yankees or the Christians and Satan, and know the only stories worth reading are True-Life Adventures of themselves. To the vast majority of Americans, all of us here are only marginally distinguishable from the arrested development cases depicted by Chris Ware or Barry Malzberg.
I carry some of their skepticism. It was bred into me, like my bad teeth and whiskey craving. I wince at a poem demanding that this war be stopped right now!, or at a blurb like "You can't spell 'Marxist' without Matrix", or at the ALSC Forum's complaint that community college composition classes stint the Homeric epic, and it's the same wince I made at Ware's "Keeping Occupied" column:
A lonely youth in eastern Nebraska came up with the idea of drawing circuit chips and machine parts on squares of paper and affixing them to his skin with celluloid tape. Hidden beneath his socks and shirt sleeves, these surprising superhuman additions would be just the things he needed to gain respect and awe while changing clothes amongst his peers before gym class.- Acme Novelty Library. Winter, 1994-1995. Number Four, Volume Three.
|. . . 2005-10-25|
The Transition to Language,
ed. Alison Wray, Oxford, 2002
If DNA analysis has secured the there-that's-settled end of the evolutionary biology spectrum, language origins lie in the ultra-speculative. As a species marker and, frankly, for personal reasons, language holds irresistable interest; unfortunately, spoken language doesn't leave a fossil record, and neither does the soft tissue that emits it. In her introduction, Alison Wray, while making no bones about the obstacles faced by the ethical researcher, suggests we use them as an excuse for a game of Twister.
Advanced Twister. Forget about stationary targets; the few points of consensus among Wray's contributors are negative ones:
The most solid lesson to take away from the book is a sense of possibility. Such as:
I think it was David Hume who defined man as the only animal that shoots Coca-Cola out its nose if you tell it a joke while it's drinking. In all other mammals, the larynx is set high in the throat to block off nasal passages for simultaneous nose-breathing and mouth-swallowing.
The same holds for newborns, which is why they can suckle without pausing for breath. In about three months, our larynx starts moving down our throat and we begin our life of burps and choking. About ten years after that's finished, boys' voices break as their larynxes lower a bit more.
Aside from the comic potential, what we gain from all this is a lot of volume, a freer tongue, and a much wider range of vowel sounds.
Got that? Good, because it's wrong! In the year 2000 Fitch realized that dogs and cats sometimes manage to produce sounds above a whimper. Embarrassingly, living anatomy's more flexible than dead anatomy. When a barking or howling dog lifts its head, its larynx is pulled about as far down its throat as an adult human's, thus allowing that dynamic range the neighbors know so well.
However, humans are unique in having a permanently lower larynx.
Almost. As it turns out, at puberty the males of some species of deer permanently drop their larynxes and start producing intimidating roars as needed.
Why would evolution optimize us for speech before we became dependent on speech? To generalize from the example of deer and teenage boys, maybe the larynx lowered to make men sound bigger and more threatening?
That would explain why chicks dig lead singers. It fails to explain why chicks particularly dig tenors, or why chicks can talk. As has happened before in science, I fear someone's been taking this "mankind" thing a bit too literally. Mercifully, Fitch goes on to point out that in bird species where both sexes are territorial, both sexes develop loud calls, and so there may be a place for the female voice after all.
On a similar note....
We understand how a vocabulary can be built up gradually. But how can syntax?
Having, like the other contributors, rejected genetic programming as an option, Okanoya thinks syntax began as a system of meaning-free sexual display before being repurposed: grammar as melody. The bulk of his article is devoted to the male Bengalese finch, each of whom hones an individualized song sequence over time, listening to its own progress rather than relying on pure instinct or pure mimicry — sorta like how a cooing babbling infant gradually invents Japanese, right? Right?
Drifting further from shore, Okanoya speculates that "singing a complex song may require (1) higher testosterone levels, (2) a greater cognitive load, and (3) more brain space." And a bit further: "Since the ability to dance and sing is an honest indicator of the performer's sexual proficiency, and singing is more effective than dancing for broadcasting...." And as we wave goodbye: "... the semantics of a display message would be ritualistic and not tied into the immediate temporal environment and, hence, more honest than the news-bearing communication that dominates language today."
As an aesthete, I'm charmed. As a skinny whiney guy with a big nose, I'm relieved to learn that Woody Allen really was the sexiest man in the world. And yet why does Mrs. Bush exhibit more coherent syntax than Mr. Bush? Does she really have more testosterone?
Perhaps we could broaden the notion of "sexual display" a bit. In a communal species, wouldn't popularity boost one's chance at survival and reproduction regardless of one's sex?
At any rate, Okanoya's flock of brain-lesioned songbirds should win him a Narbonic Mad Science Fellowship.
A straightforward "everyone said that only human beings can do this but actually monkeys can do it too" piece. In this case, monkeys can learn how to enter a seven-digit PIN on a cash machine which changes all the positions of the buttons every time they use it, except it's a banana-pellet machine and photographs instead of digits. An ominous aside: "It is doubtful, however, that the performance described in this study reflects the upper limit of a monkey's serial capacity."
Towards the end, Terrace refers to recent research on language kind-of acquisition among bonobos. Unlike the common chimps on whom we've wasted so many National Geographic specials, bonobo chimps can learn some ASL and English tokens purely by observing how humans use them. Still, there's no evidence that their use reflects anything more than hope of reward. When it comes to utterly profitless verbiage, humanity still holds the edge!
The candidate's a waffling policy wonk. Unelectable.
Language has words and grammar; communication has expressions. Wray focuses on units of expression which we never consciously break down into units of language, claiming that "a striking proportion" of formulas, idioms, cliches, and Monty Python recitations are manipulative or group defining signals rather than informative messages.
What we call "communication" among non-human species consists pretty exclusively of such signals, and so it is puzzling that human language doesn't deal with them more directly and efficiently. Wray's solution to the puzzle supposes a protolanguage that was all message, no words: "layoffameeyakarazy", "voulayvoocooshayavekmwasusswar", and so forth.
As any walk through a school cafeteria will remind us, the expressivity available to holistic formulaic language is pretty limited, which (says Wray) is why homonids stayed stuck in a technological rut for a million years. Meanwhile, analytic language developed slowly and erratically as a more or less dispensible, but very useful, supplement to holistic utterances.
Until it, um, became all we had and we were forced to cobble together holistic messages in our current peculiar way.
Thump. On the holistic side, there are tourists' phrasebooks, aphasics who can memorize (but not create) texts, and pundits who quote and name-drop in lieu of comprehension. But it seems problematic to claim that language derives from the holistic. On the contrary, Wray's evidence indicates that, although the need is there, language does a pretty poor job of meeting it.
A prole in a poke. Despite the title and the opening citation from Marx & Engels, Knight's worried about how materialism might have blocked the development of language.
Human children become more linguistically skilled when treated pleasantly by their parents, but other great apes don't show much affection towards their offspring. Similarly, there wouldn't be much reason to learn language in a culture where everyone lied all the time, but a gorilla's most altruistic and cooperative signals tend to be the exclamations it can't repress. Homo nonrepublicanis is the only ape to evolve sincerity.
What caused this awful mishap? Well, Chris Knight has this theory that all of human culture all over the world began when women's genetic material realized that they'd have a better chance to win the Great Game if men couldn't tell when they were menstruating, since the men's genetic material would be inclined to seek out more reproductively active genetic collaborators at such times. Ding-dong, Red Ochre calling! — and the rest is history.
As for language? Hey, didn't you read the part about this explaining "all of human culture"? Isn't language part of human culture? Q.E.D.
[Inclusive as Alison Wray strove to be, some hurt feelings were bound to occur, and as far as preposterous anthropological mythmaking goes, Eric Gans may beat Knight. For one thing, Gans's story would be easier to get on the cover of a science fiction pulp. For another, it emphasizes the inhibitory aspect of non-mimetic representation.
For a third, it deals with a central riddle of language evolution (as opposed to the evolution of language). Some linguistic changes seem reliably unidirectional. For example, highly inflected languages are harder to learn than subject-verb-object ordered languages; when cross-cultural contact (or cultural catastrophe) occurs, languages downgrade inflection in favor of word order; and there are no known examples of a order-based language evolving more reliance on inflection.
So where did those inflected languages come from? An even more inflected, difficult, and unwieldy language? That doesn't sound like a very practical invention.
Gans has a simple fix: Language wasn't meant to be practical. Luther and Tyndale shouldn't have gotten so exercised over Greek New Testaments and Latin Masses; incomprehension's the original sacred point.
Not that I believe any of this. I just think it's cool. Jock-a-mo fee-nah-nay.]
All of this suggests (or at least doesn't disprove) that "language" could've evolved gesturally long before it became vocal. Once audible intentional vocalizing was biologically possible, there'd be good reasons to switch: yelling would cover a wider distance; semantic tokens would be more stable; it would allow conversation during tool manufacture and use.
And as proven by infants and tourists, it's possible to add vocalization gradually to gesturally based communication, avoiding that awkward "everything at once or nothing at all" scenario.
Thumbs up, as they say.
A glum warning against reading too much into much-too-selected evidence. In this case, the too much is complex planning that would require language's help, and the much-too-selected are so-called "hand-axes" which might, from raw statistical evidence, be accidental by-products rather than intentional products of an industry.
Bickerton wants to get back to the real reason for human communication: better food and plenty of it. As a student of menu French and Italian, I'm in no position to argue.
Actually, he's pretty mild-mannered about it. Elsewhere, he's guessed that syntax is rooted in reciprocal altruism. But since there's no evidence that hominids dealt with any more social complexity than other primates, he doesn't believe social conditions alone could've triggered a change as drastic as predicated language.
The conditions which did radically distinguish our ancestors from their primate relatives were environmental. Instead of living large in the forest, hominids roamed savannahs full of predators and fellow scavengers, and did so successfully enough to expand out of Africa. Also, unlike the socially-focused great apes, humans are capable of observing and drawing conclusions from their surroundings. (To put it in contemporary terms: Driving = environmental interaction; road rage = social interaction.) Any ability to observe and then to reference would be of immediate use to a foraging and scavenging species. Predication might develop from a toddler-like combination of noun and gesture ("Mammoth thisaway"), and lies would be easily detected and relatively profitless.
It's unlikely that human language's primal goal was to accurately communicate an arbitrary two-digit number. At that level of abstraction, about all computer simulations can do is disprove allegations that something's impossible. So, ignoring the metaphors, this paper shows it's possible to improve communication of two-digit numbers across generations of weighted networks without benefit of Prometheus.
Bringing the metaphors back in, they report that smaller populations and an initially restricted but growing number of inputs are helpful in when establishing a stable "language", and point out that "because many sensory capabilities are not available at birth, the child learns its initial categorizations in what is effectively a simplified perceptual environment." (We'll come back to this in a bit.)
More computer simulation; worse anthropomorphizing. Sponsorship by the Sony Corporation might have something to do with that, and with inviting the public to interfere through a web page and at various museums. The number of breakdowns introduced by this complexity is left vague, but the project seems to have earned a Lupin Madblood Award for Ludicrously Counterproductive Publicity Stunts.
Too bad, because it's a great idea. Instead of modeling perception and language evolution separately, the project combines the two with gesture in an "I Spy" guessing game. Two weighted network simulations have access to visual data through a local video camera, have a way to "point" at particular objects (by panning and zooming), and can exchange messages and corrections to each other. The researchers monitor.
With the usual caveat about how far analogies should be carried, some of the results are enjoyably suggestive. A global view isn't needed to establish a shared vocabulary. Communication can be successful even with slightly varying interpretations and near synonyms. Again, it helps if the initial groups are fairly small, and if the complexity of the inputs increases over time.
Between Creole formation, sign languages, and computer simulations, we now have a few examples of language evolution to look at. Could it be that grammar isn't genetically programmed? Could it be that social conditions play a part in the development of syntactical language!?
Well — yes. But Ragir's attack on genetic programming is kind of a MacGuffin anyway — a good excuse to cover some interesting ground.
Ragir compares nine sign languages, and, where possible, their histories and the circumstances of their users-and-originators. That (limited) evidence shows it's possible for a context-dependent quasi-pidgin to go for some generations. Grammaticalization and anti-semantic streamlining of illustrative gestures seem to happen gradually rather than catastrophically. They're introduced by children rather than adults, and only when peer contact is encouraged. The emergence of syntax is socially sensitive.
Returning to her MacGuffin, Ragir proposes:
that we consider 'language-readiness' as a function of an enlarged brain and a prolonged learning-sensitive period rather than a language-specific bioprogram. In other words, as soon as human memory and processing reached a still unknown minimum capacity, indigenous languages formed in every hominine community over a historic rather than an evolutionary timescale. As a result of species-wide delays in developmental timing, a language-ready brain was probably ubiquitous in Homo at least as early as half a million years ago. [...] As for what triggered the increase in brain size that supports language-readiness...
Here's where we come back to that thing I said we'd come back to. A human newborn is in pretty bad shape compared to the newborns of a lot of other species, and stays in pretty bad shape for a pretty long time. As Nature vs. Nurture combatants seem unable to get through their now-hardened skulls, this lets human infants and children undergo more physical — and specifically neurological — transformation while immersed in a social context.
Although that can be entertaining, maintenance is an issue. And in a savannah environment, dependent on wandering and surrounded by predators, maternity or paternity leaves would be hard to procure. It's nice that our plasticity encourages language, but what would've encouraged our plasticity?
Definitionally, hominids are featherless bipeds. But, as some readers will vividly recall, bipedalism raises a difficult structural engineering problem: If you're going to walk on two legs, there's a limit to how wide your hips can get; narrow hips limit what you can give birth to. Mother Nature's endearingly half-assed solution was to make what we give birth to more compressible.
And since the kids were going to be useless anyway, they might as well be smart.
An attack on the all-or-nothing idea of syntax which so exercised Chapter 3. While we're growing up, syntax development isn't catastrophic, and Burling says it's even more gradual than it looks. Infants comprehend some syntactic clues long before they can reproduce them. And command of syntactical rules continues to grow long after children are reading and writing recognizable sentences. (Hell, sometimes I'm still faking it.) So why think it had to be all-or-nothing species wide?
Another oppositional piece. Did language develop purely from primate calls? Or purely as a representation of our own mental activity as sum fule say? Or purely as an excuse for alliteration?
I threw in that last choice myself, but you see the problem. The options aren't exclusive, and introspection doesn't yield universally applicable results. For example, it may be true that "devices such as phonology and much of morphology" "make no contribution to reasoning" as experienced by Pinker and Bloom and Hurford, but they surely do to mine.
Still, not a bad resource when you're bored by the usual arguments against Sapir-Whorf: If our thinking was determined by language, we'd all be completely batshit.
So, have you heard that Universal Grammar might not be genetically programmed?
Although the impact of their dissent's weakened by its placement, Christiansen and Ellefson do well with the set-up:
Whereas Danish and Hindi needed less than 5,000 years to evolve from a common hypothesized proto-Indo-European ancestor into very different languages, it took our remote ancestors approximately 100,000-200,000 years to evolve from the archaic form of Homo sapiens into the anatomically modern form, sometimes termed Homo sapiens sapiens. Consequently, it seems more plausible that the languages of the world have been closely tailored through linguistic adaptation to fit human learning, rather than the other way around. The fact that children are so successful at language learning is therefore best explained as a product of natural selection of linguistic structures, and not as the adaptation of biological structures, such as UG.
Their eclectic research is held together by one common ingredient: learning an "artificial language" with no semantics outside its visual symbols. This reduces "language" to the ability to pick up and remember an arbitrary rule behind sequences. Admittedly, that's not much of what language does, but it includes some of what we call grammar.
Strengthening the association, in a clinical study, agrammatic aphasics did no better than chance in absorbing the rules behind the sequences. And brain-imaging studies have found similar reactions to grammatical errors, game rule violations, and unexpected chords in music.
Next, Ellefson and Christiansen look at a couple of common grammatical tendencies: putting topic words at the beginning or end rather than the middle of a phrase, for example, or structuring long sequences of clauses in orderly clumps. In both cases, we've picked patterns that reduce the cognitive load. Artificial grammars which followed these rules were learned more easily than ones which didn't, both by human subjects and by computer simulations.
Newmeyer begins by agreeing with the general consensus that you can't tell much about a culture from its language. That doesn't mean there are no major differences between languages, though. Or that there weren't even more drastic differences between prehistoric languages and the languages we know. Or that we really know anything about the prehistoric cultures themselves. Or when language started. Or how often. Or the physical capabilities of the speakers.
In fact, we have no facts. We're fucked.
Heine and Kuteva soldier on, trying to boil down a fairly reliable set of rules for language change and deduce backwards from them. Each visible grammatical element in turn is shown (in the examples they choose) to be derivable from some earlier concrete noun or action verb. The basic principle should be familiar from ethno-etymologically crazed types like Ezra Pound: the full weighty penny of meaning slowly worn down by calloused palms into a featureless devalued token....
Fitting the general tone of the book, though, they close with a warning that their approach is based on vocabulary rather than syntax, and so, even assuming one-way movement away from "a language" consisting only of markers for physical entities and events, we still can't say much about how they might have been put together.
And so, in conclusion, say anything.
|. . . before . . .||. . . after . . .|
Copyright to contributed work and quoted correspondence remains with the original authors.
Public domain work remains in the public domain.
All other material: Copyright 2005 Ray Davis.