Gordon Tullock, RIP

By Kindred Winecoff

He was not my favorite economist, but there is no question that he had a strong mind that was consistently capable of locating puzzles which had escaped the attention of others. My favorite, perhaps, is his observation that given how much is at stake it is very surprising that there is so little money in politics. Spending even $1 billion on a presidential campaign is very little, when compared to the amount of influence over a $15 trillion economy that a president has. (The most up-to-date explanation for this is that spending on politics is mostly a consumption good, not rent-seeking.) On another occasion Tullock argued that if we really wanted to improve automobile safety we should replace all airbags with an 8 inch ice pick that would ram into drivers’ chests if they crashed. I know I’d drive more slowly and carefully under such conditions.

The fact that he died on Election Day is appropriate, or perhaps ironic. Tullock was an outspoken opponent of voting for instrumental reasons — voting incurs costs while the probability of impacting the outcome is minuscule, so the act of voting generates negative utility in expectation — and he extended the logic to revolutions. He had many interesting ideas, although whether they amount to a consistent philosophy or politics is debatable.

14 Reasons Susan Sontag Invented Buzzfeed!

By Seth Studer

41wboBULMFLIf you’re looking for a progenitor of our list-infested social media, you could do worse than return to one of the most prominent and self-conscious public intellectuals of the last half century. The Los Angeles Review of Books just published an excellent article by Jeremy Schmidt and Jacquelyn Ardam on Susan Sontag’s private hard drives, the contents of which have recently been analyzed and archived by UCLA. Nude photos have yet to circulate through shadowy digital networks (probably because Sontag herself made them readily available – Google Image, if you like), and most of the hard drives’ content is pretty mundane. But is that going to stop humanists from drawing broad socio-cultural conclusions from it?

Is the Pope Catholic?

Did Susan Sontag shop at Sephora?

Sontag, whose work is too accessible and whose analyses are too wide-ranging for serious theory-heads, has enjoyed a renaissance since her death, not as a critic but as an historical figure. She’s one of the authors now, like Marshall McLuhan or Norman Mailer, a one-time cultural institution become primary text. A period marker. You don’t take them seriously, but you take the fact of them seriously.

Sontag was also notable for her liberal use of lists in her essays.

“The archive,” meanwhile, has been an obsession in the humanities since Foucault arrived on these shores in the eighties, but in the new millennium, this obsession has turned far more empirical, more attuned to materiality, minutia, ephemera, and marginalia. The frequently invoked but still inchoate field of “digital humanities” was founded in part to describe the work of digitizing all this…stuff. Hard drives are making this work all the more interesting, because they arrive in archive pre-digitized. Schmidt and Ardam write:

All archival labor negotiates the twin responsibilities of preservation and access. The UCLA archivists hope to provide researchers with an opportunity to encounter the old-school, non-digital portion of the Sontag collection in something close to its original order and form, but while processing that collection they remove paper clips (problem: rust) and rubber bands (problems: degradation, stickiness, stains) from Sontag’s stacks of papers, and add triangular plastic clips, manila folders, storage boxes, and metadata. They know that “original order” is something of a fantasy: in archival theory, that phrase generally signifies the state of the collection at the moment of donation, but that state itself is often open to interpretation.

Microsoft Word docs, emails, jpegs, and MP3s add a whole slew of new decisions to this delicate balancing act. The archivist must wrangle these sorts of files into usable formats by addressing problems of outdated hardware and software, proliferating versions of documents, and the ease with which such files change and update on their own. A key tool in the War on Flux sounds a bit like a comic-book villain: Deep Freeze. Through a combination of hardware and software interventions, the Deep Freeze program preserves (at the binary level of 0’s and 1’s) a particular “desired configuration” in order to maintain the authenticity and preservation of data.

Coincidentally, I spent much of this morning delving into my own hard drive, which contains documents from five previous hard drives, stored in folders titled “Old Stuff” which themselves contain more folders from older hard drives, also titled “Old Stuff.” The “stuff” is poorly organized: drafts of dissertation chapters, half-written essays, photos, untold numbers of .jpgs from the Internet that, for reasons usually obscure now, prompted me to click “Save Image As….” Apparently Sontag’s hard drives were much the same. But Deep Freeze managed to edit the chaos down to a single IBM laptop, available for perusal by scholars and Sontag junkies. Schmidt and Ardam reflect on the end product:

Sontag is — serendipitously, it seems — an ideal subject for exploring the new horizon of the born-digital archive, for the tension between preservation and flux that the electronic archive renders visible is anticipated in Sontag’s own writing. Any Sontag lover knows that the author was an inveterate list-maker. Her journals…are filled with lists, her best-known essay, “Notes on ‘Camp’” (1964), takes the form of a list, and now we know that her computer was filled with lists as well: of movies to see, chores to do, books to re-read. In 1967, the young Sontag explains what she calls her “compulsion to make lists” in her diary. She writes that by making lists, “I perceive value, I confervalue, I create value, I even create — or guarantee — existence.”

As reviewers are fond of noting, the list emerges from Sontag’s diaries as the author’s signature form. … The result of her “compulsion” not just to inventory but to reduce the world to a collection of scrutable parts, the list, Sontag’s archive makes clear, is always unstable, always ready to be added to or subtracted from. The list is a form of flux.

The lists that populate Sontag’s digital archive range from the short to the wonderfully massive. In one, Sontag — always the connoisseur — lists not her favorite drinks, but the “best” ones. The best dry white wines, the best tequilas. (She includes a note that Patrón is pronounced “with a long o.”) More tantalizing is a folder labeled “Word Hoard,” which contains three long lists of single words with occasional annotations. “Adjectives” is 162 pages, “Nouns” is 54 pages, and “Verbs” is 31 pages. Here, Sontag would seem to be a connoisseur of language. But are these words to use in her writing? Words not to use? Fun words? Bad words? New words? What do “rufous,” “rubbery,” “ineluctable,” “horny,” “hoydenish,” and “zany” have in common, other than that they populate her 162-page list of adjectives? … [T]he Sontag laptop is filled with lists of movies in the form of similar but not identical documents with labels such as “150 Films,” “200 Films,” and “250 Films.” The titles are not quite accurate. “150 Films” contains only 110 entries, while “250 Films” is a list of 209. It appears that Sontag added to, deleted from, rearranged, and saved these lists under different titles over the course of a decade.

“Faced with multiple copies of similar lists,” continue Schmidt and Ardam, “we’re tempted to read meaning into their differences: why does Sontag keep changing the place of Godard’s Passion? How should we read the mitosis of ‘250 Films’ into subcategories (films by nationality, films of ‘moral transformation’)? We know that Sontag was a cinephile; what if anything do these ever-proliferating Word documents tell us about her that we didn’t already know?” The last question hits a nerve for both academic humanists and the culture at large (Sontag’s dual audiences).

Through much of the past 15 years, literary scholarship could feel like stamp collecting. For a while, the field of Victorian literary studies resembled the tinkering, amateurish, bric-a-brac style of Victorian culture itself, a new bit of allegedly consequential ephemera in every issue of every journal. Pre-digitized archives offer a new twist on this material. Schmidt and Ardam: “The born-digital archive asks us to interpret not smudges and cross-outs but many, many copies of almost-the-same-thing.” This type of scholarship provides a strong empirical base for broader claims (the kind Sontag favored), but the base threatens to support only a single, towering column, ornate but structurally superfluous. Even good humanist scholarship – the gold standard in my own field remains Mark McGurl’s 2009 The Program Era – can begin to feel like an Apollonian gasket: it contains elaborate intellectual gyrations but never quite extends beyond its own circle. (This did not happen in Victorian studies, by the way; as usual, they remain at the methodological cutting edge of literary studies, pioneering cross-disciplinary approaches to reading, reviving and revising the best of old theories.) My least favorite sentence in any literary study is the one in which the author disclaims generalizability and discourages attaching any broader significance or application to the study. This is one reason why literary theory courses not only offer no stable definition of “literature” (as the E.O. Wilsons of the world would have us do), they frequently fail to introduce students to the many tentative or working definitions from the long history of literary criticism. (We should at least offer our students a list!)

In short, when faced with the question, “What do we do with all this…stuff?” or “What’s the point of all this?”, literary scholars all-too-often have little to say. It’s not that a lack of consensus exists; it’s an actual lack of answers. Increasingly, and encouragingly, one hears that a broader application of the empiricist tendency is the next horizon in literary studies. (How such an application will fit into the increasingly narrow scope of the American university is an altogether different and more vexing problem.)

Sontag’s obsession with lists resonates more directly with the culture at large. The Onion’s spin-off site ClickHole is the apotheosis of post-Facebook Internet culture. Its genius is not for parody but for distillation. The authors at ClickHole strip the substance of clickbait – attention-grabbing headlines, taxonomic quizzes, and endless lists – to the bone of its essential logic. This logic is twofold. All effective clickbait relies on the narcissism of the reader to bait the hook and banal summaries of basic truths once the catch is secure. The structure of “8 Ways Your Life Is Like Harry Potter” would differ little from “8 Ways Your Life Isn’t Like Harry Potter.” A list, like a personality quiz, is especially effective as clickbait because it condenses a complex but recognizable reality into an index of accessible particularities. “Sontag’s lists are both summary and sprawl,” write Schmidt and Ardam, and much the same could be said of the lists endlessly churned out by Buzzfeed, which constitute both an structure of knowledge and a style of knowing to which Sontag herself made significant contributions. Her best writing offered the content of scholarly discourse in a structure and style that not only eschewed the conventions of academic prose, but encouraged reading practices in which readers actively organize, index, and codify their experience – or even their identity – vis a vis whatever the topic may be. Such is the power of lists. This power precedes Sontag, of course. But she was a master practitioner and aware of the list’s potential in the new century, when reading practices would become increasingly democratic and participatory (and accrue all the pitfalls and dangers of democracy and participation). If you don’t think Buzzfeed is aware of that, you aren’t giving them enough credit.

Defend Principles, Not People

By Kindred Winecoff

Steve Salaita resigned from a permanent position at Virginia Tech to accept an appointment at the University of Illinois. Then, this week, his offer from Illinois was rescinded. Neither the university nor Salaita has divulged the reason for this reversal, but Inside Higher Ed has speculated:

The sources familiar with the university’s decision say that concern grew over the tone of his comments on Twitter about Israel’s policies in Gaza. While many academics at Illinois and elsewhere are deeply critical of Israel, Salaita’s tweets have struck some as crossing a line into uncivil behavior.

For instance, there is this tweet: “At this point, if Netanyahu appeared on TV with a necklace made from the teeth of Palestinian children, would anybody be surprised? #Gaza.” Or this one: “By eagerly conflating Jewishness and Israel, Zionists are partly responsible when people say antisemitic shit in response to Israeli terror.” Or this one: “Zionists, take responsibility: if your dream of an ethnocratic Israel is worth the murder of children, just fucking own it already.”

Salaita’s appointment to Illinois was thus apparently canceled because of the opinions he expressed concerning a current event. There are questions about whether this constitutes a violation of academic freedom — in the article linked above a professor at U of I suggests that protection is limited to academic work — but I am on record defending academics in similar cases from institutional reprisals. In other words, I think Salaita’s appointment should have gone through. The fact that I disagree vehemently with some of his expressed views, particularly the allegations of genocide and statements that Israel has earned any anti-Semitism it experiences, is immaterial. If the university was concerned with Salaita’s opinions, which were not a secret before this week, then they should not have given him the offer in the first place. Given that they did, and he gave up his previous (tenured) employment to accept this offer in good faith, it absolutely should be honored.

But I wonder on what principle folks like Corey Robin can object. Robin strongly protested the appointment of General David Petraeus to a temporary teaching position at his own university. While the content of Robin’s protests were primarily about the fiscal cost of hiring Petraeus, given Robin’s advocacy efforts (and the fact that he uttered not one word of disappointment when Paul Krugman was later given a permanent position at a higher salary for less work than Petraeus) it is difficult to believe that ideology wasn’t a part of it. Robin is also a visible proponent of the BDS movement and supported the American Studies Association’s proposed boycott of all Israeli academic institutions. It appears, in other words, that Robin believes in politicizing academic hiring and promotion decisions except when he does not, and that there is a perfect correlation between his political attitudes and his attitudes on such decisions.

This is the abrogation of principle. If the university is to be politicized, and Salaita seems to believe it should not be, then it is disingenuous to express outrage with a political outcome. The fact that university administrations appear to be reflexively pro-Israel is why actions such as those taken by the ASA are counter-productive and a defense of principles is so important.

Against Neil deGrasse Tyson: a Longer Polemic

By Seth Studer

In her recent Atlantic review of two new books on atheism, Emma Green brilliantly demarcates what is missing from the now decade-long insurgency of anti-ideological atheism. I use the term “anti-ideological atheism” instead of “neo-atheism” or “new atheism” or the obnoxious, self-applied moniker “noes” because opposition to ideology – to ideational constructions – is one of the major recurring threads among these varied atheist identities (a frightening mixture of elitism and populism is another). Green illustrates this point when she notes the incongruity between Peter Watson’s new history of post-Enlightenment atheism, Age of Atheists, and the kind of atheism most vocally espoused in the 21st century. The central figure in Watson’s study, Friedrich Nietzsche, is almost never cited by Richard Dawkins or Samuel Harris or Neil deGrasse Tyson. Nor, for that matter, are Nietzsche’s atheistic precursors or his atheistic descendants…all diverse in thought, all of whom would have been essential reading for any atheist prior to, well, now.

The most famous atheist, the one whose most famous quote – GOD IS DEAD – your scrawled with a sharpie on the inside door of your junior high locker, is almost persona non grata among our most prominent living atheists. His near-contemporary, Charles Darwin (hardly anyone’s idea of a model atheist), is the belle of the bellicose non-believer’s ball.  

Green also notes that the other famous 19th century atheist – Karl Marx, whose account of religious belief vis a vis human consciousness is still convincing, at least more than Nietzsche’s – is likewise incited by our popular atheists. The reason may be simple: invocations of Marx don’t score popularity points anymore, and the business of anti-ideological atheism is nothing if not a business.

But there is, I believe, a larger reason for the absence of Nietzsche, Marx, and almost all other important atheists from today’s anti-ideological atheism. As fellow Jilter Graham Peterson recently said to me, these popular atheists need a dose of humanities: liberal inquiry and a sense that truth is hard, not dispensable in easy little bits like Pez candies. I would expand on that: they need a more dynamic discursivity, they need more contentiousness, they need more classical, humanist-style debate. They need the kind of thinking that frequently accompanies or produces ideology.

But of course, most of them don’t want that. They resist Nietzsche’s ideological critiques. They resist Marx who, despite his inherent materialism, is more systematically ideological than, say, Darwin. Sigmund Freud (who dedicated an entire tract to atheism and who is central to its 20th century development) is never mentioned, along with a host of other names.

And they do not invite new critiques – except, apparently, from Young Earth Creationists.

The title of Green’s review is pitch perfect: “The Intellectual Snobbery of Conspicuous Atheism: Beyond the argument that faith in God is irrational—and therefore illegitimate.” Contrary to what Richard Dawkins and others might claim, atheists are not a persecuted minority in the West (any group consisting mostly of white men is always eager to squeeze and contort their way into “persecuted minority” status, even as persecuted minorities struggle to push out). Anti-ideological atheism is declared conspicuously, a badge of honor and a sign of intellect. Green quotes Adam Gopnik, who introduces the nauseating term “noes,”

What the noes, whatever their numbers, really have now … is a monopoly on legitimate forms of knowledge about the natural world. They have this monopoly for the same reason that computer manufacturers have an edge over crystal-ball makers: The advantages of having an actual explanation of things and processes are self-evident.

In this respect, the “noes” have “an actual explanation of things” in greater abundance than did Nietzsche or Marx or (especially) the atheists of antiquity. In this respect, the atheists of yore and religious believers have more in common with each other than with the “noes” of today.

In my last post, I shared my thoughts about the meteoric rise of Neil deGrasse Tyson (do meteors rise? I’m sure deGrasse Tyson would have something to say about that bit of rhetorical infactitude). It may seem unfair to pick on deGrasse Tyson when, in reality, I’m bemoaning a phenomenon that began back when George W. Bush used vaguely messianio-Methodist language to frame the invasion of Iraq, an event that, whatever you think of its initial rationalizations, was poorly executed, quickly turned to shit, and set the “War on Terror” back at least a decades. In/around 2004, Richard Dawkins (who is still the author of the best popular overview of natural history ever written) realized that conditions existed for a profitable career shift.

Widespread discontent with politico-religious language was in the United States – where right-wing militarists decried the brand of fundamentalist Islam that obliterated lower Manhattan and anti-war leftists decried the (pascificst-by-comparison) brand of fundamentalist Christianity that influenced U.S. policy – coincided with fear of religious extremism in Europe, where the vexed term “Islamophobia” retained some usefulness: legitimate anxieties about theocratic terrorism (e.g., violent anti-Western responses to the deliberately provocative Mohammad cartoons and then the public slaughter of Theo van Gogh) mingled with old-fashioned European xenophobia, which was never a perfect analogue to American xenophobia. And between the U.S. and Europe lies England, where political and public responses to Islamic terrorism less often involved blustery American gun-slinging or shrill continental nativism but rather stern appeals to “common sense.” Since the collapse of British colonialism, intellectuals in England are less apt to use the term civilization than are their cousins across the Channel or their cousins across the Pond (where the term has been historically deployed by cultural warriors, a la Alan Bloom, in order to give anti-colonial leftists the willies).

The term civilized, on the other hand, is still relevant in English public discourse: not with regard to other societies, but to English society. The concept of civilized discourse (or civilised, if you will) doesn’t seem to carry the same ideological freight as civilization. But when Dawkins mocks post-positivist socio-humanist* analyses of, say, indigenous Amazonian cultures who explain natural phenomena (e.g., how the jaguar get its spots) with traditional tales, his arguments carry the epistemological heft of a suburban Thatcherite scanning his daughter’s contemporary philosophy textbook, throwing his hands in the air, and exclaiming “Oh come on!” In other words, Dawkins belongs to the long line of British “common sense” thinkers. Born in Kenya, raised in Africa, and a fan of Kipling, Dawkins has been criticized for possessing a colonial bent to his thought.

And there’s something to be said for common sense, even common sense colonialism; George Orwell, of all people, joined Rudyard Kipling (one of the most misunderstood writers in the English canon) to defend British colonialism in England on the reasonable (if depressing) grounds that, had the English let India be, the Russians would have colonized the subcontinent. This hardly excuses British crimes against India and its people, but even a cursory overview of Russian colonial atrocities forces one to sigh a very troubled and uncomfortable sigh of – what, relief? – that the British Raj was the guilty party.

Richard Dawkins
Richard Dawkins

But common sense is not fact, much less knowledge, and Dawkins has made a career of playing fast and loose with these concepts. In Unweaving the Rainbow (1998), Dawkins defended science not against the pious but against the epistemological excesses of cultural studies. In one chapter, he wrote that an Amazonian tribesman who is convinced that airplanes are fueled by magic (Dawkins’ examples often play off colonial tropes) and the the socio-humanist (usually an American cultural studies professor or graduate student in English whose dress and hygiene or dubious and who write with incomprehensible jargon) who respects the Amazonian’s conviction are both reprehensible, especially the professor, who is an enabler: he could give the ignorant native a cursory lesson in physics, but instead paints a scholarly veneer over so much tribal mumbo-jumbo. Why not explain the real source of wonder and disabuse the native of his false notions: that beautiful physics can explain how people fly!

Despite its best efforts, Unweaving the Rainbow was Dawkins’ first foray into the “Debbie Downer” genre of popular science writing. This genre pits the explanatory power of “scientific knowledge” (more about that term in a moment) against religion, superstition, homeopathy, most of Western philosophy, and pretty much any knowledge acquired or unverified by non-quantitative methods.

The “Debbie Downer” genre can be useful, especially when turned on the practice of science itself: Dawkins and his allies have successfully debunked the dogmatism that led Stephen Jay Gould’s career astray. The atrocities of Nazi and Soviet science were exposed and explained with both rigorous science and common sense. The genre can also be used  to wildly miss the point of things. I have friends who are ardent Calvinists or ex-Calvinists, who are incapable of reading Paul’s epistles without a Calvinist interpretation. They read Paul, but all they see is Calvinism. Likewise with fundamentalists and anti-ideological atheists who read Genesis but only see cosmology. Yet Paul was not a Calvinist, and Genesis is not cosmology. In some sense, the same principle applies to deGrasse Tyson and Gravity. Is this a question of knowing too much or thinking too little?  

In Unweaving the Rainbow, Dawkins confronts charge that science takes all the fun and beauty of the world just by, y’know, ‘splainin’ it. Somewhat comically, the book’s title literalizes an instance of poetic language, a practice common among Dawkins’ bête noire: religious fundamentalists. John Keats’ playful exasperation that “charms fly/ at the touch of cold philosophy” and that the natural sciences (still embryonic in Keats’ time) “unweave the rainbow,” reducing it to “the dull catalogue of common things,” is beautifully articulated representation of a well-worn human experience, one that requires appreciation more than rebuttal. But for Dawkins, the poem demands rebuttal, and not a rebuttal that distinguishes between the uses and functions of poetic language. Unweaving the Rainbow is a treatise that, dammit, science makes the world more beautiful, not the other way round.

And Dawkins is correct. After reading his marvelous Ancestor’s Tale, I felt a profound kinship with every toad I encountered on the sidewalk and every grasshopper that attached itself to my arm, six cousinly feet twisting my skin uncomfortably. Between Unweaving the Rainbow and Ancestor’s Tale, Dawkins wrote A Devil’s Chaplin, a haphazardly organized collection of Debbie Downer essays that is probably best understood as the director ancestor of Dawkins’ most successful book, The God Delusion. The book represented a specific cultural moment, described above, when everyone was eager to read why God sucked. I don’t need to rehearse the narrative or the players (something about four horsemen, cognitive, an obnoxious and inappropriate use of the prefix “neo”). Even The God Delusion‘s harshest critics praised Dawkins for capturing the zeitgeist in a bottle. But the most prominent and widely-cited negative review, by Marxist literary theorist Terry Eagleton, did not. Eagleton captured Dawkins, his personality and his project, to near perfection in the London Review of Books:

[Dawkins’ views] are not just the views of an enraged atheist. They are the opinions of a readily identifiable kind of English middle-class liberal rationalist. Reading Dawkins, who occasionally writes as though ‘Thou still unravish’d bride of quietness’ is a mighty funny way to describe a Grecian urn, one can be reasonably certain that he would not be Europe’s greatest enthusiast for Foucault, psychoanalysis, agitprop, Dadaism, anarchism or separatist feminism. All of these phenomena, one imagines, would be as distasteful to his brisk, bloodless rationality as the virgin birth. Yet one can of course be an atheist and a fervent fan of them all. His God-hating, then, is by no means simply the view of a scientist admirably cleansed of prejudice. It belongs to a specific cultural context. One would not expect to muster many votes for either anarchism or the virgin birth in North Oxford. (I should point out that I use the term North Oxford in an ideological rather than geographical sense. Dawkins may be relieved to know that I don’t actually know where he lives.)

Terry Eagleton
Terry Eagleton

Eagleton’s Marxist ad hominem is amusing: he reduces Dawkins’ own self-proclaimed materialism to his class. Dawkins is a very, very identifiable type. I’m not sure whether Eagleton knew, when he quoted Keats, that Dawkins had written a book whose title misread – or at least misappropriated – the most flowery of Romantic poets.

Eagleton’s more substantial complaint – that there are many kind of atheists, not all of whom derive their views from a fetishized notion of the natural sciences’ explanatory powers – was echoed in many other reviews. It was even the basis for a two-part episode of South Park.

Another common complaint: The God Delusion engaged with religious faith very narrowly, responding to only the most extreme fundamentalist interpretations of scripture and dogma. Dawkins hadn’t boned up on his Tillich. He’s a scientist stumbling clumsily through the humanities, unaware that his most basic criticisms of faith have been taken seriously by religious people since the Middle Ages. Again, Eagleton:

What, one wonders, are Dawkins’s views on the epistemological differences between Aquinas and Duns Scotus? Has he read Eriugena on subjectivity, Rahner on grace or Moltmann on hope? Has he even heard of them? Or does he imagine like a bumptious young barrister that you can defeat the opposition while being complacently ignorant of its toughest case? … As far as theology goes, Dawkins has an enormous amount in common with Ian Paisley and American TV evangelists. Both parties agree pretty much on what religion is; it’s just that Dawkins rejects it while Oral Roberts and his unctuous tribe grow fat on it.

More troubling than his exclusion of Eriugena and de facto collusion with Oral Roberts is his exclusion of so many other atheists. The God Delusion was published before Christopher Hitchens’ God is Not Great, a very bad book that nevertheless engaged with atheism per sedrawing from an intellectual history that extended from Lucretius to Spinoza and Thomas Paine (a list Hitchens never tired of reciting on cable news show, grinning slyly at the thought of pot-bellied viewers on their sofas, scratching their heads: I think I’ve heard of that Payne guy, but who in the sam hill is Lew Crishus?).

If Dawkins was a scientist posing as a humanist – or, more correctly, a scientist trying to sell ideology as scientific fact – then Hitchens was a humanist posing as someone with a basic understanding of science. In reality, Hitchens knew the Bible, had spent his career admiring religious thinkers and religious poets. Near the end of the Hitchens v. Douglas Wilson documentary Collision, Hitchens recalls a conversation with Dawkins, during which Hitchens declared that, if given the power to wipe religious belief off the face of the earth, he wouldn’t do it. “Why not?!” shrieked Dawkins – Hitchens, repeating the anecdote to Wilson, does a killer imitation of Dawkins’ spine-tingling shriek. Hitchens has no answer for Dawkins. He simply can’t conceive of a world without at least one religious believer.

More on point, however, is the following passage from Eagleton’s review:

Dawkins considers that all faith is blind faith, and that Christian and Muslim children are brought up to believe unquestioningly. Not even the dim-witted clerics who knocked me about at grammar school thought that. For mainstream Christianity, reason, argument and honest doubt have always played an integral role in belief. (Where, given that he invites us at one point to question everything, is Dawkins’s own critique of science, objectivity, liberalism, atheism and the like?) Reason, to be sure, doesn’t go all the way down for believers, but it doesn’t for most sensitive, civilised non-religious types either. Even Richard Dawkins lives more by faith than by reason. We hold many beliefs that have no unimpeachably rational justification, but are nonetheless reasonable to entertain. Only positivists think that ‘rational’ means ‘scientific’. Dawkins rejects the surely reasonable case that science and religion are not in competition on the grounds that this insulates religion from rational inquiry. But this is a mistake: to claim that science and religion pose different questions to the world is not to suggest that if the bones of Jesus were discovered in Palestine, the pope should get himself down to the dole queue as fast as possible. It is rather to claim that while faith, rather like love, must involve factual knowledge, it is not reducible to it. For my claim to love you to be coherent, I must be able to explain what it is about you that justifies it; but my bank manager might agree with my dewy-eyed description of you without being in love with you himself.

Dawkins would no doubt balk at the notion that he take Eagleton’s advice and “critique” science. Science is self-critiquing, after all! Science is reasonable by its very structure. Science and reason are near synonyms in the anti-ideological atheist lexicon.

This, for me, is the most troubling aspect of Dawkins and deGrasse Tyson’s trendy, anti-ideological atheism.

Let us consider once more the subtitle of Emma Green’s Atlantic review: for  the argument that faith in God is irrational—and therefore illegitimate.” Both Green and Eagleton observe what is perhaps the most troubling aspect of popular, anti-ideological atheism: it conflates terms like “reason,” rationality,” “fact,” “science,” and “knowledge.” In fact, I believe Eagleton goes too far when he asserts that “only positivists think that ‘rational’ means ‘scientific.'” Many positivists can make the distinction. (Eagleton’s reflexive assertion to the contrary is merely a product of decades spent defending post-positivist thought to his fellow Marxists.)

The popularizers of anti-ideological atheism play very fast and loose with a specific set of words: “science,” “reason,” “(ir)rationality,”  “knowledge,” “fact,” “truth,” and “information.” It is absolutely necessary to distinguish between these words. In many contexts, it is not “irrational” to object to scientifically produced knowledge, especially if you’re objecting to the implementation of that knowledge.

If I were a public intellectual with a large platform – that is, if I were Neil deGrasse Tyson – I’d go on a speaking tour. The tour’s only goal would be the definition of some basic terms, as they ought to be used by laypersons (obviously specialists will have slightly different definitions, and that’s okay). Information is data we glean from the world through our senses and technologies. Science is a method that uses information to test ideas and produce knowledge. Ideas are organized assumptions about the world. Ideas that are verifiable using scientific methods become knowledge. Reason is a system of organizing knowledge, which allows knowledge to be used for all sorts of great things: to determine a set of ethics, to decide the best shape of government, to demarcate reasonably accurate beliefs about the world, to guide us through daily decisions, etc. Rationality is reason with a French accent.

Facts are stubborn but undeniable things, some of them unveiled by the scientific method and others revealed through our senses/technologies, which help us glean information and confirm knowledge produced by the scientific method. Truth is the ontological status of reality, which makes it a very tricky thing to define and understand, and is therefore probably best passed over in silence…at least in casual conversations or book tours. True is an elastic adjective that allows us to describe the proximity of knowledge, ideas, and impressions to reality, as we understand it via science, knowledge, reason, and facts.

These definitions are not perfect, and I’m sure you and my fellow Jilters have problems with some/all of them. But I think they’re suitable for casual use. At the very least, they admit distinctions between concepts.

Anti-ideological atheists misuse these concepts for rhetorical purposes, and they encourage the public’s tendency to conflate them.

This is wrong.

When Neil deGrasse Tyson insists that “evolution is a fact,” he’s playing with rhetoric to make a political point. For too long, Creationists have conflated the scientific and popular definitions of the word “theory,” transmuting well-established and verifiable knowledge about life into speculation: Darwin’s theory of speciation was as reliable as a hopeful suitor’s theory of “why she isn’t returning my phone calls.”

But in both scientific and common English, theory is not an antonym of fact (sorry Creationists) and a theory cannot be a fact (as deGrasse Tyson well knows). A theory is established by facts. Richard Dawkins, Samuel Harris, Daniel Dennett, Neil DeGrasse Tyson, and Bill Nye have had countless opportunities to make these simple distinctions to the public; Christopher Hitchens possessed both the knowledge and rhetorical precision to explain the distinctions. But distinctions don’t pack much punch. Politically and ideologically, it’s better to affirm that “evolution is a fact,” just like gravity, and not allow the Creationists to keep slithering through their own linguistic sophistry. And just as explaining a joke drains its humor, debunking a slick sophistry invariably drains your authority. Better to bludgeon than to slice. And as anyone who has seen the ads or watched the first two episodes of his Cosmos knows, deGrasse Tyson is happy to bludgeon.


*By “socio-humanist,” I refer to scholars in the humanities (I use “humanist” as the humanities equivalent of “scientist”) and certain branches of the social sciences; I’m not referring to the broader category of post-Englightenment “secular humanism,” within which Dawkins might count himself.

The Worst President of the 20th Century: Part Four

By Seth Studer

This is part of an ongoing series about 20th century American presidents, what they did, and how we think and talk about them. Each essay can be read on its own, but if you wish to see the others, click here.

Hegel, busily inventing the 20th century while students patiently await their own intellectual germination.
Hegel, busily inventing the 20th century while students patiently await their own intellectual germination.

 

1. Presidential History: from the Academy to the Public

On December 15 1996, JFK hagiographer Arthur Schlesinger Jr. used the pages of the New York Times Magazine to work through some daddy issues.

Some background:

Arthur’s father, Harvard professor Arthur Schlesinger Sr., was one of several historians who in the 1920s and ‘30s helped develop, defend, and establish what would become the default methodology in U.S. history departments. These historians – influenced by post-Marxian social theory in Europe and the rapid development of sociology over the past half-century – argued that “bottom-up” analyses of history yielded better knowledge than the then-still-traditional “top-down” approach.

Further background:

Imagine you are a student of history at the University of Iowa in 1910. You could expect an education with two foci: verifiable facts and historical narratives. These foci generated two activities: the verification of facts using primary sources and the construction of narratives using those facts.

This method of study was executed with one underlying assumption: history does not repeat itself. And because the past will not return, intensive research into the diets of 17th century French peasants is less useful than a study of King Louis XIV’s domestic policies. Louis was the main actor, after all, the one who made things happen. Seventeenth-century French peasants were minor actors by comparison; their impact (even if you consolidate them) was minimal. If you want to understand the past, start at the source.

Much like the model of American university itself, this traditional historical methodology originated in 19th century Germany. This method had been practiced to varying degrees by amateur historians since the Renaissance. But it was formalized by professional historians in Germany (esp. Leopold von Ranke and Friedrich Karl von Savigny) in reaction against Hegelian philosophy, which explained, totalized, and subordinated history to transhistorical systems. Hegel’s work successfully pollinated a thousand significant ideas and academic disciplines. And while every university in Europe and America had a few token Hegelian historians, the fact-and-narrative method dominated. In Anglo-American historical scholarship, “facts” were generally details surrounding big events caused by big actors (important men, but occasionally large populations if they could be viewed as “top-down” actors).

By the time Schlesinger the Elder arrived at Harvard (in 1924), new approaches to historical knowledge were percolating. They had already developed in other discipline over the past half-century. These new approaches would use economic pressures, material conditions, and theories of society to write history. Although this shift in focus seemed radical to older historians, these new methods shared with their predecessors an aversion to abstract models and systems (Hegelian or otherwise), a rejection of the notion that “history repeats itself” (or even that it can rhyme without awkwardly forcing it), commitment to facts, and deference to empiricism as an epistemological base.

These new methods did not quell interest in the great persons of history. Historians tempered their emphasis on Sun Kings and Great Events. Meanwhile, the general public remained entranced by the glow of George Washington and the Civil War. Both remained invested in narratives. And narratives tend to reveal patterns; it is difficult to inoculate yourself against this. On the eve of World War II, Schlesinger Sr. published an article that proposed a cyclical interpretation of U.S. political history. He argued that trends in U.S. federal polices followed a pendulum-like pattern. Although this hardly amounted to a large Hegelian system, Schlesinger Sr. had stepped outside the mainstream.

Except that he didn’t. By 1948, Schlesinger Sr. had established a reputation outside academe, and was asked by Life magazine to poll his colleagues and rank the presidents of the United States. I do not know whether Schlesinger Sr. paused to reflect on the inadvisability of such a endeavor, on how many of his own best practices he’d be violating. A list ranking the presidents would only tell us what a handful of historians in 1948 thought, but it would tell us nothing about the presidents. Further, such a list would only encourage the public’s inflated view of the presidency.

Nevertheless, Schlesinger agreed. He asked his participants to assign each president a degree of greatness, ranging from “great” and “near great” to “below average” and “failure.” Each category was assigned a value, and the number of votes each president received in each category determined their place on the list. The results weren’t surprising: Washington and Lincoln at the top, Buchanan at the bottom. The list and its accompanying article in Life were so popular that Schlesinger Sr. was invited to repeat the experiment in 1962. On both occasions, Schlesinger Sr. surveyed nearly 100 historians.

Schlesinger Sr.’s progeny, Arthur Jr., began his foray into the family business at the family’s company in 1940. In 1954, Harvard promoted Arthuer Schlesinger Jr. to full professor sans PhD, largely on the merits of his popular, Pulitzer Prize-winning study of Jacksonian democracy (also: he was Art Sr.’s boy).

"The Historian as Participant": Arthur Schlesinger Jr.
“The Historian as Participant”: Arthur Schlesinger Jr.

Schlesinger Jr. behaved like anyone who’d just received the world’s biggest Free Meals for Life ticket, at the world’s greatest university: he resigned for the volatile world of politics and a chance to fill an elusive, ephemeral, and newly emerging role in American society: the public intellectual.

Long active in Democratic politics, Jr. hit the mother lode in 1960 when he joined the campaign and administration of John Fitzgerald Kennedy. He served as one of several “house intellectuals”: young men hired by Jack and Bobby to lounge around the White House and write memoranda to be set aside as doodling paper for the president (an incurable doodler). These men also provided a requisite level of eggheadedness – they lent an intellectual veneer – to offset the Kennedy glamour (Jack) and thuggishness (Bob). They, like the White House furniture, had a good and honorable purpose.

After Kennedy’s assassination, Schlesinger Jr. wrote A Thousand Days, an instantly popular, hagiographic “insider’s view” of the JFK administration. From then on, Schlesinger Jr.’s reputation as a scholarly but thoroughly public historian-intellectual was impeachable. He confidently spoke about administrations into which he had not enjoyed an “insider’s view” (e.g., Nixon’s). He taught at the City University of New York, but he no longer moved with the currents of historical scholarship. And frankly, between celebrity and serious scholarship, which would you choose?

But Schlesinger Jr. did little to popularize the best practices of his ostensible métier. And so in 1996, when the New York Times Magazine solicited a list ranking the presidents, Schlesinger Jr. must have thought about his father. Although Sr.’s scholarship was far more rigorous than Jr.’s, both men had turned away from hard scholarship to satisfy a very basic desire, one that the overwhelming majority of their countrymen felt, one that (in 1996) a new, radical brand of hardcore historicists had spent nearly two decades combating: the desire for easy access to history, to one’s own national history.

Of course Schlesinger Jr. would oblige.

The 1996 survey resembled the ’48 and ’62 surveys. Identical format. The pool of participants was decidedly smaller (Schlesinger Jr. surveyed twenty-nine professional historians, two politicians, and Doris Kearns Goodwin). Confirming a thesis developed later by Meena Bose et al (which examined hundreds of similar surveys), long-dead presidents fared better than more recent presidents, who tended to fall in the “average” category or lower.

You may see the full list here. If you remove all but the 20th century presidents from Schlesinger Jr.’s list, here are the rankings (accompanied by their presidential GPA):

  1. Franklin D. Roosevelt (“Great,” 3.97)
  2. Teddy Roosevelt (“Near Great” 3.31)
  3. Woodrow Wilson (“Near Great” 3.21)
  4. Harry Truman (“Near Great” 3.10)
  5. Dwight Eisenhower (“High Avg.” 2.34)
  6. John F. Kennedy (“High Avg.” 2.29)
  7. Lyndon B. Johnson (“High Avg.” 2.21)
  8. Bill Clinton (“Avg.” 1.58)
  9. Robert Taft (“Avg.” 1.52)
  10. George H.W. Bush (“Avg.” 1.45)
  11. Ronald Reagan (“Avg.” 1.42)
  12. Jimmy Carter (“Avg.” 1.37)
  13. Gerald Ford (“Avg.” 1)
  14. Calvin Coolidge (“Below Avg.” 0.88)
  15. Herbert Hoover (“Failure” -9)
  16. Richard Nixon (“Failure” -21)
  17. Warren G. Harding (“Failure” -48)

Schlesinger’s list was widely publicized (a Schlesinger-authored analysis of the list appeared in Political Science Quarterly). Political commentators in the mass media, which had contained less mass backed when Schlesinger Sr. was publishing his lists, used it to render judgments on still-active presidents, politicians, and legislation. One of these judgments had very interesting consequences, something Schlesinger Jr. could not have anticipated or desired.

2. Presidential History: from the Public to Mythology

Will Bunch, a left-wing journalist, has exhaustively and credibly documented the cottage industry of Reagan hagiography that emerged in the mid-1990s. At that time, conservative Republicans were confounded that Bill Clinton’s popularity persisted despite the fact that he wasn’t a conservative Republican. In 1992, the year of Clinton’s election, a Gallup poll fixed Ronald Reagan’s favorability ratings around 44%. Jimmy Carter’s, meanwhile, were around 63%. In Bunch’s account, 1996 was the tipping point for Republican panic over Reagan’s legacy and, writes Bunch, the Schlesinger Jr. rankings were a major factor in this tipping point. Reagan was deemed “Average.” He scored 1.42 out of 4, below both H.W. Bush and Clinton (who hadn’t even finished his first term when the survey was published). This revelation – that history professors at prestigious universities don’t much care for Ronald Reagan – should have elicited a reaction comparable to news of the pope’s sectarian affiliation. But tax guy Grover Norquist, writes Bunch, was sufficiently alarmed and motivated by the rankings to take action. The following spring, Norquist founded the Reagan Legacy Project: a division of his influential Americans for Tax Reform group that would be devoted solely to hyping Reagan.

A summary of the near deification of Reagan among conservatives, and the increased admiration for Reagan among everyone else, during the late 1990s and 2000s is unnecessary. This mood peaked somewhere between Reagan’s 2004 funeral and the 2008 primaries, when the Republican candidates fought desperately to out-Reagan each other. Reagan was a certifiably mythic figure.

But myth-making can be an ugly business. Because in mythology, there’s a frustrating tendency to be killed by one’s offspring.

Now that Reagan has an iPhone, they can follow each other on Vine.
Now that Reagan has an iPhone, they can follow each other on Vine.

By 2012, something had changed. Republicans, conservative pundits, and your father-in-law were suddenly hearing Reagan’s policies and statements turned against them. Reagan was being quoted by people who looked like Rachel Maddow and argued over minutia on Daily Kos forums. In February 2011, the centennial of Reagan’s birth, countless media outlets ran profiles of “the real Reagan” or “Reagan the liberal.” Time featured a cover that showed Obama and the Gipper chumming it up against a white background (like two guys in a Mac ad). Among the Slate and Salon class, Reagan’s liberal streak was already an article of faith.

Reagan had backfired.

Like Barry Goldwater before him (and Robert Taft before Goldwater), Reagan no longer seemed so conservative…not because American conservatives had shifted so far to the right but because “conservative” means different things in different eras.

Something else had changed: the distance between the 2012 primaries and Ronald Reagan’s last day in office was roughly the distance between Hitler’s suicide and the assassination of Martin Luther King Jr. The distance between 2012 and the day Reagan became president was the distance between Hitler’s suicide and Reagan’s first presidential campaign against Gerald Ford. Enough time had passed – and enough documents were being declassified – for serious historians to begin assessing Reagan’s presidency with sobriety and distance. Two Princeton historians, Sean Wilentz and Daniel T. Rodgers, published books on the Reagan era in 2009 and 2011. Wilentz’s accessible Age of Reagan was favorably reviewed by George Will in Time magazine. Rodgers’ Age of Fracture was an ambitious attempt to synthesize American culture during the end of Cold War. Age of Fracture was published 30 years after Reagan’s first inauguration.

The 1980s felt remote.

In 1999, three years after Schlesinger Jr.’s list, Time magazine published a list ranking the presidents of the 20th century as part of their “End of the Century” coverage (they had polled nine journalists and historians, including Schlesinger Jr.). Time published its list with anonymous comments from the participants:

  1. Franklin D. Roosevelt: “Indisputably the century’s greatest”
  2. Theodore Roosevelt: “The great prophet of affirmative government”
  3. Woodrow Wilson: “A great visionary who presided over major domestic advances”
  4. Harry S Truman: “A decent human being with homespun virtues”
  5. Dwight D. Eisenhower: “No hint of scandal either. The good old days”
  6. Ronald Reagan: “Jury still out”
  7. Lyndon B. Johnson: “America would have found a way to give blacks the vote without him, but don’t ask me how”
  8. John F. Kennedy: “Might be first-tier if he had lived longer”
  9. George Bush: “A skilled and decent administrator”
  10. Bill Clinton: “Jury out here too–maybe literally!”
  11. William Howard Taft: “Achieved nothing good with excellent situation left him by T.R.”
  12. (tie) Calvin Coolidge: “Left little historical legacy”; “Could have been greater if faced with challenges”
  13. (tie) Gerald Ford: “Returned nation to normality”
  14. Jimmy Carter: “Should have been a preacher”
  15. Richard Nixon: “The most difficult President to assess”; “Uniquely a failure among American Presidents”
  16. Warren G. Harding: Term: “Whatever personal shortcomings, presided over a period of economic growth”
  17. Herbert Hoover: “Victim of bad luck”

 

 

 

 

 

 

 

 

 

Presidential rankings are interesting because, when you compare them across time, they reveal the fluctuations of American cultural identity and how history is incorporated into that cultural identity. When Schlesinger Sr.’s second poll was published in 1962, half the interest it generated was the addition of a few new presidents, and the other half was the (relatively few) changes between the ’48 and ’62 lists. People wanted to know how the 19th century had changed between 1948 and 1962.

One might think that public rankings (of which there are many) would reveal more about cultural attitudes toward former presidents, because…well, they’re the public. But although the public has a monopoly on the culture, the public alone is not necessarily the best gauge of the culture’s self-conception. Rankings generated by historians arguably tell us more about cultural changes, because historians a.) possess more comprehensive knowledge of U.S. history and b) try (or imagine themselves) to be thoughtful and rigorous in their assessments. For this reason, the changes between their rankings – though smaller and less dramatic than changes on public rankings – are arguably more charged with cultural meaning: these are the bits of culture that filtered through even the historians’ ostensible sense of fairness vis a vis the past. What appear to be the thinnest vein proves to be the richest mine.

Imagine the Time list today. Lyndon B. Johnson’s ranking is slightly higher than in the 1996 Schlesinger Jr. list. But in 2014, two Robert Caro volumes and one Affordable Care Act later, I believe he might rank higher. In Lee Daniels’ recent and self-conscious ABOUT AMERICAN IDENTITY!!!, Johnson is portrayed more favorably than Kennedy. The film ends with a succession of voices speaking hope, civil rights, and the black experience. Johnson’s is the last white voice we hear. Steven Spielberg and Tony Kushner’s Lincoln reflects admiringly the qualities that Lincoln and Johnson shared.

Similarly, Wilson’s visionary internationalism might have seemed appealing into the heady, post-historical days of the 1990s. But after nearly eight years of disastrous neo-conservative internationalism (just as visionary as Wilson’s), when Samuel P. Huntington joined Francis Fukuyama on State Department shelves, Wilson’s foreign policy idealism is less attractive. Eisenhower, whose stock has been rising, might take Wilson’s place on the list.

In 2014, Truman and his accomplishments feel distant, while Nixon’s appear towering (the towers that link our iPhones, made in China). Nixon was already undergoing massive rehabilitation in 1999, but many of the historians/journalists on the Time panel lived through the ‘60s and ‘70s. I’ve found it’s difficult to talk reasonably about Nixon with a person whose political life began in the 1960s. Today, though, Nixon would surely be among the top ten. Truman would, too, but lower down.

No serious Carter rehabilitation has taken place. Bill Clinton, meanwhile, received almost instant rehabilitation, and would rank higher today. The narrative of the 1990s – a period of massive economic growth and peace, all of it ruined by the next guy – would prove irresistible to Time‘s panel.

Reagan? Reagan might stay in place, just outside the Top Five. Presidents would move around him, up and down, but he would stay fixed, smiling and content. Since 1999, as documents are declassified and trickle down, historians like Sean Wilentz have confirmed half of what Reagan’s worshippers believe, and dispelled quite a lot of what Reagan’s despisers believe. So it’s a wash.

Herbert Hoover is still a victim of bad luck.

3. Presidential History: from Mythology to Fable; or, Slouching toward Rushmore

Responsible historians would balk at these lists (even if we’re analyzing the historians rather than the presidents – the samples are too small). And rightly so. They encourage the wrong kind of thinking. When I speak to students about presidents, I encourage them to ignore any success/failure paradigm and treat the presidential name – Lincoln, Roosevelt, Clinton – as not a person’s name but as a metonym for a collective: a complex of policies and individuals that must be judged one by one. That approach is more productive, and that collective more interesting, than one man’s career and biography.

And yet we want our Great Men. We want them not merely as they existed in the past, but projected forward into our present: speaking to our problems, condemning our enemies, confirming our prejudices blessing our decisions.

We want to know the presidents.

But if you want to actually engage with the past, you’ll first find only dead silence. The anti-Hegelians resisted abstracted or cyclical history because they believed that the past speaks only to and about itself. At its most extreme, this view reduces history to delicate facts that crumble with the slightest extrapolation. The practice of history, the ability to make claims about the past, is practically impossible. The past is a sealed tomb. Most historians today are more pragmatic, borrowing methods and principles from the social sciences. They borrow these methods because the tools are strong and the excavation of historical knowledge is incredibly hard. You cannot apply even the recent past forward without great rigor and painstaking precision. To utilize historical knowledge properly, you must rely on slivers of specificity or sturdily engineered abstractions (usually constructed with the help of others). And you cannot allow specificity and abstraction to cross-contaminate. Everything you claim must be qualified and controlled.

This does not make for engaging or accessible presidential biographies.

Driving home last week, I endured a Minnesota Public Radio host lapping up the latest historical musings of journalist Simon Winchester. His new book that purports to introduce “the men who united the states,” men who are – fortunately for Winchester – “explorers, inventors, eccentrics, and mavericks.” The book is actually about oft-overlooked figures who show up at critical moments in U.S. history. A noble enough subject. But Winchester gives us fables. When asked, regarding Minnesota (paraphrase), “Who settled this area? Who made Minnesota a place? And why did they come here?”, he does not mention the fur trade and the decline of French colonialism in North America and the War of 1812 and federal incentive programs and politico-economic refugees from central Europe. No, Winchester driveled on about people “seeking adventure,” stir-crazy Easterners who wanted to live on the outskirts of civilization. They had an itch, and building a nation on prairie wilderness was the cure! He actually quoted Will Cather’s famous musings on the subject: the plains are “not a country at all, but the material out of which countries are made.” This passage beautifully represented a Virginian’s first impressions of Nebraska, written from Cather’s 20th century vantage, not the actual motivations of the actual miserable masses who traded New York and cholera for the Great Plains and scarlet fever. But for Winchester, the retrospect (Cather) preceded the event (the pioneers).

The contours of the present are determined by the material past. The fact of the material past is undeniable. We know that it exists, but it is hard to see. History shows us the shadows of the past, sometimes with surprising clarity, but it is easily corrupted. Fabulism is inherent in practically every publicly accessible account of American history. Perhaps this fabulism cannot be eradicated; it can only be pruned and minimized.

Regrettably, the overwhelming majority of those Americans who actually bother to think about history prefer that fabulism flourish. They want to learn from the past. They want a greatest president. They want a worst president. They want to make the past present. But the past belongs to the dead, who are mute and can be understood only by the conditions and corpses they leave behind. We take their words out of context the moment we speak them. We construct fables. We want the Angel of History to fly facing forward, like the bald eagle.

Next time: I violate everything I’ve written here and rank the 20th century presidents! 

Better Read than Dead: Writing Workshops, Film Schools, and the Cold War

Iowa Writers' Workshop
Iowa Writers’ Workshop

 

I.

Eric Bennett, a professor of English at Providence College, wrote a long, meandering, but fascinating article in this week’s Chronicle of Higher EducationThe article offers a history of the University of Iowa’s Writers’ Workshop alongside a truncated overview of mid-century political realignments, plus a few digressions (a writer after my own heart). By the end, the article has become a polemic against the perceived bias in MFA programs against fiction whose scope widens beyond the concrete and the personal to include ideological, philosophical, and global vistas. But the polemic feels compensatory, extra weight to balance a tenuous but fascinating observation about Cold War propaganda.

Bennett’s article ought to be read rather than summarized, but I’ll offer some highlights. He opens with a jolt:

Did the CIA fund creative writing in America?

The answer is not entirely satisfying; to add some heft, Bennett swerves across multiple topics (including a piece of his own biography) toward a conclusion in which he admits the fragility of the connections he’s making. “You probably can see where this is going,” he writes:

One can easily trace the genealogy from the critical writings of Trilling and Ransom at the beginning of the Cold War to creative-writing handbooks and methods then and since. The discipline of creative writing was effectively born in the 1950s. Imperial prosperity gave rise to it, postwar anxieties shaped it. “Science,” Ransom argued in The World’s Body, “gratifies a rational or practical impulse and exhibits the minimum of perception. Art gratifies a perceptual impulse and exhibits the minimum of reason.” In The Liberal Imagination, Trilling celebrated Hemingway and Faulkner for being “intensely at work upon the recalcitrant stuff of life.” Life was recalcitrant because it resisted our best efforts to reduce it to intellectual abstractions, to ideas, to ideologies.

He says it better in the next paragraph:

From Trilling, Ransom, and Arendt to Engle and Stegner, and from them to Conroy, Almond, Koch, and Burroway, the path is not long. And yet that path was erased quickly. Raymond Carver, trained by writers steeped in anti-Communist formulations, probably didn’t realize that his short stories were doing ideological combat with a dead Soviet dictator.

Iowa’s Workshop has enjoyed its reputation as “the Death Star of MFA programs” (to quote poet Jorie Graham) since before there was a Death Star. In recent years, scholars of American literature have turned to the long-unexamined institution of the creative writing MFA and the workshop model that Iowa innovated. Every graduate student working in contemporary American literature must reckon with Mark McGurl’s The Program Era, a literary-sociological study that posits MFA programs as the principle force determining the structures, forms, themes, and direction of postwar fiction. The Program Era elicited a rare response from literary scholars: near-universal praise and admiration. McGurl had struck a massive gold vein, one that would sustain years of groundbreaking scholarship, and in the most obvious place.

The MFA program and the workshop model predate the GI Bill, but they began to flourish after the postwar infusion of veterans into the public university system. The very concept of “creative writing” as a discipline was itself a practical response to mid-century geopolitics. Novelists and poets had always supplemented their income with teaching. But by the mid-20th century, fiction writers increasingly turned from journalism and screenwriting to academia to fund their metier. With the rise of totalitarian governments throughout Europe, many continental writers sought refuge within the American university system.

A dilemma now faced American literature departments, which were already struggling to reconcile their roots in 19th century German scholasticism with American trade-oriented pragmatism. What to do with all these writers? Most were given literature courses to teach, but they typically lacked training or interest in literary theory. Should these writers be expected to produce scholarship? Or should they just…write?

Vladimir Nabokov on the cover of noted anti-Communist pamphlet "Time," looking unenthusiastic about his teaching career, a year before he was called an elephant.
Vladimir Nabokov on the cover of the noted anti-Communist pamphlet “Time,” looking unenthusiastic about his teaching career, a year before he was called an elephant. The banner confidently declares American superiority.

Two Cold War refugees, the linguist Roman Jakobson and the novelist Vladimir Nabokov, famously butted heads on this issue at Harvard. Nabokov, a popular lecturer but terrible scholar of Russian literature, was promoted to a chair in literature, largely on the basis of his literary accomplishments. Irritated, Jakobson mused, “Shall we appoint elephants to teach zoology?” Jakobson’s quip inspired the title of D.G. Myers’ history of creative writing, The Elephants TeachThat’s precisely what happened, except that a partition between the Jakobsons and the Nabokovs produced, on the graduate level, the humanities’ equivalent of separate theory and practice tracks.

Bennett’s contributions to the study and history of the MFA program examine the intersection of creative writing and the Cold War (his forthcoming book, Workshops of Empire, will be published by University of Iowa Press). This intersection may seem incongruous, until one contextualizes the MFA program, and its influence on cultural production, within the geopolitical functions of the postwar American university. Bennett argues that the MFA program tends to promote concreteness, specificity, and real life in fiction and to discourage (or virtually ban) broad, ideologically-driven fictions. He links these tendencies to anti-Communist anxieties about totalitarian systems and Marxist ideology, anxieties which (he argues) shaped the MFA program’s development and agenda throughout the 1940s, ’50s, and ’60s.

During this period, creative cultural output was viewed by the U.S. government as a legitimate and productive tool in the fight against Communism. Since its inception in 1947, the CIA funded traveling exhibitions of modern American painters: Pollock, de Kooning, Rothko, et al, most of whom were ardent leftists. The CIA’s art program was a covert extension of an earlier State Department program that was forcefully terminated. The State Department had been pilloried with objections from hayseed congressmen (and a hayseed president) that Abstract Expressionism was subversive trash, not worth funding. But the CIA understood its value: Communists in America and Europe would visit the exhibitions and witness how Socialist Realism, the dominant aesthetic in the Eastern Bloc, paled in comparison to the West’s avant-grade.

Any Soviet artists who visited these exhibitions saw forms, styles, innovations – openness – that had been forbidden in Russia since the death of Lenin.

 

Sergei Eisenstein
Sergei Eisenstein

II.

Soviet propagandists in the 1950s and ’60s had little difficulty convincing their people of the West’s decadence. The self-evident and vast disparity between Western prosperity (particularly in the United States) and life in the Eastern Bloc did that work for them. As middle class swimming pools sprang up like cacti across southern California and Dairy Queens appeared in every town, there was little to dress up or exaggerate.

Convincing the Average Ivan that such decadence was undesirable proved a greater (ultimately insurmountable) challenge. After Stalin’s death and the Sino-Soviet split, public discontent evolved from an occasional and easily remedied headache to a chronic migraine for the leaders of the Soviet Union. Official lies about “our prosperity” and “Western decline” were not as durable in the Eastern bloc as they’d been – and would be – in other totalitarian regimes. Information leaked through the Iron Curtain, and citizens could compare and contrast.

To convince its people – and to convince the millions of sympathetic ears listening beyond the Iron Curtain – of capitalism’s inherent corruption, the Soviet Union documented Western colonial and post-colonial atrocities and racial apartheid in the American South. “They might have better cars than we do, but they murder whole villages and hang black men from trees!” This reality-based propaganda would eventually pressure ardent Cold Warriors in the U.S. government to lend much-needed support to the Civil Rights movement. In the late 1950s, most congressmen didn’t care if blacks in Mississippi could safely participate in society. They did, however, care about the spread of Communism in all these young nations newly emancipated from European colonialism. If allowing blacks in Mississippi back into civil society could help keep the dominos standing, so be it!

The Soviet Union also inflated its own successes, hoping nationalism might be a satisfying alternative to a Cadillac (I wonder if they could actually see Lenin rolling in his tomb). Party solidarity, especially on matters of liberalization and foreign policy, were exaggerated. Weapons stockpiles were wildly exaggerated. Most famously, dummy missiles were paraded before cameras broadcasting straight into Ronald and Nancy’s living room while he ate his nightly TV dinner. Here, even the West was duped. The American intelligence community would endure a bevy of congressional hearings between 1989 and 1991, wherein red-faced one-time Cold Warriors demanded to know how, how, our intelligence had so wildly overestimated the size of the Soviet arsenal, the health of the Soviet economy, and the strength of Soviet institutions. Hype was a successful export.

But in the propaganda war, the Soviet Union failed in an arena where even fascists had enjoyed some success: cultural exports.

In its infancy, the U.S.S.R. experienced an extraordinarily flowering of the visual artists, particularly cinema (a medium Lenin preferred). Propagandists were recruited from Russia’s thriving avant-grade theater community and from Lev Kuleshov’s film school, where the degree and density of talent is still almost without precedent (the early years of Walt Disney’s animation studio or the informal confederacy of filmmakers associated with the French New Wave come to mind). Lenin asked for Communist propaganda, and the artists delivered – in part because they were given freedom over their own work. Abstraction, surrealism, and experimentation were permitted, if not always admired by the Party (especially Trotsky). The Kuleshov school’s chief innovations, Soviet montage and theories of editing, effectively expanded cinematic grammar. Filmmakers throughout the world could convey meaning with greater efficiency, power, and range. These innovations were permanent. They remain embedded in cinema and television. They are so ubiquitous that you forget their lineage.

The montagists helped sell Leninism to illiterate Russian peasants, but their broader impact almost exclusively on cinematic form, not ideology. Eisenstein and others theorized that the two were inseparable, that Marxism was embedded in their dialectical approach to cinematic expression. The theory would go untested. When Stalin assumed power, he declared Socialist Realism the aesthetic and ethos for all Soviet art. The great Kuleshov school filmmakers either fled Russia, self-censored, or were imprisoned.

Socialist Realism produced some interesting architecture, bold sculpture, and a few decent paintings, but in general it celebrated Russia’s dismal past and more dismal present with either straight representationalism, frightening bravado, or a sentimentality that would make Steven Spielberg sick. After Stalin’s death, aesthetic restrictions loosened and Soviet filmmakers began to enjoy a little freedom. But even the best state-approved Soviet filmmakers were overshadowed by their naval-gazing brethren, filmmakers like Andrei Tarkovsky who appealed to Western taste and defied governmental standards. Tarkovsky was not the cultural ambassador Brezhnev would have picked; he never tempted a Westerner to jump eastward over the Berlin Wall. But the petulant streak that allowed Tarkovsky to defy his government but also prompted him to piss all over his acclaim in the West (“the cinema,” he told a confused audience after winning the Telluride Medal, “she is a whore”). The majority of Russian filmmakers, however, lacked Tarkovsky’s cajones. And the majority of Soviet films produced between Stalin’s death and glasnost  did receive Party approval. But even these films made the U.S.S.R. look like shit.

During the Cold War, nearly all communist propaganda that reached a wide Western audience was produced by Communists in the West. The most effective anti-Communist propaganda to infiltrate the Soviet Union – the conditions and quality of life of the Soviet people – was produced by the Soviet government. Ultimately, Cold War propaganda amounted to two vast spheres of humanity talking to themselves.

Those internal conversations included aesthetics and cultural products we consume regularly today. The conversations produced at least two creative technologies that persist and flourish today: literary minimalism and cinematic montage. They are so pervasive that we barely notice them. Their former political and ideological dimensions may have been “erased,” to borrow Bennett’s phrase. They are certainly innocent where gulags and napalm are concerned. But their parentage is compelling. The cut of a Transformers sequel or the delicacy of an Alice Munro story are just as much relics of the Cold War as missile silos in South Dakota or a toppled statue of Lenin in Kiev.

The Open Letter Opposing Legislative Meddling in University Politics

By Kindred Winecoff

I speak for none of the Jilted other than myself but I wanted to pass along this open letter, addressed to various legislative bodies in the U.S. who are trying to politicize universities in what I think is an unproductive way. Or, to quote from the letter:

Academics and commentators—including Crooked Timber bloggers—disagree over the American Studies Association’s decision to endorse an academic boycott of Israel. There should be far less disagreement over two bills recently proposed in New York’s and Maryland’s state legislatures. These bills prohibit colleges and universities from using state monies to fund faculty membership in—or travel to—academic organizations that boycott the institutions of another country. Designed to punish the ASA for taking the stance it has, these bills threaten the ability of scholars and scholarly associations to say controversial things in public debate. Because they sanction some speech on the basis of the content of that speech, they run afoul of the US First Amendment.

Read the whole thing and consider signing it if you agree. I did, and so have an impressive list of academics, media figures, and concerned citizens from all over the political spectrum. Here is why.

I disagree completely with the ASA’s boycott, as does one of the writers of the letter, and I argued with Corey Robin (the other writer, who supports the ASA) over this topic on Twitter. As it happens, the university that employs me withdrew its institutional membership with the ASA over this question; while I probably would not have gone so far I appreciate the reasons it did so and am not offended by them. I disagree with the BDS movement on both philosophical and pragmatic grounds.

But I oppose cynical legislative meddling into institutions of higher education even more. Universities and colleges are quite good at self-policing while remaining inclusive and moderate; we don’t need politicians picking and choosing which groups are above board and which are not. Moreover, the ASA’s “boycott” was so toothless as to be inconsequential; holding education funding hostage for petty politicking could obviously be quite consequential.

 

The ASA’s Boycott Lacks Seriousness

By Kindred Winecoff

The Executive Committee of the American Association of Universities has issued a statement condemning the American Studies Association’s boycott of all Israeli academic institutions. The AAU’s decision makes sense, and I support it. Claiming that all academics at all Israeli institutions bear responsibility for all actions taken by the government of Israel — whatever you think of those actions — is absurd. Playing fast-and-loose with academic freedom is more than regrettable in an environment where such liberties are under increasing threat at the margin.

I find it bemusing that someone like Corey Robin would disagree, given his own institution’s recent employment of General Petraeus. Robin protested that decision, vehemently, but given that his side was unable to prevent Petraeus from teaching at CUNY I doubt he would appreciate being banned from conferences, publications, or other academic symposia because his institution hired the leader of a war many believe to have been unjust and illegal. The American Association of University Professors (sensibly) opposes blanket boycotts as a matter of principle for just this kind of reason. In this case the Palestinian government agrees. Solidarity should not just be in the mind, and one can support Palestinian self-determination (and oppose the expansion of settlements in the West Bank) without playing games of guilt by association.

Tyler Cowen argues the positive case — would the world be better if the boycotters’ demands were met? — but I think that’s the wrong way of looking at it. This is pure mood affiliation via cheap talk. If it would actually have any real world impact I doubt most of these folks would support such a boycott for precisely the reasons Cowen gives. And if they did we would easily be able to identify their moral and scientific unseriousness.

 

UPDATE: I took a closer look at the text of the ASA’s website and one of the things I wrote above is misleading if not outright wrong. Specifically, individual Israeli academics are not being boycotted; only institutions. In practice this might be a distinction without a difference… but maybe not. In any case, here is the full statement from the ASA. The relevant part:

Our resolution understands boycott as limited to a refusal on the part of the Association in its official capacities to enter into formal collaborations with Israeli academic institutions, or with scholars who are expressly serving as representatives or ambassadors of those institutions, or on behalf of the Israeli government, until Israel ceases to violate human rights and international law.

The resolution does not apply to individual Israeli scholars engaged in ordinary forms of academic exchange, including conference presentations, public lectures at campuses, or collaboration on research and publication. The Council also recognizes that individual members will act according to their convictions on these complex matters.