14 Reasons Susan Sontag Invented Buzzfeed!

By Seth Studer

41wboBULMFLIf you’re looking for a progenitor of our list-infested social media, you could do worse than return to one of the most prominent and self-conscious public intellectuals of the last half century. The Los Angeles Review of Books just published an excellent article by Jeremy Schmidt and Jacquelyn Ardam on Susan Sontag’s private hard drives, the contents of which have recently been analyzed and archived by UCLA. Nude photos have yet to circulate through shadowy digital networks (probably because Sontag herself made them readily available – Google Image, if you like), and most of the hard drives’ content is pretty mundane. But is that going to stop humanists from drawing broad socio-cultural conclusions from it?

Is the Pope Catholic?

Did Susan Sontag shop at Sephora?

Sontag, whose work is too accessible and whose analyses are too wide-ranging for serious theory-heads, has enjoyed a renaissance since her death, not as a critic but as an historical figure. She’s one of the authors now, like Marshall McLuhan or Norman Mailer, a one-time cultural institution become primary text. A period marker. You don’t take them seriously, but you take the fact of them seriously.

Sontag was also notable for her liberal use of lists in her essays.

“The archive,” meanwhile, has been an obsession in the humanities since Foucault arrived on these shores in the eighties, but in the new millennium, this obsession has turned far more empirical, more attuned to materiality, minutia, ephemera, and marginalia. The frequently invoked but still inchoate field of “digital humanities” was founded in part to describe the work of digitizing all this…stuff. Hard drives are making this work all the more interesting, because they arrive in archive pre-digitized. Schmidt and Ardam write:

All archival labor negotiates the twin responsibilities of preservation and access. The UCLA archivists hope to provide researchers with an opportunity to encounter the old-school, non-digital portion of the Sontag collection in something close to its original order and form, but while processing that collection they remove paper clips (problem: rust) and rubber bands (problems: degradation, stickiness, stains) from Sontag’s stacks of papers, and add triangular plastic clips, manila folders, storage boxes, and metadata. They know that “original order” is something of a fantasy: in archival theory, that phrase generally signifies the state of the collection at the moment of donation, but that state itself is often open to interpretation.

Microsoft Word docs, emails, jpegs, and MP3s add a whole slew of new decisions to this delicate balancing act. The archivist must wrangle these sorts of files into usable formats by addressing problems of outdated hardware and software, proliferating versions of documents, and the ease with which such files change and update on their own. A key tool in the War on Flux sounds a bit like a comic-book villain: Deep Freeze. Through a combination of hardware and software interventions, the Deep Freeze program preserves (at the binary level of 0’s and 1’s) a particular “desired configuration” in order to maintain the authenticity and preservation of data.

Coincidentally, I spent much of this morning delving into my own hard drive, which contains documents from five previous hard drives, stored in folders titled “Old Stuff” which themselves contain more folders from older hard drives, also titled “Old Stuff.” The “stuff” is poorly organized: drafts of dissertation chapters, half-written essays, photos, untold numbers of .jpgs from the Internet that, for reasons usually obscure now, prompted me to click “Save Image As….” Apparently Sontag’s hard drives were much the same. But Deep Freeze managed to edit the chaos down to a single IBM laptop, available for perusal by scholars and Sontag junkies. Schmidt and Ardam reflect on the end product:

Sontag is — serendipitously, it seems — an ideal subject for exploring the new horizon of the born-digital archive, for the tension between preservation and flux that the electronic archive renders visible is anticipated in Sontag’s own writing. Any Sontag lover knows that the author was an inveterate list-maker. Her journals…are filled with lists, her best-known essay, “Notes on ‘Camp’” (1964), takes the form of a list, and now we know that her computer was filled with lists as well: of movies to see, chores to do, books to re-read. In 1967, the young Sontag explains what she calls her “compulsion to make lists” in her diary. She writes that by making lists, “I perceive value, I confervalue, I create value, I even create — or guarantee — existence.”

As reviewers are fond of noting, the list emerges from Sontag’s diaries as the author’s signature form. … The result of her “compulsion” not just to inventory but to reduce the world to a collection of scrutable parts, the list, Sontag’s archive makes clear, is always unstable, always ready to be added to or subtracted from. The list is a form of flux.

The lists that populate Sontag’s digital archive range from the short to the wonderfully massive. In one, Sontag — always the connoisseur — lists not her favorite drinks, but the “best” ones. The best dry white wines, the best tequilas. (She includes a note that Patrón is pronounced “with a long o.”) More tantalizing is a folder labeled “Word Hoard,” which contains three long lists of single words with occasional annotations. “Adjectives” is 162 pages, “Nouns” is 54 pages, and “Verbs” is 31 pages. Here, Sontag would seem to be a connoisseur of language. But are these words to use in her writing? Words not to use? Fun words? Bad words? New words? What do “rufous,” “rubbery,” “ineluctable,” “horny,” “hoydenish,” and “zany” have in common, other than that they populate her 162-page list of adjectives? … [T]he Sontag laptop is filled with lists of movies in the form of similar but not identical documents with labels such as “150 Films,” “200 Films,” and “250 Films.” The titles are not quite accurate. “150 Films” contains only 110 entries, while “250 Films” is a list of 209. It appears that Sontag added to, deleted from, rearranged, and saved these lists under different titles over the course of a decade.

“Faced with multiple copies of similar lists,” continue Schmidt and Ardam, “we’re tempted to read meaning into their differences: why does Sontag keep changing the place of Godard’s Passion? How should we read the mitosis of ‘250 Films’ into subcategories (films by nationality, films of ‘moral transformation’)? We know that Sontag was a cinephile; what if anything do these ever-proliferating Word documents tell us about her that we didn’t already know?” The last question hits a nerve for both academic humanists and the culture at large (Sontag’s dual audiences).

Through much of the past 15 years, literary scholarship could feel like stamp collecting. For a while, the field of Victorian literary studies resembled the tinkering, amateurish, bric-a-brac style of Victorian culture itself, a new bit of allegedly consequential ephemera in every issue of every journal. Pre-digitized archives offer a new twist on this material. Schmidt and Ardam: “The born-digital archive asks us to interpret not smudges and cross-outs but many, many copies of almost-the-same-thing.” This type of scholarship provides a strong empirical base for broader claims (the kind Sontag favored), but the base threatens to support only a single, towering column, ornate but structurally superfluous. Even good humanist scholarship – the gold standard in my own field remains Mark McGurl’s 2009 The Program Era – can begin to feel like an Apollonian gasket: it contains elaborate intellectual gyrations but never quite extends beyond its own circle. (This did not happen in Victorian studies, by the way; as usual, they remain at the methodological cutting edge of literary studies, pioneering cross-disciplinary approaches to reading, reviving and revising the best of old theories.) My least favorite sentence in any literary study is the one in which the author disclaims generalizability and discourages attaching any broader significance or application to the study. This is one reason why literary theory courses not only offer no stable definition of “literature” (as the E.O. Wilsons of the world would have us do), they frequently fail to introduce students to the many tentative or working definitions from the long history of literary criticism. (We should at least offer our students a list!)

In short, when faced with the question, “What do we do with all this…stuff?” or “What’s the point of all this?”, literary scholars all-too-often have little to say. It’s not that a lack of consensus exists; it’s an actual lack of answers. Increasingly, and encouragingly, one hears that a broader application of the empiricist tendency is the next horizon in literary studies. (How such an application will fit into the increasingly narrow scope of the American university is an altogether different and more vexing problem.)

Sontag’s obsession with lists resonates more directly with the culture at large. The Onion’s spin-off site ClickHole is the apotheosis of post-Facebook Internet culture. Its genius is not for parody but for distillation. The authors at ClickHole strip the substance of clickbait – attention-grabbing headlines, taxonomic quizzes, and endless lists – to the bone of its essential logic. This logic is twofold. All effective clickbait relies on the narcissism of the reader to bait the hook and banal summaries of basic truths once the catch is secure. The structure of “8 Ways Your Life Is Like Harry Potter” would differ little from “8 Ways Your Life Isn’t Like Harry Potter.” A list, like a personality quiz, is especially effective as clickbait because it condenses a complex but recognizable reality into an index of accessible particularities. “Sontag’s lists are both summary and sprawl,” write Schmidt and Ardam, and much the same could be said of the lists endlessly churned out by Buzzfeed, which constitute both an structure of knowledge and a style of knowing to which Sontag herself made significant contributions. Her best writing offered the content of scholarly discourse in a structure and style that not only eschewed the conventions of academic prose, but encouraged reading practices in which readers actively organize, index, and codify their experience – or even their identity – vis a vis whatever the topic may be. Such is the power of lists. This power precedes Sontag, of course. But she was a master practitioner and aware of the list’s potential in the new century, when reading practices would become increasingly democratic and participatory (and accrue all the pitfalls and dangers of democracy and participation). If you don’t think Buzzfeed is aware of that, you aren’t giving them enough credit.

The Wizard of Oz Is an Anti-Finance Manifesto

By Kindred Winecoff

Somewhat apropos of my previous post is the following anecdote, which I’ve read a number of times and have always forgotten. I’m pasting it here for posterity’s sake. It is from Daniel Little’s review of David Graeber’s Debt: The First 5,000 Years:

There are many startling facts and descriptions that Graeber produces as he tells his story of the development of the ideologies of money, credit, and debt.  One of the most interesting to me has to do with The Wonderful Wizard of Oz.

L. Frank Baum’s book The Wonderful Wizard of Oz, which appeared in 1900, is widely recognized to be a parable for the Populist campaign of William Jennings Bryan, who twice ran for president on the Free Silver platform — vowing to replace the gold standard with a bimetallic system that would allow the free creation of silver money alongside gold. … According to the Populist reading, the Wicked Witches of the East and West represent the East and West Coast bankers (promoters of and benefactors from the tight money supply), the Scarecrow represented the farmers (who didn’t have the brains to avoid the debt trap), the Tin Woodsman was the industrial proletariat (who didn’t have the heart to act in solidarity with the farmers), the Cowardly Lion represented the political class (who didn’t have the courage to intervene). … “Oz” is of course the standard abbreviation for “ounce.” (52)

The symbolism of the “yellow brick road” needs no elaboration.

UPDATE: As was been pointed out by Thomas in the comments, this was discussed long ago in the Journal of Political Economy.

Better Read than Dead: Writing Workshops, Film Schools, and the Cold War

Iowa Writers' Workshop
Iowa Writers’ Workshop

 

I.

Eric Bennett, a professor of English at Providence College, wrote a long, meandering, but fascinating article in this week’s Chronicle of Higher EducationThe article offers a history of the University of Iowa’s Writers’ Workshop alongside a truncated overview of mid-century political realignments, plus a few digressions (a writer after my own heart). By the end, the article has become a polemic against the perceived bias in MFA programs against fiction whose scope widens beyond the concrete and the personal to include ideological, philosophical, and global vistas. But the polemic feels compensatory, extra weight to balance a tenuous but fascinating observation about Cold War propaganda.

Bennett’s article ought to be read rather than summarized, but I’ll offer some highlights. He opens with a jolt:

Did the CIA fund creative writing in America?

The answer is not entirely satisfying; to add some heft, Bennett swerves across multiple topics (including a piece of his own biography) toward a conclusion in which he admits the fragility of the connections he’s making. “You probably can see where this is going,” he writes:

One can easily trace the genealogy from the critical writings of Trilling and Ransom at the beginning of the Cold War to creative-writing handbooks and methods then and since. The discipline of creative writing was effectively born in the 1950s. Imperial prosperity gave rise to it, postwar anxieties shaped it. “Science,” Ransom argued in The World’s Body, “gratifies a rational or practical impulse and exhibits the minimum of perception. Art gratifies a perceptual impulse and exhibits the minimum of reason.” In The Liberal Imagination, Trilling celebrated Hemingway and Faulkner for being “intensely at work upon the recalcitrant stuff of life.” Life was recalcitrant because it resisted our best efforts to reduce it to intellectual abstractions, to ideas, to ideologies.

He says it better in the next paragraph:

From Trilling, Ransom, and Arendt to Engle and Stegner, and from them to Conroy, Almond, Koch, and Burroway, the path is not long. And yet that path was erased quickly. Raymond Carver, trained by writers steeped in anti-Communist formulations, probably didn’t realize that his short stories were doing ideological combat with a dead Soviet dictator.

Iowa’s Workshop has enjoyed its reputation as “the Death Star of MFA programs” (to quote poet Jorie Graham) since before there was a Death Star. In recent years, scholars of American literature have turned to the long-unexamined institution of the creative writing MFA and the workshop model that Iowa innovated. Every graduate student working in contemporary American literature must reckon with Mark McGurl’s The Program Era, a literary-sociological study that posits MFA programs as the principle force determining the structures, forms, themes, and direction of postwar fiction. The Program Era elicited a rare response from literary scholars: near-universal praise and admiration. McGurl had struck a massive gold vein, one that would sustain years of groundbreaking scholarship, and in the most obvious place.

The MFA program and the workshop model predate the GI Bill, but they began to flourish after the postwar infusion of veterans into the public university system. The very concept of “creative writing” as a discipline was itself a practical response to mid-century geopolitics. Novelists and poets had always supplemented their income with teaching. But by the mid-20th century, fiction writers increasingly turned from journalism and screenwriting to academia to fund their metier. With the rise of totalitarian governments throughout Europe, many continental writers sought refuge within the American university system.

A dilemma now faced American literature departments, which were already struggling to reconcile their roots in 19th century German scholasticism with American trade-oriented pragmatism. What to do with all these writers? Most were given literature courses to teach, but they typically lacked training or interest in literary theory. Should these writers be expected to produce scholarship? Or should they just…write?

Vladimir Nabokov on the cover of noted anti-Communist pamphlet "Time," looking unenthusiastic about his teaching career, a year before he was called an elephant.
Vladimir Nabokov on the cover of the noted anti-Communist pamphlet “Time,” looking unenthusiastic about his teaching career, a year before he was called an elephant. The banner confidently declares American superiority.

Two Cold War refugees, the linguist Roman Jakobson and the novelist Vladimir Nabokov, famously butted heads on this issue at Harvard. Nabokov, a popular lecturer but terrible scholar of Russian literature, was promoted to a chair in literature, largely on the basis of his literary accomplishments. Irritated, Jakobson mused, “Shall we appoint elephants to teach zoology?” Jakobson’s quip inspired the title of D.G. Myers’ history of creative writing, The Elephants TeachThat’s precisely what happened, except that a partition between the Jakobsons and the Nabokovs produced, on the graduate level, the humanities’ equivalent of separate theory and practice tracks.

Bennett’s contributions to the study and history of the MFA program examine the intersection of creative writing and the Cold War (his forthcoming book, Workshops of Empire, will be published by University of Iowa Press). This intersection may seem incongruous, until one contextualizes the MFA program, and its influence on cultural production, within the geopolitical functions of the postwar American university. Bennett argues that the MFA program tends to promote concreteness, specificity, and real life in fiction and to discourage (or virtually ban) broad, ideologically-driven fictions. He links these tendencies to anti-Communist anxieties about totalitarian systems and Marxist ideology, anxieties which (he argues) shaped the MFA program’s development and agenda throughout the 1940s, ’50s, and ’60s.

During this period, creative cultural output was viewed by the U.S. government as a legitimate and productive tool in the fight against Communism. Since its inception in 1947, the CIA funded traveling exhibitions of modern American painters: Pollock, de Kooning, Rothko, et al, most of whom were ardent leftists. The CIA’s art program was a covert extension of an earlier State Department program that was forcefully terminated. The State Department had been pilloried with objections from hayseed congressmen (and a hayseed president) that Abstract Expressionism was subversive trash, not worth funding. But the CIA understood its value: Communists in America and Europe would visit the exhibitions and witness how Socialist Realism, the dominant aesthetic in the Eastern Bloc, paled in comparison to the West’s avant-grade.

Any Soviet artists who visited these exhibitions saw forms, styles, innovations – openness – that had been forbidden in Russia since the death of Lenin.

 

Sergei Eisenstein
Sergei Eisenstein

II.

Soviet propagandists in the 1950s and ’60s had little difficulty convincing their people of the West’s decadence. The self-evident and vast disparity between Western prosperity (particularly in the United States) and life in the Eastern Bloc did that work for them. As middle class swimming pools sprang up like cacti across southern California and Dairy Queens appeared in every town, there was little to dress up or exaggerate.

Convincing the Average Ivan that such decadence was undesirable proved a greater (ultimately insurmountable) challenge. After Stalin’s death and the Sino-Soviet split, public discontent evolved from an occasional and easily remedied headache to a chronic migraine for the leaders of the Soviet Union. Official lies about “our prosperity” and “Western decline” were not as durable in the Eastern bloc as they’d been – and would be – in other totalitarian regimes. Information leaked through the Iron Curtain, and citizens could compare and contrast.

To convince its people – and to convince the millions of sympathetic ears listening beyond the Iron Curtain – of capitalism’s inherent corruption, the Soviet Union documented Western colonial and post-colonial atrocities and racial apartheid in the American South. “They might have better cars than we do, but they murder whole villages and hang black men from trees!” This reality-based propaganda would eventually pressure ardent Cold Warriors in the U.S. government to lend much-needed support to the Civil Rights movement. In the late 1950s, most congressmen didn’t care if blacks in Mississippi could safely participate in society. They did, however, care about the spread of Communism in all these young nations newly emancipated from European colonialism. If allowing blacks in Mississippi back into civil society could help keep the dominos standing, so be it!

The Soviet Union also inflated its own successes, hoping nationalism might be a satisfying alternative to a Cadillac (I wonder if they could actually see Lenin rolling in his tomb). Party solidarity, especially on matters of liberalization and foreign policy, were exaggerated. Weapons stockpiles were wildly exaggerated. Most famously, dummy missiles were paraded before cameras broadcasting straight into Ronald and Nancy’s living room while he ate his nightly TV dinner. Here, even the West was duped. The American intelligence community would endure a bevy of congressional hearings between 1989 and 1991, wherein red-faced one-time Cold Warriors demanded to know how, how, our intelligence had so wildly overestimated the size of the Soviet arsenal, the health of the Soviet economy, and the strength of Soviet institutions. Hype was a successful export.

But in the propaganda war, the Soviet Union failed in an arena where even fascists had enjoyed some success: cultural exports.

In its infancy, the U.S.S.R. experienced an extraordinarily flowering of the visual artists, particularly cinema (a medium Lenin preferred). Propagandists were recruited from Russia’s thriving avant-grade theater community and from Lev Kuleshov’s film school, where the degree and density of talent is still almost without precedent (the early years of Walt Disney’s animation studio or the informal confederacy of filmmakers associated with the French New Wave come to mind). Lenin asked for Communist propaganda, and the artists delivered – in part because they were given freedom over their own work. Abstraction, surrealism, and experimentation were permitted, if not always admired by the Party (especially Trotsky). The Kuleshov school’s chief innovations, Soviet montage and theories of editing, effectively expanded cinematic grammar. Filmmakers throughout the world could convey meaning with greater efficiency, power, and range. These innovations were permanent. They remain embedded in cinema and television. They are so ubiquitous that you forget their lineage.

The montagists helped sell Leninism to illiterate Russian peasants, but their broader impact almost exclusively on cinematic form, not ideology. Eisenstein and others theorized that the two were inseparable, that Marxism was embedded in their dialectical approach to cinematic expression. The theory would go untested. When Stalin assumed power, he declared Socialist Realism the aesthetic and ethos for all Soviet art. The great Kuleshov school filmmakers either fled Russia, self-censored, or were imprisoned.

Socialist Realism produced some interesting architecture, bold sculpture, and a few decent paintings, but in general it celebrated Russia’s dismal past and more dismal present with either straight representationalism, frightening bravado, or a sentimentality that would make Steven Spielberg sick. After Stalin’s death, aesthetic restrictions loosened and Soviet filmmakers began to enjoy a little freedom. But even the best state-approved Soviet filmmakers were overshadowed by their naval-gazing brethren, filmmakers like Andrei Tarkovsky who appealed to Western taste and defied governmental standards. Tarkovsky was not the cultural ambassador Brezhnev would have picked; he never tempted a Westerner to jump eastward over the Berlin Wall. But the petulant streak that allowed Tarkovsky to defy his government but also prompted him to piss all over his acclaim in the West (“the cinema,” he told a confused audience after winning the Telluride Medal, “she is a whore”). The majority of Russian filmmakers, however, lacked Tarkovsky’s cajones. And the majority of Soviet films produced between Stalin’s death and glasnost  did receive Party approval. But even these films made the U.S.S.R. look like shit.

During the Cold War, nearly all communist propaganda that reached a wide Western audience was produced by Communists in the West. The most effective anti-Communist propaganda to infiltrate the Soviet Union – the conditions and quality of life of the Soviet people – was produced by the Soviet government. Ultimately, Cold War propaganda amounted to two vast spheres of humanity talking to themselves.

Those internal conversations included aesthetics and cultural products we consume regularly today. The conversations produced at least two creative technologies that persist and flourish today: literary minimalism and cinematic montage. They are so pervasive that we barely notice them. Their former political and ideological dimensions may have been “erased,” to borrow Bennett’s phrase. They are certainly innocent where gulags and napalm are concerned. But their parentage is compelling. The cut of a Transformers sequel or the delicacy of an Alice Munro story are just as much relics of the Cold War as missile silos in South Dakota or a toppled statue of Lenin in Kiev.

What-Iffings of Futures Past: Some Reflections on Counterfactual Fiction

By Seth Studer

Historians don’t really like to deal with counterfactuals, with what might have been. They want to talk about history. How the hell do you know…what might have been? Who knows? Well, I know certain things.

– Robert McNamara, The Fog of War 

1. Different, but the Same

In college, I frequently attempted to diffuse the awkwardness of a first date by describing counterfactual U.S. elections. Attempting to impress my date, I didn’t confine myself to the obvious reversals (Nixon in 1960, Gore in 2000). I described how Dewey in 1948, Humphrey in 1968, or Bush in 1992 could have happened, detailing both the electoral math and the historical consequences.

I did not go on many second dates.

A&8_Political_MapWhat if I had suppressed my urge to share these speculations over coffee at the Java House? What if we’d gone to Starbucks instead? What if I had waited until the fourth or fifth date before retrieving those counterfactual electoral maps from my backpack and explaining how a 1948 Dewey victory would have rendered the modern Republican party unrecognizable? Would I be married to someone else today? Would I be single and living in Angola (or East Sudan, a nation that exists in this scenario)?

Such speculation is pointless, because under no conditions would I have suppressed my enthusiasm for alternate history. Not because my enthusiasm is irrepressible, but because any alternative scenario is impossible. What was is, inalterably: this is the first principle of historical analysis. The fact of the past always takes precedence. Conditions for alterity never were. There is no plausible alternate history.

Our grammar disagrees with the best practices of historical scholarship. The subjunctive mood allows for speculation backward and then forward. In philosophy, counterfactual theory is a thing, although it rarely addresses Confederate victories or missing Hitlers[1]. Literary scholars take counterfactual fiction seriously, but questions of how these fictions work – or how the cultural, political, and even grammatical constructions of counterfactuality work – supersede questions of if they work.

And that’s good, because counterfactual histories never work. Unless you narrow your parameters to banal observations (e.g., “if George H.W. Bush died in 1990, Dan Quayle would have become president”), you cannot arrange historical events in any pattern other than their own. Too many factors were/are in play. The past is inert. So what matters in good counterfactual fiction is not a rigorously executed chain of historical causality, but a balance of plausibility and novelty that favors the latter (e.g., “if George H.W. Bush died in 1990, Dan Quayle would have become president, the Soviet Union would have remained intact, and the United States would have been annexed by Canada”).

The best counterfactual fiction adheres to two principles: first, alter as little as possible for the greatest impact. E.g., assassinating Reagan in 1981 changes a lot with a single bullet. Second, make the consequences of your alteration plausible. E.g., kill Reagan, and you might stop neoliberal economic reforms and forestall the end of the Cold War. On its face, that’s a plausible, high-stakes counterfactual. But if American neoliberalism and Soviet collapse were inevitable even without Reagan, the scenario packs less of a punch.

Alternate-HistoryConsequently, alternate history relies on great men and big events, not broad historical forces. An exhausting plurality of counterfactual novels fiddles with World War II (“what if the Nazis won?!”) or the American Civil War (“what if the South won?!”). After that, counterfactualists find easy work riffing on 1492, assassinations, colonial maps, and close elections. Altogether, these subjects surely account for 80% of counterfactual fiction[2]. Because all alternate histories are inherently implausible, a counterfactualist’s goal is not plausibility. His goal is to blunt his scenario’s inherent implausibility. The best counterfactual speculation adheres to the inflexibility of history. No matter how many players you shift in the foreground, the background – economic, social, and material forces – remains. Alternate history should be different, but the same.

Compare alternate history’s most prolific practitioner, Harry Turtledove, to Philip Roth’s counterfactual novel, The Plot Against America (2004). Turtledove’s interventions in Byzantine decline, Euro-American First Contact, the American Civil War, and WWII have massive ramifications, wholly altering the course of history. Special attention is given to great  men: generals, explorers, heads of state. Sometimes aliens get involved. Roth’s novel, meanwhile, posits an improbable Charles Lindbergh presidency that realigns U.S.-Nazi relations. The narrative structure is memoir: a fictional Roth, situated in the present, remembers his Jewish childhood in Newark. He was a minor figure with only a civilian’s access FDR, Lindbergh, and Hitler. By the novel’s end, the world has not substantially changed. History is not irrecoverably altered by Roth’s (fairly significant) alteration; it simply takes a detour around and back to its natural course.

My favorite counterfactual scenario reverses the Vietnam War, a major event in American cultural memory. Consider the following: sometime after Mao assumes power in China, the U.S. intelligence community decides that Ho Chi Minh should be “Asia’s Tito,” an independent Communist leader who helps contain China and the Soviet Union (the U.S. briefly flirted with this policy, which adds the requisite dash of plausibility). This policy hastens French withdrawal and unifies Vietnam. Fifty-eight thousand Americans and millions of Vietnamese do not die. The 1960s are less tumultuous. But Europe and Asia will still recover from WWII by 1970; financialization will occur in all developed economies; the Soviet Union will either weaken or liberalize; economic and demographic realities will force China open; competition will push the U.S. toward China. Secularization, decolonization, civil rights, perestroika, Islamic extremism, and the rise of the BRICs are all inevitable. The dates and details are different, but averting the Vietnam War – an event that affected millions – ultimately changes very little for the billions now living. A seemingly large event is, in the long run, pretty small.

2. Different, Not the Same

Last month, British novelist D.J. Taylor listed his ten favorite counterfactual novels for the Guardian. Half of Taylor’s picks tamper with WWII and adjacent events. These novels, including Roth’s Plot Against America, represent the best of a bad sub-genre. Taylor himself offers a fun twist on the WWII changaroo in his new novel, The Windsor Faction (unread by me). Playing to our recent (and weird) obsession with interwar England, Taylor keeps the rascally Edward VIII on the throne (sadly for fans of Masterpiece and that movie with Colin Firth, Wallis Simpson is apparently dead: this is how Taylor sidesteps abdication)[3]. Edward VIII foils his government’s attempts to undermine Hitler, because…well, once you stray too far from the premise of a counterfactual novel, things tend to get stupid. Unless you get very, very far from the premise, which is what the best novel on Taylor’s list – Kingsley Amis’s The Alteration – does.

alteration2Here’s The Alteration in a sentence (spoiler alert): a prepubescent English boy is recruited to be a castrato roughly 450 years after Martin Luther became Pope Germanius I, an event that blunted and contained the Protestant Reformation. The title’s “alteration” refers to both counterfactuality and castration. Most of the novel involves the castrato plot, set in 1976; Amis’s historical alterations and their consequences are background material.

The Alteration is not the sexiest or most exciting counterfactual novel you’ll read, but it may be (ahem) the ballsiest. Not content to reverse world wars, Amis essentially reverses Western civilization[4]. He wants an alternate history that is actually different.

So ask yourself: what is the most important event of the past thousand years (in the West, anyway)? The correct answer is surely the convergence of five or so mega-events, all of which occurred within a 100-year period: Euro-American First Contact, the scientific revolution, the proliferation of print culture, the rise of merchant capitalism, and European colonial hegemony. Smack dab in the middle of these mega-events is the Protestant Reformation, which is directly implicated in or bolstered by all of them. You needn’t agree with Max Weber to know that Protestantism was a prime vector for European modernity. And compared to capitalism, print culture, or First Contact, Luther’s break with Rome is a reasonably simple event to reverse.

But the Reformation is the very definition of an inevitable event. Would-be Reformers had always existed. Luther simply appeared at the right time. Amis knows this. How does he get away with his massive alteration? First, unlike most counterfactualists, he distances his narrative from his counterfactual premise. Amis is more interested in his alternate 1976, where England is Catholic and castrati still sing, than in an alternate Diet of Worms. Luther became Pope a long time ago, under conditions that are plausible only in a blur.

Second, Amis is not especially worried about plausibility. The novel is full of humor, absurdities, and inside jokes. In an odd moment, Amis’s protagonist reads a counterfactual version of Philip K. Dick’s Man in the High Castle, in which Pope Germanius I remained Martin Luther the Reformer[5]. Dick’s real-life Man in the High Castle (1962) is the gold standard (hey, what if we’d never ended the gold standard??) of WWII revision. Like Roth and Amis, Dick circumvents the problem of plausibility through distance: he does not observe counterfactual events up close or as they occur. The Alteration‘s fictional Dick is one of numerous devices Amis uses to suspend plausibility, distracting the reader with irony and levity.

alterationFinally, Amis allows for historical inevitability. A small reformation still occurs and anti-Catholic radicals (called Schismatics) still find refuge in North America. Without a capital-R Reformation, global capitalism and modern technology are inhibited. But they appear, albeit in atavistic form. European colonialism is severely stunted, but there are still colonies. Amis retains enough history to authenticate his massive alteration. In most plausible alternate histories, little things change but big things remain the same. The South wins the war, but the slave economy is still doomed. In Amis’s novel, little things persist but the big thing – the trajectory of Western civilization sans Martin Luther – is totally altered.

With a few deft literary maneuvers, Amis imagines vast political, economic, and cultural realignments simply by altering a few 16th century events, all while allowing for history’s inflexibility. In this respect, The Alteration is arguably the most successful counterfactual novel ever written. The result, however, is an unsatisfying novel to most fans of the genre: people like me, who obsess over “what if” scenarios and purchase terrible short story anthologies edited by Harry Turtledove. We don’t want irony or levity or distance. We don’t want alternate histories that acknowledge the impossibility of alternate history. We want options. We want great men: kings and generals and presidents. In other words, we want men and women, great or small, to have a singular impact. We want our votes to count. We want to save Kennedy, elect Gore, kill Hitler, prevent the Vietnam War. We don’t want to be alone, anonymous and without agency, faceless amid the currents of history.

Amis succeeds only because he neuters these desires. In the final pages of The Alteration, the castration that Amis’s heroes have fought so hard to prevent occurs. It occurs in a bizarre and wholly unexpected context, but it occurs nonetheless. It was inevitable. There can be no different ending.

—–

[/1] These rigorous counterfactual inquiries, while interesting, are too abstract (“if x then not y”) and too narrow (“if president dies, vice president succeeds him”) to describe what would’ve happened if JFK survived and why that would’ve been awesome and/or terrible.

[/2] Too few counterfactual novels tinker with World War I. British neutrality is a fascinating and plausible scenario.

[/3] Cultural historians will one day study the Obama-era fascination with Edwardian English aristocracy and Wallis Simpson alongside the Reagan era’s Australia fixation and the fact that everyone stopped playing Guitar Hero the moment George W. Bush stopped being president.

[/4] Many counterfactualist authors have attempted similarly seismic shifts, with mixed results. Turtledove’s attempt is literally seismic, fulfilling Barry Goldwater’s dream of detaching the eastern seaboard from the rest of North America.

[/5] This is a device Dick actually uses in Man in the High Castle: characters read a counterfactual novel-within-the-novel whose plot aligns with actual history. Amis playfully uses Dick as a nexus of counterfactual paradoxes. 

This Is Your Brain on Books

By Seth Studer

neurolinguisticsEarlier this week, OnFiction published the results of a recent study on the biological effects of reading fiction. Researchers at Emory University used MRI scanners to track the brain activity of nineteen participants, all reading the same novel (Robert Harris’s historical thriller Pompeii). The researchers focused on “functional connectivity,” the degree to which activity in one region of the brain prompts or correlates with activity in another region. Basically, your brain’s ability to talk to itself. Participants’ brains were scanned while reading, immediately after reading, and five days after completing the novel. OnFiction described the results:

[The researchers] identified a number of brain networks that got stronger (i.e., the different regions became more closely associated) during the reading period. They also uncovered a network of brain regions that became more robust over the reading period but also appeared to persist after the participants had finished reading. Although this network got weaker over time after the reading period, the correlations between the brain regions in this network were still stronger than those observed prior to the reading period.

Conclusion? Surprise, reading makes you smarter! Or, reading helps your brain make neurological connections more briskly. Those non-adjacent neurons that light up while you’re reading Starship Troopers are potentially responsible for language and comprehension skills (kinda seems obvious, right?), but the researchers aren’t sure yet: the brain remains too dense and mysterious to definitively map. So some of those neurons might be responsible for something totally unrelated to language but related to fiction-processing. Which, for literary scholars, would be awesome to learn about.

Either way: when you read, your brain lights up.

The Emory study focuses on neurological responses to a single novel. But earlier this month, OnFiction reported another study that seemed to demonstrate a measurable difference between “literary fiction” and pulp: a difference many literary scholars spent thirty or more years dismissing. Two psychologists at the New School for Social Research gave readers a randomly assigned texts – some “highbrow,” others “lowbrow,” others nonfiction – and afterward measured the reader’s ability to empathize with others (aka “Theory of Mind”). Participants who read a highbrow text were consistently more empathetic than participants who read the lowbrow text.

In other words, if you need a ruthless hitman, don’t hire the one reading Anna Karenina.

The results of this study were published in Science and discussed on NPR’s All Things Considered. You can hear the audio clip or read the transcript here (I recommend listening to the audio, to experience the full effect of the Danielle Steele/Louise Erdrich pairing).

Gregory Burns, team leader of the first study, is a neuroscientist who has used neurological approaches to economics and sociology. Now he has his eyes on literary analysis. But lit scholars are traditionally wary of theories and methods that appear too positivist, empirical, or quantitative. (Celebrity scientists who condescend and prescribe cures for the humanities without really understanding what humanists actually do aren’t helping.) Much of this wariness comes from decades of disciplinary isolation: C.P. Snow’s “two cultures.” Some of it comes from the academic turf wars and ideological disputes of the 1980s. In the late ’90s, something like Franco Moretti’s amazing Literary Lab would’ve had to been developed slowly and with care, so as not to cause too much of a ruckus. Add a dash of quantitative reasoning in one article, use a database in another, publish a groundbreaking polemic, ensure that you already have tenure and academic fame, and now you’re ready to be semi-empirical without overwhelming backlash!

Of course, so much has changed since the early 2000s. The so-called “Digital Humanities” (a term that seems to mean everything and nothing) has made statistics ‘n’ stuff more palatable to humanists, and the pioneering work of scholars like Nicholas Dames has made science less scary. Today, you can’t go to a literature conference now without a panel on cognitive science and another on economic theory. The “two cultures” are intermingling, beginning with the social sciences, which overlap with humanist concerns more explicitly than, say, physics does. But the studies featured on OnFiction this week should not be dismissed. They aren’t perfect, but their methodologies offer rigorous and robust approaches to literary experience.

Nazism, Bigamy, and the Problem of Paul de Man

By Seth Studer

It’s time to beat up on Paul de Man again.

And yes, he probably deserves it.

In Monday’s Chronicle of Higher Education, Tom Bartlett revealed the juicy details of Evelyn Barish’s new biography The Double Life of Paul de Man (due out in March 2014). Barish suggests that de Man emigrated from Belgium in 1947 to escape embezzlement charges. He was eventually convicted in absentia of stealing one million Belgian francs (roughly US$300k today) from his own publishing house. Barish also discovered that de Man never held an undergraduate degree, and that in his interactions with friends, family, and colleagues, he was sometimes a total dick.

This in addition to what we already knew: de Man was a deadbeat dad, a temporary bigamist, and the author of several blatantly anti-Semitic articles for a pro-Nazi newspaper during the German occupation of Belgium. The articles were disclosed in 1987, three years after de Man’s death. English professors across the nation responded with horror (or schadenfreude) because, throughout the 1960s and ’70s, Paul de Man led the vanguard that introduced deconstructionist theory into American universities[1]. He was a big deal.

deconstructionExcept he wasn’t.

Unlike his friend and fellow deconstructionist, Franco-Jewish philosopher Jacques Derrida, de Man’s scholarship focused narrowly on literary language. He argued that literary texts, through their own internal tensions and oppositions, effectively read themselves. (Your copy of Moby Dick is reading itself, even as it sits dusty on your shelf!) Derrida, meanwhile, wrote about everything from semiotics and political philosophy to his pet cat. Derrida’s writing was difficult, but often in a fun way – weird, cheeky, playful.

Also: if you’re a layperson, you’ve probably heard of Derrida. His obit appeared in the New York Times[2]. He was one of several influential French thinkers who emerged alongside 1960s anti-de Gaullist radicalism. You know them by their surnames: Barthes, Foucault, Lacan, Derrida. De Man was the Belgium to their France; he is virtually unknown outside literature departments. But he, more than any other figure, set the hermeneutic agenda for U.S. literature departments in the ’70s and early ’80s.

The news that de Man had authored Nazi propaganda could not have emerged at a worse time for his students (by then major scholars in their own right) or for deconstruction in general[3]. By 1987, cultural studies and politico-ethical concerns were pushing deconstruction out of the humanities. Deconstruction was too apolitical, too textocentric. This was a sideshow in the Culture Wars: as many professors adopted radical politics, ardent deconstructionists appeared reactionary and insular. Meanwhile, deconstruction’s apparent nihilism was being attacked by positivists, scientists, traditionalist lit scholars, and even social conservatives outside the academy. The de Man-Nazi revelation offered proof of what many already suspected: that deconstruction was nefariously closed-off, vapid repressive, even quasi-totalitarian. By the ’90s, deconstruction had lost its cache.

The problem is that Paul de Man was so good.

blakeDerrida was unfairly dismissed as an emperor without clothes, but he also reveled in appearing to waltz through the kingdom naked. For a certain type of student (e.g., me), de Man was much more satisfying. De Man explained heady concepts without Derridean playfulness. He wrote heavy, dense, substantive prose. He reads like a serious scholar applying a theory rather than performing or practicing it. My favorite of his essays is “The Rhetoric of Temporality,” an account of how representations of time are the basis of literary language. He describes how well-known devices – allegory, symbolism, irony – interact with time. He slowly develops an argument that slippage occurs between allegory and symbolism in Romantic poetry, despite the Romantics’ best effort to keep them separate. On this premise, he introduces two “modes” of representing time in literature: “allegory” (which partially includes symbolism) and “irony.”

Toward the end of the essay, de Man writes:

The dialectical play between the two modes [allegory and irony]…make up what is called literary history.

Deconstructionist jargon like “play” aside, de Man’s declaration is downright old-fashioned. Here is an account of literary history premised on literary analysis. When I read this in graduate school, it felt ballsy and refreshing. No hedging, no contextualizing, no whining, no kidding around, just straight-up confidence in his own system: “this is literary history.” I was floored.

So as deconstructionists went, de Man was a straight shooter, on the page if not in his life (perhaps he viewed his two wives as “two modes of dialectical play”). Unlike Derrida or even Barthes, de Man wasn’t messing with me, wasn’t trying to fool or trick me. Even if he believed (along with his intellectual kin) that “everything was a text,” he generally confined himself to literary or rhetorical analyses. I continue to find him useful, which I can’t say about most of his contemporaries. De Man’s work represented deconstruction at its best.

Fractal_swastika_(IFS)But try as I may, I can’t help but detect a bit of the Nazi in it all: the exegetical totality, the confusion (or manipulation) of text and meaning, the all-encompassing instability. And yeah, the biography.

It matters little whether a good physicist was a Nazi, because Nazism probably didn’t contaminate his work. You can kill the Nazi physicist or hire the Nazi physicist, but the physics itself will contains no traces of Nazism. This is slightly less true of a Nazi biologist, who may have covertly adopted Nazi theories of race. For a philosopher, however, the possibility of cross-contamination is so great as to warrant quarantine. Indignant defenders of de Man who separate his scholarship from his anti-Semitic writings are denying this obvious reality. (Derrida’s defense of de Man was better than most because it allowed for cross-contamination[4].)

De Man was a crook and a cheat and a Nazi collaborator. For most literary scholars today, de Man is interesting but irrelevant: deconstruction happened thirty years ago. It had a good run and probably outlasted its expiration date. Meanwhile, those who, like me, find de Man’s insights useful can argue that his political beliefs are functionally irrelevant to his scholarly work. A Chinese wall exists between the Nazism and “The Rhetoric of Temporality.” To reject or to deny? Neither option is good, and Paul de Man isn’t going anywhere, as Barish’s biography proves.

Literary scholars don’t sever Barthes or Foucault from their social, historical, and ideological roots. De Man should be no exception. It’s naive to believe that, before de Man, the humanities weren’t already poisoned by the ugliest ideologies, but it’s impossible to ignore his collaboration with Nazism. So what would it mean to accept both the scholarship and the potential evil attached to it? To not only refuse to let ourselves off the hook, but to actively get on the hook? De Man offered a compelling and useful explanation of literary language, and he also used the written word to collaborate with Nazis. Does deconstruction have Nazi roots? I don’t trust anyone who says “no” reflexively.

C’mon, let’s not be dismissive or defensive or squeamish! Let’s not be afraid of a little blood on our hands!

———

[/1]  You might think you know what “deconstruction” is, and you’re probably wrong. But you’re also probably correct, more or less. From a literary standpoint, deconstruction holds that a poem (or whatever) consists of oppositions that differ and defer to each other in a process Derrida called “play.” This play both creates and subverts the meaning of the poem (or whatever). For de Man, this meant that a poem (or whatever) is self-interpreting.

[/2] Derrida’s obituary was a minor literary event in humanities programs. I’ve seen it assigned on English syllabi, as an instance of productive misreading or something.

[/3] My favorite student of de Man is the late Barbara Johnson, who applied his theories of literary language with intelligence and clarity to topics ranging from Melville’s Billy Budd to the rhetoric of abortion. Her 1994 book The Wake of Deconstruction describes the de Man scandal.

[/4] Derrida, who as a Jewish child was persecuted by the Vichy French government, defended his friend in typical Derridian fashion: he tweaked the anti-Semitic language and found differing oppositions. The full defense is not available online as far as I can tell, but its substance can be gleaned from Jon Wiener’s intelligent, and disapproving, analysis.