Would You Rather Be Rich in the Past or ‘Comfortable’ Today?

By Kindred Winecoff

Scott Sumner:

In a recent post I suggested that one could argue that the entire increase in per capita income over the past 50 years was pure inflation (and hence that real GDP per capita didn’t rise at all.) But also that one could equally well argue that there has been no inflation over the past 50 years. The official government figures show real GDP/person rising slightly more than 150% since 1964, whereas the PCE deflator is up about 6-fold. …

Here’s one thought experiment. Get a department store catalog from today, and compare it to a catalog from 1964. (I recently saw Don Boudreaux do something similar at a conference.) Almost any millennial would rather shop out of the modern catalog, even with the same nominal amount of money to spend. Of course that’s just goods; there is also services, which have risen much faster in price. OK, so ask a millennial whether they’d rather live today on $100,000/year, or back in 1964 with the same nominal income. Recall the rotary phones and bulky cameras. The cars that rusted out frequently. Cars that you couldn’t count on to start on a cold morning. I recall getting cavities filled in 1964, without Novocaine. Not fun. No internet. Crappy TVs, where you have to constantly move the rabbit ears on top to get a decent picture. Lame black and white sitcoms, with 3 channels to choose from. Shorter life expectancy, even for the affluent. No Thai restaurants, sushi places or Starbucks. It’s steak and potatoes. Now against all that is the fact that someone making $100,000/year in 1964 was pretty rich, so your social standing was much higher than that income today. So it’s a close call, maybe living standards have risen for people making $100,000/year, maybe not. Zero inflation in the past 50 years may not be right, but it’s a reasonable estimate for a millennial, grounded in utility theory. In which period does $100,000 buy more happiness? We don’t know.

I think if we really don’t know the answer to this question then it’s only because happiness is subjective. To me it’s obvious that a $100,000/year salary is worth more today than it used to be. For one thing, in 1964 tax rates in basically every Western economy were absurdly high, so that that $100,000 would really be somewhere from $10,000-30,000. George Harrison wasn’t exaggerating; how would you like to live in a country where your best artists and creators were forced into (or simply chose) tax exile?

But let’s leave that aside for now. In 1964 a $100,000 salary would make you an elite, but your real income would actually be much smaller than that because of all of the 2014 goods you could not purchase at any price. Sumner runs many of them down, but the point is that $100,000 is still enough to live quite well in this country — even in the expensive cities — but the range of choice has exploded, and many of the modern choices now come at very low cost.

Let’s not forget that politics was quite different in 1964 as well: segregation persisted, the Cold War was raging, and even in the U.S. the “elite” were defined as much by their pedigree as income. We weren’t far removed from McCarthy, and were in the midst of a succession of assassinations of American political leaders and overt revolutionary threats in many Western societies. No birth control, no abortion, few rights for women and homosexuals in general. Being an elite in that world would likely feel very uncomfortable, and of course this blog (and essentially all media I consume) wouldn’t exist. So for me 2014 is the obvious choice.

Tyler Cowen has a more interesting question:

But here’s the catch: would you rather have net nominal 20k today or in 1964? I would opt for 1964, where you would be quite prosperous and could track the career of Miles Davis and hear the Horowitz comeback concert at Carnegie Hall. (To push along the scale a bit, $5 nominal in 1964 is clearly worth much more than $5 today nominal. Back then you might eat the world’s best piece of fish for that much.)

I’m still not sure. $20k/year back then wouldn’t be enough to make you very well off, and the marginal cost of culture consumption today has sunk almost to zero. Was Miles Davis really so much better than anyone working today? For everyone in the world who does not live in NYC, is it better to be able to watch his concerts on YouTube now, and on demand, than not to have seen them at all? Lenny Bruce was still active in 1964 but almost no one ever saw him (for both technological and political reasons). I might still take the $20k today, and I’ve lived on less than that for my entire adult life until last year, so this is an informed choice. But I agree that it’s a much more difficult decision.

It is an interesting question, mostly because it reveals what people value most. It’s a mutation of the “veil of ignorance”. So what would you choose?

Advertisements

14 Reasons Susan Sontag Invented Buzzfeed!

By Seth Studer

41wboBULMFLIf you’re looking for a progenitor of our list-infested social media, you could do worse than return to one of the most prominent and self-conscious public intellectuals of the last half century. The Los Angeles Review of Books just published an excellent article by Jeremy Schmidt and Jacquelyn Ardam on Susan Sontag’s private hard drives, the contents of which have recently been analyzed and archived by UCLA. Nude photos have yet to circulate through shadowy digital networks (probably because Sontag herself made them readily available – Google Image, if you like), and most of the hard drives’ content is pretty mundane. But is that going to stop humanists from drawing broad socio-cultural conclusions from it?

Is the Pope Catholic?

Did Susan Sontag shop at Sephora?

Sontag, whose work is too accessible and whose analyses are too wide-ranging for serious theory-heads, has enjoyed a renaissance since her death, not as a critic but as an historical figure. She’s one of the authors now, like Marshall McLuhan or Norman Mailer, a one-time cultural institution become primary text. A period marker. You don’t take them seriously, but you take the fact of them seriously.

Sontag was also notable for her liberal use of lists in her essays.

“The archive,” meanwhile, has been an obsession in the humanities since Foucault arrived on these shores in the eighties, but in the new millennium, this obsession has turned far more empirical, more attuned to materiality, minutia, ephemera, and marginalia. The frequently invoked but still inchoate field of “digital humanities” was founded in part to describe the work of digitizing all this…stuff. Hard drives are making this work all the more interesting, because they arrive in archive pre-digitized. Schmidt and Ardam write:

All archival labor negotiates the twin responsibilities of preservation and access. The UCLA archivists hope to provide researchers with an opportunity to encounter the old-school, non-digital portion of the Sontag collection in something close to its original order and form, but while processing that collection they remove paper clips (problem: rust) and rubber bands (problems: degradation, stickiness, stains) from Sontag’s stacks of papers, and add triangular plastic clips, manila folders, storage boxes, and metadata. They know that “original order” is something of a fantasy: in archival theory, that phrase generally signifies the state of the collection at the moment of donation, but that state itself is often open to interpretation.

Microsoft Word docs, emails, jpegs, and MP3s add a whole slew of new decisions to this delicate balancing act. The archivist must wrangle these sorts of files into usable formats by addressing problems of outdated hardware and software, proliferating versions of documents, and the ease with which such files change and update on their own. A key tool in the War on Flux sounds a bit like a comic-book villain: Deep Freeze. Through a combination of hardware and software interventions, the Deep Freeze program preserves (at the binary level of 0’s and 1’s) a particular “desired configuration” in order to maintain the authenticity and preservation of data.

Coincidentally, I spent much of this morning delving into my own hard drive, which contains documents from five previous hard drives, stored in folders titled “Old Stuff” which themselves contain more folders from older hard drives, also titled “Old Stuff.” The “stuff” is poorly organized: drafts of dissertation chapters, half-written essays, photos, untold numbers of .jpgs from the Internet that, for reasons usually obscure now, prompted me to click “Save Image As….” Apparently Sontag’s hard drives were much the same. But Deep Freeze managed to edit the chaos down to a single IBM laptop, available for perusal by scholars and Sontag junkies. Schmidt and Ardam reflect on the end product:

Sontag is — serendipitously, it seems — an ideal subject for exploring the new horizon of the born-digital archive, for the tension between preservation and flux that the electronic archive renders visible is anticipated in Sontag’s own writing. Any Sontag lover knows that the author was an inveterate list-maker. Her journals…are filled with lists, her best-known essay, “Notes on ‘Camp’” (1964), takes the form of a list, and now we know that her computer was filled with lists as well: of movies to see, chores to do, books to re-read. In 1967, the young Sontag explains what she calls her “compulsion to make lists” in her diary. She writes that by making lists, “I perceive value, I confervalue, I create value, I even create — or guarantee — existence.”

As reviewers are fond of noting, the list emerges from Sontag’s diaries as the author’s signature form. … The result of her “compulsion” not just to inventory but to reduce the world to a collection of scrutable parts, the list, Sontag’s archive makes clear, is always unstable, always ready to be added to or subtracted from. The list is a form of flux.

The lists that populate Sontag’s digital archive range from the short to the wonderfully massive. In one, Sontag — always the connoisseur — lists not her favorite drinks, but the “best” ones. The best dry white wines, the best tequilas. (She includes a note that Patrón is pronounced “with a long o.”) More tantalizing is a folder labeled “Word Hoard,” which contains three long lists of single words with occasional annotations. “Adjectives” is 162 pages, “Nouns” is 54 pages, and “Verbs” is 31 pages. Here, Sontag would seem to be a connoisseur of language. But are these words to use in her writing? Words not to use? Fun words? Bad words? New words? What do “rufous,” “rubbery,” “ineluctable,” “horny,” “hoydenish,” and “zany” have in common, other than that they populate her 162-page list of adjectives? … [T]he Sontag laptop is filled with lists of movies in the form of similar but not identical documents with labels such as “150 Films,” “200 Films,” and “250 Films.” The titles are not quite accurate. “150 Films” contains only 110 entries, while “250 Films” is a list of 209. It appears that Sontag added to, deleted from, rearranged, and saved these lists under different titles over the course of a decade.

“Faced with multiple copies of similar lists,” continue Schmidt and Ardam, “we’re tempted to read meaning into their differences: why does Sontag keep changing the place of Godard’s Passion? How should we read the mitosis of ‘250 Films’ into subcategories (films by nationality, films of ‘moral transformation’)? We know that Sontag was a cinephile; what if anything do these ever-proliferating Word documents tell us about her that we didn’t already know?” The last question hits a nerve for both academic humanists and the culture at large (Sontag’s dual audiences).

Through much of the past 15 years, literary scholarship could feel like stamp collecting. For a while, the field of Victorian literary studies resembled the tinkering, amateurish, bric-a-brac style of Victorian culture itself, a new bit of allegedly consequential ephemera in every issue of every journal. Pre-digitized archives offer a new twist on this material. Schmidt and Ardam: “The born-digital archive asks us to interpret not smudges and cross-outs but many, many copies of almost-the-same-thing.” This type of scholarship provides a strong empirical base for broader claims (the kind Sontag favored), but the base threatens to support only a single, towering column, ornate but structurally superfluous. Even good humanist scholarship – the gold standard in my own field remains Mark McGurl’s 2009 The Program Era – can begin to feel like an Apollonian gasket: it contains elaborate intellectual gyrations but never quite extends beyond its own circle. (This did not happen in Victorian studies, by the way; as usual, they remain at the methodological cutting edge of literary studies, pioneering cross-disciplinary approaches to reading, reviving and revising the best of old theories.) My least favorite sentence in any literary study is the one in which the author disclaims generalizability and discourages attaching any broader significance or application to the study. This is one reason why literary theory courses not only offer no stable definition of “literature” (as the E.O. Wilsons of the world would have us do), they frequently fail to introduce students to the many tentative or working definitions from the long history of literary criticism. (We should at least offer our students a list!)

In short, when faced with the question, “What do we do with all this…stuff?” or “What’s the point of all this?”, literary scholars all-too-often have little to say. It’s not that a lack of consensus exists; it’s an actual lack of answers. Increasingly, and encouragingly, one hears that a broader application of the empiricist tendency is the next horizon in literary studies. (How such an application will fit into the increasingly narrow scope of the American university is an altogether different and more vexing problem.)

Sontag’s obsession with lists resonates more directly with the culture at large. The Onion’s spin-off site ClickHole is the apotheosis of post-Facebook Internet culture. Its genius is not for parody but for distillation. The authors at ClickHole strip the substance of clickbait – attention-grabbing headlines, taxonomic quizzes, and endless lists – to the bone of its essential logic. This logic is twofold. All effective clickbait relies on the narcissism of the reader to bait the hook and banal summaries of basic truths once the catch is secure. The structure of “8 Ways Your Life Is Like Harry Potter” would differ little from “8 Ways Your Life Isn’t Like Harry Potter.” A list, like a personality quiz, is especially effective as clickbait because it condenses a complex but recognizable reality into an index of accessible particularities. “Sontag’s lists are both summary and sprawl,” write Schmidt and Ardam, and much the same could be said of the lists endlessly churned out by Buzzfeed, which constitute both an structure of knowledge and a style of knowing to which Sontag herself made significant contributions. Her best writing offered the content of scholarly discourse in a structure and style that not only eschewed the conventions of academic prose, but encouraged reading practices in which readers actively organize, index, and codify their experience – or even their identity – vis a vis whatever the topic may be. Such is the power of lists. This power precedes Sontag, of course. But she was a master practitioner and aware of the list’s potential in the new century, when reading practices would become increasingly democratic and participatory (and accrue all the pitfalls and dangers of democracy and participation). If you don’t think Buzzfeed is aware of that, you aren’t giving them enough credit.

Building a Better Middlebrow: the Case of Ken Burns’s “The Roosevelts,” Pt. 1

By Seth Studer

Ken Burns

Preface: No spoilers, please…

I am not yet finished watching Ken Burns’s fourteen-hour long saga The Roosevelts: An Intimate History. Nevertheless, I can already reflect on what Burns’s latest contribution tells us about the much-touted “Golden Age of Television.” An historical documentary on PBS spanning fourteen hours, most of it comprised of black-and-white archival footage and Baby Boomer talking heads (e.g., George Will, Doris Kearns Goodwin, and one or two real historians), is being sold to the American public as “intimate.” And the series is intimate; Burns’s focus almost never turns from Roosevelts Teddy, Franklin, or Eleanor. The Roosevelts is easily his most intimate portrayal of a Great American (or, in this case, a Great American Family), and it reflects his growth as a filmmaker over the last half-decade, beginning with The National Parks: America’s Best Idea (a hot mess, to be sure, but a beautiful hot mess) and Prohibition (a tight little policy pic – his best film). In many ways, The Roosevelts is a return to the Burns I knew and hated in The Civil War and Jazz. But he’s returned wiser, sharper. His obnoxious Great Man, Big Battles gloss on the byzantine complexities of American social and political history has never, ever looked so good and contained so much substance. We can learn a lot from Burns’s most recent hybrid success-failure. Specifically, how to build a better middlebrow within American mass culture: a middlebrow it deserves and, I think, a middlebrow it wants.

The Roosevelts’ final episode aired last Saturday, but I’m not worried about catching up. Since the middle of last week, PBS has posted the following message to my Facebook feed at least twelve times: “Remember: you can binge watch the ENTIRE series – until Sept 28th – on your local PBS station’s website or Roku.” Today, the most consistent and interesting purveyor of American middlebrow culture is AMC. Mad Men, Breaking Bad, The Walking Dead: the pretensions of HBO with half of the budget and twice the accessibility. And AMC uses the exact same language to sell me Mad Men that PBS is using to sell me The Roosevelts.

"Binge all over me," says Betty Draper.
“Binge all over me,” says Betty Draper.

Much like Netflix, which has built a business model premised on its customer’s desire to “binge” on original content (we all finish House of Cards and Orange is the New Black knowing full well it will be an entire year before we get new episodes), AMC is encouraging its audiences to consume its products in the manner of a frat boy seeking to increase his blood alcoholic content as quickly as possible, or in the manner of a psychologically distressed person for whom food is a dangerous psycho-physiological outlet. Given the well-established link between consumption, consumerism, and sex (“INDULGE” is the word they coupled with Christina Hendrick’s Joan Harris), no one is really surprised by AMC’s ad campaign. But when the same tactics are applied to a 14-hour documentary about Eleanor Roosevelt, the time has come to ask some interesting questions.

Part One: Ken Burns – not a Historian, but he plays one on TV!

Throughout the 1980s, Ken Burns directed small documentaries on topics ranging from the Shakers to Huey Long and the Statue of Liberty. In 1990, he earned national fame for his seventh documentary, The Civil War, a nearly twelve-hour documentary about the Conflagration Between the States that, amazingly, managed to say very little about the causes – social, political, and cultural – of the war itself. A viewer could watch all 690 minutes of Ken Burns’s Civil War and learn nothing about the Civil War. Besides the battles, of course. Burns spends as much time on the Battle of Chattanooga (the third most important battle fought in Tennessee, the second or third least important state in the Confederacy) as he spends on the policy battles that raged between Lincoln, his advisors, and the Congress; or the internal divisions and resentments within the Confederacy itself, which did as much to weaken their cause as the Union juggernaut. Slavery is discussed, obviously, but as a fact and not a consequence of U.S. policy; the impact of its demise on U.S. politics is minimized. Every single black character is voiced by Morgan Freeman, who gravely intones the words of Frederick Douglass and then hams it up, step ‘n’ fetch it-style, when reading the words of perfectly literate enslaved (or merely working class) black men.

If Burns’s later films would suffer from an overemphasis on personalities, his Civil War underplays them in favor of events. Lincoln’s political acumen; Grant and Sherman’s brutal tactical genius; the stubborn dignity of black leaders who, receiving emancipation, refused to prostrate themselves before Northern whites; the grace with which many Confederate leaders, Lee among them, accepted defeat; all of these Great Persons are overshadowed by Great Battles, so that viewers in every media market from Picacho Pass to Pennsylvania could look out across their amber waves of telephone wire and pavement and intone, “It happened here.”

Among the talking heads, the thickly accented Shelby Foote utterly consumes Burns’s Civil War. He appears at least ten times more frequently than any other historian or author. Foote is a documentarian’s dream: folksy, charismatic, intellectual, and a born storyteller. But Foote is also kind of an idiot. When he volunteers to name “two authentic geniuses” produced by a war that gave America seven presidents, he identifies Abraham Lincoln (one of the great statesmen of the nineteenth century, along with Benjamin Disraeli and Otto von Bismarck) and Nathan Bedford Forrest (a lieutenant general in the Confederate army and founder of the terroristic Ku Klux Klan’s first iteration). This declaration had, apparently, once placed Foote in hot water once with a Southern relative, who grimly intoned, We never thought much of Mr. Lincoln down here. Foote chuckles in response to his anecdote. Southerners have strange feelings about that war, he observes.

Ya think?

Foote is not a Confederate partisan. He is simply a Civil War buff. But a buff is the most dangerous kind of historian. I am a Nixon/Watergate buff, which is why I am reluctant to make major claims about the man or the event. Foote has made a career buffing up the Civil War, giving it sheen but no shine, clearing away dirt but revealing nothing. Burns is in awe of Foote, whose volumes on the Civil War constitute the kind of history most popular with “buffs”: battles, more battles, personalities on the field, more battles, blood, guts, glory. We remember the names of colonels and privates but none of the congressmen. We learn more about Forrest than we learn about William Seward, Charles Sumner, Thaddeus Stevens, Alexander Stephens, or Judah P. Benjamin.

Here’s a tip: any middlebrow history of the American Civil War that does not begin – begin – with transatlantic trade, not merely of slaves but of all goods, is lying to you. Period.

Between The Civil War and The Roosevelts, Ken Burns’s style underwent significant improvement. He produced two “event” histories of Baseball and Jazz, widely praised except by hardcore fans of baseball and jazz, alongside shorter treatments of subjects we portray on banknotes and passports: Lewis and Clark, Thomas Jefferson, Mark Twain, and westward expansion. In 2007, he attempted to catch the White Whale of all American historical narratives, World War II, but took such a circumspect route – no straightforward, consensus-minded historical narrative; firsthand accounts from veterans; a “bottom-up” approach to major events – that he confused most of his viewers, who came expecting “the Burns treatment” (letters from Eisenhower, Tom Hanks as Patton, David McCullough’s eyebrows, etc.). The War was a failure.

By the end of The War, Burns seemed finally to grasp his own unique strengths and limitations. On the one hand, he could spew middlebrow schlock about the United States of America better than anyone. On the other hand, he had a tendency to attach himself to Great White Dudes (Shelby Foote, Thomas Jefferson) and no capacity to represent the subtle movements upon which history progresses. Why not, then, spew schlock and attach himself to lesser-known, more interesting Great White Dudes? And why not cast these Dudes in a story less obvious than, say, THE CIVIL WAR or THE WEST?

What followed were the best documentaries Ken Burns has yet made.

In my next post: The National Parks, Prohibition, The Rooseveltsand Burns in TV’s “Golden Age”

Yes, We Should Shame and Punish Racists

By Kindred Winecoff

Contra Graham below, I do not think Donald Sterling should’ve been treated lightly. My disagreement with him stems from his assertion that “the goal” is to convince racists to not be racists. I do not think that is the appropriate aspiration in all cases, including this one.

It clearly is not the NBA’s goal. The NBA’s goal is to disassociate from racists. This is not only wise in an immediate instrumental sense — the players were going to strike if Sterling was not suspended, putting the playoffs in jeopardy — but in broader relational sense. It is not sensible for me to join the Westboro Baptist Church in order to patiently sit with them and try to persuade them to change their minds. It is nonsensical for at least two reasons. First is that voluntary association is a legitimating act. Second is that it is extremely unlikely to be successful.

Reams of research suggest that political views seldom change. When they do it is usually as the result of new exposure: having to directly view someone else’s plight for the first time. Donald Sterling’s problem is not his under-exposure to African-American athletes and their supporters. “Raising awareness” of the problem with his behavior is not likely to help him change his mind. In such circumstances patient explanation and “debunking” frequently exacerbate the problem.

Moreover, Sterling has shown no interest at all in having an open discussion. To my knowledge he has not publicly acknowledged this situation, nor has he apologized. His wife — co-owner of the team and also a racist — is suing Sterling’s former mistress for leaking the tape. These people are trying to use their enormous fortune to materially harm the lives of those who object to their execrable behavior. I’m sorry but there’s no way to reason with people in an environment such as this.

So persuasion is usually wasted effort and may worsen the problem. There is another goal: punishment. Clearly the NBA wishes to punish Sterling materially as well as disassociating from him and shaming him, and this is appropriate. I do not think it matters that this was a private conversation as the purpose of the conversation was for Sterling to instruct someone over whom he had a great deal of influence to actively discriminate against others. Sterling’s talk was not idle, in other words, and he had many, many priors. Punishment for its own sake is appropriate in cases where people have harmed others.

While I admire Graham’s eagerness to raise the status of deliberation, and I share his skepticism of the motivations behind society’s tendency to veer from one Two Minutes Hate to another, I think in this case (and cases like it) his patience is unwise. If nothing else it legitimizes beliefs that ought not be legitimized by treating them as worth consideration.

Against Neil deGrasse Tyson: a Longer Polemic

By Seth Studer

In her recent Atlantic review of two new books on atheism, Emma Green brilliantly demarcates what is missing from the now decade-long insurgency of anti-ideological atheism. I use the term “anti-ideological atheism” instead of “neo-atheism” or “new atheism” or the obnoxious, self-applied moniker “noes” because opposition to ideology – to ideational constructions – is one of the major recurring threads among these varied atheist identities (a frightening mixture of elitism and populism is another). Green illustrates this point when she notes the incongruity between Peter Watson’s new history of post-Enlightenment atheism, Age of Atheists, and the kind of atheism most vocally espoused in the 21st century. The central figure in Watson’s study, Friedrich Nietzsche, is almost never cited by Richard Dawkins or Samuel Harris or Neil deGrasse Tyson. Nor, for that matter, are Nietzsche’s atheistic precursors or his atheistic descendants…all diverse in thought, all of whom would have been essential reading for any atheist prior to, well, now.

The most famous atheist, the one whose most famous quote – GOD IS DEAD – your scrawled with a sharpie on the inside door of your junior high locker, is almost persona non grata among our most prominent living atheists. His near-contemporary, Charles Darwin (hardly anyone’s idea of a model atheist), is the belle of the bellicose non-believer’s ball.  

Green also notes that the other famous 19th century atheist – Karl Marx, whose account of religious belief vis a vis human consciousness is still convincing, at least more than Nietzsche’s – is likewise incited by our popular atheists. The reason may be simple: invocations of Marx don’t score popularity points anymore, and the business of anti-ideological atheism is nothing if not a business.

But there is, I believe, a larger reason for the absence of Nietzsche, Marx, and almost all other important atheists from today’s anti-ideological atheism. As fellow Jilter Graham Peterson recently said to me, these popular atheists need a dose of humanities: liberal inquiry and a sense that truth is hard, not dispensable in easy little bits like Pez candies. I would expand on that: they need a more dynamic discursivity, they need more contentiousness, they need more classical, humanist-style debate. They need the kind of thinking that frequently accompanies or produces ideology.

But of course, most of them don’t want that. They resist Nietzsche’s ideological critiques. They resist Marx who, despite his inherent materialism, is more systematically ideological than, say, Darwin. Sigmund Freud (who dedicated an entire tract to atheism and who is central to its 20th century development) is never mentioned, along with a host of other names.

And they do not invite new critiques – except, apparently, from Young Earth Creationists.

The title of Green’s review is pitch perfect: “The Intellectual Snobbery of Conspicuous Atheism: Beyond the argument that faith in God is irrational—and therefore illegitimate.” Contrary to what Richard Dawkins and others might claim, atheists are not a persecuted minority in the West (any group consisting mostly of white men is always eager to squeeze and contort their way into “persecuted minority” status, even as persecuted minorities struggle to push out). Anti-ideological atheism is declared conspicuously, a badge of honor and a sign of intellect. Green quotes Adam Gopnik, who introduces the nauseating term “noes,”

What the noes, whatever their numbers, really have now … is a monopoly on legitimate forms of knowledge about the natural world. They have this monopoly for the same reason that computer manufacturers have an edge over crystal-ball makers: The advantages of having an actual explanation of things and processes are self-evident.

In this respect, the “noes” have “an actual explanation of things” in greater abundance than did Nietzsche or Marx or (especially) the atheists of antiquity. In this respect, the atheists of yore and religious believers have more in common with each other than with the “noes” of today.

In my last post, I shared my thoughts about the meteoric rise of Neil deGrasse Tyson (do meteors rise? I’m sure deGrasse Tyson would have something to say about that bit of rhetorical infactitude). It may seem unfair to pick on deGrasse Tyson when, in reality, I’m bemoaning a phenomenon that began back when George W. Bush used vaguely messianio-Methodist language to frame the invasion of Iraq, an event that, whatever you think of its initial rationalizations, was poorly executed, quickly turned to shit, and set the “War on Terror” back at least a decades. In/around 2004, Richard Dawkins (who is still the author of the best popular overview of natural history ever written) realized that conditions existed for a profitable career shift.

Widespread discontent with politico-religious language was in the United States – where right-wing militarists decried the brand of fundamentalist Islam that obliterated lower Manhattan and anti-war leftists decried the (pascificst-by-comparison) brand of fundamentalist Christianity that influenced U.S. policy – coincided with fear of religious extremism in Europe, where the vexed term “Islamophobia” retained some usefulness: legitimate anxieties about theocratic terrorism (e.g., violent anti-Western responses to the deliberately provocative Mohammad cartoons and then the public slaughter of Theo van Gogh) mingled with old-fashioned European xenophobia, which was never a perfect analogue to American xenophobia. And between the U.S. and Europe lies England, where political and public responses to Islamic terrorism less often involved blustery American gun-slinging or shrill continental nativism but rather stern appeals to “common sense.” Since the collapse of British colonialism, intellectuals in England are less apt to use the term civilization than are their cousins across the Channel or their cousins across the Pond (where the term has been historically deployed by cultural warriors, a la Alan Bloom, in order to give anti-colonial leftists the willies).

The term civilized, on the other hand, is still relevant in English public discourse: not with regard to other societies, but to English society. The concept of civilized discourse (or civilised, if you will) doesn’t seem to carry the same ideological freight as civilization. But when Dawkins mocks post-positivist socio-humanist* analyses of, say, indigenous Amazonian cultures who explain natural phenomena (e.g., how the jaguar get its spots) with traditional tales, his arguments carry the epistemological heft of a suburban Thatcherite scanning his daughter’s contemporary philosophy textbook, throwing his hands in the air, and exclaiming “Oh come on!” In other words, Dawkins belongs to the long line of British “common sense” thinkers. Born in Kenya, raised in Africa, and a fan of Kipling, Dawkins has been criticized for possessing a colonial bent to his thought.

And there’s something to be said for common sense, even common sense colonialism; George Orwell, of all people, joined Rudyard Kipling (one of the most misunderstood writers in the English canon) to defend British colonialism in England on the reasonable (if depressing) grounds that, had the English let India be, the Russians would have colonized the subcontinent. This hardly excuses British crimes against India and its people, but even a cursory overview of Russian colonial atrocities forces one to sigh a very troubled and uncomfortable sigh of – what, relief? – that the British Raj was the guilty party.

Richard Dawkins
Richard Dawkins

But common sense is not fact, much less knowledge, and Dawkins has made a career of playing fast and loose with these concepts. In Unweaving the Rainbow (1998), Dawkins defended science not against the pious but against the epistemological excesses of cultural studies. In one chapter, he wrote that an Amazonian tribesman who is convinced that airplanes are fueled by magic (Dawkins’ examples often play off colonial tropes) and the the socio-humanist (usually an American cultural studies professor or graduate student in English whose dress and hygiene or dubious and who write with incomprehensible jargon) who respects the Amazonian’s conviction are both reprehensible, especially the professor, who is an enabler: he could give the ignorant native a cursory lesson in physics, but instead paints a scholarly veneer over so much tribal mumbo-jumbo. Why not explain the real source of wonder and disabuse the native of his false notions: that beautiful physics can explain how people fly!

Despite its best efforts, Unweaving the Rainbow was Dawkins’ first foray into the “Debbie Downer” genre of popular science writing. This genre pits the explanatory power of “scientific knowledge” (more about that term in a moment) against religion, superstition, homeopathy, most of Western philosophy, and pretty much any knowledge acquired or unverified by non-quantitative methods.

The “Debbie Downer” genre can be useful, especially when turned on the practice of science itself: Dawkins and his allies have successfully debunked the dogmatism that led Stephen Jay Gould’s career astray. The atrocities of Nazi and Soviet science were exposed and explained with both rigorous science and common sense. The genre can also be used  to wildly miss the point of things. I have friends who are ardent Calvinists or ex-Calvinists, who are incapable of reading Paul’s epistles without a Calvinist interpretation. They read Paul, but all they see is Calvinism. Likewise with fundamentalists and anti-ideological atheists who read Genesis but only see cosmology. Yet Paul was not a Calvinist, and Genesis is not cosmology. In some sense, the same principle applies to deGrasse Tyson and Gravity. Is this a question of knowing too much or thinking too little?  

In Unweaving the Rainbow, Dawkins confronts charge that science takes all the fun and beauty of the world just by, y’know, ‘splainin’ it. Somewhat comically, the book’s title literalizes an instance of poetic language, a practice common among Dawkins’ bête noire: religious fundamentalists. John Keats’ playful exasperation that “charms fly/ at the touch of cold philosophy” and that the natural sciences (still embryonic in Keats’ time) “unweave the rainbow,” reducing it to “the dull catalogue of common things,” is beautifully articulated representation of a well-worn human experience, one that requires appreciation more than rebuttal. But for Dawkins, the poem demands rebuttal, and not a rebuttal that distinguishes between the uses and functions of poetic language. Unweaving the Rainbow is a treatise that, dammit, science makes the world more beautiful, not the other way round.

And Dawkins is correct. After reading his marvelous Ancestor’s Tale, I felt a profound kinship with every toad I encountered on the sidewalk and every grasshopper that attached itself to my arm, six cousinly feet twisting my skin uncomfortably. Between Unweaving the Rainbow and Ancestor’s Tale, Dawkins wrote A Devil’s Chaplin, a haphazardly organized collection of Debbie Downer essays that is probably best understood as the director ancestor of Dawkins’ most successful book, The God Delusion. The book represented a specific cultural moment, described above, when everyone was eager to read why God sucked. I don’t need to rehearse the narrative or the players (something about four horsemen, cognitive, an obnoxious and inappropriate use of the prefix “neo”). Even The God Delusion‘s harshest critics praised Dawkins for capturing the zeitgeist in a bottle. But the most prominent and widely-cited negative review, by Marxist literary theorist Terry Eagleton, did not. Eagleton captured Dawkins, his personality and his project, to near perfection in the London Review of Books:

[Dawkins’ views] are not just the views of an enraged atheist. They are the opinions of a readily identifiable kind of English middle-class liberal rationalist. Reading Dawkins, who occasionally writes as though ‘Thou still unravish’d bride of quietness’ is a mighty funny way to describe a Grecian urn, one can be reasonably certain that he would not be Europe’s greatest enthusiast for Foucault, psychoanalysis, agitprop, Dadaism, anarchism or separatist feminism. All of these phenomena, one imagines, would be as distasteful to his brisk, bloodless rationality as the virgin birth. Yet one can of course be an atheist and a fervent fan of them all. His God-hating, then, is by no means simply the view of a scientist admirably cleansed of prejudice. It belongs to a specific cultural context. One would not expect to muster many votes for either anarchism or the virgin birth in North Oxford. (I should point out that I use the term North Oxford in an ideological rather than geographical sense. Dawkins may be relieved to know that I don’t actually know where he lives.)

Terry Eagleton
Terry Eagleton

Eagleton’s Marxist ad hominem is amusing: he reduces Dawkins’ own self-proclaimed materialism to his class. Dawkins is a very, very identifiable type. I’m not sure whether Eagleton knew, when he quoted Keats, that Dawkins had written a book whose title misread – or at least misappropriated – the most flowery of Romantic poets.

Eagleton’s more substantial complaint – that there are many kind of atheists, not all of whom derive their views from a fetishized notion of the natural sciences’ explanatory powers – was echoed in many other reviews. It was even the basis for a two-part episode of South Park.

Another common complaint: The God Delusion engaged with religious faith very narrowly, responding to only the most extreme fundamentalist interpretations of scripture and dogma. Dawkins hadn’t boned up on his Tillich. He’s a scientist stumbling clumsily through the humanities, unaware that his most basic criticisms of faith have been taken seriously by religious people since the Middle Ages. Again, Eagleton:

What, one wonders, are Dawkins’s views on the epistemological differences between Aquinas and Duns Scotus? Has he read Eriugena on subjectivity, Rahner on grace or Moltmann on hope? Has he even heard of them? Or does he imagine like a bumptious young barrister that you can defeat the opposition while being complacently ignorant of its toughest case? … As far as theology goes, Dawkins has an enormous amount in common with Ian Paisley and American TV evangelists. Both parties agree pretty much on what religion is; it’s just that Dawkins rejects it while Oral Roberts and his unctuous tribe grow fat on it.

More troubling than his exclusion of Eriugena and de facto collusion with Oral Roberts is his exclusion of so many other atheists. The God Delusion was published before Christopher Hitchens’ God is Not Great, a very bad book that nevertheless engaged with atheism per sedrawing from an intellectual history that extended from Lucretius to Spinoza and Thomas Paine (a list Hitchens never tired of reciting on cable news show, grinning slyly at the thought of pot-bellied viewers on their sofas, scratching their heads: I think I’ve heard of that Payne guy, but who in the sam hill is Lew Crishus?).

If Dawkins was a scientist posing as a humanist – or, more correctly, a scientist trying to sell ideology as scientific fact – then Hitchens was a humanist posing as someone with a basic understanding of science. In reality, Hitchens knew the Bible, had spent his career admiring religious thinkers and religious poets. Near the end of the Hitchens v. Douglas Wilson documentary Collision, Hitchens recalls a conversation with Dawkins, during which Hitchens declared that, if given the power to wipe religious belief off the face of the earth, he wouldn’t do it. “Why not?!” shrieked Dawkins – Hitchens, repeating the anecdote to Wilson, does a killer imitation of Dawkins’ spine-tingling shriek. Hitchens has no answer for Dawkins. He simply can’t conceive of a world without at least one religious believer.

More on point, however, is the following passage from Eagleton’s review:

Dawkins considers that all faith is blind faith, and that Christian and Muslim children are brought up to believe unquestioningly. Not even the dim-witted clerics who knocked me about at grammar school thought that. For mainstream Christianity, reason, argument and honest doubt have always played an integral role in belief. (Where, given that he invites us at one point to question everything, is Dawkins’s own critique of science, objectivity, liberalism, atheism and the like?) Reason, to be sure, doesn’t go all the way down for believers, but it doesn’t for most sensitive, civilised non-religious types either. Even Richard Dawkins lives more by faith than by reason. We hold many beliefs that have no unimpeachably rational justification, but are nonetheless reasonable to entertain. Only positivists think that ‘rational’ means ‘scientific’. Dawkins rejects the surely reasonable case that science and religion are not in competition on the grounds that this insulates religion from rational inquiry. But this is a mistake: to claim that science and religion pose different questions to the world is not to suggest that if the bones of Jesus were discovered in Palestine, the pope should get himself down to the dole queue as fast as possible. It is rather to claim that while faith, rather like love, must involve factual knowledge, it is not reducible to it. For my claim to love you to be coherent, I must be able to explain what it is about you that justifies it; but my bank manager might agree with my dewy-eyed description of you without being in love with you himself.

Dawkins would no doubt balk at the notion that he take Eagleton’s advice and “critique” science. Science is self-critiquing, after all! Science is reasonable by its very structure. Science and reason are near synonyms in the anti-ideological atheist lexicon.

This, for me, is the most troubling aspect of Dawkins and deGrasse Tyson’s trendy, anti-ideological atheism.

Let us consider once more the subtitle of Emma Green’s Atlantic review: for  the argument that faith in God is irrational—and therefore illegitimate.” Both Green and Eagleton observe what is perhaps the most troubling aspect of popular, anti-ideological atheism: it conflates terms like “reason,” rationality,” “fact,” “science,” and “knowledge.” In fact, I believe Eagleton goes too far when he asserts that “only positivists think that ‘rational’ means ‘scientific.'” Many positivists can make the distinction. (Eagleton’s reflexive assertion to the contrary is merely a product of decades spent defending post-positivist thought to his fellow Marxists.)

The popularizers of anti-ideological atheism play very fast and loose with a specific set of words: “science,” “reason,” “(ir)rationality,”  “knowledge,” “fact,” “truth,” and “information.” It is absolutely necessary to distinguish between these words. In many contexts, it is not “irrational” to object to scientifically produced knowledge, especially if you’re objecting to the implementation of that knowledge.

If I were a public intellectual with a large platform – that is, if I were Neil deGrasse Tyson – I’d go on a speaking tour. The tour’s only goal would be the definition of some basic terms, as they ought to be used by laypersons (obviously specialists will have slightly different definitions, and that’s okay). Information is data we glean from the world through our senses and technologies. Science is a method that uses information to test ideas and produce knowledge. Ideas are organized assumptions about the world. Ideas that are verifiable using scientific methods become knowledge. Reason is a system of organizing knowledge, which allows knowledge to be used for all sorts of great things: to determine a set of ethics, to decide the best shape of government, to demarcate reasonably accurate beliefs about the world, to guide us through daily decisions, etc. Rationality is reason with a French accent.

Facts are stubborn but undeniable things, some of them unveiled by the scientific method and others revealed through our senses/technologies, which help us glean information and confirm knowledge produced by the scientific method. Truth is the ontological status of reality, which makes it a very tricky thing to define and understand, and is therefore probably best passed over in silence…at least in casual conversations or book tours. True is an elastic adjective that allows us to describe the proximity of knowledge, ideas, and impressions to reality, as we understand it via science, knowledge, reason, and facts.

These definitions are not perfect, and I’m sure you and my fellow Jilters have problems with some/all of them. But I think they’re suitable for casual use. At the very least, they admit distinctions between concepts.

Anti-ideological atheists misuse these concepts for rhetorical purposes, and they encourage the public’s tendency to conflate them.

This is wrong.

When Neil deGrasse Tyson insists that “evolution is a fact,” he’s playing with rhetoric to make a political point. For too long, Creationists have conflated the scientific and popular definitions of the word “theory,” transmuting well-established and verifiable knowledge about life into speculation: Darwin’s theory of speciation was as reliable as a hopeful suitor’s theory of “why she isn’t returning my phone calls.”

But in both scientific and common English, theory is not an antonym of fact (sorry Creationists) and a theory cannot be a fact (as deGrasse Tyson well knows). A theory is established by facts. Richard Dawkins, Samuel Harris, Daniel Dennett, Neil DeGrasse Tyson, and Bill Nye have had countless opportunities to make these simple distinctions to the public; Christopher Hitchens possessed both the knowledge and rhetorical precision to explain the distinctions. But distinctions don’t pack much punch. Politically and ideologically, it’s better to affirm that “evolution is a fact,” just like gravity, and not allow the Creationists to keep slithering through their own linguistic sophistry. And just as explaining a joke drains its humor, debunking a slick sophistry invariably drains your authority. Better to bludgeon than to slice. And as anyone who has seen the ads or watched the first two episodes of his Cosmos knows, deGrasse Tyson is happy to bludgeon.


*By “socio-humanist,” I refer to scholars in the humanities (I use “humanist” as the humanities equivalent of “scientist”) and certain branches of the social sciences; I’m not referring to the broader category of post-Englightenment “secular humanism,” within which Dawkins might count himself.

What The Fox Doesn’t Know?

By Adam Elkus

There’s an emerging dustup between FiveThirtyEight‘s Nate Silver and The New Republic’s Leon Wieseltier. Since much of this already boiling down to a re-hash of CP Snow’s “Two Cultures” debate, I’ll try to look at each’s argument and then observe some strengths and flaws. TL: DR — both are talking past each other, one has some big flaws, and the other is missing the point.

First, Silver. Reading through Silver’s blog, I see two sorts of arguments being made from the philosophy of science that Silver (perhaps in the interest in readability, doesn’t fully explain) — prediction and falsification. Silver sees the primary problem with journalism as being one in which a gap exists between collection and organization of information and explanation and generalization. Silver’s idea of how to fix this gap is strongly bound up in the idea of producing knowledge that is both falsifiable and have good out of sample predictive qualities.

For example, they cite three factors they say were responsible for Mitt Romney’s decline in the polls in early mid-September: the comparatively inferior Republican convention, Romney’s response to the attacks in Benghazi, Libya, and Romney’s gaffe-filled trip to London. In fact, only one of these events had any real effect on the polls: the conventions, which often swing polls in one direction or another. (This does not require any advanced analysis — it’s obvious by looking at the polls immediately before and after each event.) Explanation is more difficult than description, especially if one demands some understanding of causality. …. …But while individual facts are rigorously scrutinized and checked for accuracy in traditional newsrooms, attempts to infer causality sometimes are not, even when they are eminently falsifiable.

Explanation is about why particular things occur, and these explanations should ideally be falsifiable. Notice that Silver does not necessarily say that all explanations are falsifiable. If he did, this would rule out large swaths of the hard sciences that rely on notions that are not directly falsifiable.  He would also rule out the utility of heuristic understandings of phenomena where good data does not exist, or where the results of statistical meta-analysis are inconclusive and contradictory. Still, Silver seems to privilege explanations that are falsifiable — and as I will later detail — gloss over some of the enormous problems with the conception of science that he mentions as a model for his site.

He later goes on to make a covering-law esque argument that particular explanations should be evaluated for how well they scale with the aim of finding useful general truths. He equates explanation and causality with the classical model of an explanandum to be explained and a set of premises that explain it. Silver says that a generalization must be tested by how well it predicts out of sample, and equates this to falsification in the absence of laboratory experiments. However, while Silver may have a point about prediction, there are some distinct nuances to how falsification has been considered in the philosophy of science.

The problem with Silver’s argument is that he glosses over just how hard it is to actually get rid of a theory. If you believe Imre Lakatos, than the hard core of a research program itself is unfalsifiable. If you subscribe to a coherentist view in the philosophy of science, you may believe (like Duhem-Quine) that a theory is not one thing but a web and one has to defeat the core of theory and its outlying components. You may not, as per Feyeraband, believe that we can rise to a general model of science and that domain-specific principles rule. And this is to say nothing of the vast array of historical and sociological work on the ways in which science is actually practiced, which to some extent have some uncomfortable aspects in common with Silver’s critique of punditry as being driven by strong ideological priors.

Now, if we focus solely on the aspect of predictive accuracy Silver seems to be on stronger grounds. Given that it is so hard to really falsify a theory, and that it is also easy to rescue a theory by saving it from failures to predict, Milton Friedman made a much-maligned argument that theory itself is inherently tautological and what matters is whether or not the theory accounts for things that haven’t been observed yet:

The ultimate goal of a positive science is the development of a “theory” or, “hypothesis” that yields valid and meaningful (i.e., not truistic) predictions about phenomena not yet observed. Such a theory is, in general, a complex intermixture of two elements. In part, it is a “language” designed to promote “systematic and organized methods of reasoning.” In part, it is a body of substantive hypotheses designed to abstract essential features of complex reality. Viewed as a language, theory has no substantive content; it is a set of tautologies. Its function is to serve as a filing system for organizing empirical material and facilitating our understanding of it; and the criteria by which it is to be judged are those appropriate to a filing system. Are, the categories clearly and precisely defined? Are they exhaustive? Do we know where to file each individual, item, or is there considerable ambiguity? Is the system of headings and subheadings so designed that we can quickly find an item we want, or must we hunt from place to place? Are the items we shall want to consider jointly filed together? Does the filing system avoid elaborate cross-references?

Friedman in many ways bypasses the problem of falsification by noting that a theory’s internal consistency is not necessarily important because consistency can easily lapse into tautology:

A hypothesis is important if it “explains” much by little, that is, if it abstracts the common and crucial elements from the mass of complex and detailed circumstances surrounding the phenomena to be explained and permits valid predictions on the basis of them alone. To be important, therefore, a hypothesis must be descriptively false in its assumptions; it takes account of, and accounts for, none of the many other attendant circumstances, since its very success shows them to be irrelevant for the phenomena to be explained. To put this point less paradoxically, the relevant question to ask about the “assumptions” of a theory is not whether they are descriptively “realistic,” for they never are, but whether they are sufficiently good approximations for the purpose in hand. And this question can be answered only by seeing whether the theory works, which means whether it yields sufficiently accurate predictions. The two supposedly independent tests thus reduce to one test.

For Friedman the issue is that whether or not valid predictions follow from the minimal components of a theory that can approximate something of interest. This actually contradicts the Tetlock-like argument that Silver makes about ideologically strong priors held by pundits. A pundit could believe any number of things that might seem patently ridiculous — but what matters is that they permit valid predictions. Silver might agree that this is true, and make an argument (as he has) that pundits should be open to revising their beliefs in light of failed predictions, updating their priors in a Bayesian fashion. While I would agree that this would be a Good Thing, it also shows Silver’s lack of understanding about the nature of punditry.

When Silver talks about strong priors and ideological beliefs, he’s in some ways paraphrasing Noah Smith’s now-infamous explanation of “derp” as unusually strong Bayesian belief states that resist posterior estimation. Silver and Smith are arguing that even math-averse pundits have implicit models of how the world works, and those models ought to be evaluated for predictive accuracy. It is true that all pundits that make normative arguments about complicated social things have implicit models of the world, and also make implicit predictions about the future. But this is secondary really to the purpose of punditry to begin with.  Pundits do not see things in terms of probability — Bayesian or Frequentist. The basic column has the following format: “X is the present state of the world, Y is wrong/right in it, Z should be done/not done.” X is the area most amenable to Silver-like data analysis, but as we move from X down to Z the idea of using scientific arguments to address it becomes more and more problematic.  The relationship between science and religion, for example, is still not something that we have gotten a good handle on despite centuries of debate.  Moreover, in most public policy issues data will bound the range of acceptable policy options but not necessarily do much more than that.

Wieseltier’s argument, on the other hand, is a farrago of nonsense. Whereas Silver’s argument simply is problematic because it fails to grapple with some complexities of science and opinion, Wieseltier seems more interested in rhetoric than anything else:

He dignifies only facts. He honors only investigative journalism, explanatory journalism, and data journalism. He does not take a side, except the side of no side. He does not recognize the calling of, or grasp the need for, public reason; or rather, he cannot conceive of public reason except as an exercise in statistical analysis and data visualization. He is the hedgehog who knows only one big thing. And his thing may not be as big as he thinks it is. Since an open society stands or falls on the quality of its citizens’ opinions, the refinement of their opinions, and more generally of the process of opinion-formation, is a primary activity of its intellectuals and its journalists. In such an enterprise, the insistence upon a solid evidentiary foundation for judgments—the combating of ignorance, which is another spectacular influence of the new technology—is obviously important. Just as obviously, this evidentiary foundation may include quantitative measurements; but only if such measurements are appropriate to the particular subject about which a particular judgment is being made. The assumption that it is appropriate to all subjects and all judgments—this auctoritas ex numero—is not at all obvious. Many of the issues that we debate are not issues of fact but issues of value. There is no numerical answer to the question of whether men should be allowed to marry men, and the question of whether the government should help the weak, and the question of whether we should intervene against genocide. And so the intimidation by quantification practiced by Silver and the other data mullahs must be resisted. Up with the facts! Down with the cult of facts!

First, the question is posed wrongly as a matter of measurement and fact. The specific criticism of punditry that Silver makes is one that pundits do not revise their beliefs after events cast doubt on the accuracy of a belief to predict future events. Say that John Mearsheimer, in making an normative policy argument for realist policies, argues that the international system has certain rules and thus himself argues that those rules will lead to certain outcomes. It is fair for Phillip Schrodt to highlight the failure of the system to behave in the way he says, and argue that this should have implications for whether we rely on his theory. Silver’s error is in the assumption that beliefs are predictions, as opposed to the sensible observation that strong beliefs will usually have predictive implications.  Certainly numbers cannot decide the issue of whether men should marry men, but if arguments against same-sex marriage warn that more liberal attitudes towards homosexuality will lead to the decline of marriage it is fair for Silver to try to see if this belief accounts for the variation in marriage and divorce.  It is precisely the fact that internally consistent beliefs can be tautological, as Friedman observes, that makes prediction useful.

Second, nowhere does Silver say that data ought to decide normative issues. The strongest statement that Silver makes about this in his manifesto is ironically counter to the image the TNR casts of him as a quant expressing a view from nowhere: Silver argues that scientific objectivity is distinct from journalistic objectivity in that it should make statements about whether certain arguments can be factually sustained. This is not necessarily an argument that empiricism should be the final arbiter, but that it ought to make a statement about what truths can be discerned from investigation about the rightness and wrongness of argument. And it is also not too much different from the notion of journalistic objectivity, as Silver argues. A good journalist doesn’t represent all of the sides of an issue, they give the reader information as to which ones are problematic. I am not sure, again, how he can square the circle between two notions — it is one thing to scientifically evaluate competing hypotheses, another to scientifically evaluate competing normative beliefs that do not really take the form of hypothesis or theory (even if they may have implicit hypotheses and theories embedded).

Wieseltier gives away his real problem with Silver when he notes this:

The intellectual predispositions that Silver ridicules as “priors” are nothing more than beliefs. What is so sinister about beliefs? He should be a little more wary of scorning them, even in degraded form: without beliefs we are nothing but data, himself included, and we deserve to be considered not only from the standpoint of our manipulability. I am sorry that he finds George Will and Paul Krugman repetitious, but should they revise their beliefs so as not to bore him? Repetition is one of the essential instruments of persuasion, and persuasion is one of the essential activities of a democracy. I do not expect Silver to relinquish his positivism—a prior if ever there was one—because I find it tedious.

It were one thing if punditry consisted of abstract deduction. But it does not. Punditry is about persuasion. Pundits do not make logical arguments from first principles or write mathematical proofs. Nor do pundits utilize any of the techniques of logic found in mathematics and philosophy, write sound mathematical definitions, or build their arguments off of logical deductions in the way that all mathematicians must work off previously proved things. Instead, Wieseltier is making a strong argument that “persuasion is one of the essential activities of a democracy.” Hence Will and Krugman should be free to repeat their beliefs for dramatic effect, in the hope that it would persuade others that they are right. This contradicts Wieseltier’s earlier arguments about reason, logic, and deduction. If Wieseltier wants to argue a reason-based defense of the humanities, which I do find persuasive, he cannot have it both ways. Public reason and persuasion are not the same thing — taken to one extreme persuasion becomes sophistry. 

Sophistry, however, is what Wieseltier has been selling for a very long time. In arguing for his policy positions — particularly on the Iraq War. Wieseltier’s columns at TNR present no deductively rigorous argument on the question of intervention and America’s place in the world. Instead they are extended fits of moral posturing, in which he constantly exhorts the reader to a titanic struggle against evil. Instead of logical and rigorous arguments about whether or not a particular stance on Ukraine follows from a particular train of logic, Wieseltier’s world instead is a emotionally charged trip into glory, courage, and justice — where every struggle is always Munich, and every politician an inferior shadow of a Churchillian figure exhorting the populace into total mobilization. Wieseltier, in other words, is engaging in a particularly sophistic form of persuasion that aims to convince us that we ought to embrace a position of total mobilization by utilizing rhetoric and repetition. Indeed, Matt Yglesias (who has an undergraduate degree in philosophy) got it right when he flagged a somewhat muddled take on Kant by Wieseltier — Wieseltier is TNR’s book reviews editor. He is a literary scholar, not a philosopher.  I certainly know I have not lived up to the standards that I am holding Wieseltier to in my own writings, but I at least have become acutely aware that there is something wrong with the kind of argumentative style that I sometimes fall into. Wieseltier, however, conflates public reason with emotive rhetoric.

I must admit that I have my own doubts about Silver’s new enterprise. And like a Bayesian, I have a prior belief that I will adjust when the “data” comes in to evaluate it. I do not feel entirely comfortable with the arguments he makes and also am skeptical that data without mechanisms or heuristic understanding will really deliver the insights that the site promises. That being said Silver strikes me as a very smart person who has thought very deeply about the problems with modern journalism. I at least feel somewhat confident that he will be an evolutionary improvement over the existing model. Wieseltier, however, is the very symbol of the kind of pundit that makes even the most hyperbolic Silver critiques seem understandable. I will take data enthusiasm over Wieseltier’s “persuasion” any day of the week.  I do not also think that Silver will crowd out “public reason.” Indeed, the popularity of Nassim Nicholas Taleb — a quant turned philosopher — seems to indicate otherwise. Someone like Taleb, who grounds arguments in the style of a mathematician or philosopher rather than a statistician (and unlike Wieseltier has a body of technical work that can be philosophically evaluated) will be first to check a Silver-like data journalist if they overreach. We need both empiricists and rigorous deductive analysts, and ideally combinations of both.

Against Neil deGrasse Tyson: A Three-Minute Polemic

A literal "skeptic trump card," for the armchair sociologist who prefers personalities over boring old physics textbooks.
A literal “skeptic trump card,” for the armchair sociologist who prefers personalities over boring old physics textbooks.

Normally I put a lot of thought (or at least a lot of words) into my Jilt articles, careful to say things that I’ll still feel passionately about five minutes after posting. But a Neil deGrasse Tyson quote – the latest of dozens – just floated through my Facebook feed, and it broke a levee of feeling. Here are some thoughts I’ll throw haphazardly like mustard seeds onto infertile soil, thoughts I may regret posting within five minutes – but not three:

Neil deGrasse Tyson has spent the last decade slouching toward cultural ubiquity, a seemingly nice guy who twenty years ago would’ve competed with lanky Bill Nye for the title “Science Guy” (and yes, I think that’s a real thing in our culture: scientists who spend more time in public relations meetings than in the lab are all vying for the title of “Science Guy”). But in 2014, he inhabits a cultural ecosystem where Richard Dawkins is someone my mom has heard of. DeGrasse Tyson inhabits a world in which Christopher Hitchens, approaching 60 and noticing the inevitable dulling of his faculties, turned to popular atheism as an easy and reliable source of mulah. This is a world in which a cursory knowledge of the natural sciences and a declaration of disbelief in the desert deity of Abraham is enough to certify oneself “intellectual” or “enlightened,” all with the blessing of a few high-profile public figures.

Dawkins was once a great explainer of Darwinian biology, but he quit that gig years ago. Nye was a children’s TV host who explained basic scientific knowledge in clear language who now debates Young Earth Creationists (i.e., the people who other Creationists make fun of). And deGrasse Tyson was once a fan and acquaintance of Carl Sagan, and now hosts a television show that is (so far) preoccupied with religion and earthbound institutions – as far from the spirit of Sagan as The Big Bang Theory is from Star Trek. 

To be fair, deGrasse Tyson seems like a nice enough guy. I heard him interviewed by Terry Gross a few weeks ago, and he explained that his new show Cosmos (produced by Family Guy creator, professional misogynist, and world’s-most-irritating-atheist Seth MacFarlane) was an attempt to recapture the spirit of John F. Kennedy’s sweeping pro-science rhetoric. That rhetoric, said deGrasse Tyson, is what inspired him and millions of his peers to enter scientific fields. Today’s generation won’t be inspired by the prospect of creating an airplane that is more fuel-efficient than their parents’, he continued. They needed something to really inspire them.

Nevermind that JFK was half-hearted in his commitment to the space program or that its impetus had little to do with scientific discovery (everyone knows that). Nevermind that innovative, fuel-efficient technologies make money, and money is pretty damned inspiring. Nevermind that deGrasse Tyson is attempting to ape ’60s pro-science optimism using Cosmos, a vehicle of late ’70s inward-looking trippiness that doesn’t inspire action so much as awe. Sagan was chill. DeGrasse Tyson is visibly uptight. Sagan’s Cosmos was subtitled A Personal Journey; MacFarlane and deGrasse Tyson have revised that to A Spacetime Odyssey, aiming, I guess, for shades of “Thus Spake Zarathustra,” Stanley Kubrick, and Nietzsche. But their show’s tone isn’t ’60s or ’70s: it’s pure 2014, the Year of the Dead Horse (DISCLAIMER – I do not believe in astrology I believe in science I was only making a pun I believe in science I do not actually believe in astrology – DISCLAIMER). In this case, the horse is the vacuousness of religious faith. And despite all the blood and pulp, nobody seems to be tired of it yet.

So twenty minutes ago, deGrasse Tyson slides across my Facebook feed, the latest in a long chain of images mocked up by fans (or, in this case, Mother Jones) that marry images of deGrasse Tyson looking cool or authoritative (or, in this case, just standing) with a quote that only barely masks his utter contempt for those who would, say, explore the religious sphere of human existence or deny funding to NASA:

When [scientists] do know something, there are reasons why we know it, and if you don’t understand that, you deny it only at your peril, especially when the result may affect the stability of our future.

This sounds like a threat. I know he’s addressing climate change denial as much as Creationism or regular Mass attendance, so the “stability of our future” is probably intended to register beyond “If the religious crazies take over, we’re all going to die!

Problem is, that’s the only song these public “Science Guys” have been singing since Richard Dawkins discovered there was money in it. And I am so, so sick of it.

Science denial is a meaningless phenomenon. Outrage about science denial is phony. Period.

Basic scientific knowledge has never been widely understood – not fully. The average anti-Creationist probably couldn’t explain Darwin’s theory of evolution by natural selection without getting much of it horribly wrong. And scientific inquiry has never been widely valued in itself. Everyone knows scientific inquiry is not funded unless there are economic or (less commonly) geo-political reason for doing so. And everyone knows that practical and economically viable scientific research will be funded no matter what.

I never tire of reminding people that “science” is, in itself, not an actual thing. Science is a method, a process. And I love science, for many of the same reasons deGrasse Tyson wants me to: I was too young for Cosmos, but I grew up with NovaNature, and yes, Bill Nye the Science Guy (remember when he had Soundgarden on?). I loved science before I love the humanities. And although I’m a humanist, I still believe that the scientific method produces the most valuable knowledge we have about our world and, increasingly, each other.

But the scientific community, left on its own, is just a bunch of guys with no money and no voice producing knowledge that nobody pays attention to. To hear Dawkins and deGrasse Tyson and Nye tell it, science is simultaneously totally in charge and under constant attack (their rhetoric in this regard resembles the rhetoric of Evangelical Christians and Stalinists). But science is not in charge. In the 19th century, scientists were guys who either sought patronage or relied on independent means to fund beetle collections and jungle expeditions. And without their practical socio-economic applications, most scientific work wouldn’t get done.

But deGrasse Tyson isn’t interested in the practical applications. He said as much on Fresh Air. Practical applications are boring. And when the skeptical consumer of pro-science PR asks, “Why should I care?”, deGrasse Tyson responds in one of two ways. Either he relies on rhetoric and poetry, not the nuts and grit of real scientific work, because the big stuff – theoretical astrophysics, for instance – is much sexier, especially when you dumb it down…or he goes shrill, warning that if we don’t take science seriously – if we don’t trust them and believe what they say – bad things will happen. This shrill tone occasionally cracks into insouciance: “Doesn’t matter what you believe,” says the Science Guy. “We’re correct whether you believe us or not.”

I hate both approaches, especially the latter. Both approaches discourage critical inquiry, upon which the scientific method relies. While their colleagues do actual, original, difficult research in universities on the dimes of taxpayers and various boards of trustees, Science Guys globe-trot on book tours, stroking the egos of the faithful and epistemologically bullying everyone else. And I wouldn’t mind as much if the enlightened faithful actually understood or cared about the boring work of science any more than the drooling masses. But one only need survey Western civilization for five minutes to know that the overwhelming majority of everybody – including Dawkins/deGrasse/Nye’s audience – doesn’t care about real, hard, boring science.

And so this is my message to the Science Guys:

The Catholic Church ignored science for centuries without destabilizing shit. There were wars, then there were periods of peace, then there were wars. There was ignorance, but there was also some knowledge. But there was no “peril” in ignoring Copernicus. And it wasn’t Galileo who created post-Enlightenment stability in Europe. That was Protestants. More specifically, that was German princes who embraced Protestantism and capitalism. These societies created the conditions in which the natural sciences flourished – not the other way around. Don’t pretend that we need you more than you need us (in most cases, literally U.S. – the U.S. government and its economic allies). It’s our teat you’re sucking on – so keep on sucking, and smile while you’re doing it. 

I realize that 99.99% of professional natural scientists understand that science is a process and that scientific knowledge is a target for continual inquiry. And to be fair, deGrasse Tyson offers an acceptable, if unnecessarily vague, definition of “the scientific method” early in Cosmos. But the definition takes 30 seconds to recite, while he spends half of the episode lambasting 16th century Christianity for persecuting a man who, he later admits, wasn’t actually using the scientific method and was just lucky to have guessed that planets existed. This only further encourages regular people to continue invoking the word “science” the way deGrasse Tyson does: it’s a mantra, a mystical trump card that ends all debate. “This is SCIENCE,” end of debate. Such a mindset is decidedly anti-scientific, but these celebrity scientists who  moonlight as armchair sociologists are enablers, virtually none of whom have earned their public authority through scientific inquiry.

(Hey, here’s an equation written by an English Ph.D. candidate: Neil deGrasse Tyson – [Jon Stewart + Seth McFarlane] = NOBODY. 100% tested and verifiable. What does that tell you about the power of “science”?)

Do I trust scientists more than I trust religious fanatics? Yes, obviously. But I still trust the first 2,000 names in the Boston phone book more than I trust either scientists or spiritualists. Even in an educated city like Boston, people won’t nitpick over the astrophysical details of George Clooney movies, and they’ll still probably wind up setting aside a few dollars for the Large Hadron Collider.