Is Failure An Option? (I)

By Adam Elkus

Clay Shirky has a must-read post on the Obamacare website fiasco. Though I approve of his overall message, I am not sure I fully agree with Shirky’s diagnosis that Obamacare is “doomed.” Co-blogger W.K. Winecoff noted on Twitter that despite the shambolic rollout, there is still a nontrivial chance that the Affordable Care Act could ultimately persist if its key provisions survive long enough to force societal adaptation to a new equilibrium. To indulge the political science cliche, path dependence matters. And there has been remarkable variation in the rollout so far. As a foreign policy and national security blogger I’m not going to speculate any further — this isn’t really the point of the post.

I’ve compiled some notes over time about the subtext of strategy theory relevant to the themes Shirky discusses here, and there is too much material for one post. Hence I’m going to break it up into three posts that touch on the theme of the difference between optimal and politically realistic design and implementation of strategy.

Why am I interested in the ACA, after all? Well, Shirky’s post is really not about Obamacare. Instead, it’s about how not to think about sociotechnical systems. And as Ezra Klein noted, the problem here is not just a website but the way the technical infrastructure intersects with the ACA’s core design choices. Shirky in diagnoses the faulty rollout as a yet another case of “waterfall” technology development methods:

The preferred method for implementing large technology projects in Washington is to write the plans up front, break them into increasingly detailed specifications, then build what the specifications call for. It’s often called the waterfall method, because on a timeline the project cascades from planning, at the top left of the chart, down to implementation, on the bottom right.

Like all organizational models, the waterfall method is mainly a theory of collaboration. By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work. Instead, the waterfall approach insists that the participants will understand best how things should work before accumulating any real-world experience, and that planners will always know more than workers.

The waterfall vs. agile design methodology dispute is old hat in technology circles. So why did the tech-savvy Obama administration ignore best practices? Shirky explains that the White House may have seen a staged or progressively scaled rollout as an boon to those seeking to repeal the bill. Hence “failure was not an option” and the White House opted for a high risk/high reward plan that minimized opportunities to painlessly patch small failures.  Facing intractable political opposition and a mixed public verdict on the ACA, the WH feared that iterative improvement might also unintentionally raise the risk of Obamacare being halted altogether. So it rolled the domestic political equivalent of war’s “iron dice” and implicitly decided to risk everything. Indeed, failure was not an option. Once the decision was made to go forward, it was all or nothing and there was no turning back.

To a technologist like Shirky, this kind of logic is self-evidently ridiculous. He rails at length about Beltway perceptions of technology, refusing to build on his somewhat implicit recognition that politicians sometimes can make informed decisions on a decision calculus different from technological best practices. Indeed, as I will argue in this post, Shirky’s belief that agile design is self-evidently superior mirrors some contradictions in strategic theory.

Strategic theorists often talk about the need to ensure both a proper and well-formulated allocation of ends, ways, and means that also is flexible enough to compensate for fog and friction in war and the enemy’s ability to frustrate even the most well-planned theories of victory. Because so much in war is based on subjective assessment of the situation with incomplete information, a strategist is implicitly Bayesian in many ways. Indeed, Colin S. Gray casts strategy as an iterative process of dialogue and negotiation, a constant and often frustrating set of repeated trips across the “strategy bridge” between violence and political purpose. This does not exactly map onto Shirky’s conception of agile development, but it shares many similarities.

The paradox is that the “drums and trumpets” folk theory of strategy that audiences eat up is a story of endless military equivalents of how Shirky describes Obamacare.  The Great Men of History –at least seen in the eyes of those who consume popular military history, are those who embark on high risk endeavors with rigid plans and timetables and little capacity for iterative learning and improvement. Cortes famously scuttled his ships – he and his men would either conquer Mexico or die trying. The mythological Churchill was a man possessed of superhuman will that fought virtually alone against Hitler despite the total defeat of British landpower in Europe and refused a separate peace. The wildly popular “historical” movie 300‘s interpretation of the Spartans heavily relies on the phrase “with your shield, or on it” as its motivating credo. You cannot get more “failure is not an option” than that.

It’s easy to mock all of this as so much macho posturing and militarism. But what if “failure is not an option” is actually not a credo but a forcing mechanism that a decisionmaker may invoke under certain circumstances? Indeed, perhaps it recurrence can be explained by the persistence of these environmental conditions.

  1. The decisionmaker has a complex and risky design they want to execute. It may be controversial, or at the very minimum involve a substantial degree of uncertainty and steep risks.
  2. Agile development makes the decisionmaker vulnerable to some kind of internal or external problem. Perhaps, like the Bush administration’s marginalization of the CIA and State Department during Iraq War planning, they fear being undermined by within. Like von Schlieffen croaking out last-minute instructions about keeping the right wing strong in the Western front invasion plan, they fear purity of design being compromised in devastating ways. Or, as Shirky claims re: Obamacare, fear of powerful external opposition plays a role.
  3. The stakes are high. Risk is perceived to be extreme and perhaps existential, but reward is seen as worth it. The deterrence theorist Scott Sagan argues that Imperial Japan was willing to risk devastation because the consequences of not fighting would be a subtraction of “Imperial” from Imperial Japan.
  4. In order to be successful, the design must be executed rapidly and decisively in order to surmount the substantial obstacles to success. Deviations from the time table cannot be risked, even though the scale of the plan seemingly warrants caution.
  5. In order to overcome the risks implied by environmental condition 1, the design must be tightly controlled, supervised, well-formulated, and rigid in character.
  6. In order to ensure that the venture succeeds, there must be a perceived Rubicon that — once crossed – functions as a point of no return. Those involved must either succeed together or fail together and suffer the consequences.

“Failure is not an option” is thus a simple algorithm for optimizing a complex and risky venture in the conditions specified above. It consists of three instructions, one of which (as I will note later) is also an algorithm in its own right:

  1. Limit the ability of the design to evolve in time as much as possible, allow only tactical adjustments.
  2. Implement the design with maximum force and velocity, but with little room for error as consequence.
  3. Guarantee automatic consequences for failure. Either the venture succeeds or a great penalty is dispensed. Make defection from the venture impossible.

In part two of this post, I’ll look at the difference and similarities between “skin in the game” and “failure is not an option.”

Advertisements

The stuff feminist reality is made of: Michelle Obama, feminism and the raced meaning of motherhood

By Amanda Grigg

Politico is jumping on the “lean in” bandwagon and upping the ante by dragging Michelle Obama along for the ride with their Friday cover story, “Leaning Out: How Michele Obama became a feminist nightmare.”

I guess it depends on what you mean by “feminist nightmare.” I’m a feminist and my most terrifying recurring nightmare is that I’m back in undergrad during finals week and I realize I haven’t been to class all semester. I usually have it when I’m under a deadline so…try to figure that one out Freud.

Anyway, the author Michelle Cottle (the other Michelle) suggests that feminists are disappointed that Michele Obama has focused on being “mom in chief” rather than wading into more significant (and controversial) policy debates. Cottle highlights an earlier critique by The Root writer Keli Goff and criticism from Linda Hirschman (of National Prospect “Homeward Bound” fame) and suggests that Michelle Obama’s policy-avoidance might be particularly unnecessary following Obama’s re-election – there’s no need to worry about an active first lady turning off voters (the Hillary Clinton factor).

The New Republic and Slate both featured articles defending Michelle Obama and tweets from prominent feminists which suggest that the “disappointment” is not widespread. And Cottle includes quotes from defenders in her piece as well, largely they’re “choice” feminists arguing that any choice a woman willingly chooses represents a win for feminism (a debate that could launch a million posts but suffice it to say, there’s more to feminism than that).

In my opinion, the best parts of the article are those where Cottle quotes black feminist writers because they do a much better job of illustrating the pretty classic dilemma Michelle Obama faces as a prominent black woman. First up is Rebecca Walker (author and daughter of Alice Walker) who says:

I wouldn’t necessarily say Michelle Obama had to kowtow to some demand that she become a June Cleaver type. I would say she understands the need to help people understand a model that they may not have been familiar with, and to help them learn how to trust something that they may not have been able to in past.

Rebecca Walker is drawing our attention to the fact that Michelle Obama being seen as anything near a June Cleaver type is something new and doesn’t have quite the same meaning as if a white first lady were seen in this way. Why? Because femininity and motherhood have had very different meanings for black women than they have for white women. For more we can turn to feminist scholar and all around badass Patricia Hill Collins:

Two elements of the traditional family ideal are especially problematic for African-American women. First, the assumed split between the “public” sphere of paid employment and the “private” sphere of unpaid family responsibilities has never worked for U.S. Black women.

During slavery, black women worked in what was allegedly the “public” sphere of Southern agriculture, but did so without wages and without any familial privacy. Since the end of slavery, and for a whole host of reasons (including continuing inequality and discrimination leading to lower wages among blacks which in turn require women to contribute to the household income). Because Black men have traditionally been denied a family wage, Black women have been far more likely to work outside of the home. Generally this was not part of an effort to establish themselves as equal to men but to secure sufficient income for their families. Collins continues: 

“Second, the public/private binary separating the family households from the paid labor market is fundamental in explaining U.S. gender ideology. If one assumes that real men work and real women take care of families, then African-Americans suffer from deficient ideas concerning gender. In particular, Black women become less “feminine,” because they work outside the home, work for pay and thus compete with them, and their work takes them away from their children.”

psy0812_cov
Michelle Obama on the cover of Parenting magazine

Historically, white middle and upper income women have been considered inherently good mothers who are deserving of having more children while poor women and minority women are characterized as unfit mothers, unworthy of or too irresponsible to have more children.[1] This ideology has often manifested itself in state policies that encourage motherhood among well-situated white women and discourage it among poor women and women of color. For example, in 1970, black women were sterilized at twice the rate of white women, and throughout the decade predominantly black recipients of public assistance reported that welfare agency workers had threatened to cut off their benefits if they did not agree to undergo state funded sterilization.[2]  In Welfare’s End Gwendolyn Mink argues that race-valuation of motherhood is evident in the difference in policy design between Survivor’s benefits and welfare programs like AFDC and TANF. Predominantly white Survivor’s benefits are more generous and less stigmatized than Temporary Assistance to Needy Families, and they support mothers who choose to stay at home to care for their children. TANF benefits are not only stigmatized and increasingly limited, they also refuse to support poor/black motherhood by demanding that mothers work outside of the home. Mink suggests that these policies send a clear message to poor single and often black mothers that their care is not valued.[3]

So you could argue that by presenting such an admirable (and well-liked) model of black motherhood, Michelle Obama is challenging the historic devaluation of black caregiving and raced assumptions about motherhood and family.
On the other hand, Patricia Hill Collins suggests that rather than trying to explain why Black women deviate from or trying to meet the historically white standards of femininity (which today are pretty much only met on Modern Family/in the public imagination) women should challenge “the very constructs of work and family themselves.” So there’s definitely grounds to be critical of Michelle Obama’s choice to change the meaning of motherhood via replication rather than by rejecting it (the current model of motherhood) entirely.
OBAMA NEW YORKERCottle also quotes The Root writer Keli Goff’s earlier article listing 5 things she would like to hear Michelle “preach” in her husband’s second term, which included her stance on reproductive rights:

Michelle Obama is also on the record as supporting reproductive rights in recent years as Planned Parenthood has been under attack, but she has waded into the issue only tepidly. With African-American and poor women more likely to have unplanned pregnancies and out-of-wedlock births and to raise families in poverty—not to mention the high AIDS rates among black Americans—her voice could go a long way toward making a difference on issues of reproductive and sexual health.

When I read this I immediately thought of Zoltan Hajnal’s Changing White Attitudes Towards Black Political Leadership. Haven’t read it? Well you should, but just this once I’ll summarize. Hajnal studies white attitudes towards black political leadership and finds that:

Once black officials have the opportunity to prove that black leadership generally does not harm white interests, uncertainty should fade, whites’ views of blacks and black leadership should improve, and more whites should be willing to consider voting for black candidates.

Initially this seems encouraging. But, often black communities elect black leaders with the specific hope that they will make significant changes to the status quo, changes that will almost inevitably “harm white interests” insofar as whites have benefited from racial inequality. Hajnal argues that this shouldn’t be the case, particularly if whites continue to become more sympathetic to racial injustices. But either way it suggests that black leaders must strike a careful balance between advocating for racial justice and affirming whites’ fears and thus, their resistance of black leadership.

In the case of Michelle Obama, this likely means that speaking out about women’s issues, let alone black women’s issues, would result not just in the kind of backlash that Hillary Clinton saw, but could also confirm white fears of and dislike of black leadership. If that sounds paranoid spend some time checking out the google hits for “Obama race war” and “Obama class warfare.” Or don’t and just trust me (the internet is a terrible place). This presents a real dilemma. If black leaders are elected in part, because constituents hope that they will change the racial status quo, but will not be reelected (or will negatively affect views of black leadership generally) if they change the status quo, they’re in a real bind.

So when we consider what black motherhood has meant, and what a black feminist first lady would likely mean it’s not surprising and certainly not a feminist nightmare that Michelle Obama has chosen the path of incremental change.

Edit 11/26, a quick addition: Notably the two big issues Michelle Obama has focused on seem to allow her to address problems that are particularly pressing to the black community without invoking race. Childhood obesity and barriers to higher ed for low-income students certainly hit racial minorities harder than whites, but they don’t strike anyone as particularly radical or inherently “raced” issues.


[1] This is in part because black women have historically defied the norms that define motherhood in opposition to wage-work and the public sphere. See Patricia Hill Collins Black Feminist Thought  and Dorothy Roberts, “Racism and Patriarchy in the Meaning of Motherhood,” American University Journal of Gender & Law, 1993 Vol. 1: 1-38

[2] Stephen Trombley, The Right to Reproduce (London: Weidenfeld and Nicolson, 1988), 177.  

[3] Gwendolyn Mink, “At the crossroads of race, morality, and poverty, welfare law codifies disdain for poor single mothers as mothers. (121) Welfare law sends a message to poor single mothers, that their care is not valued.

The Worst President of the 20th Century: Part Three

By Seth Studer

November 22, 1963

There was a very brief period in my life when Oliver Stone’s JFK was one of the scariest films I’d ever seen. A distorted portrayal of the only trial in the assassination of John F. Kennedy, the film is tonally bombastic from start to finish, but my relative youth blunted the effect (teenagers like cloying earnestness and loud things) and allowed me, for a brief shining moment, to experience the film as Oliver Stone intended: an assassination, confusion, then a lull, then a slow simmer that heats to a boil when Jim Garrison (Kevin Costner) meets Mr. X (Donald Sutherland). Prior to that scene, the film is a thesis disguised as a thriller. After that scene, the film becomes a thesis disguised as a courtroom drama. You can understand why, when I watch JFK today (and yeah, I still do), I stop the DVD when Donald Sutherland signs off: he leans over and tells Garrison, “I just hope you catch a break.”

For me, that’s the last line of the film.

If I kept watching, I’d see Kevin Costner standing over Kennedy’s grave while (I kid you not) a black family kneels reverently in the background, and then I’d spend the next twenty minutes cleaning the vomit off my couch and carpet.

But up until Mr. X, holy cow, JFK can be a really fun and exciting movie, if you get yourself in the right spirit. Everyone agrees that the “conspiracy montages” are what make the film so great. Here’s how they work: a character delivers a five, ten, sometimes fifteen minute explanation of some theory of the assassination, occasionally interrupted by skeptical questions from other characters who are immediately satisfied with the answers, while Stone, in a rapid montage, cuts between the lecture (which would be unwatchable on its own) and shots of events, sometimes multiple versions of the same event, and unrelated images – some of which are a little creepy (there’s a recurring skeleton).

The most film’s second best conspiracy montage (after Mr. X’s) is the working lunch at Antoine’s, where Garrison and his aids first realize that the scope of their investigation likely exceeds their parish. Toward the end of the montage, they discuss Lee Harvey Oswald’s biography and begin, for the first time, to put Oliver Stone’s narrative together:

The tone and content of these conspiracy montages get creepier and creepier as Garrison and his aids acquire more knowledge, closing the gap between themselves and the truth. Stone goes out of his way to make the Cubans and right-wing militants look ghoulish. (Many of the villains are gay, and Stone’s portrayal of homosexuality is absurdly problematic.) But these men are only demi-goblins amid a much larger Walpurgis of horrors, the center of which is not a fringe Cuban operation but the all-powerful military-industrial complex.

This is when Mr. X intervenes. He arranges a secret meeting with Garrison in the most public and out-in-the-open part of Washington D.C., where he tells Garrison, “You’re on the right track.” X reveals himself as a former military intelligence officer (under the supervision of a man he calls “General Y”) who helped oversee “black-ops” in the 1940s and ’50s: helping Nazis escape Europe, orchestrating the overthrow of Mossadegh in Iran, rescuing the Dali Lama from Red China, that kind of thing. Once they got involved in Cuba, says X, things started to go wrong.

In 1962, Mr. X was working at the Pentagon, and was assigned an unexpected trip to Antarctica. While he was gone, Kennedy was killed. X continues:

Why was Kennedy killed? Stone uses X to reveal his (Stone’s) ultimate thesis: John F. Kennedy planned a complete withdrawal of U.S. troops from Vietnam by 1965. This was a step too far for the military-industrial complex. The only way to stop Kennedy from pulling out the troops was to kill him. Lyndon Johnson agreed and signed on. Well, Stone doesn’t state that directly; he lets the montage do the talking.

In this scene, X explains the consequences of a Vietnam withdrawal for Kennedy’s enemies (NSFW):

Despite the absurdities of this scenario, it’s a blast to watch. But the closer you get to the center of the conspiracy, the vaguer it becomes. The weirdos on the fringe are characters with dialogue. They’re played by Joe Pesci and Tommy Lee Jones, real people. They’re tangible. But once you move beyond them, well…you don’t really move beyond them.

JFK was one of my first exposures to the lacuna-thriller: films that don’t quite satisfy you, films with characters who are defined and driven by their need to know what is probably unknowable. Movies like  All the President’s Men, Jacob’s Ladder, The Vanishing, The Conversation, Don’t Look Now, Zodiac, Upstream Color, and even Zero Dark Thirty. Not all these films involve conspiracies and a few result in an actual solution to the mystery. But they all strike a similar register.

I saw The Vanishing when I was 17. The whole film terrified me, not just the ending but also the beginning, the innocuous bits, the picnic at the petrol station. A man’s life is disrupted by a single, inexplicable event. He is overwhelmed by the need to understand the event, to make an inexplicable few minutes explicable. But if he succeeds, his whole identity – which has gradually been absorbed by his obsession with knowing – would disintegrate. He lives to know what happened, but his life is sustained by his ignorance. The Vanishing handles this tension brilliantly.

Detective John Munch, conspiracy junky extraordinaire.
Detective John Munch, conspiracy junky extraordinaire.

This tension is a staple of our pop culture. From The Crying of Lot 49 to The Parallax View to Twin Peaks, it’s the gift that keeps on giving. The questions never get answered: when is it better not to know? If you’re paranoid, how can you know when they’re actually after you? How do you know when you’ve reached the conspiracy’s end? The attitudes change over time: in the ’60s, cynicism bolstered belief in conspiracy. In the ’90s, the cynics were the ones who didn’t believe (re: Gillian Anderson in The X Files). One of the most persistent figures on mainstream television is Detective John Munch (played by Richard Belzer). Munch’s antiquated obsession with ’60s-style conspiracy is his defining quality. It’s also the quality audiences find most endearing in him. Munch, who first appeared on the television series Homicide and just recently retired from Law & Order: SVU, has made guest appearances on eight other shows: more than any other actor-character in television history. Munch isn’t an icon, but he has more connections than anyone else in the vast Tommy Westphall Universe. If anything, he seems to be part of a conspiracy – the man who keeps showing up. His presence and popularity demonstrate the endurance of the paranoia, the obsession, and the conspiracy as aspects of our cultural identity.

From his first appearance, Munch was a throwback to a specific type of conspiracy nut, earlier ideas about conspiracies and how they work. Large institutions still conspire and commit crimes; but they are also increasingly unwieldy, chaotic, and prone to leaking. In the ’60s, political paranoids imagined puppet masters operating behind the curtains. Today, we imagine the offices of the NSA: wide open and filled with hundreds of employees committing crime after crime after crime, sometimes unwittingly, and often not very well.

Fifty years after the fact, the assassination is a touchstone of the paranoid style in American life, a symbol of the American affection for conspiracy. In 2013, the Kennedy assassination registers only on certain frequencies, none of them political. It’s not a tragedy anymore; it’s an essential part of our pop culture, like Superman or alien invaders. I mean, seriously: look at this. So today, I don’t feel nostalgia for John F. Kennedy. I feel nostalgia for John Munch.

Postscript

If you’re interested in a video about the actual assassination, one that attempts to answer a few questions honestly, this one is a good one:

 

The Worst President of the 20th Century: Part Two

By Seth Studer

1. “Terrible Headaches”

On or about November 10 1997, the Kennedy legacy changed. Seymour Hersh’s The Dark Side of Camelot appeared on Barnes & Noble table displays across the fruited plains. The sexual promiscuity that added a little heat to the Kennedy aura – you could imagine him picking up girls, laughing with Frank Sinatra in bar, all in a black-and-white photograph – was suddenly an inventory of sleazy details.

Worst of all, if you believed Hersh and his sources, Jack Kennedy was kind of a dick.

Before that, Jack was cool. In 1996, a plurality of Americans reelected a president even though they disapproved of his sexual predilections. In the public mind, I think Clinton’s primary sin was not that he committed adultery but that he had joyless sex with the wrong kind of women and, despite his own glowing opinion of himself, he didn’t seem very self-assured about any of it. He actually winced when you asked him about sex. He didn’t do it right. He didn’t do it with style.

Not like Jack did.

This was the 1990s.
This was the 1990s.

Jack was an athlete; women were like sport. Clinton had an appetite; women were like McDonald’s. Jack was fit. Jack was from the Northeast. Clinton was from Arkansas. (American film and literature has always portrayed Southern sexuality as somehow…off.)

Ever since ’92, Clinton evoked and welcomed comparisons to the previous fortysomething president. Democrats still proudly invoked John F. Kennedy’s memory. What’s more, a generation of Republicans who didn’t cringe at the Kennedy name had arrived. Dan Quayle had confidently compared his record to JFK’s. Senator Lloyd Bentsen’s famous response – “Senator, you’re no Jack Kennedy” – is still hailed as a high point of the Dukakis/Bentsen campaign. In 2013, Bentsen sounds like a supercilious ass: an elder senator who isn’t going to listen to some kid pretend he knows something about Jack Kennedy. “I knew Jack Kennedy,” he says, which only reminded people that Bentsen has been in Congress for forty years. His Texas growl may have evoked memories of Lyndon Johnson: not an association you wanted in 1988 (Johnson’s rehabilitation would come later). Meanwhile, the insanely competent but uncharismatic Michael Dukakis, the man who would be the next president from Massachusetts, didn’t do much to secure the Democrats’ monopoly on Kennedy nostalgia.

Three years later, Oliver Stone’s JFK briefly mainstreamed absurd conspiracies about Kennedy’s assassination. The film actually inspired new legislation: the President John F. Kennedy Records Act of 1992 was signed into law by former CIA director George H.W. Bush (back before he earned the “H.W.”). But the most significant thing about JFK was that it’s not actually about JFK. Aside from the wonderful “Mr. X” scene, Kennedy’s policies are barely discussed. The film followed the oddballs and patsies who stand at the edges of the assassination. The film was about the events following the event. Kennedy wasn’t even the objet petit a (that’s the grassy knoll, or whatever’s behind it). Kennedy the man and Kennedy the president had ascended into symbolism, and were just as pliable.

jfkBy the mid-90s, Republicans were citing Kennedy as a proto-Reagan. He cut taxes, funded the military, preached personal responsibility (“ask not”): if JFK were alive today, he’d be a Republican! Democrats, meanwhile, persisted in giving Kennedy credit for the few good things Lyndon Johnson had done and blamed Johnson for all the bad things Kennedy did.

1996 was maybe the last good year for old-fashioned flannel-jacketed Kennedy nostalgia. A huge swath of the adult population hadn’t even been born when Kennedy was elected, while many aging baby boomers had little more than a strong, adolescent impression of his presidency. These were ideal conditions for distortion, false memories, and mythologizing. JFK was as an avatar in the public imagination, like Marilyn Monroe (whose movies nobody watched) or Elvis Presley (whose music nobody listened to). Americans could compare the avatar to their current president and find the current president wanting. They could compare the glamor and grace of Kennedy’s era to the present, and find the present wanting (a favorite American pastime).

Then The Dark Side of Camelot fell into their laps, and suddenly Jack was sleazy. The conversation inevitably started with the women, and that’s okay: the women are perhaps the most substantive aspect of Kennedy’s presidency. Inga Arvad and Marilyn Monroe – would could blame him? All the mob stuff was kinda sexy. Everyone had heard that Jack slept with Marlene Dietrich after Marlene Dietrich slept with Jack’s dad – that was kind of weird. But y’know, these are things celebrities do. And JFK was the first celebrity president. Wouldn’t you sleep with as many women as possible if you were the most famous, powerful man in the world?

Except that after Hersh, Kennedy wasn’t merely canoodling with glamorous women on a giant, white, oval bed, the presidential seal hanging overhead. After Hersh, Kennedy was a misogynistic lech. He slept with secretaries and interns and journalists, women who were not in any real position to say “no.” He used the White House as a harem for his friends and his brothers: they need only show up and a woman would be procured for them. The 1960 campaign was a template for the forthcoming Beatles and Stones tours: women everywhere, always available. Jack even spent the night before his inauguration one of his steadier girlfriends. And during the campaign, he told a girl that he would divorce Jackie if Nixon won the election (here’s something fun: imagine an inversion of that scene where it’s Nixon instead of Kennedy). Would Jackie consent to a divorce? Well, they certainly weren’t happy. The primary complaint in the Kennedy marriage was Jack’s promiscuity. Once in office, he promised to sleep with other women only when Jackie was not at the White House. Consequently, Jackie spent much of the glamorous Kennedy years outside the White House.

But a misogynistic lech can be an effective president, right? Sure. Except that after Hersh, Jack’s fame, power, and a dehumanizing attitude toward women were no longer adequate excuses for his sex life. After Hersh, the whole question of “excuses” was supplanted by the need for explanations. It’s not that Kennedy was immoral. It’s that something seemed seriously askew in Kennedy’s judgment, maybe even his brain. JFK kind of had a problem.

jfk_mobKennedy took bizarre risks in order to have sexual intercourse. He had divisions of the Secret Service coordinating his liaisons. Financial resources were funneled into scouting, securing, and serving women to the president. Secret Service agents were diverted from regular duties to plot elaborate mazes through which women were brought to Kennedy. Most of the affairs were one night stands, and his security complained that the number of women coming and going stretched their ability to keep the White House secure. Kennedy shared a woman with mafioso Sam Giancana. She was a dual-mistress and a courier, moving cash from the White House to Giancana to fund the United States’ ongoing operation of failing to kill Castro. (That’s another thing: unlike his immediate predecessors, Kennedy took a direct hand in murder and assassination plots, seemingly unconcerned about plausible deniability. Even Nixon kept a guy or two between himself and the plans.)

When Kennedy traveled abroad, he slept with women who were barely vetted by the Secret Service. He inadvertently slept with former members of the Communist Party (foreign and domestic). He even got the chance to sleep with two actual, bona fide, real-life Communist spies (fortunately for him, their espionage duties were all confined to a middle-tier U.S. ally: England). The ghost of his old friend, Senator Joe McCarthy, must have been otherwise occupied. (Maybe McCarthy’s ghost gave Jack a pass, what with Jack having helped fund the psychotic anti-Communist’s reelection.)

Kennedy’s staff may have tolerated the contortions required to bring women to the president because they couldn’t tolerate his short periods of celibacy. One Secret Service agent described the sexless days when Jackie was at the White House: “[Kennedy] just had headaches. You really saw him droop because he wasn’t getting laid. He was like a rooster getting hit with a water hose.”

Harold Macmillan - Kennedy asked this man whether going a few days without sex gave him headaches.
Harold Macmillan – Kennedy asked this man whether going a few days without sex gave him headaches.

Kennedy told many people about his headaches: friends, enemies, members of the press. “If I don’t have a woman for three days,” he told Harold Macmillian, the Prime Minister of the United Kingdom, “I get terrible headaches.” (“I wonder how it is with you, Harold?” he asked one of the most distinguished PMs of the 20th century.) This behavior was not libidinous, it was compulsive. Something was…off. Even if we accept only the mildest accounts, John F. Kennedy fit the textbook definition of a terminal sex addict.

Most of what I’ve described emerged from Hersh’s research, which was criticized as lax (there was a minor scandal over one document). The rest emerged in the years since The Dark Side of Camelot was published. The book’s reviewers adopted one of three disingenuous moods: righteous indignation (at Hersh), smug cynicism (very ’90s), or, if you were The New York Times, both. But for the Barnes & Noblers who heard about the sleaze in Newsweek and decided to read it for themselves, the book was a snore. Hersh’s style was dry, straightforward, reporterly. Most readers felt like teenagers who had rented and watched all of I Am Curious for the naughty bits: tired, confused, let down.

In the two years after The Dark Side of Camelot, Democrats defended Bill Clinton’s White House dalliances with everything in their arsenal, including Kennedy. But thanks to Hersh, appeals to Jack were losing their power. Kennedy now offended the sensibilities of too many segments of the body politic, left and right. Hillary Rodham Clinton (all three names) was first lady, the most powerful first lady since Eleanor Roosevelt. Newt Gingrich was Speaker of the House of Representatives, the most visible Speaker since…I don’t know, Henry Clay? Between second (and third) wave feminists and conservative Evangelicals, there wasn’t much space for an ass-slapping frat boy with a charming smile. Kennedy’s Rat Pack womanizing was no longer cute.

And the real shocks were yet to come.

2. 1961

By the 40th anniversary of the assassination, Kennedy’s legacy was undergoing further revision. The impact of Hersh’s work is acknowledged in the opening line of Robert Dallek’s December 2002 Atlantic cover story:

Recent assessments of Kennedy’s presidency have tended to raise “questions of character”—to view his Administration in the context of his sometimes wayward personal behavior. Such assessments are incomplete. Newly uncovered medical records reveal that the scope and intensity of his physical suffering were beyond what we had previously imagined. What Kennedy endured—and what he hid from the public—both complicates and enlarges our understanding of his character.

Dallek’s article revealed what even Hersh hadn’t discovered (or disclosed). The fact that Kennedy had Addison’s disease was not news. He admitted that during his lifetime. The incredible, byzantine, all-encompassing system that Kennedy and his staff designed to hide the severity of his ailment from the public – that was news. The fact that Kennedy was on massive doses of painkillers throughout his entire presidency, that he required amphetamines, that he may have been addicted to something other than sex – that was news.

Imagine not discovering the full scope of the Watergate cover-up until, say, 2011. The pay-offs, the perjury, all the cloak-and-dagger shit. Now multiply all that by powers of a hundred. This is the impact the general public should have experienced (and Kennedy aficionados certainly did) when they learned that JFK spent long periods of his presidency as a kind of half-cogent prop held together by straps and metal braces underneath his clothes. That he was in constant and agonizing pain. That he was perpetually at death’s doorstep, that he’d taken up permanent residence on death’s doorstep as a young boy. That he had little chance of living into his 50s (hence the urgency to run for president in 1960). That a metal brace on his back held him perfectly upright for Lee Harvey Oswald’s third and fatal bullet.

Hersh had actually disclosed that last horrifying fact in 1997, and had wrongly attributed the brace to a nasty pool-side fall that occurred a few days earlier when the president was, you guessed it, having sex. The pool-side sex/fall did happen, but the brace was there to counteract Addison’s effect on the president’s back.

A weak, dying man in a 42-year old’s body that functions like a 72-year old’s body (except for the groin) certainly cuts a sympathetic figure. But even here, Kennedy found ways to behave recklessly.

Kennedy liked his doctors the way he liked his women: numerous and a little dangerous. The White House hired several medical doctors, including the Hollywood physician Dr. Max Jacobson, whose cocktail of drugs – cortisone and amphetamines, plus other stuff now and again – impaired Kennedy’s physical and cognitive functions. The problem wasn’t the drugs, it was the dosage. All of Kennedy’s physicians prescribed smaller doses of these and similar drugs, most of which would affect the president’s mood and mental clarity a bit (some have argued that certain drugs were responsible for his unquenchable libido). His physicians tried to keep these side effects manageable, but the dosages they prescribed did not fully alleviate Kennedy’s pain. Only Jacobson’s came close. So when his medical team warned that Jacobson’s treatment would too severely diminish the cogency of the president of the United States, Kennedy responded accordingly: he built a Chinese wall between his White House physicians and Jacobson.

This Chinese wall helped erect the Berlin Wall, in its own small way. Kennedy snuck Dr. Jacobson along to Vienna for the June 1961 summit with Nikita Khrushchev. He kept Jacobson’s presence a secret from his other physicians, and – whether out of nerves or genuine pain – asked for an unusually strong dose of painkillers before meeting with the shoe-throwing Soviet leader. Within hours, Kennedy appeared haggard and sick. He was about to meet the leader of the Communist world, and he was stoned.

The fact that Kennedy’s foreign policy was in tatters didn’t help. The new president had struggled to unite his strong-willed defense advisors and could not formulate a sane, cohesive policy toward the Soviet Union and its satellites: something his predecessor had accomplished delicately but deftly. Khrushchev and Eisenhower had been moving toward a kind of detente, albeit at a glacial pace. As Khrushchev consolidated power in the mid-1950s, China began testing the limits of their dependence on Russia. This pushed the U.S.S.R. ever so slightly toward the possibility of cordial relations with the United States. In retrospect, it’s clear that – despite numerous conflicts, setbacks, and both side’s hardline policies – Eisenhower and Khrushchev had made small but significant progress toward a mutually acceptable draw in the Cold War. In fact, this slight slackening in U.S.-Soviet relations helped Kennedy become president: he campaigned loudly on a platform of rearmament and aggression toward Cuba and the Soviet Union. With Barry Goldwater and others, he accused Eisenhower of being soft on Communism. And Eisenhower, in turn, expressed alarm over the aggressive yet casual way the senator from Massachusetts spoke about nuclear weapons.

Kennedy’s consistently hot rhetoric (the “ask not” speech is more warlike than you remember) put Khrushchev on defense when he arrived in Vienna. A year earlier, under Eisenhower, some kind of agreement on Berlin seemed possible. But the Soviets were threatened by Kennedy’s rhetoric, confused by his actions, and now buoyed by his visible physical weakness (which counts for a lot in Russia, apparently). Kennedy was slow to speak, slow to respond. Khrushchev ambushed him.

After Vienna, Soviet resolve had increased. Kennedy appeared distracted, easy to manipulate. Meanwhile, Kennedy developed a nasty infection, exacerbated by his Addison’s, which increased his pain and decreased his mobility (some records intimate that the president nearly died on June 22). Dr. Jacobson amped up the drugs, further decreasing Kennedy’s ability to work. The vibe emanating from the White House was ambivalence, confusion, vulnerability. The tough, hawkish persona Kennedy had spent years cultivating was crumbling. Khrushchev felt increasing comfortable: he threatened to seize West Berlin, and then in August he quarantined East Germany (and the rest of the Eastern bloc) by closing all roads to the West and building a wall around West Berlin, holding millions on the Communist side hostage.

In their haste to prevent East Germans from fleeing, they forgot to Reagan-proof the wall - an oversight they would regret in 28 years.
In their haste to prevent East Germans from fleeing, they forgot to Reagan-proof the wall – an oversight they would regret in 28 years.

Khrushchev never intended to seize West Berlin; he understood that such a move would result in war, perhaps a nuclear exchange. But after seeing the intoxicated president, he thought to himself (in Russian), “Eh, why not?” The bluff would make the quarantine look like a concession; the Soviets could pretend to be the level-headed ones for once.

Dr. Jacobson did not build the Berlin Wall. Plenty of sober minds helped draft Kennedy’s disastrous foreign policy in 1961. Broader geopolitical and economic factors narrowed the options for both Kennedy and Khrushchev at Vienna – neither man was ever in complete control of his government. But insofar as Kennedy had power over certain outcomes, he either bungled or misused it. He had difficulty managing his defense and diplomatic teams, a mish-mash of military brass and technocrats. Everyone in the White House knew that Kennedy was often disoriented by his medication. The president’s lucidity must have been impaired at several important junctures: meetings on Cuba, Turkey, Laos, Vietnam.

Kennedy was not the first or the last president to make foreign policy decisions while intoxicated. But no other president had been so frequently intoxicated while responding to such potentially apocalyptic events.

Apart from the dugs, Kennedy was naturally ambivalent. He loved the appearance of risk but hated the actual, y’know, risk. He committed to major policies half-heartedly. He applied pressure to Indochina, tampered with Cuba, and placed nuclear missiles in Turkey, but always reneged when serious risk seemed imminent. Meanwhile, he began sending olive branches to Khrushchev through back channels and backed off Germany. These oscillations confused Soviet leadership. This man had no idea how to wage a Cold War. So in the summer of 1962, Khrushchev, feeling sufficiently confident in his position vis a vis Kennedy, installed nuclear missiles at the United States’ doorstep.

Then the world almost ended.

By all accounts, Kennedy was cogent throughout the missile crisis. A few months earlier, he had finally caved to pressure from his physicians and fired Dr. Jacobson. His staff recorded a notable change in the president: he wasn’t stoned.

The missile crisis jolted the president. After 1962, his foreign policy oscillations were less extreme, except in one low stakes arena: that sliver of Southeast Asia far away from the rest of the Cold War. Kennedy had already begun to apply his Cuban tactics to South Vietnam: covert operations, low risk disruptions, sex-for-espionage, even a few troops here and there (called them “military advisors”). He could meddle in Vietnam without enraging Khrushchev. And with no overarching Vietnam policy, he could experiment with hundreds of little policies (policies were kind of like women). He could make commitments with governments and then, if he changed his mind, break them. If they threw a fit, no problem: small, pro-American governments in Third World – like women – were easy to kick out. He could fight Communism with few consequences. He could have action without risk. What could possibly go wrong?

3. Playboy

Vienna might have gone poorly, and the Berlin Wall might have been built, even if Kennedy had never met Dr. Jacobson. The doctor and his dope were just two small actors in a much, much larger drama. And we can forgive a sick man for seeking radical treatment to relieve his pain…unless he’s president of the United States. With Dr. Jacobson, Kennedy willfully and recklessly made himself vulnerable, as he had with so many women, any one of whom (as far as the White House knew) could have been Khrushchev’s niece. His worst decisions mirrored the worst aspects of his personality. He was reckless. Selfish. Careless. Unwilling to fully commit. These words describe both his marriage and Bay of Pigs, or Vietnam.

Kennedy relished the illusion of action, and for most of his life, people were paid to pretend the illusion was real. “Never expect any appreciation from my boys,” Joe Kennedy told Tip O’Neill in 1953. “These kids have had so much done for them by other people that they just assume it’s coming.” If Kennedy wanted something, he got it; more dangerously, if he wanted to be something, he became it. His father’s influence and sense of entitlement guaranteed that just enough people would play along to make it real for Jack. Consequently, Jack jumped into positions he hadn’t prepared for and just lazily played the part. Jack’s acting career was more varied than Ronald Reagan’s. He played good student. He played celebrated author. He played at winning a Pulitzer. He played underdog candidate. He played congressman. He played senator. He played president. He played a healthy, vigorous young man. And he always got away with it. Only the presidency required some real effort, some real acting, before he got the part (one of the most subtle actors in 20th centuries was competing with him for the role). But Kennedy always knew to expect a deus ex machina or, in this case, a daley tex machina. Daddy delivered.

rockyIt’s difficult to fault Kennedy for being spoiled. But compare Kennedy to Nelson Rockefeller, his closest counterpart, to see how singularly Kennedy’s hyper-privileged upbringing deformed his character.

Like Kennedy, Rockefeller was the a son of ridiculous wealth who fully expected to become president one day. Like Kennedy, he was spoiled, petulant, and licentious. Granted, Rockefeller didn’t possess Kennedy’s bitterly competitive edge or his taste for corruption, but he also didn’t have Joe for a father. Sure, Jack and Nelson’s advantages were handed down differently: what Jack got from his father’s willingness to cheat, steal, bribe, fight, and rig nearly anything to get his boys elected, Nelson got from his name, an intangible key that opened every door in the free world. Nelson had more advantages, probably; but then he wasn’t the one who became president.

Still, Jack and Nelson are worth looking at side-by-side. Rockefeller possessed all the repulsive qualities of a privileged son, and he grew sour with age. But he possessed warmth and sincerity, too. He slept around and committed adultery, but he also had the capacity to fall in love with a woman and settle down (and probably lost the Republican nomination for it). Above all, he possessed a set of convictions that were noble and inflexible, and he was willing to sacrifice political capital in the service of these convictions. His political career stalled because he wouldn’t accept the Republican party’s willingness to trade integrity for nihilism on Civil Rights in exchange for the votes of the old Confederacy.

Kennedy’s virtues were much sparser: he was undeniably charismatic. He was naturally funny, charming, and quick-witted. He was genuinely  kind to his friends and allies (he could be ruthless with anyone he didn’t like; unlike Bobby, however, ruthlessness was not his sole attribute). As for conviction: Kennedy’s admirers point to his Civil Rights record, and it’s true he proposed a Civil Rights bill and (characteristically) took some half-risks in the process. It’s even possible he would have risked losing one or two states in 1964 for Civil Rights.

But I really doubt it.

From the beginning, Kennedy was terrified of Civil Rights issues. He was terrified of Southern politics. He didn’t want any of it near the White House; or, he didn’t want it near him. Whatever progressive moves he made on Civil Rights were the result of intense pressure from within his party and his administration. But in general, he took a tepid stance – if he planned to sincerely fight for the Civil Rights bill he proposed, a la LBJ, he didn’t let on. Pro-segregationists from within his party were powerful, and the president’s team didn’t want blacks to cost him a second term. The issues at stake was toxic. Whatever gestures Kennedy made toward the Civil Rights movement were just that: gestures.

Sincere political conviction frightened Kennedy more than marital fidelity. He spent his entire political career avoiding it.

Judged solely on his personality, John Fitzgerald Kennedy possessed more unattractive qualities than any U.S. president since Andrew Jackson. And like Jackson, his persona overwhelmed his presidency. In that respect, the crude, licentious, perpetually doped frat boy is merely an inversion of the witty, energetic, charismatic Camelot Kennedy: both are a spectacle, designed to make politics more exciting to people at the dentist’s office.

As a result, critics sometimes dismiss Kennedy on the basis of spectacle (he’s just a cultural figure, a celebrity, a sex symbol, a martyr, etc.). But that only diminishes his many dubious accomplishments. In a mere thirty-four months (or “a thousand days,” if you want to get all goose-bumpy), Kennedy oversaw a frightening reversal of Cold War strategy; equivocated on or blundered the most vital issues; dramatically escalated the government’s reliance on thugs, mercenaries, and gangsters (LBJ called it “a damned Murder Incorporated”); and treated nuclear war as a penis-measuring contest, recklessly taunting the Soviet Union more than any other Cold War president (yes, including Reagan).

All this in addition to the lies, drugs, and hedonism that receive more press, the slime that coats his more substantive failures. With John F. Kennedy, personal flaws and political failures emanate from the same sewer.

In the next installment: November 22, 1963

Anthropology and The Evolution of Mean Girls

By Amanda Grigg

Disclaimer: This post features references to the greatest film of our generation, Mean Girls. If you haven’t seen it what are you doing with your life go watch it right now. If you have, get in loser, we’re going blogging.

The fanciest British journal ever, Philosophical Transactions of the Royal Society, published a special issue this fall on female aggression and its conclusions have been making their way across the web. Some of the scholarship applies science to the “mean girl” phenomenon so of course journalists are all a flutter to see who can cover the findings in the most annoying way possible. Contenders include a LiveScience post titled, “Mean Girls: Women Evolved to be Catty?” and The New York Times coverage.

19TIER_SPAN-articleLargeMost of the coverage focuses on a single study from the special issue, conducted by Tracy Vaillancourt and Aanchal Sharma. To learn more about how women react to “rivals” the researchers placed two undergraduate women in a room together, ostensibly as part of a study on female friendship. Then they sent in another young woman wearing either khakis and a crew-neck shirt (Cady pre-Mean Girlification) or a short skirt, knee-high boots and a low-cut top (regulation hottie).

And of course, the researchers chose this model not because she fits a very particular cultural model of sexual attractiveness but because she “embodied qualities considered attractive from an evolutionary perspective,” meaning a “low waist-to-hip ratio, clear skin, large breasts.” It doesn’t hurt that she’s white, tall, blonde and has perfect teeth. Or maybe caveman were also particular about the hair color and orthodontia of their mates.

As researchers expected, reactions after the young woman left varied depending on the woman’s clothes. The jeans and polo shirt elicited little response. The “sexy” ensemble summoned their mean girl wrath:

They stared at her, looked her up and down, rolled their eyes and sometimes showed outright anger. One asked her in disgust, “What the [expletive] is that?”

…One student suggested that she dressed that way in order to have sex with a professor. Another said that her breasts “were about to pop out.”

To explain this author John Tierney turns to evolutionary forces. On the evolutionary incentives to be indirectly aggressive:

“women were not passive trophies for victorious males. They had their own incentives to compete with one another for more desirable partners and more resources for their children. And now that most people live in monogamous societies, most women face the same odds as men. In fact, they face tougher odds in some places, like the many college campuses with more women than men.”

The piece seems to assume that evolution and primal mating calculi are the driving forces behind the forms female aggression takes, and at whom it is directed. Because science. To which I say, ugh.

tumblr_lji9dzN7wr1qig6iso1_500
Cady: 1 Evolutionary explanations for female aggression: 0

Of course Mean Girls protagonist Cady Heron, being the daughter of anthropologists, understands the role of culture in shaping female aggression. Throughout the film she notes the way things would be handled “in the animal world” but reminds herself, and the audience that “this was girl world.”  When Queen Bee Regina dangles her boyfriend (and Cady’s crush) Aaron in front of Cady to taunt her, Cady fantasizes about violently attacking her rival. But, because “this is girl world” she tells Aaron that his hair does in fact look sexy pushed back and continues to quietly plot (indirectly aggress) her revenge.

While I would be fine basing all of my repudiations of The Grey Lady on the wisdom of Tina Fey’s Mean Girls, we can also turn to alternative coverage of the story. From io9

The problem with talking about humans, of course, is that we are not wild animals. As Stockley and Campbell are careful to point out, humans have been so influenced by culture that it’s very hard to tell if a lack of overt aggression among women is an evolutionary or cultural artifact. Because so many women are culturally trained to tamp down their aggressive urges, it’s impossible to call their behavior “natural.”

32737
…or did they?

For their coverage, The Atlantic spoke with Agustin Fuentes, chair of the dept. of anthropology at Notre Dame, summarized here:

though this and other studies show how important physical appearance is to the way women respond to each other, there’s too much cultural baggage at play to say it all comes from our primate ancestors. The short-skirt-boots combo, for example, is already a “meaning-laden image,”

As Fuentes suggests, how women identify “competition” and thus who they direct aggression towards is fundamentally shaped by culture – cavewomen certainly didn’t wear knee high socks.

Though the researcher’s plant has the exact same “evolutionarily attractive” physical features in either outfit, she only elicits aggression in the short skirt which suggests that it’s not primal mating urges at work (or at least not just those urges). The outfit incites “indirect aggression” because it carries all sorts of cultural meanings, which women have been socialized to recognize and criticize for reasons beyond competition for mates.

The NYT piece also fails to note the similarities between male and female aggression. According to Fuentes girls and boys engage in equal amounts of direct aggression until adolescence, at which point it becomes socially unacceptable for girls to do so. And according to David Buss in the Atlantic, studies have suggested that adult men also engage in indirect aggression, especially once they reach the age at which it becomes socially unacceptable for them to engage in direct aggression.

Buss has found that men “bitch” about their rivals, too—they just tend to insult their lack of money or status, the things women traditionally have valued in mates, rather than their physical appearance.

Overlooking the use of the word “bitch” to describe something you’re trying to argue is gender neutral…it’s notable that men insult rivals for lack of money and status. Of course you could argue that cavewomen wanted mates with lots of buffalo-meat in the bank (I’m pretty sure that’s accurate anthropologically) but it seems absurd to try to explain this without acknowledging the social and economic context – particularly that women in recent history have relied entirely on men for financial support and equally troubling, that men have been judged primarily by their economic and professional accomplishments. It’s just as absurd to try to explain indirect aggression between women without at least considering the cultural context.

I’ll conclude with two quick insights from feminist theory. I don’t think either fully explain and they certainly don’t justify woman on woman hate but they do suggest that there is more to these interactions than biology. First, we might look to Sandra Bartky and Foucault, to understand how these responses are part of the process by which the cultural ideal of femininity is constructed. Insofar as that ideal demands the perfect balance of modesty and sexuality (walking the Madonna/whore line which this woman seemingly does not achieve), these responses serve to “discipline” the woman, encouraging her to fall in line. Second, à la Ariel Levy’s Female Chauvinist Pigs we might  consider that women are viewing the sexy plant not just as an abstract threat to their primal urge to defend mates, but as a physical manifestation of the constant pressure women are under to be thin, blonde, beautiful, and above all sexy. “What the [expletive] was that,” indeed.

 

Don’t Become What You Study

By Graham Peterson

199920-shia-muslims-flagellate-themselves-walk-on-fire-prior-to-ashuraAs I understand it, psychology students get rather seriously advised against using their textbooks and courses to self-diagnose, nor to use their own head to derive general theories of mind.  I think that’s incredibly sound advice, and I think we could use more of it in the rest of the human sciences.

We can always use a stern reminder in social science to resist the temptation to derive theory from intuitive introspection.  Even Milton Friedman admitted in his Essays on Positive Economics that social science is qualitatively different from the hard sciences — the social scientist himself is a potential research subject, and carries with him a folio of “empirical experiences.”  It’s the cheapest data around, and because demand curves slope downward, we consume the most of it.  That’s the problem with trying to construct theories of what’s out there based on what’s in here.

But there is a double danger in applying to ourselves our theories of what’s out there, in order to reshape and reform what’s in here.  That’s how you end up stuck in a self-referential loop, and merely extending the process to a circle of your colleagues is even worse, because your degree of smiling idiocy starts rising exponentially as a function of the number of colleagues who have the same baseless reasons for belief.

I reflected (you see!  I’m doing it right now!) on this a bit last year.  After studying economics long enough, I began to use economic reasoning to justify my own actions ex post (after the fact).  Finding myself with inevitably human regrets and guilt about choices I’d made, I would reason, “well, the benefits outweighed the costs at the margin, so that’s why I did it and I shouldn’t feel bad about it.”  Now, that’s not economic reasoning at all.  Rational calculations take place ex ante (before the fact) — they’re not a device to make yourself feel better at 10 am about having a shot at Last Call the night before.

To be sure, it’s worthwhile to give yourself a break and recognize that a lot of your own decisions were relatively prudential.  It’s worthwhile to say, “hey, I was really stressed and the ice cream felt good so fuck you, conscience.”  But that’s not economic reasoning.

What’s struck me in my new surroundings is the degree to which I see sociologists fitting themselves to sociological theory.  I find it incredibly exhausting.  Everyone gets fitted to respective social roles based on class, skin color, breadth of vocabulary, gender, and so forth, and the party starts.  Sociology students openly and unselfconsciously reduce one another to simplistic stereotypes and go hunting for double standards, discrepancies, and oppressions in one another’s behavior.  That’s when sociology becomes circular identity politics — when everyone partakes in this religious cleansing ritual —  ostensibly meant to “raise consciousness.”

It must seem incredibly obvious to people who have studied sociology for a very long time, that everyone is indeed a latent racist, sexist, nationalist, and so forth, after systematically categorizing their friends, family, and coworkers as such daily. The sociology student has assumed from the start that people are making such deliberations subconsciously (err, institutionally), and prides herself on the enlightenment she’s accomplished by making such thinking explicit and checking her privilege.  But nay, it could just be that by sheer force of her own priors and a determination to fulfill them, she has merely seen in herself and the world around her a chapter of her sociology reader.

Just like I was when studying economics.  This is the danger of studying human sciences — we become the theories we construct — and we impose them on the world around us, often times to our own detriment and that of others.

Psychologists have a particularly cynical view of human nature, that the majority of it originates in pathology and fear.  They get expressly warned against turning that cynicism on themselves.  But note that all social sciences have a disturbingly cynical theory of humanity.  Homo economicus is a selfish bastard; Homo socialis is a pathological discriminator; Homo anthropolus speaks in 60 word sentences; and Homo politicus is a power hungry exploiter.  We are all well advised to stop-dead our temptation to reform ourselves and our own emotions, in an attempt to undo whatever cynical pathology we spend the rest of our day diagnosing the world with.

Social science does not exist as a device for the neurotic gratification of the religiously lost, and it becomes more and more of a religion, and less and less of a science, in direct proportion to the degree we encourage one another and allow ourselves to paste our theories to our own foreheads, swear by them, and unselfconsciously become them.

*Shouts go out to my man Rich Wallace, currently studying sociology, who prompted me with, “I just want to rock Jordan’s while I’m debating intersections of race class and gender, is that to much to ask?”

The Worst President of the 20th Century: Part One

By Seth Studer

Foreword 

This week marks the 50th anniversary of John F. Kennedy’s assassination. Over the next few weeks, I will write a series of posts reflecting on JFK, his life, his legacy, the office of the president, American history, cultural memory, and myself. The thread uniting these posts is a single thesis: John F. Kennedy was the worst president of the 20th century. Some posts will veer away from Kennedy, others will deal exclusively with Kennedy, but all will, in their own way, approach and grapple with the question of what it means to be “the worst president.”

A Tale of Two Libraries

Boston is a second home to me. I’m sure thousands of ex-grad students who’ve attended any of the dozens of universities along the Charles River feel the same way. The feeling runs a bit deeper for me, I think. I met my wife in Boston. She and her entire family hail from Boston, Dorchester, and the patchwork of suburbs that hug the harbor. Half of my family is there now, and I’ll never stop returning. I’ve tried to write about Boston before and failed. Boston is a place about which I’ve ceased to have easy or definite opinions. For me, that’s a pretty good definition of home.

My first home is eastern Iowa, the stretch between Waterloo and Iowa City. As a kid, the Herbert Hoover Presidential Library and Museum, located west of the University of Iowa in West Branch (pop. 2,322), was a frequent field trip. I probably visited Hoover’s museum ten times before I finished high school. The library sits near Hoover’s childhood home; the grounds feature renovations and reconstructions of his birthplace, his childhood barn, and the Quaker meetinghouse where he worshiped.

Hoover didn’t stay in Iowa long. His museum is filled with unintentionally hilarious mannequins of the cosmopolitan Hoovers traveling the globe: mining in Australia, feeding children in war-torn Europe, standing astride a canon during an anti-colonial uprising in China. (Mrs. Hoover, a woman of grit, does the striding.) Visiting the museum, I never reflected on the millions of acres of grain surrounding me, grain that would eventually leave Iowa and find its way to every continent. For me, the Hoover museum was a more immediate and tangible (if somewhat tacky) link between Iowa and the world.

Future first lady Lou Hoover posing by a canon during the Boxer Rebellion. The Hoover Library recreates this problematic scene, a fixture of my childhood.
Future first lady Lou Hoover posing by a canon during the Boxer Rebellion. The Hoover Library recreates this problematic scene, a fixture of my childhood.

I first visited the John F. Kennedy Library and Museum in South Boston when my parents were in town. Like most urban dwellers, I didn’t take advantage of local attractions unless people were visiting. And ever since I moved to Boston, I was eager to visit Kennedy’s library, in part because of Hoover’s. The Hoover museum is a carefully curated response to preordained hostility: people hate Hoover. Hoover caused the Great Depression (right?). He did nothing while people lost their savings, their jobs, their homes (right?). The curators offer a humble but firm apologia, something to do with foreign aid and Russia and being a wise old man in Manhattan. Even if you think Hoover was a bad president, says the library, Hoover was still a pretty good guy. Fine. I knew that. Now I wanted to see a library dedicated to a beloved president, someone who managed to squeeze lots of exciting shit (including loads of sex and near-apocalyptic disasters) into three short years! What must his library be like?!

By the end of my day at the JFK library, I had really learned something: Hoover’s library is kind of awesome.

Hoover’s was the first of the now thirteen presidential libraries affiliated with the National Archives and Records Administration (NARA). As far as I know, it is the only presidential museum whose curators operate under the assumption that their president was a failure. It would be hard to do otherwise. Everyone hated Hoover, even Iowans [1]. He’s the only president to have a symbol of nomadic poverty named after him. When critics wish to accuse a sitting president of total incompetency, they frequently invoke Hoover (sometimes idiotically). During their end-of-the-century coverage, Time magazine ranked Hoover as the worst president of the 20th century.

And who could object?

Herbert Hoover's childhood meetinghouse. Only two U.S. presidents have been Quakers: Hoover and Nixon. Takeaway: if you're a Quaker, don't become president.
Herbert Hoover’s childhood meetinghouse. Only two U.S. presidents have been Quakers: Hoover and Nixon. Takeaway: if you’re a Quaker, don’t become president.

Even if you take a sympathetic view of the man and his presidency – Hoover was an unparalleled philanthropist! he was a great statesman! he opposed Wall Street corruption throughout the 1920s! he began implementing New Deal-style reforms as early as 1929! the Great Depression was mostly Harding and Coolidge’s fault! any incumbent (even a composite of Lincoln, FDR, Reagan, Albert Einstein, Steve Jobs, and Harriet Tubman) would have lost in 1932! – even granting all that, you have to admit, Hoover was the worst.

Gore Vidal once suggested that ex-presidents be given the emeritus title “Librarian” (an ironic title, Vidal added, when you consider Americans’ general disinterest in reading). And if, for reasons perverse and unknown to me, you wanted to look inside the brains of a president’s most ardent supporters, to see what they see, you could do worse than visit a presidential library. Let’s begin with the real estate: Hoover’s library is situated at the end of a long driveway, an easy-to-miss turn off an old highway. The library is tucked away among trees. As anyone anywhere will tell you, Iowa is not a heavily forested state. I’m sure someone decided that trees would add shade and beauty to the grounds, but they only make the library seem deliberately hidden. The flat ’60s-style architecture, the “living history” vibe of the outdoor exhibits, the mannequins, the retro arcade-style buttons you push to watch videos produced in 1985: everything about the library feels antiquated.

The Kennedy library, by contrast, towers confidently at the edge of a peninsula it shares with the University of Massachusetts-Boston, a great white slab jutting out over the harbor, the bay, the ocean (no need to imagine a link between Boston and the rest of the world).

Herbert Hoover Library and Museum
Herbert Hoover Library and Museum

Only a jerk would point out the absurdity of the cones and pyramids; the spheres that abruptly give way to sharp angles; the awkward slabs of concrete juxtaposed with walls of glass; the wide, empty stairs; and the hollow square detailing (rendered, again, in concrete); all of which reek of, well, 1979: the year Kennedy’s library was dedicated, the year before his baby brother’s final unsuccessful shot at the big oval, the year before a Hollywood actor matching Kennedy’s charisma would claim his office (properly mandated, and able to survive two full terms). If Hoover’s library feels antiquated, Kennedy’s is merely dated.

Which is worse?

The late ’70s and early ’80s might have been the height of Kennedy nostalgia, when the Kennedy era was becoming historical memory: the moment of malaise, a time when swinging and sex weren’t fun anymore, when assassinations lost their shock value. A yucky time, more Teddy than Jack. This moment, not ’61 or ’63, is fossilized just beneath the surface of what seems, at first glance, a fantastically beautiful building, the John F. Kennedy Library and Museum.

Once you’re inside, Kennedy’s museum is curated to reaffirm the love you obviously already have for their man. I expected as much going in. JFK was a popular guy, and I assumed the museum would reflect that. No need for too much dirt, the lewd details: the question of how many women he slept with (Vidal, not the least reliable source, claims it was close to 5,000); the question of how high he was (moderately, but much of the time); the question of his isolationist daddy’s machinations; of Cook County ballot boxes; of dead voters in Texas (what, would you want Nixon to have won?).

John F. Kennedy Library and Museum
John F. Kennedy Library and Museum

But I was immediately alarmed by the the Kennedy library’s total and unapologetic adulation of their man. Presidential library museums are inherently propagandistic, I get that. Even the Hoover museum highlights two accomplishments for every terrible decision. What I don’t expect is a slobbery, sometimes defensive, borderline kitschy shrine to a complicated administration. I don’t expect a brazen and unapologetic whitewash of a man who has already been washed whiter than any modern president, a man who has practically been conferred sainthood among American Catholics (I’ve seen haloed JFK portraits on mantels in Dorchester, Dubuque, Denver).

The bulk of the Kennedy library consists of gifts that he and Jackie received from across the (recently decolonizing) world, a virtual bazaar of exotic treasures from steamy climes, all displayed against gaudy late ’70s colors and tones. Jackie’s inauguration day pillbox hat is a major attraction (I admit, it’s cool to see). But whenever raw politics pops up, the museum tips the scales toward the guy who already won. In one strange exhibit, a reconstructed newsroom, ostensibly broadcasting the 1960 election, shows Kennedy leading Nixon in California, Nixon’s home state (which Nixon won in 1960 because, duh). Historically literate visitors will understand that the exhibit represents a single, frozen moment in the evening of November 8 1960 when Kennedy was ahead in many states. For everyone else, the exhibit implies a final Kennedy victory that is much wider than the historical tally.

The "realtime" scoreboard doesn't reflect the razor-thin margin of Kennedy's victory.
The “realtime” scoreboard doesn’t reflect the razor-thin margin of Kennedy’s victory.

The library gives much attention to elections and speeches and pageantry, less to policy. It offers George Wallace (who Kennedy confronted with words, not actions) more space than Vietnam (decidedly fewer words, way more action). The library determined that papers from Robert F. Kennedy’s law school career and the Bay of Pigs deserve roughly equal time. The Cuban missile crisis is summed up in a single blurry, fragmented, incoherent documentary (that, in retrospect, probably captures with relative accuracy Kennedy’s own narcotized experience of those thirteen days).

After some genuinely moving footage of JFK in Ireland, you pass through a bizarre, unlit hallway: the assassination. “Is that it?” my dad asked, not merely of the assassination but of the entire museum. If the curators deliberately designed such a strange and anti-climatic ending to convey the anticlimax that Lee Harvey wrought, well, mission sorta accomplished. Instead of leaving with the horrific drama of Dealey Plaza – the abrupt end of a potentially great presidency – you leave feeling unsatisfied, confused, like you missed something. As you exit, you walk through an enormous glass room with a gorgeous view of the harbor. Look up, and you realize you’re standing under a comically large American flag. You’re unsure of what it’s supposed to mean.

Of course, a library is much more than a museum. I enjoy the Kennedy Library Forums and appreciate their availability online, even if the audience was unnecessarily cold to LBJ biographer Robert Caro, and unbearably snobby toward his subject, after Caro uttered a few mild criticisms of JFK. In 2011, Christopher Hitchens confirmed my suspicion that, of the presidential libraries, Kennedy’s “is…renowned among presidential and other scholars as the most obstructive and politicized of the lot.”[2]

Flag at the end of the Kennedy Library Museum
Flag at the end of the Kennedy Library Museum

But my interest in the Hoover and Kennedy libraries runs much deeper than politics or history or scholarship. The libraries belong to the places I’m from; they represent those places; they exude the attitudes and qualities I associate with those places, or that I project onto them from myself. Iowa: understated, apologetic, ashamed, even when you’re blameless. Boston: overstated, defensive, sour, even when you’re winning.

And Kennedy is always winning. Even the scandals – sex with celebrities! stoned into a stupor at cabinet meetings! erotic waterboarding in the bathtub! all while moving nuclear missiles around the planet as if he were planning the final turns of a high-stakes, to-scale game of Risk! – somehow that all adds to his mystique!

And on top of that, on top of all of that, he gets shot in the head at the height of his popularity. And his killer, who was never quite exactly totally witnessed killing the president, is shot dead before any confession or trial, rendering the circumstances of the assassination technically indeterminable and therefore interesting to everyone forever. Bad for the nation, absolutely. Horrific and traumatizing for the president’s family, unquestionably. But good for his legacy? Do I even have to answer that?

What more could a dead president ask for?

One thing, apparently: a library that insists nary a single bad word was ever spoke about John Fitzgerald Kennedy. A museum that ignores his illnesses, his distracting sexual appetites, his insecurities, his shortcomings, even the possibility that he ever made a non-adorable mistake. A museum that refuses to acknowledge the obvious: that for decades, the assassination clouded our ability to assess John F. Kennedy’s presidency.

Postscript

November 22, 1963, was one of the most painful and frightening moments in modern U.S. history. The murder of any head of state is terrifying; assassinations can destabilize entire nations. In the United States, Kennedy’s death symbolically inaugurated an era of major cultural change that was compounded by social unrest, political realignments, painful economic adjustments, failed presidencies, and all manner of violence. We call this period “the Sixties,” but it arguably lasted well into the early 1980s [3].

Is Kennedy responsible for these changes? Absolutely not. Did his assassination provoke them? No. Could he have tempered the mood, prevented some of the violence? I don’t think so.

So why do I believe John F. Kennedy was the worst president of the 20th century? Because of the sex? The corruption? The deceit? The drugs? The cockiness? The recklessness? The policies? The posturing with Khrushchev? The mystique? Yes to all of the above, and more. I’ll try to dispense with the personality issues in my next post and then move on to more “substantive” critiques. But there’s substance in the seedy details; no other president on so many occasions endangered the welfare of the United States for a quickie.

In 2013, John F. Kennedy is no longer a deific figure. I don’t get any points for swinging at his legacy, nor can I label myself a clear-eyed contrarian for dismissing him. I will, however, attempt to heighten the discourse surrounding JFK from adulation (its former state) and nuance (its current state) to a shrill (my state). The vulgar consensus (i.e., whatever Doris Kearns Goodwin is saying these days) is that John F. Kennedy was a charismatic, complicated, and ultimately flawed president [4]. Some would add that JFK could have been  a great president if he had lived. In the face of this hardening and well-supported consensus, I will argue that JFK was charismatic, complicated, flawed, and also the worst president of the 20th century, because I want to raise the stakes of the debate and because I believe it.

In the next installment: Seymour Hersh, Sex, the ’90s, Sex, What Happens in Vienna Stays in Vienna, Sex

***

[/1] Hoover lost Iowa and forty-one other states in the 1932 election. The rural Midwest had not benefited from the roaring 1920s, which boosted major urban centers. After a post-war boom in 1918, agribusiness contracted throughout the ‘20s. The Great Depression obliterated Iowa’s already weak economy. By 1932, the year of Hoover’s reelection campaign, five percent of all Iowa farms fell into foreclosure. Des Moines declared a moratorium on land seizures. This all before the Dust Bowl struck.

[/2] Even the notorious Nixon library, once owned and operated by Nixon’s own foundation, handed its archives and facilities over to the NARA in 2007, making itself respectable to visitors and scholars alike.

[/3] The cover of Time‘s April 1981 issue asks, in response to Reagan’s near-assassination, “Can it never be stopped?” Reagan’s brush with death seemed part of an historical continuity: a plot to shoot President Nixon morphed into the shooting of presidential candidate George Wallace. President Ford came face-to-face with not one but two would-be shooters during his short tenure. But after Reagan, thirty years passed before another national politician was shot (Arizona Rep. Gabrielle Gifford in 2011).

[/4] We’re going through quite a Lyndon Johnson revival these days, what with the Caro biographies and Lee Daniels’s The Butler. LBJ is the most favorably represented president in that film – he also gets the last line.