Good men and good bosses

The Me Too movement was always going to get to this point. From its beginning it was an attempt to reckon with the exposure of a singularly evil individual in a position of power in a major industry and an attempt to publicly reconceptualize a lot of common experiences in women’s lives as unacceptable and deserving of protest. The problem was that the vast majority of men responsible for inflicting quotidian experiences of exploitation, discomfort, and violation on women are not Weinsteins. Almost no one is a Weinstein. There are horror movie villains who aren’t Weinsteins. “Canceling” monsters was never going to be enough to actually open up the abscess festering beneath the skin.

And so now with the case of Aziz Ansari the tension between these two impulses — catching bad guys and reforming everyday life — has reached a breaking point. Many critics have jumped on the ordinariness of what Ansari stands accused of having done to discredit the movement. Every man has done something like this, they say. It’s wrong to conflate this kind of thing with the actions of a Weinstein or a Cosby.

The uncomfortable truth is that they’re right. Every man has done something like this. Aggressive, coercive, disrespectful sexual behavior on a date? A tweet that’s gone viral saying that 75% of adult men have acted similarly at some point is probably an underestimate. Ansari’s character Tom Haverford does things like this constantly on Parks and Rec, to affable chuckling. With Ansari, in other words, we have finally reached a point where we have to move from insisting that “this isn’t normal” to insisting that there is a problem with what is normal, that we need, collectively, to do better than normal.

We have to come to terms with the fact that individual values or personality traits can only do so much in the face of structural incentives to be a certain type of man. We can’t just purge the bad ones. Parenting, media creation, friendship, workplace structure, and so on will all have to change, so that men consistently face consequences when their actions venture onto the spectrum that runs from Ansari at one end to Weinstein at the other. This includes but goes well beyond “education.” Ansari is plenty educated about these matters. What is needed is a non-pathologizing explanation of why he would act the way he did anyways, an explanation consistent with the fact that millions of similarly enlightened men do similar things on a daily basis.

An analogy might make this seem like a less daunting task. Labor relations is often reduced, even on the left, to a matter of the qualities of individual employers, in the same way that there is a persistent tendency in this current moment to reduce gender relations to a matter of the qualities of individual men. This has come up recently with respect to various living-wage campaigns in the U.S. and Canada. “If you can’t pay staff a $15/hr minimum wage AND benefits, you shouldn’t be in business,” one tweeter argued about Tim Horton’s. People have similarly expressed bewilderment about the fact that Vox Media, a liberal company, has been reluctant to recognize its employees’ unionization efforts. “It is not the responsibility of your employees to subsidize your shitty business,” another person recently summed up.

But that quite literally is the function of labor under capitalism. Businesses make money by earning more from selling the stuff that people make for them than they pay out to those people. If they paid their employees what they were actually worth to them, they would have no profits, would not be able to expand and diversify or spend money on advertising, and would likely swiftly go out of business. What separates a kind boss from a cruel boss is not the fact of labor exploitation but the enthusiasm with which they pursue it. That is why labor organizations like unions and regulatory laws like the minimum wage are important: they provide an external constraint on the free reign that the logic of capitalism assigns to employers, because trusting in “good” bosses to spontaneously act with integrity is a recipe for getting burned.

The incentives for men to exploit their sexual partners, especially women, are less material or economic and more a matter of culture and ideology. But the logic of masculinity is similarly such that external constraint, ultimately leading to wholesale alteration, is necessary above and beyond the good will of individuals in power. It’s worth noting here that the two structures are, in practical fact, profoundly intertwined. I was struck by the class markers peppered throughout the Ansari article: his “exclusive” address, his pushiness about fine wine, his demand that he “call her a ride,” and so on. More broadly, the locus of Me Too activism has often been the workplace: it is not just men who have been its targets but male bosses. True “accountability” for men, the robust and consequential kind, will require an equalization of economic power. (Though such a change is obviously not sufficient: I’m drawing an analogy, not an equivalency.)

Marx and Engels once famously railed against the “misconception that induces you to transform into eternal laws of nature and of reason, the social forms springing from your present mode of production and form of property.” As ex-Google employee James Damore reminded us recently, gendered exploitation too is so normalized and so ubiquitous that it can come to seem like a law of nature. These seem to be the two poles that we are caught between: male abuse as an evolutionary necessity and male abuse as an aberration of pathological monsters. We need now more than ever to seek out the excluded middle, where we might find the possibility of collective social transformation — a better normal.

Advertisements

Favorite movies of 2017

This was a really incredible year for film. There were so many great releases, and after seeing many of the top movies in the last several weeks I want to commit my thoughts to “paper,” more for my own sake than anything. This collection of films was extraordinarily rich, not just entertaining but provocative, engaged, and complex.

MV5BNDI0NjU3MTc3NF5BMl5BanBnXkFtZTgwNDc5MTMyMDI@._V1_

1. Get Out (dir. Jordan Peele)

Film is a uniquely public medium. Consumption of other forms of art — music, novels, etc. — has become an increasingly private activity, not just literally solitary in practice, but disconnected from any kind of common conversation. There are no must-read books or must-listen albums any more, if there ever were: people just have their pick from whatever some personalized algorithm selects from an overwhelming stream of new content. But movies, made collectively by large groups of people in the first place, are still by and large (Netflix aside) meant to be seen outside the house, a shared experience with a group of strangers. And there are few enough major films released every year that to some extent people are still seeing the same set of new movies, enabling public conversations of unique participatory scope and depth, and the introduction of new characters, stories, and images to something like a collective imagination.

And Get Out was a uniquely public film. To me it was obviously the movie of 2017. No other non-franchise film was a public event in the same way that Get Out was. I can’t remember another movie that was so important to see in theaters, and that was the subject of so much discussion, analysis, and yes, memeing. It wasn’t just that it was the right film at the right time, although it was. It was that Jordan Peele crafted the film to reward nearly endless rewatchings, promising new discoveries every time, and to be enriched by the experience of watching as a member of an audience. Get Out is as close to perfect as a film can be: flawlessly paced, moving, entertaining, terrifying, challenging, and even, at moments, strikingly beautiful, more or less all in equal measure. I’ve seen it three times and each time I became more impressed with its artistry. To paraphrase Roger Ebert’s assessment of Fargo: films like Get Out are the reason I love the movies.

2. Three Billboards Outside Ebbing, Missouri (dir. Martin McDonagh)

Speaking of Fargo. I’ve been following the backlash to this movie with some bewilderment. I certainly don’t begrudge anyone for hating it. Bland agreement is boring; vigorous argument about movies is great. And I would probably hate the movie I’ve read about too. I just truly feel like I saw a different film than many of its critics. Not just interpretively: I am at a loss to explain how people have moved from the film that I thought I saw to some of the basic summaries of its plot or dramatic arc that I have read. The two explanations that I have come up with are: (1) There have been so many films made, often by Quentin Tarantino, that are similar in lots of superficial respects to this one that some viewers were primed to expect a certain, far inferior movie, and could not adjust when the movie departed in key ways from that mold; (2) I obliviously looked past a set of glaring flaws because I wanted to like this movie going into it, because of my preexisting admiration for everyone involved in its making.

I’d like to think the two are about equally likely. Maybe the latter is more plausible. I really don’t know. It’s difficult to say much more without getting into spoiler-y plot details. So I’ll just say that I’m glad that even this movie’s most ardent detractors have generally acknowledged the brilliance of Frances McDormand’s performance. I think it will go down as one of the best in movie history.

3/4. Lady Bird and The Florida Project (dir. Greta Gerwig/Sean Baker)

Both of these movies are gorgeous, intimate, tender studies of family, growing up, motherhood, daughterhood, class, disappointment, and hope. They are blessed with spectacular ensembles, a rich visual sense of place, and a deft ability to produce that rarest of emotional reactions, the laugh-sob. I don’t have a single bad thing to say about either of them. They didn’t stay with me in quite the same way as nos. 1 and 2, but if you said either of them was the best movie of the year I couldn’t give a counterargument.

5/6/7. The Shape of Water; Call Me By Your Name; Phantom Thread (dir. Guillermo Del Toro/Luca Guadagnino/Paul Thomas Anderson)

Here are three singular, beautiful romance movies that, at the end of the day, I just didn’t love quite as much as the previous four films. I think all of them suffer from telling rather than showing that their central characters are in love, and I think that the development that each of them seeks to give to the traditional romance formula is delivered somewhat clumsily in each case. The creature in The Shape of Water is a little too animalistic and devoid of personality. I am apparently in a tiny minority in finding Armie Hammer absolutely terrible in Call Me By Your Name. The final moments of Phantom Thread feel inexcusably abrupt for such a deliberately paced film. But these are small quibbles. All were a joy to watch, featured at least one stunning performance, and were chock-full of indelible images. If Jonny Greenwood wasn’t in a famous band he’d win Best Score at the Oscars for Phantom Thread and it’s a travesty that he won’t.

8/9. Star Wars: The Last Jedi and Blade Runner 2049 (dir. Rian Johnson/Denis Villeneuve)

Sci-fi nerds like me got to enjoy two of the best franchise films in recent memory this year. I was very scared that both of these movies would be bad, but they both exceeded my expectations by leaps and bounds. That’s a pretty uncontroversial assessment of Blade Runner 2049, which cemented Denis Villeneuve (who did Arrival, my second-favorite movie of last year) as one of the most exciting “genre” directors working in Hollywood today. The Last Jedi was probably the second-most polarizing movie of the year, after Three Billboards. I thought it was brilliant, balancing fan service and thematic and narrative innovation better than its already-great predecessor. It was timely without being heavy-handed; it had, for my money, the most visual flair of any film in the franchise; and, most importantly, it had Mark Hamill.

10. Dunkirk (dir. Christopher Nolan)

Dunkirk was one of my favorite movie-going experiences of the year; I saw it in 70mm on its opening weekend. It has been justly praised for its transportive, immersive quality, and Harry Styles gives the best pop star movie performance since Justin Timberlake in The Social Network. It is a masterpiece of film and sound editing. It’s just that after I saw it I didn’t think about it once for about six months, until awards season commenced in force a few weeks ago. It is formally remarkable: if I was drawing up a film school curriculum it would be at the top of the list (or probably still behind Get Out). But it doesn’t really say anything, and to that extent it suffers by comparison: this was not a year to shirk on thematic complexity.

Honorable Mention: Norman: The Moderate Rise and Tragic Fall of a New York Fixer (dir. Joseph Cedar) 

This movie made no money and got no attention because it was a small-scale drama about Jewishness released in late April with only one big name attached (Richard Gere, who’s shockingly good). But I saw it! And I liked it a lot! In a year that saw Three Billboards, a new season of Fargo, and Suburbicon, which they actually co-wrote, it channeled the spirit of the Coen brothers as well as anything else in 2017. That’s high praise.

Notable haven’t-seens: The Post; The Disaster ArtistI, Tonya. 

The latest neo-Nazi PR campaign

One evening probably two weeks ago I was walking in my neighborhood and found a sticker plastered on the back of a stop sign that said in plain text, “It’s okay to be white.” I was startled because of Cambridge, MA’s liberal reputation (“The People’s Republic of Cambridge”), but not particularly surprised. Steve Bannon and Jared Kushner went to Harvard. Of course there are white supremacist “trolls” here. I scraped the sticker off with my keys and went about my business.

I now regret not taking a photo, because since then similar stickers or posters have appeared around the country, typically on college campuses, including at my alma mater, Northwestern, this past weekend. It’s apparently some sort of organized campaign launched on 4chan, an important “alt-right” meeting place, designed to expose the intelligentsia’s hatred of white people to a national audience and thus garner the support of “normal,” unwitting, comparatively offline whites.

It’s a masterful strategy and to judge by my Facebook feed — so remember, there’s a great deal of selection bias against the alt-right here — it’s been modestly effective. Isn’t it okay to be white? So at the admitted risk of giving any more attention to an insidious neo-Nazi propaganda effort, I want to spell out what’s wrong with the logic behind these stickers.

The most obvious point to make is the same argument that has mercifully taken the wind out of the sails of “all lives matter” over the last year or so: it is just not the case that calling attention to the devaluation of some lives is the same as bemoaning the fact that other lives are valued. Use whatever analogy you want to make this point. Some people like to imagine someone feeling neglected when the fire department comes to extinguish a blaze at their neighbor’s house. All Houses Matter.

But while that argument is all well and good as far as it goes, there’s a more important point that I think people have been too willing to retreat away from in the face of campaigns like this, which is that there actually is a problem with whiteness itself. The proper analogy is not two separate houses, one of which just happens to be on fire. It is more like one group of people forced another to build them a house, then put them in another, much worse house, then lit it on fire, and then when the fire department came, violently insisted that there was in fact no real fire. The philosopher Charles Mills argues that we should say “racial inequality” less and “racial exploitation” more, to drive home the point that the creation of “race” as a system inherently advantages the white race at the expense of the others.

That’s not just to say that “race” or “whiteness” are “social constructs.” That’s not enough. Baseball is a social construct. It is, needless to say, okay to play baseball. It is to say that “whiteness” is made through white supremacy, by the system that Mills calls the Racial Contract. In certain left-liberal circles “whiteness” has come to mean something like “annoyingness,” as if its constitutive features were overconfidence and Vampire Weekend. But the heart of whiteness is more like this:

chart_1 (3)

This is the uncomfortable reality that I and every other white person have to grapple with: it is okay for us to be, as human beings who draw breath, but there is, in fact, a problem of sorts with our being white, because being white means reaping profound unearned material advantage from the system of white supremacy — regardless of whether our ancestors owned slaves or whether, like me, our grandparents or great-grandparents would not necessarily have even been classified as white when they came to the U.S.

This is true whether we like it or not. If we like it, we can join the 4chan Nazi brigade, protecting our advantage by fabricating victimization. But if we don’t like it, the solution is not to feebly attempt to opt out, like Rachel Dolezal. It is to dismantle the system at its roots.

Return of the Repressed

In June earlier this year, the conservative pundit and think-tanker Avik Roy took to Twitter to defend the Affordable Care Act repeal-and-“replace” plan then on its way to the Senate floor. “I’m very open to thoughtful critiques of the Senate bill from the left,” he wrote. “‘MILLIONS WILL DIE’ is not it.”

In late August, after the first of several hurricanes that would leave countless people displaced, injured, and, yes, dead across the Gulf, a spokesman for the new fossil-fuel-industry-controlled EPA blasted climate scientists for connecting the storm’s devastation to climate change. The spokeswoman repudiated those who were “engaging in attempts to politicize an ongoing tragedy.”

And now after the deadliest mass shooting in modern American history the chorus has started up again. “After Las Vegas, can we take a moment to agree not to politicize tragedy?” Glenn Beck asked. Everyone from Patton Oswalt to Hillary Clinton has already faced the wrath of the reflexively anti-political. Discomfort with “divisiveness” has once more trumped the value of real human lives.

Speaking of Trump, this is all important context for the hallucinatory reality show into which we have been collectively plunged since his election. His nihilistic world of self-contained reflectivity, cycling eternally between television, Twitter, and private country clubs, is just the Platonic form of the more ordinary and longer-running impulse among many Americans to drive a wedge between politics and reality. Politics is just a kind of word-game where people arbitrarily select values and then compete with each other to score pointless victories that affect the lives of the participants and no one else. To insist that lawmaking, governing institutions, democratic action, and so forth might actually matter in the real world, beneath the aether of the 24-hour-news-cycle, is to break the rules of the game. In its demand that people confront disempowerment in all its horrifying reality, that we come face-to-face with the human lives around us that have been crushed by structures sustained by our collective action and inaction, it is nothing if not impolite.

But all the polite apathy we can muster can’t make that reality go away for good. Our actions have shaped the natural and social world that we now have no choice but to live in. We are responsible not only for the past that Karl Marx famously described as “weighing like a nightmare upon the brains of the living,” but also the present that will weigh similarly upon future generations. There isn’t a tweet in the world — even from the President — that can insulate our lives forever from the consequences of our politics. “History is what hurts,” as the literary critic Fredric Jameson once put it. And every time we ignore or naturalize or excuse the hurt, we only ensure that it will return again, with greater vengeance.

“Thoughts and prayers,” not politics or protest, has been the refrain today, especially from the Christian right wing, but the Gospel of Luke told it differently, in a passage at the beginning of the Passion sequence. “Some of the Pharisees in the crowd said to Him, ‘Teacher, rebuke Your disciples!’ ‘I tell you,’ He answered, ‘if they remain silent, the very stones will cry out.'”

The anti-single payer blitz

Within a few days of its announcement, Bernie Sanders’ new Medicare-for-All bill, endorsed by a dozen other Democratic senators, has come under fire in the pages of newspapers across the country, from Philadelphia to Chicago to D.C. and elsewhere. The jury is in: it sounds nice, but it is a pie-in-the-sky delusion that is just technically not possible. There are a few things that are odd about this consensus.

The first thing, of course, is the peculiarity of declaring something impossible that in fact exists in reality, throughout the Global North, in countries that are less wealthy than the United States. And, of course, not only does it exist, but countries with single payer or at least a public option outperform the US healthcare system on practically every metric imaginable.

Screen Shot 2017-09-19 at 10.38.11 AM

The fact that a multiplicity of other governments currently shoulder the vast majority of citizens’ health care costs while actually spending less on health care than the United States and saving more lives to boot is perhaps why, as Matt Bruenig observes, these op-eds have tended to actually be political arguments in technical clothes. Take off the mask of expertise and you find unsubstantiated assertions about what voters will or won’t stomach.

As Bruenig also notes, the current existence of Medicare-for-some is also troubling for declarations of prima facie impossibility. Clearly the United States government is capable of running a health insurance system; clearly people are capable of giving up their private insurance for government-provided health care; we know that this is true because it’s literally already happening. And the inconvenient truth for the anti-single payer pundits is that it’s incredibly popular, which of course is why Sanders is calling his plan Medicare for All in the first place. (One also thinks of the Affordable Care Act, which many pundits also declared that people would hate but has instead proved remarkably difficult for even a unified GOP government to pry from the hands of its recipients.)

Ah, but taxes will go up and Americans hate taxes. On the one hand, only the wealthy are likely to see their tax increase exceed what they’re currently spending on health care, which means this argument really relies on the conviction that most Americans would rather have less money in the bank than shift over any of their spending from corporations to the government. Even if one accepts that assertion, perhaps ponder why Americans are unenthusiastic about federal income tax:

Screen Shot 2017-09-19 at 11.01.45 AM

So the typical working American only sees the benefits of a tiny fraction of their annual federal tax payment in the short-term. Of course, they’ll reap the benefits of Social Security and Medicare eventually; they already benefited from education spending; they do continue to benefit from spending on transportation, housing, and environmental protection; but in general, when most people send off their taxes every year they won’t see a direct, personal return from the majority of that payment for another several decades. Now imagine what Americans’ attitudes towards taxation might look like if instead they reaped an obvious, undeniable benefit only months later, during a life-threatening illness for themselves or a family member.

The other cost argument notes that potential savings from single payer will be offset from people going to the doctor more. This argument is interesting because it implicitly concedes that there are many millions of Americans who currently put off necessary doctors’ visits because they cannot afford to go. It is designed to conjure up racist images of mooching welfare queens, but instead just drives home the point that the ACA is still a tremendous moral failure that continues to condemn millions of people to die unnecessary deaths because of their socioeconomic status. A compassionate society should not wring its hands about too many people going to the doctor.

Furthermore, although I am straying out of my expertise here a bit, it seems to me like this argument massively overestimates the elasticity of demand for health care. I am a bit of a hypochondriac with access to a free university health center and I still rarely go to the doctor more than once a year. I would hazard that if there’s anyone who’d go to the doctor more often than they “need to,” it would probably be the wealthy — that is, precisely the people who would shoulder the heaviest (i.e. any) extra financial burden.

Speaking of expertise, this is what I find most fascinating about this whole dispute. It illustrates a remarkable and under-appreciated shift in the politics of expertise and “technocracy” over the last several decades.

This is obviously oversimplifying a bit a lot, but I see a periodization of the politics of expertise (especially in American social science, which is what I study) that goes something like this.

  1. In the early twentieth century, scientific experts were generally also practically engaged activists, advocates, community organizers, and so on. The distinction between social work and social science was far less rigid. The idea was to document problems that could be addressed, and help to formulate solutions to those problems that movements could fight for. These early experts of course occasionally held wrong or even repugnant views, but when they did those views were typically widely shared by the social movements with which they were working.
  2. Beginning in the late 20s and early 30s, and accelerating with the start of the Cold War, many experts moved first into large foundations and then into the government and military. This kind of expertise was still about extremely ambitious social problem solving, but it tended to be newly unaccountable and even secret. The severing of the relationship with people “on the ground” led to a kind of overambition that was often disastrous: technocrats pursued agendas that people didn’t want and sometimes didn’t even know about. But this shift wasn’t totalizing: many scientists were at the forefront of new movements around disarmament and environmental protection, and new social movements from the New Left to ACT UP developed their own kind of “counter-expertise” to aid their causes, in a way that occasionally reverberated into academic disciplines like sociology that subsequently took on a more critical bent.
  3. Since the end of the Cold War, the politics of expertise has regrouped around shoring up the putative End of History. The unaccountable people who will address social problems out of the public eye are no longer the experts themselves but entrepreneurs and capitalists who can harness the power of the market to make incremental change. The role of experts now is to insulate market forces from democratic efforts at reform by explaining the limits to the agendas of social movements: for the first time expertise is primarily about what is not possible. The rise of poststructuralism among the academic left, dismissing notions of expertise, progress, and even change tout court has unwittingly collaborated with this shift, and its ideology that a better world isn’t possible.

To me, the rash of anti-single payer op-eds illustrates the extent to which this last mode of expertise has become taken for granted. Being in touch with “reality” means throwing a pail of cold water on any attempt to guarantee everyone health care, even when that means handwaving, mystical assertions about whatever it may be about America that explains why it cannot do what its peer nations have already done. “Seriousness” has become synonymous with slavish devotion to the status quo — which helps to explain why America’s op-ed pages have been so strangely devoid of columns in support of a policy that the majority of one of our two major parties supports.

Of course a transition to single-payer will have difficulties. The role of experts should be to figure out how to solve those difficulties, not to throw their hands up and insist that they are insurmountable.

What Democracy Looks Like

rally

I.

Like for millions of Americans, this has been an emotional morning for me, waking up to find that the GOP’s latest Affordable Care Act repeal plot — one that seemed certain to bear some kind of fruit just a few days ago — had been defeated at the eleventh hour. I’m still trying to sort out what it all means.

Many people don’t realize that the core of neoliberalism — the dominant ideology on the right for the last 40 or 50 years — is not really an obsession with economic efficiency, per se, but with a loathing of democracy. The first meeting of what would later become the Mont Pèlerin Society, the organizational and intellectual bedrock of the worldwide neoliberal movement, was actually called the Colloque Walter Lippmann, after the 20th-century American intellectual who wrote little about economics but much about, as the title of his famous 1922 book put it, Public Opinion. Lippmann’s idea was that “the public” and, by corollary, “public opinion” were chimeras, concocted by the political elite to mask the fact that ordinary people were actually too irrational, too unintelligent, too easily mislead, to form any kind of collective purpose that might bestow democratic legitimacy on governmental action.

Lippmann’s conviction is something of a cipher to unlock many of the right-wing developments of the second half of the twentieth century. James Buchanan and the “public choice” economists developed more formalized economic models that treated putative democracies and the collective action they facilitate as, in fact, only arenas for self-interested jockeying by individual power players who were unresponsive to anything that could be called “the public.” Steeped in evolutionary psychology and studies of resource management by small communities, other neoliberal intellectuals, like Elinor and Victor Ostrom, insisted that democracy and collective action could only have meaning on scales small enough to allow reciprocal, face-to-face personal relationships between all players: our modern “democratic” nation-states could never have such legitimacy; and therefore dreams of social problem-solving at the national scale was at best a mistake and at worst a cover for “special interests.” Many others, similarly working at a strange and novel interface between economics, political science, and (evolutionary and social) psychology, explained why ingrained biases made the average person unable to perceive clearly the merits of free markets and inclined them “emotionally” towards socialism, and postulated a brand of IQ determinism that attempted to scientifically debunk the political (and often racial) egalitarianism at the heart of democracy.

Meanwhile, behind the scenes, right-wing business elites, in cooperation with many of these same intellectuals, waged a highly practical war on the functioning of real-world democracies around the globe. The story of the collaboration of Milton Friedman and his “Chicago Boys,” among other neoliberal economists, with Augusto Pinochet’s reign of terror in Chile is comparatively well-known. But in the U.S. and the U.K., and increasingly elsewhere in Europe, millionaires and billionaires (most famously, in the U.S., Charles and David Koch) have over the last several decades bankrolled the establishment of an elaborate think-tank infrastructure to propagate their ideas, spent unprecedented amounts of money to buy candidate loyalty and eventually elections themselves, and developed and implement arcane but potent legislative restrictions on the right to vote and to organize.

If you spend a lot of time with these ideas, even if you refuse to follow them to their free-market and anti-democratic policy conclusions, they can start to make a twisted, depressing kind of sense. After all, our democracy is so dysfunctional. It frequently does work for the benefit of elites, who are all too good at manipulating ordinary citizens for their purposes. In fact, the neoliberal project seems, in this light, like a disturbing proof of concept of their core idea.

And yet.

And yet, with the Kochs’ guy, Mike Pence, waiting in the Senate chamber ready to cast the tie-breaking vote to finally achieve victory in a seven-year-long, day-and-night political struggle against the popular crowning achievement of a popular president, the plan didn’t work. Somehow, after months of activists facing arrest and physical injury to defend their basic well-being, citizens around the country calling Senators and Representatives day and night, polling numbers proving again and again the unpopularity of the GOP effort among the American populace, somehow, the millions of people who rely on the ACA for access to healthcare bought some more time. Not today.

This is what I can’t stop thinking today: They were wrong. Collective action is real, meaningful, and effective. Despite everything, the public still has a voice, can still exert its will, and can still throw up roadblocks when politicians try to literally bleed the country dry to further enrich the ruling class. To adapt a beloved protest chant, this is what democracy looks like: a stunning reminder that no matter how hard the right works to actualize its vision of government as a game played by individual self-interested rulers and nothing more, the will of the people cannot be extinguished entirely.

II.

This is not what democracy looks like:

Democracy is, above all, the conviction that no one should be forced to submit to the arbitrary power of someone else above them, whether that’s a seventeenth-century monarch, a present-day employer (the monarch of the twenty-first-century workplace), or, yes, a politician like John McCain — holding the fate of millions of Americans in his hand and still holding it out tantalizingly before signaling his decision, a reminder of the ultimate power he wields and could choose to wield in any way he pleases.

Today in New York a couple committed suicide because they couldn’t see an end to the “financial spiral” in which they were caught — including, reportedly, health care bills that they simply couldn’t pay. Last night’s decision saved many lives. But it did not save theirs. And it didn’t save the lives of the estimated 28 million people that the ACA will still leave uninsured in 2026.

That is, if there still is an ACA by then. This morning Freedom Caucus chair Mark Meadows promised to have a “perfect” bill ready in two weeks to try to push through both the House and the Senate. The final bill that was defeated last night was, it’s worth emphasizing, a true abomination: scrapped together days or hours before the final vote, with an eye towards major revisions to come during the subsequent reconciliation process. And yet somehow 49 Republicans still voted for it. Who knows if McCain — or even the two “hard no” GOP Senators, Lisa Murkowski and Susan Collins — will still have the integrity to reject a bill with a more polished, respectable veneer?

So democracy won last night, but tyranny continues: the tyranny of economic deprivation, of a society that refuses to use its unfathomable wealth to guarantee the basic needs of all of its members, of a party — led by a man who is the incarnation of arbitrary power — determined to continue to subject millions of Americans to lingering suspense, uncertainty, and despair. Moving forward we will need the same outpouring of popular energy from engaged citizen-activists that we’ve seen fighting repeal the last several months — now redoubled, expanded, and sustained — to break this ugly stalemate, to win the non-reformist reforms that can humanize and democratize our nation bit by painstaking bit.

Baseball Night in America

Every five days during the baseball season I watch Clayton Kershaw pitch, and before he takes the mound each time I am convinced it will be a perfect game. It has never happened. The goal of a pitcher is to get batters out. According to a statistic called Walks & Hits per Inning Pitched (WHIP), Kershaw has gotten batters out more frequently than any other pitcher in history, except for a guy named Addie Joss, who was born in 1880. He is about as good at pitching as it is possible for a person to be. But in every single one of his 283 career starts he has failed at least once at his basic task.

One time he got very close. On June 18, 2014, he retired the first 18 Colorado Rockies he faced, before he got the 19th batter to hit a soft ground ball at the shortstop, Hanley Ramirez. Ramirez fielded the ball, and threw it about a yard past the first baseman, allowing the runner to take second base. Kershaw proceeded to get the final 10 batters out. That one fielding error proved to be all that stood between him and the 24th perfect game in history. That’s baseball for you.

Baseball encourages these sorts of reflections more than any other sport. Reflection is built deeply into its rhythms on all timescales. There’s the (in)famously relaxed pace of individual games, which critics mock, commissioners try in vain to accelerate, and which fans feel lends the game a feeling at once tense and contemplative that is without parallel in sports. We are also presently at the most reflective moment of the season, the annual post-All Star Game ritual of deciding which teams have a legitimate chance of making the playoffs, and therefore which teams will become “buyers” or “sellers” at the trade deadline. The first half is an optimistic burst of enthusiasm set off by an Opening Day saturated with fantasies of infinite possibility and perfect parity; now it is time to take stock.

And as a phenomenal recent longform article by Peter Dreier and Robert Elias in Jacobin emphasizes, the history of the game since its late-nineteenth-century origins is a history of a more critical kind of reflection, pursued by the courageous players, managers, union attorneys, journalists, and others who have fought to reform and reimagine a game they loved that did not always love them back — and occasionally punished them dearly for their transgressions against the status quo. In this sense, baseball, often called the most conservative American sport, is in fact almost exactly as conservative as the nation that produced it, which is to say, it’s complicated: defined both by its tradition of iron-fisted reaction and its tradition of idealism, reform, and revolution; by the patterns of exclusion and exploitation present in its structure at its genesis and the progress extracted by those activists at an excruciatingly patient pace, almost as slow as that of the game itself.

This is why I could never quite sign on to Jon Bois’ preseason declaration that “there is no future of baseball.” By this he meant not that the sport was facing its imminent demise but that, with the last of the game’s famous championship curses broken by the Chicago Cubs last fall, the game had achieved a sort of end of history, a stationary state of the kind that John Stuart Mill thought he could see around the corner in 1848, two years after the first officially recorded baseball game in U.S. history. I do get where Bois is coming from. His piece is as good a summary as any of the strange kind of melancholy that I and some other Cubs fans I’ve talked to felt in the aftermath of the World Series win. But I still don’t think it’s quite right.

How could it be? Baseball is the sport that taught me, when I was so young that I didn’t even have any real conception of sexuality, that there was nothing worse for a man to be than gay. How could baseball be “finished” when the Cardinals still invite outspoken homophobes like Lance Berkman to something called “Christian Night;” when there remain no publicly out major leaguers; when its little leagues across the country still teach the same lessons I was taught when I was a kid? How could it be finished when so many of the same reformers that Dreier and Elias write about are still rigorously excised from official histories of the game; when owners still rip off the public for stadium funds and still inflict punishing living conditions upon their minor league players; when the league still refuses to treat domestic violence within its ranks with the seriousness it deserves; when 70 years after Jackie Robinson’s rookie season the game remains overwhelmingly white in its demographics and even in, as Mary Craig observed last week, the language its media uses to describe players of different races?

To put it another way, how could a sport so intensely bound up with American identity that the early 2000s saw a Congressional investigation held to protect its integrity from steroids ever be finished when America itself is so painfully far from finished, still wrestling with the same demons it has bequeathed to its national pastime?

Bois only sees half of the picture. He understands the fixation on perfection, on symmetry, on closure: three strikes, three outs, nine innings, nine positions; the only sport in which talk of a “perfect game” is even coherent. From this structural perspective, baseball is indeed “finished,” but it has been finished for a long time, perhaps even forever. There are no enhancements to be made to improve its austere beauty and intricate self-containment.

But the mirror image of this Platonism is baseball’s acute sense of the textures of history. It’s that sense that drove my dad to wake me up well after my bedtime to watch the last several outs of Randy Johnson’s perfect game, and to bring me to Wrigley Field to make sure I got to see my favorite player, Greg Maddux, in the flesh before his retirement. It’s a sense that encompasses the legendary championship droughts, yes, but that runs much deeper, flowing ultimately from the inevitable discrepancies between the game’s on-paper fleshless perfection and the overwhelming imperfection of the game as played and managed by human beings.

History — its ceaseless flow of victories and disappointments, its sense of collective memory and collective hope — is baseball’s answer to the cruel paradox at the heart of the game: that the perfect ideal will elude even its best players. Mike Trout will head straight back to the dugout almost seven times in ten. Clayton Kershaw will do absolutely everything right and Hanley Ramirez will still blow it for him. Josh Gibson will pile up more home runs than anyone in baseball history, patiently waiting in the Negro Leagues for a chance at proper pay and proper recognition, only to be passed over, when the opportunity to integrate the major leagues finally arrives at the end of his career, in favor of a younger player named Jackie Robinson.

And in spite of all of that, the game keeps moving. The worst players in the lineup come up to bat just as often as the superstars. The box scores pile up day after day. The disgruntled, excluded, and mistreated make demands on the sport’s establishment that may never be actualized in their careers, if ever, and a new generation of fans decides to fall in love with a game that holds out to them the near-certainty of betrayal.

There’s always next year.

f4f9475cc36a86497421c482ec7ca2aa.jpg

Economic democracy and the history of liberalism

A swollen title, I know, but one demanded by the broad and ambitious article I want to respond to. Elizabeth Anderson has an article in Vox about democratizing the workplace that is excellent political philosophy but flawed intellectual history. That’s fine, because she’s not a professional historian and the history issue is (mostly) peripheral to her argument, but it is worth addressing.

Anderson’s article is a welcome reminder that the democratic socialist left need not abandon the liberal tradition wholesale (her chair is named after John Dewey, after all!). It’s also an illustration of the usefulness of “democracy” as a way to frame the radical changes the left seeks for modern economic institutions. But when it comes to the question of why present-day “classical liberals” have failed to draw the conclusions that Anderson would like us to draw from the work of their intellectual forbearers, I think Anderson comes up a little short.

She makes two broad arguments. Together the story goes something like this. Libertarians and neoliberals have neglected the Industrial Revolution that occurred after the time of Adam Smith and Thomas Paine, and (1) therefore ignored that privately-owned firms are now massive structures of wage slavery instead of the small personal farms or trade shops that Smith and Paine saw as the vehicle for free-market emancipation. Today, because (2) they have become obsessed with economic notions of efficiency, they have lost the political vision necessary to update the insights of Smith and Paine for the modern era and advocate for workplace democratization.

I think this is wrong. First, it is just flat-out inaccurate to say that today’s libertarians and neoliberals have underestimated the impact of the Industrial Revolution. On the contrary, they generally think it was among the greatest events in human history. Hayek edited a book glorifying it; Ayn Rand wrote a book excoriating (primarily environmentalist) critics of its ecological and social impacts; and a variety of well-known libertarians will every once in a while get together to try to figure out how the miracle happened. And they don’t just like it — they see it as a game-changing break with the past as well; people from Steven Pinker to Julian Simon have cited it as the reason why economic growth is possible at all, contra earlier thinkers like Malthus.

The substance of this discussion varies, but the common denominator is a vision of industrial capitalism as driven by what the early 20th century economist Joseph Schumpeter (a figure whose importance for today’s right is massively underestimated) called “creative destruction.” In general, most libertarians and neoliberals are not in thrall to the obsession with efficiency and equilibrium that is often ascribed to them. In fact, they love the Industrial Revolution so much because they think it has freed us from precisely the kind of steady state where what goes up must come down and costs and benefits balance out; that’s the kind of Malthusian thinking our capitalist ingenuity has allowed us to move past. On the contrary, it is the messy (today we’d say “disruptive”), often quite inefficient process of market-based trial-and-error that fuels knowledge growth, innovation, and wealth.

This is the first reason libertarians hate the idea of worker-owned firms: workers would never allow their business to fail for the greater good of the market economy! That wouldn’t just be bad for the economy, but it’d also be bad for the souls of workers. Yes, Hayek conceded, in a market economy, “life and health, beauty and virtue, honor and peace of mind, can often be preserved only at a considerable material cost,” but this is actually a good thing, because it forces us to consider when we’d be willing to sacrifice materially for the sake of those values. If democratic workplaces guaranteed us “peace of mind” without us having to suffer for it in advance, moral corruption would surely be the consequence.

The fact is that the argument in favor of entrepreneurial dictatorship has nothing to do with efficiency, and everything to do with a bite-the-bullet inegalitarian political-moral view of societal progress, where noble risk-taking entrepreneurs make the sacrifices — tolerate the failures — that are necessary to generate wealth in our dynamic economy. Those sacrifices would simply never be made were the hoi polloi given the ability to prioritize their sustained wellbeing at the union ballot box. This actually oughtn’t be surprising: because Anderson is right that control of the workplace is a political question, we should expect to find that the libertarian/neoliberal answer to that question is undergirded by a political vision as well.

The more bloodthirsty version of this vision is Ayn Rand-style social Darwinism, replete with talk of parasites and John Galt, but more common these days is actually a kindler, gentler patrician styling that sees non-entrepreneurs as noble savages instead of economic leeches. To all of the leftists who never bother to read them and so reprimand them for thinking that everyone is perfectly rational they say — exactly! The vast majority of people aren’t rational at all. That’s why no central planner can foresee what they’d want, and we need to rely on free markets, captained by cognitively superior entrepreneurs, to accumulate information on their fundamentally a-rational “preferences.” They can organize the affairs of their household and perhaps even a small-scale, relatively homogenous community well enough (in the last few decades, this is often argued to be because evolution has trained us to be altruistic in these kinds of situations), when the government leaves them alone. But modern-day polities and economic firms are very large, which confuses the poor folk, and they start to think they can understand the big picture — and the good instincts which help them run their private lives free from government turn into bad instincts towards socialism. (Some of them are even so confused by this temptation towards birds-eye thinking as to think they have a thing called a mind instead of a similar collection of small semi-coordinated local parts! The silly devils.)

So the question of “scale” is resolved very differently, and actually in a more sinister fashion, than Anderson imagines. They agree that scale changes everything — but their conclusion is that the massive scale of modernity is precisely why not only economic democracy but also democratic political action in general on a governmental or otherwise societal scale is profoundly mistaken. Hayek again:

“Agreement about a common purpose between a group of known people is clearly an idea that cannot be applied to a large society which includes people who do not know one another. The modern society and the modern economy have grown up through the recognition that this idea — which was fundamental to life in a small group — a face-to-face society, is simply inapplicable to large groups.”

And here’s Victor Ostrom warning of the dangers of the fatal modern cocktail of egalitarianism and large-scale societies:

“The larger the society and the more diverse the country, the greater the propensity for error… Individuals assuming themselves to be like all the rest, no longer look upon themselves as fallible creatures subject to limited comprehension, but as omniscient observers addressing themselves to problems in the society as a whole.”

One common error on the left, among people of a more “communitarian” persuasion, is to tell exactly the same story about modernity and, rather than embrace the libertarians’ despotic modernism, come away pining for the small-scale societies of days past (what Marx called “primitive communism”). But of course, because absent a Mr. Burns, a Post-Electric Play-esque apocalypse, that’s not fodder for a robust present-day political program, that tends to breed scholastic quietism. Anderson doesn’t commit this error — on the contrary, her article provides a number of concrete action items for the left — but I still detect a faint whiff of it in the vaguely nostalgic tone with which she recalls the classical liberals. Such a tone is hardly merited.

As scholars like Nancy Fraser have reminded us, the early bourgeois “private sphere” household that was the site of Smithian “self-employment” was marked by profound gender hierarchy and inequality. And Anderson doesn’t mention colonialism, which expanded the scale of corporations long before the Industrial Revolution and provided the foundation of European political economy in the days of the early classical liberals. She acknowledges slavery, at least. But by valorizing Thomas Paine (generally though not without controversy considered an abolitionist) while never mentioning his friend Thomas Jefferson, she conveniently elides the reality that Paine-Jefferson American classical liberalism could just as readily be deployed in defense of Jeffersonian agrarianism, built on the exploitation of slave labor, as in (often privately expressed) opposition to the “peculiar institution.”

The expansion of the rights of women, the abolition of slavery, and the end of (most) (formal) colonial occupations all therefore had ramifications for economic justice, yes, but it is simply not the case that they were the consequence of the overdue consistent application of classical liberal political philosophy to previously insulated economic realms. As Charles Mills has forcefully argued, “inconsistency” is almost never actually an adequate explanation for the classical liberals’ many failings, and reclaiming liberal insights for radicalism requires a more sweeping reformulation.

In Anderson’s case, I think the problem lies about halfway through the article, when she writes:

Americans are used to complaining about how government regulation restricts our freedom. So we should recognize that such complaints apply, with at least as much force, to private governments of the workplace.

But I don’t think this is right at all — and this is where the history begins to impinge upon the political philosophy. Previous expansions of economic democracy — the examples cited above but also bans on child labor, the introduction of the weekend, collective bargaining, etc. — have almost always been enforced, when not originally compelled, by strong governmental action, following democratic mass movements. In other words, economic democracy requires the repudiation of the anti-government bromides of the classical liberals, and the insistence of their neoliberal successors on the illegitimacy or incoherence of the idea of collective action for a common purpose in modern democracies.

You can’t have it both ways. Either democracies can do the things Anderson advocates — ban noncompete clauses, support unions and unionization, restrict the ability of employers to fire workers, and so on — or you can accept the hostility to modern democratic governance characteristic of the libertarian tradition. The former — which is clearly Anderson’s core commitment — sees political and economic democracy as actually part of the same cloth. But the latter — which is where some of her rhetoric and historical argumentation goes — treats the two as related, maybe branching from the same trunk, but now no longer in interfolded contact. That’s the mistake that a robust understanding of the history of liberalism, and especially recent neoliberalism, can correct.

Caricaturing libertarianism

This is partly a note for myself, because my tweets auto-delete and I want to retain some thoughts that I just put up there for the future. I was responding to Noah Smith, who was excited that the Niskanen Center is telling people that markets are (necessary but) not sufficient for “real liberty.”

As I said: “This is the problem with the typical left caricature of libertarianism: it can be disarmed so easily. But critique is still essential! Libertarians KNOW that humans aren’t ultra-rational utility monsters and that extra-market institutions matter. And they have made those tenets the bedrock of a hyper-anti-democratic worldview. Far-leftists will continue in their smugness, regurgitating critiques that often were developed by neoliberals themselves. And center-leftists will continue saying “Good!!” and marveling at the appearance of a kinder, gentler libertarianism. And the right will continue to wage war on democracy, secure in their knowledge of human irrationality and the non-inevitability of markets.”

And then I put up a picture of one of my favorite Hayek quotes:

“The vast majority of people (I do not exaggerate) no longer believe in the market. It is a crucial question for the future preservation of civilization and one which must be faced before the arguments of socialism return us to a primitive morality. We must again suppress those innate feelings which have welled up in us once we ceased to learn the taut discipline of the market.”

But it would perhaps have been more apropos to cite a representative of the “new institutional economics” (The Ostroms, Ronald Coase, Oliver Williamson, etc.), to whom the Niskanen article in question is quite close in spirit. They are typically less obsessed with “The Market” than Hayek but, in a more subtle fashion, just as anti-democratic, determined to erode the legitimacy of the modern state as a vehicle for collective action to address public problems. Here’s Victor Ostrom, for instance:

“The larger the society and the more diverse the country, the greater the propensity for error… Individuals assuming themselves to be like all the rest, no longer look upon themselves as fallible creatures subject to limited comprehension, but as omniscient observers addressing themselves to problems in the society as a whole. Government them becomes an omnicompetent, universal problem-solver capable of responding to all of the problems of the society as a whole…Justice is conceived as social justice implying equal shares in social outcomes rather than equal standing in access to the games of life.”

This is also the theme of Nancy MacLean’s great new book about James Buchanan. (Peter Boettke, one of Buchanan’s key successors at George Mason and the current president of the Mont Pèlerin Society, is a major popularizer of the Ostroms’ work.)

The Niskanen Center article that Smith links, while seeming to be relatively laudatory of democracy, gives away the game about halfway through:

To resist the reification of the state is to depart from most mainstream political-theory accounts of democracy, even many that are supposed to be highly pragmatic. Most democratic theorists cannot help regarding the democratic state (or, in more populist, American form, simply “democracy”) as the common, conscious entity that “must” speak for the moral purpose of the whole, and that must, allegedly, be the final arbiter of  disputes among other institutions. There is no such must, and no such entity.

There you have it: the modern democratic state, existing like all institutions as a “conventional,” convenient, “pragmatic” complement to the market, has no special democratically-invested authority to act in furtherance of collective projects, to possess public things, and so on. What at first seems to be a level-headed departure from “market fundamentalism,” fostering appreciation for “non-market institutions,” is in fact a profoundly radical “anti-mystification” attack on the typical meaning of “democracy” for most people nowadays. “Liberalism understood in the more realist, Hayekian way is the opposite of populism,” as the Niskanen article puts it.

The Niskanen Center makes its living trading on its image among outsiders as a group of “the good libertarians.” They support a carbon tax! Many of their key figures were ousted from the Cato Institute by the Kochs! They admit that the welfare state has its uses! But look deeper and you find an organization still deeply embedded in the right-wing think tank ecology (or Russian-nesting-doll structure, as Philip Mirowski might put it), — oh, and conceding to massive popular support for a carbon tax only in exchange for gutting other environmental regulations. In this sense, they’re the think-tank parallel of the “New Prophets of Capital” that sociologist Nicole Aschoff has identified, providing a slick veneer of humanitarianism that conceals the unabated metastasization of the neoliberal order beneath.

But it’s also important to recognize that some of the intellectual framework that furnishes this humanitarian veneer isn’t just deceptive or disingenuous but actually constitutive of a perhaps less prominent, but no less sinister thread of neoliberal thought (and organization) over the last fifty years or so.

Positive and negative free speech rights

The topic of free speech on college campuses has recently mutated from a convenient way for intellectually exhausted centrist and right-leaning pundits to fill column inches to a live debate within the left. Driven especially by criticisms from Freddie deBoer and a few figures associated with Jacobin, the online left-o-verse has been abuzz of late with disputes over the merit of “de-platforming” and other tactics commonly taken as emblematic of the contemporary student left’s disregard of the value of free speech. I actually think that this development is fairly healthy and has brought some important issues to the forefront, but I think that this Nouvelle Vague of “pro-free speech” leftist writers concedes far too much to the right and has yet to articulate a coherent vision of how students can integrate a commitment to both free speech and substantive leftist goals. These flaws are intimately related, but more on that anon.

DeBoer is both the most strident and most rhetorically gifted member of this group, so he makes an illustrative example for understanding the position I’m talking about. He has three main arguments. The first is that, to paraphrase Baroness Thatcher, There Is No Alternative; that is, (1) there is nowadays simply no feasible or coherent way to avoid the liberal framework of free speech rights without falling into an intellectual morass or quietism of both smug and despairing varieties. The second is that (2) unless the left can get their more embarrassingly authoritarian comrades under control, the right will use the slow-burn optics disaster to run roughshod over public education. And the third, which I can’t find a good link for right now, is (3) that leftist students are so powerless that whatever deviations from a norm of free-speech absolutism they push for will be used to punish them in turn.

I’m somewhat sympathetic with all of these arguments individually, but together they paint an incoherent image of the state of student activists: they are (1) so used to powerlessness that they resort to implausible intellectual masturbation in lieu of real action; (2) so powerful (and overzealous) on campus that the GOP can paint a compelling picture of the animals running the zoo at universities nationwide; (3) simultaneously powerful enough to extract material changes in what university administrations are and aren’t capable of doing, but also powerless enough that they will in turn immediately face the wrath of now-almighty administrators as soon as they obtain these same successes.

(Another common incoherence of (2) and (3) is the notion that students are perched in an odd position of razor-thin precariousness, where any noisemaking will provoke punishment from powerful conservatives, either in the form of defunding and privatization or of sanctions for particular activists, but where toeing the free-speech-absolutist line will also be sufficient to postpone the hammer indefinitely.)

Here’s my competing story: whether or not public universities are defunded and privatized has almost nothing to do with what student activists do; it has been a core goal of the right since before many of today’s students’ parents were born, and the GOP will stop at nothing to accomplish it, come hell or high water, no matter how respectable or absurd the actions of students seem to outsiders. The only way to actually stop this agenda is to build real political power on the left in legislatures and to articulate and defend a robust vision of education as a public good — and a positive right — that ought to remain outside the purview of market forces. Every leftist student currently protesting racist speakers could throw in the towel and spend every night for the next four year volunteering at soup kitchens and absent those two developments the current march towards the dismantling of public higher education in the US would continue unabated.

This is the most important free speech issue of our time. And conceding the center-right’s framing of “college free speech” makes it impossible to recognize it as such. 

DeBoer’s repeated expressions of exasperation and incredulity when encountering observations of the hypocrisy of “free speech advocates” suggest that he truly cannot fathom the idea of a person who thinks free speech and academic freedom are very important values for the left but thinks the current hegemonic conception of those principles is fundamentally, even dangerously deficient. The fact that the political spectrum among the most prominent free speech rabblerousers — Jonathan Haidt, Conor Friedersdorf, Jonathan Chait, Steven Pinker, the entire American Enterprise Institute, etc. — ranges essentially from Tony Blair to Ronald Reagan does not trouble deBoer one iota. The only reason someone could possibly be concerned about that sort of thing, he insinuates, is a base drive to place partisanship (who’s on your “team”) over principle.

This means that he doesn’t realize that the most common form in which “free speech absolutism” appears in the mainstream press is as a corollary of a broader commitment to the ideology of the “marketplace of ideas,” the very same ideology which justifies treating education as a commodity that should be subjected to private market competition in the first place. Here, for instance, is the right-wing Foundation for Economic Education invoking Jonathan Haidt’s advocacy for “political diversity” in an argument against the institution of tenure. Here is Friedersdorf extolling the virtues of homeschooling vis-à-vis public school, and here he is arguing for school vouchers as his preferred method of reparative racial justice. The neoliberals looking to use the putative college war on free speech as an excuse to enact their agenda on American higher education are not just far-off GOP lawmakers but also deBoer’s fellow “free speech absolutist” writers.

So finding an alternative to their conception of “rights” is not just possible but absolutely urgent. Luckily, such an alternative is ready-made: it’s the same alternative that the left has drawn on since the rise to hegemony of classical liberalism hundreds of years ago. Against the classical liberal insistence only on the existence of “negative rights” (think concepts that begin with “freedom from…”), the left has traditionally defended the existence of “positive rights” (typically “the right to…”), and often insisted that positive rights ought to take precedence over negative rights when the two come in conflict: the idea, for instance, that economic coercion is acceptable to guarantee access to healthcare for all.

So for this leftist tradition, exemplified in the twentieth century by figures like Dewey and Habermas, the right to free speech is not just the freedom to say what one wants at any particular time unencumbered by any active restraint, but a freedom to learn, to reflect, and to use one’s capacity for critical thinking to contribute to political discourse and ultimately concrete collective political projects, in cooperation and solidarity with others. From this conception of free speech, the urgency of e.g. the defense of institutions like tenure, collective bargaining for university employees, and public funding for higher education flows quite naturally. In fact, they move to the center of our conception of what the fight for free speech on campuses actually means, and the problem of student activism starts to seem more like a politically expedient distraction.

It also, for what it’s worth, becomes clear when and for what reason suspensions of the more conventional liberal right to free speech become acceptable: in defense of this positive right of universal access to democratic deliberation and political action. Speech used to intimidate and harass students whose access to education and all it entails is in jeopardy can be subject to reasonable restraints (though as with all rights conflicts the precise practical solution cannot be spelled out entirely a priori). So, to take a real-world example, attempting to prevent Milo from outing undocumented students on campus is entirely defensible under this framework.

Indeed, collective student action more generally — protests, student writing, etc., aimed at changing the status quo — starts to seem closer to embodying the spirit of a robust understanding of academic freedom than imperiling it. “Democratic means and the attainment of democratic ends are one and inseparable,” as Dewey put it. While the neoliberal idea of the marketplace of ideas advocates an imposed discursive free-for-all as a convenient means of preventing collective bodies from ever actually doing anything, the competing leftist, positive-rights idea insists on the importance of knowledge creation and earnest communication as the groundwork of further democratic action.

This distinction is absolutely crucial (it’s central to my new article with Naomi Oreskes, for instance) but it gets occluded every time deBoer sneers at attempts to argue that the locus of the free speech fight ought to shift away from student deplatforming. If deBoer and his allies really do want to defend the right of people to work together for a better world — and I don’t doubt that they do — they should just do that, instead of insisting that we can slide ass-backwards into that same position if only we first make our peace with the libertarians.