Home > News > A conservative YouTube star just lost his income stream for homophobic slurs
122 views 13 min 0 Comment

A conservative YouTube star just lost his income stream for homophobic slurs

YouTube isn’t ready to deal with political controversy.

- June 6, 2019

YouTube just announced that it was demonetizing conservative commentator Steven Crowder’s YouTube stream for breaching the terms of its Partner Program. Tarleton Gillespie is a principal researcher at Microsoft Research, an adjunct associate professor in the department of communication at Cornell University and the author of an important recent book, “Custodians of the Internet” (Yale University Press, 2018), on how firms like YouTube moderate content. I asked him what was going on.

Henry Farrell: Ordinary understandings of language would suggest that calling someone a “lispy queer” is homophobic harassment. Why did YouTube seem to disagree?

Tarleton Gillespie: It’s difficult to guess what the conversation sounds like inside YouTube. On its face, I find Crowder’s videos homophobic, and Carlos Maza, who was on the receiving end of these slurs, thinks so, too. Crowder certainly does not. But of course, it doesn’t matter what I think, and in some ways it doesn’t matter what Maza or Crowder thinks, either.

Currently, it’s YouTube who gets to decide which videos to remove and which to keep up. Its right to make this decision itself, on our behalf, is protected by U.S. law and has become the norm for most social media: Platforms set guidelines for what speech and behavior they allow on their site, and reserve the right to remove content or suspend users who violate those guidelines.

That said, YouTube’s decisions are not without consequence. The user base is increasingly frustrated with how YouTube polices the site — or fails to. Some feel it too often overlooks harassment like this, even boosting its visibility through its recommendation algorithms. Others say YouTube is silencing speech that should be protected, especially conservative perspectives like Crowder’s. Still others think it’s unclear about what the rules are, and inconsistent in how they’re applied. In the face of this frustration, YouTube (and other platforms like it) fall back on making broad categorical decisions: not are these particular slurs wrong or cruel, but are they “harassment” under the company’s terms? Are they “hate speech”? Are they “bullying and abuse”? In the absence of justice, it opts for consistency.

Of course, it can’t achieve consistency, either. What would it even mean to be consistent, across thousands of videos, saying slightly different things, in different ways, in different contexts? So it tends to fall back on standard, familiar notions. And homophobic slurs still don’t carry the cultural weight of racial ones. Not long ago it was acceptable to call someone a [homophobic slur] — as Crowder does in the T-shirt he’s wearing in some of the offending videos — and have it pass as a joke, maybe in poor taste, but not harassment or hate. YouTube here has failed to appreciate the cultural shift in how we think about homophobia.

Why have YouTube and other platforms found themselves in the situation of having to moderate the speech of their users?

Well, we can say they brought it on themselves. Platforms wish they could be the open Web done right: that they help us all join in, say what we want, find entertainment and community — and they get some ad dollars in the process. What they quickly found was that building community requires dealing with social tensions; that providing people a way to be heard means they will use that tool to their own ends, whether noble or cruel.

On the other hand, what’s the alternative? It’s easy to criticize platform moderation by condemning intervention of any kind. But a site with no rules whatsoever quickly fills with spam, porn, brutality and contraband. Hosting and delivering information means, inevitably, making choices about how and why to do so — through the design itself and the policies imposed. It is only a question of what the rules should be, how they are enforced and — I think this is important — who should make them.

After announcing that it would not remove Crowder’s videos, YouTube then announced that his channel would be demonetized until he could “address the issues with his channel.” What does that mean?

YouTube has long had a Partner Program that gives users with sizable audiences a share of the revenue from the ads paired with their videos. This can be lucrative for YouTubers with large followings. YouTube also reserves the right to “demonetize” either a single video or an entire channel if the content is not “advertiser friendly” — a second layer of content guidelines, above the ones that already apply to everyone. This means the revenue stream from a popular video can suddenly dry up at YouTube’s discretion.

Having his entire channel demonetized could be a significant financial blow to Crowder, though he also sells merchandise and may have other sources of revenue, as well. I don’t love that YouTube’s revenue sharing only applies to YouTubers with a big audience. There’s no reason why every user couldn’t split ad revenue for their videos, however minuscule. But given that YouTube has chosen to reward audience size, it does create a useful distinction: It holds users with big audiences to a higher standard. Bigger audiences should mean higher standards.

YouTube may do this for its own financial reasons, of course: It wants this content to be palatable to advertisers. As we’ve learned, harassing speech has additional effects beyond just being hurtful. Crowder can say he’s just debating, or being un-PC, that it’s his right to speak as he chooses. But his audience takes these slurs as a signal to harass Maza directly: leaving cruel comments on his videos, harassing him on other platforms, emailing him with death threats. Crowder knows this, certainly. But he gets to shrug and pretend his words are the only issue. Having an audience comes with responsibility; YouTube’s revenue-sharing rules, however self-interested, impose this additional responsibility almost by accident.

So, yes, YouTube absolutely does treat its stars differently. It’s possible that, in a single policy decision like this one, star power counts for something, although YouTube said it didn’t. But I think it’s more about how YouTube is built, particularly around its distribution of money, where star power matters.

How much do political considerations (e.g., YouTube’s relative sensitivities to criticism from liberals and criticism from conservatives) play a role?

There’s certainly been an effort by the right to challenge the content-moderation efforts of YouTube and other social media platforms. Charges of bias against conservatives, based usually on a handful of examples and very little analysis, have even reached Congress and the Trump administration, which last week set up a site for lodging complaints of social media censorship. And progressives are making noise about regulating Big Tech for derelictions of public duty.

Crowder is now lodging the same complaints about conservative bias. It’s a clever tactic; it’s hard to imagine how the risk of looking biased wouldn’t creep into YouTube’s decision-making process. But I think it’s a categorical error to always focus on individual bias without looking at the big picture. Sure, YouTube’s content moderation team may hesitate about a decision that could look biased.

But the bigger issue is, how can a platform make fair decisions on a landscape of language that is itself political, politicized and shifting? Conservatives have spent the last few years suggesting that social media companies are liberal and biased; but they’ve spent the last few decades waging a protracted battle about political correctness, liberal bias in the news, campus speech and identity politics, painting all those concerned about rising hate and harassment as “snowflakes,” hurt by mere words. These tactics have pushed the U.S. political landscape about speech. So it’s not just that YouTube has to decide whether this conservative blowhard is harassing or merely criticizing a liberal commentator at a progressive media site; it has to decide what even counts as harassment in the shifting sands of partisan language politics.

What recommendations does your book have for dealing with these kinds of clashes and controversies?

In “Custodians of the Internet,” I try to show what an enormously difficult task content moderation is, how it extends some very old questions we’ve had about media for a long time — and to wonder whether it’s acceptable that these decisions should be made by the platforms. It’s time for platforms to radically rethink their commitments and their impact on public life. They should certainly close the gap between their stated rules against harassment and actual enforcement of them. And they should admit that hateful speech stirs up swarms of harassers, by holding instigators as accountable as their followers. But they should also admit that all content moderation decisions are political and should not be exclusively in their hands. Publicly contested values are worked out by being publicly contested.

You could argue that this means Crowder should get to say what he wants, Maza what he wants, then we all gnash our teeth and beat our chests. But the alternative would be for platforms to make space for that debate to occur, publicly and among its users, and then enforce the conclusions they reach. It should matter what Maza thinks, what Crowder thinks, what I think and what all of us think — more so than what YouTube thinks.