On Wednesday, YouTube changed its content policies and began blocking more of what it considers “borderline” hate speech. These new restrictions resulted in takedowns of non-defamatory accounts — like history and journalism channels — and a backlash from free speech advocates.
YouTube’s actions came after Vox reporter Carlos Maza asked them to ban conservative commentator Steven Crowder for harassment in a viral Twitter thread last week, citing videos in which Crowder described Maza, who is a gay Cuban American, with variety of racial and sexual slurs.
On Tuesday, YouTube told Maza that Crowder did not violate its community standards and policies regarding speech. But on Wednesday, YouTube pivoted.
YouTube demonetized Crowder’s channel but did not ban him outright. They also published a blog post explaining its reasoning and newly tightened standards titled, “Our ongoing work to tackle hate.”
(Maza responded to this saying demonetization isn’t enough, even though Crowder believes this will be devastating to his business. YouTube originally told Gizmodo that Crowder didn’t violate YouTube’s policies because he mocked Mazo within the context of ideological debate and didn’t encourage his followers to harass and bully Mazo, even though they did.)
“Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status,” the blog post says. “This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory.”
YouTube’s blog post discusses how the Alphabet-owned video platform will crack down more aggressively on “borderline” content “to reduce the spread of content that comes right up to the line.”
Many free speech advocates on the right and the left see this as a dangerous shift for YouTube, since tech companies often don’t apply their speech and discrimination policies fairly or transparently. Free speech advocates also worry about the slippery slope of tech companies deciding which ideas are allowed on the internet and which are not.
“This is ridiculous. YouTube is not the Star Chamber — stop playing God & silencing those voices you disagree with. This will not end well,” Sen. Ted Cruz (R-Texas) tweeted.
Glenn Greenwald, the progressive reporter and free speech advocate who led reporting on U.K. and U.S. government surveillance following the Edward Snowden leaks, also criticized YouTube’s new direction. “Apparently, creating and implementing vague, arbitrary censorship standards on the fly in response to mob demands and then purging people en masse end up suppressing and punishing many voices that censorship advocates like,” he tweeted on Wednesday. “Who could have guessed this would happen?
According to a June 2018 report from the Verge, YouTube restricts and demonetizes LGBT creators as well as anti-LGBT creators. The Electronic Frontier Foundation (EFF) finds that tech platforms are notoriously inconsistent about how they enforce content policies, and all too often, they prioritize accounts and creators that drive clicks — on both sides of the political spectrum.
You have free articles remaining.
“Commercial content moderation practices negatively affect all kinds of people, especially people who already face marginalization,” the EFF says on its website. “We’ve seen everything from Black women flagged for sharing their experiences of racism to sex educators whose content is deemed too risqué.”
The EFF sees tech companies’ inconsistent content policy enforcement as a threat to free speech and free expression on the internet, which is why on May 20, the think tank launched TOSsed Out, a platform to track and publicize the ways tech companies “unevenly enforce” content moderation policies with “little to no transparency.”
Independent journalist Ford Fischer, for example, said YouTube just demonetized his entire YouTube channel documenting different kinds of activism and extremism because of Wednesday’s policy update.
“YouTube says ‘our team of policy specialists carefully looked over the videos you’ve uploaded to your channel News2Share. We found that a significant portion of your channel is not in line with our YouTube Partner Program policies.’ As you’ll see in the next tweet, that’s wrong,” Fischer tweeted Wednesday. “YouTube sent me only two specific videos that they’ve taken down. 1st is a video of @JasonRCharter and other #Antifa activists confronting a Holocaust denier. While it’s true that the Holocaust denier says Holocaust-denier-stuff, this is raw vid documenting him being shut down. The only other one flagged was raw video of a speech given by Mike Peinovich ‘Enoch.’ While unpleasant, this documentation is essential research for history.”
Tech platforms like Twitter, YouTube and others sometimes mistakenly pull down accounts for violating content policies, but when the owners of the accounts alert them to the error, they restore the accounts. But all too often, tech companies permanently remove or demonetize benign accounts as well as defamatory ones — sometimes at the direction of government officials.
The EFF argues that if tech platforms were transparent and held accountable for what posts and accounts they take down, fewer mistakes and censorship would occur. The EFF supports the Santa Clara Principles, which urge companies to:
• Publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;
• Provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account;
• Enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.”
“People rely on Internet platforms to share experiences and build communities, and not everyone has good alternatives to speak out or stay in touch when a tech company censors or bans them,” the EFF said. “Rules need to be clear, processes need to be transparent, and appeals need to be accessible.”