How YouTube Pays for Hateful, Misogynistic and Racist Speech

Misogyny, racism, and targeted harassment… Some of this content will be heavily paid for by YouTube, despite its moderation rules. Analyst firm Bot Sentinel conducted a survey of a video-sharing platform that can be compared to a social network because of the sharing spaces it offers.

Christopher Busi, founder of Bot Sentinel, explains in an exclusive interview with Rolling Stone, discovering a pattern whereby hateful, misogynistic, racist and harassing speech (mostly aimed at prominent and identifiable women) is not moderate. The company focused on 29 YouTube channels to identify toxic comments and nearly 24 other channels suspected of copyright infringement.

The channels paid $42,000 per month.

The Sentinel bot found at least two dozen YouTube channels in gross violation of the platform’s moderation policy. They still continue to publish content and get paid for it. Meghan Markle, an American actress who became a member of the British royal family by marrying Harry of Sussex, as well as actress Amber Heard, Johnny Depp’s ex-wife, are among their favorite targets. The two individuals have already spoken out about the devastating and long-term effects on their mental health and their emotions of being bullied towards them, Rolling Stone recalls.

The analytics firm says the 5 channels that aired hate speech against Meghan Markle received a total of $42,000 per month from the platform, with all of their videos containing ads. YouTube obviously gets a cut of the revenue, the study says. However, theoretically, the platform prohibits insults against famous or recognizable people, based on someone’s appearance or against a category of people (because of their ethnic origin or victims of domestic violence, etc.).

But YouTube channels under investigation by Bot Sentinel have been posting completely unverified slanderous content. Others use deceptive methods, such as using preview images or “thumbnails” that do not match the content of the video, to bypass moderation.

Meghan Markle and Amber Heard, main targets

At least 29 YouTube channels monetize content that offends, defames or even threatens Meghan Markle. 22 other channels have posted at least 30,000 videos, 80% of which are anti-Meghan Markle. These channels have the sole purpose of broadcasting content aimed at provoking anti-Markle sentiment, since 94% of all their content consists of negative or slanderous statements about him, Bot Sentinel notes. The three main channels mention Meghan Markle’s name at least 15,000 times in videos that have amassed 76 million views.

Another YouTuber consistently posts content that violates YouTube’s rules for the Duchess of Sussex. In a live video, he said she deserved to be “choked to death,” according to the Bot Sentinel. Taking advantage of the public’s passion for the lawsuit between Johnny Deep and Amber Heard, the creator shot 128 videos with offensive comments about Amber Heard’s body or slandering her testimony during the trial. In the videos, his name is mentioned more than 6,000 times.

The Sentinel bot also details copyright infringement by YouTube channels that specialize in royalty or pose as anti-Markle. At least 22 channels have published thousands of videos made by word of mouth reading of press articles. Most of these 34,000 videos, which have amassed over 441 million views, do not identify the articles or news headlines from which the content originated.

Youtube moderation does not exist?

Christopher Busi assured that he repeatedly reported the video on YouTube and even on Google. Unsuccessfully. On the other hand, less than an hour after comments were requested on various specific videos, YouTube removed at least two videos, explaining that they did not comply with its harassment policy, Rolling Stone slips.

For Christopher Busi, YouTube is responsible: “A lot of these channel managers wouldn’t do what they do if YouTube didn’t reward them. And let’s be clear here, they are rewarded. When you allow these people to monetize this type of content, you, the company that pays them, are contributing to the harassment.”

A YouTube representative, who did not see the report, contacted the US media and stated that he was serious about this kind of violation of their rules. He says YouTube is committed to applying its moderation policy to all content creators on an equal footing and encourages users to report content that doesn’t follow the rules set by the platform.

What is moderation in social networks?

According to Christopher Busi, this study highlights YouTube’s shortcomings in moderation. These videos, which highlight racist or misogynistic content, reinforce a certain vision of the world in the people who watch them, he explains to Rolling Stone magazine. And, according to him, the lack of moderation of this content can lead to an increase in hate and racist speech directed at other groups of the population, such as journalists or even ordinary citizens.

“I think a big platform like YouTube, with billions of views, has its responsibilities,” adds Christopher Busi. The topic of moderation also applies to Twitter, Facebook, Instagram or even TikTok. It is all the more important that young people are increasingly turning to social networks for information.

A report published by NewsGuard warns of misinformation being posted on TikTok: Nearly 20% of content broadcast on trending topics contains false or misleading information. However, the video sharing app is used as a search engine by young people who learn a lot about it.

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker.