The involvement of individuals or groups in conspiracy theories such as QAnon, which has been the trigger for violence on social media Youtube, will be banned. In doing so, Google, the owner of Youtube, is following in the footsteps of social media Facebook and Twitter in addressing far-fetched and dangerous conspiracy theories.

Tens of thousands of videos and hundreds of channels related to the QAnon conspiracy theory have already been removed from Youtube on the basis of current terms of use, especially when there are threats of violence or denial of major acts of violence, according to the AP news agency.

“All of this work has played a key role in limiting the spread of harmful conspiracy theories, but there is, even more, we can do to address certain conspiracy theories that are used to justify real-world violence, such as QAnon.” said in the company’s blog post today .

QAnon is a far-fetched conspiracy theory based on US President Donald Trump waging a secret war against an international circle of devil-worshipping child molesters that many of the president’s political opponents belong to. Proponents of her case have been working to make the actual transcript of this statement available online.

Youtube also forbids the so-called Pizzagate conspiracy theory, which was the predecessor of the QAnon theory, to be directed at people or groups. The theory prompted an armed man to enter a restaurant in Washington, D.C., to “investigate” whether there was a child molestation ring and to fire a repeater rifle a month after the 2016 presidential election. He was sentenced to prison the following year.

Was the main platform for the dissemination of the theory

Sophie Bjork-James, an anthropologist at Vanderbilt University who studies QAnon, tells the AP that Youtube has been the main forum where the theory has been spread for the past three years.

“Without the medium, Q would probably continue to be a little-known conspiracy theory. For years, Youtube gave this radical group an international target group, “she says.


QAnon has become increasingly prominent in the United States and other countries in recent months, and now a group of Republican candidates in the November election are declaring confidence in the theory. Various experts are wondering if Google is not too late in the act with its actions now.

Facebook announced last week that the media intends to ban groups that openly support QAnon. Pages, groups, and Instagram accounts would be taken down if they shared QAnon content, even if violence was not explicitly encouraged. Twitter took action against QAnon this summer, banning thousands of accounts linked to QAnon and banning URL sharing. Then the medium stopped recommending QAnon tweets.

Public Reaction