Subscribe for notification
Categories: Tech

YouTube Joins Facebook And Twitter And Takes Action Against Conspiracy Theory Group QAnon | Voice Of America

OAKLAND, USA – YouTube is following the lead of Twitter and Facebook, saying it will take steps to limit QAnon and other groups who believe in unsubstantiated conspiracy theories that could lead to actual acts of violence. The Google-owned video platform said Thursday that it will now ban material that targets a person or group with conspiracy theories used to justify acts of violence. An example would be videos that threaten or harass someone by suggesting that they are part of a conspiracy, as does QAnon, which depicts President Donald Trump as a secret warrior battling an alleged child trafficking organization run by celebrities and officials. governments that would be part of a “deep state.” The Pizzagate is another conspiracy theory that has circulated on the internet, essentially the seed of QAnon, which would fall into the category of those banned by YouTube. Its promoters claimed that children were being hurt in a Washington DC pizzeria. A man who believed in the alleged conspiracy entered the restaurant in December 2016 and fired an assault rifle. He was sentenced to prison in 2017. YouTube is the third major platform to announce policies that seek to suppress QAnon, a group of conspiracy theories that all helped advance. Twitter bans 7,000 QAnon accounts and imposed limitations on another 150,000 Many of the followers of the conspiracy theory are accused by the company of violating its rules on targeted harassment and spreading false information. Twitter announced an offensive against QAnon in July, although its followers were not banned from the platform. If you banned thousands of accounts associated with QAnon content and blocked web pages associated with the group so that they could not be promoted from Twitter. Twitter also said it would stop recommending tweets associated with QAnon. Facebook, meanwhile, announced last week that it was banning groups that openly endorse QAnon. He said he would cancel Instagram pages, groups and accounts for representing QAnon, even if they did not promote violence. The social network said it will consider a variety of factors when deciding which groups meet the criteria to be banned. Those include the group’s name, their bio or background information, or discussions within their Facebook page or groups, or account on Instagram, which is owned by Facebook. Facebook’s move comes two months after announcing a less aggressive offensive, saying it would stop promoting the group and its followers. But that effort failed due to inconsistent compliance. YouTube said it had already removed tens of thousands of QAnon videos and removed hundreds of channels under its current policies, especially those that explicitly threaten violence or deny the existence of major events of violence. “All of this work has been essential in reducing the scope of harmful conspiracies, but there is more we can do to confront certain conspiracy theories that are used to justify actual acts of violence, such as QAnon,” the company said Thursday. Experts say that The move shows YouTube is taking threats related to violent conspiracy theories seriously and recognizing the importance of limiting the advancement of those conspiracies. But, with QAnon increasingly breaking into American politics and life, they wonder if It’s not too late. “While this is a major change, for nearly three years YouTube was an important place for QAnon to advance,” said Sophie Bjork-James, an anthropologist at Valderbilt University who has studied QAnon. “Without the Platform, QAnon would possibly continue to be a dark conspiracy. For years, YouTube provided the radical group with an international audience. ”


Related Post

This website uses cookies.