QAnon is facing a wave of bans by tech companies. These are the platforms that have and haven’t responded to the conspiracy theory.

QAnon

Summary List Placement

The QAnon conspiracy theory was born on the internet, and while it’s spread to real-life rallies in the US and abroad, it’s continued to thrive and spread online. And yet social-media platforms, where the conspiracy theory gains power and radicalizes people in the US abroad, have been generally slow to act on banning it. 

QAnon is a baseless far-right conspiracy theory that claims President Donald Trump is fighting a deep-state cabal of elite figures who are involved with human trafficking. It is unfounded, and yet its followers — estimated to be in the millions — have reportedly been linked to several alleged crimes, including killings and attempted kidnappings. In 2019, a bulletin from the FBI field office in Phoenix warned that the conspiracy theory movement could become a domestic terrorism threat. 

Many platforms, including Facebook, are finally taking steps to combat the spread of QAnon-related misinformation ahead of the November election. 

Here’s how major tech companies are handling the spread of the QAnon conspiracy theory online.

Facebook said its companies are cracking down on QAnon.

On October 6, Facebook announced it would remove all pages, groups, and Instagram accounts that promoted QAnon.

The ban, which the company said would be enacted gradually, comes after the platform previously announced over the summer that it had removed 790 QAnon Facebook groups. 

Extremism researchers are tracking how the new ban will play out, as the movement has spread rapidly on Facebook and on Instagram, where many are using “Save the Children” rhetoric to further propagate the movement’s misguided focus on human trafficking conspiracy theories. 

  Biden says he won't return Trump's attacks on his children because 'it's crass' to target a political opponent's family

Facebook has been criticized for its slowness in acting against QAnon. 

Twitter announced a moderation plan on QAnon in July.

In July, Twitter announced it would begin cracking down on QAnon content by suspending accounts that were “engaged in violations of our multi-account policy, coordinating abuse around individual victims, or are attempting to evade a previous suspension.” 

The platform said it would also stop recommending QAnon-related accounts and trends and block URLs associated with QAnon from being shared on Twitter. 

Critics have said the platform was slow to act on the movement and hasn’t moderated the community enough. On October 3, The Washington Post reported that there were still 93,000 active Twitter accounts referencing QAnon in their profiles, citing research from Advance Democracy, a nonpartisan research group.

YouTube announced a crackdown on QAnon after significant pressure, but stopped short of an explicit ban.

YouTube, where QAnon videos that spread the conspiracy theory have thrived and gained millions of views, said on Thursday that it would prohibit QAnon content that threatens violence against a group or individual. 

The move is part of an addition to the company’s policies against hate and harassment that is focused on “conspiracy theories that have been used to justify real-world violence.” 

“We will begin enforcing this updated policy today, and will ramp up in the weeks to come,” the company said on Thursday. “Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll …read more

Source:: Businessinsider – Politics

      

(Visited 2 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *