By John P. Mello Jr.
May 5, 2020 10:14 AM PT
Far-right groups and individuals in the United States are exploiting the COVID-19 pandemic to promote disinformation, hate, extremism and authoritarianism, according to a think tank report.
“COVID-19 has been seized by far-right groups as an opportunity to call for extreme violence,” states the report, the second in a series on the information ecosystem around the coronavirus, released by the London-based
Institute of Strategic Dialogue.
“This includes mobilization by white supremacist communities as well as the increased prevalence of memes which semi-ironically promote insurrectional violence across a range of social media platforms,” it continues.
The report is based on a combination of natural language processing, network analysis and ethnographic online research.
COVID-19 is becoming an increasingly important topic among far-right groups and individuals, the ISD researchers found. For example, mentions of “corona-chan” — a slang term popular with the extreme right — are increasing significantly across mainstream and fringe social media platforms.
Between February and March, “corona-chan” was used 13,000 times on 4Chan, an imageboard website that attracts far-right types, according to the report. It jumped 375 percent in interactions with corona-chan related posts on Reddit.
Meanwhile, Facebook saw a 1,920 percent increase in interactions using the term during March, while “corona-chan” mentions on Instagram climbed 21.5 percent.
Fear and Uncertainty
Uncertainty and fear can breed disinformation, observed Vincent Raynauld, an assistant professor in the department of communication studies at
Emerson College in Boston.
“There are a lot of polls out there showing that people in the United States are afraid. On top of that, we’re getting different messaging from different governments,” he told TechNewsWorld.
“So based on the information environment, people are very uncertain, very confused and very scared,” Raynauld continued.
“That’s an ideal environment for people to infuse disinformation because people are looking for easy solutions to try to understand what the crisis is about,” he said. “If there were a clear timeline on how we’re going to get out of this crisis, I don’t think people would be as likely to be buying into these explanations, but because there’s so much uncertainty people are buying into them.”
QAnon conspiracy theorists are using the pandemic to increase their reach online, the ISD report also notes. Theories they’re peddling include that the pandemic is being orchestrated to manipulate U.S. politics, that COVID-19 is a bioweapon, that there is a hidden cure for the virus, and that it is being manipulated to implement martial law.
“Whenever you have a situation where people are unhappy with the government having a heavy hand, and people have enough time on their hands to sit and delve into things on the Internet and follow issues down various rabbit holes, you can get distorted messages that incite anger and frustration,” said Karen North, director of the
Annenberg Program on Online Communities at the University of Southern California in Los Angeles.
“Conspiracy theorists are always talking about how various groups — especially government groups — are manipulating the world in order to control people,” she told TechNewsWorld. ”
“Now we have a worldwide event that gives them plenty of opportunity to distort what actually happened into a narrative that makes that event look like it was done intentionally and for sinister purposes,” North explained.
“Most of us are saying, ‘We are all in this together because this terrible virus has spread across the world, and we should unite against the virus and support each other,'” she continued. “But for some people, they’re uniting against what they perceive as forces against them, or forces trying to change their lives, such as a heavy-handed government.”
Not Your Boomer’s Boogaloo
Far-right communities have started talking about COVID-19 as an accelerant for a second civil war, also known as “boogaloo,” the ISD report notes. From Feb. 1 to March 28, more than 200,000 posts on social media contained the word “boogaloo.” The most popular hashtag within those posts was “#coronachan.”
Twenty-six percent of the references to “boogaloo” on 4chan relate to the coronavirus, the researchers found.
“While some of these calls appear to be ironic, others should be recognized as legitimate security threats,” the report warns. “This trend has already manifested into real-world violence, with one alleged white supremacist terrorist dying after shootouts with the FBI.”
Antisemitic speech and ideas are being adapted to incorporate the coronavirus, it notes. “Old antisemitic tropes of ‘blood libel’ relating to false claims of ritualistic sacrifice are being fused with a wide range of conspiracy theories which are emerging around COVID-19.”
Although social media outlets like Facebook, Twitter and YouTube have made efforts to reduce disinformation on their platforms, they appear to be facing an uphill battle.
“Social media platforms are in this constant fight to catch up with the evolution of fake news, but with the virus and so many people online, I think the level of fake news has increased significantly,” Raynauld said.
“Not only has the volume increased, but the potency, as well,” he added.
With the pandemic raging, social media platforms that to a large degree depend on humans to police their content are finding themselves in a difficult position, noted Marc Faddoul, research scientist at University of California, Berkeley, in a recent article for the Brookings Institution.
Platforms are fighting to contain an epidemic of misinformation, with user traffic hitting all-time records, he pointed out.
“To make up for the absence of human reviewers, platforms largely handed off the role of moderating content to algorithmic systems. As a result, machines currently have more agency over the regulation of our public discourse than ever before,” Faddoul wrote.
“So far, algorithmic systems have proven they can bring precious support in scaling up the enforcement of certain guidelines. However, they also reveal fundamental limits in their ability to capture nuances and contextual specificities,” he noted.
“Our dependence on humans to moderate content will last. By shifting to a more participatory and decentralized model, big tech platforms could make their content moderation pipeline more resilient and adaptable,” Faddoul suggested. “Moderating public forums for half of the planet is simply too big of a challenge for platforms to take on on their own — even with AI.”