Social media companies “must take more responsibility” to counter fake news and conspiracy theories promoted on their networks or risk helping to fuel extremist violence, says a counterterrorism expert.
Christina Nemr, a former advisor with the Bureau of Counterterrorism at the U.S. Department of State, warns that “toxic” disinformation and deliberately manipulated content spread by extremist and hate groups has become a growing threat.
“The battlefield is in the hands of the private sector,” she told an online seminar held by the Center on Terrorism at John Jay College.
“Social media companies must take more responsibility.”
Nemr, now director of Park Advisors, a consulting firm, said it has become too easy for extremist views to go viral on the internet.
“Conspiracy theories arise when there’s a vacuum or gap in information,” Nemr said.
“Once you believe one, you will believe others.”
Some recent examples include the notorious “pizzagate” theory peddled in 2016 by Alex Jones, the Info-Wars host, who reported that Hillary Clinton was sexually abusing children in satanic rituals a few hundred miles north, in the basement of a Washington, D.C., pizza restaurant. That post, retweeted widely, prompted a North Carolina man to bring his rifle into the restaurant and open fire.
No one was injured in that attack, but three years later another conspiracy theory peddled in cyberspace incited a 28-year-old Australian to enter a mosque in Christchurch, New Zealand last year and gun down 51 people.
Currently, conspiracy theories about COVID-19, and Black Lives Matter protests promoted by QAnon, a shadowy website, have become a source of worry during this year’s election campaign.
One QAnon follower is accused of murdering a mafia boss in New York last year and another arrested in April was accused of threatening to kill Democratic presidential nominee Joseph R. Biden Jr.. Federal Bureau of Investigation has warned that QAnon poses a potential domestic terror threat. the New York Times reported.
Hoaxes, falsified content and conspiracy theories disseminated by violent extremists can be effective in undermining a population’s confidence in their government, said Nemr, author of “Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age.”
In that report, published in March 2019, Nemr wrote:
The messages conveyed through disinformation range from biased half-truths to conspiracy theories to outright lies. The intent is to manipulate popular opinion to sway policy or inhibit action by creating division and blurring the truth among the target population.
She cited one study which showed that even after learning a story was false, one-third of those surveyed said they still shared it on social media because it fit into their worldview.
“Facts don’t matter, it’s emotions that are important,” she told the seminar.
The speed with which the stories can be spread is stunning.
Nemr’s report revealed that “on average, a false story reaches 1,500 people six times more quickly than a factual story.”
It’s critical that disinformation be tackled, and social media should take the lead since most of the toxic content exists on such platforms, but tech giants are reluctant to do so for a number of reasons, she said.
The first, Nemr said, is that social media companies “don’t want to be the arbiter of truth.”
The second is that they’re essentially businesses focused on revenue, and viral stories that skirt the boundaries of disinformation generate a lot of revenue.
Big tech companies who hold all the data, not governments, also have privacy concerns.
A fourth reason: the social media executives can too easily be swayed by public opinion.
Nemr said that when ISIS was posting videos and recruitment or propaganda material, social media companies were swift to remove such content, proving they can be fast when they want to be.
But when the source of the dangerous information is not as obviously dangerous, the social media companies “play the free speech card,” she said.
One of the obstacles is that salacious and shocking material get more reads online than more staid, carefully researched stories thus generating more revenue.
Even worse, some of the social media algorithms, such as the one recommending the next thing someone should watch on YouTube, actually push consumers “down the road to more extremist videos,” she said.
Governments don’t have a great track record of tackling disinformation, Nemr said. They can’t respond quickly enough and legislation gets tangled.
The exceptions are Finland and Thailand, she explained, which have long records of dealing with a large, powerful nation close by that is aggressive with disinformation. With Finland, it is Russia; with Thailand, it is China.
Germany made progress by effectively passed a law that says a social media company must remove hate speech within 24 hours or face fines.
But defining “hate speech” is not always easy.
“Working with the private sector is imperative,” Nemr said.
Disinformation amplifies conspiracy theories, such as the recent ones about the origin of COVID-19.
One example: the story that Bill Gates created the pandemic so that he could then push vaccinations to the U.S. population that would contain tracking chips.
Fake news and cyber hoaxes have already been identified as a major threat to the U.S. election campaign. Although both China and Iran have been named as likely sources for efforts to undermine U.S. elections in cyberspace, Russia remains the top player.
During 2017, according to published U.S. intelligence assessments, Russian President Vladimir Putin ordered an influence campaign that combined covert cyber operations (hacking, troll farms, and bots) with overt actions (dissemination of disinformation by Russian-backed media) “in an effort to undermine public trust in the electoral process and influence perceptions of the candidates.”
Nemr’s full report, “Weapons of Mass Distraction,” co-written with William Gangware, can be read here.
Nancy Bilyeau is deputy editor of The Crime Report.