The U.K. Riots Were Fomented Online. Will Social Media Companies Act?
Standing in front of a lectern on Thursday, his voice at times taut with anger, Britain’s prime minister announced a crackdown on what he called the “gangs of thugs” who instigated violent unrest in several towns this week.
But the question of how to confront one of the key accelerants — a flood of online misinformation about a deadly stabbing attack — remained largely unanswered.
Prime Minister Keir Starmer called out online companies directly, after false information about the identity of the 17-year-old suspected in the attack spread rapidly on their platforms, no matter how many times police and government officials pushed back against the claims.
Three girls died after the attacker rampaged through a dance class in Southport, northwest England, on Monday. Of the eight children injured, five remain in the hospital, along with their teacher, who had tried to protect them.
Immediately after the attack, false claims began circulating about the perpetrator, including that he was an asylum seeker from Syria. In fact, he was born in Cardiff, Wales, and had lived in Britain all his life. According to the BBC and The Times of London, his parents are from Rwanda.
The misinformation was amplified by far-right agitators with large online followings, many of whom used messaging apps like Telegram and X to call for people to protest. Clashes followed in several U.K. towns, leading to more than 50 police officers being injured in Southport and more than 100 arrests in London.
Officials fear more violence in the days ahead. The viral falsehoods were so prevalent that a judge took the unusual step of lifting restrictions on naming underage suspects, identifying the alleged attacker as Axel Rudakubana.
“Let me also say to large social media companies and those who run them: Violent disorder, clearly whipped up online, that is also a crime, it’s happening on your premises, and the law must be upheld everywhere,” Mr. Starmer said in his televised speech, though he did not name any company or executive specifically.
“We will take all necessary action to keep our streets safe,” he added.
The attack in Southport, England, has been a case study in how online misinformation can lead to actual violence. But governments, including Britain, have long struggled to find an effective way to respond. Policing the internet is legally murky terrain for most democracies, where individual rights and free speech protections are balanced against a desire to block harmful material.
Last year, Britain adopted a law called the Online Safety Act that requires social media companies to introduce new protections for child safety, while also forcing the firms to prevent and rapidly remove illegal content like terrorism propaganda and revenge pornography.
But the law is less clear about how companies must treat misinformation and incendiary, xenophobic language. Instead, the law gives the British agency Ofcom, which oversees television and other traditional media formats, more authority to regulate online platforms. Thus far, the agency has not taken much action to tackle the issue.
Jacob Davey, a director of policy and research at the Institute for Strategic Dialogue, a group that has tracked online far-right extremism, said many social media platforms have internal policies that prohibit hate speech and other illicit content, but enforcement is spotty. Other companies like X, now owned by Elon Musk, and Telegram have less moderation.
“Given the confrontational tone set by some companies it will be challenging to hold them accountable for harmful but legal content if they decide they don’t want to enforce against it,” said Mr. Davey.
The European Union has a law called the Digital Services Act that requires the largest social media companies to have robust content moderation teams and policies in place. With the new powers, regulators in Brussels are investigating X and have threatened to fine the company in part for its content moderation policies.
In the United States, where free speech protections are more robust than in Europe, the government has few options to force companies to take down content.
X could not be reached for comment, though Mr. Musk replied “insane” to a video on X of Mr. Starmer’s remarks. Meta, owner of Facebook and Instagram, did not respond to a message seeking comment.
Telegram said that calls to violence are “explicitly forbidden” on its platform and that it was developing a tool that would allow fact-checkers within a country to add verified information to posts that are being viewed by users in that territory.
British policymakers said the country must address false information spread by the far right on social media.
“I see it almost every single day — straight-up lies about these situations designed to cause violence, to incite racial hatred, to incite people to violence,” Jonathan Brash, a member of Parliament from Hartlepool, an area where there were violent clashes with the police, said Thursday on BBC Radio 4. “There is so much misinformation and it’s being spread quite deliberately to stoke tension in communities.”
Al Baker, the managing director of Prose Intelligence, a British company that provides services for monitoring Telegram, said the online discourse was a reflection of wider societal challenges.
“It’s important not to go too far and say the internet is the cause,” Mr. Baker said. “The internet and social media are an accelerant that intensify existing problems we have as a society.”
#U.K #Riots #Fomented #Online #Social #Media #Companies #Act