Less than three hours after the stabbing attack on Monday that led to the death of three children, an AI-generated image was shared on X by an account called Europe Invasion. It depicted bearded men in traditional Muslim dress outside the Houses of Parliament, one waving a knife, behind a crying child in a union jack T-shirt.
The tweet, which has since been viewed 900,000 times, was captioned: “We must protect our children!” and shared by one of the most potent accounts for misinformation about the Southport stabbings.
AI technology has been used in other ways, including an anti-immigration Facebook group that illustrated a call to attend a rally in Middlesbrough by generating an image of a large crowd at the town’s cenotaph.
Platforms like Suno – which employs AI to generate music complete with vocals and instruments – have been used to create online songs combining references to Southport with xenophobic content. Titles include “Southport Saga” featuring an AI female voice singing lyrics such as “hunt them down somehow”.
Experts have warned that new tools and ways of organising have seen Britain’s fractured far right exploit the Southport attack to unify and rejuvenate its presence on the streets.
In a surge of activity not seen for years, more than 10 protests are being promoted across social media platforms such as X, TikTok and Facebook in the aftermath of violent disorder up and down the country.
Death threats against the UK prime minister, incitement to attack government property and extreme antisemitism were also among the comments on the Telegram channel of one extreme-right outfit this week.
Amid fears of violence spreading, a leading counter-extremism thinktank warned there was a risk that the far right could achieve a mobilisation not seen since the English Defence League (EDL) burst on to the streets in the 2010s.
An added new dimension has come with the advent of easily accessible AI tools that extremists have been using to create material ranging from inflammatory images to songs and music.
Andrew Rogoyski, a director at the Institute for People-Centred AI at the University of Surrey, said advances – with image-generating tools now widely available online – mean “anyone can create anything”.
He added: “The ability for anyone to be able to create powerful imagery using generative AI is of tremendous concern. The onus then switches to providers of such AI models to strengthen the built-in guardrails within the model to make it harder to create such imagery.”
Joe Mulhall, the director of research at the campaign group Hope Not Hate, said the use of AI-generated material was nascent but reflected the growing overlap and collaboration online between a range of individuals and groups.
While far-right organisations such as Britain First and Patriotic Alternative continue to be at the forefront of mobilising and agitating, just as important are a range of individuals not affiliated to any particular group.
“These are made up of thousands of individuals who offer micro-donations of time and sometimes money to collaborate towards common political goals, completely outside traditional organisational structures,” Mulhall said. “These movements lack formal leaders but rather have figureheads, often drawn from an increasing selection of far-right social media ‘influencers’.”
In terms of the promotion of the protests, the hashtag #enoughisenough has been used by some rightwing influencers according to Joe Ondrak, a senior analyst at Logically, a UK company that monitors disinformation.
“What’s key is how that phrase and hashtag has been used with prior anti-migrant activism,” he said.
The use of bots was also highlighted by analysts. Tech Against Terrorism, an initiative launched by a branch of the UN, cited a TikTok account that began posting content only after the Southport attack on Monday.
“All posts were related to Southport, with most calling for protests near the attack site on 30 July. Despite having no previous content, the posts relating to Southport amassed over 57,000 cumulative views on TikTok alone within hours,” said a spokesperson. “This suggested that bot networks were actively promoting this material.”
A central role is being played by a constellation of individuals and groups around Tommy Robinson, the far-right activist who fled abroad earlier this week before a court hearing. Others include Laurence Fox, the actor turned rightwing activist who has been sharing misinformation in recent days, and conspiracy theory websites such as the Unity News Network (UNN).
On the channel that UNN operates on Telegram – a largely unmoderated messaging platform – commentators rejoiced in violence seen outside Downing Street on Wednesday. “I hope they burn it to the ground,” said one. Another called for the hanging of Keir Starmer, the prime minister, saying: “Starmer needs the Mussalini [sic] treatment.”
Among those spotted on the ground during riots in Southport were activists from Patriotic Alternative – considered one of the fastest-growing far-right groups in recent times. Other groups, including those who have split over their positions on conflicts such as the war in Ukraine, or Israel, have also sought to engage.
Dr Tim Squirrell, the director of communications at the Institute for Strategic Dialogue counter-extremism thinktank, said that the far right had been seeking to mobilise on the street in the past year, including on Armistice Day and at film screenings by Robinson.
“This is a febrile environment and is only exacerbated by the health of the online information environment which is the worst it’s been in recent years,” he said.
“Robinson remains one of the more effective organisers in the UK far right. However, we have also seen the rise of accounts, large and small, which curate news stories that appeal to anti-migrant and anti-Muslim sensibilities, and are unconcerned by spreading information that is unverified.
“There is a risk that this moment could be exploited to attempt to create street mobilisation that more closely resembles the 2010s.”
Source link
lol