Researchers at the University of Bath have identified signals in social media posts that can predict when someone posting on far-right forums is likely to go on to commit a terrorist act.
Posts that related specifically to logistics, operational planning (including knowledge about weapons and avoiding law enforcement) and violent action marked out individuals that would go on to perpetrate terror offences. This was evident up to four years before criminal action.
In the first study of its kind, published in Personality and Social Psychology Bulletin, the team compared posts of convicted far-right terrorists with posts from people holding far-right extremist views who have not gone on to commit violence offline. The majority of offenders were convicted in the United States (75%), with 20% convicted in the United Kingdom and the remaining 5% in Australia, Canada, and New Zealand.
The discussion of far-right ideology and the expression of hateful views actually decreased the probability that the user mobilised to action.
Online Signals of Extremist Mobilization is published as senior US and UK police officers warn, in an interview by the BBC, that an increasing number of those turning to terrorism are driven by 'a fascination for violence, rather than ideological fanaticism'.
"Our research shows that we can identify people on social media who go on to commit extremist action by picking up on posts that are about acquiring know-how and developing capability to commit terrorism," said Dr Olivia Brown, Associate Professor in Digital Futures at the University' School of Management and Deputy Director of the Bath Institute for Digital Security and Behaviour.
"This method can help to identify people that are genuinely dangerous and likely to cause physical harm as opposed to those that are likely to contain their extremism to radical views and hate speech online."
The researchers spent a year compiling a unique database of over 200,000 social media posts from 2011-2019. These posts were from 26 individuals convicted of terrorism-related offences (mostly in the US and some in the UK) and 48 people sharing extremist content on far-right forums on Iron March, Gab, and Discord, who had not been convicted.
"Unfortunately, the sheer volume of extremist content online means that identifying people most likely to cause harm is like finding a needle in a haystack," said Dr Brown.
"We have pinpointed signals of risk to make the haystack smaller and the needle bigger, which can be used to prioritise monitoring resources on a smaller pool of people who we think are more likely to act.
"Of course, ideological content will still be a major concern to security services, but this is an additional technological tool, alongside existing resources, to differentiate between individuals who are likely to engage in terrorist action and those who are not."
Dr Brown is seeking funding to apply the methods of this research to the January 6 Capitol Building riots in the USA, to understand more about the mechanisms of mobilisation.
She is also working with law enforcement to look at social media posts in the context of online forums - examining group interactions and creating a tool to analyse risk within social networks.
The research was funded by the Centre for Research and Evidence on Security Threats (CREST).
Online Signals of Extremist Mobilization is published in Personality and Social Psychology Bulletin.