“We all use our mobile phones pretty much every hour of every day,” states Dr Olivia Brown, Associate Professor in the School’s Information, Decisions and Operations Division.
“The internet is completely ubiquitous in everything that we do, which means it’s now becoming more and more ubiquitous in everyday harms – whether that’s terrorism or whether it's slightly less extreme harm in terms of violent protests or hate speech.”
As such, Liv believes it’s more important than ever before to look at online data – not just in the context of preventing harm, but also in understanding its origins and spread through social networking sites.
Liv’s research focuses on examining how people communicate online and what it can tell us about ‘mobilisation’, the process by which people move from online discussion to real-world action. Some of her recent work, published in summer 2024, compared social media posts from a sample of convicted right-wing extremists with posts from non-offenders who interacted in the same online spaces.
She and her fellow researchers found a number of signals that a user was moving towards committing offline violence. Surprisingly, the ideological content of users’ posts was less indicative of mobilisation than the length and amount of punctuation used – with longer posts and increased punctuation (such as exclamation marks and question marks) more common among the convicted sample's posts.
Clearest signs

The most telling signal that someone would take offline action, however, was explicit talk of operations and logistics. Liv explains: “It suggests that people are developing the efficacy and ability to actually engage in the action. So we would suggest a focus on that, [as part of] a holistic approach.”
She continues: “What has been the posting activity of that person over time? Do we have any information about their offline behaviour? Are there any risk factors in their day-to-day presentation? Have they violently offended before? Those sorts of things should be combined with these indicators that we find in online data.”
Such markers – as well as Liv’s wider understanding of the online far-right ecosystem – could be useful for intelligence agencies in identifying those most at risk of carrying out real-world offences. She is, however, emphatic that caution is key.
“We would never want to claim that that our work could single out somebody as being a potential terrorist, for example,” she says. “I think there's always a really careful balance that has to be done in terms of how data is used and whether and where it's used appropriately.”
The paper garnered widespread attention, and Liv has worked with UK police and government agencies to share her insights. She was also interviewed by numerous media outlets about the role of social media in the far-right protests and riots that erupted across several UK cities in July and August 2024.
“Why is it that we see an incident like [the mass stabbing in] Southport happening and then we see this enormous increase in hateful rhetoric on social media?” she muses.
She identifies the situation around X – formerly known as Twitter – as a perfect storm.
While all platforms will have their darker, more harmful corners (“those actors will always exist; people will always look to leverage platforms to meet their ends”) the changes made to X’s algorithms and the re-instating of several far-right figures on the platform has fuelled an echo chamber of increasing toxicity.
Her future research plans include carrying out a large-scale survey and focus groups to investigate public sentiment around the use of artificial intelligence in policing. How might people feel about forces using AI to counter, monitor and potentially even remove posts spreading disinformation around inflammatory current events, for example?
“We know that the idea of social media being monitored raises a lot of strong feelings for people.”
Digital developments

Liv is also Deputy Director of the newly launched Bath Institute for Digital Security and Behaviour. The Institute is an interdisciplinary research group, encompassing experts from across the University in management, psychology, computer science and social sciences – including Co-Director Professor Adam Joinson and others from the School of Management.
It aims to work with industry, government and academia to respond to the evolving security risks to – and from – digital technologies, and to create a safer digital society for the UK.
“My role within the Institute is very much about research and development, and how we can support that through our activities,” Liv explains.
Engagement with practice has been a passion of Liv’s throughout her career, and she is excited for the chance to build further connections through her work with the Institute.
“I think it’s going to provide a platform to really increase the awareness of the work that we're doing,” she concludes. “I'm really excited about having the opportunity to make Bath a university that comes to people's minds straight away when they think about digital security and behaviour.”