Any story about terrorism usually begins with a chapter on radicalisation – on how individuals stumbled upon something – a video, a picture, even a made-up story – that reshapes their worldview.
Here in Singapore, with the explosion of the Internet and proliferation of smartphones, the means of radicalisation have become diffused and more challenging to detect.
Fourteen of the 16 individuals issued terrorism-related orders over the past two years had been self-radicalised online, according to a recent report by the Internal Security Department.
Online platforms, including social media, communications and games, have been identified as key vectors for self-radicalisation and domestic terrorism threats in Singapore, providing spaces for recruitment and propaganda.
This is worrying when about 4.96 million people in Singapore use social media – around 85 per cent of the total population.
When Prime Minister Lee Hsien Loong recently emphasised the continuing threat of radicalisation in his commentary last weekend, his call for vigilance should echo strongly in the online spaces which dominate Singaporeans’ daily experiences.
Rabbit Holes of Radicalisation
Video-streaming platforms have received significant attention in recent scholarship as an under-studied but highly effective space for radicalisation. YouTube, the most visible video-hosting platform, illustrates the challenges inherent to practicing vigilance.
The platform is seen as a powerful potential vector for self-radicalisation, with 2 billion monthly active users and a recommendation algorithm that industry watchers say prioritises high traffic and user engagement.
This makes it easy for general audiences to be recommended controversial, divisive, and misleading content, claim the platform’s critics. They point to acknowledgements by YouTube’s Chief Product Officer in 2018 that more than 70 per cent of watch time is spent on videos the algorithm recommends.
Critics also highlight that the New Zealand government found that the 2019 Christchurch shooter had been at least partially radicalised on YouTube, and that YouTube was a known recruitment venue for terrorist groups like Islamic State.
Indeed, YouTubers espousing philosophies in professionally crafted long-form video content about social and political issues have become a significant generator of radical ideology.
Many of these content creators capitalise on outrage at politically contentious issues — like unemployment or immigration policy — to galvanise radical sentiment and stoke popular unrest. Their rhetoric is often manipulative, playing fast-and-loose with the truth.
This is a format of video that has become a hallmark of politically extreme commentators and conspiracists, and one that scores highly on the YouTube algorithm’s engagement metrics. These creators often form communities, consciously directing viewers to ideologically similar content in ways that trains YouTube’s algorithm to continue recommending similar videos to consumers.