It’s All Fun and Games, Until It’s Not: The Risks
of Gamified Violent Extremism and How We Can Tackle It
by Petra Regeni, Research and Project Officer, Terrorism and Conflict group, RUSI Europe
11 November 2022
From Bratislava last month to Buffalo in mid-May, we continue to reckon with the emulation of video games in real-life violent terrorist attacks – highlighting, once again, the dangers of violent extremist actors co-opting gaming platforms and culture.
Bringing games into reality can be fun – who doesn’t love go-karting like in Mario Kart? But, as a string of gamified terror attacks has shown, some aspects of games are best left online. Like any online space, the growing popularity of video games is heightening the risk of them being co-opted by malign actors. There is evidence that gaming culture – and not simply games themselves – is increasingly exploited by violent extremists to radicalise and recruit new members, sensationalising terrorism while desensitising users to violence. We have seen the emulation of games like Call of Duty and Halo in real-life violent attacks, and the use of memes popular among gaming communities by extremists in order to appear relatable to target audiences.
On 12 October, Slovakia was struck by a violent anti-LGBTQ attack that left two dead. The assailant, a 19-year old Slovak teen, claimed to be inspired by past terrorists including the recent Buffalo shooter, was radicalised online, and spent the hours after his attack posting on Twitter and 4chan forums. The attack, although not gamified in reality, was inspired by previous gamified terror attacks and was subsequently gamified in the dark corners of far-/alt-right platforms, most notably 4chan, 8chan and other -chans.
So, what exactly is ‘gamification’? How does it fit into the landscape of (violent) extremism? And what are some key implications for preventing and countering violent extremism (P/CVE) efforts in the public and private sector?
With the expansion of technology in our everyday lives, distinguishing between ‘online’ and ‘offline’ radicalisation is virtually impossible. This makes ‘winning the screens of people’ as important as ‘winning their hearts and minds’ for terrorists and states alike. Online gaming communities are a combination of ‘screens’, ‘minds’ and ‘hearts’ that can be won; as such, gaming platforms and communities are increasingly being leveraged by violent extremists and terrorists. Gamification is one example of this. It is, in brief, ‘the use of game design elements in non-game contexts’. For terrorism, this means introducing game design elements into real-life actions and violent attacks, including game-like rankings, scores and badges for certain acts, and creating missions and leadership boards.
Gamification itself, however, is backed by behavioural science. It is employed by apps such as Duolingo and NikeFuel to solidify users’ engagement and commitment to the brand. A similar strategy has been adopted among extremists to desensitise target audiences by incorporating familiar aspects from alt-/far- right forums. This ranges from using coded slang and tropes, some originating on the alt-right and others appropriated from popular culture – such as ‘based’, ‘libtard’, ‘snowflake’, ‘femoid’ and ‘soy boy’ – to their infamous lexicon of memes. The convergence of the two spaces is apparent in the idolisation of previous far- right attacks and attackers, at times in the form of memes, as well as terrorists’ manifestos that are so filled with alt-/far-right references that they are likely unintelligible to normal people.
Any reader who found half of the above paragraph confusing would not be alone; this is part of a wider phenomenon that research is now looking to understand. A recent paper on the Gamification of (Violent) Extremism, published by the RAN Policy Support programme, unpacks some of the many intricate ways in which extremism has been gamified. It examines the approaches used by assailants, reactions online and from policymakers, and the unconventional modus operandi – namely the livestreaming of violent acts – that may inspire subsequent imitations of gamified attacks.
In March 2019, the distressing reality of gamified terrorism struck Christchurch, New Zealand. The attack served – and still serves today – as a source of inspiration for future assailants employing similar elements of gamification, including the Poway Synagogue shooting; the attacks in El Paso, Texas, and Bærum, Norway; and the attack in Halle, Germany, all between March and October 2019. In all these cases, the attackers livestreamed, or attempted to livestream, their ‘first-person shooter game’ emulation attack on social media and so-called gaming(-adjacent) platforms including Discord and Twitch. Beyond such copycat attacks, online extremist discourse is increasingly gamified as assailants are assigned scores based on the ‘successful execution’ of attacks.
This twisted gamified discourse is very much present around the 12 October 2022 shooting in Bratislava. Mere hours after the attack, 4chan was rife with gamified chatter, including conversations with the attacker, Juraj Krajčík. In one thread, some users spoke disparagingly – ‘Did this dude really only kill 2 people? What a failure’, ‘Did he at least have the decency to livestream it?’ – while others, though far fewer, spoke with praise: ‘that’s the Slovakian high score’. The disturbing discourse highlights many users’ (at least those on 4chan forums) lack of awareness, or outright ignorance, regarding the real-life implications of attacks. Instead, for some, the attack was reduced to gaming rhetoric confined to the online world.
Research has also evidenced the idolisation of these attackers on online forums by far-right extremist and mainstream audiences alike. The ‘canonisation’ of terrorists like Anders Breivik as ‘Knight Justiciar’, Brenton Tarrant as ‘Saint Tarrant’, and the Kenosha assailant, Kyle Rittenhouse, as ‘Saint Kyle’, seeps into various online spheres – from 4chan, 8chan and other messaging boards to streaming platforms. Following this pattern, the Slovak shooter has already been named ‘Saint Krajčík’ across some -chan spheres.
It should be noted that gamification is merely one manifestation of the nexus of gaming and extremism. Others include the production of bespoke video games and modifications of existing games to incorporate and/or replicate attacks like Christchurch. This points to a growing ability of extremists to misuse online gaming and capitalise on the rapidly expanding capacities of technology to transcend an attack itself.
Despite a handful of gamified attacks in recent history, knowledge regarding the phenomenon remains nascent. Evidence-based research is urgently needed to understand the impacts of gamified extremism on online radicalisation, risk factors for engaging in real-life violence, and how it impacts audiences differently across the world and between demographics. Nonetheless, some media outlets and policymakers are already shaping public discourse and perspectives around these issues.
Gamified terrorism can make for flashy headlines, but it can also contribute to distorting the reality of gaming and giving attackers added notoriety. Certain reports vilify the gaming industry, even placing partial responsibility for terror acts on gaming companies. Furthermore, media stories sensationalising gamified terror, or publishing videos and manifestos, give a larger voice to hateful ideologies and greater notoriety to terrorists. In doing so, the media has the potential to fuel the stigmatisation of gaming communities – made up of nearly three billion people worldwide – and to amplify the ideologies of terrorists, which could play a role in inspiring subsequent attackers.
At the same time, policymakers are also hurtling toward new measures and solutions. This is understandable and comes with well-placed intentions. However, acting without evidence-based guidance is unlikely to generate effective policies that tackle the roots of the problem.
Attempts to use existing policies and legislative measures on gaming and gaming(-adjacent) platforms – such as content moderation, de-platforming, regulating algorithm amplification, and end-to-end encrypted messaging platforms – cannot tackle the root causes and symptoms of gamified terror on their own. Such measures only scratch the surface and apply ‘band-aid’ solutions; they may hinder the consumption of extremist content, but they cannot ‘address the wide range of factors that motivate individuals to act on extremist beliefs and attitudes’.
Moreover, most legislation has been designed to address extremism on social media rather than video gaming or gaming(-adjacent) platforms – the latter including forum, thread and messaging sites designed for the gaming community. This is because audio-visual content requires advanced technological capabilities to rapidly and sufficiently hash content – hashes being unique digital fingerprints of known terrorist and violent extremist content in its raw form, appearing as a numerical representation – so that tech companies can better detect, track and prevent terrorist content online. As such, livestreaming platforms like Twitch and DLive face more challenges in detecting illicit content in videos than primarily text- and photo-based social media platforms.
De-platforming can create other challenges too. Extremist actors have found ways to auto-create new profiles almost instantaneously if they are de-platformed – and it’s not rocket science to do this manually either. It is often unclear whether a post qualifies as extremist or not, placing it in a grey zone: is it just an angry kid ranting, or hate speech or extremist rhetoric? While de-platforming may diminish the spread of propaganda, it can push people on the fringes of radicalisation and those already radicalised – the very people we want to monitor and reach – towards more toxic and unregulated places. In parallel, gaming spaces become mere tools for extremists to radicalise, recruit and link new followers off-platform to fringe spaces.
For a long time, public and private sectors have spurned cooperation, preferring to run solo rather than race together. It takes the inclusion of all actors – from policymakers, governments and researchers to practitioners, tech companies and Trust and Safety teams on gaming sites – to comprehensively combat online harms, especially ones leading to offline violence. One key aspect in achieving these aims, which may sound redundant but remains cardinal, is the ‘overwhelming need for more methodologically rigorous (empirical) research’.
Some examples of platforms where this research is happening are the recently formed Extremism and Gaming Research Network (EGRN) and the RAN Policy Support programme. RUSI Europe is a member of both networks, supporting the European Commission and EU member states in better understanding and tackling extremism and radicalisation.
As we continue to combat emerging threats like the gamification of terrorism and (violent) extremism, we need to bring all actors to the table. All parties want the same thing: to prevent the gamification of terrorism and combat the misuse of gaming and gaming(-adjacent) platforms. After all, the fun shared by gamers around the world should be the only thing radical about games.
Petra Regeni, Research and Project Officer, Terrorism and Conflict group, RUSI Europe