By Mokareoluwa A.
At a Tops Friendly Market store on a quiet Saturday morning in Buffalo, New York, on the 14th of May 2022, the world once again saw a repugnant, destructive display of the lethality that so often and so harrowingly accompanies alt-right ideology. Black people were targeted, hunted, and mowed down in a racially motivated attack — ten were left dead.
The mistake of thinking that this kind of act of cruelty is unfounded must not be made; the Buffalo shooter live-streamed his entire racist rampage on the internet, as well as leaving a ‘manifesto’ riddled with racist, antisemitic, and extremist drivel. Payton Gendron, the gunman, is an example of yet another person radicalised online — yet another whose perception of the world was warped by the Alt-Right Pipeline, with devastating effects. Any attempt to curb the thorny, weed-like growth of the alt-right must, then, involve an attempt to understand the spreading phenomenon of online radicalization.
The term ‘Alt-Right Pipeline’ denotes a process in which an individual’s beliefs are incrementally changed through consistent exposure to, and immersion in, various extremist media and discourse on the internet, culminating in a complete ideological shift towards those worldviews, and potentially real-life acts of terror. (This is not to say that all extremist media is alt-right, but the term is used in specific reference to the alt-right alone.) It might begin with introduction to certain ‘conservative’ principles, but it does not end there. As many people know, the nature of sites such as YouTube is such that users are recommended videos related to that which they have just watched, over and over again; regardless of the near infinite inventory YouTube boasts, users’ pages are algorithmically personalized and geared towards the content they consume. They are thus steered towards specific content, and in some cases potentially extremist content.
This shift is a symptom of the basic human desire to explore interests and uncover ‘truths’ more and more. The effect of this is that viewers are progressively led “down a rabbit hole of extremism” (Zeynep Tufecki, 2018), their thought and worldview almost completely reshaped by the pervasive nature of online media. Echo chambers are created as such discourse leaves YouTube and finds reinforcement on such sites as Reddit and Discord – users are inundated and inundate their peers with such extremism until it becomes normal and the old normal is rejected.
The Pipeline is such that jokes and memes lead to the normalization of their toxic subject matter — the more you see and the more is shared, hiding under thinly-veiled irony, the less outrageous they become. Once they become normal, the user acclimatizes to certain levels of extremism until they effectively grow numb to a degree of hatred, pushing deeper into thought more dangerous and deadly than they had previously experienced. As this process carries on, users move into a mental state in which they are able to dehumanize anyone who does not share sympathy for their rhetoric — whether this be Black people, Jewish people, feminists, Marxists and more — and think of them or vocally demean them as lesser beings, undeserving of rights or personhood. To them, only the ‘red-pilled’ are human.
We have seen such individuals carry out many acts of violence in recent times — the Christchurch Mosque shootings, the 2018 massacre of Jewish worshippers at a Pittsburgh Synagogue, and others. However, there is no officially recognized term that can be designated to individuals who participate in such incendiary discourse and online behaviour before engaging in physical acts of violence. The people carrying out these attacks belong to no recognized groups, and are individuals operating independently, picking and killing their own targets. Thus, it can appear as though these people are slipping out of the woodwork with no prior signs and committing unspeakable acts of terror. However, this perception is flawed; they have always been there, lurking along the darkly bordered lines of social media, perverted by those who have come before them. Nurtured and festering within the toxic environments of far-right Internet groups, one might consider it inevitable that such people would realise their hatred in the form of raising guns and swords against those whom they have been conditioned to hate. The depths of the Internet are manufacturing monsters, and they are closer to the surface than we might all like to believe — the tendrils sweeping about the mouth of the pipeline are affecting us all, who may fail to look twice at slurs pushed across our touchscreens or memes shared at the expense of certain already-marginalised groups.
If these acts of violence that threaten our very way of life are to be stopped, and if the perennial loss of life is to meet an end, work must be done to disrupt the conspiracies of these volatile individuals and stop people from being drawn into the pipeline altogether.