The success and influence of social media platforms lies within the framework of their algorithms to appeal to specific the interests and needs of their users. We have accepted the idea that the collection of our data and information is not overly concerning as it is rather a useful resource in serving us more content of what we love. Social media algorithms are built around the core objective: to promote content that will maximise user engagement. However, it has become noticeable that social media posts further tap into and play to our need for stimulus, on our anxieties, into negative, primal emotions in which our disdain and hate ironically keep us clicking and engaging with the posts; profiting these platforms.
You see, under the rhythms of capitalism, enforcing dominant and oppressive forces are essential to its flourishment. Social media essentially weaponises these rhythms, as it not only condones but it often promotes these harmful behaviours and ideas that fortify harmful hegemonic discourses to tap into the vulnerability of young minds that in turn maximise their user engagement and profits.
This becomes highly evident within recent social dilemmas surrounding Andrew Tate’s prevalent presence on social media. Even though his content radically violates social media guidelines, platforms appear to do little to limit his spread or ban his account. Instead, it has propelled him into the mainstream, allowing clips of him to proliferate, and actively promoting them to young users. Videos of him expressing extreme misogynistic, hyper-masculine regressive ideas have been viewed 11.6 billion times on TikTok alone.
Many of these violate community guidelines, but why aren’t they being taken down? Because views generate profit for TikTok, so they continue pushing these videos onto the feeds of young men, regardless of the real-world consequences that online misogyny can have.
We know what happens when violent internet misogyny goes unchecked; we saw it with Hunter Moore, who popularised revenge porn in 2012. We see it with incel forums, which have been churning out mass shooters at an alarming rate.
These videos not only encourage the engagement of young boys and shockingly adult men, but it feeds off the intervention of female viewers that respond to his pernicious content, further inflating his position on social media platforms to maximise views and engagement.
This algorithm further becomes apparent in emphasising socially prescribed gender roles, western beauty standards and internalised misogyny through campaigns and content that trap girls into a vicious cycle that tells them their worth is in what others think about their bodies. Although these may not be explicit, they are transferred invisibly and subconsciously internalised. Such harmful narratives, which reward extreme and dangerous ideologies, are boosted and amplified through algorithms that enable their content to reach greater visibility and engagement, creating a fertile breeding ground for serving companies’ capitalist interests at the expense of reinforcing oppressive hegemonies.
Ultimately, social media platforms covertly popularise controversial and problematic ideals in their content to perniciously gain further engagement and interaction by users, in turn translating to profit, regardless of its harmful influences and consequences on individuals' beliefs, attitudes and perspectives.
We need to hold social media platforms accountable for giving the loudest microphones to the most dangerous people.
ABOUT THE AUTHOR, GIGI:
With a keen interest in Property Economics and Business Law, I am in my first year of University after completing my HSC at school in Sydney last year.
I wanted to participate in the Youth Committee to contribute a contemporary perspective on the safety of social media engagement and effective for young people.
Kommentare