Social Media Algorithms and Their Threat to Children

Jackson Nielsen

We live in a digital age where potentially harmful content is only a few finger swipes away. This is applicable for every age of human who has access to the internet. To make matters worse, after viewing dangerous content, it then begins to be served on a silver platter to the viewer only furthering the detrimental experience for youthful minds. For this reason, big tech CEOs of the world’s most powerful platforms, such as TikTok, Instagram, and Facebook, need to change how these algorithms function which would lead to a safer online environment for the still-developing minds of our youth. 

These social media platforms are ultimately businesses who look to grow their user numbers while simultaneously growing the number of people who use the applications. So there are features implemented within these platforms that are made to be addictive and keep the user engaged longer. So if a child is constantly scrolling through the content served to them on their feeds, this only increases the threat of being served harmful content by the algorithm. The searches these kids are requesting may be innocent and purely for research purposes, such as “how to feel better about myself” or “inspirational quotes for when I’m sad” but the way the algorithms work may twist these searches and begin displaying content glorifying eating disorders or videos promoting self-harm. Meta, which owns Instagram, studied the kind of experiences kids were having on their application and found that “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse”. 

Furthermore, not only do these algorithms suggest harmful content to children, it can also lead online predators to their victims. In the same way one who likes cute animals could be shown an increasing number of cute animal videos by these social media algorithms, an online predator who searches for minors will continue to be shown these underage kids because the algorithm knows this is what they want to see. While explicit content of children is illegal, it is still out there, even on popular social media platforms. This content may be flagged and put behind a viewer discretion wall but it was brought to the attention of Meta CEO Mark Zuckerberg that the users were still able to view the explicit content with a single press of the “view content anyway” button. This is clearly morally wrong, exploiting our children and showing this content to the world is a detriment to our society and the future of these social media platforms. 

It should be clear that these social media platforms were created with good intentions. By creating a space where loved ones can connect, innocent pictures can be shared, and news can circulate to the masses. Unfortunately, these applications have evolved into a potentially harmful environment where children can view whatever they please unmonitored and online predators have their victims at their fingertips. The CEO’s of these apps, need to alter the way their online algorithms function which will lead to the protection of children and the decline of their exploitation.