Written by PTC | Published November 14, 2023
Forty-one states and the District of Columbia are now suing Meta Platforms, Inc., for contributing to the youth mental health crisis by knowingly and deliberately designing addictive features into their Facebook and Instagram social media platforms.
A 2021 investigation from the Wall Street Journal turned-up documents showing that Facebook formed a team to study preteens and business opportunities that targeting that demographic might present. One internal document stated, “Why do we care about tweens? They are a valuable but untapped audience.”
Facebook and other social media companies become addictive because they provide a “social validation feedback loop.” Every time the user gets a notification that someone has liked or commented on one of their posts, they get a little hit of dopamine – the chemical in the brain most closely associated with feeling good. Social media companies ensure that kids get a regular supply of dopamine hits by using algorithms to increase engagement and keep kids locked on social media for hours at a time
Social media algorithms are designed to curate and personalize content for users based on their preferences, behaviors, and interactions. While the exact workings of algorithms can vary between platforms, there are some common principles and factors that influence how social media algorithms operate:
User Engagement: Likes, Comments, and Shares
Algorithms often prioritize content that has higher engagement, such as posts with more likes, comments, and shares. Unfortunately for teens on social media, the more extreme the content is, the more engagement is it likely to generate – even if the engagement is largely negative. This increases the likelihood that children engaging with a topic will be fed more and more extreme content related to that topic.
Relevance: User Preferences & Content Type
Algorithms consider a user's past behavior, interactions, and preferences to predict what content they are likely to engage with. This includes the types of posts a user has interacted with in the past. Algorithms may prioritize certain types of content based on a user's historical engagement. For example, if a user frequently engages with videos, the algorithm may show them more video content. News headlines in recent years have demonstrated how this type of algorithm can feed depression and suicidal ideation in teens with tragic results.
Content from users' friends, family, and close connections may be prioritized. Social media platforms aim to strengthen the sense of community and connection among users, but this can also backfire by feeding teens’ fear of missing out, feeling left out of social gatherings and activities their friends are engaging in, or encouraging teens to compare their life with the curated content they see from friends and peers online.
Before you allow your teen to join any social media platforms, it is important that you sit down and talk to them about how these algorithms can distort their perceptions of reality and how harmful they can really be.