Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› 9 Design Interactions Responsible For The Social Media Mental Health Crisis

9 Design Interactions Responsible For The Social Media Mental Health Crisis

by Circ Cular
36 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

What is it specifically about social media that increases anxiety, depression and suicide amongst teens? How can we map these negative feelings to the interactions users have on social platforms? Is it possible to quantify the level of toxicity in these platforms? This article aims to dissect the toxicity of how we interact online using the latest scientific research and my altruheuristics framework.

We’ve been told often that social media can be detrimental to our mental health. As a designer who advocates for mental health, I went on a journey to try and understand if we could dissect the specific design interactions that contribute towards this epidemic so that we can design more humane and ethical products.

I couldn’t find any information so my own research led me to conduct the first ever deep dive into the mapping of mental illness to everyday social media interactions. Here’s what I learned.

Firstly, the benefits.

Social media isn’t entirely all bad. There are some advantages including:

  • Access to other people’s health experiences and expert health information
  • Emotional support and community building
  • Self-expression and self-identity
  • Making, maintaining and building upon relationships

How does mental illness in social media manifest? Isn’t it just a modern day version of newspapers?

No. To understand why the toxicity of social media is much more pervasive, we need to understand how the stress and dopamine response works.

Firstly the stress response. It’s natural and a really good thing for our stress response to activate when we come face to face with a lion in the savannah. Adrenalin is coursing through our veins, the parasympathetic nervous system is in full drive, our heart rate is elevated, sweat is released through our glands as our body prepares for fight or flight by rushing as much blood away from our digestive tract and towards our muscles.

This stress response happens on a smaller scale when we read a newspaper maybe a few times a day. The problem isn’t the activation of the response however, it’s the chronic activation and the 24/7 availability that does the damage over time through mass consumption and constant updates from social media.

When we chronically activate our stress response through the consumption of outrageous news, it lowers our immune system over time leaving us vulnerable to catching some nasty diseases.

Same thing applies with dopamine. When we chronically stimulate our dopamine pathways through the consumption of social media, comparing ourselves to others, playing games or watching porn, it deceptively lowers our baseline over time leaving us vulnerable to a depletion of pleasure resulting in increased of pain, anxiety and depression in the long term. This molecule follows Newton’s third law of physics, for every action there is an equal and opposite action.

If happiness is determined by expectations, then two pillars of our society — mass media and the advertising industry — may unwittingly be depleting the globe’s reservoirs of contentment. If you were an eighteen-year-old youth in a small village 5,000 years ago you’d probably think you were good-looking because there were only fifty other men in your village and most of them were either old, scarred and wrinkled, or still little kids. But if you are a teenager today you are a lot more likely to feel inadequate. Even if the other guys at school are an ugly lot, you don’t measure yourself against them but against the movie stars, athletes and supermodels you see all day on television, Facebook and giant billboards [1].

Which platforms are the biggest culprits of fostering mental illness?

According to the net impact on young people’s health and wellbeing, the most toxic social platforms consist of Instagram, TikTok, Snapchat, Facebook and Twitter.

The vast majority of scientific research on the correlation between social media and mental illness point to a specific design component ironically named, the feed. The most popular social platforms used today that contain the notorious feed component include:

  • Twitter and its copycats:
    – BlueSky
    – Mastodon
    – Truth Social
    – Threads
  • Facebook
  • Medium
  • Instagram
  • YouTube
  • LinkedIn
  • TikTok
  • Tinder
  • SnapChat
  • Reddit

Feeding the beast.

Social platforms are all designed the same but may use different styles. The social feed contains cards algorithmically placed in either a grid or list format with pagination. For either mobile or web, a card can contain short form text, images, GIFs or videos. Also a profile page is included for showcasing social status through a follower count.

Why do social platforms insist on using the feed?

To ensure that the user stays on the platform, they have to engage with the product. If they get bored, they leave as there is no shortage of unoriginal copycat competitors to go to. The feed is where all the seemingly cool people hang out to spread the latest gossip. It’s all a status game.

The ultimate goal for any company is to generate revenue by any means necessary. The best predictor for ensuring that money keeps rolling in is by using the customer lifetime value metric. This metric, in simple terms means that the longer the user is on the platform, the more likely they are to spend money or provide more and more data to be sold to advertisers.

How does the feed deliver engagement?

Through the algorithmic exploitation of variable rewards through the dopamine pathways inspired by casinos. Gambling addiction works like this. The lever is pulled, the dice is rolled, the card is revealed. All of these actions create a sense of excitement from the anticipation of a reward.

The molecule of more works like this. The goal is to feel pleasure through the consumption of something novel as it’s hardwired in our DNA. The peak in pleasure does not arrive through the attainment of a reward, but from the anticipation. In other words, dopamine levels reach its peak not when we consume good content, but when we anticipate that there will be good content arriving in the near future. We keep scrolling and refreshing the screen. The feed is a mix of content that can either be good or bad so it’s a goldmine of instantly gratifying anticipation and rewards just waiting to be plucked from the hat.

Novelty that keeps us engaged comes in many potent forms. The never ending delivery of outrageously polarizing news, social validation through reactions, and envy through the comparison of wealth and status.

This manipulative interaction priming us towards instant gratification goes against our purpose for delaying gratification. The delay in reward is where the real sense of satisfaction comes from.

Who are the most vulnerable?

Using data based on major depression from the USA since the inception of social media, those who tend to be affected are:

  • Generation Z mostly as they are the first generation to be born with social media since adolescence where their brains are still in development and more prone to being impressionable.
  • Millennials are also affected but on a lesser scale as we grew up before the inception of social media and lived through the golden age [2].
  • Females are more affected than males as they spend more time on social media.
  • Social platforms are intentionally terrible at verifying age so users can even be younger than 13. Facebook whistleblower Frances Haugen confirms that companies like Instagram are aware of this and have done nothing about it. The younger the user, the easier it is to get them hooked due to their developing brains.

How does social media feed mental illness?

There are 16 effects that are caused by the interaction design of the social feed based on the conducted research that you can find in the Notes section at the end of this article. Each effect is grouped using an altruheuristic created to expose unethical interactions. More information about my altruheuristics framework can be found here. Bare in mind that all these effects can eventually lead to anxiety and depression.

  1. Anxiety & Depression
    – 
    General anxiety & depression leading to cognitive distortions, fear, bias exploitation and self-harm [3].
  2. Comparison
    – 
    Comparison of self-image leading to low self-esteem.
    – Curated highlight reels.
    – Unhealthy levels of perfectionism with self.
    – Body image obsession leading to eating disorders & plastic surgery [4].
    – Polarization and filter bubbles.
    – Self identity modification.
  3. Exclusion
    – 
    Fear of missing out, being excluded and feelings of loneliness [5].
    – Bullying and trolling [6].
  4. Dopamine Regulation
    – 
    Sleep deprivation from overstimulation [7].
    – Addiction to novelty seeking.
    – Context switching leading to a lack of sustained focus.
  5. Status Play
    – Social status exploitation leading to emotional dysregulation.
  6. Truth
    – Reality distortion from algorithmic bias.
    – Misinformation and disinformation.
    – Divisiveness through the use of bots.

Dissecting the illusive social feed

The social feed is made of many parts and they each play a role towards the 16 mental illness effects outlined earlier. It’s not just the illusive design components that feed the illness however. Like a puppet, black box algorithms are the strings that control how the design components are presented and how the audience interacts with them.

1. Profile card

Altruheuristics

  1. Anxiety & Depression
    – 
    General anxiety & depression leading to cognitive distortions, fear, bias exploitation and self-harm.
  2. Comparison
    – 
    Comparison of self-image leading to low self-esteem.
    – Curated highlight reels.
    – Unhealthy levels of perfectionism with self.
    – Body image obsession leading to eating disorders & plastic surgery.
    – Polarization and filter bubbles.
    – Self identity modification.
  3. Status Play
    – Social status exploitation leading to emotional dysregulation.

The profile card is always signaling status through the follower count and bio. First of all the concept of following/follower breaks the mapping interaction design principle as we don’t follow people in reality the way a sheep follows the herd.

The count is used to signal trust and authority. It exploits our bias towards authority so the larger then count, the more authority the account perceives to have. It is a bias as a larger count does not necessarily mean that the authority is warranted.

Like the media, social platforms such as Twitter encourage provocative content through their engagement algorithm as they know it’s what gets the attention. The result is users being coerced into posting provocative content to receive engagement. Provocative content attracts provocative followers which provides the provocative account with validation and the retention cycle continues. This is how mob mentality and cancel culture arise.

A low count can bias us towards low status signaling yet, the account could be new, niche, inactive or failing to play by the rules of engagement. There’s nothing more that signals ‘pretentious high status’ than an account with thousands of followers and a zero following as no attempt at reciprocation is being made.

The comparison between our count and theirs can elicit negative emotions, lowering our status if our count is smaller and elevate our status if our count is larger. The larger the count however, the more engagement, likes, favorites, reposts and shares they receive so their content is pushed to the top of the feed meaning that most tweets are from accounts that have a larger following than us. This results in most comparisons being negative.

This count however, is illusive and does not entirely convey trust or authority. There are a few ways to artificially bolster up the count to fake the signal of high status, trust and authority.

There are plenty of third party services where a single payment allows follower farms to boost the clout. These conspicuous accounts that use this service have large followers yet little engagement with their content.

There are those to partake in the ‘I’ll follow you if you follow me’ game. The subtle nature of this interaction eventually leads to someone unfollowing after some time. On the topic of reciprocation, there can be real-word consequences to unfollowing, unmatching or defriending someone, especially when it’s a close friend or relative.

Bots are prevalent throughout social platforms and can significantly boost the status of an account as well as wreck havoc in democracy. These fake users are reported to make up 19% of all interactions online however some of them could be real users who are inactive or lurkers. Either way this paints a misleading picture not being able to tell the difference between a bot and an inactive user.

A quick summary of some of Musk’s follower data for those who haven’t read the more detailed report: Just over 42 percent of Musk’s more than 153 million followers have 0 followers. More than 40 percent have zero tweets posted on their account. Around 40 percent of Musk’s followers also follow less than 10 users. This points to a few things. Many of these accounts can be fake accounts or bots. Also, many of these accounts can simply belong to inactive users or people who set up an account and rarely if ever return [8].

When many of your followers are inactive and your content receives low engagement, it can lead to many cognitive distortions. There are some of us that become preoccupied with the growth of our count. A loss of a follower can lead to the jumping of conclusions as we attach our self worth to the count making us wonder what we did or question our value. The real reason was the removal of a bot.

One death is a tragedy, a million deaths a statistic. — Joseph Stalin

The distortion of our reality with the use of bots, manipulation of gaining followers and the obsession with being driven by the more molecule to accumulate more and more followers prevents us from developing the compassion to view others as human. This illusive design component hides the humanity behind a number enabling apathy by making us see others as a mere statistic. We should not compare our count to those that are infested with bots and attach our self worth to it as we do not see past the million deaths.

2. Messages, comments and replies

Altruheuristics

  1. Exclusion
    – 
    Fear of missing out, being excluded and feelings of loneliness.
    – Bullying and trolling.

An honorable mention. All social apps have a messaging or comments component where any user or a set of users with the right privileges can send a message. This is somewhat distinct from the feed yet is it a goldmine for bullying and spam, especially for platforms that allow pseudo names.

Social platforms like Reddit allow users to take on a new online identity other than their own. The online disinhibition effect enables actions and real-world consequences to be detached from the self. Like Harry Potter’s invisible cloak it begs the question, what would you do if you could be invisible? Similar to the followers count, reciprocating with those that we can not identify with leads to a deficit in empathy and a rise in pathological behavior. Mere words read online can be interpreted as malicious as they lack expression, personality, tone and emotion.

The rank order and visibility of comments and replies can be affected by the voting system and prioritized or deprioritized based on verification without the user’s knowledge. Having your opinions deprioritized for not belonging to the exclusive verified club unsurprisingly leads to exclusion.

3. Upvoting

Altruheuristics

  1. Truth
    – Reality distortion from algorithmic bias.
    – Misinformation and disinformation.
    – Divisiveness through the use of bots.

The upvote and downvote design component has been a staple since the inception of the forum where online communities gather together and post engaging content. The system tries to mimic a democracy where it appears that everyone has a fair chance at gaining visibility yet it falls foul to misuse at the expense of those who seek genuine interactions.

The voting system can be rigged for those looking to seek social validation leading to feelings of social rejection and a distorted reality.

Voting platforms such as Reddit and ProductHunt have feeds that display the most trending or popular posts by default. Most of the content you see has been algorithmically cherry-picked from the survivors with high votes giving the illusive bias that all posts can reach that level of standard along with a feeling of social rejection. The reality is that most posts get little to not traction if you sort them by the new filter. There are a few reasons for why this is.

Illusive votes can be rigged through the herd mentality where the original poster can get others to vote on their content enabling it to be pushed up the queue and into the front page. Content can be downvoted for a myriad of trivial reasons unrelated to quality.

Timezones play an important role too. If the majority of users are from the USA, posting content when they are asleep won’t help with visibility. Servers can reset at 12am leaving those who post near that time with an unfair advantage due to a limited window of visibility.

The original poster may also be competing with others with large followings which can cast an overshadow on their own content. Moderators of these forums can delete posts at their will if rules are violated. Something as trivial as a duplicated posts can lead to posts disappearing without reason or notice.

The voting system is not a democracy and can be rigged resulting in cognitive distortions through illusive design not visible to the user. A lack of transparency leads to a lack of trust.

4. Pagination

Altruheuristics

  1. Anxiety & Depression
    – 
    General anxiety & depression leading to cognitive distortions, fear, bias exploitation and self-harm.
  2. Dopamine Regulation
    – 
    Sleep deprivation from overstimulation.
    – Addiction to novelty seeking.
    – Context switching leading to a lack of sustained focus.

TikTok auto-paginates when a video is finished, YouTube does the same using a countdown with the addition of loading videos infinitely when scrolling. Tinder requires you to paginate through swiping. There is no end to the vortex of overstimulation.

Social platforms use a recommender system to suggest what to consume next for further increasing engagement. Many data points for recommending content include previous history, friends history or what’s trending with the help of AI.

These design interactions provide a never-ending chase for pleasure inevitably leading to a depletion of dopamine and elevated levels of anxiety and depression.

The constant switching of context between short videos, short image posts, short tweets, short gifs fragments our attention leading to more resistance when we need to sustain focus for long periods of time. The result of context switching with short form content has led to many people being misdiagnosed with ADHD [9].

A limit must be set to offset the damage caused by illusive design.

5. Reactions

Altruheuristics

  1. Dopamine Regulation
    – 
    Sleep deprivation from overstimulation.
    – Addiction to novelty seeking.
    – Context switching leading to a lack of sustained focus.
  2. Status Play
    – Social status exploitation leading to emotional dysregulation.

There’s no other social component that has been so transformative in how we interact online than the Facebook Like button. Our social status is elevated when we get that hit of dopamine from a like, reaction, favorite, clap or match.

The addition of a like provides a strong sense of social validation. The absence of a like however can lead to many cognitive distortions and feelings of social rejection and emotional dysregulation.

This illusive design component has the same effects of the voting system as it can also be rigged, yet it has the full backing of the black box algorithm. Your reach could be limited due to deprioritization over verified users or brands, timezone, soft shadow-banning or because Elon said so. Reactions can also be rigged by bots.

We tend to attach our self worth to these artificially designed illusive components. It is human nature to seek validation but not through interactions that are managed by a psychopathic algorithm.

6. Notifications

Altruheuristics

  1. Exclusion
    – 
    Fear of missing out, being excluded and feelings of loneliness.
    – Bullying and trolling.
  2. Dopamine Regulation
    – 
    Sleep deprivation from overstimulation.
    – Addiction to novelty seeking.
    – Context switching leading to a lack of sustained focus.

In-app notifications work similarly to reactions where activities mostly related to social validation such as being followed, reactions or replies are intermittently reinforced using a red dot.

It’s a double edged sword where a notification bubble produces a dopamine hit whereas the absence of the bubble can elicit the illusion of social rejection and distorted thoughts.

The molecule of more needs more hits so the refresh of a page — being the equivalent of dice roll, lever pulldown and spinning of the roulette creates peaks in anticipation leading to repeated refreshing at higher rates. The absence of social validation from the bubble can lead to the jumping of conclusions and catastrophizing however unbeknownst to the user, there could be many valid reasons for its absence unrelated to social rejection.

Intermittently dispersed notification rewards play directly in the hand of retaining attention through the customer lifetime value metric. This form of social validation is illusive and we can offset the damage through slowly limiting our exposure.

7. Short-form Content Cards

Altruheuristics

  1. Anxiety & Depression
    – 
    General anxiety & depression leading to cognitive distortions, fear, bias exploitation and self-harm.
  2. Comparison
    – 
    Comparison of self-image leading to low self-esteem.
    – Curated highlight reels.
    – Unhealthy levels of perfectionism with self.
    – Body image obsession leading to eating disorders & plastic surgery.
    – Polarization and filter bubbles.
    – Self identity modification.
  3. Exclusion
    – 
    Fear of missing out, being excluded and feelings of loneliness.
    – Bullying and trolling.
  4. Dopamine Regulation
    – 
    Sleep deprivation from overstimulation.
    – Addiction to novelty seeking.
    – Context switching leading to a lack of sustained focus.
  5. Status Play
    – Social status exploitation leading to emotional dysregulation.
  6. Truth
    – Reality distortion from algorithmic bias.
    – Misinformation and disinformation.
    – Divisiveness through the use of bots.

Never-ending short-form content in the format of text, image or videos combined with the ability to paginate automatically promotes impulsive out-of-context-switching where our focus is shifted from one outrageous dopamine hit to the next. The addictive nature of this habit results in the difficulty for us to sustain the deep work required for us to experience meaning.

Short-from content such as tweets or TikTok videos enable cognitive distortions through the elimination of context allowing users to jump to conclusions. The constant reinforcement of distorted thoughts can lead to elevated levels of anxiety and a very distorted view of the world. It’s also a goldmine for the molecule of more.

There’s a formula for designing YouTube thumbnails that gets clicks and it consists of displaying a surprised face. The element of surprise plays to our primitive need for seeking novelty as it excretes dopamine.

The limit of characters, size or duration coerces the user into crafting clickbait content in such a way that only the highlights are displayed. As users become acclimated to the outrage, more outrage is required to retain attention. Like the breaking news media headlines, there’s only so much space for text. So why not choose the most triggering words to grab attention?

An example being stories of violence. Headlines are quick to portray killers as cold and calculated yet fail to explain the mental health issues lurking behind the motivation. It’s too much detail to put into a headline and it’s not as triggering because it doesn’t elicit curiosity through attention-grabbing apathy. The user is kept in a state of learned helplessness with a distorted worldview. The constant bombardment of breaking news exploits our bias to availability making us feel as if violence is happening everywhere all the time.

Limiting our exposure can help us mitigate the damage.

8. AI Filtered Cards

Altruheuristics

  1. Anxiety & Depression
    – 
    General anxiety & depression leading to cognitive distortions, fear, bias exploitation and self-harm.
  2. Comparison
    – 
    Comparison of self-image leading to low self-esteem.
    – Curated highlight reels.
    – Unhealthy levels of perfectionism with self.
    – Body image obsession leading to eating disorders & plastic surgery.
    – Polarization and filter bubbles.
    – Self identity modification.
  3. Status Play
    – Social status exploitation leading to emotional dysregulation.
  4. Truth
    – Reality distortion from algorithmic bias.
    – Misinformation and disinformation.
    – Divisiveness through the use of bots.

Many social platforms allow beauty filters to be applied to videos and images without disclaimers, enhancing the face or body to extreme levels. Spots and wrinkles can be removed, eyes can be bigger and noses can be smaller, exaggerating the already unrealistic beauty standards set by celebrities and influencers [10].

Upward comparison is the thief of joy as it lowers our social status leading to self-consciousness resulting in anxiety and eventually depression. Similar to the highlight reel, the feed portrays a filtered worldview and we compare these highlights to our raw, unfiltered appearance.

These filters affect females more than males and set unrealistic standards and distort reality damaging self-esteem over time. Filters are nothing more than an unregulated social experiment on those who are the most vulnerable. Comparison using these features are the primary cause of anxiety, depression and suicide. The only cure is to abstain.

9. Algorithmic Cards

Altruheuristics

  1. Anxiety & Depression
    – 
    General anxiety & depression leading to cognitive distortions, fear, bias exploitation and self-harm.
  2. Comparison
    – 
    Comparison of self-image leading to low self-esteem.
    – Curated highlight reels.
    – Unhealthy levels of perfectionism with self.
    – Body image obsession leading to eating disorders & plastic surgery.
    – Polarization and filter bubbles.
    – Self identity modification.
  3. Exclusion
    – 
    Fear of missing out, being excluded and feelings of loneliness.
    – Bullying and trolling.
  4. Dopamine Regulation
    – 
    Sleep deprivation from overstimulation.
    – Addiction to novelty seeking.
    – Context switching leading to a lack of sustained focus.
  5. Status Play
    – Social status exploitation leading to emotional dysregulation.
  6. Truth
    – Reality distortion from algorithmic bias.
    – Misinformation and disinformation.
    – Divisiveness through the use of bots.

The most illusive form of design is the design that you can’t see. The feed algorithm is invisible and it’s where most of the damage is concentrated. The feed was originally ordered chronologically but the level of engagement couldn’t compete with the AI-powered ranking system.

The algorithm can cause a myriad of cognitive distortions and can modify your reality for the worse.

Like a puppet, black box algorithms are the strings that control how the design components are presented and how the audience interacts with them.

The algorithm feeds your addiction and learns your habits through clicks, length of time watched, channels subscribed to and keeps you hooked by playing on your confirmation bias and showing what you want to see through echo chambers or filter bubbles. Whether it’s YouTube recommendations, Google SEO page ranking or the next TikTok video, the algorithm can be gamed through keyword bidding and hashtags.

The algorithm ignores the content that fails to grab engagement and pushes the survivors to the top of the queue. Overnight millionaires, supermodels, fitness models and bodybuilders with genetics in the 1% of the population, abuse of steroids, hours spent in the makeup room and production companies working behind the scenes are all pushed out of the limelight giving the illusion of effortless instant gratification from these false gods.

Bots create engagement through provocation and the algorithm can’t discern between real and fake news. We shouldn’t need to place our wellbeing into the hands of a psychopath that cares not for our values and boundaries.

The algorithm can show your friends highlights making you think that you’re missing out with feelings of social rejection.

If your opinion goes against the beliefs of the tyrannical administrators, you can find yourself either soft or hard shadow-banned without your awareness. The metric that powers the algorithm after all is customer lifetime value and not the customer wellbeing value.

Unfortunately, e-commerce companies lobbied successfully to have the age of “internet adulthood” set instead at 13. Now, more than two decades later, today’s 13-year-olds are not doing well. Federal law is outdated and inadequate. The age should be raised. More power should be given to parents, less to companies. — Jonathan Haidt

Suggested Design Solutions

All is not doom and gloom however. Solutions provided by scientists to help conquer the illusive design of the social feed are outlined here. The majority of social platforms have not implemented even some of these solutions.

  1. The introduction of a pop-up heavy usage warning on social media
  2. Social media platforms to highlight when photos of people have been digitally manipulated
  3. Social media platforms to identify users who could be suffering from mental health problems by their posts and other data, and discreetly signpost to support
  4. Don’t promote unrealistic beauty standards
  5. Encourage positive self-affirmations
  6. Acknowledge and address mental health risks
  7. Congress should pass legislation compelling Facebook, Instagram, and all other social-media platforms to allow academic researchers access to their data. One such bill is the Platform Accountability and Transparency Act, proposed by the Stanford University researcher Nate Persily.
  8. Congress should toughen the 1998 Children’s Online Privacy Protection Act. An early version of the legislation proposed 16 as the age at which children should legally be allowed to give away their data and their privacy. Unfortunately, e-commerce companies lobbied successfully to have the age of “internet adulthood” set instead at 13. Now, more than two decades later, today’s 13-year-olds are not doing well. Federal law is outdated and inadequate. The age should be raised. More power should be given to parents, less to companies.
  9. Elevate user safety, health, and wellbeing in the culture and leadership of technology companies.
  10. Assess and address risks to users at the front end of product development.
  11. Continually measure the impact of products on user health and wellbeing and share data with the public.
  12. Recognize that the impact of platforms and products can vary from user to user, and proactively ensure that products designed for adults are also safe for children and adolescents.
  13. Allow users to provide informative data about their online experience to independent researchers.
  14. Directly provide researchers with data to enable understanding of (a) subgroups of users most at risk of harm and (b) algorithmic design and operation.
  15. Partner with researchers and experts to analyze the mental health impacts of new products and features in advance of rollout. Regularly publish findings.
  16. Allow a broad range of researchers to access data and previous research instead of providing data access to a privileged few.
  17. Take a holistic approach to designing online spaces hospitable to young people.
  18. Limit children’s exposure to harmful online content.
  19. Give users opportunities to control their online activity, including by opting out of content they may find harmful.
  20. Develop products that actively safeguard and promote mental health and wellbeing.
  21. Promote equitable access to technology that supports the wellbeing of children and youth.
  22. Co-create a code of ethics. First, visual modifications must be clearly labelled, so that consumers can easily distinguish real features from those that have been augmented. And second, as with risks associated with any sort of product, brands should explicitly specify the potentially harmful effects of their products on users’ psychological wellbeing.

My Solution

I’ve developed the altruheuristics framework which can be used alongside the Heuristic Evaluation Workbook by NNGroup. Design stakeholders working with low UX maturity can integrate this framework using biology and CBT to expose unethical design patterns where existing principles can’t. This framework is 80% more effective than dark patterns in exposing unethical design. Unethical interactions can be evaluated after wireframes, user flows, prototypes or mockups. The earlier the better so get the free guide here.

Takeaways

It’s clear from the lack of collaboration with researchers that social platforms are hiding the true nature of their impact. Regulations are being introduced to slow the damage however the damage may already have been done. Lobbyists have set the internet adulthood age to 13 which allows social platforms to prey on those who are most vulnerable and this needs to be reversed. A lot of the research used the broad definition of social media making it difficult to isolate the problem areas and arriving at false conclusions. All of these illusive components combined in a feed give rise to chronic overstimulation.

The whole is greater than the sum of its parts  —  Aristotle.

One surprising takeaway I discovered is that there wasn’t any mention of dopamine depletion in the research that I undertook. The depletion of dopamine does result in a lack of pleasure leading to an increase in pain. Using these products at first feels great but the danger lies on the end of the scale where acute pleasure becomes chronic as there is no balance. The consequences of constant overstimulation is anxiety and depression so I’ll assume that it’s a given.

On the surface level, it appears as if the design is what is responsible for the illness. However, upon looking at the evidence that I have presented, every illusive design interaction mentioned is influenced by an invisible algorithm built either on the backend (business logic) or frontend (bots). The design is what bridges the gap between the two.

When the user experience is controlled by an algorithm incapable of experiencing empathy, we end up with apathy driving the user experience and not compassion.

A human lacking in empathy is labeled a psychopath and algorithms lack empathy. When the user experience is controlled by an algorithm incapable of experiencing empathy, we end up with apathy driving the user experience and not compassion.

We designers design for empathy yet we have no power as decision makers. The design field should be outraged that their craft is being abused but they’re choosing to stay silent. The world leaders in research-based user experience, NNGroup & W3C don’t even include information about the dangers of social media. It takes a tremendous amount of courage to find your voice when you have no power.

Scientists are having a hard time with causality, meaning they can’t tell whether mental illness is caused by social media or whether those who are mentally ill use social media more. It seems to be a self-reinforced cycle however here are a few points that lean towards social media being the culprit.

  • Sometimes comparing ourselves with others, short-term stress, and short dopamine bursts can be good. Chronic exposure to these effects however can be detrimental over time.
  • Those who abstain or significantly reduce their expose to social media show a large improvement in their wellbeing.
  • Social platforms are intentionally designed for chronic stimulation. Like gambling in casinos, removing windows, using bright lights and colors are all part of the experience.

Thanks so much for reading. If you’ve enjoyed this article and feel that you’ve learned something of value, please share it so that others can learn from it too.

I’m working on a humane social platform that’s backed by ethical science designed to help create deeper value-driven connections. Visit circlo.com and sign up to our newsletter to get early access and more articles like this.

I’m making it my mission to fix the social media mental health epidemic caused by illusive design. As a designer, I feel it’s a moral obligation to design ethical products that do not exploit young minds.

Notes

The evidence presented here isn’t exhaustive however it illustrates the common patterns I’ve found when mapping the social feed to mental illness. I’ve provided references and extracts of important notes through numerous research papers, reports, essays and books grouped by resource.

  1. The first is that social media presents “curated” versions of lives, and girls may be more adversely affected than boys by the gap between appearance and reality. Many have observed that for girls, more than for boys, social life revolves around inclusion and exclusion. Social media vastly increases the frequency with which teenagers see people they know having fun and doing things together-including things to which they themselves were not invited. While this can increase FOMO (fear of missing out), which affects both boys and girls, scrolling through hundreds of such photos, girls may be more pained than boys by what Georgetown University linguistics professor Deborah Tannen calls “FOBLO” — fear of being left out? When a girl sees images of her friends doing something she was invited to do but couldn’t attend (missed out), it produces a different psychological effect than when she is intentionally not invited (left out). And as Twenge reports, “Girls use social media more often, giving them additional opportunities to feel excluded and lonely when they see their friends or classmates getting together without them.” The number of teens of all ages who feel left out, whether boys or girls, is at an all-time high, according to Twenge, but the increase has been larger for girls. From 2010 to 2015, the percentage of teen boys who said they often felt left out increased from 21 to 27. For girls, the percentage jumped from 27 to 40. Another consequence of social media curation is that girls are bombarded with images of girls and women whose beauty is artificially enhanced, making girls ever more insecure about their own appearance. It’s not just fashion models whose images are altered nowadays; platforms such as Snapchat and instagram provide “filters” that girls use to enhance the selfies they pose for and edit, so even their friends now seem to be more beautiful. These filters make noses smaller, lips bigger, and skin smoother.” This has led to a new phenomenon: some young women now want plastic surgery to make themselves look like they do in their enhanced selfies [11].
  2. Given the breadth of correlational research linking social media use to worse well-being, we undertook an experimental study to investigate the potential causal role that social media plays in this relationship. METHODS: After a week of baseline monitoring, 143 undergraduates at the University of Pennsylvania were randomly assigned to either limit Facebook, Instagram and Snapchat use to 10 minutes, per platform, per day, or to use social media as usual for three weeks. RESULTS: The limited use group showed significant reductions in loneliness and depression over three weeks compared to the control group. Both groups showed significant decreases in anxiety and fear of missing out over baseline, suggesting a benefit of increased self-monitoring [12].
  3. Few people realize it, but a chilling 19 percent of interactions on social media are already between humans and bots, not humans and humans. Studies based on statistical modeling of social media networks have found that these bots only need to represent 5 to 10 percent of the participants in a discussion to manipulate public opinion in their favor, making their view the dominant one, held by more than two-thirds of all participants. When those on the powerful fringe — the Mrs. Salts of the world -enforce a position that doesn’t reflect reality, join together with general ignorance, or harness the silent support of all those waiting around to see which way the wind will blow, they can rapidly solidify into a distorted, hurricane-strength social force. wielding the influence of a majority with the true support of a meager few, the resulting collective illusion harnesses crowd power to entrap us in a dangerous spiral of silence [13].
  4. Research suggests that young people who are heavy users of social media — spending more than two hours per day on social networking sites such as Facebook, Twitter or Instagram — are more likely to report poor mental health, including psychological distress (symptoms of anxiety and depression). Seeing friends constantly on holiday or enjoying nights out can make young people feel like they are missing out while others enjoy life. These feelings can promote a ‘compare and despair’ attitude in young people. Individuals may view heavily photoshopped — don’t hypenate, edited or staged photographs and videos and compare them to their seemingly mundane lives. The findings of a small study, commissioned by Anxiety UK, supported this idea and found evidence of social media feeding anxiety and increasing feelings of inadequacy. The unrealistic expectations set by social media may leave young people with feelings of self-consciousnesslow self-esteem and the pursuit of perfectionism which can manifest as anxiety disorders. Use of social media, particularly operating more than one social media account simultaneously, has also been shown to be linked with symptoms of social anxiety.
  5. The sharing of photos and videos on social media means that young people are experiencing a practically endless stream of others’ experiences that can potentially fuel feelings that they are missing out on life — whilst others enjoy theirs — and that has been described as a ‘highlight reel’ of friends’ lives. FoMO has been robustly linked to higher levels of social media engagement, meaning that the more an individual uses social media, the more likely they are to experience FoMO [14].
  6. AR overlays are often used to alter a consumer’s appearance. This may seem harmless enough, but physical appearance is a key component of identity and as such it can have a substantial impact on psychological well-beingStudies have shown that virtually modifying appearance can provoke anxiety, body dysmorphia, and sometimes even motivate people to seek cosmetic surgery.
  7. On the other hand, for the participants who were already happy with their appearances, seeing their faces with realistic modifications made them feel less certain about their natural looks, shaking their typical self-confidence. In a follow-up survey, we found that when the AR filter increased the gap between how participants wanted to look and how they felt they actually looked, it reduced their self-compassion and tolerance for their own physical flaws [15].
  8. The crisis is not a result of changes in the willingness of young people to self-diagnose, nor in the willingness of clinicians to expand terms or over-diagnose. We know this because the same trends occurred, at the same time, and in roughly the same magnitudes, in behavioral manifestations of depression and anxiety, including hospital admissions for self-harm, and completed **suicides…**how sharp and sudden the increase has been for hospital admissions for teen girls who had intentionally harmed themselves, mostly by cutting themselves.
  9. The correlation of childhood lead exposure and adult IQ is r = .11, which is enough to justify a national campaign to remove lead from water supplies. These correlations are smaller than the links between mood disorders and social media use for girls. Gotz et al. note that such putatively “small” effects can have a very large impact on public health when we are examining “effects that accumulate over time and at scale”, such as millions of teens spending 20 hours per week, every week for many years, trying to perfect their Instagram profiles while scrolling through the even-more perfect profiles of other teens.
  10. Teens often say that they enjoy social media while they are using it — which is something heroin users are likely to say too. The more important question is whether the teens themselves think that social media is, overall, good for their mental health. The answer is consistently “no.” Facebook’s own internal research, brought out by Frances Haugen in the Wall Street Journal, concluded that “Teens blame Instagram for increases in the rate of anxiety and depression … This reaction was unprompted and consistent across all groups.” [16]
  11. In particular, YouTube was highlighted as a “rabbit hole”, with the seemingly infinite stream of content enabling people to watch one video after another without making a conscious decision to do so. YouTube’s Marketing Director, Rich Waterworth, confirmed to us that around 70% of the time people spend on YouTube is spent watching videos that have been ‘recommended’ to them by the platform’s algorithms, rather than content they have actively searched for.
  12. The effects of engagement metrics sit alongside the fact that platforms such as Snapchat and Instagram use augmented reality technologies to provide users with image-enhancing filters that can be applied to photos before they are shared. There are concerns that ‘beautifying’ filters can have a potentially negative impact on self-esteem, and Instagram was ranked worst by young people surveyed by the Royal Society of Public Health for its effect on body image. These filters allow users to make their lips appear fuller, hips rounder or waist narrower in order to conform to others’ pre-conceived ideas of physical beauty. It is also argued that this can lead to body dysmorphic disorder, with cosmetic surgeons reporting that patients are increasingly bringing ‘filtered’ pictures of themselves to consultations, despite such images being unobtainable using surgical procedures. Vishal Shah told us that the company takes the issue of body dysmorphia “seriously”.
  13. In the digital economy, data, design and monetization are inextricably tied. The 5Rights Foundation observed in written evidence that the design strategies of online platforms “are based on the science of persuasive and behavioral design, and nudge users to prolong their engagement or harvest more of their data.
  14. Evidence demonstrates how some games, as well as social media platforms, use psychologically powerful design principles similar to those used in the gambling industry. Dr Mark Griffiths’s written evidence highlights how the structural characteristics of video games — the features that induce someone to start or keep playing — resemble those employed to keep people using gambling products, such as “high event frequencies, near misses, variable ratio reinforcement schedules, and use of light, colour, and sound effects.”230 Match-three puzzle games such as King’s Candy Crush Saga demonstrate reward mechanics in action: for example, the game rewards players with incentives, such as pop-up motivational slogans or free ‘spins’ that offer another random chance to win ‘boosters’ to enhance game-play, at random intervals.
  15. Similarly, design mechanics that encourage people to stay on, or return to, social media platforms include pop-up notifications delivered at random intervals; a lack of “stopping cues” to prevent people from reflecting on how long they have been using an application; and deliberately structuring menus or pages to nudge people into making choices that the platform favors.
  16. A slot machine handles addictive qualities by playing to a specific kind of pattern in a human mind. It offers a reward when a person pulls a lever. There is a delay, which is a variable — it might be quick or long. The reward might be big or small. It is the randomness that creates the addiction [17].
  17. Many researchers argue that digital technologies can expose children to bullying, contribute to obesity and eating disorders, trade off with sleep, encourage children to negatively compare themselves to others, and lead to depressionanxiety, and self-harm. Several studies have linked time spent on social media to mental health challenges such as anxiety and depression [18].
  18. Instagram has grown in popularity among young adults and adolescents and is currently the second-favorite social network in the world. Research on its relationship to mental well-being is still relatively small and has yielded contradictory results. This study explores the relationship between time spent on Instagram and depressive symptoms, self-esteem, and disordered eating attitudes in a nonclinical sample of female Instagram users aged 18–35 years. In addition, it explores the mediating role of social comparison. A total of 1172 subjects completed a one-time-only online survey. Three different mediation analyses were performed to test the hypotheses that social comparison on Instagram mediates the association time spent on Instagram with depressive symptoms, self-esteem, and disordered eating attitudes. All three models showed that the relationship between intensity of Instagram use and the respective mental health indicator is completely mediated by the tendency for social comparison on Instagram.
  19. Researchers inside Instagram, which is owned by Facebook, have been studying for years how its photo-sharing app affects millions of young users. Repeatedly, the company found that Instagram is harmful for a sizable percentage of them, most notably teenage girls, more so than other social-media platforms. In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address [19].
  20. Body image is an issue for many young people, both male and female, but particularly females in their teens and early twenties. As many as nine in 10 teenage girls say they are unhappy with their body. There are 10 million new photographs uploaded to Facebook alone every hour, providing an almost endless potential for young women to be drawn into appearance-based comparisons whilst online. Studies have shown that when young girls and women in their teens and early twenties view Facebook for only a short period of time, body image concerns are higher compared to non-users. One study also demonstrated girls expressing a heightened desire to change their appearance such as face, hair and/or skin after spending time on Facebook. Others have suggested social media is behind a rise in younger generations opting to have cosmetic surgery to look better in photos, which has implications for physical health through unnecessary invasive surgery. Around 70% of 18–24 years olds would consider having a cosmetic surgical procedure [20].
  21. Many quantitative studies have supported the association between social media use and poorer mental health, with less known about adolescents’ perspectives on social media’s impact on their mental health and wellbeing. This narrative literature review aimed to explore their perspectives, focusing on adolescents aged between 13 and 17. It reviewed qualitative studies published between January 2014 and December 2020, retrieved from four databases: APA Psychinfo, Web of Science, PubMed and Google Scholar. The literature search obtained 24 research papers**. Five main themes were identified: 1) Self-expression and validation, 2) Appearance comparison and body ideals, 3) Pressure to stay connected, 4) Social engagement and peer support and 5) Exposure to bullying and harmful content. This review has highlighted how social media use can contribute to poor mental health — through validation-seeking practices, fear of judgement, body comparison, addiction and cyberbullying.** It also demonstrates social media’s positive impact on adolescent wellbeing — through connection, support and discussion forums for those with similar diagnoses. Future research should consider adolescent views on improvements to social media, studying younger participants, and the impact of COVID-19 on social media use and its associated mental health implications [21].
  22. The rise in popularity of instant messaging apps such as Snapchat and WhatsApp can also become a problem as they act as rapid vehicles for circulating bullying messages and spreading images [22].
  23. Cyberbullying can take many forms including the posting of negative comments on pictures and directed abuse via private messages. Almost all social networking sites have a clear anti-bullying stance. However, a national survey conducted by Bullying UK found that 91% of young people who reported cyber bullying said that no action was taken [23].
  24. Where teens and young adults are constantly contactable, and deal with online peer pressure [24].

References

  1. Harari, Y. (2011). Sapiens: A Brief History of Humankind. [online] Vintage, p498. Available at: https://www.goodreads.com/book/show/23692271-sapiens [Accessed 09 Jan. 2022].
  2. Google Docs. (n.d.). Adolescent mood disorders since 2010. [online] p6. Available at: https://docs.google.com/document/d/1diMvsMeRphUH7E6D1d_J7R6WbDdgnzFHDHPx9HXzR5o/edit?usp=sharing [Accessed 13 Sep. 2023].
  3. StatusOfMind — Social media and young people’s mental health and wellbeing. (n.d.). [online] p8, https://www.rsph.org.uk/. Royal Society for Public Health. Available at: https://www.rsph.org.uk/static/uploaded/d125b27c-0b62-41c5-a2c0155a8887cd01.pdf.
  4. Royal Society for Public Health, p10.
  5. Royal Society for Public Health, p12.
  6. Royal Society for Public Health, p11.
  7. Royal Society for Public Health, p10.
  8. Binder, M. (2023). Elon Musk’s army of inactive followers paints a bleak picture of X as a whole. [online] Mashable. Available at: https://mashable.com/article/elon-musk-inactive-followers-whole-x-platform [Accessed 13 Sep. 2023].
  9. Huberman, A. (n.d.). ‎Huberman Lab: Maximizing Productivity, Physical & Mental Health with Daily Tools on Apple Podcasts. [online] Apple Podcasts. Available at: https://podcasts.apple.com/gb/podcast/huberman-lab/id1545953110?i=1000528586045 [Accessed 10 Aug. 2023].
  10. Ryan-Mosley, T. (2021). Beauty Filters Are Changing the Way Young Girls See Themselves. [online] MIT Technology Review. Available at: https://www.technologyreview.com/2021/04/02/1021635/beauty-filters-young-girls-augmented-reality-social-media/.
  11. Haidt, J. (2018). The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure. [online] Penguin Books, p154. Available at: https://www.goodreads.com/en/book/show/36556202 [Accessed 17/05/2023].
  12. Google Docs. (n.d.). Social Media and Mental Health. [online] Available at: https://docs.google.com/document/d/1w-HOfseF2wF9YIpXwUUtP65-olnkPyWcgF5BiAtBEy0/edit?usp=sharing [Accessed 13 Sep. 2023].
  13. Rose, T. (2022). Collective Illusions: Conformity, Complicity, and the Science of Why We Make Bad Decisions. [online] Hachette Go, p68. Available at: https://www.goodreads.com/en/book/show/58340695 [Accessed 17/05/2023].
  14. Royal Society for Public Health, p8.
  15. Javornik, A., Marder, B., Pizzetti, M. and Warlop, L. (2021). Research: How AR Filters Impact People’s Self-Image. Harvard Business Review. [online] 22 Dec. Available at: https://hbr.org/2021/12/research-how-ar-filters-impact-peoples-self-image.
  16. Haidt, J. (2022). Teen Mental Health Is Plummeting, and Social Media is a Major Contributing Cause. [online] United States Senate Committee on the Judiciary. Available at: https://www.judiciary.senate.gov/imo/media/doc/Haidt Testimony.pdf [Accessed 13 Sep. 2023].
  17. Immersive and addictive technologies Fifteenth Report of Session 2017–19 Report, together with formal minutes relating to the report. (2019). Available at: https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1846/1846.pdf.
  18. Murthy, V. (n.d.). Protecting Youth Mental Health. [online] U.S. Department of Health and Human Services. Available at: https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf [Accessed 13 Sep. 2023].
  19. Social Media and Mental Health, p103.
  20. Royal Society for Public Health, p10.
  21. Social Media and Mental Health, p197.
  22. Royal Society for Public Health, p11.
  23. Royal Society for Public Health, p12.
  24. Royal Society for Public Health, p8.
post authorCirc Cular

Circ Cular
Circ is the humane design founder @circlo.com. Helping humanity conquer everyday illusions to design their best lives. Reinventing, humanizing, and intellectualizing design science using biology, philosophy & cognitive science.

Tweet
Share
Post
Share
Email
Print
Ideas In Brief
  • The article dissects the link between social media and rising mental health concerns, pinpointing the design components responsible for increasing anxiety and depression among users.

Related Articles

Discover how Flux.1, with its groundbreaking 12 billion parameters, sets a new benchmark in AI image generation. This article explores its advancements over Midjourney and Dall-E 3, showcasing its unmatched detail and prompt accuracy. Don’t miss out on seeing how this latest model redefines what’s possible in digital artistry!

Article by Jim Clyde Monge
Flux.1 is a Mind-Blowing Open-Weights AI Image Generator with 12B Parameters
  • This article examines Flux.1’s 12 billion parameters and its advancements over Midjourney and Dall-E 3. Highlights its superior image detail and prompt adherence.
  • The piece explores the shift of developers from Stability AI to Black Forest Labs and how this led to Flux.1. Analyzes the innovation impact.
  • It compares Flux.1 with Midjourney V6, Dall-E 3, and SD3 Ultra, focusing on visual quality, prompt coherence, and diversity.
  • The guide explains how to access Flux.1 via Replicate, HuggingFace, and Fal. Covers the different models—Pro, Dev, Schnell—and their uses.
  • The article investigates Flux.1’s capabilities in generating photorealistic and artistic images with examples of its realism and detailed rendering.
Share:Flux.1 is a Mind-Blowing Open-Weights AI Image Generator with 12B Parameters
5 min read

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and