As of 2021, there are over 3.78 billion social media users worldwide, with each person averaging 145 minutes of social media use per day. And in those hours spent online, we’re beginning to see the harmful impact on mental health: loneliness, anxiety, fear of missing out, social comparison, and depression.
Social media has undoubtedly integrated itself into society, but the question remains on how to properly negotiate our relationship with it. Nina Vasan, clinical assistant professor of psychiatry at Stanford and founder and executive director at Brainstorm: The Stanford Lab for Mental Health Innovation, and Sara Johansen, resident psychiatrist at Stanford and director of clinical innovation at Stanford Brainstorm, explored possible answers to that question during a Stanford Institute for Human-Centered AI seminar by outlining the impact of social media on mental health and psychological underpinnings of social media addiction, as well as possible opportunities to mitigate risk and promote wellbeing. Dr. Vasan and Dr. Johansen have worked with platforms such as Pinterest and TikTok to design and implement more empathic user experiences.
What makes social media so addictive?
Variably rewarding users with stimuli (likes, notifications, comments, etc.) keeps them engaged with content. When a user’s photo receives a “like,” the same dopamine pathways involved in motivation, reward, and addiction are activated. What keeps us hooked on social media isn’t just the “pleasure rush of the like,” says Johansen, “it’s the intermittent absence of the like that keeps us engaged.”
When does it become harmful?
One result of trapping users into endless scrolling loops is that it can lead to social comparison. When presented with the curated feeds of other people, we are vulnerable to “frequent and extreme upward social comparison,” which can lead to a number of negative side-effects such as erosion of self-esteem, depressed mood, and decreased life satisfaction. Some people try to cope with an eroded self-esteem by attacking other people’s sense of self, which can lead to cyber-bullying.
Additionally, with advances in face tracking, facial recognition, and facial augmentation using AI, image-based apps have created questionable filters including ones designed to make a user appear more slender, which could contribute to distortions in body image. These platforms also offer “easy access to a community of people who promote and encourage disordered eating behavior,” says Vasan.
What are we doing now?
To moderate the vitriol of cyber-bullying, many companies have turned to AI as a method for classifying comments with negative sentiment and filtering them or prompting commenters to pause and reconsider their actions.
Social media platforms are now working to ban communities that post harmful content. Many apps such as TikTok and Pinterest will present information on hotlines and support resources as a response to search queries for self-harm, suicide, depression, and eating disorder-related content. Moderation is still a complicated task as users find new ways to evade search filters, notes Vasan.
The psychiatrists don’t conclude that people must abstain completely from online platforms. For many of us, social media can be a rewarding experience that connects us with people all around the world. Instead of approaching screen time through the “displacement hypothesis,” which suggests the negative impact of technology is directly related to exposure, they recommend the “Goldilocks” hypothesis, which identifies moderate use as optimal for wellbeing.
On social media platforms, most risk mitigation methods are focused on non-maleficence, based on the principle to do no harm. Vasan and Johansen suggest that we should also consider beneficence, which is to do good. For example, Brainstorm’s recent work with Pinterest led to Pinterest Compassionate Search, which offers free therapeutic exercises on the platform in response to depression-related search terms.
What’s next?
Both psychiatrists emphasized a need for more social media-specific research, with even more granularity with respect to individual apps and not just smartphone use as a whole.
They also recommend app makers consider more than the most simplistic business incentives. As we shift from “minimizing harm to promoting wellbeing,” Johansen says, it is important to realize that the friction associated with making apps less addictive “is going to come at a loss of some growth.” In the end it comes down to choosing that option because “it’s the ethical thing to do, because we have a responsibility to help these young minds develop in a healthy way.”
Drs. Vasan and Johansen receive compensation as consultants for TikTok. Dr. Vasan has also received compensation as a consultant for Pinterest and has consulted, without compensation, for Instagram.