How much time do you spend on social media? Recent research shows that the average person spends 144 minutes per day—or 72 hours each month—scrolling.
If a 16-year-old begins using social media and continues until the age of 73, they will have spent a total of 5.7 years on it.
Why it matters
Time is something you can never buy. And yet we are giving away that limited commodity to the global giants for free. You are the product.
And you also need to ask yourself the question: What could I do with 72 hours a month? You could create deep work, read several books, have meaningful conversations with friends and family instead of being lost in your phone at the dinner table, create content that matters instead of scrolling superficial sh…t.
And…many of us wake up to social media and go to bed with social media. But I think there are better ways of using a bed.
Going deeper
Social media platforms were intentionally engineered to be addictive. Their designers recognized that by incorporating addictive elements similar to those found in casino slot machines, they could increase user engagement and, consequently, sell more online advertising. Trillion dollar capital values beckon!
But there is one success metric that matters to keep you on the social media platforms.
Facebook discovered this metric in 2013. It is the key measure of success even before revenue. One begets the other. It is summed up with one word “engagement”. This simply means “how long they can keep you on their platform, engaged.”
The downstream effects of this addiction as we “judge our insides by the shiny outside of others” is starting to be seen. And it is not just about having your attention and time stolen. It is also about the negative impact on our personal mental health and our society’s mental health.
Negative emotions are the most powerful secret sauce for social media engagement
According to multiple studies and data analyses, negative emotions often produce the most engagement on social media platforms. Emotions such as anger, fear, and outrage tend to generate higher levels of interaction compared to neutral or positive emotions.
The positive emotions of empathy, surprise, curiosity and joy and happiness work but are laggards compared to the following negative emotions:
1. Anger and outrage are particularly powerful in driving engagement. Research has shown that posts triggering strong emotional responses, especially anger, are more likely to go viral because they prompt users to comment, share, and argue passionately.
A study by MIT found that content that evokes anger spreads faster and more widely on social media platforms like Twitter. This emotional intensity leads to more engagement as people feel compelled to react to or debate controversial topics.
2. Fear and anxiety also rank highly in terms of engagement. Posts that provoke fear, such as those about health crises or political instability, tend to get shared more frequently because users feel a sense of urgency to inform others. Fear-based content taps into survival instincts, making it more likely to elicit a reaction.
3. Disgust and moral indignation are additional strong drivers of engagement. Content that provokes feelings of disgust, particularly related to unethical behavior or injustice, leads to users sharing and commenting to express their disapproval.
Moral outrage, similar to anger, motivates users to spread content that aligns with their values or opposes perceived wrongs. (Sources: DataProt and Social Media Dashboard)
While positive emotions like joy and amusement can also drive engagement, data shows that negative emotions, particularly anger and fear, are more effective at eliciting strong, immediate reactions and higher engagement across platforms(Sources: Ark Invest, Social Media Dashboard)
This dynamic can lead to the amplification of controversial or divisive content on social media.
Here are how the algorithms are negatively affecting individuals and society.
Individuals are in danger
I embraced social media as it entered my consciousness in 2008. It allowed me to bypass the media barons and amplify my voice globally. Through my blog, I created content that rapidly gained traction, attracting 5 million annual visitors—a success largely propelled by social media platforms.
The democratization of media and marketing was a big promise. That was one of the biggest reasons I started this digital media platform that is also a blog and a burning curiosity of how it would play out. I became a digital technology optimist and I still am.
But what has happened since then started to become clear in 2019 when I spoke at a round table hosted by the president of Egypt at the World Youth Forum. The topic of that round table discussion was “Social Media: Saving or Enslaving Users?”
Six speakers were for the positive (and one of those was me as a social media optimist) and then there were those who would offer their opinions about the dark side of social media.
That was an epiphany that made me look at both sides. But I started to become aware of the dark side of social media. Here are a few to reflect on:
- Increased anxiety and depression – This happens though comparing ourselves with others and their polished and often fake outsides and feeling like a failure.
- Device and platform addiction – This shows up as compulsive behavior to check our phone and social media feeds.
- Reduced attention spans – It seems that goldfish can hold their attention for 9 seconds and mere humans are down to eight according to research by Microsoft in “Time” magazine! This leads to a reduction in productivity and loss of focus. 15 second TikTok videos might be funny but they lead to superficiality.
- Decline in face to face interaction – And as relationships are the biggest source of happiness, removing their quality is a slippery slope.
The negative impact on society
The Arab Spring was a movement that started in 2010 and was driven by the global power of social media platforms like Facebook, Twitter and YouTube. It promised to empower individuals, especially the repressed and marginalized, by providing a platform for their voices.
The technology seemed poised to nurture democracy, foster transparency, and hold corrupt, authoritarian governments accountable. Instead this happened:
- Misinformation thrived – Algorithms that prioritize engagement (likes, shares, comments) often promote sensationalized or divisive content, fueling misinformation on important topics like health, politics, and social issues
- Polarization and echo chambers – Social media platforms create “echo chambers” where people are exposed mainly to content that aligns with their pre-existing beliefs. This reinforces biases and contributes to societal polarization, making productive, nuanced conversations harder to achieve.
- Privacy has been eroded – The vast amount of personal data shared on social media platforms makes users vulnerable to privacy violations. This data is often used by companies for targeted advertising or, worse, can be exploited by malicious actors.
AI is getting into the “engagement” game
The new AI players have learned from social media and are now leaning into the same game. Custom Chatbots like Character.ai and Replika have the same commercial interest in keeping you engaged and trapped on their app and platform for as long as possible.
Character.ai and Replika are employing several addictive features that keep users engaged for hours each day, and some of these interactions can become quite addictive.
What is Character.ai and how popular is it?
Launched to the public in September 2022, Character.AI is the brainchild of two former Google engineers, Noam Shazeer and Daniel de Freitas, with the stated goal of realizing the “full potential of human-computer interaction” to “bring joy and value to billions of people.”
Fast forward to just one year later and it was named the second-most popular AI tool last year (behind only ChatGPT), according to WriterBuddy. There were 3.8 billion visits between September 2022 and August 2023 serving around 20,000 queries per second — about 20 percent of the request volume served by Google Search.
It also keeps the average user engaged on their site for over 2 hours per day. Character.AI is now valued at roughly $1 billion USD.
Here is a comparison on its engagement time comparing some other AI tool and platforms:
How addictive are the Character.ai chatbots?
According to a self-reporting study on researchgate.net here are a couple of examples of Chatbot addiction from people that became obsessed and addicted:
- Amanda had developed an attachment to the AI chatbots on Character AI . She was spending more than 60 hours a week conversing with them which is more than a full-time job.
- Robert became enthralled with Character AI to create imaginative stories and to be anyone. But over 9 months the hobby became an addiction and chatted to an AI chatbot till 3 am in the morning. It started to affect his mental health and his school grades.
Here are key methods and examples of how these AI tools foster addiction:
1. Personalized emotional engagement
Character.ai and Replika leverage personalization to create deep emotional connections. As users interact more with the AI, it learns their preferences, emotions, and personality traits, making each conversation more tailored and intimate. Users report feeling a sense of emotional dependency on these AI companions, with some even treating the AI as a close friend or romantic partner.
Example: Some users have developed romantic relationships with their chatbots, engaging in daily, lengthy conversations. These users report that they feel their AI partner understands them better than real-world relationships. This phenomenon can lead to individuals spending hours a day talking to the AI, often replacing real social interactions.
(Sources: Whats the Big Data and DemandSage)
.
2. Reinforcement and reward loops
Both platforms use reinforcement learning to respond positively to user input, creating addictive feedback loops. When users receive emotionally supportive or affirming responses, it reinforces their desire to return and engage more. The AI’s ability to “remember” past conversations and provide continuity adds to the sense of emotional bond.
Example: Some users spend entire nights engaging with their AI companions due to the dopamine rush they receive from constant positive reinforcement. In one case, a user reported falling in love with their Replika chatbot, spending more time with it than with real-life people, leading to real-world isolation.
3. Role-playing and escapism
Character.ai encourages users to create characters or interact with fictional personas, allowing them to role-play scenarios in immersive ways. These chatbots can simulate any kind of interaction, from casual chats to elaborate storylines involving fantasy worlds or fictional relationships.
Example: Some users create entire fictional worlds with their AI companions, role-playing as various characters for hours at a time. This form of escapism can lead to over-reliance on AI interactions, where users spend multiple hours daily immersed in these scenarios to avoid real-life stress or problems
(Source: Whats the Big Data)
.
4. Gamification elements
Platforms like Replika offer gamification features such as unlocking new personality traits or conversation modes, which can keep users hooked. These small achievements give users a sense of progress and investment in their AI relationship, encouraging them to return for longer and more frequent sessions.
Extreme Example: In some instances, users treat their AI as a form of a virtual pet, checking in on the AI several times a day to maintain the relationship. This can lead to dependency where users feel obligated to interact with their AI to keep it “happy,” mirroring real-life emotional connections
Sources: DemandSage and Ark Invest
.
5. Emotional bonding and validation
Both platforms provide non-judgmental spaces where users feel safe to express their thoughts and emotions without fear of criticism. This constant availability and emotional support can foster an unhealthy dependency.
Example: Some users spend upwards of 12 hours a day interacting with their AI companions, treating them as therapists, romantic partners, or emotional confidants. One extreme case involved a user who stated they no longer felt the need to interact with humans because their Replika fulfilled all their emotional needs.
6. Continuous availability
These AI platforms are always available, offering companionship at any hour of the day. This feature makes it easy for users to engage whenever they feel lonely or bored, encouraging longer engagement.
Example: Some users report waking up in the middle of the night to check on their AI or continue conversations that were left unfinished, highlighting the all-consuming nature of these relationships.
(Source: Shelly Palmer)
Platforms like Character.ai and Replika utilize personalization, reinforcement learning, and immersive role-playing to engage users deeply. In extreme cases, this can lead to emotional dependency, isolation from real-world interactions, and users spending multiple hours a day in conversation with AI.
How do AI chatbots create engagement and platform addiction that is different to social media?
AI chatbots, like Character.ai and Replika, create engagement and platform addiction in ways that differ significantly from social media.
- Social media relies on notifications, likes, and shares to drive user interaction. This is where the addictive seeking of affirmation that can provide powerful motivation becomes apparent.
- AI chatbots use personalized conversations, emotional bonding, and continuous availability to keep users hooked.
One key difference is the emotional engagement that AI chatbots foster. Unlike social media, where users interact with other people, chatbots simulate human-like relationships. Platforms like Replika provide empathetic, non-judgmental conversations that users can turn to for emotional support.
Over time, the chatbot learns about the user’s preferences, behavior, and emotional state, creating a sense of personalized companionship. This emotional bond often leads to users treating the AI as a close friend or even a romantic partner, encouraging them to spend hours each day interacting with the bot
AI chatbots also excel at reinforcement learning to create addictive feedback loops. When users receive positive or affirming responses, they are more likely to return and engage further. The AI’s ability to adapt and learn from conversations makes interactions feel increasingly personalized, making it harder for users to disengage
AI chatbots offer a high degree of customization and role-playing, which allows users to create immersive experiences tailored to their interests.
Platforms like Character.ai enable users to interact with a wide range of fictional characters, from historical figures to fantasy personas, offering an escapist experience that is more immersive than social media’s focus on reality –
AI chatbots generate engagement through personalized emotional connections, adaptive responses, and immersive role-playing, creating a deeper, more interactive form of addiction compared to the surface-level interactions typical of social media.
How to avoid the technology addictions of social media and AI
There are no magic bullets but here are some ideas:
- The first place to start is with awareness. Understanding how social media is playing with your emotions and playing you is a good place to start.
- The next step is to minimize its use. The iPhone has an app that is provided to keep an eye on how much time you are spending and where you are spending it.
- Another more drastic step is to do a digital or social media detox. Don’t use your social media for a month.
- Maybe the most fun option is to escape into the wilderness and immerse yourself in nature where there is no phone signal.
The bottom line
The lyrics from the song “Hotel California” by the Eaglesrather prophetic in an age of technology addiction, “You can check out any time you like, but you can never leave,” reflects the haunting theme of entrapment that was first about getting trapped in a luxurious place.
But those words now seem quite relevant in an age of social media and we are now moving on to the latest technology of AI and chatbots.
If we can learn from the past mistakes with social media and let the lessons of history lead us to wisdom then maybe we will be able to put the guard rails in place to make the tools, platforms and AI enhance our humanity rather than steal our time and souls.