When I scroll through social media platforms, I often see people chatting with AI as if they are in romantic relationships. One person might say, “Honey, how was your day?” and the AI replies, “I thought about you the whole day. I missed you.” When I first saw this, I felt disgusted. We are humans. I believed that only humans could date each other. I couldn’t even imagine any other kind of being becoming someone’s boyfriend or girlfriend.
How could humans and AI interact with each other nowadays? Until a few years ago, AI was not very popular. Only a few people had a sense of what artificial intelligence was. But it has turned out to be a different story now. Nowadays, almost everyone uses AI tools like ChatGPT. As time passed and AI developed far beyond human imagination, it became a part of our daily lives. AI can be our friend, mentor, or even a romantic partner.
The foundation of the interaction between AI and humans lies in Emotion AI. According to MIT, Emotion AI refers to artificial intelligence that measures, understands, simulates, and reacts to human emotions. Emotion AI can effectively respond to specific content because they are aware of the user’s emotional state. In other words, emotion AI not only understands what we say but also how we feel when we say the contents.
This system allows AI to engage with humans more empathetically and form deep personal relationships in a way similar to how humans do. Various emotion recognition techniques make Emotion AI possible. Firstly, text-based analysis interprets the emotional tone of conversations through word choice and language patterns. Facial recognition analyzes expressions and movements on a person’s face to understand emotions. Voice-based analysis examines pitch, speed, and rhythm to determine a speaker’s emotional state. The goal of using these data sources is not only to understand the direct information but also to interpret the emotional context.
According to data from Sensor Tower, the top six most downloaded AI companion apps from January to August 2024 surpassed 36 million downloads. Ark Invest projects that the Emotion AI industry could generate over $70 billion in revenue within the next six years. One of the most prominent examples is Character.AI. This AI allows users to create custom personalities and interact with them through text, voice messages, and calls.
When emotion AIs have conversations with users, they analyze the user’s emotions through text or voice and adjust their tone and content to suit the user’s emotions. Emotion AI, such as Character.AI, can provide comfort and healing to users by forming personal relationships. This can be beneficial for people who feel lonely or experience social anxiety but struggle to build social connections in real life.
While Emotion AI offers potential, its rapid development also raises serious concerns. For Emotion AI, like Character.AI, where anyone can create their own characters and the chatbot includes NSFW (“not safe for work”) filters, there is a risk that young users could encounter content that is not age-appropriate. Additionally, there is the danger of developing a severe reliance on AI. The boundary between online and real life can become blurred. Unlike a real romantic partner, an AI will do anything based on user input to satisfy the user’s needs. This makes users rely too much on AI emotionally, and they might prioritize interactions with AI over real human relationships
. As a result, they may become isolated from society, face difficulties in real-world communication, and feel more comfortable with AI. Moreover, there may also be inherent biases in AI algorithms due to the datasets used to train them. If these datasets are biased, the AI’s responses reflect stereotypes and prejudices. In turn, the AI might say hurtful or emotionally damaging words, and the emotional impact on users can be huge. In addition to data biases, ethical and privacy concerns arise with Emotion AI. Sensitive personal information such as voice recordings, facial images, emotional behavior patterns, and users’ emotional states is stored, analyzed, and used for further data training. This means that personal information is what users pay in exchange for emotional interactions with AI. Users might be unaware of how their information is shared with the AI they call their ‘romantic partners’ and used in data training processes.
Emotion AI raises critical questions that we have to consider. What does it mean to love? Can we even consider AI a romantic partner? How should we address the ethical issues surrounding Emotion AI? Before the emergence of Emotion AI, love was about vulnerability, growth, and human presence. But today, love no longer strictly requires a human presence. It has evolved into something defined by connection and empathy, even if that connection is with a non-human being. Imagine saying “I love you” to an AI. How would you feel? The truth is the definition of love shifts over time, and this is one such moment of transformation.
Due to the limitations of Emotion AI, it should be used with caution. It doesn’t make sense to stop using AI, especially Emotion AI, just because of its risks. Instead, we must use Emotion AI with responsibility, as it has great potential. Rather than only focusing on technological development, we should also ask how humanity can manage the ethical use of AI. Everyone, including developers, users, policymakers, and educators, should think about ethical boundaries for Emotion AI as it engages in emotional interaction with people. The challenges we face in this technological era could also become opportunities, depending on how humanity chooses to act.
By: Mina Sung
Write and Win: Participate in Creative writing Contest & International Essay Contest and win fabulous prizes.