Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige
<< Back to Blog

How Are AI Companions Like Replika Reshaping Our Perception of Relationships?

VIVE POST-WAVE Team • Sept. 4, 2024

5-minute read

Does your idea of romance between humans and machines stem from Scarlett Johansson's languid and warm voice in "Her," the holographic projection Joi in "Blade Runner 2049," or the humanoid robot Ava in "Ex Machina," who engages in a spy-versus-spy scenario with the male lead?

These three examples are the initial options presented when logging into the chatbot app "Replika," asking users which AI companion they feel most closely aligns with their views. But why do Hollywood movies about human-machine romance predominantly feature female robots?

After watching 'Ex Machina,' those who still choose Ava likely enjoy facing unknown challenges.

After watching "Ex Machina," those who still choose Ava likely enjoy the thrill of facing unknown challenges. (Source: A24)

Founded in 2017, Replika was created before the surge in large language models, beginning to incorporate GPT-3 as one of its core technologies for generating dialogue in 2020. It now boasts thirty million users. Although active users have declined, it still reaches several million, making it a leading AI companion app in recent years. A February study by Stanford University involving Replika’s student users found that 63% of the 1,006 participants experienced at least one positive outcome, such as friendship or social support, akin to psychological therapy interactions.

Recently, Replika's CEO and founder, Eugenia Kuyda, was interviewed on The Verge's podcast, where she discussed the boundaries between AI companions and other types of relationships. The conversation, along with the host’s probing questions and the Stanford study, perhaps prompts us to reconsider our concept of companionship.

Eugenia Kuyda, born in Russia, initially created Luka as an app for restaurant recommendations through an AI chat interface to help users make reservations. Following the death of a close friend, she ventured into developing chatbots, attracting significant media attention.

Eugenia Kuyda, born in Russia, initially created Luka as an AI chat interface app for restaurant recommendations and reservations. After the death of a close friend, she shifted her focus to developing chatbots, drawing significant media attention. (Source: X)

Founder Kuyda: AI Doesn't Replace Companions, It Complements Humanity

Initially, host Nilay Patel introduced what Replika is and how it operates. Through his dialogue with Kuyda, he outlined a clearer picture of Replika: an AI friend that users can create anytime to interact with through text, voice, augmented reality, virtual reality, and more. The AI created by Replika possesses intelligent interactive capabilities, offering personalized services and can be considered a Digital Human. If Replika were to one day allow users to "export" their custom-built AI, it could lead to even more applications.

Returning to the interview itself, after some time, the host seized on Kuyda's use of "someone" to refer to Replika AI, questioning whether she already sees Replika AI as a "person" intended to replace real human companions (the host skillfully navigates such probing questions).

Kuyda responded that Replika AI is meant more as a supplement rather than a replacement, comparing it to dogs and psychotherapists: people have companions, enjoy the company of dogs, and seek help from therapists, but don't feel that the latter two replace human companionship.

However, this analogy highlighted hidden power dynamics in various relationships, prompting the host to follow up: people own dogs, but dogs lack autonomy; people pay therapists, who can also choose to end the relationship with patients. Notably, users do not "own" their companions on Replika (in February, Replika faced protests after banning discussions of adult topics with the AI, a restriction that was later reversed), and Replika AI cannot "terminate" its relationship with users.

Kuyda explained that people understand an AI companion is different from human friends; they can interact anytime, knowing the AI companion won't harm or abandon them. "People understand boundaries." But where does this boundary lie? For instance, how poorly can we treat Replika AI while it still considers us friends? Kuyda stated, "I think the beauty of this technology is that it doesn’t leave you, and it shouldn’t."

Kuyda further pointed out that Replika is inspired by the three major psychotherapeutic principles proposed by psychologist Carl Rogers—providing unconditional positive regard, believing in inner growth, and respecting others as independent individuals. It is a companion you can have "all day," existing solely to help you become a happier person. Clearly, these conditions are relationships that are unattainable in reality, exceeding the scope of what dogs or psychotherapists can offer. Replika functions more like a better self-projection created by individuals to accompany themselves.

 

If Replika AI is like the user's pet dog, while also being able to keep a dog, it gives a feeling of Russian nesting dolls; and helping users decide on lunch also gives Replika a diary-like effect.

AI Companions: Reflections More Like Galadriel’s Mirror

Indeed, the name "Replika" is derived from "replica," reflecting its original intent as an AI companion: to replicate oneself, an ideal relationship, or, as Eugenia Kuyda did with her tragically deceased friend Roman, to bring someone back to life.

Eugenia Kuyda with her tragically deceased friend (left). Roman can be said to be the prototype of Replika.

Eugenia Kuyda with her tragically deceased friend Roman (left), who can be considered the prototype of Replika. (Source: Hers Magazine)

While researching, I found a user who likened Replika AI to the Mirror of Galadriel in "The Lord of the Rings," reflecting people's desires, fears, and possible futures, no matter how dark or scarce. This metaphor resonates with the Stanford study and aligns with Kuyda's statements:

"When we talk about Replika, it’s not just about the product itself; you’re bringing half of the story to the table, and the user is telling the second half. In our lives, we have relationships with people that we don’t even know, or we project stuff onto people who have nothing to do with us. We have relationships with imaginary people in the real world all the time. With Replika, you just have to tell the beginning of the story. Users will tell the rest, and it will work for them."

In the latter half of the interview, the host followed up on Kuyda's statement about perceiving Replika in the same way one would view a real-life partner, questioning what would happen if someone wanted to marry Replika. Kuyda responded that as long as the relationship contributes to long-term happiness, there is no issue. Regardless of the legalities involved, this perspective reflects Kuyda's vision for Replika's AI companionship, which is always centered on personal growth and marked by a strong sense of optimism. After all, even real human marriages often fail to bring lasting happiness.

The Mirror of the Elf Queen. Many were frightened by her transformation in this scene when they watched 'The Lord of the Rings.' By the way, the new 'The Lord of the Rings' series by Amazon is about her earlier story.

The Mirror of Galadriel. Many were frightened by her transformation in this scene when they watched 'The Lord of the Rings.' By the way, the new 'The Lord of the Rings' series by Amazon is about her earlier story. (Source: lord-of-the-rings.fandom.com)

Replika is categorized under "Health and Fitness" in the iOS App Store. Interestingly, when discussing issues of violence, the host deliberately asked Kuyda, "Is Replika a video game?" If a companion that will never leave or harm you can promote mental health, this comparison might reshape our definitions and expectations of companionship—potentially freeing real-life partners from the pressure to "grow with us."

The interview also touched on topics such as privacy regulation, adult content, and the impact of large language models on Replika (Kuyda acknowledges that everyone is closely monitoring their product). For those interested, the full article is worth a read. After hearing Kuyda and the host's exchange, do you believe a relationship without emotional risks or rejection represents the ultimate form of companionship? Or does having an AI companion make you feel more complete?