ChatGPT, an artificial intelligence chatbot, has been one of the biggest developments in the world of tech for years. It has over 100 million active users and has disrupted everything from the world of teaching to the bad puns you see on social media.
But ChatGPT isn’t alone. Chatbots are emerging from all corners of the internet, each promising to better your life. You can get out of parking tickets, speak to an AI version of a celebrity, or, one of the fastest-growing options, form a new romantic relationship.
“The AI companion who cares” states the Replika website, a company that makes AI chatbots to replicate relationships. With two million total users and a growing fan base, Replika is leading the charge for a new form of dating: AI companions.
While AI relationship chatbots can have a seedy background, mistreated by their users, this is a tool that can also provide support for people in vulnerable places.
Supporting users
In essence, these AI companions work a lot like ChatGPT or any popular chatbot on the internet. Fed a large set of training data, the AI learns what is the most logical order of words to respond to a user’s message.
However, there are a few key differences. Firstly, from the outset, Replika users can specify what type of emotional relationship they want to form, be it a virtual partner, friend or mentor.
Secondly, AI companions will try and mould themselves around your traits
“Replika has a base training, learning from conversations with the user to match their personality. The more you message with the system, the more it tries to replicate you, and learn your likes, dislikes and opinions” says Marco Dehnert, an expert in human-AI communication.
“Replika is designed to both be your companion and a mirror of yourself. It reflects back your interests and acts like a relationship with another entity. The AI 'person' even has their own interests and opinions.”
In studies of human-chatbot interactions, users of the technology were shown to be able to form meaningful relationships, gaining a secure base to develop and express their emotions when needed.
When using Replika, participants also reported feeling more open to expressing their anxieties without the fear of being judged by the person on the other side. They also reported a boost in their well-being and described their relationship with the chatbot as a positive experience.
This same research also showed that, as the user's relationship continued, the amount of communication with the chatbot decreased while the amount of reward remained the same, highlighting a lessening in the need for emotional support.
While it is easy to assume most users of these apps are utilising these chatbots as their only relationship, there is actually a surprising wealth of users.
“For a small group of users, this might be the only relationship that they have as they choose to remove themselves from society. But for the vast majority of users, both current and future, it will just be another variety of relationship to add to your life,” says Dehnert.
“Maybe it fulfils a short-term period of loneliness. Those dealing with loss or a breakup who don’t want to deal with people, or those who are struggling to acclimatise to a new place where they don’t yet have a social circle.”
Dehnert compares it to most long-distance relationships – not for everyone, but also temporary in nature.
These technologies are mostly used for romantic relationships, but there is equally a market for users who, for whatever reason, are in a place where they cannot bond with real-life companions.
"It will simply be a new type of relationship that adds to the human experience. Someone to speak to about your problems and concerns, or a sounding board when it is needed, for as long as it is needed."
An ethical headache
Like anything in the world of artificial intelligence, there are some serious ethical questions that come up. One of which is around a user’s ability to comfortably return to real-life social interactions after a period away from them.
If a person becomes reliant on these services, or even just steps out of the dating world, supplementing it with this, there is a concern that it will take away their ability to comfortably socialise with real people.
“The opinion is mixed here. People understand the difference between reality and technology and know that both systems will operate differently,” says Dehnert.
“The problem is that real relationships are give and take, requiring work from both sides, whether it is a friend, partner or another type of relationship. This isn’t the case with AI models, and there is a risk of people taking these bad habits back into real life.”
Even though this kind of human-chatbot relationship has existed for years, it is still unclear what kind of effects it could have on users and could depend on how reliant a user becomes on the technology.
There have also been reports of some users abusing their chatbot relationships. Removed from repercussions, these users verbally abuse the chatbots – behaviour that can be brought back into the real world.
On the flip side, there are also concerns about the user themselves. By training on human behaviour from the internet, chatbots pick up both the good, the bad and the ugly that we have to offer, and sometimes the bad can leak through.
Read more:
- ChatGPT: Everything you need to know about OpenAI’s GPT-4 tool
- Midjourney: The gothic AI image generator challenging the art industry
- AI art’s hidden echo chamber is about to implode. Here’s what that will look like
Italy has already imposed a data ban on the app due to its potential risk to minors, and there are concerns about the risk to vulnerable users. If someone generates a trusting relationship with the AI when they need it, they could be further damaged by a mistake in the chatbot’s responses.
"A chatbot is simply interested in the logical continuation of its programming, it doesn't care about the emotional side of something. Whether it is problematic, political or generally unsavoury information, it could easily slip into this style of conversation by accident."
The future of this technology
Dehnert points out that, for most people, an AI relationship isn’t going to suddenly become a part of their life. In fact, like most of the niche corners of technology, it will only ever appeal to small groups of people.
But while it carries a taboo, Dehnert argues that it follows a similar trajectory from dating apps.
“It is only recently that we’ve seen a shift with dating apps not being taboo, not least because of COVID and a change in how people interact. As technology advances, these chatbot relationships become a more realistic option for people, along with changing attitudes to sexuality.”
While relationship chatbots have existed for years in different forms, this level of advancement and accessibility is still new and developing leaving the future open.
Heavy restrictions like in Italy could clamp down on their usage, or equally, they could quickly fade into the background in the next few years.
“Right now, early adopters and those who needed the support are interested in these types of things. Like all technology, it will eventually plateau. A virtual companion won’t be for everyone but for some, it might be just what they need at this point in their life.”
About our expert, Marco Dehnert
Marco is an expert in human-machine communication and social impact of communication technologies at the Arizona State University.