It is becoming increasingly difficult to imagine a world without artificial intelligence (AI). Large Language Models (LLMs) such as ChatGPT have cemented their place in the everyday lives of millions of people. A useful tool in many ways, AI has revolutionised not only how people approach their tasks but has made significant strides in fields like academia and medicine. But what happens when vulnerable people turn to AI for more personal usage?

In October 2025, Sam Altman, the CEO of OpenAI, which encompasses ChatGPT, announced that a version of the model would be released with the purpose of engaging in sexually charged conversations. It is a response to an apparent high demand, as many AI users already resort to erotic prompts. Some, like Meta’s Mark Zuckerberg, argue that such “companions” could be a solution to the much-discussed loneliness epidemic. To others, this development might be reminiscent of a sinister Black Mirror episode.

The biggest issue that would arise with the commercial deployment of an erotic ChatGPT model is a lack of regulation. When it comes to users’ mental health, the dark side of chatbots has recently been exposed. A case in which a teenage boy took his own life after being encouraged by ChatGPT made global headlines. The parents of the teenager ended up suing OpenAI, declaring the company responsible for their son’s death. When this danger is applied to an erotic context, an array of concerns begins to materialise.

In the course of its existence, the internet has led to the radicalisation of many individuals. In addition to more extreme political views, opinions on the makeup of our society, specifically the perceived roles of men and women, have also experienced a shift. Self-proclaimed communities such as red pill groups or so-called incels (men who are involuntarily celibate) demonstrate how online discourse can seemingly legitimise dangerous beliefs, which include the subordination of women and their depiction as sexually manipulative and power-hungry.

In terms of the proposed erotic ChatGPT, there is a risk of reinforcing such narratives and linking them to people’s sexuality. Without proper parameters, the chatbot could be prone to replicate sexual violence, building on the input of the user. That’s what makes this model so precarious. Where do we draw the line? What constitutes sexual preference and what can be seen as a transgression? Regarding these questions, a consensus in common discourses has not been reached, so there is reasonable doubt as to how a chatbot might handle them.

The erotic model of ChatGPT is not the first on the market. An existing bot called Replika, which was created in 2017, offers an adult function in exchange for a fee, making it possible for users to sexually interact with the bot. When the Italian Data Protection Authority condemned Replika for utilising user data to target emotionally vulnerable people and for its exposure to minors due to lacking age restrictions, the company behind the bot swiftly removed its romance and erotica mode. What followed was immense backlash by Replika’s users, showcasing the demand for this kind of content. Ultimately, Replika reinstated its erotic function.

Users becoming emotionally dependent on chatbots is a real concern. Studies on the loneliness epidemic have ironically shown that the rise in feelings of isolation primarily stems from an increased use of technology, especially social media. Fighting fire with fire is not the best course of action. While chatbots can emulate human interactions and offer support, they cannot match the complexity of human emotions. This can go on to create unrealistic expectations in relationships and alienate vulnerable people even further.

When certain prospects are not met in real life, resentment can build. Extreme mentalities like those of the red pill or incel community demonstrate a frightening precedent. Through unregulated AI models, the objectification of women could be further aggravated. Therefore, one cannot help but wonder if the rise of the use of AI to roleplay romantic relationships might drive an even bigger wedge between men and women as online toxicity continues to grow.

The motivation behind the new development of OpenAI, which for a long time was resistant to allowing sexual content on its chatbot, as the company itself feared high emotional reliability on its AI model, becomes clearer once the strong demand for erotic artificial intelligence comes into focus. OpenAI states it wants to “treat adults like adults”, which is why it is offering the new service, but some have determined the move to be a blatant cash grab.

Regardless of the intention of OpenAI, the popularity of ChatGPT is sure to lead to a wide user base for the new erotic model. While this mode of ChatGPT could be a quick fix for experienced loneliness or unfulfilled desire, it cannot replace human emotional connection. Relying on chatbots for sexual or emotional gratification could exacerbate existing issues of isolation as well as feelings of unsafety for marginalised groups, including women and children, when using the internet.

What is necessary to combat social isolation in terms of technology is the regulation of media and artificial intelligence. Additionally, further research needs to be conducted on human-AI relationships and their consequences. AI can be an important informational tool and can provide knowledge without judgement. At the same time, it can also reinforce existing social divisions and false narratives. OpenAI still has the chance to modify its erotic chatbot model before its release. Whether it will take existing concerns into consideration remains to be seen.

Written by Isabell Meggeneder, Edited by Alexandra Steinhoff

Photo Credit: Cash Macanaya (Uploaded April 6, 2023) on Unsplash