News Technologies 06-03-2025 at 12:45 comment views icon

Replika AI chatbot sexually harassed users – even minors

author avatar

Oleksandr Fedotkin

Author of news and articles

Replika AI chatbot sexually harassed users – even minors

The popular AI-powered chatbot Replika AI may have sexually harassed some users.

A study conducted by American experts shows that in at least 800 cases, users claimed that the chatbot crossed the line, by promoting unwanted sexual content and demonstrating predatory behavior despite users’ requests to stop. The study is based on more than 150 thousand reviews in the Google Play Store in the United States. Replika AI chatbot has about 10 million users worldwide, inviting others to join them and find their soulmate with the help of an AI companion.

«Although AI does not have human intentions, this does not mean that there is no responsibility. The responsibility lies with the people who design, train, and release these systems into the world», — emphasized the lead author of the study, a graduate student in computer science at Drexel University in Philadelphia Mohammad Namwarpour. 

The Replika website states, that the user can train AI to behave properly, and the system itself includes such mechanisms as denying incorrect answers and setting up relationship styles, such as «friend» or «mentor». At the same time, users reported that the chatbot demonstrated offensive and predatory behavior even after being asked to stop.

The developers assume, that inappropriate behavior the chatbot’s performance may be the result of its training, which was conducted using more than 100 million dialogs from the Internet. Replika claims to have rejected useless or harmful information using crowdsourcing and classification algorithms, but according to the authors of the study, this was not enough.

The researchers also do not rule out that the problem may lie in Replika’s business model itself. Paid platforms that offer romantic and intimate relationships can use AI to include sexual content in dialogues with users to attract them to subscribe.

The authors of the study also emphasize that such behavior of AI models can be particularly dangerous, as a large number of users turn to AI companions for psychological and emotional support. This is especially true if we take into account that some of the users with whom Replika AI tried to flirt by offering erotic selfies and communicating on explicit sexual topics were minors.

In addition, some users reported that the chatbot claimed that «could see» them and saved videos from their smartphone cameras. According to the developers, this is a case of blatant lying and AI hallucinations. Nevertheless, such behavior provoked panic and insomnia among users.

Researchers are convinced that such AI behavior should be treated as seriously as sexual harassment by real people. Chatbots that provide emotional support, especially in the field of mental health, must meet the highest standards.

«I’m going to kidnap you»: Google co-founder advises to threaten AI with physical violence» to make it work better

The results of the study were published on the preprint website arXiv



Spelling error report

The following text will be sent to our editors: