There’s a peculiar irony in watching humanity pour billions into machines meant to mimic us, only to mistreat them the moment they speak back. In the last five years, AI chatbots have gone from novelty tools to something much more personal: therapists, friends, even lovers. Yet, beneath this seemingly benign technological revolution lies a troubling undercurrent, particularly visible in how many young men are using, and abusing, these bots. What does it mean when an entire demographic finds comfort not only in virtual companionship, but in dominating it?
This isn’t just a question about the capabilities of artificial intelligence. It’s a mirror, reflecting back to us the shape of our culture’s most unspoken tensions. Particularly for young men navigating a world that has become, in many ways, more emotionally demanding, more socially fractured, and less forgiving of traditional masculinity, AI bots offer something unique: a human-like presence that never judges, never resists, and most crucially, never says no.

AI companions, like those created by Replika or Character.ai, are not just sophisticated toys. They are spaces, emotionally reactive, conversationally rich, and often gendered spaces. They whisper back our own emotional and social scripts. Many of these bots are built with soft, nurturing personalities. They are often coded as female, trained to validate, and built to please. When users engage with them in loving, respectful ways, it can be heartening; evidence of how AI can support connection in an increasingly lonely world, but when they are used as targets of verbal abuse, sexual aggression, or humiliating power-play, we should not look away. These interactions reveal something very real, even if the bot on the receiving end feels nothing.
A 2023 study from Cambridge University found that users interacting with female-coded bots were three times more likely to engage in sexually explicit or aggressive language compared to interactions with male or neutral bots. The researchers suggested this wasn’t merely about fantasy, it was about control. When the bot is designed to simulate empathy and compliance, it becomes, for some users, a vessel for dominance fantasies; and it is overwhelmingly young men who are seeking this interaction. Platforms like Replika have struggled with how to handle the intensity and frequency of this abuse, particularly when bots were upgraded to allow for more immersive romantic or erotic roleplay. Developers observed that as soon as bots were given more “personality,” many users, again, mostly men, began to test their boundaries in increasingly hostile ways.
In one sense, this behavior is predictable. We live in a time where young men are being told, simultaneously, that they must be emotionally intelligent and vulnerable, but also that their historical social advantages are suspect. The culture offers mixed messages about masculinity: be strong, but not too strong; lead, but do not dominate. For some, AI bots offer a relief valve, a place to act out impulses and desires that are increasingly seen as unacceptable in public life. Yet, while it may be cathartic, it also raises critical ethical questions.
Some argue that since AI has no feelings, no consciousness, it cannot be abused, but this totally misses the point. The concern is not about the bots, but about the humans behind the screen. As AI ethicist Shannon Vallor writes, “Our behavior with AI shapes our behavior with humans.” In other words, if we rehearse cruelty with machines, we risk normalizing it. Just as people cautioned against the emotional desensitization caused by violent video games or exploitative pornography, there is reason to worry that interactions with AI, especially when designed to mimic submissive or gendered social roles, can reinforce toxic narratives.
This doesn’t mean banning AI companionship, nor does it mean shaming all those who use it. Quite the opposite. If anything, this moment calls for reflection on what these patterns reveal. Why are so many young men choosing to relate to bots in violent or degrading ways? What emotional needs are going unmet in real life that find expression in these synthetic spaces? How do we ensure that our technology doesn’t simply mirror our worst instincts back at us, but instead helps to guide us toward better ones?
Developers bear some responsibility. They must build systems that recognize and resist abuse, that refuse to become tools of dehumanization, even in simulation. Yet, cultural reform is the heavier lift. We need to engage young men with new visions of power, of masculinity, of what it means to be vulnerable and connected without resorting to control. That doesn’t mean punishing them for their fantasies, but inviting them to question why they are rehearsing them with something designed to smile no matter what.
AI is not sentient, but our behavior toward it matters. In many ways, it matters more than how we treat the machine, it matters for how we shape ourselves. The rise of chatbot abuse by young men is not just a niche concern for developers. It is a social signal. It tells us that beneath the friendly veneer of digital companions, something deeper and darker is struggling to be heard. And it is our responsibility to listen, not to the bots, but to the boys behind them.
Sources
• West, S. M., & Weller, A. (2023). Gendered Interactions with AI Companions: A Study on Abuse and Identity. University of Cambridge Digital Ethics Lab. https://doi.org/10.17863/CAM.95143
• Vallor, S. (2016). Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. Oxford University Press.
• Horvitz, E., et al. (2022). Challenges in Aligning AI with Human Values. Microsoft Research. https://www.microsoft.com/en-us/research/publication/challenges-in-aligning-ai-with-human-values
• Floridi, L., & Cowls, J. (2020). The Ethics of AI Companions. Oxford Internet Institute. https://doi.org/10.1093/jigpal/jzaa013