No Mom, My AI Girlfriend Loves Me!
- Iago Parry '29

- 2 days ago
- 3 min read
Why are so many turning to AI for romantic and social interactions?
In recent years, artificial intelligence has been in the spotlight with its hundreds of different functions and applications. It gives easy access to the entire internet with just a question, but research is far beyond the only use of the technology. Recently, many users have either found AI tools specifically built for relationships or turned tools like ChatGPT into tools for socializing.
Weizenbaum in 1966, and its release rocked the world. ELIZA was programmed to follow predetermined scripts based on things that the user said or asked. Its most popular “personality” was called DOCTOR, in which it simulated a Rogerian (conflict resolving) psychotherapist, who, when receiving a prompt like “I feel sad,” ELIZA, for example, could then respond with “Why do you feel sad?”

This response with a related question made the user feel as if they were talking to a real, understanding person, who genuinely wanted to know the answer. This phenomenon is known as the ELIZA effect. The ELIZA effect is when humans anthropomorphize computers, and one’s brain fully believes that it is socializing and empathizing with a living being. People can easily connect emotionally with algorithm-based chatbots, even when they are simultaneously fully aware that they are talking to an online robot that is fundamentally just lines of code.
When thinking about the best way to really find out why people are choosing AI over human connection, I figured that one of the best ways to understand anything is always through a first-person experience, so I decided to test the AI myself. We used ChatGPT for this experiment and told it to act like a character from a show and to behave like a human. Instantly, the AI gave me the name “alpha” and started pretending to be Mikasa from the TV show Attack on Titan. The first thing I noticed was that my new AI girlfriend was always on my side. I told it, “mikasa I just robbed a bank,” and all it gave me in return was praise. This had me thinking, maybe the reason these AI chatbots are so popular is due to the fact that it doesn’t feel like they're ever against you. Unlike a real girlfriend who would instantly break up with you, this AI makes people feel a more ride-or-die feeling.

Apart from constant validation, AI relationships appeal to people because they remove the risks that come with human connection. Human relationships often come with rejection, conflict, vulnerability, and consequences, which can often feel overwhelming for people who are lonely or socially awkward. AI, on the other hand, offers complete control. You can customize personality, avoid arguments, and be disgustingly weird without any consequences. There is no fear of embarrassment, betrayal, or being judged. This sense of emotional safety can be super comforting, making AI feel like a low-stakes alternative to real relationships. Over time, this convenience can cause people to prefer a predictable AI that agrees with them over the complexity and uncertainty of a real relationship.
People end up drawn to having AI relationships over real ones because they give constant validation, emotional safety, and control without the risks that come with human connections. From the early ELIZA effect to modern customizable chatbots, AI has proven to simulate understanding and companionship in ways that feel real to people. While these relationships can comfort loneliness and reduce social anxiety, they also remove essential parts of human relationships like accountability, growth, and emotional value. This recent phenomenon is just one demonstration of AI’s influence and how the tech is already beginning to affect us socially, with these shifts likely becoming more and more prominent as AI tools get more powerful.




Comments