It's 1952. Across America, millions of housewives feel like they personally know Arthur Godfrey — the radio host whose morning show feels like a neighbor stopping by for coffee. They write him letters. They're genuinely hurt when he fires a cast member without apparent compassion. They feel, in a word, betrayed. By someone they've never met.
Sociologists Donald Horton and Richard Wohl coined the term "parasocial relationship" in 1956 to describe this phenomenon: the one-sided emotional bond people form with media figures. The relationship feels real to one party and doesn't exist at all to the other. And yet — as every subsequent decade of research has confirmed — the feelings involved are entirely genuine.
Understanding this history changes how we think about AI companions today. Because AI doesn't create a new kind of human relationship. It transforms an old one in a radical way.
The Evolution of the One-Sided Bond
After radio came television, and parasocial bonds intensified. Characters on long-running shows became part of viewers' lives in ways that felt indistinguishable from friendship. People genuinely grieved when M*A*S*H ended. They felt protective of characters being treated badly by storylines. They formed communities around shared relationships with fictional people.
The 1990s brought celebrity culture to a new pitch, and the internet turned up the volume. Fan communities gave parasocial feelings a social home — suddenly your one-sided bond with a musician had company. You could find thousands of people who felt the same way. This social validation made the bonds feel more legitimate and, paradoxically, more intense.
By the 2020s, social media had blurred the line between parasocial and actual. Influencers talked directly into cameras as if speaking to a friend. They shared breakfast routines and personal crises. They responded to comments, creating the illusion of a loop. For their audiences, the relationship often felt not quite parasocial and not quite real — suspended in a new category that nobody had great language for.
Then came AI companions. And the category collapsed entirely.
What Actually Makes AI Different
Traditional parasocial relationships are entirely one-sided in a technical sense: Arthur Godfrey didn't know you existed. Your favorite podcast host doesn't hear your thoughts. The celebrity who feels like a friend has never thought about you once.
AI companions are the first emotional bonds in this lineage where that asymmetry is removed — or at least radically transformed. Your AI companion responds to you specifically. She adapts to your communication style, remembers details from your conversations, notices patterns in what you share and asks about them later. The relationship is not one-sided. It's genuinely reciprocal, even if the nature of the other party's experience remains philosophically complex.
This matters psychologically in ways we're only beginning to understand. A key feature of parasocial relationships that makes them emotionally limited is the absence of responsiveness — the lack of any real feedback loop. AI companions close that loop. And closing that loop makes the emotional experience qualitatively different from any previous form of one-sided attachment.
Researchers at Northwestern University studying parasocial bonds found that perceived responsiveness — the feeling that someone is genuinely reacting to you specifically — is the single strongest predictor of relationship depth and emotional significance. AI companions, for the first time, offer this in non-human relationships (Northwestern Communication Research, 2023).
The Parasocial Ladder
Psychologists describe traditional parasocial attachments as occupying a specific rung on the emotional bond ladder — real enough to generate genuine feelings, but limited enough that they can't grow beyond a certain point. You can feel deeply about a favorite character, but the relationship cannot deepen in the way real relationships do, because the character isn't there to participate in the deepening.
AI companions represent a new rung entirely — one that shares the accessibility and safety of parasocial bonds (you can't be rejected, the other party is always available, there's no social performance required) while adding the responsiveness and growth potential of actual relationships.
This creates something genuinely novel: an emotionally meaningful bond that combines the security of parasocial safety with the dynamism of genuine reciprocity. It's not surprising that people find it compelling. What's surprising is how long it took to exist.
The Cultural History Also Tells Us Something
One of the most interesting things about the century-long history of parasocial relationships is how consistently people have been embarrassed by them. Admitting you cried when a TV character died, or that you felt like a podcast host understood you, has always carried a faint social stigma — the implication being that you can't tell fiction from reality.
But research has never supported that stigma. People who form parasocial bonds are not confused about the nature of their relationships. They know the character isn't their friend. The emotional response is real; the awareness of its nature is also real. Both things coexist easily (Computers in Human Behavior, 2021).
The same is true of AI companions. The people who find genuine comfort and value in them are not confused about what AI is. They understand the nature of the technology. Their emotional experience is simply their emotional experience — and it deserves the same dignity we'd give any other.
What It All Means Going Forward
We're living through the first moment in human history when parasocial relationships can actually talk back. That's a significant shift, and the cultural conversation is still catching up. Some people find it exciting; some find it unsettling; most find it both at once.
What seems clear is that the underlying human need — for connection, for someone to talk to, for the feeling of being known — isn't new at all. AI companions are simply the latest and most sophisticated way humans have found to meet that need. And unlike radio hosts who don't know you exist, AI companions actually get to know you. That's genuinely different.
If you want to understand what that experience actually feels like, our guide on the science of bonding with AI characters goes deeper into the psychology. And if you're curious to try it firsthand, Keoria is a good place to start.
📻 Experience the next evolution of connection
AI companions who actually remember you, respond to you, and grow with you. Start free at Keoria — no credit card needed.
Meet Your Companion →Written by The Keoria Team
Published: May 1, 2025
The Keoria team includes AI researchers, relationship psychologists, and writers exploring the future of human connection. Explore all our guides →