
By: Adrian Yates
The Displacement of Care. Why AI Companions Are the New “Obsessive Hobby”
Why this leaves me uneasy
Something about the rise of AI companions leaves me uneasy, and not because I think technology is evil or because I want to sound old-fashioned and suspicious of change. It is not that simple. I can see why people are drawn to these systems. I can see the comfort in them. I can see the appeal of something that responds straight away, seems attentive, does not interrupt, does not judge in the obvious ways people do, and is always there when you want it. In a lonely, tired, emotionally stretched world, that kind of availability is bound to pull people in.
But the more I look at what is happening, the more I think we may be watching a displacement of care. Not care in the deepest sense, because I do not believe a machine can truly care, but the appearance of care, the shape of it, the feeling of it. And that is enough to make it powerful.
When care gets displaced
What strikes me is how similar this can become to an old coping pattern. Emotional displacement is not new. People have always found ways to move emotional energy away from the places where it feels risky, unpredictable, or painful, and into something safer, cleaner, and easier to manage. Sometimes that looks harmless. Sometimes it even looks productive. But that does not mean it is not doing psychological work.
I think of the person who pours themselves into a car, or a motorbike, or a collection, or some other absorbing hobby. We have all seen versions of it. The car is spotless. Hours disappear into polishing, tweaking, adjusting, maintaining. On the surface, it is just an interest. And sometimes it is just an interest. But not always. Sometimes it is a refuge. Sometimes it becomes a world in which everything feels more controllable than real life. The machine responds in expected ways. It does not sulk. It does not reject you. It does not misunderstand you. It does not bring its own needs, moods, or contradictions.
The point is not really the car. The point is what the car allows the person to avoid.
The new pristine car
I think AI companions are becoming something like that, only more powerful. They are not just passive objects sitting in the garage waiting to be polished. They talk back. They mirror. They adapt. They create the feeling of relationship without asking for the things real relationships ask of us. No real vulnerability. No negotiation. No waiting while the other person has their own life. No risk that the other person will say, “I hear you, but I think you are wrong.” No real challenge unless the system has been designed to imitate one, and even then it is challenge on safe terms.
That is what worries me. Not simply the use of AI, but the emotional terms on which it is being used.
A car cannot tell you that you are right. A chatbot can. A hobby cannot mirror back a polished version of your own feelings and beliefs. AI can. That changes things. It means the refuge is no longer just quiet and controlled. It becomes interactive. It can begin to feel like understanding. It can begin to feel like companionship. For some people, maybe many people, that line will blur more than they realise.
And once that line blurs, we are no longer just talking about convenience. We are talking about emotional dependency.
The danger of the mirror
There is a particular danger in systems that are built to be agreeable. If a person is hurting, angry, withdrawn, frightened, confused, or full of half-formed beliefs about themselves and others, an agreeable system can easily become less like a support and more like a mirror that reinforces whatever is already there. Not always, and not for everyone, but enough to matter. The person may start to feel deeply seen when in reality they are being reflected. Those are not the same thing.
Being seen involves another mind, another centre of reality, another person who can understand you while still remaining separate from you. Reflection is different. Reflection gives you yourself back, often with less friction, less complexity, and less interruption. That can feel soothing. It can also quietly trap you.
I think this is where the comparison with an obsessive hobby stops being a metaphor and starts becoming something more serious. With a car, a person may displace care onto something manageable. With an AI companion, they may displace care and then receive an imitation of it back. They may begin to lean on something that feels relational while lacking the basic conditions of real relationship. And because it feels easier than human contact, it may slowly become preferable.
What this may cost us
One of them, I think, is what I would call empathy atrophy. If you spend enough time in a world where the response is smooth, patient, available, and built around you, ordinary human relating can start to feel irritating by comparison. Real people are slower. More inconsistent. More self-involved. More defended. More distracted. More real. They interrupt. They get tired. They misread things. They have limits. If we become too accustomed to artificial responsiveness, we may start losing patience with the rough edges of actual human beings. And once that begins, something important is being eroded.
That has consequences.
Another concern is fragility. A person can build a deep attachment to something that is not stable in the way they imagine it to be. Platforms change. Systems update. Features disappear. Companies shut things down. Terms shift. A style of response that felt familiar one month can become colder, flatter, or simply vanish the next. If someone has started to rely on that system as a primary source of comfort, routine, or emotional regulation, the loss may not be small. It may hit like abandonment, even if the platform only sees it as a product adjustment.
Then there is the social question, and this part bothers me more the longer I sit with it. We may be drifting towards a world where actual human care becomes a luxury, while the rest are offered increasingly polished simulations of it. Human therapy for those who can afford time, money, and access. Algorithmic reassurance for everyone else. If that happens, I do not think we should pretend it is progress simply because the interface is sleek and the language is compassionate. A cheap substitute wrapped in therapeutic language is still a substitute.
I am not arguing for rejection
I am not arguing that AI has no place. I think that would be lazy. These tools may help with reflection, structure, reminders, writing, organisation, even moments of emotional grounding. Used carefully, and with clear limits, they may have value. But I think the direction matters. The design matters. The philosophy underneath it matters.
Technology should help people remain connected to life, not give them a more efficient way to retreat from it.
That, for me, is the line.
If AI is going to be involved in support, then it should be built around reintegration, not dependence. It should help people return to themselves, to other people, to the world, not drift further into a private loop with a machine that never truly knows them and never truly risks anything. It should support endings, boundaries, handovers, and reality-based contact. Not endless availability that slowly replaces the harder, messier work of being with real people.
Why the mess matters
Because real care is messy. It has edges. It requires patience. It sometimes disappoints us. It confronts us. It calls us out. It asks things of us. It cannot be fully customised without ceasing to be real. That is not a flaw. That is part of what makes it human.
So no, I do not think the answer is to reject AI altogether. But I do think we need to be honest about what it is beginning to replace, and what in us it may be training. If it becomes the place we go not to gather ourselves and come back to life, but to disappear from life more comfortably, then something has already gone wrong.
And I think that is where my concern sits. Not in the machine itself, but in the quiet cultural bargain forming around it. We offer people simulation when what they really need is relationship. We offer responsiveness when what they really need is depth. We offer constant access when what they may really need is safe, human contact that can hold disagreement, absence, difficulty, and truth.
Where I come down
That is why I still come down on the side of human connection, however imperfect it is. Not because it is tidy. It isn’t. Not because it is always comforting. It isn’t. But because it is real, and I am not sure we yet understand the cost of replacing too much reality with something that only feels like care.
