When AI Becomes a Companion: Jourdan Travers on AI Haven’t a Clue
- aihaventaclue
- Sep 15
- 2 min read

There's a growing trend of young people turning to Artificial intelligence for companionship and relationships. Respected Psychotherapist, Jourdan Travers joined AI Haven't a Clue to explore the psychological impact this is having.
The idea of an AI girlfriend isn’t just novelty; for some, it’s becoming an emotional substitute or supplement to human relationships. Travers says that emotional investment in AI companions can lead to unrealistic expectations, of both the AI and of human relationships. AI, while impressively convincing in some instances, lacks essential qualities of mutual vulnerability, unpredictability and reciprocal growth. Using an AI companion as a consistent emotional resource can set a stage for disappointments when human relationships (messy and imperfect by nature) can’t compete with idealised digital mirages.
For some users, AI companionship might begin as harmless curiosity or aid, but risk growing into dependence. If someone leans on AI continually for emotional comfort, companionship, even validation, there’s a possibility it could reduce real social contact. Travers highlights how this is of particular concern for younger people who are in formative stages of identity, social building, and trust.
AI companions don’t feel. They simulate. Simulation can be subtly harmful, rather than helpful. Travers tells the podcast how having affection or emotional responses from an entity that doesn’t genuinely reciprocate might distort how people think about relationships and self-worth. The “relationship” risk isn’t always obvious in the short term; it can manifest in longer-term loneliness, or difficulty in forming more equal human emotional relationships.
There’s also a theme of how people project onto AIs - giving them personality, ascribing human features, assigning emotional weight. Travers talks about how this projection can shape someone’s self-image: what they feel they deserve in a relationship, what they expect from others and how they interpret emotional signals. Part of it is harmless anthropomorphism; part of it may set up mismatches between what someone expects from a human partner versus what a human partner can realistically give.
It’s not all negative. Travers acknowledged there can be therapeutic value in some cases: AI companions might help people practice emotional articulation, reduce feelings of loneliness, or offer a stepping-stone for social anxiety. For some people, especially in contexts of isolation or mental health struggle, AI companionship might offer solace or serve as a bridge. The concern is more about balance, awareness, and ensuring people don’t lose touch with human relational skills.
As AI companions become more sophisticated, people will need to become more psychologically literate about what these companions are - tools, not people. Understanding limits will become more important.
AI companions are more than a sci-fi curiosity. They’re part of a fast-unfolding reality with psychological, ethical, and social dimensions. While there’s potential for them to serve beneficial roles, the risks are not trivial - especially for younger people, whose relational lives are still being shaped.
