
Cleopa Kinyua Njiru, a Counseling Psychologist during an interview in Upper-Hill, Nairobi, on July 2, 2025.
Cleopa Kinyua Njiru, a counselling psychologist with over 30 years of experience based in Upper Hill, warns of the psychological and emotional risks of using artificial intelligence (AI) in romantic or interpersonal relationships. While there is ease in AI tools like ChatGPT, he argues, they lack the basic qualities to promote human relationships.
“AI is fundamentally different from a human person,” states Mr Kinyua. “It responds to what you tell it, but it can't read what is going on in you.” To individuals suffering from emotional disorders such as anxiety, obsessive-compulsive behaviour, or low self-worth, the use of AI can provide temporary solace, but usually at the cost of further psychological harm.
In Kinyua's view, the most horrifying effect of AI on relationships is its potential to replace actual human touch. “When couples use AI to communicate, for example, to send love letters or apologies, the message is artificial,” he states. “It's not what the heart wants to say. It's just what you prompted.”
This disruption goes beyond surface-level conversation. Over time, reliance on AI weakens emotional bonds. “The natural flow of connection is lost. The relationship becomes distorted because it's no longer rooted in real, vulnerable human exchange,” Mr Kinyua explains.
He compares AI-generated communication to using a middleman to express your feelings. “You’re not building intimacy with your partner, you’re outsourcing it. And that’s a problem.”
Mr Kinyua sees a growing risk of mental dependence upon AI, especially for the emotionally vulnerable. “AI is used as a crutch,” he says. “People begin to rely on it not just with words, but for emotional support. That can alienate them from real-life relationships.”
This reliance can create behaviour addiction, as with alcohol or pornography excess. “At first, AI will seem helpful or pleasurable,” states Mr Kinyua. “But the brain adapts. You get hungry for more responses, more stimulation, more activity. This is tolerance. The original pleasure wears off, and you must escalate your use to feel as good.”
This cycle can lead to emotional isolation and social isolation, whereby people will begin to seek AI contacts more than human contacts simply because they are more predictable, safer, and more rewarding in the short term.
One of the issues Mr Kinyua quotes is the inability of AI to diagnose or even treat mental illness. “A therapist doesn't merely listen to what you say; we observe non-verbal cues, patterns, and mood,” he says. “AI cannot diagnose obsessive actions, depression, or anxiety. In fact, it might reinforce them.”

Romantic connection is a mutual exchange, eye contact, tone of voice, shared smiles.
“For instance, someone with OCD may ask AI questions from their obsessions and then be presented with information that heightens the cycle. Similarly, worried people could be escalated by the response style of AI. It mirrors your input; it cannot learn to challenge it or reroute you therapeutically.”
With all the sophistication, he underscores that AI is merely a tool. “It's like a phone or a pen,” he says. “It can assist in learning or technical work, but it cannot replace human intelligence, empathy, or relational sensitivity.”
He notes that even professionals like therapists, doctors, or engineers only use AI to assist with tasks they already understand deeply. “You need the foundation first. Otherwise, you’re blindly trusting a machine that doesn’t know you.”
When asked if, or when, AI might replace romantic or platonic relationships, Mr Kinyua hesitates not at all: “R

Romantic connection is a mutual exchange, eye contact, tone of voice, shared smiles
. AI can't do that. What it gives us is not companionship, it's simulation.”
He warns that users are not just capable of being disappointed, but can also be irreparably damaged psychologically. “You can lose your natural ability to connect. And before you know it, you might already be addicted.”.
His prescription? “Be human,” Mr Kinyua urges. “Go to therapy, learn conflict resolution face-to-face. Use AI as a tool, not a substitute.” He suggests early education, especially for children, to learn to use AI responsibly and avoid emotional dependence.
“The danger,” he concludes, “isn't in the tool but in assuming it can replace what only humans can do.”