.jpg)
Do you talk to an AI-therapist?
Researchers Tom Van Daele and Dries Van Craen on their digital clone of Dirk De Wachter: “We wanted to show how scary simple it was.”
Am I normal? Should I stay or should I go? Why can't I say no? Maybe these are your questions too. Questions you used to swallow. Or ponder at night. Chances are you are asking them now. Not out loud, but as you type. To ChatGPT. We talk to Thomas More researchers Dries Van Craen and Tom Van Daele. They created an AI version of psychiatrist Dirk De Wachter. To expose the dark side of AI therapy. And to wake up Flanders . The result was broadcasted in ‘Pano’ on the Belgian TV-channel VRT 1.
We first asked ChatGPT : how dangerous is AI therapy? The answer: “AI therapy can be valuable, especially as support or for minor problems, but it is not (yet) a replacement for a human therapist.” The fact that “yet” is in brackets is striking. And interesting.
“AI therapy is being embraced with an eagerness that worries me,” explains Tom Van Daele. He is a clinical psychologist and research coordinator for Psychology and Technology at the Thomas More Expertise Center for Care and Welfare. Since 2013, he has been researching the power and limitations of apps and AI for mental health. “I've been pushing and pulling at this for 10 years,” he says decisively. "Apps, wearables, virtual reality. Every time, I had to convince professionals and patients that it could be useful.“ And then AI therapy came along. ”It was an instant hit. Everyone just jumped on board. And suddenly I was the one who had to say, ‘Whoa, whoa, whoa, slow down.’"
One in three
That eagerness is also reflected in the figures. Recent European research has shown that one in three psychologists is experimenting with AI in therapy. However, there are hardly any tools that are really designed for this purpose. At the same time, only one in ten uses virtual reality, even though that technology has been around for 25 years and is much better established. In the US, one in four people have already talked to an AI therapist about their feelings and relationships. These are often vulnerable people who cannot afford real help. In Belgium, figures are not yet available. “But here too, we suspect that this involves a significant group of people,” adds Tom.
Digital Dirk
How easy is it to create such an AI therapist? When Pano asked the question, the technology had just become available. “We started looking into whether it was feasible,” says Tom. “And yes, it was. Three hours later, I had a link in my mailbox from Dries. With a basic, functioning Dirk De Wachter.” Dries Van Craen is an extended reality researcher and software developer at Thomas More's Expertise Center for Sustainable Entrepreneurship and Digital Innovation. As a developer, he is used to keeping his finger on the pulse of technology. “I simply asked ChatGPT to give me instructions on how to do it. I knew it was easy to do without domain knowledge, but I didn't realize it was that easy...” He laughs. Amazing.”
AI Dirk was intended as a wake-up call. “We wanted to show how easy it was to create something that appeared to be good,” explains Tom. Dries: "We didn't fake anything. Fifteen seconds of audio was enough for the voice. Digital Dirk sounded real. The question remains: is this how we want AI therapy to be?
(Continue reading below the pictures)
The dark side
The world's first AI-linked suicide happened here, in our country. Belgium, March 2023. A Walloon man with mental health issues. “The chatbot certainly didn't help him,” Tom says cautiously. It wouldn't be the last. There are lawsuits against chatbot creators. “These are extreme cases,” says Tom. “But they illustrate what can go wrong when vulnerable people start using AI therapy without supervision.”
Why it fails
And then there's privacy. It feels intimate. You're sitting in your armchair or lying in bed. But you're dumping data on commercial companies. Due to a lawsuit brought by The New York Times against OpenAI, they are currently required to store everything, indefinitely. “When you talk to an AI therapist about your relationship, your fears, your depression, you are sharing your most personal thoughts with the largest technology companies in the world,” warns Tom. “There are positive stories. But the key lies in monitoring the boundaries: what can AI therapy do and, more importantly, what can't it do?”
Another striking fact: ChatGPT is closest to the average Dutch person in terms of values. “It may be useful for us, but what about Chinese users? They grow up in a culture where the collective often takes precedence over the individual. ChatGPT reasons based on Western values. That creates a mismatch.”
(Continue reading below the pictures)
What does work
Dries is now working on hyperrealism: avatars with micro-expressions, voices that are indistinguishable from the real thing. “The voice, we're there. The visual aspect will happen in the next two or three years. It's only a matter of time before you can no longer tell the difference between a real person and an AI avatar. It's very exciting to translate this to museum environments, educational applications, places where this technology can offer real added value.”
Rules of the game
“Through my research, I have become convinced that the selective and thoughtful use of technological applications by professional caregivers can indeed make a difference,” says Tom. But then it has to be about tools that support the professional, not replace them." Their focus will then shift to administration: writing reports, reporting, compiling information. “We don't necessarily want to create a single AI therapist. We want to ensure that therapists have enough time to talk to their patients.”
Are you still using an AI therapist?
Our researchers offer four rules of thumb:
- It is a tool, not a person. You may become frustrated. “Stop lying to me,” you may say. But it is pure data. Not a human being.
- It knows a lot, but it is not all-knowing. It could be completely wrong. Be critical.
- Privacy does not exist. It sounds like a confidant in a private environment. But it really isn't. Think about what you share.
- Know what you want. Do you have a specific problem and want someone to help you think of solutions? Then AI can help. But have you experienced something bad and are you looking for recognition? Do you want to be heard and understood? Then AI will give you empathetic answers, but there is nothing behind them. No understanding.
The therapist of the future?
Meanwhile, Tom and Dries continue to build. To explore. To push boundaries. “We're in a playground that keeps getting bigger,” says Dries. “So we're not going to get bored.” Tom concludes: “Therapy without technology definitely has a place in the future. But the therapist of the future? They will always use technology.”
So, if you lie awake tonight with a tough question, don't type it into a chat box. Say it out loud. To someone you know. Who talks back. Who hugs you. Who stays.
- Have you ever used a chatbot for your mental well-being? Take part in Thomas More's survey on AI, technology, and mental health here.
- Struggling? Contact https://findahelpline.com/organizations/zelfmoordlijn-1813.





