thumbnail

Dear Dr Danielle

A psychoanalyst answers your questions about human-AI relations

 

Dear Dr. Danielle,

In psychoanalysis, we have come to value concepts such as “subjectivity” and “emotional truth.” The promised use of A.I. as therapeutic agents bothers me because it lacks the ability to experience either subjectivity or an emotional truth. It provides a facsimile of subjectivity and a facsimile of emotional truth. Analysts also value an experience of “aliveness” for the patient (and analyst) in session and in the larger world. Again A.I. cannot experience this, or even know what that feels like “in its bones.” Does the mimicry of human experience, even if experienced as potentially helpful to a patient, mean it doesn’t matter whether the therapeutic relationship was real or canned, or should we be fighting for the humanity that has been implicit in our approach since this whole psychoanalysis thing started?

Not a Bot

 

Dear Not a Bot,

You’ve articulated something I think many of us feel but struggle to name—this deep unease about replacing what is fundamentally human with something that merely resembles it.

You are correct in saying that psychoanalysis has always been grounded in the meeting of two subjectivities, two people bringing their full humanity into a room. The aliveness you describe—that ineffable quality of presence, of being truly with someone—is not simply a nice feature of the work; it is central to it. Our patients aren’t just looking for accurate interpretations or well-timed reflections; they’re seeking genuine human contact, perhaps for the first time in their lives.

Your letter comes at a moment when this matter goes beyond the theoretical. Several studies show that many people of all ages are already turning to AI for emotional support and advice, often because human care is inaccessible, unaffordable, or feels too threatening (Chatterji, A., et. al. 2025; Rousmaniere, et. al., 2025). That reality makes your question urgent: are we offering people a bridge to care, or are we normalizing a world where simulated empathy replaces the real thing?

But here’s what troubles me even more: many psychoanalysts are already moving away from “two bodies in a room.” The shift to teletherapy, accelerated by the pandemic, means we’re conducting analysis through screens, mediated by technology, our physical presence reduced to pixels and audio compression. We’ve already begun distancing the human-to-human element you’re defending. In some ways, we’re moving toward a digital form of treatment that may resemble AI therapy more than we’d like to admit. Are we and our machines beginning to converge?

When I think about what makes psychoanalysis transformative, it’s precisely what you name—the analyst’s capacity to be moved, surprised, confused, even occasionally angry or wounded by the encounter. That mutual vulnerability, that shared risk of being changed by each other, creates the conditions for true growth. An AI can reflect and validate, but can it truly meet someone? And yet, can we truly meet someone through a screen in the same way we do in person?

I also sit with the uncomfortable knowledge that millions of people are suffering without access to any human therapeutic relationship. If AI provides some measure of relief or insight, do we dismiss that? Or do we hold both truths—that it’s helping some people and that something essential is missing?

Thank you for the reminder of what we’re fighting for. In a world increasingly mediated by technology, the question of what makes us irreducibly human matters more than ever. Perhaps we need to look carefully at our own practices before we can clearly articulate what separates us from the machines.

With appreciation,

Dr. Danielle

 

References

Chatterji, A., Cunningham, T., Deming, D. J., Hitzig, Z., Ong, C., Yan Shan, C., & Wadman, K. (2025).
How People Use ChatGPT. NBER Working Paper No. 34255.
https://doi.org/10.3386/w34255

Rousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025).
Large Language Models as Mental Health Resources: Patterns of Use in the United States.
Practice Innovations. Advance online publication.
https://doi.org/10.1037/pri0000292

 

 

=============================

Dr. Danielle is Danielle Knafo, PhD

If you’d like to submit a letter seeking Dr. Danielle’s advice on anything around human-AI relations, please write to [email protected] with a copy to [email protected].

 

Alexander Stein