Psychoanalysis and Psychotherapy Research or the Flight of the Dodo
Robert M. Galatzer-Levy
Robert M. Galatzer-Levy M.S., M.D., is a clinical professor of psychiatry and behavioral neurosciences at the University of Chicago and a faculty member of the Chicago Institute for Psychoanalysis.
A clinical psychoanalyst and a psychodynamic psychotherapy researcher walk into a bar. They don’t notice each other.
The joke is painfully close to reality, which is a pity. Psychoanalysts have much to learn from psychotherapy researchers but we largely don’t take advantage of this resource. Day-to-day decisions about treatment or the timing of an intervention could be informed and improved by using psychotherapy research, but this rarely happens because the two fields have grown far apart.
The relationship of psychotherapy research and psychoanalysis has been unfriendly for a long time. Those few individuals who have spanned the fields often find themselves marginal in both.
The problem is partly historical. Systematic psychotherapy research originated in attacks on “unscientific” psychoanalysts while psychoanalysts, beginning with Freud, claimed research methods that compared groups of people of one type with groups of another (e.g., patients treated with one form of therapy to patients treated with another) inevitably fail to capture complexities that can only even be approximated in extended case studies.
Equally important, the sociology of knowledge in the two fields is very different. Most analysts come from disciplines that demand students learn supposed facts based on teachers’ authority. Whether it is the physician’s “the hip bone is connected to the thigh bone” or the psychologist’s reinforcement schedules, in the healing and mental health professions the major focus of pre-analytic training is transmission of a body of knowledge accepted on teachers’ say so. This approach continues in psychoanalytic institutes where the discussion of evidence for claims is often absent and frequently unwelcome. In contrast, psychotherapy researchers, by and large, come from a world where questions of how one knows things are foregrounded, assertions of fact are made cautiously, with less attention to the facts themselves than the bases on which the facts are asserted.
This leads to the most obvious difference between psychotherapy researchers’ work and that of psychoanalysts. Psychotherapy research publications are full of statistics. Statistics are almost completely absent from traditional psychoanalytic publications. The power of statistics comes from giving quantitative descriptions of phenomena and quantitative estimates of how likely a proposition is to be valid. In contrast, most psychoanalytic writing is judged on its narrative persuasiveness and usefulness to practicing analysts. Good arguments have been made for both kinds of truth. But conversations across the gulf are hard.
Let’s go back to the bar. A mutual friend of the analyst and the researcher happens in and suggests they sit together. After the usual pleasantries both start whining about work. The researcher says, “Money for psychotherapy research has dried up.” The analyst says, “I know how you feel. No one comes looking for analysis anymore. It’s such a pain! Potential patients keep telling me they’re getting ‘scientifically proven’ CBT or medication. Frankly I’m not feeling great about offering a treatment with no scientific basis, whatever that is.”
The psychotherapy researcher grins, “Well if you’d just kept up on my field you’d know what to say. You’d have a better idea about what treatments work for various problems and what interventions seem to be most effective.” The analyst, trying to be tactful, doesn’t mention he has no idea how to “keep up” with the researcher and that his eyes glaze over whenever the R word (research) is mentioned.
“Yeah,” the researcher says, “in 2017 we have really good data about how well PDT, that’s short for psychodynamic psychotherapy, works to relieve symptoms and the news is good. Jonathan Schedler’s 2010 paper shows PDT is effective by bringing together published research on the question, and it’s at least as effective as many evidence based therapies. You can tell your patients, and yourself, that they have out of date information. Things aren’t quite as rosy as I’m painting them. In the first place there is very little empirical research on psychoanalysis proper. But worse, the Dodo isn’t dead.”
“I knew it.” says the analyst, who tended to become morose when he drank. “The part about analysis doesn’t worry me that much—most of what I do is, what did you call it, PDT. But this Dodo thing sounds ominous.”
“It is, sort of. You remember the caucus race from Alice in Wonderland, in which the Dodo announces, ‘Everybody has won and all must have prizes.’ That’s pretty much how it is with psychotherapy, and in fact with mental health treatments generally. Once you get to the level where patients are getting reasonable treatment the differences between the effectiveness of treatments become small compared to their main effects. The efficacy research doesn’t show PDT is the best treatment for everything but it does put the lie to the idea that there are a bunch of scientifically proven treatments out there that are winners while poor PDT has been left far behind.”
Noting that the analyst seems to be staring into his beer, the researcher says, “You know, as long as I’ve got you here I should let you know there is a lot more to psychotherapy research than efficacy studies.”
The analyst, trying to be polite, says, “I keep hearing we need efficacy studies to prove analysis and PDT work. Isn’t that what it’s all about?”
“Not really, I understand that for political and insurance purposes you guys would like data that show your treatment works, but there is a whole lot more to psychotherapy research and some of it could reshape the way you work. For example, there is the whole alliance idea. In the Paleolithic era, psychotherapy research, in the 1960s and ’70s, researchers showed the alliance between therapist and patient was of primary importance in determining whether therapy worked and whether the patient stayed in treatment. It mattered more, for example, than the particular techniques that therapist used. But that wasn’t the end of the story. The next obvious questions were what interferes with the alliance and how can it be repaired if it is damaged? Here, there were findings that really could help in your bread and butter work.”
“I hate to say this but that’s not really news, we were always taught to interpret the negative transference promptly, although I must admit that didn’t always work—half the time the therapy just blew up anyway.”
“Exactly. What a long series of studies showed was that interpreting the transference in the sense of telling the patient that negative feelings about the analyst came from early, unconscious sources tended to make things worse and came across more as excuses than useful insight. Instead, acknowledging the problem and trying to understand it in the present, including the therapist’s contribution, often led to the establishment of a stronger alliance. What was more the experience of working through the problem with the therapist had a positive effect on therapeutic outcome. You would have to admit that would be helpful information for a working analyst.”
“It certainly would. Interrupted treatments are the bane of my existence and I often feel helpless when I see things going down the tubes. I know I should ‘work on the alliance’ or deal with the negative transference but frankly I’ve always wondered what that really means. By the time my patients tell me for the twelfth time, ‘I know what you’re going to say—this reminds me of my depressed mother’—I get the message that transference interpretations aren’t doing the trick but the work you’re describing points to a different approach. Do you have any more tricks up your sleeve?” The analyst, who usually didn’t talk this way had had one too many beers and the whole situation was beginning to feel a bit too friendly.
“Yeah, I have a bunch,” says the researcher, who was feeling pretty good himself. “Here’s one: what works in one situation may not work in others. As Peter Fonagy puts it, the trick is to figure out ‘What Works for Whom.’ (It’s so British to get the grammar right.) There are a bunch of results I wish therapists knew. For example, remember how you were always taught psychoanalysis is for people who aren’t action prone? A wonderful study of children treated at the Anna Freud Center showed that while neurotic kids benefited from analysis the biggest therapeutic gains from intensive therapy was seen in the action-prone youngsters.”
Seeing the analyst has a happy smile on his face but not sure if it is the beer or the talk, the researcher, who had been dying to have a chance to tell one of these guys about his work, goes on, “And then there is the work on panic by Barbara Milrod and her co-workers. They showed rigorously that their form of PDT works but that when panic states are part of a complex psychological picture, using PDT makes more of a difference for patients than when the panic states seem to be isolated phenomena. And, of course, there was the late great Sidney Blatt who showed that depressed patients with introjective personality organization are more responsive to psychoanalysis while anaclitic patients are more responsive to more supportive psychotherapy. Choosing the right treatment for the patient really makes a difference.”
The analyst, who is beginning to feel groggy and a bit overwhelmed says, “I’m sorry but I have to get up in the morning to attend a session on proper analytic techniques. But hey, they never taught us this stuff at the institute. Where can I learn more?”
The researcher who isn’t accustomed to quite this much beer is beginning to mumble something about “latent growth mixture modeling,” but the mutual friend says, “You can find most of this and a lot more in the ‘Open door review of outcome and process studies in psychoanalysis’ on the IPA website https://www.ipa.world/ipa/IPA_Docs/Open%20Door%20Review%20III.pdf.
Whipping out his iPhone the analyst manages to find the review and laughs, “Four hundred eleven pages—do you really think …”
“It’s not meant to be read straight through, although it’s a surprisingly good read. But if you want something short, take a look at Schedler’s “The Efficacy of Psychodynamic Psychotherapy’ (The American Psychologist, February-March 2010; https://www.apa.org/pubs/journals/releases/amp-65-2-98.pdf). It’s fair to leave copies strewn about your waiting room for the science-minded potential patient. If you want a nice summary of a huge amount of research take a look at J. Barber et al.’s “Research on Dynamic Therapies” in M. Lambert’s Bergin and Garifel’s Handbook for Psychotherapy and Behavior Change. If you get scared by the statistics just do what my yoga instructor recommends when teaching flying pigeon. ‘You can just skip over this part and practice easy seat or some other calming pose for a while.’”