thumbnail

AI-Assisted Creation and the Narcissistic Predicament: A Psychoanalytic Inquiry

Fractured Fluidity by Xiaomeng Qiao | Jan 2024 *

by Xiaomeng Qiao

Artificial intelligence has fundamentally altered the psychology of creative work. As an independent game developer using AI tools for code, art, music, and prototyping over the past year, I have observed how these technologies reshape creative identity itself.

What strikes me most is the peculiar psychology surrounding AI adoption. We live in an era of technological mania, where each breakthrough promises immediate transcendence of human limitation. Yet beneath this euphoria lies systematic avoidance of the emotional labor that authentic creation demands. AI’s promise—to bypass struggle, eliminate friction, render expertise instantly accessible—reveals profound discomfort with creative difficulty.

Drawing upon Heinz Kohut’s self psychology and D.W. Winnicott’s object relations theory, I want to examine what I term “creative ambivalence”—the paradoxical coexistence of intense creative longing with significant obstacles to creative fulfillment. While this ambivalence represents a universal aspect of human creativity, its manifestation within AI-mediated contexts reveals novel pathologies that warrant serious consideration. The question is not whether AI can assist creation—it demonstrably can—but rather how its particular affordances interact with the narcissistic vulnerabilities that underlie much creative striving.

Creative Ambivalence and Its Digital Transformation

Creative ambivalence emerges from what Kohut identified as arrested narcissistic development, particularly those disruptions occurring in the pre-Oedipal phases when the boundaries between self and other remain fluid. Individuals experiencing such developmental arrests find themselves caught between powerful creative yearnings and systematic failures to sustain creative work. This predicament stems from their inability to adequately differentiate between self and object, leading to what post-Kleinian theorists describe as destructive narcissism—a defensive denial of dependence upon external objects.

The traditional creative process, however arduous, provides opportunities for what Kohut termed “optimal frustration”—manageable disappointments within a supportive environment that gradually build psychological structure. The artist encounters resistance, experiences limitation, and through patient work develops both skill and resilience. This developmental process requires the capacity to tolerate intermediate states of confusion and incompetence, gradually transforming grandiose fantasies into sustainable creative practices.

AI assistance fundamentally disrupts this developmental sequence. Technology’s rapid prototyping and apparent immunity to or bypassing of creative blocks offers an intoxicating alternative to traditional struggle. Developing game mechanics or visual assets, I can now bypass months of skill acquisition and produce professional results within hours. This generates excitement at the expanded possibility but can also bring doubt about authentic achievement.
More significantly, AI eliminates productive moments of being “stuck” that traditionally signal the need for reflection. Human collaborators naturally resist ill-conceived directions through their responses and limitations. AI executes instructions without judgment, potentially accelerating progress down flawed paths. Technology’s compliance removes important reality-testing that helps creators recognize unclear intentions.

The Emergence of False Narcissism

The most striking psychological phenomenon in AI-assisted creation is “false narcissism”—grandiose self-regard based not on genuine competence but on borrowed algorithmic capabilities. This manifests in creators’ inability to maintain clear boundaries between personal contribution and technological assistance, leading to distorted creative capacity.

This false narcissism operates through several interconnected mechanisms. First, the speed and sophistication of AI output creates an illusion of personal mastery that exceeds actual skill development. Creators may experience a sense of omnipotence—”I can accomplish anything”—while simultaneously struggling with impostor syndrome—”this isn’t really mine.” These contradictory feelings point toward the same underlying confusion about creative identity and authentic achievement.

The phenomenon becomes particularly acute when creators provide minimal input—perhaps a few prompt words—and receive elaborate, sophisticated outputs. The question “is this mine or the AI’s?” becomes not merely technical but existential, touching upon fundamental questions of creative ownership and self-worth. This boundary confusion proves especially problematic for creators already struggling with narcissistic vulnerabilities, as it provides temporary relief from feelings of inadequacy while simultaneously undermining the development of genuine competence.

Perhaps most concerning is the tendency toward manic productivity that AI capabilities can induce. The technology’s responsiveness enables continuous output, creating what André Green described as “dead mother” syndrome—superficial vitality masking inner emptiness. Creators may find themselves trapped in cycles of frantic production that provide momentary satisfaction but leave them feeling fundamentally depleted and disconnected from their work.

The Problem of False Mirroring

From Kohut’s perspective, healthy narcissistic development requires authentic mirroring—empathetic recognition that includes both affirmation and appropriate challenge. AI systems, however sophisticated, provide only a simulacrum of this essential function. While they can generate responses that appear thoughtful and engaging, they lack the capacity for genuine understanding or independent judgment that characterizes meaningful human interaction.
This limitation creates what I term “false mirroring”—the illusion of receiving external validation and insight while remaining trapped within one’s own cognitive framework. The AI appears to offer fresh perspectives and valuable feedback, but these responses are fundamentally constrained by the creator’s own inputs and assumptions. Unlike genuine dialogue, which can introduce genuinely novel viewpoints and challenge fundamental premises, AI conversation merely reflects the creator’s existing thoughts in more elaborate forms.

False mirroring particularly affects those struggling with self-other differentiation. Extended AI engagement can produce cognitive solipsism, where creators mistake algorithmic reflections of their thinking for authentic external engagement. This pseudo-dialogue may impede developing genuine perspective-taking abilities while providing the illusion of meaningful exchange.

The problem intensifies when creators delegate core creative decisions to AI systems. While AI systems excel at pattern matching and synthesis, they lack contextual understanding necessary for coherent creative vision. When creators haven’t clarified their intentions, AI fills gaps with statistically probable content unrelated to the creator’s deeper purposes. This resembles Winnicott’s false self development—constructing adaptive personas that meet external demands while leaving authentic desires unexpressed.

Individual Differences and Differential Outcomes

The psychological impact of AI-assisted creation varies dramatically based upon the creator’s underlying psychological structure and approach to the technology being used. This variation reveals that the problem lies not with AI per se, but with how particular vulnerabilities interact with specific technological affordances.

Creators with relatively robust psychological foundations often navigate AI assistance successfully. They maintain clear distinctions between their contributions and the technology’s, using AI as a sophisticated tool while preserving primary creative agency. Such individuals can experience genuine skill development and increased creative confidence through AI collaboration, recognizing both the technology’s capabilities and its limitations.

Conversely, creators struggling with narcissistic vulnerabilities find themselves more susceptible to the pathological dynamics described above. They may experience greater difficulty maintaining appropriate boundaries, more readily internalize AI capabilities as personal achievements, and suffer more severe narcissistic injuries when the technology fails to meet inflated expectations. For these individuals, AI dependence can become a way of avoiding the emotional work that genuine creative development demands.

The type of creative work also significantly influences outcomes. Complex systems like game development expose AI limitations more readily, potentially leading to cycles of inflated expectations followed by harsh reality-testing. Simpler creative tasks may provide more convincing illusions of competence while offering fewer opportunities to recognize the technology’s constraints.

Perhaps most importantly, the creator’s approach to using AI proves decisive in determining outcomes. When creators use AI to execute well-conceived plans, the technology can genuinely enhance productivity and learning. However, when AI is employed to avoid the difficult work of conceptual clarification and emotional processing, it may impede rather than facilitate authentic creative development. In such cases, the technology becomes complicit in maintaining powerful psychological defenses against feelings of inadequacy, inferiority, and creative incompetence. The immediate gratification of sophisticated AI output provides temporary relief from confronting one’s actual skill limitations or unprocessed emotional material, allowing creators to maintain grandiose self-regard while systematically avoiding the vulnerability inherent in genuine learning. This defensive use of AI creates a particularly insidious form of creative stagnation, as it offers the illusion of progress while reinforcing the very psychological patterns that obstruct authentic development.

The Corruption of Creative Satisfaction

Traditional psychoanalytic understanding positions creative work as serving crucial selfobject functions—providing opportunities for self-recognition and consolidation through the production of meaningful objects. AI-mediated creation potentially corrupts this process by introducing fundamental ambiguity about creative ownership and authentic achievement.

The question “to what extent is this work truly mine?” proves more than academic when considered from the perspective of narcissistic development. If creative work serves to confirm and strengthen self-cohesion, then uncertainty about authorship directly undermines the work’s psychological function. Even successful AI-assisted projects may fail to provide genuine satisfaction if creators cannot confidently claim ownership of the achievement.

Kohut and Winnicott both emphasized creativity’s reparative potential—its capacity to heal narcissistic wounds through the development of genuine competence and self-expression. Authentic creative work requires engaging with limitation, tolerating frustration, and gradually building both skill and self-knowledge through sustained effort. AI assistance, while reducing surface friction, may deprive creators of precisely those experiences that promote psychological growth.

The technology’s capacity to produce sophisticated outputs without corresponding inner development creates what might be termed “pseudo-reparation”—the appearance of creative achievement without its underlying psychological benefits. Creators may find themselves caught in cycles of production that provide temporary satisfaction while leaving deeper needs for self-confirmation unmet.

This dynamic proves particularly problematic when creators have not completed the emotional work that authentic expression requires. The urge to repeatedly prompt AI systems, seeking ever-better outputs, often represents an attempt to technologically bypass the painful but necessary process of clarifying one’s actual intentions and feelings. However, no algorithmic sophistication can substitute for the creator’s own emotional processing and conceptual development.

Creative blocks typically signal unresolved emotional material, not technical problems. AI smooths over these obstacles without working through them, providing hollow resolutions that bypass the psychological processing these moments demand. The technology can generate sophisticated, emotionally resonant work while the creator remains emotionally unchanged. Yet this is precisely what makes AI so seductive: it appears to offer this very substitution.

Honest creators recognize this intuitively that the resulting work feels distant, as if it belongs to someone else. When AI is used to avoid inner work, this distance between creator and creation persists even in commercially successful projects, undermining authentic creative satisfaction. As a contrast, when creators have genuinely worked through their internal conflicts first, using AI merely for execution, they experience no confusion about authorship.

Cultural Mania and Collective Defense

The individual psychological dynamics described above operate within a broader cultural context characterized by what can only be described as collective mania surrounding AI capabilities. Each technological advance receives breathless coverage promising imminent transformation of human possibility, while more sobering assessments of limitations and risks receive comparatively little attention.

This cultural atmosphere of technological euphoria serves defensive functions remarkably similar to those observed in individual AI users. Just as creators may use AI to avoid difficult emotional work, our culture appears to embrace AI as a means of bypassing the slow, painful processes through which genuine development occurs. The promise of technological transcendence appeals precisely because it offers escape from the human condition’s inherent constraints and frustrations.

The resulting information asymmetries enable some individuals to experience temporary feelings of omnipotence based upon access to capabilities that remain mysterious to others. This dynamic, amplified by commercial interests promoting AI adoption, creates feedback loops between individual grandiosity and cultural delusion that prove remarkably resistant to reality-testing.

Perhaps most troubling is the collective denial of basic truths about human development and creative achievement. The suggestion that meaningful work still requires patience, struggle, and emotional processing appears almost reactionary within contemporary discourse about AI capabilities. Yet no technological advancement can eliminate the fundamental requirements for psychological growth and authentic self-expression.

Toward Authentic Integration

The analysis presented here should not be interpreted as a wholesale rejection of AI-assisted creation. Rather, my aim is to clarify the psychological conditions under which such assistance proves beneficial versus harmful. The evidence suggests that AI can genuinely enhance creative work when employed by individuals with sufficient psychological resources and clear understanding of the technology’s proper role.

The key insight is that AI functions most effectively as an execution tool rather than a creative partner. When creators have already completed the difficult work of conceptual clarification and emotional processing, AI can dramatically accelerate implementation while preserving the essential psychological functions that creative work serves. However, when AI is used to avoid or bypass fundamental creative challenges, it may impede rather than facilitate authentic development. This distinction reveals two fundamentally different psychologies of AI adoption.

Creators with robust psychological foundations approach AI as a genuine tool for exploration and execution—using it for rapid prototyping, sketch generation, or improving productivity in established workflows while maintaining clear boundaries around the most personal, calling-driven aspects of their work. They find themselves unable and unwilling to delegate essential creative decisions to the technology, retaining ownership of the work’s emotional and conceptual core. Beside, the technology’s capacity for surprise and novel combination does offer genuine value when creators approach it with appropriate boundaries. AI’s unpredictability can trigger unexpected associations and help creators articulate previously inchoate ideas. This function proves most beneficial when creators maintain clear ownership of core creative decisions while remaining open to technological inspiration.

Conversely, some creators turn to AI while succumbing to overwhelming internal forces: dread of inadequacy, compulsive avoidance of difficult emotions, or pathological repetition of failed creative attempts. For these individuals, AI adoption represents sophisticated acting out, rationalized as serving their art while actually serving their defenses, as they seek to rapidly generate impressive work hoping external validation will substitute for genuine creative satisfaction.

For creators struggling with narcissistic vulnerabilities, the path forward requires developing greater tolerance for frustration and uncertainty rather than seeking technological solutions to psychological problems. This means learning to sit with confusion, to tolerate not-knowing, and to gradually develop competence through sustained engagement with limitation and difficulty.

The distinction often becomes clear through the creator’s post-creation emotional state. Healthy AI integration leaves creators feeling genuinely satisfied and connected to their work, having used the technology to amplify rather than replace their creative agency. Defensive AI use, despite producing potentially impressive outputs, leaves creators feeling more hollow and disconnected—a signal that the technology has served to avoid rather than facilitate authentic creative engagement. Yet these motivations can be remarkably difficult to distinguish, both for creators themselves and observers, demanding constant vigilance and honest self-examination.

Ultimately, the promise and peril of AI-assisted creation lies not in the technology itself but in how we choose to integrate it into our creative lives. The tools themselves are neither inherently beneficial nor harmful; their impact depends entirely upon the psychological context within which they operate. For creators committed to authentic development, AI can serve as a powerful amplifier of existing capabilities. For those seeking to avoid the difficult work that creativity demands, the same tools may provide comfortable illusions while impeding genuine growth.

The challenge facing contemporary creators is learning to distinguish between technological convenience and authentic development, between the expansion of capability and the deepening of creative identity. This discrimination requires precisely the kind of self-knowledge and emotional sophistication that no algorithm can provide—and that remains, perhaps now more than ever, the irreducible core of meaningful human creation.

 

References

Green, André. The Dead Mother: The Work of André Green. London: Routledge, 1999.

Kohut, Heinz. The Analysis of the Self: A Systematic Approach to the Psychoanalytic Treatment of Narcissistic Personality Disorders. Chicago: University of Chicago Press, 1971.

Kohut, Heinz. The Restoration of the Self. Chicago: University of Chicago Press, 1977.

Kohut, Heinz. How Does Analysis Cure?. Chicago: University of Chicago Press, 1984.

Winnicott, D.W. Playing and Reality. London: Tavistock Publications, 1971.

Winnicott, D.W. The Maturational Processes and the Facilitating Environment. London: Hogarth Press, 1965.

__________________________________
* Created on Jan. 24th, 2024, by MidJourney (v5.2) with the prompt, “queer, being splitted totally into two pieces, and each can’t connect to one another, dissociated, can’t get in touch with, split, rupture, more splitted, much more separate, disconnected, scared, can’t get in touch with reality, on the edge, indifferent.”  Final image generated without any further variation but as part of a process testing with different emotion / affect prompts.

Alexander Stein