top of page

SCHIZO_
COMPUTATIONAL
CAPITALISM

Liviu Poenaru, PhD; Oct. 4, 2025

​

​

Abstract

​

Background 
Building on both clinical psychopathology and critical theory, schizocomputational capitalism describes the current digital-economic order that mirrors and produces the fragmentation characteristic of schizophrenia. Classical psychoanalytic conceptions of the “split self” (Freud, Lacan) and Deleuze & Guattari’s notion of “capitalist schizophrenia” provide the groundwork for analyzing how algorithmic capitalism transforms human cognition, affect, and social bonds. Contemporary societies are immersed in computational infrastructures that not only reflect mental disintegration but actively reproduce it through data extraction, emotional manipulation, and symbolic overload.

​

Goal 
The study aims to conceptualize schizocomputational capitalism as both a metaphor and a systemic condition, demonstrating how algorithmic and affective processes destabilize subjectivity and democratic life. The goal is to show that the political economy of digital capitalism is psychopathogenic: it generates cognitive fragmentation, emotional dysregulation, and social disintegration as inherent outcomes of its operation.

​

Method 
The analysis combines psychoanalytic theory, political economy, and computational social science. It synthesizes literature from critical media studies, empirical research on algorithmic affect modulation (e.g., Facebook’s emotional-contagion experiment), and sociological data on polarization and misinformation. Three analytical axes are developed: (1) the contradiction economy, which commodifies moral and ideological opposites; (2) affective and algorithmic colonization, which captures and manipulates emotional life; and (3) symbolic overload, which saturates meaning through continuous semiotic noise and simulation.

​

Results 
The findings reveal a coherent pattern: digital capitalism monetizes dissonance and confusion. Algorithmic systems exploit negativity bias and emotional contagion to sustain engagement, transforming users into fragmented affective feeds. Computational infrastructures blur the distinction between authentic and induced emotion, while hyper-simulated information environments collapse the difference between truth and falsehood. These dynamics collectively erode the “sovereign self” and the shared symbolic frameworks necessary for democratic deliberation.

​

Interpretation and Perspectives 
Schizocomputational capitalism thus appears as a pathogenic regime of sense: a system that profits from psychic disorder while undermining collective rationality. Strengthening democratic life will require innovations in regulation, education, and digital architecture—but also recognition that artificial intelligence and corporate power may already outpace human interpretive agency. Whether reform is still possible or a civilizational rupture is inevitable remains an open question. Yet diagnosing these schizoid traits is a necessary first step toward re-imagining a healthier synthesis of technology, psyche, and polis.

 

Introduction

 

In clinical psychopathology, schizophrenia is a severe mental disorder characterized by profound disruptions in cognition, emotion, and sense of self. The DSM-5 (Diagnostic and Statistical Manual of Mental Disorders, 5th ed.) defines schizophrenia by symptoms such as persistent delusions, hallucinations, disorganized speech and behavior, and blunted affect, with significant impairment in social or occupational functioning. Etymologically meaning “split mind,” schizophrenia was so named by psychiatrist Eugen Bleuler to capture a fragmentation of mental functions—thoughts and perceptions that no longer cohere into a unified reality. Clinically, individuals with schizophrenia experience a breakdown in the ability to distinguish internal fantasies from external reality, often manifesting as a shattered continuity of self and a collapse of consensus about the real (American Psychiatric Association [APA], 2013). 

​

In Freudian psychoanalysis, this disorder was initially understood as a form of narcissistic withdrawal. Freud observed that patients with “dementia praecox” (an early term for schizophrenia) divert their libidinal energy away from external objects and relations, turning it inward onto the ego (Freud, 1914). In his essay On Narcissism, Freud noted the “withdrawal of the libido from external objects” in schizophrenia and the resulting megalomania and detachment from reality. The schizophrenic, in this Freudian view, regresses to a narcissistic state, making genuine interpersonal connections and therapeutic transference extraordinarily difficult. 

Lacanian psychoanalysis further refines the understanding of psychosis (of which schizophrenia is a paradigm case) as a structural failure in the symbolic order. According to Jacques Lacan, schizophrenia results from the “foreclosure” of the Name-of-the-Father – the primordial signifier that anchors the subject in shared social reality.

 

In Lacan’s theory, the Name-of-the-Father represents the internalized law and language of society; if this signifier is excluded from the psychic structure, the individual cannot stably enter the symbolic realm of meaning. The outcome is a collapse of the social bond and a cascade of unsymbolized experiences – hallucinations, delusional interpretations, and a fragmented sense of self (Lacan, 1957/1977). Thus, across clinical and psychoanalytic dimensions, schizophrenia epitomizes a fragmentation of subjectivity: thoughts, affects, and identity cease to integrate into a coherent narrative, leaving the subject inundated by disjointed voices and perceptions that resist organization.

​

Critical theorists have appropriated the term “schizophrenia” beyond its clinical meaning to diagnose pathologies of contemporary culture and economy. Notably, Gilles Deleuze and Félix Guattari (1983) reconceptualized schizophrenia as both metaphor and method in their work Anti-Oedipus: Capitalism and Schizophrenia. They distinguish the medical condition from a broader “schizophrenic process”: a liberatory breaking of entrenched structures and meanings. For Deleuze and Guattari, capitalism itself exhibits a schizoid tendency; it perpetually destabilizes and reconfigures social relations in the pursuit of profit. Capitalism is described as an unceasing process of deterritorialization – it dissolves traditional bonds, values, and certainties – followed by immediate reterritorialization onto the axioms of the market and private interest. 

​

In Deleuze and Guattari’s analysis, the decoding of flows and the deterritorialization of the socius thus constitutes the most characteristic and most important tendency of capitalism, constantly pushing toward a “genuinely schizophrenic limit” of complete social disintegration. However, capitalism “schizophrenizes” only to capture and exploit the resulting fragments: every liberated desire or dissenting impulse is rapidly absorbed back into commodity circulation and surveillance. Deleuze and Guattari’s provocative formulation of “capitalist schizophrenia” is not a clinical claim but a sociopolitical diagnosis: an analogy between the fragmented, delusional consciousness of the psychotic individual and the frenetic, contradictory dynamics of late capitalism. This critical-theoretical lens suggests that modern power operates not by imposing a single normative order, but by modulating chaos—embracing contradictions, proliferating images and affects, and capitalizing on disorientation.

​

Integrating these insights, the present work develops the concept of “schizocomputational capitalism.” This term denotes the current configuration of digital or algorithmic capitalism as both metaphorically schizophrenic and as a structural producer of schizophrenia-like effects in society. On one hand, “schizocomputational” is metaphorical: it highlights how today’s socio-economic order mirrors the fragmentation of the self associated with schizophrenia. Ubiquitous computing networks and social media platforms act like an externalized nervous system, fraying attention and splintering identity into data-defined pieces. On the other hand, the term also indicates a concrete structural condition: contemporary capitalism, driven by computational technologies, actively engenders states of cognitive and affective disorganization for profit. It is schizo- in that it thrives on the production of division, contradiction, and confusion, and -computational in that it uses algorithmic processing of massive data to achieve this at scale. 

​

Under schizocomputational capitalism, the fragmentation of subjectivity is not just a psychopathological metaphor but an everyday lived reality: individuals experience their sense of self and community undermined by the ceaseless flux of online information flows. Algorithmic modulation of affect becomes a routine instrument of economic and political power, as machine-learning algorithms curate what we feel and believe in order to maximize engagement. As we shall explore, this system entails systematic manipulation of belief – truth and falsehood circulating interchangeably – resulting in a populace that oscillates between extremes of credulity and cynicism. 

​

Ultimately, these processes have grave implications for democracy. The destabilization of democratic life is a defining outcome of schizocomputational capitalism: a polity that is affectively polarized, epistemologically fragmented, and unable to sustain the shared reality necessary for reasoned public discourse. This essay analyzes these dynamics through three interrelated lenses: The contradiction economy, Affective and algorithmic colonization, and Symbolic overload. Together, these analyses will demonstrate how contemporary capitalism, turbo-charged by digital computation, operates through a schizoid logic – one that simultaneously fragments and exploits human experience – and why this poses a profound challenge to subjective coherence and democratic society.

​

The Contradiction Economy

​

One hallmark of schizocomputational capitalism is an economy of contradictions, wherein opposing values and emotions are not resolved or balanced, but instead simultaneously intensified and monetized. Digital platforms and social networks create a structural double bind: they herald themselves as arenas of personal expression, community, and virtue while their underlying algorithms thrive on conflict, extremism, and vice. For example, a social media platform might publicly celebrate ideals of tolerance, well-being, and authentic connection, yet its engagement-driven algorithms preferentially amplify outrage, envy, and fear if those emotions yield higher clicks and longer screen time. Users thus receive mixed injunctions at every turn. On the surface, they encounter constant exhortations to be positive, compassionate, and socially conscious (e.g. viral charitable challenges, corporate virtue signaling). Simultaneously, the same platform’s recommendation systems aggressively push polarizing news, sensational conspiracies, or provocative content that generates hate, anger, and anxiety. 

 

The result is a kind of dissonance at the core of the system: as I already developed previously (Poenaru, 2023), social media produce and maintain a permanent double bind, a sort of structural dissonance. On one side, they present themselves as new temples of virtue, peace, tolerance, creativity, and individual expression; on the other, their algorithmic architectures exploit the most archaic drives of desire and rivalry, fueling comparison, anger, fear and greed. In this way, the positive and negative poles of human affect are both harnessed as raw material. The economy of this process lies in rendering every polarity profitable: virtue and vice, attraction and repulsion, affirmation and negation all become grist for the attention mill.

​

Contemporary capitalism no longer seeks to smooth over or resolve social contradictions – instead, it commodifies conflict. Each dichotomy or cultural rift becomes another opportunity for engagement, data collection, and targeted advertising. As Deleuze and Guattari presciently noted, capitalism is uniquely capable of integrating dissidence and opposition back into its operational logic. A striking illustration is how online platforms handle moral and political polarization. Outrage and division, which in earlier eras might have been seen as signs of social dysfunction or crises to be managed, are in the digital attention economy desirable assets. Research in psychology has long established that negative stimuli and events command greater attention and have stronger effects on memory and behavior than neutral or positive ones (the “negativity bias”). Indeed, Baumeister et al. (2001) famously summarized this principle in their meta-analytic finding that “bad is stronger than good” in human cognition. Social media algorithms implicitly exploit this asymmetry: posts that induce anger or fear (for example, inflammatory political misinformation or sensationalized news) tend to generate more engagement and prolonged user attention than posts that induce contentment or agreement. As a consequence, the systems favor divisive content. 

​

The contradiction economy ensures that every outrage becomes a data point, every social schism a revenue stream. Far from attempting to bridge differences or promote consensus, platform algorithms often exacerbate ideological extremes because polarization itself is profitable. The more the world is cleaved into opposing camps – “clived” into us-versus-them mentalities – the more intensely users interact, share, and produce monetizable data exhaust. In Shoshana Zuboff’s terms, these platforms operate as part of a new surveillance capitalism that makes money by tracking, predicting, and shaping human behavior at scale (Zuboff, 2019). From Zuboff’s analysis, we glean that extracting profit from human experience has few moral guardrails; whatever keeps users scrolling and clicking is the business model. If outrage and conspiracy theories keep people engaged, they will be algorithmically amplified, regardless of the truth or the social harm.

​

The commodification of contradiction can be seen in how every polarity becomes a market segment. Causes and their counter-causes are equally monetized. Environmental sustainability is promoted in brand marketing, while advertisements for gas-guzzling luxury SUVs appear in the next feed scroll; an anti-establishment meme flourishes on the same video platform that hosts corporate PR campaigns. Rather than resolve the tension between, say, ecological consciousness and consumerist desire, digital capitalism simply sells both, often to the same individuals in different moments.

​

The phenomenon of corporations profiting from “woke” social justice messaging while simultaneously lobbying against labor or environmental reforms exemplifies this two-faced logic. It can be seen as a form of “biface capitalism” – Janus-like, with one visage of humanistic values and another of ruthless exploitation. Under schizocomputational capitalism, sincerity and cynicism cohabit the same structures. This systemic duplicity leads to psychological strain on the individual level: users internalize contradictory imperatives (to care and to consume, to empathize and to outperform others, to express individuality and to seek constant validation). The subject becomes a ‘feed’ itself: an assemblage of contradictory positions… no longer a stable identity, but a succession of affective states dictated by the attention economy. In other words, the person in a contradiction economy oscillates rapidly between moral outrage and narcissistic display, between collective virtue and competitive ego – mirroring the splitting and identity diffusion that, metaphorically, recall schizophrenia’s disruption of a unitary self. The “contradiction economy” thus refers to a mode of capitalism that feeds on conflict and, in doing so, reproduces in society the very disorientation and divided psyche that define the schizoid condition.

​

Affective and algorithmic colonization

​

A second defining feature of schizocomputational capitalism is the colonization of subjectivity by computational processes, particularly through the modulation of human affects (emotions, desires, and attention) via algorithms. If industrial capitalism colonized the body (harnessing physical labor and regimenting time) and late consumer capitalism colonized social life and desire (through advertising and mass media), the current era of ubiquitous computing extends colonization into the intimate recesses of the mind – our moods, attentional rhythms, and subconscious impulses. Digital platforms are not passive channels of content; they are active conditioning systems (Poenaru, 2023). Machine learning algorithms analyze every click, pause, “like,” and search query to infer psychological vulnerabilities and preferences, then dynamically adjust the feed to maximize engagement. In essence, the algorithm becomes a real-time behavioral psychologist, continuously experimenting on the user. This process can be understood as an algorithmic orchestration of the collective psyche. 

​

Social media algorithms today do not merely produce visibility; they organize the psyché collective. They orchestrate the market of emotions by adjusting in real time the content to each person’s fragilities, beliefs and desires. It’s a capitalism that colonizes subjectivity—no longer only our bodies or labor, but our affects, perception, and imagination. This view captures the core of affective colonization: the intimate mechanisms of feeling and thought become targets of economic exploitation, administered by computational agents.

​

Empirical evidence for this algorithmic modulation of affect has emerged in various studies. A controversial example is the Facebook “emotional contagion” experiment, in which researchers manipulated the algorithmic curation of nearly 700,000 users’ news feeds to be slightly more positive or more negative in tone (Kramer, Guillory, & Hancock, 2014). The study found that users exposed to fewer positive posts subsequently produced fewer positive expressions themselves (and vice versa for negative posts), demonstrating that emotional states can be transferred and engineered via algorithmic control of content. Although Facebook users were unaware of the experiment, their moods were measurably shifted by the platform’s hidden interventions.

 

This exemplifies how affect becomes a programmable variable in the attention economy. Similar algorithmic steering occurs in recommendation systems on YouTube, TikTok, or Instagram, which learn to serve each user an optimally captivating mix of content – often by leveraging emotional triggers like curiosity, lust, or indignation. The goal is not truth or enlightenment, but to maximally capture attention in order to sell advertisements or gather data. As Zuboff (2019) argues, the tech corporations of surveillance capitalism seek to predict and shape human behavior as a means of guaranteeing revenue streams; to do so, they must instrumentalize human experience, including our feelings and choices, as raw material for data analytics. The result is a pervasive form of behavioral conditioning. 

​

Much like a slot machine is designed to hook a gambler through variable rewards, social apps algorithmically alternate pleasure (e.g. satisfying content, social affirmation) with provocation (e.g. enraging news, social comparison anxiety) to keep users oscillating between emotional peaks and valleys. This intermittent reinforcement taps into what Freud called “archaic drives” – deep-seated instincts for pleasure seeking, aggression, and social attachment – but harnesses them in an automated, data-driven way. The user’s affective life becomes the terrain of a computational feedback loop: our joy, boredom, desire, or outrage is continuously sampled, fed into algorithms, and returned to us in the form of stimuli calibrated to provoke further emotion, in an endless cycle.

​

The term “colonization” is apt because this process increasingly encroaches on autonomy and interiority. Just as colonial powers once infiltrated foreign lands to extract value, today’s algorithms infiltrate the internal landscape of attention and emotion. Philosopher Byung-Chul Han has described this as a new stage of capitalist control he calls psychopolitics, where power operates not by open coercion but by shaping the psychic drives of individuals who believe themselves free (Han, 2017). Individuals willingly submit to digital platforms because they offer convenience or social connection, yet in doing so they subject themselves to what Franco “Bifo” Berardi terms the semiotic stimulation of cognitive capitalism – a constant bombardment of signs and signals that can induce stress, anxiety, or even neurochemical changes (Berardi, 2018). Indeed, Berardi suggests that the exploitation of our cognitive and affective capacities in the digital economy has led to a rise in psychopathologies such as attention disorders, depression, and panic, as human neural plasticity is pushed to adapt to an overload of stimuli and the demands of relentless connectivity. In a sense, the mind itself becomes a direct site of profit extraction, with algorithms as the new managers of the “means of production” of behavior.

​

Affective and algorithmic colonization clearly blurs the line between authentic and induced emotions. When your stream of feelings is significantly shaped by an algorithm’s dark patterns (e.g. auto-playing the next video chosen to stir a strong reaction, or sending a push notification at a moment of likely boredom or loneliness), to what extent do those feelings remain “yours”? The metaphor of schizophrenia becomes salient here in a nuanced way: the schizophrenic individual often cannot tell which voices are internally generated versus externally imposed, or what is real versus hallucinated. Analogously, under schizocomputational capitalism, one may struggle to discern genuine beliefs from those subtly inculcated by algorithmic suggestion, or spontaneous moods from those nudged by digital feedback loops. The self becomes an unstable assemblage of influences, not fully aware of how it is being orchestrated. 

​

The erosion of the sovereign self has political ramifications. A citizen whose fears or convictions can be algorithmically swayed is vulnerable to new forms of manipulation – from micro-targeted advertisements that capitalize on emotional fragilities, to propaganda campaigns that algorithmically amplify outrage or ethnic hatred for political ends. In recent years, we have seen how computational propaganda and “fake news” exploits the architecture of social media to engender false beliefs and collective emotions (like moral panic) that can sway democratic processes (Benkler, Faris, & Roberts, 2018). In short, affective algorithmic colonization creates a populace that is both emotionally hyper-aroused and easily reprogrammed, an ideal subject for consumerist and authoritarian interests alike. It realizes a scenario akin to what Deleuze called the “control society,” where constant modulation replaces the fixed discipline of earlier eras – power is exercised through continuous tracking and adjustment of our inner states, keeping us in a tight grip without our explicit awareness. Through this lens, schizocomputational capitalism emerges as an order in which the unconscious itself has been annexed – a machinic unconscious (to use Félix Guattari’s term) where desire and anxiety are no longer solely organic or personal, but part of an algorithmic marketplace of affects.

 

Symbolic overload

​

If the previous sections dealt with contradictions and affect, the third dimension of schizocomputational capitalism is semiotic and symbolic: a state of informational and ideological overload that overwhelms meaning-making capacities. We live in an environment of hyper-simulation, in which images, signs, and messages circulate in massive quantity and at lightning speed, largely decoupled from material reality or consistent truth. Jean Baudrillard famously argued that postmodern society is defined by simulacra – copies without originals – where signs refer only to other signs in a self-enclosed loop, inducing a condition of hyperreality in which the distinction between real and representation breaks down (Baudrillard, 1994). Schizocomputational capitalism accelerates this condition: it floods daily life with an incoherent barrage of symbolic content, producing what we might call a semiotic schizophrenia or symbolic overload. In practical terms, this means that an individual’s stream of experience (especially via digital media) is a jumble of disparate and often contradictory signifiers: news headlines, advertisements, memes, personal photos, global crisis updates, celebrity gossip, political propaganda, and so on, all jostling for attention in the same flattened space of the screen. Context collapses; everything seems equally urgent, equally real, or equally absurd. The psyche is thus bombarded by flows of signification it cannot fully assimilate or synthesize into a coherent worldview.

​

In recent years, the rise of Computational Social Science (CSS) has extended this logic of simulation from the realm of representation to that of prediction and control. Using vast datasets and agent-based modeling, corporations, governments, and research institutions now simulate social dynamics ranging from collective uprisings to financial investment behaviors, testing countless virtual scenarios to optimize outcomes in the real world. These models do not merely describe society; they actively participate in its reconfiguration by feeding back into political and economic decision-making. In this sense, the hyperreality Baudrillard foresaw has acquired a computational infrastructure: simulations no longer mirror the social, they produce it. Populations and information flows are subtly guided through algorithmic experimentation, creating a recursive feedback loop between prediction and behavior—a world in which the social body becomes both the subject and object of continuous simulation.

​

Returning to our argument, a key feature of symbolic overload is the coexistence of moral and ideological opposites whose very simultaneity dissolves the tension that once separated them. On a given social media timeline, one might see a post about humanitarian aid for disaster victims immediately followed by a xenophobic meme, then an advertisement for luxury fashion, then a call to action on climate change. The simultaneity and rapid succession of these messages create a cognitive dissonance that can be disorienting. As noted earlier, capitalism now leverages both sides of every dichotomy; here we see that dynamic in the symbolic realm: signs of virtue and signs of vice commingle without a stable hierarchy of value. The image becomes commodity, and the commodity becomes image in a total exchange of signs. For instance, the symbol of “justice” or “revolution” can be immediately appropriated to sell merchandise or to brand a corporate campaign (e.g. a cola company aligning its advertising with a protest movement). Conversely, extremely violent or profane imagery is repackaged as just another piece of consumable content in the news feed. 

​

This collapse of differentiation is itself “schizophrenic” in the sense used by cultural critic Fredric Jameson when he described postmodern consciousness: lacking a stable narrative or interpretive anchor, the subject experiences time as a series of disjointed presents and language as a stream of floating signifiers, leading to a kind of perpetual presentness and depthlessness (Jameson, 1990). In schizocomputational capitalism, the depthlessness is evident in how serious political discourse, entertainment, advertisement, and personal communication all share the same platforms and stylistic idioms. Everything tends to be flattened into infotainment. A tragic war video might be algorithmically sandwiched between a dance challenge and an influencer’s makeup tutorial, each item granted a similar visual weight and duration on a scrolling feed. Such an arrangement erodes the context that grants meaning; significance becomes slippery. The human mind, confronted with this glut of unsynthesized information and the imperative to react instantly (like, share, comment, move on), may defensively resort to either apathy or extreme credulity – sometimes flipping between both. This vacillation, too, mirrors the instability of schizophrenic thought: at times an inability to ascribe stable meaning to symbols (leading to withdrawal or apathy), and at other times an over-ascription of idiosyncratic meaning (paranoia, conspiracy thinking).

​

A fundamental consequence of symbolic overload is the manipulation and erosion of belief. In a media ecosystem where true and false information circulate indiscriminately, beliefs become unmoored from factual basis and instead are shaped by repetition, emotional resonance, and algorithmic reinforcement. The simultaneous circulation of “the true and the false, the good and the evil, the human and its simulation” is not a poetic exaggeration but a literal description of many people’s online reality. For example, during a public health crisis, one may encounter accurate scientific guidance and baseless conspiracy theories in the same Twitter thread, each with passionate adherents. The information landscape itself is schizoid: fragmented and clashing realities co-exist without resolution. 

​

Researchers have noted how recommendation algorithms on platforms like YouTube or Facebook can create radicalization spirals – by incrementally suggesting more extreme content, they can lead users from relatively moderate interests into bizarre or extreme belief systems (Tufekci, 2018; Ribeiro et al., 2020). These processes show how algorithmic curation can destabilize consensus reality, effectively slicing the public’s shared symbolic framework into isolated bubbles of meaning. When large portions of society no longer agree on basic facts or trust in common institutions, democratic deliberation breaks down. 

​

Siva Vaidhyanathan (2018) argues that Facebook’s architecture, by prioritizing emotionally engaging content over verified truth, has eroded the foundations of an informed citizenry and thus “undermines democracy.” In democratic theory, a functioning public sphere depends on the symbolic order – shared reference points, a baseline of mutual reality – that allows different groups to debate and reason together. Schizocomputational capitalism, through symbolic overload, actively undercuts this by saturating the public sphere with noise, spectacle, and disinformation, making rational consensus exceedingly difficult.

​

The destabilization of democratic life under these conditions cannot be overstated. It is not just that people are being deceived by particular pieces of fake news; rather, the entire sense-making environment tilts toward distrust, fragmentation, and cynical spectacle. Sensing the chaos and contradiction, citizens may lose faith in the very possibility of truth or collective agency. Some respond by doubling down on fabricated certainties (e.g. QAnon adherents finding patterns in the noise, much like a delusional system in psychosis imposes a private order on chaos), while others succumb to nihilism or political withdrawal. In either case, the result is an erosion of the common world that democracy requires. 

​

Hannah Arendt (1973) warned that a constant flood of lies and absurdities can make people unable to believe anything, thereby paving the way for authoritarianism – a phenomenon visibly exploited by demagogues who thrive in the polluted infosphere. We witness a populace emotionally overtaxed, cognitively disoriented, yet continuously provoked – a populace that is, in effect, politically “schizophrenic” in its oscillation between extremes of credulity and disbelief, collective fervor and fragmented isolation. The symbolic overload thus does not remain confined to the realm of media; it penetrates into the social fabric, corroding the trust and solidarity that hold democratic communities together.

​

Conclusion

​

Through the lenses of the contradiction economy, affective algorithmic colonization, and symbolic overload, we have traced how contemporary computational capitalism both mirrors and induces key elements of schizophrenic psychopathology – fragmentation, disorientation, and collapse of stable meaning. The notion of schizocomputational capitalism serves as more than a metaphor; it denotes a structural reality in which the economic imperatives of digital networks require and reproduce divided subjects and incoherent collectives. In this regime, human attention and emotion are the new frontiers of accumulation, and exploiting them entails systematically pushing individuals toward psychological extremes: our empathetic and aggressive tendencies are simultaneously inflamed (the better to capture engagement), our sense of self is stretched thin across inconsistent roles and stimuli, and our grasp on reality is continually tested by the blurring of fact and fiction. The result is a kind of social schizophrenia – not in the clinical sense of diagnosed illness, but in the diffuse sense of a society that behaves as if it were schizophrenic, rife with hallucinated conspiracies, abrupt swings of mood, and a profound loss of continuity in narrative and identity.

​

Importantly, invoking “schizo” in this context is not to pathologize individuals but to critique a pathogenic system. Schizocomputational capitalism is an analytical lens that reveals how the political economy itself can be understood as psychopathogenic – generating mental distress and disintegration as byproducts of its normal functioning. This aligns with a growing body of critical media and political theory concerned with the psycho-political effects of digital capitalism (Han, 2017; Fisher, 2009). 

​

We see that the fragmentation of subjectivity under constant connectivity is not a mere side-effect but is instrumental to current power structures: a citizen who is anxious, distracted, and doubting is easier to influence and harder to mobilize for collective action. The algorithmic modulation of affect ensures that populations are kept in a state of hyper-arousal or numbed passivity, oscillating between the two in cycles that preclude sustained critical attention. The manipulation of belief and the flooding of the symbolic environment with contradictory signals undermine the possibility of a shared reality, which is the foundation of any democratic polity’s capacity to deliberate and act together. In sum, the structural condition of schizocomputational capitalism tends toward anti-democratic outcomes: it splinters publics into echo chambers, amplifies tribal passions over common reason, and corrodes the trust in information and institutions necessary for collective decision-making. It is an operating system for society that, by design, destabilizes the very notion of an informed, coherent demos.

​

By naming schizocomputational capitalism, we acknowledge that the current malaise is not a random mental health epidemic nor an inevitable result of technology, but a contingent socio-economic formation – one that can be analyzed, critiqued, and potentially reformed. Just as Deleuze and Guattari proposed schizoanalysisas a radical alternative to Oedipal psychoanalysis, aimed at liberating desire from repressive structures, we might imagine new analytic and political practices to reassemble the fractured subject and detoxify the informational commons. 

​

Strengthening democratic life in the face of this challenge will demand not only innovations in media regulation, digital architecture, and public education but also a sober reckoning with the asymmetry between human intelligence and the ever-advancing artificial systems that now outpace it. How can reflective agency compete with infrastructures designed to anticipate and redirect thought itself? The corporations that command these machinic architectures operate with planetary reach and near-total opacity, rendering reform efforts perpetually one step behind. It is therefore fair to ask whether the window for meaningful redress has already closed — whether a true paradigm shift might require nothing less than a civilizational rupture. 

​

And yet, even in this grim prospect, a form of responsibility persists. Schizocomputational capitalism confronts us with a mirror of our present: a socio-technical mirror that reflects not only fragmentation but complicity. Facing that reflection means attempting, perhaps against all odds, the difficult task of cognitive mapping that Fredric Jameson (1990) urged — tracing the invisible systems that govern us so that we might still recover a measure of collective sanity, or at least understand what has been lost. If collapse is indeed the precondition for renewal, then the question is whether we can learn to rebuild before the dust fully settles.

​

References

​

American Psychiatric Association. (2013). Diagnostic and Statistical Manual of Mental Disorders (5th ed.). Washington, DC: APA.

Arendt, H. (1973). The origins of totalitarianism. Harcourt, Brace, Jovanovich.

Baudrillard, J. (1994). Simulacra and Simulation (S. F. Glaser, Trans.). Ann Arbor: University of Michigan Press. (Original work published 1981)

Baumeister, R. F., Bratslavsky, E., Finkenauer, C., & Vohs, K. D. (2001). Bad is stronger than good. Review of General Psychology, 5(4), 323–370.

Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.

Berardi, F. (2009). The Soul at Work: From Alienation to Autonomy. Semiotext(e).

Berardi, F. B. (2018). Neuroplasticity: Beyond Adaptation Toward Morphogenesis. In W. Neidich & B.  (Ed.), The Psychopathologies of Cognitive Capitalism, Part Three (pp. 175–186). Berlin : Archive Books.

Dean, J. (2010). Blog theory: Feedback and capture in the circuits of drive. Polity Press.

Deleuze, G., & Guattari, F. (1983). Anti-Oedipus: Capitalism and Schizophrenia. University of Minnesota Press. (Original work published 1972).

Fisher, M. (2009). Capitalist realism: Is there no alternative? Zero Books.

Freud, S. (1914/1957). On narcissism: An introduction. In J. Strachey (Ed. & Trans.), The Standard Edition of the Complete Psychological Works of Sigmund Freud (Vol. 14, pp. 67–102). London: Hogarth Press. (Original work published 1914)

Guattari, F. (1996). Chaosmosis: An Ethico-Aesthetic Paradigm. Indiana University Press. (Original work published 1992)

Han, B.-C. (2017). Psychopolitics: Neoliberalism and new technologies of power. Verso.

Vaidhyanathan, S. (2018). Antisocial media: How Facebook disconnects us and undermines democracy.Oxford University Press.

Illouz, E. (2007). Cold intimacies: The making of emotional capitalism. Polity Press.

Jameson, F. (1990). Postmodernism, or, the cultural logic of late capitalism. Duke University Press.

Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790.

Lacan, J. (1977). On a question prior to any possible treatment of psychosis. In Écrits: A Selection (A. Sheridan, Trans., pp. 179–225). New York: Norton. (Original work published 1957)

Poenaru, L. (2023). Inconscient économique. Paris: L’Harmattan.

Ribeiro, M. H., Ottoni, R., West, R., Almeida, V. A. F., & Meira Jr., W. (2020). Auditing radicalization pathways on YouTube. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (FAT20)*, 131–141. ACM.
https://doi.org/10.1145/3351095.3372879

Tufekci, Z. (2018, March 10). YouTube, the great radicalizer. The New York Times. [Opinion].
https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html

Zuboff, S. (2019). The Age of Surveillance Capitalism. New York : Public Affairs. 

​

Capture d’écran 2025-10-07 à 00.03.34.jpg

LET’S
BUILD
AN
ECONOMICALLY
AWARE
WORLD

  • LinkedIn
bottom of page