top of page

ALGORITHMIZED CARE: HOW ATTACHMENT WAS CAPTURED, DISTORTED, MEASURED, AND MONETIZED

  • Writer: Liviu Poenaru
    Liviu Poenaru
  • 31 minutes ago
  • 2 min read

Liviu Poenaru, Dec. 15, 2025


Algorithmic attachment names a quiet mutation of attachment itself. Classical attachment theory described how early bonds organize safety, proximity-seeking, and self-worth through relationships with living others (Bowlby, 1988). Today, those same regulatory circuits are being continuously solicited by non-human systems. Platforms do not simply distribute content; they distribute reassurance, rejection, anticipation, and absence. The feed becomes a relational field, and the algorithm takes the position of an always-there, always-withholding presence — sometimes soothing, sometimes punishing, never accountable. What was once interpersonal becomes infrastructural.

This is not an artificial replacement of attachment, but a parasitic one. Algorithms exploit natural attachment mechanisms rather than inventing new ones. Variable reinforcement mimics inconsistent caregiving; notifications simulate proximity signals; visibility metrics translate love into quantity. Empirical studies on short-video and social-media platforms show how these design features intensify reward sensitivity, emotional dependency, and vulnerability to distress, particularly among individuals with higher attachment insecurity (Montag et al., 2021). The system does not “care,” yet it trains users to care intensely. Attachment anxiety is cultivated through unpredictability; avoidant defenses are rewarded through emotional flattening and ironic distance.

Here lies the inversion: what appears as connection is often a form of controlled exposure. Algorithmic attachment produces a chronic orientation toward evaluation — toward being seen, ranked, validated, or ignored. Longitudinal evidence shows that increased social-media engagement predicts declines in subjective well-being over time, even when users believe they are strengthening social bonds (Kross et al., 2013). Social recognition anxiety becomes normalized, even adaptive, under conditions of continuous comparison. The subject learns to scan the environment not for meaning, but for signals of relevance.

Clinically, algorithmic attachment reshapes the inner economy of self-regulation. Distress is no longer processed relationally but offloaded onto screens; reassurance is outsourced to metrics; absence becomes intolerable. The body learns a new rhythm of expectation and disappointment, a neuroplastic loop where attention, reward, and self-esteem are algorithmically synchronized (Montag et al., 2021). What looks like addiction is often attachment without an other — bonding without mutual recognition, intimacy without care.

The mental forecast is clear: unless these dynamics are named, they will continue to masquerade as personal weakness or individual pathology. Algorithmic attachment is not a failure of resilience; it is a political economy of affect. To think critically about mental health today requires asking not only who we are attached to, but what is shaping the conditions of attachment itself — and who benefits from keeping those bonds unstable, measurable, and endlessly exploitable (Bowlby, 1988; Kross et al., 2013).

REFERENCES

Bowlby, J. (1988). A secure base: Parent-child attachment and healthy human development. Basic Books.

Montag, C., Elhai, J. D., & Sindermann, C. (2021). On the psychology of TikTok use: A first glimpse from empirical findings. Frontiers in Public Health, 9, 641673. https://doi.org/10.3389/fpubh.2021.641673

Kross, E., Verduyn, P., Demiralp, E., Park, J., Lee, D. S., Lin, N., Shablack, H., Jonides, J., & Ybarra, O. (2013). Facebook use predicts declines in subjective well-being in young adults. PLOS ONE, 8(8), e69841. https://doi.org/10.1371/journal.pone.0069841

Comments


Capture d’écran 2025-10-07 à 00.03.34.jpg

LET’S
BUILD
AN
ECONOMICALLY
AWARE
WORLD

  • LinkedIn
bottom of page