Mimetic Machines

How algorithms orchestrate desire and rivalry in the age of AI

In 1979, French philosopher René Girard published Things Hidden Since the Foundation of the World, revealing how human desire operates through imitation and how societies manage the resulting conflicts through sacrifice. Today, his insights illuminate a technological crisis he could never have imagined: artificial intelligence systems that amplify mimetic dynamics at unprecedented speed and scale, creating new forms of rivalry, polarization, and scapegoating that threaten social stability. AI is both the engine of mimetic crisis and potentially its sacrificial resolution.

The intersection of mimetic theory with algorithmic systems reveals a perfect storm where ancient psychological patterns meet modern technological amplification. Social media platforms have become what Bishop Robert Barron calls "practically a Girardian laboratory" where mimetic crises can "go viral almost instantly."1 But the implications extend far beyond social networks to encompass the entire ecosystem of AI-driven technologies that increasingly mediate human experience and desire.

The algorithmic architecture of mimetic desire

Contemporary algorithms create what researchers term "mimetic ecosystems" through sophisticated technical mechanisms that exploit fundamental human psychology. The PRIME content amplification system—prioritizing Prestigious, In-group, Moral, and Emotional information—reveals how platforms systematically promote content that triggers mimetic responses.2 When algorithms optimize for engagement, they inadvertently select for the very dynamics Girard identified as sources of social crisis: competitive desire, rivalry, and the collapse of distinctions that maintain order.

TikTok's recommendation engine exemplifies this process, using collaborative filtering to identify users with similar preferences and recommend content based on what "digital twins" desire.3 This creates convergent desires across user groups, transforming Girard's triangular structure of desire—subject, model, and object—into vast networks where millions simultaneously desire what algorithmic models present as desirable.4 The variable ratio reinforcement schedule employed by social platforms triggers the same dopamine pathways as gambling, creating addiction-like patterns that lock users into mimetic competition.5

Instagram's algorithm takes this further by separating "connected reach" from "unconnected reach," enabling content to go viral based purely on engagement metrics rather than existing relationships.6 This creates artificial scarcity where popular content becomes exponentially more visible while similar quality content remains buried—mimicking Girard's observation that mimetic desire creates scarcity where abundance previously existed.7 The platform's focus on metrics like watch time, likes per reach, and sends per reach transforms abstract social dynamics into quantifiable competitions.6

When algorithms become rivalry engines

The technical architecture of recommendation systems doesn't merely reflect human desires—it actively shapes and intensifies them. Research from a 2025 PNAS study demonstrated that Twitter's engagement-based algorithm amplifies "emotionally charged, out-group hostile content" even though users report this content makes them feel worse about political opponents.8 This fundamental misalignment between engagement optimization and user wellbeing reveals how algorithms can drive behaviors that users themselves don't actually prefer.

The mechanisms are precise and measurable. Platforms employ deep learning systems—convolutional neural networks for image analysis, recurrent networks for temporal patterns, transformer architectures for text, and graph neural networks for social relationships—all optimized to predict and maximize engagement.9 These systems create feedback loops where user behavior shaped by algorithms influences future recommendations, establishing self-reinforcing cycles of mimetic intensification.10

Social proof amplification operates as a key mechanism, with algorithms prioritizing content that has already gained traction.11 This creates "reinforcing spirals" where extreme content generates more engagement, leading algorithms to promote increasingly polarizing material.12 When multiple users compete for the same scarce resource—attention—algorithms intensify competition by highlighting performance metrics publicly, creating leaderboards and trending lists that enable real-time comparison of engagement metrics.

Sacrificial Crisis in Silicon Valley

Girard's concept of "sacrificial crisis"—the moment when mimetic rivalries spread throughout a community, threatening social order—finds new expression in our algorithmic age.13 Contemporary AI systems create the very conditions Girard identified as precipitating such crises: the collapse of distinctions, the spread of rivalry, and the failure of traditional mechanisms for containing violence.14

AI represents what Girard called "undifferentiation"—the erasure of boundaries essential to social order.15 Cognitive boundaries dissolve as AI outperforms humans in creativity tests, professional boundaries blur as algorithms threaten traditional human roles, and relational boundaries collapse as people increasingly relate to AI systems as social actors. This systematic dissolution of human/machine distinctions triggers deep anxieties that manifest as calls for AI "pauses" and moratoriums—modern expressions of sacrificial logic.16

Research on online mob dynamics reveals how algorithmic systems accelerate the sacrificial crisis. Digital mobs possess organizational advantages over traditional ones: they're larger (crossing national boundaries), faster to organize, centrally directed, and face reduced barriers to participation. The January 6 Capitol attack exemplified these dynamics, with social media coordination continuing during the violence itself. What Girard called the problem of "the first stone"17  finds new relevance online, where platforms attach rewards (likes, shares) to aggressive behavior, lowering the threshold for initiating violence.

AI as Artificial Scapegoat

In Girard's theory, communities resolve mimetic crises through the scapegoat mechanism—unconsciously selecting a victim to blame for chaos, whose sacrifice temporarily restores social unity.18 AI increasingly occupies this scapegoat position, blamed for unemployment, misinformation, social decay, and threats to democracy.19 The paradox is that AI serves simultaneously as the source of crisis and its potential resolution—a duality that makes it particularly susceptible to sacrificial dynamics.

Analysis of AI safety discourse reveals classic scapegoating patterns. Extinction-risk narratives position AI as an ultimate threat requiring extraordinary measures—the logic that historically justifies sacrificial violence.20 Proposed solutions like development pauses and alignment mechanisms follow sacrificial thinking: the belief that constraining or "sacrificing" AI development will restore safety and order. Germany's post-Fukushima nuclear moratorium provides historical precedent, demonstrating how societies use technological "pauses" as sacrificial responses to crisis.

Peter Thiel, Girard's student and prominent tech investor, appears to consciously deploy these insights.21 His approach suggests viewing managed scapegoating as necessary for social stability, using AI and social media to channel and control mimetic violence rather than eliminate it. This elite manipulation of sacrificial dynamics represents a new development in Girardian theory—the conscious orchestration of scapegoating for strategic advantage.22

Algorithms of mimetic contagion

Real-world consequences of algorithmic mimetic amplification are measurable and devastating. In Myanmar, Facebook's algorithm amplified false rumors about interfaith violence, leading to mob formation and deaths.23 Austrian authorities disrupted a terrorism plot involving teenagers radicalized through TikTok's recommendation system, which exposed them to increasingly extreme content. Ethiopian online trolls exploit algorithmic amplification to deepen ethnic tensions by posing as members of different groups.23

Research identifies "internet banging"—distinct from cyberbullying—where online taunts between rival groups escalate to physical violence and death.24 This represents a direct algorithmic pathway from digital mimetic rivalry to real-world bloodshed

The AAPA (antidemocratic attitudes and partisan animosity) study with 1,256 participants demonstrated that algorithmic exposure directly shapes affective polarization, with increased exposure leading to more negative emotions and attitudes toward political opponents.25 

The mechanics of digital desire

The technical implementation of mimetic amplification operates through multiple sophisticated systems working in concert. Platforms employ multi-modal analysis examining text sentiment, visual aesthetics, audio characteristics, and behavioral patterns. Multi-objective optimization balances short-term engagement, long-term retention, platform safety, and advertiser satisfaction—with engagement consistently winning when conflicts arise.

Mimetic learning systems26 emerge where users learn what to desire by observing algorithmic recommendations, while platforms use "social learning biases" to predict engagement. This creates what researchers term "functional misalignment"—the systematic conflict between what drives engagement and what supports accurate social learning or user wellbeing.27

The neurobiological mechanisms are well-documented. Social media metrics trigger the brain's reward system, with each notification releasing dopamine. The anticipation of rewards often generates more dopamine than the reward itself, creating addiction-like patterns.28 Platforms transform abstract social dynamics into quantified competitions through public metrics that enable constant comparison, with "vanity metrics" becoming proxies for self-worth and social status.29

Conclusion: Breaking the cycle of digital sacrifice

The intersection of Girard's mimetic theory with algorithmic systems reveals both profound risks and transformative possibilities. AI and algorithms don't merely reflect human desires—they amplify, shape, and accelerate mimetic dynamics in ways that threaten social stability while creating new forms of sacrificial crisis and scapegoating.

Understanding these dynamics proves essential for navigating our technological future. The tendency to blame AI for social problems while simultaneously viewing its constraint or destruction as salvific follows ancient sacrificial patterns that prevent addressing root causes.30 Breaking this cycle requires recognizing how algorithms function as mimetic amplifiers and consciously designing alternatives that promote cooperation over competition, diversity over homogenization, and collective flourishing over individual optimization.

A key insight is that overcoming algorithmic mimetic crisis requires not just better technology but fundamental shifts in how we conceptualize and construct our digital environments—moving from systems that exploit mimetic desire to those that cultivate genuine human flourishing.

References:

  1. https://www.wordonfire.org/articles/barron/rene-girard-and-the-social-media-swamp/
  2. https://www.scientificamerican.com/article/social-media-algorithms-warp-how-people-learn-from-each-other/
  3. https://buffer.com/resources/tiktok-algorithm/
  4. https://dl.acm.org/doi/10.1145/312624.312682
  5. https://theconversation.com/social-media-rewires-young-minds-heres-how-243120
  6. https://contentstudio.io/blog/social-media-algorithm
  7. https://andrewmarrosb.blog/posts-collected-by-topic/mimetic-desire-and-mimetic-rivalry/
  8. https://arxiv.org/abs/2305.16941
  9. https://www.datacamp.com/blog/what-is-symbolic-ai
  10. https://theconversation.com/feedback-loops-and-echo-chambers-how-algorithms-amplify-viewpoints-107935
  11. https://www.scottgraffius.com/blog/files/algorithms-and-the-user-experience.html
  12. https://www.orfonline.org/expert-speak/from-clicks-to-chaos-how-social-media-algorithms-amplify-extremism
  13. https://outofmyelement.substack.com/p/crisis-and-differentiation
  14. https://criticallegalthinking.com/2023/09/04/mimetic-desire-the-scapegoat-notes-on-the-thought-of-rene-girard/
  15. https://muse.jhu.edu/article/748160/summary
  16. https://smythos.com/developers/agent-development/symbolic-ai-limitations/
  17. https://en.wikipedia.org/wiki/Ren%C3%A9_Girard
  18. https://criticallegalthinking.com/2023/09/04/mimetic-desire-the-scapegoat-notes-on-the-thought-of-rene-girard/
  19. https://www.thenewsminute.com/voices/artificial-scapegoat-why-algorithms-cant-be-blamed-political-or-corporate-decisions-112248; https://arcmag.org/the-gods-of-silicon-valley/
  20. https://80000hours.org/problem-profiles/artificial-intelligence/
  21. https://bookhaven.stanford.edu/tag/jean-pierre-dupuy/
  22. https://arcmag.org/the-gods-of-silicon-valley/
  23. https://knightcolumbia.org/content/the-algorithmic-management-of-polarization-and-violence-on-social-media
  24. https://theconversation.com/how-social-media-turns-online-arguments-between-teens-into-real-world-violence-155613
  25. https://academic.oup.com/pnasnexus/article/4/3/pgaf062/8052060
  26. https://www.biologicalpsychiatryjournal.com/article/S0006-3223(24)01820-1/fulltext
  27. https://www.scientificamerican.com/article/social-media-algorithms-warp-how-people-learn-from-each-other/
  28. https://www.scientificamerican.com/article/ai-anxiety-is-on-the-rise-heres-how-to-manage-it/
  29. https://quickcreator.io/quthor_blog/the-psychology-of-social-media-users-who-like-their-own-posts/
  30. https://www.thenewsminute.com/voices/artificial-scapegoat-why-algorithms-cant-be-blamed-political-or-corporate-decisions-112248

regenerative law institute, llc

Look for what is missing

—what have extractive systems already devoured?

Look for what is being extracted

-what would you like to say no to but are afraid of the consequences?

Menu