A dimly lit hallway symbolizing gradual influence, moral drift, and the path toward betrayal

The Algorithm Put It on the Heart of Judas

A modern analogy for how influence precedes betrayal

In the Gospel of Luke, the betrayal of Jesus is summarized in a single, unsettling line: “Then Satan entered Judas.” For centuries, that phrase has been interpreted as a moment of possession or sudden moral collapse. But when read alongside the other Gospel accounts, the story suggests something quieter and more familiar: a gradual process of influence, consent, and erosion of judgment.

John’s Gospel adds an important detail. Before the betrayal takes place, the idea is already present. The decision does not arrive fully formed. It develops.

If that same story unfolded today, the language might sound different, but the mechanism would be recognizable. Instead of “Satan entered Judas,” it might read: “The algorithm put it on the heart of Judas, and he agreed.”

This is not a rewriting of Scripture, nor an argument that technology replaces spiritual agency. It is a modern analogy, one that highlights how influence now operates through systems that quietly shape belief and behavior long before any decisive action occurs.

A Pattern, Not a Moment

The biblical account of Judas does not depict a sudden transformation from loyalty to betrayal. Instead, it traces a pattern of incremental compromise. Judas is entrusted with the community’s money and abuses that trust. Small ethical violations go unchallenged. Private rationalizations accumulate. Over time, moral distance grows.

By the time betrayal becomes an option, it no longer feels unthinkable.

The critical shift is not the opportunity to betray. It is the internal willingness to consider it. Betrayal, in this telling, is not an event. It is an outcome.

How Modern Persuasion Works

In the digital age, persuasion rarely announces itself as persuasion. It does not arrive as a command. It arrives as reassurance.

“You’re not wrong for feeling this.”

“Everyone else sees it this way.”

“This is just being honest.”

“They deserve it.”

“This doesn’t really hurt anyone.”

Individually, these messages can seem harmless. Repeated over time, they harden into belief. What begins as exposure becomes reinforcement. What begins as curiosity becomes certainty.

Modern recommendation systems are not designed to shape moral reasoning. They are designed to maximize engagement. But engagement is often driven by content that provokes outrage, reinforces grievance, or offers simple explanations for complex problems. Over time, repeated exposure narrows perspective, reduces self-questioning, and rewards certainty over reflection.

Eventually, agreement feels natural.

What Algorithms Amplify

Algorithms do not invent cruelty, fraud, or dehumanization. They amplify what captures attention, and human attention reliably gravitates toward outrage, fear, grievance, and moral certainty.

Researchers who study online influence have documented how recommendation systems can create feedback loops that intensify belief while filtering out contradiction. This process does not require explicit propaganda. It relies on repetition, validation, and the gradual removal of friction.

The result is not sudden radicalization, but something subtler: empathy narrows, complexity collapses, and internal resistance weakens. As with Judas, the heart agrees long before the act occurs.

The Common Thread: Distance

What connects ancient betrayal to modern harm is not technology itself. It is distance.

Distance from consequence.

Distance from relationship.

Distance from seeing another person as fully human.

This distance explains why people say things online they would never say face to face. It helps explain how scams become psychologically tolerable when victims are abstract. It underlies the way online ideologies flatten entire groups into villains or caricatures.

Harm becomes easier when humanity becomes theoretical.

Betrayal Before the Betrayal

What often goes unexamined is that betrayal rarely begins outwardly. Long before someone betrays a cause, a community, or another person, they have often betrayed something closer: themselves.

In the Judas narrative, this appears early. John’s Gospel notes that Judas regularly took from the communal money he was entrusted to manage. These actions were not yet acts of public treachery, but they marked a quiet abandonment of integrity, one no one else could see.

Modern influence follows a similar pattern. Before people dehumanize others online, exploit strangers through fraud, or adopt rigid ideologies, they often make a series of small internal concessions. They silence discomfort. They ignore moments of moral friction. They tell themselves stories that allow them to act against their better judgment without fully acknowledging the cost.

Algorithms accelerate this process not by forcing behavior, but by normalizing dissonance. When actions conflict with values, repeated exposure to validating narratives makes it easier to resolve that tension by adjusting the values instead. Over time, the person is no longer acting against their conscience. They have reshaped it.

This is how betrayal of self becomes betrayal of others.

Ordinary People, Predictable Outcomes

Public conversations about digital harm often focus on “bad actors,” as if damage only comes from obviously immoral people. Scripture and history suggest something more unsettling: harm is usually carried out by ordinary people who stopped interrogating the stories they were being fed.

The Judas narrative is disturbing precisely because it is not about an obvious villain. Judas is present, trusted, and embedded in the community. The betrayal comes not from opposition, but from proximity combined with misalignment.

Responsibility Still Matters

None of this eliminates individual responsibility. No system, whether spiritual, cultural, or technological, can force internal agreement. At some point, consent occurs.

Influence explains how people arrive at harmful conclusions. Choice explains why they act on them. Understanding the mechanics of persuasion does not absolve accountability. It clarifies how ordinary people end up doing extraordinary harm.

A Warning Without Panic

This is not an argument for abandoning technology or treating digital platforms as inherently corrupt. It is a call to awareness: awareness of how repetition shapes belief, how distance dulls empathy, and how easily moral certainty can replace moral reflection.

One of the most relevant questions today may not be “What am I being shown?” but “What am I slowly being trained to agree with?”

An Old Lesson in a New Context

If the story of Judas were written today, it might indeed say: “The algorithm put it on the heart of Judas, and he agreed.”

But the deeper truth would remain unchanged. The story is never only about what was suggested. It is about what was consented to. It is about the moment, almost always gradual and almost always rationalized, when questioning stopped.

In an age of algorithmic influence, the ancient warning remains urgent. Pay attention to what earns your agreement, and why. Because betrayal rarely announces itself as betrayal. It begins as a series of small decisions that slowly distance us from who we believe ourselves to be.


Lana Reid founded Conversations in Color, a nonprofit focused on amplifying Black voices through conversation, storytelling, and community projects. Her work promotes equity, cultural responsibility, and environments that support Black advocacy, healing, and honesty.

Leave a Reply

Your email address will not be published. Required fields are marked *