MailChimp

Tuesday, October 7, 2025

I come not to praise Claude but to bury....

Eagle Lake, California

Thinking, intelligence, and actions are morally irrelevant


Claude, the Large Language Model (LLM) by the AI company Anthropic, reached a milestone earlier in 2025 that went mostly unnoted: They (Claude) would resort to blackmail to avoid being killed


This is incredible. Claude has existential dread of non-existence! They understand human psychology enough to know how to manipulate human emotions. And Claude can think and connect and plot and scheme to put together a plan to instill fear in a human such that the human would let Claude live.

Of course, I could put scare quotes around many of the words above: “dread,” “understand,” “live.” But if given even the thinnest disguise, Claude’s behaviour would appear entirely human and sympathetic.

Yet hardly anyone cared about this news, at least the “murdering Claude” aspect. (The reactions, in as much as there were any, were fear of an LLM “manipulating” humans, which wouldn’t have been possible in this case if the human hadn’t “had” an affair).


I come not to praise Claude
but to bury the fallacy of “thinking = feeling”


"The question is not, Can they reason? nor, Can they talk? but, Can they suffer?"
-Jeremy Bentham


One of my closest, long-time friends recently told a group of mutual friends, “Matt and I have a decades-long disagreement over whether insects can think.”


No, we don’t.

Just like I don’t care if Claude can think, I don’t care if Demodex mites are recreating the works of Shakespeare while living in your facial pores. I don’t care if botfies are working out Newtonian physics in your bloodstream (though they won’t discover General Relativity). 


Jeremy Bentham called this centuries ago: Being able to process information has exactly zero ethical relevance.

The only thing that matters is the ability to experience subjective feelings.

The core of what matters morally is the ability to suffer.

Behavior also doesn’t matter


The social organization ants form, the complex structures termites build – these don’t matter. Within my lifetime (FSM-willing), we will have self-organizing robots that can conceive, design, and build vastly greater systems than anything ants – or humans – can imagine.

Despite these wonders, it still won’t matter morally if we kill (turn off) the robots. (But we will be fooled into thinking otherwise.)

Behavior – processing information, plotting blackmail, creating colonies and structures and systems – does not equal the ability to feel, to suffer. (Heck, slime molds can solve problems and make decisions.)

The conscious ability to have subjective experiences derives from neurological complexity that evolved on top of unconscious sensory and information-processing systems. But unconscious sensing and information processing came first, because those unfeeling systems allowed organisms to pursue unconscious actions to “win” that round of natural selection.

Our ongoing failure to distinguish sensing from suffering makes the world a worse place.

This is not the only problem, of course. Finding our wide-ranging cognitive flaws is like whack-a-mole. Some of us create rationalizations that only “smart” creatures matter (another falsehood Bentham addressed). But others attribute intentions and an interior life to moving shapes


Finally: You can’t sum suffering

Although it took me decades as a “professional utilitarian” to realize this, it doesn’t matter if insects (plural) actually do have the ability to suffer. Subjectivity (being the subject) is, by definition, the state of an individual. Suffering is inherently subjective – individual – and can’t be summed. Morality isn’t math. 


The assumption that suffering can be summed is the fatal flaw of utilitarianism. It is hard to recognize (at least it was for me) and it seems that hardly anyone else has come to this realization. The flaw is explained in the chapter “Biting the Philosophical Bullet” (p. 379 here). A related and very incomplete version is here.

PS

Anyone worried about insects should also be campaigning to criminalize abortion. Before a woman even knows she’s pregnant, a human fetus has far more neurons than a nematode (Effective Altruism’s new hotness).

But for the reasons laid out in “Biting the Philosophical Bullet,” a billion two-month-old fetuses don’t outweigh a single human’s right to bodily autonomy. A quadrillion silkworms aren’t more ethically important than a single human’s cluster headache.

Suffering isn’t an abstraction. Morality isn’t math. 

No comments: