Friday, June 3, 2022

More on Effective Altruism, Longtermism, and Big Numbers

I have repeatedly tried to point out the flaws in EA's approach to the world (e.g, (Kinda) Against "EA" / Utilitarianism. Definitely Against Big Numbers  and  My expected value is bigger than yours). But Brian points me to the definitive rebuttal by D0TheMath: 


Excerpts:

Scale: If we think there is only a 1% chance of panpsychism being true (the lowest possible estimate on prediction websites such as Metaculus, so highly conservative), then this still amounts to at least 10^78 electrons impacted in expectation.

Neglectedness: Basically nobody thinks about electrons, except chemists, physicists, and computer engineers. And they only think about what electrons can do for them, not what they can do for the electrons. This amounts to a moral travesty far larger than factory farms.

Tractibility: It is tremendously easy to affect electrons, as shown by recent advances in computer technology, based solely on the manipulation of electrons inside wires.

Electrons are suicidal

If electrons can only sense the charges of their neighbors, they know that positrons are positively charged, and that if they make contact with a positron the pair will immediately annihilate (all reasonable assumptions by any metric), then the only reason it would travel in the opposite direction of electric fields is in the hopes that it ends up colliding with positrons, thereby ending its existence.

This means every moment of an electron’s existence is pain, and multiplying out this pain by an expected 10^78 produces astronomical levels of expected suffering.

Since pain is worse than pleasure is bad, and it seems highly unlikely electrons would run towards certain death if their lives were pleasurable, this possibility dominates the moral calculus in this scenario.


Epistemic status: Certain





No comments: