Friday, January 26, 2024

Michael Lewis' "Going Infinite"

Tears for Fears - Secret World 



So I finished Going Infinite and my envy of Michael Lewis is even greater. (My gawd, he could write the pants off a bumblebee.**) I identified with a fair bit of SBF's childhood, although not his mind. (I also saw a few similarities in how SBF's parents raised him and how we parented.)

After reading, I stand by my SBF analysis even more (and the chapter here, "My Expected Value Is Greater Than Yours"*). 

SBF's story is even crazier than I knew - it makes Elizabeth Holmes & Theranos literally seem like nothing. This still stands out as Lewis' greatest insight:

One day some historian of effective altruism will marvel at how easily it transformed itself. It turned its back on living people without bloodshed or even, really, much shouting. You might think that people who had sacrificed fame and fortune to save poor children in Africa would rebel at the idea of moving on from poor children in Africa to future children in another galaxy. They didn’t, not really—which tells you something about the role of ordinary human feeling in the movement. It didn’t matter. What mattered was the math. Effective altruism never got its emotional charge from the places that charged ordinary philanthropy. It was always fueled by a cool lust for the most logical way to lead a good life.

* Vox's Dylan Matthews summed it up nine years ago:

The common response I got to this was, "Yes, sure, but even if there's a very, very, very small likelihood of us decreasing AI risk, that still trumps global poverty, because infinitesimally increasing the odds that 10^52 people in the future exist saves way more lives than poverty reduction ever could."

The problem is that you could use this logic to defend just about anything. Imagine that a wizard showed up and said, "Humans are about to go extinct unless you give me $10 to cast a magical spell." Even if you only think there's a, say, 0.00000000000000001 percent chance that he's right, you should still, under this reasoning, give him the $10, because the expected value is that you're saving 10^32 lives. Bostrom calls this scenario "Pascal's Mugging," and it's a huge problem for anyone trying to defend efforts to reduce human risk of extinction to the exclusion of anything else.  


** Given that I can hardly read two pages of my latest without cringing at something that needs to be edited, I did feel a perverse delight in the huge typo on p. 64. Sorry.

No comments: