Thursday, July 15, 2021

My expected value is bigger than yours (from Vox)

Reading through some pages on the Open Philanthropy Project's site, I re-discovered this Vox article: I spent a weekend at Google talking with nerds about charity. I came away … worried. It reminds me of a number of posts on this blog (e.g., Against Big Numbers) and makes some other good points. A few excerpts:

The common response I got to this was, "Yes, sure, but even if there's a very, very, very small likelihood of us decreasing AI risk, that still trumps global poverty, because infinitesimally increasing the odds that 10^52 people in the future exist saves way more lives than poverty reduction ever could."

The problem is that you could use this logic to defend just about anything. Imagine that a wizard showed up and said, "Humans are about to go extinct unless you give me $10 to cast a magical spell." Even if you only think there's a, say, 0.00000000000000001 percent chance that he's right, you should still, under this reasoning, give him the $10, because the expected value is that you're saving 10^32 lives. Bostrom calls this scenario "Pascal's Mugging," and it's a huge problem for anyone trying to defend efforts to reduce human risk of extinction to the exclusion of anything else. 

[U]ltimately you have to stop being meta ... if you take meta-charity too far, you get a movement that's really good at expanding itself but not necessarily good at actually helping people [or other animals -ed].

No comments: