What's long-term about "longtermism"?
I still disagree with many of the fundamental premises, but it is worth a quick skim. E.g.:
Suppose right now there’s a 0.001 percent chance that climate change could generate a catastrophic feedback mechanism that leads to human extinction, and doing a Thanos snap and killing half of everyone reduces that to 0.0001 percent. A certain kind of longtermist logic says you should do the snap, which I think most people would find odd.
No comments:
Post a Comment