About the author

I am the author, co-author, secondary-author, ghost-author, and non-author of articles, speeches, book chapters, and even entire books! The most recent can be found at LosingMyReligions.net. Currently, I am President of One Step for Animals; previously, I was shitcanned from so many nonprofits that we can’t list them all here. Before my unfortunate encounter with activism, I was an aerospace engineer who wanted to work for NASA (to impress Carl Sagan). My hobbies include photography, almost dying, and {REDACTED}. I live in Tucson with my soulmate and reluctant editor Anne, along with the occasional snake and scorpion.

Sunday, January 23, 2022

What I want from long-termists

I continually rip on longtermists on this blog (e.g, 1, 2). There one main thing I'd like from that community:

A recognition of sign-uncertainty (aka cluelessness). That is, we don't know if our actions aimed at the long-term future will have a positive or negative impact. There have been plenty of examples, but one involves work on AI. It is entirely possible that by trying to reign in / slow down the development of AI in the United States (e.g., to force researchers to stop and try to address the alignment problem), an unfettered AI from China could be first and pre-empt every other attempt. 

I don't buy the "AI is a threat to humanity" / "AI will be our god." But if you did believe that, it just seems really difficult to feel very confident that your actions would actually increase the probability of a good outcome.

Also, maybe regular and overt admission of opportunity costs; e.g. that writing endless series of million-word essays about a million years from now means you are actively choosing to not help people who are suffering terribly right now.




No comments: