Discussion about this post

User's avatar
Glenn's avatar

My similar take, which I arrived at recently, is that it's obviously best if we build God, but to die trying is nevertheless better than the status quo because wild animal suffering makes global welfare net negative. If I was a Yudkowskyite and thought superintelligence would wipe out all life no matter what, I'd think we should go full bore on developing it. The only reason to pump the brakes is if you think there's some possibility that superintelligence could give us an extremely positive future, or at least more positive in more careful scenarios than not.

Expand full comment
8 more comments...

No posts