Episode Details
Back to Episodes“Optimistic Assumptions, Longterm Planning, and ‘Cope’” by Raemon
Published 1 year, 7 months ago
Description
Eliezer Yudkowsky periodically complains about people coming up with questionable plans with questionable assumptions to deal with AI, and then either:
Some people complain about Eliezer being a doomy Negative Nancy who's overly pessimistic.
I had an interesting experience a few months ago when I ran some beta-tests of my Planmaking and Surprise Anticipation workshop, that I think are illustrative.
i. Slipping into a more Convenient World
I have an exercise where I give people [...]
---
Outline:
(00:59) i. Slipping into a more Convenient World
(04:26) ii. Finding traction in the wrong direction.
(06:47) Takeaways
---
First published:
July 17th, 2024
Source:
https://www.lesswrong.com/posts/8ZR3xsWb6TdvmL8kx/optimistic-assumptions-longterm-planning-and-cope
---
Narrated by TYPE III AUDIO.
- Saying "well, if this assumption doesn't hold, we're doomed, so we might as well assume it's true."
- Worse: coming up with cope-y reasons to assume that the assumption isn't even questionable at all. It's just a pretty reasonable worldview.
Some people complain about Eliezer being a doomy Negative Nancy who's overly pessimistic.
I had an interesting experience a few months ago when I ran some beta-tests of my Planmaking and Surprise Anticipation workshop, that I think are illustrative.
i. Slipping into a more Convenient World
I have an exercise where I give people [...]
---
Outline:
(00:59) i. Slipping into a more Convenient World
(04:26) ii. Finding traction in the wrong direction.
(06:47) Takeaways
---
First published:
July 17th, 2024
Source:
https://www.lesswrong.com/posts/8ZR3xsWb6TdvmL8kx/optimistic-assumptions-longterm-planning-and-cope
---
Narrated by TYPE III AUDIO.