Search Results for:

Cancelling the Apocalypse – Avoiding Civilisational Collapse

Cancelling the Apocalypse – Avoiding Civilisational Collapse

Cancelling the Apocalypse – Why do civilizations collapse? And is our own civilization ripe for collapse? 2:19 Are we on the road to civilizational collapse? 11:01 Why we need global governance, not global government 18:20 The double-edged sword of civilizational complexity 24:11 The apocalyptic scenario Bob fears most 42:01 Luke: Not every problem has a…

Can we build AI without losing control over it? – Sam Harris
|

Can we build AI without losing control over it? – Sam Harris

San Harris (author of The Moral Landscape and host of the Waking Up podcast) discusses the need for AI Safety – while fun to think about, we are unable to “martial an appropriate emotional response” to improvements in AI and automation and the prospect of dangerous AI – it’s a failure of intuition to respond…

Heavy-Tailed Distributions: What Lurks Beyond Our Intuitions?
| | | |

Heavy-Tailed Distributions: What Lurks Beyond Our Intuitions?

Understanding heavy-tailed distributions are important to assessing likelihoods and impact scales when thinking about possible disasters – especially relevant to xRisk and Global Catastrophic Risk analysis. How likely is civilization to be devastated by a large scale disaster or even go extinct? Anders discusses how heavy-tailed distributions account for more than our intuitions tell us….

Nick Bostrom: Why Focus on Existential Risk related to Machine Intelligence?
| |

Nick Bostrom: Why Focus on Existential Risk related to Machine Intelligence?

One can think of Existential Risk as a subcategory of a Global Catastrophic Risk – while GCR’s are really bad, civilization has the potential to recover from such a global catastrophic disaster. An existential Risk is one in which there is no chance of recoverability. An example of the sort of disaster that fits the…