Search Results for:

Cancelling the Apocalypse – Avoiding Civilisational Collapse

Cancelling the Apocalypse – Avoiding Civilisational Collapse

Cancelling the Apocalypse – Why do civilizations collapse? And is our own civilization ripe for collapse? 2:19 Are we on the road to civilizational collapse? 11:01 Why we need global governance, not global government 18:20 The double-edged sword of civilizational complexity 24:11 The apocalyptic scenario Bob fears most 42:01 Luke: Not every problem has a…

The Great Filter, a possible explanation for the Fermi Paradox – interview with Robin Hanson
| | |

The Great Filter, a possible explanation for the Fermi Paradox – interview with Robin Hanson

I grew up wondering about the nature of alien life, what it might look like, what they might do, and whether we will discover any time soon. Though aside from a number of conspiracy theories, and conjecture on Tabby’s Star, so far we have not discovered any signs of life out there in the cosmos….

Can we build AI without losing control over it? – Sam Harris
|

Can we build AI without losing control over it? – Sam Harris

San Harris (author of The Moral Landscape and host of the Waking Up podcast) discusses the need for AI Safety – while fun to think about, we are unable to “martial an appropriate emotional response” to improvements in AI and automation and the prospect of dangerous AI – it’s a failure of intuition to respond…

Can We Improve the Science of Solving Global Coordination Problems?  Anders Sandberg
| | | | |

Can We Improve the Science of Solving Global Coordination Problems? Anders Sandberg

Anders Sandberg discusses solving coordination problems: Includes discussion on game theory including:the prisoners dilemma (and the iterated form), the tit-for-tat strategy, and reciprocal altruism. He then discusses politics, and why he considers himself a ‘heretical libertarian’ – then contrasts the benefits and risks of centralized planning vs distributed trial & error and links this in…

Heavy-Tailed Distributions: What Lurks Beyond Our Intuitions?
| | | |

Heavy-Tailed Distributions: What Lurks Beyond Our Intuitions?

Understanding heavy-tailed distributions are important to assessing likelihoods and impact scales when thinking about possible disasters – especially relevant to xRisk and Global Catastrophic Risk analysis. How likely is civilization to be devastated by a large scale disaster or even go extinct? Anders discusses how heavy-tailed distributions account for more than our intuitions tell us….

Nick Bostrom: Why Focus on Existential Risk related to Machine Intelligence?
| |

Nick Bostrom: Why Focus on Existential Risk related to Machine Intelligence?

One can think of Existential Risk as a subcategory of a Global Catastrophic Risk – while GCR’s are really bad, civilization has the potential to recover from such a global catastrophic disaster. An existential Risk is one in which there is no chance of recoverability. An example of the sort of disaster that fits the…