The Looming Shadow of the Intelligence Explosion: A Deep Dive with James Barrett
| |

The Looming Shadow of the Intelligence Explosion: A Deep Dive with James Barrett

About For Future Day 2025, James Barrat discussed his up and coming book ‘The Intelligence Explosion: When AI Beats Humans at Everything’ (Amazon link, Google Book link) which is expected to be released June 10, and also another up and coming book on AI-weapons (see video below). In a thought-provoking discussion, James Barrett, author of…

AI Go Foom Sometime Soon?
|

AI Go Foom Sometime Soon?

Intelligence is powerful – since natural selection stumbled on intelligence, organisms have developed increasingly complex modes of survival and reproduction. What evolution didn’t know, was that it’s dominant position in shaping life on earth would be superseded by intelligence. Intelligence can make sense of evolution by natural selection, it can make sense of the environments…

Can we build AI without losing control over it? – Sam Harris
|

Can we build AI without losing control over it? – Sam Harris

San Harris (author of The Moral Landscape and host of the Waking Up podcast) discusses the need for AI Safety – while fun to think about, we are unable to “martial an appropriate emotional response” to improvements in AI and automation and the prospect of dangerous AI – it’s a failure of intuition to respond…

Anders Sandberg -The Technological Singularity

Anders Sandberg -The Technological Singularity

Anders gives a short tutorial on the Singularity – clearing up confusion and highlighting important aspects of the Technological Singularity and related ideas, such as accelerating change, horizons of predictability, self-improving artificial intelligence, and the intelligence explosion. Tutorial Video: Points covered in the tutorial: The Mathematical Singularity The Technological Singularity: A Horizon of predictability Confusion…

Sam Harris on AI Implications -The Ruben Report
| | |

Sam Harris on AI Implications -The Ruben Report

A transcription of Sam Harris’ discussion on the Implications of Strong AI during recent appearance on the Ruben Report. Sam contrasts narrow AI with strong AI, AI Safety, the possibility of rapid AI self-improvement, the idea of AI superintelligence may seem alien to us, and he also brings up the idea that it is important…

Is there a Meaningful Future for Non-Optimal Moral Agents?
| | | |

Is there a Meaningful Future for Non-Optimal Moral Agents?

In an interview last year, I had a discussion with John Danaher on the Hedonistic Imperative & Superintelligence – a concern he has with HI is that it denies or de-emphasises some kind of moral agency – in moral theory there is a distinction between moral agents (being a responsible actor able to make moral…

The Singularity & Prediction – Can there be an Intelligence Explosion? – Interview with Marcus Hutter
| | |

The Singularity & Prediction – Can there be an Intelligence Explosion? – Interview with Marcus Hutter

Can there be an Intelligence Explosion?  Can Intelligence Explode? The technological singularity refers to a hypothetical scenario in which technological advances virtually explode. The most popular scenario is the creation of super-intelligent algorithms that recursively create ever higher intelligences. What could it mean for intelligence to explode? We need to provide more careful treatment of…

Can Intelligence Explode? – Marcus Hutter at Singularity Summit Australia 2012
| | |

Can Intelligence Explode? – Marcus Hutter at Singularity Summit Australia 2012

Abstract: The technological singularity refers to a hypothetical scenario in which technological advances virtually explode. The most popular scenario is the creation of super-intelligent algorithms that recursively create ever higher intelligences. After a short introduction to this intriguing potential future, I will elaborate on what it could mean for intelligence to explode. In this course,…

Vernor Vinge on the Technological Singularity
| | |

Vernor Vinge on the Technological Singularity

What is the Singularity? Vernor Vinge speaks about technological change, offloading cognition from minds into the environment, and the potential of Strong Artificial Intelligence. Vernor Vinge popularised and coined the term “Technological Singularity” in his 1993 essay “The Coming Technological Singularity“, in which he argues that the creation of superhuman artificial intelligence will mark the…