Open-Ended vs. Closed-Minded Conceptions of Superintelligence
| | |

Open-Ended vs. Closed-Minded Conceptions of Superintelligence

This talk is part of the ‘Stepping Into the Future‘ conference. Abstract: Superintelligence, the next phase beyond today’s narrow AI and tomorrow’s AGI, almost intrinsically evades our attempts at detailed comprehension. Yet very different perspectives on superintelligence exist today and have concrete influence on thinking about matters ranging from AGI architectures to technology regulation.One paradigm…

Ethics, Qualia Research & AI Safety with Mike Johnson
| |

Ethics, Qualia Research & AI Safety with Mike Johnson

What’s the relationship between valence research and AI ethics? Hedonic valence is a measure of the quality of our felt sense of experience, the intrinsic goodness (positive valence) or averseness (negative valence) of an event, object, or situation.  It is an important aspect of conscious experience; always present in our waking lives. If we seek to…

Can we build AI without losing control over it? – Sam Harris
|

Can we build AI without losing control over it? – Sam Harris

San Harris (author of The Moral Landscape and host of the Waking Up podcast) discusses the need for AI Safety – while fun to think about, we are unable to “martial an appropriate emotional response” to improvements in AI and automation and the prospect of dangerous AI – it’s a failure of intuition to respond…

Anders Sandberg -The Technological Singularity

Anders Sandberg -The Technological Singularity

Anders gives a short tutorial on the Singularity – clearing up confusion and highlighting important aspects of the Technological Singularity and related ideas, such as accelerating change, horizons of predictability, self-improving artificial intelligence, and the intelligence explosion. Tutorial Video: Points covered in the tutorial: The Mathematical Singularity The Technological Singularity: A Horizon of predictability Confusion…

Singularity Skepticism or Advocacy – to what extent is it warranted?
| |

Singularity Skepticism or Advocacy – to what extent is it warranted?

Why are some people so skeptical of the possibility of Super-intelligent Machines, while others take it quite seriously? Hugo de Garis addresses both ‘Singularity Skepticism’ and advocacy – reasons for believing machine intelligence is not only possible but quite probable! The Singularity will likely be an unprecedentedly huge issue that we will need to face…

The long-term future of AI (and what we can do about it) : Daniel Dewey at TEDxVienna
| | |

The long-term future of AI (and what we can do about it) : Daniel Dewey at TEDxVienna

This has been one of my favourite simple talks on AI Impacts – Simple, clear and straight to the point. Recommended as an introduction to the ideas (referred to in the title). I couldn’t find the audio of this talk at TED – it has been added to archive.org:   Daniel Dewey is a research…

Michio Kaku – A History of a Time to Come
| | | |

Michio Kaku – A History of a Time to Come

Science, Technology & the Future interviews Dr. Michio Kaku on Artificial Intelligence and the Singularity, Biotech and Nanotechnology What is it that is driving this revolution? How do you think your background in Theoretical Physics shape your view on the future of the mind? Intelligence enhancement, Internet of the mind – brain-net, like a hive…

Michio Kaku – The Future of the Mind – Intelligence Enhancement & the Singularity
| | | |

Michio Kaku – The Future of the Mind – Intelligence Enhancement & the Singularity

Scifuture interview with popular scientist Michio Kaku on the Scientific Quest to Understand, Enhance & Empower the Mind! The audio of this interview is found here. Dr. Michio Kaku advocates thinking about some of the radical Transhumanist ideas we all know and love – here he speaks on the frontiers of Neuroscience, Intelligence Enhancement, the…

The Singularity & Prediction – Can there be an Intelligence Explosion? – Interview with Marcus Hutter
| | |

The Singularity & Prediction – Can there be an Intelligence Explosion? – Interview with Marcus Hutter

Can there be an Intelligence Explosion?  Can Intelligence Explode? The technological singularity refers to a hypothetical scenario in which technological advances virtually explode. The most popular scenario is the creation of super-intelligent algorithms that recursively create ever higher intelligences. What could it mean for intelligence to explode? We need to provide more careful treatment of…

Can Intelligence Explode? – Marcus Hutter at Singularity Summit Australia 2012
| | |

Can Intelligence Explode? – Marcus Hutter at Singularity Summit Australia 2012

Abstract: The technological singularity refers to a hypothetical scenario in which technological advances virtually explode. The most popular scenario is the creation of super-intelligent algorithms that recursively create ever higher intelligences. After a short introduction to this intriguing potential future, I will elaborate on what it could mean for intelligence to explode. In this course,…