## Heavy-Tailed Distributions: What Lurks Beyond Our Intuitions?

Understanding heavy-tailed distributions are important to assessing likelihoods and impact scales when thinking about possible disasters – especially relevant to xRisk and Global Catastrophic Risk analysis. How likely is civilization to be devastated by a large scale disaster or even go extinct?
Anders discusses how heavy-tailed distributions account for more than our intuitions tell us.

How likely is civilization to devastated by a global disaster or even go extinct?
In this video, Anders Sandberg discusses (with the aid of a whiteboard) how heavy-tailed distributions account for more than our intuitions tell us .

Considering large-scale disasters may be far more important than we intuit.

### Transcript of dialog

So typically when people talk about probability they think about nice probability distribution like the bell curve or the Gaussian curve. So this means that it’s most likely that you get something close to zero and then less and less likely that you get very positive or very negative things and this is a rather nice looking curve.

However, many things in the world turn out to have much nastier probability distributions. A lot of disasters for example have a power law distribution. So if this is the size of a disaster and this is the probably, they fall off like this. This doesn’t look very dangerous from the start. Most disasters are fairly small, there’s a high probability of something close to zero and a low probability of something large. But it turns out that the probability getting a really large one can become quite big.

So suppose this one has alpha equal to 1 – that means that there is the chance of getting a disaster of size 10 is proportional to 1 in 10 and that disaster is 10 times as large that’s just a 10th of that probability and that it’s also 10 times as large as that big disaster (again a 10th of that).

That means that we’ve quite a lot of probability of getting very very large disasters – so in this case getting something that is very far out here is exceedingly unlikely, but in the case of power laws you can actually expect to see some very very large outbreaks.

So if you think about the time that various disasters happen – they happen irregularly and occasionally one is through the roof, and then another one, and you can’t of course tell when they happen – that’s random. And you can’t really tell how big they are going to be except that you’re going to be distributed in this way.

The real problem is that when something is bigger than any threshold that you imagine.. well it’s not just going to be a little bit taller, it’s going to be a whole lot taller.

So if we’re going to see a war for example as large as even the Second World War, we shouldn’t expect it to kill a million people more. We could expect it to kill tens or most likely hundreds or even a billions of people more – which is a rather scary prospect.

So the problem here is that disasters seem to be having these heavy tails. So a heavy a tail in probability slang that means that the probability mass over here, the chance that something very large is happening, there again it falls off very slowly. And this is of course a big problem because we tend to think in terms of normal distributions.

Normal distributions are nice. We say they’re normal because a lot of the things in our everyday life get distributed like this. The tallness of people for example – very rarely do we meet somebody who’s a kilometer tall, however, when we meet the people and think about how much they’re making or much money they have – well Bill Gates. He is far far richer than just ten times you and me and then he’s actually got, he’s from afar out here.

So when we get to the land where we have these fat heavy tails when both the the richest (if we are talking about rich people and the dangers if we talk about this) also tend to be much bigger than we can normally think about.

Mmm and the problem is of course our intuitions are all shaped by what’s going on here in the normal realm. We have this experience about what has happened so far in our lives and once we venture out here and talk about very big events or intuitions suddenly become very bad. We make mistakes. We don’t really understand the consequences, cognitive biases take over and this can of course completely mess up our planning.

So we invest far too little in handling the really big disasters and we’re far too uninterested in going for the big wins in technology and science.

We should pay more attention probability theory (esp heavy-tailed distributions) in order to discover and avoid disasters that lurk beyond our intuitions.

Also see –
– Anders Sandberg: The Survival Curve of Our Species: Handling Global Catastrophic and Existential Risks

Anders Sandberg on Wikipedia: https://en.wikipedia.org/wiki/Anders_Sandberg

Many thanks for watching!

Consider supporting me by: