Dr. Brian Klaas explains how humanity has gone from the “local instability and global stability” of hunter-gatherer times to a world where we now experience local order but global chaos. This shift, shaped by complex systems and non-linear dynamics, shows how tiny changes can trigger massive disruptions—and why clinging to outdated models leaves us exposed to the unpredictable.
Timestamps:
0:00: Modern volatility
1:20: Complex systems theory
6:06: The sandpile model
6:47: Basins of attraction
7:49: Black swan events
Transcript:
Modern Volatility
Modern humans experience a different world and a different dynamic of our existence than anyone who has ever come before us. And the reason for that is because we've inverted the dynamics of how our lives unfold. So all of the past people, the hunter-gatherers that came before us, they lived in a world that was defined by what I call local instability, but global stability. Their day-to-day lives in their local environment was unpredictable. Now we have flipped that world. We experience local stability, but global instability. We have extreme regularity in our daily lives. We can order products online and expect exactly when they're going to arrive. We can go to Starbucks anywhere in the world and it's going to taste roughly the same. But our world is changing faster than it ever has before.
The consequence of that is that when things do go wrong, the ripple effects are much more profound and much more immediate. And this is where that sort of aspect of global instability is really dangerous. We've actually engineered a volatile world where Starbucks is completely unchanging from year to year, but democracies are collapsing and rivers are drying up.
Complex systems theory
So when we tried to understand why things happen in social systems, the old way of thinking often involved what are called linear dynamics. In other words, a small cause has a small effect and a big cause has a big effect, and the relationship between them is linear. Now, the way the world actually works is non-linear, and this means that sometimes a very small change can produce a very big effect. And complex systems theory understands this and tries to incorporate it into its models. It connects the dots. It requires non-linear thinking. It requires the idea of modeling a world in which a tiny shift or a little fluctuation that seems to be noise can actually radically shift the way that the future unfolds.
In the history of trying to understand social change, there have been a few key assumptions that were incorrect but were sometimes useful to try to make sense of a very complex world. And one of the first assumptions that we start with is this idea that when you look at why something happens, that there's going to be a clear-cut cause behind it. So, for example, you think about the atomic bomb being dropped on Japan. If all sorts of things had been slightly different, this would not have played out the same way. So if the Battle of Midway had not happened, or if Albert Einstein had not been born, or if uranium deposits had not been forged by geological forces, all of these things had to exist just as they were for that event to happen on exactly August 6th, 1945. And yet our models don't do well in trying to understand an infinite number of causes producing a single effect.
The second assumption that we have, which complex systems theory challenges and tells us is wrong, is that if you just understand the components of an individual system, you will therefore understand the entire system. And the reason that's incorrect is because complex systems are different from what are called complicated systems. It sounds like a very small nuance, but it's a very important one. So a Swiss watch, for example, is complicated but not complex. And the reason it's complicated is because it has a million interlocking parts, it's got all these little things in it that need to work for the watch to actually perform its function. But it doesn't adapt within the system if something breaks. So if part of the watch breaks down, the watch just stops working.
A complex system is different, it's adaptive. We can imagine this in our own lives because if you're in traffic, for example, and somebody pumps the brakes, it's not like everybody just keeps going and slams into them. They also pump the brakes and it affects the entire system. This adaptation that we have in a complex system means that you can't just understand the constituent parts. You have to understand exactly how they interact with each other. The third big assumption, and this is something that's totally central to social science and a lot of understandings of change in the past, is that if something was a pattern of cause and effect in the past, it will also be a pattern of cause and effect in the present or in the future. And the philosopher David Hume identified this problem many hundreds of years ago.
Now, the problem is that the world is more rapidly changing than ever before, so that means that the patterns of the past are least likely to apply to our current moment or to the future. For example, there's an analysis of authoritarian regimes and how they work and why they're so stable. And this analysis says that Middle Eastern regimes are extremely resilient, they're extremely stable. And then about a year or two after this book was published, the Arab Spring happens and all these regimes collapse in the span of like three months. Now, the immediate reaction is to think this theory was wrong. But the second way of thinking about the world is that the world actually changed from the point that the book was written to the point when the regimes collapsed.
And now the theory is incorrect, but not because it was wrong in the past, but because simply the world has shifted. Because so much of modern life runs on models that assume that past patterns are predictive of future ones. AI is also like this, right? Where AI is learning based on past patterns and then trying to develop, you know, machine learning-driven models that can help us navigate uncertainty in the future. But if the underlying world has shifted, then that analysis is not going to just be wrong, it's going to be dangerous.
And the problem is we can't really test this. Complex systems involve diverse parts where not everything is uniform. They interact with each other and crucially they adapt to each other. Human society is obviously a complex system. We're not all the same, we're not uniform. We are a giant complex system of 8 billion interacting individuals. Now, there are a few other ideas within complex systems theory that I think are particularly useful for understanding how our lives unfold partly between order and chaos. And we're in the middle there, that's where complexity thrives.
The sandpile model
So the first one is called the sandpile model, which is a subset of what is called self-organized criticality where you have a grain of sand and you add another grain of sand and another grain of sand and eventually you build up a pile. Now, at some point, the pile of sand is going to get so tall and so unstable, it will be on what physicists sometimes call the edge of chaos. And at this point, a single grain of sand can cause the entire pile to collapse. Now, that collapse can be produced by something so tiny and yet have such huge effects that it explains the non-linear dynamics of that system. This is something that I think we have engineered in modern society because of what are called basins of attraction.
Basins of attraction
And a basin of attraction is where a system will tend to move towards this evolution of the system over time. A great way of thinking about this is traffic on the highway. If you are driving down the highway, there is a speed limit. The speed limit is a basin of attraction for that system. Not everyone is going to drive 60 miles an hour when the speed limit is 60 miles an hour, but most people will drive somewhere near it. Additionally, the cars are going to be roughly evenly spaced. Of course, there might be a jerk who's tailgating, there might be someone who's way far behind you. But for the most part, there's sort of this order that emerges from a lot of interconnected individuals driving cars, producing a relatively predictable system. And that's supposed to be good for us because efficiency is the main driver of so much of our modern social systems. But this means that the basin of attraction for our world is on the edge of chaos, it's at the absolute limit of the sandpile, so that when the sand falls, it's more likely to cause an avalanche.
Black swan events
A black swan is a term that was coined by Nassim Nicholas Taleb, and what it basically refers to is a highly consequential rare event that was unpredictable. It's usually something that you couldn't actually foresee. One of the things that produces an early warning sign in natural systems is something called critical slowing down, which is a new branch of science that I think will be developed further in the coming years and will be an important way of having an early warning system to black swan events. When critical slowing down happens, you have fluctuations that don't go back to equilibrium. They start to become really erratic and they can measure how quickly it takes for a system to snap back to normal. This early warning system potentially can provide us with a red flag that things are a little bit unstable. This is where I think we make a profound error in understanding social disasters.
We tend to say, "Oh, well, that was just a black swan, and now we've gone back to the normal way the world works." The problem with that is you've misunderstood the origin of those black swans. You've developed a sandpile that's so tall that the avalanche is inevitable. It's part of the system, it will absolutely happen at some point. And so this is where I think the lesson for us is to prioritize efficiency and optimization slightly less, to build a slightly smaller sandpile, so that it's more resilient. So when the sort of randomness or the flukes of life, the noise of life does happen, we don't end up with disaster. When we think about our daily lives and we navigate our world, we tend to have what I call the mirage of regularity. And it's very easy to be seduced by this because things unfold in these highly predictable ways. And because of this, we start to think, "Okay, the world is controllable." This is, however, a mirage.
And I think this is where the sandpile model and some of the other ideas from complex systems theory are so important for correcting that mirage and making us understand that it is just an illusion. The mirage of regularity is one where when you think about the 21st century, we have been able to delude ourselves into thinking we can make forecasts, right? But every forecast has been invalidated by these black swan events. I mean, just imagine going back and reading the economic forecasts for what the world in 2020 would look like written in late 2019. I mean, those forecasts were so unbelievably wrong. And I think one of the things that's worth thinking about is that we could never imagine some of these dynamics until they happen to us. There are things that we call radical uncertainty where we can't simply understand the future in any possible way because the idea doesn't even occur to us.
We can't tame the world, we can't predict accurately. And so there are things that we have to separate from the questions we have to answer from the questions we don't have to answer. If we parcel out the parts of the world that we have to try to navigate through this uncertainty, we'll make fools of ourselves slightly less often and that will be a good thing for our societies.
Share this post