This is an acronym I came across a lot in my studies. VUCA stands for volatile, uncertain, complex, and ambiguous. It’s used to describe the world we live in and was noted as the number one reason to why we need innovation. It’s also a useful reminder for any individual or organization to avoid overconfidence.

The world is volatile. It moves unpredictably and doesn’t ever really trend in one direction for long. Especially human made systems move faster and fluctuate more with our progress. When our thinking and actions couldn’t keep up, we invented computers to do some thinking and acting for us. The result was more interconnectedness and faster systems, which often makes mistakes more devastating.

Uncertainty is a given and always was. Even with thousands of years of added knowledge, we still don’t know shit about how much of the world really works. It’s shocking how many of the things we think we know are based on assumptions. Often useful ones, for sure, but still, assumptions that might fail at some point. Uncertainty means unpredictability. We don’t like that very much. We don’t like to not know what is coming, so we typically give in to the illusion that the past is an indicator of the future. Sometimes this holds true, frequently it does not, and this mistake costs us dearly.

Complexity is another reason for the unpredictability of the world. There are so many interchanging and interdependent factors and variables at play. Even if we know how these interrelate to each other, it wouldn’t help us to understand the whole system. Complex means that understanding the parts doesn’t mean you understand the system.

Finally, the world is ambiguous. Even though we occasionally think we understand things, there is always a surprise around the corner. Many questions outside of physics (and even there, as quantum physics might suggest) have ambiguous answers that don’t fit with our boolean way of judging things.

The essence of VUCA is that you don’t know what’s coming, and you can’t figure it out. Many try and are trying, which is fine as long as you don’t put your survival at risk. When it comes to surviving in the long run, there is no success in prediction, only in preparedness and adaptation.

Nassim Taleb says that you can’t measure risk because you can’t predict the future. You don’t know what’s going to happen and how bad it’s going to be. That’s because of VUCA. Instead, one should measure fragility — how vulnerable a system is to shocks. If a system doesn’t like volatility, it’s fragile. If it isn’t harmed by volatility, it’s robust. And finally, if it even improves because of volatility, it’s antifragile — the opposite side of the fragility spectrum.

There should always be two types of efforts going on. First, let’s continue our quest to figure out how the world works, despite its VUCA properties. We have the power to know more and understand better, and many of the things that were once in VUCA land no longer are (for example some types of natural disasters). Our predictive powers grow and we can increase our agency.

At the same time, we need to accept that there’s probably always more we don’t know. We must stay humble. Don’t trust your ability to understand the world when it comes to risk management. Instead, your other efforts should go into building robust or even antifragile systems that survive despite a VUCA world.