enic_arrow_drop_down_18px

Explora Articles When we say the world is VUCA we fall short

Articles

August 10, 2020 9 min

When we say the world is VUCA we fall short

The COVID crisis is helping many people become aware of how complex and unpredictable the world we live in is.

Santi Garcia

A content by Santi Garcia

Share

The COVID crisis shows that we live in a hyper-connected world where small events in a remote place can end up causing catastrophes of planetary dimensions. It is the idea that, in the sixties of the last century, the American mathematician and meteorologist Edward Lorenz tried to convey with the metaphor of the ‘butterfly effect.’ The future of two worlds, which initially only differ in that a butterfly flaps its wings in one and not in the other, can end up being very different due to the chain reactions that this small difference can cause. It is also the object of chaos theory, of which Lorenz is a pioneer: the behavior of nonlinear dynamical systems, so sensitive to small variations in their initial conditions that it is practically impossible for us to predict their future. But systems that would be predictable if we were capable of knowing perfectly all their initial conditions.

Small events can have enormous consequences that are very difficult to anticipate. It is also the underlying idea behind the black swan theory developed by Nassim Nicholas Taleb in his book The Black Swan. For Taleb, a black swan is an event that:

  • Happens by surprise since nothing indicates that it can happen; 
  • Has an extreme impact –Taleb considers that many significant scientific discoveries, facts and achievements in the history, such as the Internet or the First World War, can be classified as “black swans”;
  • Once it happens, we tend to rationalize it in hindsight, as if we could have foreseen it.

Whether it is a swan (black or white), an elephant, or a rhino, the point is that the coronavirus, a molecule just 0.1 micrometers in diameter, has turned our world upside down.

Some analysts consider the coronavirus pandemic to be one of those black swans. However, other authors, including Taleb himself, do not see it that way. For Taleb, the COVID pandemic is not a black swan but a white swan. We knew it could happen (the World Health Organization itself had warned of this risk), and we could have anticipated its consequences on the world economy, given the degree of connectivity of the world in which we live. Also borrowing metaphors from the animal kingdom, other analysts refer to the COVID crisis as a gray rhino (an expression Michele Wucker used to describe the Greek crisis of 2012) or as a black elephant (in 2014 Adam Sweidan used it to name “elephant in the room” environmental risks that display some black swan attributes). Anyway, whether it is a swan (black or white), an elephant, or a rhino, the point is that the coronavirus, a molecule just 0.1 micrometers in diameter, has turned our world upside down.

This idea that we live in an increasingly complex and difficult-to-predict world is also implicit in the acronym V.U.C.A. that we have been using for years. Coined in the military field in the last years of the Cold War to describe the context of the new conflicts that arose at that time, nowadays V.U.C.A. is a term commonly used in business circles to express how volatile, uncertain, complex and ambiguous the business environment is. However, some experts think that when we say we live in a V.U.C.A. we fall short.

It is no longer that we do not know what we ignore (unknown unknowns). It is that we are not capable of discovering it (unknowable unknowns).

In recent decades the complexity of the world has increased to the point that it borders on chaos. Some even say we are experiencing a “change of state,” as when water turns into steam. More and more, we face situations in which it is no longer that we do not know what we ignore (unknown unknowns). It is that we are not capable of discovering it (unknowable unknowns). Situations whose consequences are not uncertain, but practically impossible to predict.

It is the thesis that Jamais Cascio, member of the Institute for the Future and one of the Top 100 Global Thinkers of Foreign Policy magazine, has been defending for years, at least since 2018. This analyst raises the need to replace, or at least complement, the conceptual framework provided by the V.U.C.A. model (which for the above reasons considers insufficient), for a new one that helps us find meaning, accept and guide our actions in a world that is more than V.U.C.A. In Cascio’s words, a framework that allows us “to illustrate the scale of the disruptions, the chaos, underway, and enable consideration of what kinds of responses would be useful.”

Cascio summarizes this alternative framework with another acronym: B.A.N.I., a term made up of the initials of the English words Brittle, Anxious, Non-linear, and Incomprehensible.

A Brittle World

Many of the social, economic, and technological systems with which we interact are more fragile than they appear. They are “brittle” systems whose performance collapses the moment an input exceeds a certain level, or environmental conditions exceed some parameters, instead of gradually degrading and diminishing their performance. However, the most dangerous thing is that these systems have a solid appearance, which makes us feel confident. So when they fail, they catch us off guard. Saving the distances, this is what happens with the chocolate coating of our ice cream on a hot day. A first crack opens and, after a few seconds, we find ourselves juggling to prevent the rest of the chocolate layer from crumbling, and the melted interior of the ice cream spilling onto our hands.

Additionally, many of the systems we interact with are susceptible to catastrophic cascading failures. A risk that augments as these systems are increasingly interconnected. The problem here is that we are developing fragile systems in the pursuit of efficiency. Notwithstanding, our priority should be to build resilient systems or, if we go one step further, as Taleb proposes, to design ‘anti-fragile’ systems, which not only quickly recover from the adverse situations they face but emerge stronger from those situations. How can we do that? Allowing redundancies in systems, building them in layers to avoid single points of failure, resisting our tendency to suppress randomness, assuming many small risks, but avoiding big risks that if they go wrong would be the end, keeping options open, respecting the ‘old’ that has managed to exist for a long time, playing long term rather than short term, are some of the things we can do to build less brittle, more anti-fragile systems.

An Anxious World

The world we live in is also a world that generates anxiety. We face new situations that we are not familiar with, that surprise and disorient us. There are many things, perhaps too many, that we feel we cannot control. Everything seems potentially disastrous. Regardless of what we decide, our decisions will never be risk-free. Many times we do not dare to make decisions, because the only one who is never wrong is the one who does not decide, but at the same time, we feel that, by not making decisions, we may be missing opportunities. We are also concerned that what we decide today could limit our ability to maneuver if we need to change course in the future.

And to the above, we must add the bombardment of news, many of them negative. We are aware of how difficult it is amid this informative avalanche to distinguish between the truth and fake news, and yet we are still hooked on the media and social networks. We are distressed by the dystopian future that some draw for us, the conspiracy theories that others raise, and at times we feel there is no way out. The very idea that we live in a V.U.C.A. world, although most of us have assumed it, generates anxiety. Understandably, cases of stress, depression, even the number of suicides, are increasing. Empathy, mindfulness, disconnection, and trust are some resources that can help us in this scenario.

A Non-linear World

The majority of natural, economic, and social systems are non-linear. It means that if we introduce an input in one of these systems, we do not always obtain the same output. At the same time, variations in the size or intensity of those inputs can lead to non-proportional changes in the outputs of the systems. Besides, the consequences of the actions we take on these systems often arrive with a long delay, making it even harder to identify cause-effect relations. Also, non-linear systems have memory, a property that in Physics is called “hysteresis.” Their history influences their future behavior to the point that two identical systems may behave very differently in the future if their pasts are diverse.

This apparent disconnect and disproportion between causes and effects contribute to the feeling that we live in a world where things happen randomly when the reality is that they do not. What happens is that many of the systems we try to understand, or whose behavior we want to influence, are too complex for the technology that is available today. It is the reason why, for example, we still cannot make accurate long-term weather forecasts. Consequently, the starting point should be to assume that, in nature, in the economy and society, the linear is the exception and the non-linear is the norm. And have the humility to accept that there are many things that, at least for the time being, we humans are not able to understand.

An Incomprehensible World

As humans, we always seek explanations for the situations and phenomena we observe around us. The problem is that our conclusions are often too simplistic and can lead us to make the wrong decisions. This risk is even higher in a world that becomes more complex as its elements are increasingly interconnected, and the feedback loops between them multiply. We have more data than ever, but having more data is not always helpful in interpreting the situations we face. Information overload often creates more confusion and makes it difficult for us to separate signal from noise.

The widespread use of artificial intelligence solutions, whose potential never ceases to amaze us, also contributes to the feeling that the environment around us is incomprehensible. Algorithms seem to produce valuable inputs for decision making, but we often do not know how they reach their conclusions. Therefore, we run the risk they are contaminated by the biases of the people who designed them, or by those the data used to train them may have. The incomprehensibility of the environment is also what leads companies to use A / B tests to make their decisions. They observe how two solutions work in the market and choose the one that gives better results, without having to understand in detail the reasons why this solution works better than the other. 

It is, in short, a scenario in which we ask algorithms to explain how they reach their conclusions. But also a world where it is necessary to dust off a quality we had parked in the belief that in a high-tech quantified work context was no longer needed: intuition.

Did you like it?

Future for Work Institute operates on an annual subscription model that includes access to our calendar activities and knowledge repository resources, as well as in-company services.

  • Plan

    Curiosity

    Recommended for HR teams of between 5 and 20 people.

    Subscribe

  • Plan

    Pioneer

    Recommended for HR teams of between 15 and 100 people.

    Subscribe

  • Plan

    Exploration

    Recommended for HR teams of more than 100 people.

    Subscribe

  • Plan

    Horizons

    For more complex organizations.

    Subscribe

Contact us

Already Registered? Log in here