Surprisingly large parts of our world are simple in the sense that nature’s predictability allows us to design complex systems on top of this ‘simplicity’. As I’ve mentioned in another blog post, ‘Complexity 101’, which is a gentle introduction, scientists using only chalk and blackboard were able to discover the theory of evolution in biology and most of the laws of physics, including gravity, relativity and even quantum mechanics. Although the equations that describe these theories may be complex they in fact exploit underlying simplicity and predictability in the world to describe it very accurately. This gives us, as the users of these equations that describe nature, more control, and so also, the ability to build complex technologies that exploit nature’s underlying simplicity. These scientific theories and equations have been used to make astonishing predictions about nature that were proved correct again and again by hundreds of subsequent experiments and observations. This is due to the underlying simplicity and predictability of the environments they are describing. Engineers, like NASA’s Apollo engineers then exploit this underlying simplicity to create spectacular successes, like the Apollo missions to the moon.
Newton is the real star pilot for NASA
So first up, let’s re-consider the NASA engineers who built the rockets that went to the moon. The mathematics seems complex to us, but to the NASA engineers, this complex maths, is, in a sense, a ‘simple’ situation. This is because they can use this maths to calculate where the rocket will end up in advance using the principles of Newtonian physics as long as we have the basic information to put into the equations. Once the rocket is on its way and has fired the astronaut's module towards the moon, it is Isaac Newton who is really ‘in the driving seat’. What I mean by this is that the laws of Newtonian physics predictably describe where the lunar module will end up.
What we see, again and again in sciences and engineering, is that predictability of an environment and/or nature, allows complex projects and plans to be created to negotiate even very challenging, complex problems, such as gong to the moon, or building a dam. Even when there are high costs to failure, the underlying predictability in nature allows seemingly dangerous and difficult problems to be negotiated and successfully solved. Therefore, we can summarise this type of rational complexity which builds on predictability as:
Planning is rational, and can be complex, as long as the following conditions are in place to make the planning rational:
Conditions for rational planning
The environment being used is sufficiently predictable
The necessary information about the environment can be supplied in advance to do the prediction that the planning is based on
These two points might be rather obvious but they are worth emphasising because in many cases plans are still made in unpredictable environments or environments where we lack sufficient information. In these cases, either or both the conditions above are not true, and we can question whether planning is rational.
So, we may ask the question: What, if any sort of complexity is rational, if the conditions for rational planning are not in place? I believe that the answer is ‘agile complexity’. I will be arguing that this type of complexity is probably what we see in nature, and increasingly in business too, where very often, complex software development projects are built according to the principles of ‘agile’ development rather than being heavily planned in advance. I will be talking about agile complexity a lot, because while the conditions for rational planning are well known, I think that ‘rational agility’ is not that well understood.
To see what rational agility might mean in practice, let’s look at some real-world complexity, which occurs in unpredictable environments. The agile approach to mid-sized software development is a method developed by a loose- knit group of experienced software engineers. These engineers came together in response to the failure of so many of the heavily planned IT projects that they were involved with. These were projects where planning was not that rational. These quite rebellious software guys remind me a little of the Chinese Taoist philosophers, as Taoism also developed in times of great uncertainty in ancient China. Admittedly, these programmers were probably more devoted to coffee drinking than meditation. However, what both the Taoists and the agile software developers share is an approach that rejects the urge to plan and predict everything about the project in advance. Instead, they favour seeking new information on a continual basis as even complex projects develop. This is fundamentally because their environments, both Ancient China and modern 21st century business, are inherently unpredictable.
Build and adjust, build and adjust
The agile software developers deal with unpredictably by taking a step forward in some direction each week, while also continually gathering, new, useful information as they go. To incorporate unpredicted or surprising new information they adjust what they have already built in as efficient a way as possible that leads to predictable progress pretty much whatever the new information is. This approach is rational, I argue, when they combine the approach with a skepticism about their own information knowledge starting point. If you know that there is unpredictability in the environment and the goal, then agile complexity can actually be seen as a more cautious and effective approach than planning ahead too far.
Agile software development works by allowing the software to be developed alongside the information gathering process rather than only after the information has been gathered.
Conditions for rational agility:
Any changes needed to the project built based on new information so are not too expensive
Communication to check what is needed now via channels that are frequent, inexpensive and reasonably reliable
There is a general understanding of the project in terms of how best to use the new information obtained
This agile form of rational complexity, is still not well understood mathematically, and that is one of my long-term research goals. I want to formulate more precisely when agility is rational. However rational agility is probably everywhere in our lives, when you know where to look. I think that this is because it is a natural solution to unpredictability in the environment. When we organise a trip to the zoo, we may not only plan and try to predict, but also arrange our ‘plans’ ‘defensively’ by choosing a route that gives us options (plan B) in case we suddenly learn that it is going to rain when we are already half-way there. This preparation to be able to react and use unexpected information probably has a mathematical underpinning, but we don’t understand that sort of mathematics in any detail. This is not because it is particularly difficult, but just because rational agility is something we are only just getting good at in engineering and science. I believe that the defensive aspect of agility also probably relates to or is an aspect of ‘resilience’ or ‘anti-fragility’ which has been spoken about by people like Nassim Taleb (writer of ‘Black Swan’). Taleb talks about it in terms of building a more resilient financial sector that could survive economic crashes and unpredictability more effectively.
So, to summarise, we have explored 2 different kinds of rational complexity; ‘rational planning’ and now, ‘rational agility’. In the next two blog articles I will be discussing how the children’s game 20 questions can give a further insight into what sort of complexity is rational in a given situation. I will be using the different strategies of 20 questions to show how this also relates to optimism or pessimism we may feel in terms of achieving our goals. Is the glass half empty or half full? The best response, I reckon, is that it depends on whether we think the glass is simple or complex!
The website of the original agile manifesto: http://agilemanifesto.org/
Taleb, Nassim Nicholas. The black swan: The impact of the highly improbable. Vol. 2. Random house, 2007.