Muddle Through – to manage to do something although you are not organized and do not know how to do it. Cambridge Dictionary[1]
For approximately the last five years I have been putting thoughts together for a new book project. The tentative working title for the book is “Muddle”. I started thinking about “Muddle”, or more accurately “Muddle Through” while writing my book on complexity, “It’s Not Complicated: The Art and Science of Complexity in Business” which was published by University of Toronto Press.
Muddle brings together concepts from two areas of management that do not get nearly enough attention: complexity science and risk management. In addition to these two management disciplines, my conceptualization of muddle also brings together behavioral economics, as well as sociological economics. Behavioral economics are all the human quirks and irrationalities that we exhibit as individuals, while sociological economics are all the human quirks and irrationalities we collectively exhibit as a group. It is important to note that the quirks that we as humans exhibit as a group are most definitely not a scaling up of our individual quirks.
Muddle also deals with the critical difference between the Knightian definitions of risk and uncertainty. This has major implications as we “muddle through” our new work reality of big data and artificial intelligence changing the role of the human in the workplace.
We are certainly going through a period of great uncertainty. Whether or not we want to admit it, we are experiencing a period of muddling through. Perhaps an even more disturbing thought is that the “leave it to the experts” experts are also for the most part muddling through.
“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.” Donald Rumsfeld
Donald Rumsfeld was widely ridiculed when he made the above statement. His “… known knowns, … known unknowns, and … unknown unknowns, … “, may have made rich fodder for comedians, but perhaps it is time that we admit that he was onto a very important concept.
In this day and age of digital assistants, an exploding internet of searchable knowledge, and expert pundits, we expect, and dare I say, demand, that someone knows something (and does something). The reality however is quite different. Most real experts will admit that the more they learn, and thus the more they know, the more they realize that they don’t know.
At times like this it is critical to look at the work of Philip Tetlock, whose work is highlighted in his very readable book (co-authored with Dan Gardner), “Superforecasting: The Art and Science of Prediction”. Tetlock has made a career of tracking the forecasting ability of a wide variety of experts. His results are surprising; experts are not very good at forecasting. In fact, they are terrible. One especially surprising, and even troubling result is that the more confident and the more professional the expert is judged to be, the worse their forecasts turn out to be!
We cannot simply rely on experts in even simple times, much less complicated and complex times such as now. Asking Alexia what to do is not going to work, except for maybe telling us what stores are currently stocked with toilet paper. Sadly, hoarding toilet paper is not going to be the ultimate solution to our current situation.
Collectively we need to appreciate the wisdom of Rumsfeld sayings and realize, regardless of your opinions of his politics, that his “unknowns unknowns” is a concept that we need to accept and embrace at the present time. However, we also need to appreciate that he also commented on “known knowns”, and “known unknowns”. This is where a knowledge of systems thinking and the difference between complicated and complexity come into play.
Complicated things work by the rules of physics or mathematics. They are certainties and are completely reproducible; if you do the same actions, you get the same outcomes. Complex issues however are based on people interacting and adapting. Complex issues are not reproducible. They exhibit properties of emergence, tipping points, feedback loops, multiple scales and levels and many other properties that lead to “unknown unknowns”.[2]
Complicated systems are easy to manage. You apply the rules or laws governing the system. Complex systems however require a radically different approach. Trying to “solve” a complex problem like it is a complicated one almost always creates negative unintended consequences. It is my fear that in trying to “solve” the COVID-19 problem, we are creating even bigger problems with even longer lasting negative consequences.
“The problem with the world is that the intelligent people are full of doubts, while the stupid ones are full of confidence.” Charles Burkowski
In my book “It’s Not Complicated”, I suggest three simple guidelines for managing complexity. The first is to recognize what type of problem you are dealing with; is it complicated or is it complex. The second guideline is to think “manage, not solve”. The third suggestion is to implement a “try, learn, adapt” approach.
It is the height of folly and arrogance to state that you have a solution to a complex problem. No one has a solution to a complex problem, thus the need to “manage, not solve”. However, in hindsight, I believe that this COVID-19 crisis is likely to be the seen as a mismanaged complex problem, and in addition the crisis of social media hysteria and “solution screaming”.
Complicated style decisions are being made, and defended, based on incomplete and fuzzy data. Our search for “the” solution is massively counterproductive. Furthermore, the increasing polarization of society – another complex social media effect in my opinion – means that when new data and new ideas do come forth, they struggle for an audience due to the entrenchment of existing views. It is almost as if we as a society never learned Bayesian analysis, or one of the key lessons pointed out in Superforecasting, namely to “update your beliefs”.
“In times of change, learners inherit the earth, while the learned find themselves beautifully equipped to deal with a world that no longer exists.” Eric Hoffer
The COVID-19 crisis is likely to change a lot of things. It already has. We need to change and learn as well. We need to start thinking more critically. We need to start realizing that social media, like all tools, can be great if used properly and harmful if used carelessly. We need to appreciate the difference between risk and uncertainty as models and AI are increasingly implemented in complex situations they are totally unsuited for. We need to learn and appreciate the difference between what we know, what we think we know, and what we don’t know. We have to come to grips, and even celebrate, that our world is part complicated and many parts complex. We have to continue learning. In short, we need to muddle through.
Dr. Rick Nason, PhD, CFA
Rick Nason, PhD, CFA
[1] https://dictionary.cambridge.org/dictionary/english/muddle-through
[2] For an excellent short summary of the properties of complex systems see: https://www.cecan.ac.uk/sites/default/files/2018-06/The%20Visual%20Communication%20of%20Complexity%20-%20May2018%20-%20EcoLabs.pdf