Wednesday 13 March 2013

Book of the month: The Logic of Failure by Dietrich Dörner

Let's go and blame somebody for this...
In Dietrich Dörner's book, translated from the German "Logik des Misslingens", he starts off by showing us how complex systems are difficult to manage, using computer simulations of an African tribe and an English town as examples. Dörner then goes on to describe why some systems are complicated (due to complexity, internal dynamics, intransparency and incorrect or incomplete understanding). The book then veers a bit off from a logical trajectory to discuss: Setting goals, Information and models, Time sequences and Planning. He finishes off with a chapter entitled "So Now What Do We Do?". (If you just want to find out "How not to fail" then skip to the bottom of this post)

Good things about this book include the discussion about the Chernobyl disaster and its appreciation that this was an expert team who were not "stupid" people making mistakes. Dörner also made me realise that there are positive goals (this is something I want to make happen) and negative goals (this is something I want not to happen). In general, positive goals are better because they make planning easier. Dörner also suggests ways of dealing with multiple problems including: finding the central problem(s), finding the most urgent/important problem(s) and delegating problem(s).

He also clarified the concept of "repair service" behaviour for me. This is when we don't spend the time to find the central or most important/urgent problems and instead go out to find a problem, any problem. We solve this problem and then go on to find the next problem. "Repair service" behaviour may be better than doing nothing, but it means that the most important problems are overlooked.

An indicator variable in a cage
Dörner helped me with another couple of definitions: critical variables (which influence many other things in a system) and indicator variables (which are influenced by many variables but do not affect much themselves). In a coal mine a critical variable may be the compressive strength of the tunnel supports, while an indicator variable would be the canary.

Dörner also uses the concept of "ballistic decisions". These are "fire and forget" decisions which follow a given trajectory with an unchanging course. The alternative are "rocket decisions" whose trajectory is followed and altered as new information is gathered. Bad planners make a lot of ballistic decisions which they never follow up on to see if they were the correct ones.

In his final chapter "So Now What Do We Do?", Dörner explains the causes of mistakes:

  1. Slowness of thinking (not because we are dim-witted but because we are human)
  2. Only able to process a small amount of information at a time
  3. Tendency to protect our sense of competence
  4. Limited capacity of our memory
  5. Tendency to focus only on immediately pressing problems



A couple of things that could be improved: On p.20 Dörner has a graph showing a good and a bad participant in the English town computer simulation. The starting points for the two participants are the same, but their starting satisfaction scores are different. It is unclear whether this is just a printing error or due to some other cause. Additionally, a graph on p.126 refers to a "heavy black line"which is not to be found.
Dörner also spends 9 pages exploring the HIV epidemic and the statistics surrounding it, which is not really what we need in a book on failure

This book is let down a bit by Dörner's conclusion "There is only one thing that does in fact matter, and that is the development of our common sense." It may be that this phrase has not translated well, but there are enough books and articles out there to show us that "common sense" is very frequently non-sensical. I would have liked to have seen this better explained and perhaps a different choice of words used.

Overall, a very good introduction to some of the theory behind complicated systems with some good tips on how to stop ourselves from being overwhelmed in a complicated system.


According to Dörner the following mark out good participants:
1) They make more decisions and more decisions per goal
2) They act "more complexly" (i.e. they appreciate that a complex system exists and therefore their actions need to be complex)
3) They generate hypotheses (bad participants generate truths) and admit ignorance
4) They ask "Why?"
5) They don't become distracted too easily but also don't become obsessed with something
6) They think ahead
7) They break complex problems or goals into intermediate problems or goals
8) They get the level of detail right, not too rough but not too fine
9) They plan. Planning is good, too much planning is bad and sometimes you've just got get stuck in (he refers to Napoleon's "We engage (the enemy) and then we see" and talks about the military strategist Moltke but doesn't mention one of his best quotes "No battle plan survives contact with the enemy")
10) They reflect on their own thinking and decisions.

No comments:

Post a Comment