|Let's go and blame somebody for this...|
Good things about this book include the discussion about the Chernobyl disaster and its appreciation that this was an expert team who were not "stupid" people making mistakes. Dörner also made me realise that there are positive goals (this is something I want to make happen) and negative goals (this is something I want not to happen). In general, positive goals are better because they make planning easier. Dörner also suggests ways of dealing with multiple problems including: finding the central problem(s), finding the most urgent/important problem(s) and delegating problem(s).
He also clarified the concept of "repair service" behaviour for me. This is when we don't spend the time to find the central or most important/urgent problems and instead go out to find a problem, any problem. We solve this problem and then go on to find the next problem. "Repair service" behaviour may be better than doing nothing, but it means that the most important problems are overlooked.
|An indicator variable in a cage|
Dörner also uses the concept of "ballistic decisions". These are "fire and forget" decisions which follow a given trajectory with an unchanging course. The alternative are "rocket decisions" whose trajectory is followed and altered as new information is gathered. Bad planners make a lot of ballistic decisions which they never follow up on to see if they were the correct ones.
In his final chapter "So Now What Do We Do?", Dörner explains the causes of mistakes:
- Slowness of thinking (not because we are dim-witted but because we are human)
- Only able to process a small amount of information at a time
- Tendency to protect our sense of competence
- Limited capacity of our memory
- Tendency to focus only on immediately pressing problems
Dörner also spends 9 pages exploring the HIV epidemic and the statistics surrounding it, which is not really what we need in a book on failure
This book is let down a bit by Dörner's conclusion "There is only one thing that does in fact matter, and that is the development of our common sense." It may be that this phrase has not translated well, but there are enough books and articles out there to show us that "common sense" is very frequently non-sensical. I would have liked to have seen this better explained and perhaps a different choice of words used.
Overall, a very good introduction to some of the theory behind complicated systems with some good tips on how to stop ourselves from being overwhelmed in a complicated system.
According to Dörner the following mark out good participants:
1) They make more decisions and more decisions per goal
2) They act "more complexly" (i.e. they appreciate that a complex system exists and therefore their actions need to be complex)
3) They generate hypotheses (bad participants generate truths) and admit ignorance
4) They ask "Why?"
5) They don't become distracted too easily but also don't become obsessed with something
6) They think ahead
7) They break complex problems or goals into intermediate problems or goals
8) They get the level of detail right, not too rough but not too fine
9) They plan. Planning is good, too much planning is bad and sometimes you've just got get stuck in (he refers to Napoleon's "We engage (the enemy) and then we see" and talks about the military strategist Moltke but doesn't mention one of his best quotes "No battle plan survives contact with the enemy")
10) They reflect on their own thinking and decisions.