Its been quite a while that I've put in anything new here. Ofcourse, as busy as the month(s) have been, it wouldnt all account for this delay as much as a lack of an interesting topic and procrastination. In this post, I decide to bore you with a review of an interesting book that I decided to read the last week.
I just finished reading "Think Again: Why Good Leaders make bad decisions and how to keep it from happening to you". It documents a lot of interesting work done by the authors Finkelstein, Whitehead and Campbell (you can get it here). Finkelstein has been an authority on strategy, leadership and warning signs for corporate disasters; but I didnt know that when I picked this up from my neighbouring colleague at work.
The book has interesting material on how our brain processes information to arrive at decisions and just so that all this doesnt sound too far-fetched, the book is abundant with ample examples to verify each of its findings. Ofcourse the foundation of all of it is that we learn from experience how to handle situations and this experience teaches us to make prudent decisions as we get more and more experienced. However such experiences, the authors have shown very successfully, can also prejudice rather than enlighten and hence bias us towards wrongful decisions that may seem perfectly correct at the time they were made.
They have analyzed a large collection of decisions made by influential people over the past few decades to arrive on bechmarks that qualify good decisions from bad ones. Every decision has an inherent risk factor and no matter how good it may be at the time, they might turn out badly in future: some people may just be plainly unlucky. The authors have done well not to include such decisions, but to limit their research to those that were flawed at the time they were made and could (rather should) have been averted if possible.
The findings result in four sources of errors that include
- Misleading Experiences: past experiences that seem similar to the current situation but are actually not because we have underestimated a vital difference between the two scenarios
- Misleading Prejudgments: Experience has a strange way of teaching us to think less and less as we gain more and more experience. It seems to become "obvious" to us what the course of action should be without thinking over it. This occurs due to heuristics that are developed and strengthened with age. Most often these heuristics or biases have strange ways of defying reason.
- Inapproprate Self-Interest: Every one is selfish. Even those who claim they arent, are selfish for the appreciations that their perceived selflessness would generate. Ofcourse there are exceptions, but one of the major factors blinding us from making objective decisions is an inappropriate self-interest. Sometimese actions in this direction actually work against self interest if thought out carefully. Case in point: in a survey where a random collection of people of people were asked "If you were sued by someone else and they lost the case, should he/she pay your legal costs?", 85% people answered yes. But when asked "If you sue someone and you lost the case, should you pay his/her legal costs?", only 44% answered yes. It is evident that the influence of self interest clouded the objectivity in the decision that was made.
- Inappropriate attachments: emotional or other attachments have a way of clouding the judgement with irrational thinking.
Ofcourse there are many other factors that come into play, the most insightful of them being short-term returns which our brain weighs more heavily than a long term return, especially when it comes to matters involving money.
Three good cases in point which I quote from the book:
The Iraq decision and Tony Blair:
Prior to the iraq decision, Tony Blair had 3 relevant but potentially misleading experiences: in the Balkans, in Sierra Leone, and in Afghanistan. In all 3 cases, military intervention or a credible threat of military intervention succeeded or appeared to succeed in resolving the situation. Eg: Tony Blair personally pursuaded a reluctant President Clinton to threaten the Serbian leader, Milosevic, with military invasion if he did not back down over Kosovo. The threat worked. Milosevic backed down and was then overthrown in an internal political coup. The poliy of making a credible threat of military intervention worked. These earlier experiences ecouraged Blair to support military intervention in Iraq. He was certainly more enthusiastic than the British foreign secretary or the chancellor of the exchequer, neither of whom had such personal involvement in these earlier events. Ofcourse, these earlier experiences may not have been misleading, they just turned out to be so because the situation in Iraq was different in some important ways. Most prominently, Iraq was a fractured society which had only been held together by brutal force.
The Bay of Pigs, kennedy and Cuba:
One of the first decisions Kennedy made when he became president was to overthrow Fidel Castro's regime in Cuba. The plan, developed by the CIA called for an invasion by cuban exiles, with US military support including air cover and paratroopers to secure the approaches to the landing beaches. The forces were meant to join up with other cubans opposed to castro. Kennedy wanted the US involvement to be "deniable", and he insisted that operation be undertaken entirely by cubans and landing take place at night in an area with little opposition. The only option was the Bay of Pigs. The operations was launched in April 1961. It was a disaster. The plan had been leaked, and Castro's forces quickly closed in, preventing the invaders from leaving the beaches. Despite US airstrikes, all rebels were captured or killed in 3 days time. Kennedy was forced to negotiate for the release of survivors and it was a political setback. So what was wrong in the decision? The list is long. Kennedy's plan was based on his prejudgment that visible US involvement was unacceptable. His changes made for political reasons and without advice of generals condemned the operation militarily. In addition to that, there was an element of self interest in his decision to go ahead with the plan. While he had personal misgivings about the whole enterprise, kennedy was under domestic political pressure to do something about cuba and was accused by opponents of being weak.
A large UK chemical company was considering a major investment in Russia. The chairman was concerned that the CEO and the regional management team had become overcommitted to the project. In this case, the chairman had not done a formal red flags analysis, but if he had, he might have been concerned about prejudgments. For example, he was concerned that local managers had presumed that the market was attractive. He was also concerned about attachments. The local managers had close and difficult to unwind relationships with local partners. the chairman also recognized that his own thinking could be biased. He had previous experience of losing money in Russia and had recently been briefed about deteriorating relationships between Russia and the UK. What later happened as a result of many meetings and exercises, was that the investment plan was wisely dropped.
The safeguards that the authors suggest against bad decisions are Monitoring, Group debates and discussion (there is minimal probability that a bias or prejudice shared by one will be shared by many others, unless it it worthy of an impact in the decision process), data and experience (as against intuition and heuristics) and continous governance.
It has been an enlightening book to read.