jump to navigation

Mental Models: necessary, but incomplete… December 10, 2009

Posted by Graham Pingree in Sway: Pull of Irrational Behavior.
add a comment

One of my favorite quotes from Sway comes from Psychologist Franz Epting about how humans use mental models to quickly digest information: “We use diagnostic labels to organize and simplify. But any classification that you come up with has got to work by ignoring a lot of other things – with the hope that the things you are ignoring don’t make a difference.” I think it helps underscore the inherent compromise in using mental models to interpret different situations: we need these shortcuts to function effectively and make decisions in real-time, but they are necessarily built on incomplete information. The challenge becomes knowing when to adjust these models, or augment them with new assumptions or relationships. Almost all of the irrational decisions that the authors provide throughout the book are traceable to an incomplete model, and the authors point to several behavioral traits (Loss Aversion, Value Attribution or Diagnosis Bias) that help classify the different ways in which we rely on hidden assumptions or faulty reasoning.


Sway: The Irresistible Pull of Irrational Behavior December 7, 2009

Posted by Kavita Vora in Sway: Pull of Irrational Behavior, [Books] Leadership & Change.
add a comment

In Sway, brothers Ori and Rom Brafman investigate what makes intelligent people make irrational decisions.  With a combination of Ph.D. in psychology and MBA, they offer a unique perspective into irrational behavior and offer recommendations on how we can “resist the sway”.

For anyone who has paid for cell phone minutes they won’t use just because they are afraid of going over, or felt a connection with someone because the other person displayed signs of being interested in them, or started out with good intentions of helping a coworker and but once they got rewarded for it they didn’t want to continue helping without another reward, these are all examples of how we get swayed in our decision making.

Psychological forces that influence irrational behavior:

  • Aversion to loss
  • Commitment
  • The Chameleon Effect
  • Diagnosis Bias
  • Incentive vs. Altruism

Loss aversion can create costly mistakes, such as in the Tenerife air disaster of 1977.  Pilot Captain Jacob Van Zanten led KLM’s safety course programs and had a spotless record of on time flights.  In analyzing this tragedy, it has become clear that the Captain made drastically different decisions than he would have if not under the pressure of avoiding a perceived potential loss.  Since the stakes kept getting higher and higher, Zanten took higher risks to avoid those losses.  In retrospect the potential gain vs. loss was not weighed objectively and he was lacking an effective feedback mechanism from his crew.

The Brafman brothers also discuss their theory of the swamp of commitment in which past success makes us want to use the same strategy over and over rather than incorporating feedback that may indicate a new strategy is needed.  They offer examples from the arenas of sports and politics that bring to light that the higher the potential loss, the more optimistic we become in hoping everything will be ok, rather than reevaluating our strategy.

I thought this was a great read full of stories which bring the principles to life. I highly recommend Sway and my major takeaways were to (1) second your first impression to stay open minded, (2) focus on the long term to avoid short term loss panics, (3) reevaluate your incentive strategy so it doesn’t derail altruistic motivations, and (4) be actively aware of your assumptions and paradigms.

Reflections on Sway December 7, 2009

Posted by allenb120 in Sway: Pull of Irrational Behavior.
add a comment

I really enjoyed Sway: The Irresistible Pull of Irrational Behavior by Ori Brafman and Rom Brafman.  I thought it was easy to read and conveyed useful information to identify various psychological factors in decision making and possible solutions.  However, it seemed to rely too much on anecdotes. The thesis is that psychological factors can cause people to take irrational actions, but that recognition and corrective action can overcome the effects of these irrational psychological factors.  The psychological factors the authors discuss are loss aversion, commitment bias, value attribution, diagnosis bias, fairness and peer pressure.

Loss Aversion is the tendency to go to great lengths to avoid possible losses.  The authors note that people have much stronger reactions to losing $1,000 than winning $1,000 and that loss aversion increases the more that is at stake.  One example of loss aversion discussed is buying rental car insurance.  The authors note that this behavior is irrational because a standard car insurance policy would cover rental cars as well.  However, faced with the uncertain expense of a catastrophic car accident many people buy rental car insurance anyway.  Another example mentioned is buying flat rate phone service as most people would have saved money by paying a per minute charge.  However, faced with the possibility of an excessive phone bill, most people play it safe and buy a flat rate service.  People could have saved the additional expense in these situations by stepping back and asking whether the additional expenses are justified.

Another irrational behavior discussed is diagnosis bias.  This bias is a blindness to all evidence that contradicts an initial assessment of a person or situation.  Bias makes first impressions as well as brand very important and can skew our judgment.  The authors noted a study of NBA players that found a player’s position in the NBA draft selection had a greater impact on the player’s playing time, trade prospect and length of career than points per minute or other performance measures.  Someone can counteract the effect of this bias by being open, observant and withholding judgment until necessary.

In terms of the goals our course, I think this book could apply to the “observation” and “solution” parts of the Innovation Process outlined in the Beckman and Barry article by helping the observer understand personal biases of the observer and the observed as well as by helping frame better solutions through taking into account how psychological factors may influence the implementation and acceptance of new ideas.  As a general matter I felt that the lessons of the book fit into the Assumptions and Mental Models component of the course.

December 7, 2009

Posted by James Bender in Sway: Pull of Irrational Behavior, [Books] Leadership & Change.
add a comment

Sway: The Irresistible Pull of Irrational Behavior

Ori and Rom Brafman do a great job of captivating the reader with amazing anecdotal stories that drive home the primary message of the book: people, who normally make rational, intelligent, and calculated decisions, can and will neglect glaring evidence to make a decision that will negatively alter their lives. The wide swath of tales covers enough ground so that all readers should experience the proverbial cooking pan to the side of the head. Perhaps that is why I liked the book. The Brafman brothers state that all people do stupid things. Then, you, as a reader, give them a chapter or two to prove their point thinking that you’d never be caught doing some thing so foolish that it would alter the course of your life. You are then engulfed by the storytelling. And then, bam, the fry pan hits you square in the face because you realize that one of the stories directly describes some recent foolish action you have taken.

This book is outstanding in its own right. My interest was piqued by the book’s 1st sentence which describes one of the world’s most experienced and accomplished pilots in the world. Now, allow me to diverge. Aviators are an interesting set. Perhaps there are more technically difficult jobs in the world. But the confidence required to be an aviator disallows just anyone to sign up. Aviators must be relentlessly perfect. Surgeons and aviators have to have a certain swagger in their professionalism because mistakes are final. While not everyone knows an aviator, most people know surgeons and are inspired or disgusted by their approach on life and interactions with people. I feel like the same can be said about aviators.

My 9 years previous to b-school were spent flying the US Navy’s largest plane that lands on the carrier. Just to reiterate the idea of finality, when landing on the carrier, I had 4 feet of centerline deviation before I started dragging my wingtip through parked planes. If not on centerline while landing, not only would I be causing literally hundreds of millions of dollars of damage as I dragged my wingtip through parked planes, but the likelihood of surviving that event would be slim at best. Plus, the 150 flight deck workers would be subjected to 150 knot flying metal shards. So my plane would be lost, 15 or so other planes would suffer severe damage, and many people would likely not survive. So it was a stressful life. But in my mind (and reality), there just wasn’t a better carrier aviator. I was the best. I had to be. Each time I was on final approach to the carrier, attempting to safely crash my plane onto the deck and pray that my tailhook caught a wire, I had to have nerves of steel and the confidence to accomplish my task at hand. Raining? Didn’t matter. Snowing and couldn’t see? Fly perfect instruments. Windy and the carrier deck pitching plus or minus 19 feet with 9 foot listing? A varsity day, but toughen up. So as you can clearly see, that confidence is required, because beyond open heart surgery, there are few jobs in the world were you absolutely must be perfect, every time.

So I found it interesting that the Brafman brothers would start with the Tenerife disaster that killed 584 people when two planes collided on the runway. Increasing bad weather, crew rest, and a failed cockpit social structure allowed the holes of the proverbial swiss cheese to align and this horrible mishap to occur. All the safety frameworks in place did not catch the circumstances surrounding this event. Many studies were conducted to better understand what went wrong and how to plan to avoid the mistake again. The interesting piece for me is that my years of flight school incorporated these safety improvement ideas so well, that I did not even realize it was happening. Allow me to diverge again.

When you start flight school, you first fly with an instructor. That instructor has thousands of hours flying and you do what they say when they say it. If you cannot perform simple maneuvers, talk on the radio, and understand the emergency procedures, you attrite quickly. There is a knowledge base that you must acquire and prove that you can perform. However, after about 200 hours, a student knows his way around the jet and can safely get off the ground, zipp around at 300 knots, fly formation, and safely land in challenging weather environments. The confidence is building because flying a jet is a difficult task. However, this idea of crew resource management is ever present. There is always someone somewhere that has information that you need. It might be that ATC knows there is a distressed plane behind that needs to land before you–altering your fuel planning. It might be that your a fire-detection system has failed. It might be that your co-pilot sees a very small plane 20 miles out that is CBDR (constant bearing-decreasing range). It is not that anyone one of these things will cause a disaster. But add these circumstances up without the primary decision making (the aircraft commander) understanding what is happening, and the holes of the swiss cheese align. Experience, and the confidence (read arrogance) that surrounds it will never be a substitute for a lack of situational awareness. For this reason, it is imperative that the cockpit be an information depository. As an aviator, you must constantly look for new bits of information. It might be a rate of descent reading, an altitude, a radar blip of another plane, or an ATC radio call describing the weather at your destination. Flight is dynamic and constant, relevant, updated information is paramount. The most important thing to understand here is that things happen quickly and pilots get saturated. Hence, CRM. Crew resource management enables a communication environment where each crew member is incentivized to speak up concerning relevant information. Had the social structure in the cockpit of the KLM flight been different, the greatest commercial aviation mishap would have been avoided. However, the copilot (who was an amazing pilot in his own right) did not speak up when his more senior captain took an action that was reasonable, yet deserved questioning.

The interesting piece is that 20 years after the Tenerife disaster, CRM has even reached Naval Aviation. As a matter of fact, CRM was such an established standard that I didn’t realize that at one point it did not exist. Even as a US Naval Aviation Safety School graduate, I did not understand that there was a generation gap between an aviator trained 10 years previous and myself. A single flight with my commodore (15 years my senior) made it painfully clear to me. I signed for the aircraft as the aircraft commanded, despite the fact that I was seriously outranked. I was the more experienced aviator in in this particular aircraft, but he had thousands more hours overall. We had an oil pressure problem and I wanted to land quickly. He thought it was a false reading and we should press 30 minutes back to base so that he could make his next meeting. My aircrewman and I both agreed that landing asap was the best plan. The commodore pressed his case. I listened to his reasoning and then listened to the aircrew thinking. Because I had signed for the plane, I had 51% of the vote. The commodore was visibly upset that I declared an emergency and landed at the nearest airfield. However, his mood changed very quickly when directly after landing, our right engine failed. His desire to take his next meeting clouded his ability to think clearly about the circumstances surrounding our flight. Because he outranked me by 15 years (a lifetime in Navy), he expected me to immediately fold to his wishes. However, CRM demanded that I accumulate as much information about our situation from all members of the crew. Had the commodore signed from the aircraft, he just would have pointed the plane towards home base and suffered an engine failure. Because I listened to the reasoning of all crew members, I was able to build my situational awareness (unlike the Tenerife disaster) and make a timely decision. It was standard protocol for me to employ CRM in this case. However, the commodore was not interested in my thoughts on the matter. I am happy this generation gap existed.

For me, CRM was a normal, necessary part of aviation. Yes, I was the best aviator to fly my aircraft–that confidence did not escape me. However, unlike the commodore, I knew that I could be better if I had complete information. Older pilots pre-CRM believed that their experience was more valid than any other entity in the aircraft. It seems criminal with hindsight, but in the commodore’s era, the confidence/arrogance that was required to be a good aviator was a swiss cheese hole aligning mechanism–and it was encouraged. His willingness to employ CRM in his cockpit set up a social structure that was difficult to overcome. Had I not signed for the plane that day giving myself 51% of the vote as aircraft commander, I wonder how my world would be different.

How do we Sway Irrational Actions with Rational Frameworks? December 6, 2009

Posted by Alison Zander in Sway: Pull of Irrational Behavior.
add a comment

Sway was an interesting book about leadership and change.  It is a good book to pick up at the airport.  It was full of fun stories, many of which were based in research.  The examples used ranged from huge life or death situations to smaller grocery shopping decisions.  I think the book did a good job at showing the large impact irrational decisions can have while also giving examples of relatable situations.  It was very interesting to hear about the irrational decisions that are made every day.  It also highlighted that anyone can act irrationally.  It doesn’t matter what level of the organization you are at or what level of expertise you have, we are all susceptible to making irrational choices.  I wish the book would have done a better job of outlining how to overcome these irrational tendencies, but now that I am more aware of this type of behavior I will be able to recognize it in the future. 

I encourage people to experiment with calling out when they are seeing irrational behavior.  Many times we don’t realize when we, ourselves are being irrational.  Awareness might help all of us be less irrational, or it may just end in conflict – it is tough to tell.  This book made me think about the times when I myself have acted irrationally.  As MBAs we try to see the world in a rational manner and are taught to make choices within this framework.  I wonder if we assumed people were swayed to be irrational if we would come up with different conclusions…