No Solutions Because We're Irrational
No matter what
certain economists may tell you, you know that people aren’t rational in the
choices they make. That is pretty obvious. And yet, we forget that every time
we come up with a “rational” solution only to find that it ends up achieving a
totally different and unexpected outcome, all because people aren’t rational!
For example, there
was this day care center in Israel that expected parents to pick up their kids
by 4 pm. Sometimes, parents were late but not by much. Why? Because they had to
interact with the teacher who had to stay late and thus felt apologetic and a
bit guilty. An experiment was conducted to impose a fine for late pickups. The surprising
consequence?
“In day cares where the fine was
introduced, parents immediately started showing up late, with tardiness levels
eventually leveling out at about twice the pre-fine level. That is, introducing
a fine caused twice as many parents to show up late. What about the remaining
four day care centers that remained fine-free? Tardiness didn’t change at all.”
Why did that
happen? Because once parents paid a fine, their guilt vanished and was replaced
by a sense of “Now that I am paying for it, I can be late”!
If you are paid to
say something, would you necessarily start believing in it? The rational person
would think “No, not if you have any integrity”. But here’s what Dan Ariely
found:
“Studies
show that people quickly start believing what they are saying, even when they
are paid to say it. This is cognitive dissonance: doctors reason
that if they are telling others about this drug, they must believe in it
themselves — and change their beliefs to match their words.”
Remember Goebbels’
point that repeating a lie often enough will convince people it is the truth?
It’s worse than that: you can lie to yourself too.
But wait, it gets
even worse (and stranger). In his book, Errornomics,
Joseph Hallinan pointed out the seemingly obvious solution in cases like
doctors recommending medicines they are paid for is to force the doctor to
declare his stake in making a recommendation. Once you know of his possible
bias, you’d be able to compensate for that. Or so the thinking went. What did
actually happen?
“Disclosing a bias doesn’t cancel its
effects… (Once they disclosed their stake/bias), advisors apparently felt they
had a similar license to pursue their own self-interest (with even fewer
constraints than when they hadn’t disclosed their bias!).”
This is called the
“Hey, I warned you” principle. Roughly put, it means that now that you’ve been
warned, I don’t have to feel bad about pursuing what’s in my interest!
No wonder half the
problems never get fixed: it’s because we are irrational, people!
Comments
Post a Comment