Yesterday I spouted off loosely about some behavioral economic (a mix of economics and psych) ideas, but I didn't mention where I'm picking them up. They are being gleaned from a course I'm auditing at Harvard, taught by Sendhil Mullainathan, who would probably cringe to know how I'm bastardizing (aka dumbing down) the theories of minds miles greater and more subtly nuanced than mine. Let's just hope he's not reading.
The particular concept I was writing about - that we tend to stick to our beliefs by spinning new information in a way that reinforces those beliefs - is related to a very cool concept that Sendhil was talking about a couple weeks ago. It's so cool, it's worth mucking up again from my memory.
Turns out our propensity to stick to our guns may not just be a self-serving, ego-stroking mechanism to make us feel good. It could be that it's just a by-product of a couple of logical flaws - or "biases," as the behavioral economist term them - that we humans have.
Here's the bias in question: we like to see patterns. Or to be more precise, we like to see information we know about the big picture show up in the little picture.
Here's an example: If you were asked - out of all the families that had five kids in the US, what percentage had them in this pattern - BGGBG - vs - GGGGG - you'd probably say the first pattern was more frequent than the second (as does almost everyone who's asked this question). A professional statistician, however, would tell you that both patterns are equally probable - and they'd be right. The reason you (and I) would guess the BGGBG pattern is because, well, we know that there are about 50/50 girls to boys and so it's just really, really weird to see 100% girls in one family. That's because you and I make the error of thinking just because the odds of something happening in an entire population are 50/50, those same odds should exist in a small sample (they don't). If you really wanted to test this theory out, you could flip a coin a few thousand times and see that any string of five flips (say, HTTHT) is no more likely than another (like HHHHH).
One place this bias shows up is in something called "the gambler's fallacy." If I ask you to bet on the flip of coin, it's 50/50 whether you'd bet heads or tails, right? But if we do a few rounds of coin tosses, and heads came up three times in a row, what would you be would be the next toss? Tails, right? (Be honest.) You, like 99% of humanity, think "Well, I know that half of all coin tosses turn up heads and half tails, if I've seen a lot of heads, a tails has to turn up soon, right?" Wrong. It's the same bias as above - we think the fact of 50% heads showing up should be seen in small samples as well as large samples. Again, it ain't the way it happens.
But there's another bias gamblers have which seems to be the opposite of the gambler's fallacy. This theory has it's own special name too: the "hot hand." Like it's name suggests, the "hot hand" says that gamblers will often get under the illusion that they are on a roll, having a lucky streak. If we were still playing our coin toss game and you kept seeing heads, you might begin to think - "okay, let's keep putting our money on heads." How can this be so? How can a gambler go from having a bias of thinking that the winds have got to shift to thinking their wave is going to ride out forever?
The two biases aren't as contradictory as they seem. In fact, they could different manifestations of the same bias. The difference is only in the pattern that the gambler expects to see. If he's looking for the pattern of HTHTHT, he'll have the gambler's fallacy. If thinking he's on a roll of HHHH... he'll get a hot hand. The interesting question is when and how he'll make the switch from one to the other? Behavioral economists seem to believe that we'll stick with our expectations for one pattern a long time, long past evidence shows up our expectations are way off. When the evidence builds up to a breaking point, though, we'll finally give up on our bias. Sadly, though, as the gambler suggests, we just jump to another set of expectations that are likely to be equally false.
That brings us back to yesterday's post. It's not just at the roulette table that we are this irrational. It's probably the case in our everyday beliefs. If we know something to be "true," we'll keep expecting (and looking for) evidence that supports our belief. It'll take a large heap of contradictory evidence (hurled at us successfully at high speeds) in order to unhook us from our beliefs. And when we finally snap out of it? Well, we just find a new belief - and pattern - to look for and, of course, continue to see.
No comments:
Post a Comment