Friday, July 23, 2010

another lesson in less is more?

The US and South Korea are gearing up to show North Korea some military muscle - and North Korea is gearing up to throw some version of a temper tantrum in response.

At this time I'm giving the US State Department the benefit of the doubt that their planned military exercise will have some short or long-term positive impact on North and South relations, but now I'm having a hard time seeing it. I suspect, instead, that the US may be victim of the "do something" bias - that is, do something is better than doing nothing. But sometimes doing nothing is the better way to go.

in praise of stovepipes

The Washington Post two year investigation that laid out the vast complexity and "overgrowth" of the US intelligence system has generally been received with gasps of disbelief and sighs of despair. Certainly, at first glance one wonders how 1,271 government organizations and 1,931 private companies could possibly coordinate activities and efficiently share information. Surely they must be doing duplicative work. Even worse, if not all the agencies are sharing information and letting each other put the pieces together, surely terrorists will slip through their fingers.

But are duplication and stovepiping really bad things? A couple of other familiar concepts suggest not: markets and groupthink.

In markets, when there are multiple companies working to develop smartphones, no one complains about duplication of work. Of course not; we all understand the advantages of a market where firms can compete to bring us the best product. Even if competition for the almighty dollar is taken out of the picture, there are tons of examples of where we prefer to let many entities try out different solutions, knowing that some will work out better than others. Indeed that's one of the advantages of having 50 states acting as laboratories for public policy.

But building a better smartphone or experimenting with public policy is not the same as putting together information to stop potential acts of terrorism, one might respond. Unlike the smartphone and policy, there's no saying what's the best version, so trying out many kinds and seeing what sticks makes sense. But with terrorists, it's not like you're looking for theoretical terrorists - there are actual ones out there that you need to find, and fast. True, but while you can't experiment with hypothetical terrorists, you can experiment with how best to find them. Finding a terrorist needle in the haystack is as much art as science; it may be to our advantage if many groups of spooks are putting their minds together in different ways to find the bad guys - the more creative teams out there, the more likely one will find the bad guys before they find us.

Admittedly, the fact that so many agencies don't have access to each others' intelligence is more of a problem - but it could also be a good thing. Psychologists have long known that we have a number of biases that lead us to believe misinformation; "groupthink", "social proof", "confirmation bias" all predispose us to believe what others believe (without deeply questioning) and to believe only what fits our theories. Even more oddly, our instinct is just to believe things - so once we've heard something, even if we think "that can't be true", part of our brain takes it as truth. All of this adds up to one of our great cognitive traps - "the inadvertent acceptance of the nearly correct," as Janet Metcalfe calls it. In other words, we think more information equals more knowledge, but it paradoxically can lead to "overconfidence"; as we plateau or even lose knowledge we believe we know more.