Saturday, September 25, 2010

why I'm an evolutionary psychologist

An unabashed confession: I am an evolutionary psychologist.

This is not to say I have a degree in psychology or evolution. But when it comes to understanding how we humans behave and why we do the things we do - in our families, at work, at play, in government and even at war (or especially at war) - I start with evolutionary psychology. Which is to say I ask "What would a hunter gatherer do?"

Many smart people, of course, prefer to start with the question "What does our culture dictate we do," arguing that our behavior is mostly shaped by societal rules that aren't innately human. Many also say that our sophisticated cultures and brains make the question of how our former hunter gatherer selves behaved irrelevant.

I don't disagree that culture matters and that our brains add considerable complexity and flexibility to our behavior, but for me the behaviors that evolved while we lived off of hunted game and foraged roots usually have the upper hand.

Two reasons are enough to convince me. First, when it comes to the meaty stuff, we humans act pretty identically across all continents and societies; we fall in love, get jealous, are ambitious, protect our children, etc. Sure there are differences around the edges (who it's acceptable to fall in love with, whether we simmer or explode when jealousy sparks), but our core behavior is remarkably universal. Since it's too big a coincidence that humans in all cultures "just happen" to be similar, the more likely explanation is that there's something in our DNA that makes us do the things we do.

The other reason you'll get farther asking how a hunter gatherer would act has to do with the sheer amount of time we spent in the bush. If you think that humans have been around for 2 million years (in our current form as Homo Sapiens for 250,000 years), and that we've only been agricultural for about 30,000 years, industrial for 150 years and "post-industrial" for 50 years, out modern "culture" has - at best - been around for 2% of the time we humans have been on earth, and that's assuming our culture goes back to Sumer and we're only talking about Homo Sapiens. If you prefer to look at the earliest humans and think modern culture started more around the time of the Enlightenment, then our modern selves only account for 0.025% of our species' existence. Imagine the length of a football field as the history of human life; we've been modern for about an inch.

If you do think evolution has something to do with our behavior, then 99% of our evolution happened when we were living in small bands, gathering roots and wild fruit, and hunting small prey. The culture we see around us is a relative gloss. That's not to say we aren't still evolving - we are, but not at a pace fast enough to reverse what came 2 million years before.

The best example of how behavioral evolution can't keep up with times is contraception. Pretty much up until the last 50 years, when people had sex there was a good chance a baby would follow. If you were a man who was partnered off with a woman and raising her children with her - or expecting to - you'd likely get pretty peeved, or enraged, if you found her in the arms of another man. This made good evolutionary sense; men who didn't care about who their women were lounging with would more likely end up raising other men's children. In evolutionary theory, if your DNA don't make it to the next generation you've lost. So the gene that says "sure, man, you can philander with my girlfriend" would likely die out while the "i will pummel you if you touch my woman" gene would proliferate. Thus infidelity leads to rage.

Fast forward to the late 20th century, however, and the equation "sex=babies" is no longer true. Condoms and the pill have made baby-making a rare byproduct of sex. And yet, men in the modern age still fly into rages - or merely fall out of love - if they find their girl with another guy. But that doesn't make any evolutionary sense anymore. The chump has almost no risk of raising the other guy's kid as his own; he's lost nothing - why should he care if his lady is having some fun on the side? But it does make evolutionary sense - if you keep in mind that "behavioral evolution" is still largely left over from our hunter-gatherer days. The pill can change society, but it can't change 2 million years of evolution.

Thursday, September 23, 2010

two respected news agencies - two realities

The New York Times was fairly confident about what was going on with Chinese exports of rare minerals to Japan:

"Sharply raising the stakes in a dispute over Japan’s detention of a Chinese fishing trawler captain, the Chinese government has blocked exports to Japan of a crucial category of minerals used in products like hybrid cars, wind turbines and guided missiles."

Its sources were unidentified "industry officials" and a minerals consultant in Australia. The Times admitted that the Chinese Commerce Ministry denied any ban, but that was easily explained: if the Chinese had an "official" ban, then Japan could run to the World Trade Organization and make a beef.

It all seemed perfectly plausible. China, after all, doesn't like being pushed around by other nations, least of all Japan, and they don't have the most stellar record of playing by WTO rules (if you're coming from an American perspective, of course).

But perhaps it was the Commerce Ministry's spokesman's quote on the subject that made the Reuters reporters wonder if something else was going on. "I don't know how the New York Times came up with this, but it's not true. There are no such measures," Chen Rongkai said. For a statement from a Chinese official, it does have a whiff of sincerity.

Reuters talked to some rare earth traders in China and Japan, none of whom had heard of the ban. One in Japan, however, had heard rumors.

Recapping: there's no "official" ban and no one directly affected by the ban knows that it exists. So is there a ban or not?

Reuters helps us out by explaining that there are quotas on exports of the rare minerals, but those were in place long before and fishing boat incident. Being that it's the end of the year, those quotas would naturally be tapping out. That might explain why customs officials are stopping exports of rare minerals - if, indeed, they are doing so, which is not clear. (Quotas on rare minerals that account for 93% of the world's supply may be a problem in their own right, but that's not the same thing as saying China is using quotas to retaliate against Japan.)

The mind-spinningly different accounts from the Times and Reuters could be explained in a couple of ways. The Reuters reporters naively accept the word of government ministries and don't dare to dig past a few questions to some Chinese and Japanese import-exporters. Or, the New York Times has an imagination. I'm inclined to believe the latter.

But, in the Times' defense, "imagination" is not a rare and fantastical thing; it's something we all do all of the time, but almost never are aware of. Our social psychologist and behavioral economist friends call it a host of things: confirmation bias, attributional bias, representative heuristic, auto-association.

Here's how it works. First, we get bits of information: China and Japan are engaged in a tiff about a captured fishing boat; China's prime minister makes some blustering statements about possible retaliation; and about that same time, crucial minerals stop being shipped to Japan (I'm assuming this last part is true to give the Times' the benefit of the doubt). If you're a journalist covering China, this set of data will set off a familiar pattern in your brain: "Ah, yes, I've seen this before - China uses trade as political bullying measure." Forget about whether there's any proof that the two events - prime minister bluster and trade drying up - are connected; in the mind, the link has already been primed. Then some "industry officials" (still unidentified) tell you that the Chinese custom's agency has stopped exports to Japan (but not other Asian countries). You talk to a consultant who confirms the story, and voila - case closed.

At this point, you're locked in. The pieces fit in so nicely, there's no sense in investigating further, maybe calling some of the trade companies in Japan and China to see if they knew of the ban.

What doesn't happen - for Times journalists or for you and me, by and large - is that we stop and wonder if our past impressions are setting us up to believe a certain version of a story. If we did, my might muck around for evidence to dis-prove our conclusions. But it's such a good story... how could it possibly not be true?

Wednesday, September 22, 2010

get over your rational self

Yesterday I was overjoyed to find that me and the CIA were on the same page - at least when it comes to understanding the deceptive practices of the human mind.

My one and only reader of this blog (let's call him Harry) emailed to let me know that he had an issue with the CIA's quotes, however. In talking about the mind, Harry wrote, the CIA was making that old faulty distinction between "body and mind", as if the mind could operate, Vulcan-like, unaffected by our body's emotions and desires.

I'm so glad you brought that up, Harry! I don't think the CIA was falling prey to that error, but - yes, oh yes - the idea that we have a mind that can think independent of all those emotions swirling through our gut, heart and loins is, indeed, the number one deceptive practice of the human mind.

I probably don't have to convince anyone that humans are not always rational. (If you've spoken to one recently, that'll be evident.) But it may take a moment to digest the proposition that humans are never rational.

For those jumping up saying "define your terms, lady!", let's say that "rationality" is the ability to make a decision based on reason and logical thought alone. (Still pretty murky, I know, but stick with me.)

One of everybody's favorite stories on this topic is poor old Phineas Gage. A 19th c. railroad foreman, in performing his duty of tamping down blasting powder one day, Gage accidentally sent a three and half foot, one inch wide "tamping rod" through his skull, blasting out a small chunk of his brain. Miraculously, he not only survived but also was up, walking and talking minutes after the accident (he later fell into a light coma for a couple of weeks before he made a full "recovery"). During the remainder of his life, he was appreciated primarily as a "freak", doing a few stints with Mr. Barnum. But as decades past, psychologists became fascinated by reports of Gage's personality change. Before the accident he was seen as a level-headed, upstanding kind of guy; after, if you believe the reports, he was a trash-talking, irresponsible thug. His friends said he was "no longer Gage."

The rod, it seems, had shot through one of the parts of our brain that regulate emotions. Gage went through life with his "logical" mind in tact; but with his emotional lobes gone kaplooey he no longer was the sensible, rational guy he was pre-blast.

Antonio Damasio, in his book Descartes' Error (the go-to tome on the mind-body fallacy, which was of course "Descartes' error"), talks about another poor brain-damaged fellow. Eliot was likewise missing his emotional faculties (eg gruesome pictures didn't seem to bother him) and was what one might imagine to be a Man of the Enlightenment - all logic and thinking. Yet, no Spock was Eliot. Eliot could explain social situations to you and spell out the consequences of actions just fine; but the one thing Eliot couldn't do was make a decision. Not that he was paralyzed in an angsty Hamlet way; he just didn't have any emotions to base a decision on.

This probably makes sense if you think about any big decision in your life. You might know you're a "go with your gut" guy, but if you're like many you may spend a lot of time mulling over the pros and cons of key choices (maybe even made a list?). No matter how detailed, thoughtful and logical the list, however, I'm betting that what finally tipped the scales was "an inner feeling." It really couldn't be any other way, if you think about it. Any decision is just going to have too may pluses and minuses to consider; how could you logically decide whether you should go to med school or study massage therapy when you have to factor in short term and long term financial considerations, quality of life, what your parents will say, what your boyfriend will say, etc. Even if you could make a complete pro/con chart, how would you weigh each of the items? You'd have to decide how much of those line items "mattered" to you, which is another way of saying you'd have to check in with your emotions.

So big decisions, yes, are emotion-wrought. But small decisions? Dean Shibata at the University of Washington was curious how much we depend on emotions even when deciding something as minor as fastening our seat belts. As scientists like to do nowadays, he hooked up an MRI to people's brains to see what parts would light up when asked about seat belts and when asked to do a little math. Unsurprisingly, the seat belt question lit up the lobe associated with emotions, which remained dormant during the math questions.

Some look at the evidence above (and charted in plenty of other studies with and without MRIs) and conclude that rationality does exist, but that emotions are part of being rational. That's not a wrong way to see it, but it opens up a Russian dolls' worth of worms. If based on emotions and logic, what's to say one decision is "rational" and another "irrational"? You'd have to judge some emotions as rational and others not - but what would you use to make that judgment? Another set of emotions. You can see it doesn't really end.

If the thought that your powers of reasoning are forever shackled to your emotions bums you out, you are not alone. I fancy myself the queen of rationality, but I know that's a fanciful illusion. But it's hard to see any other way. And all is not lost; even though we can't depend on logic and out analytical powers to lead us through life, they do help out a bunch. That is, until you run into the brain's other nasty deception...

Tuesday, September 21, 2010

cranium gazing CIA

I was planning to start a series of posts this morning on the topic of "how little we know our own minds," but a spate of scrabble requests and other urgent matters kept me off task.

Then, by great fortune, I ran into this incredible CIA document that does my work for me! Just a few choice quotes from "The Psychology of Intelligence Analysis" to get you doubting your every thought:

"A basic finding of cognitive psychology is that people have no conscious experience of most of what happens in the human mind. Many functions associated with perception, memory, and information processing are conducted prior to and independently of any conscious direction. What appears spontaneously in consciousness is the result of thinking, not the process of thinking."

"Herbert Simon first advanced the concept of "bounded" or limited rationality. Because of limits in human mental capacity, he argued, the mind cannot cope directly with the complexity of the world. Rather, we construct a simplified mental model of reality and then work with this model. We behave rationally within the confines of our mental model, but this model is not always well adapted to the requirements of the real world."

"People construct their own version of "reality" on the basis of information provided by the senses, but this sensory input is mediated by complex mental processes that determine which information is attended to, how it is organized, and the meaning attributed to it. What people perceive, how readily they perceive it, and how they process this information after receiving it are all strongly influenced by past experience, education, cultural values, role requirements, and organizational norms, as well as by the specifics of the information received."

"This process may be visualized as perceiving the world through a lens or screen that channels and focuses and thereby may distort the images that are seen. To achieve the clearest possible image of China, for example, analysts need more than information on China. They also need to understand their own lenses through which this information passes. These lenses are known by many terms--mental models, mind-sets, biases, or analytical assumptions."

Couldn't have said it better myself, CIA! If we want to begin to understand anything out there in the world (our job, our mother, what the heck we should do about climate change), the first thing we need to get a handle on are the tricks of our brains.

Monday, September 20, 2010

two headed citizens


There are few thing pundits and political strategists like more than a poll, whether it tells them which candidates are leading the race, how much the president's approval ratings have dropped or whether Americans are for or against building a mosque near Ground Zero.

But polls can be slippery things. For one, they give the illusion that those polled are of one mind - that people actually disapprove of the president's performance or that they believe a mosque has a right to be built in downtown New York.

Social psychologists (and their cousins, political psychologists and behavioral economists) will tell you that, in fact, people are usually quite ambivalent about their preferences and beliefs; that is, we are capable of holding two contradictory views in our heads at once (and often do). It's only when we're asked our opinion that one of those views will come to the fore. (A bit like Heisenberg's electrons.) And what makes one opinion bounce to prominence over the other depends on a slew of things; how the question is asked, what's been in the news lately, what your boyfriend said to you last night, whether you got a particularly good latte that morning, etc.

A couple of marketing profs give a good example of the first factor: it's all in how you frame the question. In a study that looked at the familiarity effect (which I'll leave for future posts), they asked people how fair and honest they thought Nike and Reebok were as firms. Depending upon how the question was put they got very different results, as you can see in the chart at top.

How'd that happen? The authors conjecture that people in general have stronger feelings about Nike than Reebok because Nike is a much more familiar brand - but those feelings are ambivalent. So when you ask which firm is more "fair and honest", you'll automatically light up the parts of people's brains that associate Nike with "fair and honest" and vice versa when you ask "unfair and dishonest."

This is a phenomenon known as "priming" and it happens to all of us all day. Rather than having brains with a set of static views, we're all a jumble of conflicting opinions and preferences. It's only when the pollster rolls around and asks us to commit one way or the other that we select the opinion whose neurons are firing more loudly. So are the pollsters getting our true opinion - or just the latest "primer" we ran across?

Sunday, September 19, 2010

when social science disappoints

The aim of this blog is to uncover studies from the social sciences that reveal potential failings in that great social experiment, US democracy.

But all too depressingly often I run into studies that end up revealing the failings of the social sciences themselves.

Case in point: A Chronicle Review article mentioned an incredible story about how easily duped we are by authority -

"In the early 1970s, a group of medical researchers decided to study an unusual question. How would a medical audience respond to a lecture that was completely devoid of content, yet delivered with authority by a convincing phony? To find out, the authors hired a distinguished-looking actor and gave him the name Dr. Myron L. Fox. They fabricated an impressive CV for Dr. Fox and billed him as an expert in mathematics and human behavior. Finally, they provided him with a fake lecture composed largely of impressive-sounding gibberish, and had him deliver the lecture wearing a white coat to three medical audiences under the title "Mathematical Game Theory as Applied to Physician Education." At the end of the lecture, the audience members filled out a questionnaire.

"The responses were overwhelmingly positive. The audience members described Dr. Fox as "extremely articulate" and "captivating." One said he delivered "a very dramatic presentation." After one lecture, 90 percent of the audience members said they had found the lecture by Dr. Fox "stimulating." Over all, almost every member of every audience loved Dr. Fox's lecture, despite the fact that, as the authors write, it was delivered by an actor "programmed to teach charismatically and nonsubstantively on a topic about which he knew nothing."'

Pretty stunning proof that we will blindly believe an "authority" even when he speaks utter nonsense, yes?

Well, maybe. Looking at the details of the study, a couple of near-gaping holes quickly stand out.

For one, the study broke the cardinal rule of scientific study; that is, it didn't have a "control group." In other words, the study looked only at how an audience judged a good lecturer delivering a nonsensical lecture; it did not also look at how a similar audience would judge a good lecturer delivering a sensible lecture. Doing so, the experimenters may have found that the good feedback the phony lecture received was not so good when compared to a non-gibberish lecture.

The other holes hint that that may be the case. First, the feedback forms ask mostly about the style of the presentation, not the content. For some reason the researchers didn't ask "Did the lecturer make a clear (or convincing) argument?" If they had, the experimenters may have found a big drop in positive feedback.

Second, the responders free-form comments suggest that quite a few weren't buying the lecture. Most of the positive comments were, again, about the style of Dr. Fox. But when it comes to content, the responses show signs of bafflement: "Too intellectual a presentation. My orientation is more pragmatic. Did not carry it far enough. Lack of visual materials to relate it to psychiatry. Left out relevant examples. He misses the last few phrases which I believe would have tied together his ideas for me. Interesting, wish he dwelled more on background. Somewhat disorganized. Unorganized and ineffective." (There was one positive comment on content: "Good analysis of subject that has been personally studied before." I think we've all met this delusional guy before.)

But, you say, what can explain the good feedback this impostor got, if people did not actually believe he was legit? A likely explanation was that they were being polite or hiding their confusion. Think about it: how often have you sat through a talk that you didn't understand and then written on the feedback form either "I had no idea what he was talking about" or "That lecture was bullshit." You probably either left the comment section blank or politely alluded to its inanity, saying something like "My orientation is more pragmatic."

Unfortunately, we won't ever have the true explanation because, deep sigh, this potentially brilliant study didn't ask the right questions in the right conditions.

Saturday, September 18, 2010

Down with justice!

A level-headed Anglican - and English professor at Wheaton College - found himself embroiled in an online sludge-fest about epistemological skepticism on a conservative Anglican website. After his hands were trembling so much that he could no longer type, he vowed off of online chat rooms and calmly contemplated why online discussions hurl us into hysteria.

His conclusion: we have a "hypertrophied sense of justice and an atrophied sense of humility and charity."

I won't advocate on the part of humility and charity (though they are nice), but I do agree with the professor that our society is a little too hooked on justice. Not the justice of getting the criminal and locking him up; the justice that there are "wrongs" that need to be "righted."

Now, justice has certainly played an important role in US history - the Declaration of Independence, ending slavery, the vote for women, the Civil Rights movement to name a few notable examples. But that may be part of the problem; when we look back on history many (if not most) of the our proud moments were about "righting wrongs." Those are the nicest memories. Oppression ended; goodness and enlightenment won out over corrupt power and darkness. Somehow those other great moments - when Hamilton brokered a deal with the South in order to create a national bank, or when FDR ushered in Social Security - may be regarded as good things, but they just don't pull at the heart-strings the same way.

The problem comes when those same heart-strings try to mold every disagreement into a battle between "right" and "wrong". It would be nice if right and wrong were as always as clear as "slavery v. no slavery," but unfortunately that's rarely the case. It's hard to imagine the annals of history glorifying the victory of "public option health care plans" or "a credit consumer protection agency"; important as those issues just may be, we're not talking "representation without taxation" here. And yet, we oh so much want to make the sides of the debate just that stark.

Of course, sneaky political operatives are in part to blame; they know how to agitate the masses and will whip up a good-evil battle when they can. But they only can do so because our brains let them.

"Justice" is not a uniquely American passion. Humans come equipped with a sense of justice from birth. Social psychologists love studies that show how people will often sacrifice personal gain when "justice" is involved. The most famous study, the "ultimatum game", pits two people who are charged with sharing $10. One, the "giver," is given the 10 bucks and told to make an offer to the other on how much they will share. The "receiver" can either accept the offer, in which case the two walk off with their portion of the $10, or can reject the offer, which not only kills the deal but means both walk away with nothing. If you thought people only cared about money, you'd think it wouldn't matter how much the "giver" offered, the "receiver" would be stupid not to happy to accept it. Some money is better than no money. But, remarkably, "receivers" almost never accept less than $3. It seems, when people are offered only $1 or $2 out of the $10 pot, they'd rather go with nothing than to let the "giver" get away with almost all the cash. This innate sense of fairness holds true when the ultimatum game is played on any continent in any culture.

Behaviorists, of course, have their theories why we're all natural Solomons. When we traveled in small groups it made sense not only for humans to "play fair" and cooperate, but to also have a way to make sure others were likewise not taking advantage of you; instinctively punishing the guy who tried to get away with the biggest piece of bison was a good way to insure he wouldn't do it again.

You've probably felt that instinct boil to the fore now and then. I can feel it brewing in a variety of situations: when the friend shows up 30 minutes late for a drink; when the French Parliament votes to ban burkas; when someone else doesn't give you credit for your contribution on a work project. It's all the same swirl of emotions that rides under the banner of "Wrong!"

In our day to day life, our sense of fairness may still help us out, but in politics it's possible it causes more problems than it solves. It's not just that having a heels-dug-in right-wrong attitude makes it hard for a nation to reach compromises on important issues. But, going back to those clever politicos, our penchant for righteousness is an easy way to rope us into sideline battles - Terry Schiavo, Ground-Zero Mosques - that distract us from the issues that matter.

Should we ditch "justice" then? Not entirely; it could still comes in handy in places like Iran, Zimbabwe and Myanmar. And maybe someday we'll need it again in full force in the US; in the meantime I think we'll be better off with putting "justice" to the side and concentrating on cooperation, mutual-respect and patience. Once our hands stop trembling, maybe we can even get something done.

Thursday, September 2, 2010

the incentive spiral

There's a shortage of primary care physicians in America that's only bound to get shorter.

As this Newsweek editorial points out, programs that are designed to reverse the worsening trend - such as tuition breaks for med students who focus on family medicine - have been shown to barely turn the wheel. The obvious solution - make it more profitable to be a family doctor - alas, seems out of reach. The problem is that health insurers look to Medicare to set their reimbursement rates, and the Medicare panel that makes those rates just happens to be stocked with specialists.

Newsweek doesn't see a politically feasible way out of this incentive spiral. To give more money to family doctors you'd either have to spend more money (not popular these days) or take away money from the specialists which, Newsweek claims, would mean fewer patients getting the special procedures they need (never popular). But hold on, would people really be losing out on the care they need if there were fewer specialists? Not according Maggie Mahar, of the Century Foundation, who argues that what drives all the procedures specialists do is not the need of patients as much as its the need of the specialists. In health care, as opposed to most markets, supply drives demand rather than the other way around. More money given to family doctors to sit around and talk through health issues with patients could also help nip health problems early on and avoid costly duplicative care. Any way you look at it, more money for primary care docs probably adds up to better care (or, at least, as good care) and less money.

fungible facts

Met a lovely, young man last night who was in town for a UN event on disabilities. Earnest and articulate, he explained how, unsurprisingly, the UN had a treaty (or protocol, convention - however they call them) on disability rights and, even less surprisingly, the US had not officially signed on.

Okay, so we don't like being pinned down by a protocol, I said, but how's our track record on disability rights compared to other countries?

Not too good, he explained: "When you look at the employee disability cases that go to the courts, more than 90% are decided in favor of the employer."

Well, case closed. Apparently there are tons of people out there with disabilities that are not having their rights upheld.

But, I asked, couldn't you say the opposite? That since the courts are finding that most of the cases brought to them are without merit, clearly employers are by and large following the law? If you said, instead, that the courts were finding 90% of the cases in favor of the workers, then you'd have evidence that employers were disregarding their staff's disability rights.

Blank stare. I felt bad - he really was earnest and nice. So I back-pedaled: maybe it's not a matter of percentages, but total numbers, I suggested? If 90% of 100 cases brought to federal court were thrown out, then no biggie. But if 90% of 100,000 cases were tossed, then - yes - that would be something to be concerned about. He brightened up - and will now hopefully brush up on his facts and maybe even use them more carefully.