Thursday, December 16, 2010

more signs of a moderate America

Polls are fickle things. A few weeks ago, one was ringing the death knell for Obama's popularity - showing he had slipped below the fatal 40% approval bar. Pundits were ready to read into the low numbers announcing that Obama's willingness to give in to Republicans was losing the faith of Democrats yet not winning him any love from Independents and conservatives.

A new poll from the bi-partisan polling team of McInturff and Hart shows Obama's approval numbers at 45%. I'll try not to repeat the pundit mistake of reading too much into the tea leaves, but three observations come to mind.
  • First. Again, polls are fickle. They'll blip and swerve - even when you account for "margins of error." Reading anything into a single poll is just shy of senseless.
  • As the authors of the WSJ article point out, if you look at the trendline of Obama's approval ratings, they're pretty consistent for the last six months - and haven't changed that much since last year. This is somewhat remarkable given the state of the economy and ongoing joblessness.
  • What that says about Americans and Obama is an open question. Perhaps Americans are becoming more forgiving of leadership and less willing to heap blame on politicians just 'cause the economy sucks. But it could be that there's something about Obama's leadership which have people willing to give him the benefit of the doubt. Maybe his conciliatory, cross-the-aisle attitude is not as hateful to Americans as some have said. Who knows? Only time will tell how history rates our Compromiser in Chief.

Tuesday, December 14, 2010

Americans can compromise

Now we'll see if Congress can compromise too.

The Washington Post polled Americans on the four major provisions in the tax deal Obama negotiated with Republicans last week.

Asked about the provisions one at a time, only 11% of those polled approved of all four.

But when asked if they backed the entire package - with all four provisions bundled together - 69% gave it the thumbs up.

This, of course, is the nature of compromise. I'll take a bit of what I don't like in order to get some of what I do.

Politicians, however, have gotten into the habit of rejecting compromise as lilly-livered flip-flopping. Hatching a deal with opposition, according to some lawmakers, is akin to selling your soul.

Thank goodness Americans still recognize the sense in lilly-livered compromise. Hopefully some of that common sense will brush off on their elected leaders.

Saturday, December 11, 2010

why are so many smart people just focused on making money?

Today someone asked that question on Facebook. I just had to answer...

Because we are social animals designed to seek power. Back in the hunter-gatherer day, power could mean the difference between survival and starvation - or between getting the ladies (and so offspring) or having your genes die out.

Today, with food stamps and birth-control, the power-survival equation no longer makes sense, but our human DNA hasn't caught on. We are still driven to accumulate power. For men, that usually means making money. Women don't have the power-drive as strongly as men (that's because a powerful male hunter could have dozens of kids, but even the most powerful woman gatherer could only knock out 10 children over a lifetime); even so, they'll also aim to get on top either by making money or marrying money.

But that all doesn't explain why the power of choice is money - and not political power, physical might or prestige in non-remunerative fields (academia, nonprofit, etc.) - which may be the question you're really asking. That's a trickier question, but it may be easily answered by the fact that money is the only common currency we all have. While we may seek to get ahead in our occupation (real-estate, law, engineering, etc.), the field of competition is narrowed by the people in that field. In the competition for money, however, everyone's a player. Being the most popular game, it'll also have the highest stakes and the biggest winners.

Finally, I doubt that smart people are any more or less driven to make money. But if most people are money hungry, and you assume smart people are also smart at making money, it'll look like there are "so many smart people just focused on making money."

Friday, December 10, 2010

designed for deadlock

Another week, another round of bills killed in the Senate by the dread filibuster - this time taking down a Don't Ask Don't Tell repeal, the DREAM act and a health bill for 9/11 rescue workers.

When our founding fathers designed our federal government, it was no accident that they created a system built for logger-jam. Wary of giving a central government too much muscle, they architected the famous "balance of powers", making sure that neither the president, the House, the Senate or the courts could run amok. That caution is in large part why DC can't get anything done today; the writers of our Constitution only wanted us to act when there was broad-based consensus to do so.

But even our worry-wort founders might agree that the filibuster is taking caution a little too far.

The filibuster doesn't come from the Constitution; it was an early Senate practice that got formalized into a rule in 1917. Back in the 19th century, there was no way for senators to officially end floor debates (a rule for ending debate got cut in 1806, because it was thought it would never be needed). Starting in the 1830s, lawmakers who strongly disliked a bill would simply (though exhaustingly) debate the bill to death, allowing the Senate to move on to other matters only after relenting to the "filibustering" senator. In 1917 fed up senators finally created a new bill, allowing the chamber to end debate - but only if 60% of the senators agreed. But even the newly formalized "filibuster" isn't the one we're used to today; a few decades ago it was still used only for highly contested bills, whereas today no bill comes up for a vote unless it passes muster with 60 senators.

A 60% majority may not seem so anti-democratic, but when you add it to the fact that 2 senators come from every state - regardless of their size - you get a very funky democracy indeed. Out of curiosity, I added up the populations of the smallest 21 states to see exactly how funky.

In theory, senators representing 11% of the nation's population can block a bill from passing. In other words, for some bills to pass you might need backing from lawmakers representing 90% of the country.

I know "democracy" is a loose term, but even I'm not certain it includes a 90% majority.

Wednesday, December 8, 2010

the sunlight that works

It may be a statistical aberration, but October was the first month that no US passenger sat on the tarmac for 3 hours waiting for lift-off - at least since the FAA started keeping track in 2008.

It was that year the government also threatened to fine airlines for holding travelers prisoner while waiting for take-off. Although the threat has never been followed through with, it could be that the FAA's onlooking eyes are enough to incentivize airlines to clean up their act.

While "sunlight" laws are most commonly known for making government transparent, throwing sunlight on corporations has become popular lately as well. (Tracking airline delays is just one example - if you live in a city where restaurants have to display their health department grade, that's another.)

It may be that, when it comes to the private sector, sunlight is a more effective disinfectant. That makes market sense: if an airline has a shoddy on-time record you may choose to spend the extra couple of bucks to fly a more reliable company - but if our government is doing a below-par job, we have less leeway to choose another government (voting a party out of office or moving across the border are not easy options).

Tuesday, December 7, 2010

reality-based general

As the Daily Beast pointed out this morning, it's not all that common when politicians come out and say "I don't know."

You can include in that group of confident "knowers" political pundits, experts, consultants and more and more the man on the street. We live in a time where people aren't only really damn confident about what's what today - but also about what the future will bring. Of course, people like Nassim Nicholas Taleb will tell you that reading the future is generally an utter waste of time. Philip Tetlock will add on that the more "expert" you are in an area, the more your predictions are likely to be way off.

That's why it was refreshing to hear military expert (and future politician?) Petraeus say he didn't know if we'd be wrapped up in Afghanistan by 2014.


'“I think-no commander ever is going to come out and say, ‘I'm confident that we can do this.’ I think that you say that you assess that this is-- you believe this is, you know, a reasonable prospect and knowing how important it is-- that we have to do everything we can to increase the chances of that prospect,” the top commander in Afghanistan told me. “But again, I don't think there are any sure things in this kind of endeavor. And I wouldn't be honest with you and with the viewers if I didn't convey that.”'

Thank you for your honesty, General. We could use more of that these days.

Saturday, December 4, 2010

room for reciprocation

Calls for bi-partisanship - thankfully - are coming into vogue.

But while Americans may be yearning more and more for greater cooperation across the aisles, the question of how to get the parties to get on the bi-partisan bandwagon has no one answer.

Yesterday I mentioned a couple of promising solutions suggested by the new group No Labels (not the first by far to recommend them), all which would tweak the election process to deliver fewer extremists to Congress.

Today I want to try out some reforms that would nudge lawmakers toward collaboration once they've gotten into office.

Those ideas come - indirectly - from Robert Cialdini's "Influence," the 1984 marketing classic on the six "weapons of influence" - that is, how to get people to say "yes" without them knowing that's what you're trying to do (and even make them like you while you're doing it). It could be called a manifesto for manipulation, but as with all tools their value is in whether they can be used for evil or good.

The first weapon I propose to use for the good of collegiality on Capitol Hill is "reciprocation." Human beings are natural reciprocators. Studies suggest that our propensity to dole out and return favors is not just something we learn, but rather a behavior hard-coded in our DNA. Unlike most species that usually just help out their immediate kin (ie those who share genes), humans are surprisingly altruistic toward people with no familial relation. Evolutionary biologists theorize that this altruist - or more accurately "quid pro quo" - trait would be beneficial to groups of humans in the early days; a small tribe which naturally helped each other out in hard times might be more apt to survive than tribes without a similar help-thy-neighbor instinct. (Humans are not naive enough to be pure altruists though; we also have a strong instinct to punish "free riders" who don't live up to their half of the "quid pro quo" bargain.)

Cialdini charts out a few modern day examples of this reciprocation instinct in action. Joe, the plant in one study, casually buys a participant a coke while he goes to get himself one at the vending machine. The unsuspecting participant thinks Joe is just another subject in a fake study. Later, when the "decoy" study is over, Joe asks the other guy if he wants to buy raffle tickets for a local charity. If Joe hadn't bought a coke, participants on average bought 25 cents of tickets. If he did come back with a coke, the other guy would buy 50 cents of tickets. Given that a bottle coke back in the 70s was 10 cents, Joe made a good return on his "gift" investment.

Apparently we have a strong urge to return favors. Or more precisely, according to Cialdini, we have a strong urge to get rid of the uncomfortable feeling of being in someone else's debt.

Cialdini doesn't mention it himself, but at least one other great mind has noticed that being in someone else's credit also makes them more apt to help out. As Ben Franklin famously noted, "he that has once done you a kindness will be more ready to do you another than he whom you yourself have obliged."

Either way, gifts have a way of producing more gifts.

One kind of favor - a "concession" can be particularly effective at eliciting return favors. In negotiations, a "concession" is essentially a gift. If I'm splitting the responsibility of a project with a co-worker, I can start start with a demand that he handle 70% of the work, which will likely make him balk "no friggin' way." I can then backtrack, splitting up the work a little more fairly, say 55% and 45%. That concession can win me some good will and perhaps a concession in return, so I end up doing 40% of the work, but getting 50% of the credit (which, of course, I would never do).

Remarkably, studies that set up quasi real-life negotiation show that when there is a concession back-and-forth people are happier with the result - even if they they had reached the exact same result without a concession. (The technique - which Cialdini calls "rejection then retreat" - can, however, backfire if the initial demands are seen as extreme and made in bad faith.)

With all the mutual benefits of giving and giving-in, you can imagine how they'd come in handy on Capitol Hill. It wouldn't be just during legislation negotiations that favors (in the form of concessions) would reap favors - and ultimately lead to compromise and bipartisan solutions. But even small favors out of the committee rooms - not of the grease-palming but of the "could you pass the salt" variety - would naturally spur collegiality.

Of course, giving and giving-in face two obstacles in DC. The greater obstacle by far is the culture that "giving-in" is a sign of weakness. We've bred a generation of partisans that think any compromise is akin to trouncing on your principles, if not selling your soul. The other obstacle is a logistical one; according to Ron Brownstein, the combination of low airfares and high fundraising demands means that congressmembers no longer have any downtime in DC - that is, downtime they used to use socializing with each other often across party lines. With no off-the-clock face time, lawmakers have no opportunity to build up good faith in the everyday quid-pro-quo humans are used to.

The question then becomes how do you surmount those obstacles? How do you fight a culture of "stand your ground" and how do you encourage lawmakers to spend more time with their colleagues across the aisle?

I'm not sure, but one thing I know: changing a "culture" is near impossible. You can, however, lessen the effects of that culture by putting some teflon between it and lawmakers. This is certainly an unpopular position, but giving lawmakers a space - free from onlookers - where they can freely concede without be charged as "spineless flip-floppers" would be a start.

As I said, it's not a popular idea. America, of course, is heading in the direction of more transparency in DC discussions, not less. It would be impossible to backtrack and close the doors on committee meetings that were opened in the 70's when the transparency movement took off. But we could stem the tide of greater openness in Capitol Hill chambers. That is, unless Americans can embrace "giving a little to get a little", which I don't see happening soon, we need to give our lawmakers the closed-door room to reciprocate on their own.

a conservative case for civility

Brought to you by Commentary's Peter Wehner.

Friday, December 3, 2010

a new bi-partisan kid on the block

Talking to a colleague the other day I wondered aloud "Is it just because I'm hopeful and so am imagining it, or does America truly seem sick of hyper-partisanship?"

I don't have an answer to that question, but a little more evidence trickled in today that America is, indeed, fixing to move on from extreme partisan politics.

No Labels is now here to "bring together leading thinkers from the left, right, and all points in between" and "work to break down false divisions and lift up the common ground on which we can build solutions." It has a snazzy new website and a couple of ex Clinton and Bush administration leaders at its helm, so it clearly has smarts and gravitas to back up its pollyanna goals.

It's not entirely clear how they plan to turn the tide of hyper-partisanship, but they have some initial ideas: cut down on gerrymandering so districts are actually competitive; open up primaries to do the same; and change campaign finance rules so money is the overwhelming driver it is today.

But even with these great ideas, their fundamental principle - that labels are bad - may be a faulty one. Sure, our political parties can seem like pretty sleezy and unprincipled groups, but party affiliation serves a purpose. Without parties, it becomes that much more difficult for citizens to figure out who to vote for and to keep their elected officials accountable. Parties help define the issues, simplifying the decision of voters. While it's common (and justified) to complain that parties over-simplify the issues, the reverse prospect - of every candidate defining and explaining policy positions - could create a cacophony of ideas that overwhelms voters and makes it hard for legislators to align on any policy solutions.

No Labels doesn't seem to go as far as to want to abolish parties (which would be politically and logistically impossible to do), but I wonder if even putting the emphasis on the negative of labels hampers the ability of labels to do good. Perhaps we shouldn't be trying to impair the power of parties - but rather find ways to force them to use their power responsibly.

Monday, November 29, 2010

moody Americans

Last week, after I posted a graph suggesting that when unemployment rates go up presidential approval goes down, my understandably despondent friend Chris wondered if that meant Americans only like presidents who help them make more money to buy more stuff.

I don't think that's the case. People don't mind when their fortunes don't rise. They do get, however, get pissy when they see their fortunes fall.

At least, I recently ran across that argument in Jared Diamond's Collapse; historically neighboring nations don't go to war when times are bad, but they do take up arms when a period of economic growth is followed by economic decline. It's a theory, as another friend - Frauke - pointed out, that was made popular by James Davies in the 50's:

"Revolutions are most likely to occur when a prolonged period of objective economic and social development is followed by a short period of sharp reversal. People then subjectively fear that ground gained with great effort will be quite lost; their mood becomes revolutionary."

No one's revolting or going to war in the US, but the mood of Americans - as expressed by presidential approval - probably follows similar ups and downs. Much has been made recently of increasing income inequality in the US and of the social tensions that may result. But I don't think growing gaps between the rich and not-rich is enough to get Americans glowering at each other - it's only when combined with seeing jobs and paychecks decline that we get ticked off.

Two places you can look for evidence are Brazil and India. Brazil had one of the world's worst records on income disparity over the past decade, as measured by Gini, and yet its president (and now his new successor) was wildly popular. India's Gini is more moderate, but as this $1 Billion home in the middle of Mumbai's slums show, the gap between rich and poor is starkly visible. Like their Brazilian cousins, the mood in India is high. Why? Because like Brazil, they know their collective wealth and prestige is on the upswing.

Americans are seeing a different picture; while longterm forecasts of US decline are probably premature, when short spells of spiking unemployment occur - while Wall Street paychecks appear continue to bloom - it's enough to make the working classes think the fall is imminent. If Davies is to be believed, that should make us disgruntled indeed.

Sunday, November 28, 2010

kool-aid denial

Andrew Ferguson over at Commentary is the latest victim of the Bugjuice-calling-the-kettle-KoolAid syndrome.

Critiquing journalists' tendency to only debunk research studies when their conclusions disagree with the journalist's world view, he - well - debunks a research study that disagrees with his world view.

FBI makes another terrorist

It's the second time this year the FBI has taken a young aspiring terrorist - and turned him into one.

Mohamed Osman Mohamud clearly had dreams of being a jihadist and bringing devastation onto Americans, but his attempts to sign into a terrorist training camp got nowhere. No problem; the FBI stepped in and helped him make his fantasies a reality - even assisting him in building a (fake) bomb.

I'm not a legal scholar, but it seems there should be a difference between dreaming about a crime and actually perpetrating one. Many disaffected youth imagine pulling off criminal acts but don't go through with it - whether because they lack the means, the cojones or the will to ultimately do so. If the NYPD was in the business of making every juvenile's criminal fantasies come true we'd double the occupancy of Rikers. But the NYPD - I hope - is interested in making fewer criminals, not more.

Why is the FBI different? Clearly, the stakes are higher. Wanting to pull off a drug deal or store heist is not the same as aiming to murder hundreds of innocents at a shot. But I'm not sure that - considerable - distinction really makes a difference.

The media all report that Mohamed was dead set on his intentions to bomb a gathering of Portland Christmas revelers and that not even the FBI agent's concern that he might kill children would deter him. What the media doesn't point out is that Mohamed was expressing his determination to someone he believed was a fellow terrorist, and perhaps even someone he looked to as a mentor. That person, of course, was an FBI agent acting the part. We don't know exactly what the FBI agent said to Mohamed, but as they say, actions speak stronger than words and this agent was helping Mohamed to plan a terrorist attack.

Up until the moment the FBI called Mohamed to help him orchestrate an act of terror, Mohamed was on his own. But when the FBI moved in to act as Mohamed's accomplice he stopped acting alone. Sure, as a human being, he always had free will to continue or stop his plans - but would he have gone through with them without the support of the FBI? If I stopped to think about how many ideas and plans I've had - and even eagerly wanted to accomplish - but have done nothing with, I'd be writing a list all day. We can never be certain how likely Mohamed would have bombed a public square in Portland if it had not been for the FBI, but we can surely say the likelihood was less than 100%.

Again, without being a legal scholar, this surely must be the reasoning behind why entrapment is a no-no. It recognizes that we rarely act as lone wolves - but that our actions are almost always subject to encouragement and discouragement.

So why are we encouraging terrorism?

Update: Glenn Greenwald at Salon and Ted Conover at Slate also question the FBI's terrorist baiting practices.

Saturday, November 27, 2010

social research fun

Kevin Lewis at Boston.com sums up conclusions from five social research studies, including:
  • If you want to win in the NBA, you've got to reach out and touch your teammates.
  • One thing that'll make you argue your beliefs more adamantly: doubt.
  • To make it easier for your students to learn you may have to make it harder for them to learn.
  • Promoting safety can promote civility (at least on oil rigs).
  • Nietzsche was almost right: what doesn't kill you could make you happier.

partisanship for bipartisanship

Michael Barone shakes up conventional wisdom in the American Interest, arguing that it is not partisanship - but rather voter volatility - that makes compromise and cross-party coalitions impossible.

"As already suggested, the essence of most bipartisan compromises is that they contain provisions unpopular with constituencies of both parties and often provisions that are unpopular with a majority of voters. That’s why such measures tend to be passed by bipartisan coalitions of members with safe seats.

"In such an unsettled political environment [as the one we have today], it may be difficult—maybe impossible—to round up the votes needed for bipartisan legislation. Politicians will not be inclined to take on additional and avoidable risks. And that difficulty means that legislators in a position, whether because of expertise or committee membership, to cobble together such legislation may just conclude that it’s not worth the trouble.

"Absent large congressional majorities, therefore, it looks like we are stuck for a while—not only, or mainly, because of ideological polarization and party sorting, but because of electoral volatility. When you think about it, this suits the definition of irony. Why are voters so willing to “throw out the bums”? Because they think they can’t get much of anything done. Why can’t they get much of anything done? Because they’re afraid that bipartisan compromise will get them thrown out of office."

Two other possible ironies not mentioned by Barone: Since nothing creates stability like firmly partisan districts, perhaps more gerrymandering is needed for bipartisan legislation to happen? Even more counter-intuitive and depressing: if engaged, independent and open-minded voters are more likely to be volatile (than dyed-in-the-wool Republicans and Democrats), then maybe we should be advocating for greater voter apathy and blind-partisanship? Now, them's some grim thoughts.

our stories ourselves

John Bickle and Sean Keating have a nifty little article in the New Scientist packaging together the many wonders of the left brain. I'm a right-brainer, so it's hard to admit that the left-brain is at the core of our humanness; it makes the stories that make us.

You may be familiar with that inner voice in your head. According to Bickle and Keating, it's helping to place your daily encounters into the grand epic of your life. (Bickle captured our brains on MRI not only talking to ourselves - but also "listening".) Those narratives add up to our "identity" - the notion that you are a unique, unified, intentional being, rather than a sac of cells pinballing through life.

The authors also conjecture what might happen to our inner stories as digital media breaks up traditional narrative. Their answer: not much. Not even haiku texting can keep us from our sagas.

But the left brain is not all self-involved soap opera. A big piece of making up stories is making up causal relations ("I didn't get the promotion because I bungled the project" or "He left me because I'm a bad cook"). That instinct to look for cause and effect is also what makes us handy at spinning out theories to explain the world around us. Luckily, our left brain has the right brain to help it figure out which of those cockamamie theories are true.

Friday, November 26, 2010

still the economy...

In a Zogby poll this week, Obama's approval rating dipped below the 40% marker for the first time since taking office. With a new statistical bone in sight, the politerati pounced - and we got a new wave of predictions for Obama 2012 and advice on what policy stands and leadership initiatives he can take to win America back.

But wiser heads suggest that Barack's current disapproval ratings and 2012 prospects have, as always, more to do with the economy than health care, Afghanistan and Sarah Palin combined.

We've all heard "it's the economy, stupid" a zillion times, but I was curious to see just how much the economy ruled our collective opinions about our Leader in Chief. So I decided to spend my Thanksgiving charting it out, pulling down unemployment stats (thought to be the clearest economic indicator on Main Street) from the BLS and historical approval ratings from UCSB.

Here is the harvest of my labor.


At first glance there doesn't seem to be an air-tight case for "good economy=presidential approval", but if you take a close look, the evidence is suggestive if not convincing.

First thing to notice is that when presidents enter office, they usually do so on a well-spring of positive reviews (only Reagan and Clinton entered with less than 60% love). I also pointed out on the chart three non-economic events that caused extreme swings in approval. Two positive - the Gulf War and 9/11. One negative - Watergate. (Other blips - including Reagan's assassination attempt, Iran Contra, the 2nd Gulf war - couldn't be squeezed in.)

But incoming presidential ardor and the rare non-economic factor aside, it does look like one of the worst things for a president's ratings are rising unemployment rates.

The gray areas highlight times of soaring unemployment. In most cases, while the red of unemployment goes up, the blue of approval goes down. Eisenhower's ratings dipped during both times of rising unemployment. Nixon's first spate of unemployment also spelled disfavor; by the second surge in joblessness his ratings were so battered by Watergate they couldn't get much worse. Ford and Reagan entered office with unemployment spiking and so saw their approval plummet (Reagan got a temporary popularity reprieve after being shot by Hinckley). As for Bush I & II, not even the hugely popular Gulf War and the patriotic fervor following 9/11 could stave off the dive in popularity they'd soon suffer from joblessness.

The non-gray areas, conversely, show how good economic times can boost ratings, the biggest beneficiaries being Reagan and Clinton. The only presidents who buck the trend are LBJ, Carter and Bush II. LBJ and Bush II both had the albatross of dragged out wars dragging them down, which may have outweighed rosy job markets. As for Carter, he had another economic problem - inflation - causing him loss of love. (The one benefit of inflation, not coincidentally, is lower unemployment.)

So where does that leave Obama? Like Reagan, Barack came into office with unemployment on a steep incline. That combined with the normal inflation of popularity of incoming presidents meant that it was only to be expected Obama's popularity would take a beating once in office. A few have taken the Reagan comparison further, pointing out that once people start getting jobs again Obama will likely follow in Reagan's footsteps picking up popularity points as well. Given the record of our Oval Office leaders since the days of Ike, barring a Watergate or military quagmire, Obama should indeed be putting all his bets on more jobs. It could, indeed, still be the economy, stupid.

Wednesday, November 24, 2010

fetishizing freedom

We Americans inherited much of our love of liberty from our British forefathers. But according to Tristam Riley-Smith, since kicking our imperial parents out - in part over a disagreement over freedom - our conceptions of liberty have taken different paths.

Constitutional scholars talk about "positive" and "negative" rights; the first give you the ability to do things (vote, express yourself) while the second protect you from things being done unto you (be discriminated against). Similarly, Isaiah Berlin split "liberties" up into those that allow you to do something and those that keep you from being constrained. The complication comes, of course, when someone's positive liberty (say, to serenade his sweetheart at midnight) butts up against others' negative liberty (to not have their sleep interfered with). The story of liberal societies is in many ways about how to balance the positive and negative.

Americans seem to have sided with protecting the negative; in the choice between government working to give its citizens more liberty or government just keeping out of our business, we'll take door number two.

The British, however, recognize more that in order to protect the liberties of some you sometimes need to limit the freedom of others. So, in order to exercise the liberty to breathe clean air, the freedom of factories to pollute has to be reined in. Or to give people real freedom to participate in society, the government needs to provide public education - which is paid for by constraining the pocket-books of wealthier citizens.

That's not the kind of liberty Libertarians on our shores like to consider, but maybe Liberals should start co-opting the phrase for themselves. Along with "liberty to own guns" and "liberty to not pay taxes", how about "liberty to get an education" and "liberty to have health care?"

America in red and blue

I can never get enough of maps.



In this one, it's fun to see American flip its red-blue axis - a slow process that took 70 years and the Civil Rights Movement to play out.

And one more surprising find: America was more geographically polarized under Clinton than under Bush. Who knew?

Tuesday, November 23, 2010

paranoia and prudery go head-to-head

When one day we look back on how "the great privacy movement" began, it won't be accounts of Muslim homes being searched without warrants or gigabytes of data being swept up by the National Security Agency that sparked revolution - but rather images of underwear on TSA screens and videos of passengers being felt up by federal agents.

Americans finally seem to be growing wary of privacy incursions in the name of national security - and it is our prudery that's pushing us to take a stand.

Of the many forces that get humans in action, fear is the clear forerunner. You don't need the reminder of today's stampede in Cambodia to know that fear can get humans to do almost anything (include trample other humans to death). Fear of terrorism has gotten Americans to accept secret searches of our homes and warrant-less searches of our online communications, not to mention bare feet at the airport.

Our sense of privacy and injustice have not been enough to counter that fear - but privacy concerning our private parts may have the edge.

It may be that this visceral reaction to protect our "junk" is not worth exploring - it is what it obviously is - but I couldn't help being reminded of Jonathan Haidt's "moral foundations theory." According to Haidt, morality comes naturally - and it comes in five flavors. Some of us are more instinctively offended by injustice, while others get their moral dander up when someone is unloyal. One of those moral dimensions - disgust - may be what's kicking in at airport security scans. (The other dimensions, btw, are "caring for others" and "respecting authority.")

At a TSA briefing for House staffers in DC, people were "averting their eyes" when it came time for the pat-down demonstration on a young female volunteer. No one has ever averted their eyes from a questionable FBI search of library records. That difference - whether an action is eye-avert-worthy or not - is not to be underestimated. While most Americans do not approve of the government tracking down their library records, they're not upset enough to make a fuss. Images of men agents feeling down other men (and women feeling up women) just may be the one thing that's worse than the miniscule chance that there's a terrorist on our plane.

It'll be interesting to see in the next couple of weeks whether our outrage over being touched is the thing that finally gets Americans to draw the line on privacy. Who knows. Fear of exploding planes still has a hold on us. But in the looming the battle of paranoia over prudery, I'm putting my money on the prudes.

Sunday, November 21, 2010

when reason does back-flips

The Independent reports today that, according to the Pope, you may not be sinning if you use a condom - but only if you're a prostitute... and gay.

Monday, November 15, 2010

Juan Williams and knowing your unknowable self

In September I introduced the idea (not to the world, but to this blog) that the notion that we can ever be rational is a foolish one. Not that reason doesn't play a role in how we make decisions; it certainly can. But at the end of the day, every decision we make - and every thing we do - is mixed with at least a dash of emotion.

For most of us most of the time, that dash is more like a tanker-full. A common metaphor among social psychologists is to equate our emotions to a two ton elephant, with our "rational" brain as its 100 lb driver, hopelessly trying to steer the mammoth thing in the direction we think wisest.

To natural emoters like me, it may be patently obvious that the elephant is in charge. No matter how much our reason counsels us to not be nervous or not be upset, we still find ourselves trembling as we stand up to give that board presentation or when we bump into our ex-lover at a cocktail party with his new girlfriend. For those of us who go through their day relatively unperturbed, however, it may be less evident that emotions run the show.

This is, of course, because not all emotions manifest themselves as crying jags or fits of rage; most are quiet operators that work below the radar. Indeed, some social psychologist define emotions as those forces that work at the unconscious level.

Your mind-body complex has an endless array of ways to slyly guide your behavior via emotion. I'll dig into many on this blog, but for starters let's look at stereotypes and prejudice.

In the wake of the Rick Sanchez' and Juan Williams' firings for voicing questionably anti-Semitic and anti-Islamic comments, I was reminded of how little we like to consider ourselves bigots. Sure, other people are - but I don't let my actions towards other be shaded by racial or ethnic prejudice. Or do I?

A couple years ago I took a few spins on Harvard's "Implicit Association Test" site to discover that, yes, just like everyone else on this earth (maybe excepting the Dalai Lama) I have my hidden prejudices.

The social psychologists that design the IAT know that you can't find out if someone has a prejudice by asking them; outside of Aryan Nation rallies, most Americans like to believe they are color-blind and will report so. So researchers use a trick. They depend on the brain's way of associating words and concepts. If two words - say, cat and dog - are associated in our mind, they'll have a strong neural connection. When I say "cat", all those words associated with cat will get activated, consciously or subconsciously - so if I next ask "what does 'god' spell backwards' you'll be that much quicker to say "dog" than if I just asked the question out of the blue. (This is a phenomenon known as "priming," which marketers love to exploit.) To look for hidden prejudices social psychologists will hone in on those milliseconds more it will take you to make connections with and without priming.

Suspecting I had a - maybe not so hidden - bias against overweight people, I just took the IAT's fat-ist test. Sure enough, I have a "moderate" prejudice against the overweight. (I have plenty of company; of those who have taken the same test, most have negative views of pudge.)

But what does that mean? Rationally, I know that there are many reasons why people may be overweight and - to my knowledge - there is no correlation between being overweight and being "bad" in any other way. In fact, anecdotally, I'd say that most overweight people I know are kind, intelligent and interesting people. Most importantly, it's no skin off my back if someone else carries a few extra pounds.

So, why in heck do I associate being fat with being "bad?" I have no idea (but could make a few guesses - in another post). The point is that in spite of priding myself in being supremely rational, my subconscious is supremely irrational - and whether I like it or not it's running the show much of the time. The important thing is to be aware of the fact - so you can take precautions against it.

That's one of the reasons the Juan Williams story was so disappointing. On his stint on Fox, Williams admitted that he gets fearful when he sees Muslims at the airport but also pointed out that it's important to put that irrational fear aside when thinking about policy. What I heard was "I'm bigoted just like everybody else, but we need to get past our bigotry when we come together to talk about national issues." What the media - and NPR - heard was "I'm an unabashed bigot."

Too bad. Williams was helping us to be aware of our prejudices and asking us to know our unknowable selves. Instead of thanking him, we canned him.

Tuesday, November 9, 2010

why evolutionary psychologists get a bad rap

A few weeks ago I posted a proud confession that I am an "evolutionary psychologist" - or that I think like one, always asking the question when it comes to human behavior "what would a hunter gatherer do?"

The post was inspired by an article in the Wall Street Journal which is an example of the reasons you, dear reader, would be justified in claiming evolutionary psychology is bunch of bullhooey.

Evolutionary psychology takes as its premise that the behavior of humans today can often be best understood in light of how that behavior probably helped our hunter-gatherer ancestors survive and get their genes into the next generation. So while it doesn't make much evolutionary sense why a toddler would scream for his mommy at daycare today (he's unlikely to get harmed and his mom is in an office building clear across town), if that same toddler was left in the bush without his pre-historic mom eons ago those screams would stand a good chance of keeping him around to see the next day.

The screaming toddler is an example of "well, duh" evolutionary insight. Matt Ridley (who generally is an exceptional thinker) gives a good "excuse me, huh?" evolutionary take in his WSJ opinion piece.

Ridley starts his article sensibly enough, explaining that while it may be unfashionable to say so evolutionary theory can account for many of the differences between men and women. But rather than serve up some "no, duh" examples - eg, women are more nurturing because they were likely the prime parenters, men are more aggressive because they had to defend their women, etc - Ridley offers instead the golf course and to the mall to clarify how evolution shaped our behavior.

Women are thought to have been the "gatherers" and men the "hunters" in the pre-historic division of labor, says Ridley. This is not a much disputed point. But for Ridley those past job roles explain why today men love golf and women love shopping.

"Without knowing it, golf-course designers are setting up a sort of idealized abstraction of the hunting ground, while shoe retailers are setting up a sort of ersatz echo of the gathering field," writes Ridley.

It's no wonder evolutionary psychology inspires eye rolls.

I was trying to think why Ridley's golf-hunter and mall-gatherer is so maddeningly inane. He is, after-all, only noting a valid cliche about men and women. Although some women love golf, let's face it guys more often go gaga over the sport. And show me a straight man who will moon over Jimmy Choos for hours on end - I have a bridge to sell you.

But Ridley's golf-course-as-hunting-ground and macy's-as-foraging-forest theory strike two blows against logic and common sense.

First, as I claimed in my earlier post, one of the most persuasive arguments for evolutionary psychology is the observation that so many of behaviors are universal, found in every village and city world-wide. If a behavior - say, crying for mommy - is so prevalent, odds are it comes from something in our genetic makeup (which varies little across continents) rather than culture (which varies little). Golf, it is safe to say, is not one of these behaviors. (One might counter-argue that the only reason golf isn't played across the globe is that it's price-prohibitive. That might be so, but it still leaves the golf-as-universal claim as conjecture.) What Ridley could have used as a strong example is "sports" in general. Without being an anthropologist I'm going to go out on a limb and say men in all cultures play some kind of team sport. Trying to figure out what ancient instinct sports satisfy is a worthwhile pursuit; I'll even go out on a limb and say it has something to do with, yes, the hunt as well as the occasional need to attack the neighboring tribe.

That leads to the second rule of sensible evolutionizing; when looking at a universal behavior, always be sure to ask "what are the other possible evolutionary explanations?" Working off the cuff here, I can come up with a few other scenarios to explain golf and shoe shopping.

Golf first: take a little competition (males fighting for status, which is really fighting for fertile chicks), add an exclusive and expensive activity (more social status), a green setting (a reassuring sign that food is available) and, okay, a little thrill of hitting a target far in the distance (hunting) - and you get golf.

As for shoe shopping, when Ridley tips his hat to "foraging", he's putting too much emphasis on the "shopping" and not enough on the "shoe." Replace shoe with "grocery" and he'll find women much less happy to browse for hours (and, of course, grocery shopping is more akin to our ancestral activities). Shoe shopping, as any girl will attest, is not as much about exploring as it is about primping and imagining how deliciously and fabulously delectable you are to the object of your desire (true today as it was millenia ago). If you bring along your girlfriends, all the better - now you get to bond and gossip, two other behaviors that likely served to improve life for African Eves.

Don't be mistaken; I'm not trying to replace Ridley's evolutionary claptrap with my own claptrap. I'm just trying to point out that other (and, perhaps, more plausible) evolutionary explanations for golfing and shoe shopping are available than the ones he has to offer. Which explanations are correct is something we might never know. But that doesn't mean all evolutionary theorists are just making stuff up: the smart ones figure things out the old fashioned way - coming up with a theory, figuring out a test that could disprove the theory, and if/when that test fails to prove their hypothesis, then knowing they are one step closer to better understanding. There are plenty of examples; let's hope Mr. Ridley offers them next time around.

double speak du jour

I am initiating a new series of entries on this blog: "Double speak du jour: how politicians lose their tether with reality and/or the English language."


"Every president would like for us to appropriate all the money and send it to them and let them spend it in any way they want to."

I mean, really, who died and made the president the Chief Executive?

Friday, October 1, 2010

first thing we do, kill off the bully-calling



Ellen DeGeneres' video plea to take a stand against teenage bullying, in the wake of Tyler Clementi's suicide this week, looks like it's going viral.

Thank goodness. Ellen's popularity and likability hopefully will call more attention to the torment millions of teens experience each day. As Ellen says, Youtube puts the spotlight on high-tech bullying, but anyone who has gone to high school can attest that bullying was an "epidemic" long before we had webcams and Twitter.

But how do you stop bullying? A first step may be stop using the term "bully".

I don't know about you, but when I hear the word, I can't help getting one of two images in my head: the towering athlete, fully equipped with jock entourage and glinting eyes, or his female equivalent, the cool and catty biatch, with her gum-smacking posse and half smirk.

But those images are misleading, because - of course - bullying is not a phenomenon reserved for the few and the cool. It's also not an exclusively teenage behavior. We all do it, at all stages of our lives.

Yes, even me. At the age of five, for example, I directed a boy admirer to slap - or otherwise inflict injury on - a girl, just because I found her annoying. (That was the last time I had such control over a man.) In 6th grade I participated in the temporary imprisonment of a friend, tying her to a chair and deserting her in a bathroom stall. By high school, like most girls, I became more subtle in my bullying behavior, limiting myself to occasional character assassination - which I've tried to stop as an adult, but still certainly lapse from time to time.

Admittedly, none of those behaviors seem too horrible, but two points are needed. One: I was always considered a really kind, considerate girl with kind, considerate friends. Two: that's exactly why bullying is so insidious. The individual acts never seem too terrible when they are being perpetrated - in all the examples above I probably thought I was either justified or "only having a little harmless fun" - but, from the victim's stance, small instances of thoughtless cruelty can have a horrid cumulative effect.

Bullying is dangerous precisely because no one thinks they are a bully. Actors learn this early on: Iago didn't think he was a great villain, he just thought he was a victim getting back at the guy who screwed him over. Social psychologists also know our brains have a host of mechanisms to protect us from thinking we've done anything bad. For one, if we do something we know is a moral no-no, our inbuilt "attributional bias" helps us blame the circumstances, rather than our - normally - correct character. We're also handy at re-adjusting our notions of right and wrong once we do something that crosses the ethical line, a phenomenon known as cognitive dissonance.

That's why we need to ditch the term "bully". Each time we use it, everyone will agree: "yes, we need to stop bullying." But almost no one will pause to think they are the bully that needs to be stopped.

That said, I don't have a good alternative. "Small instances of thoughtless cruelty that can hurt - or kill" is too unwieldy, but it begins to get at the point. Or maybe the way to go is to start a campaign "We're all Bullies; Get to Know Your Inner Mean-Girl and Tell Her it's Not Cool." (Okay, I'm not queen of catchy.)

Either way, a successful campaign needs to recognize that careless cruelty is all too human - not something others do, but something we all do everyday.

Saturday, September 25, 2010

why I'm an evolutionary psychologist

An unabashed confession: I am an evolutionary psychologist.

This is not to say I have a degree in psychology or evolution. But when it comes to understanding how we humans behave and why we do the things we do - in our families, at work, at play, in government and even at war (or especially at war) - I start with evolutionary psychology. Which is to say I ask "What would a hunter gatherer do?"

Many smart people, of course, prefer to start with the question "What does our culture dictate we do," arguing that our behavior is mostly shaped by societal rules that aren't innately human. Many also say that our sophisticated cultures and brains make the question of how our former hunter gatherer selves behaved irrelevant.

I don't disagree that culture matters and that our brains add considerable complexity and flexibility to our behavior, but for me the behaviors that evolved while we lived off of hunted game and foraged roots usually have the upper hand.

Two reasons are enough to convince me. First, when it comes to the meaty stuff, we humans act pretty identically across all continents and societies; we fall in love, get jealous, are ambitious, protect our children, etc. Sure there are differences around the edges (who it's acceptable to fall in love with, whether we simmer or explode when jealousy sparks), but our core behavior is remarkably universal. Since it's too big a coincidence that humans in all cultures "just happen" to be similar, the more likely explanation is that there's something in our DNA that makes us do the things we do.

The other reason you'll get farther asking how a hunter gatherer would act has to do with the sheer amount of time we spent in the bush. If you think that humans have been around for 2 million years (in our current form as Homo Sapiens for 250,000 years), and that we've only been agricultural for about 30,000 years, industrial for 150 years and "post-industrial" for 50 years, out modern "culture" has - at best - been around for 2% of the time we humans have been on earth, and that's assuming our culture goes back to Sumer and we're only talking about Homo Sapiens. If you prefer to look at the earliest humans and think modern culture started more around the time of the Enlightenment, then our modern selves only account for 0.025% of our species' existence. Imagine the length of a football field as the history of human life; we've been modern for about an inch.

If you do think evolution has something to do with our behavior, then 99% of our evolution happened when we were living in small bands, gathering roots and wild fruit, and hunting small prey. The culture we see around us is a relative gloss. That's not to say we aren't still evolving - we are, but not at a pace fast enough to reverse what came 2 million years before.

The best example of how behavioral evolution can't keep up with times is contraception. Pretty much up until the last 50 years, when people had sex there was a good chance a baby would follow. If you were a man who was partnered off with a woman and raising her children with her - or expecting to - you'd likely get pretty peeved, or enraged, if you found her in the arms of another man. This made good evolutionary sense; men who didn't care about who their women were lounging with would more likely end up raising other men's children. In evolutionary theory, if your DNA don't make it to the next generation you've lost. So the gene that says "sure, man, you can philander with my girlfriend" would likely die out while the "i will pummel you if you touch my woman" gene would proliferate. Thus infidelity leads to rage.

Fast forward to the late 20th century, however, and the equation "sex=babies" is no longer true. Condoms and the pill have made baby-making a rare byproduct of sex. And yet, men in the modern age still fly into rages - or merely fall out of love - if they find their girl with another guy. But that doesn't make any evolutionary sense anymore. The chump has almost no risk of raising the other guy's kid as his own; he's lost nothing - why should he care if his lady is having some fun on the side? But it does make evolutionary sense - if you keep in mind that "behavioral evolution" is still largely left over from our hunter-gatherer days. The pill can change society, but it can't change 2 million years of evolution.

Thursday, September 23, 2010

two respected news agencies - two realities

The New York Times was fairly confident about what was going on with Chinese exports of rare minerals to Japan:

"Sharply raising the stakes in a dispute over Japan’s detention of a Chinese fishing trawler captain, the Chinese government has blocked exports to Japan of a crucial category of minerals used in products like hybrid cars, wind turbines and guided missiles."

Its sources were unidentified "industry officials" and a minerals consultant in Australia. The Times admitted that the Chinese Commerce Ministry denied any ban, but that was easily explained: if the Chinese had an "official" ban, then Japan could run to the World Trade Organization and make a beef.

It all seemed perfectly plausible. China, after all, doesn't like being pushed around by other nations, least of all Japan, and they don't have the most stellar record of playing by WTO rules (if you're coming from an American perspective, of course).

But perhaps it was the Commerce Ministry's spokesman's quote on the subject that made the Reuters reporters wonder if something else was going on. "I don't know how the New York Times came up with this, but it's not true. There are no such measures," Chen Rongkai said. For a statement from a Chinese official, it does have a whiff of sincerity.

Reuters talked to some rare earth traders in China and Japan, none of whom had heard of the ban. One in Japan, however, had heard rumors.

Recapping: there's no "official" ban and no one directly affected by the ban knows that it exists. So is there a ban or not?

Reuters helps us out by explaining that there are quotas on exports of the rare minerals, but those were in place long before and fishing boat incident. Being that it's the end of the year, those quotas would naturally be tapping out. That might explain why customs officials are stopping exports of rare minerals - if, indeed, they are doing so, which is not clear. (Quotas on rare minerals that account for 93% of the world's supply may be a problem in their own right, but that's not the same thing as saying China is using quotas to retaliate against Japan.)

The mind-spinningly different accounts from the Times and Reuters could be explained in a couple of ways. The Reuters reporters naively accept the word of government ministries and don't dare to dig past a few questions to some Chinese and Japanese import-exporters. Or, the New York Times has an imagination. I'm inclined to believe the latter.

But, in the Times' defense, "imagination" is not a rare and fantastical thing; it's something we all do all of the time, but almost never are aware of. Our social psychologist and behavioral economist friends call it a host of things: confirmation bias, attributional bias, representative heuristic, auto-association.

Here's how it works. First, we get bits of information: China and Japan are engaged in a tiff about a captured fishing boat; China's prime minister makes some blustering statements about possible retaliation; and about that same time, crucial minerals stop being shipped to Japan (I'm assuming this last part is true to give the Times' the benefit of the doubt). If you're a journalist covering China, this set of data will set off a familiar pattern in your brain: "Ah, yes, I've seen this before - China uses trade as political bullying measure." Forget about whether there's any proof that the two events - prime minister bluster and trade drying up - are connected; in the mind, the link has already been primed. Then some "industry officials" (still unidentified) tell you that the Chinese custom's agency has stopped exports to Japan (but not other Asian countries). You talk to a consultant who confirms the story, and voila - case closed.

At this point, you're locked in. The pieces fit in so nicely, there's no sense in investigating further, maybe calling some of the trade companies in Japan and China to see if they knew of the ban.

What doesn't happen - for Times journalists or for you and me, by and large - is that we stop and wonder if our past impressions are setting us up to believe a certain version of a story. If we did, my might muck around for evidence to dis-prove our conclusions. But it's such a good story... how could it possibly not be true?

Wednesday, September 22, 2010

get over your rational self

Yesterday I was overjoyed to find that me and the CIA were on the same page - at least when it comes to understanding the deceptive practices of the human mind.

My one and only reader of this blog (let's call him Harry) emailed to let me know that he had an issue with the CIA's quotes, however. In talking about the mind, Harry wrote, the CIA was making that old faulty distinction between "body and mind", as if the mind could operate, Vulcan-like, unaffected by our body's emotions and desires.

I'm so glad you brought that up, Harry! I don't think the CIA was falling prey to that error, but - yes, oh yes - the idea that we have a mind that can think independent of all those emotions swirling through our gut, heart and loins is, indeed, the number one deceptive practice of the human mind.

I probably don't have to convince anyone that humans are not always rational. (If you've spoken to one recently, that'll be evident.) But it may take a moment to digest the proposition that humans are never rational.

For those jumping up saying "define your terms, lady!", let's say that "rationality" is the ability to make a decision based on reason and logical thought alone. (Still pretty murky, I know, but stick with me.)

One of everybody's favorite stories on this topic is poor old Phineas Gage. A 19th c. railroad foreman, in performing his duty of tamping down blasting powder one day, Gage accidentally sent a three and half foot, one inch wide "tamping rod" through his skull, blasting out a small chunk of his brain. Miraculously, he not only survived but also was up, walking and talking minutes after the accident (he later fell into a light coma for a couple of weeks before he made a full "recovery"). During the remainder of his life, he was appreciated primarily as a "freak", doing a few stints with Mr. Barnum. But as decades past, psychologists became fascinated by reports of Gage's personality change. Before the accident he was seen as a level-headed, upstanding kind of guy; after, if you believe the reports, he was a trash-talking, irresponsible thug. His friends said he was "no longer Gage."

The rod, it seems, had shot through one of the parts of our brain that regulate emotions. Gage went through life with his "logical" mind in tact; but with his emotional lobes gone kaplooey he no longer was the sensible, rational guy he was pre-blast.

Antonio Damasio, in his book Descartes' Error (the go-to tome on the mind-body fallacy, which was of course "Descartes' error"), talks about another poor brain-damaged fellow. Eliot was likewise missing his emotional faculties (eg gruesome pictures didn't seem to bother him) and was what one might imagine to be a Man of the Enlightenment - all logic and thinking. Yet, no Spock was Eliot. Eliot could explain social situations to you and spell out the consequences of actions just fine; but the one thing Eliot couldn't do was make a decision. Not that he was paralyzed in an angsty Hamlet way; he just didn't have any emotions to base a decision on.

This probably makes sense if you think about any big decision in your life. You might know you're a "go with your gut" guy, but if you're like many you may spend a lot of time mulling over the pros and cons of key choices (maybe even made a list?). No matter how detailed, thoughtful and logical the list, however, I'm betting that what finally tipped the scales was "an inner feeling." It really couldn't be any other way, if you think about it. Any decision is just going to have too may pluses and minuses to consider; how could you logically decide whether you should go to med school or study massage therapy when you have to factor in short term and long term financial considerations, quality of life, what your parents will say, what your boyfriend will say, etc. Even if you could make a complete pro/con chart, how would you weigh each of the items? You'd have to decide how much of those line items "mattered" to you, which is another way of saying you'd have to check in with your emotions.

So big decisions, yes, are emotion-wrought. But small decisions? Dean Shibata at the University of Washington was curious how much we depend on emotions even when deciding something as minor as fastening our seat belts. As scientists like to do nowadays, he hooked up an MRI to people's brains to see what parts would light up when asked about seat belts and when asked to do a little math. Unsurprisingly, the seat belt question lit up the lobe associated with emotions, which remained dormant during the math questions.

Some look at the evidence above (and charted in plenty of other studies with and without MRIs) and conclude that rationality does exist, but that emotions are part of being rational. That's not a wrong way to see it, but it opens up a Russian dolls' worth of worms. If based on emotions and logic, what's to say one decision is "rational" and another "irrational"? You'd have to judge some emotions as rational and others not - but what would you use to make that judgment? Another set of emotions. You can see it doesn't really end.

If the thought that your powers of reasoning are forever shackled to your emotions bums you out, you are not alone. I fancy myself the queen of rationality, but I know that's a fanciful illusion. But it's hard to see any other way. And all is not lost; even though we can't depend on logic and out analytical powers to lead us through life, they do help out a bunch. That is, until you run into the brain's other nasty deception...