Home » Education » Learning to Think for Yourself

Learning to Think for Yourself

April 10, 2009

supplu-chain

Little known poet Walter D. Wintle wrote only one poem of note. The poem is best known by the title “The Man Who Thinks He Can.” It goes like this:

If you think you are beaten, you are;
If you think you dare not, you don’t.
If you’d like to win, but think you can’t,
It’s almost a cinch you won’t.
If you think you’ll lose, you’re lost,
For out in the world we find
Success begins with a fellow’s will;
It’s all in the state of mind.

If you think you’re outclassed, you are;
You’ve got to think high to rise.
You’ve got to be sure of yourself before
You can ever win a prize.
Life’s battles don’t always go
To the stronger or faster man;
But soon or late the man who wins
Is the one who thinks he can.

The poem was actually published in 1905 with the title “Thinking.” New York Times‘ columnist Nicholas Kristof published an interested op-ed piece about thinking, specifically about the often errant thinking of so-called experts [“Learning How to Think,” 26 March 2009]. Wintle was trying to motivate people to think positively. Kristof is just trying to motivate people to think for themselves. Kristof begins his column with the compelling question that asks how so many financial “experts” could have been so wrong about the policies they followed that eventually pushed the global economy over a cliff. I would have said it had more to do with greed than expertise; but that would spoil Kristof’s point. He writes:

“One explanation is that so-called experts turn out to be, in many situations, a stunningly poor source of expertise. There’s evidence that what matters in making a sound forecast or decision isn’t so much knowledge or experience as good judgment — or, to be more precise, the way a person’s mind works.”

We’ve all met “smart” people who appear to have no common sense. We are, for example, bemused by so-called “absent-minded professors.” But there is nothing amusing about experts with bad judgment and Kristof argues that lots of so-called experts seem to lack good judgment. Before writing more on that subject, however, Kristof flips the coin and examines why people are nevertheless influenced by experts. He makes the point that not only do so-called layman get “buffaloed” by experts, but even knowledgeable people can be snowed by charisma and presentation.

“The best example of the awe that an ‘expert’ inspires is the ‘Dr. Fox effect.’ It’s named for a pioneering series of psychology experiments in which an actor was paid to give a meaningless presentation to professional educators. The actor was introduced as ‘Dr. Myron L. Fox’ (no such real person existed) and was described as an eminent authority on the application of mathematics to human behavior. He then delivered a lecture on ‘mathematical game theory as applied to physician education’ — except that by design it had no point and was completely devoid of substance. However, it was warmly delivered and full of jokes and interesting neologisms. Afterward, those in attendance were given questionnaires and asked to rate ‘Dr. Fox.’ They were mostly impressed. ‘Excellent presentation, enjoyed listening,’ wrote one. Another protested: ‘Too intellectual a presentation.'”

I’m sure there were thoughtful people in the audience who critiqued the presentation as pure BS. Kristof’s question is why didn’t everybody? I suspect that fear plays some role in our willingness to be influenced by experts. When faced with such situations, it would be natural to think, “What if I challenge their thinking and they turn out to be right and I turn out to be wrong? How will that make me look?” Kristof is also interested in knowing which experts seem to have the most influence. He reports that one study shows that the President (any president, not just President Obama) has little effect on public opinion following a television appearance. A president can move public opinion by less than a percentage point while so-called experts “can move public opinion by more than 3 percentage points, because they seem to be reliable or impartial authorities.” I’m skeptical about the “reliable and impartial” part of the assessment. More and more people are selecting news sources based on their underlying philosophical bent. Republicans, for example, flock to Fox News to see so-called “fair and reliable” experts, most of whom are also ideologues whose primary purpose is to support conservative positions. Democrats prefer receiving their news from sources like the New York Times, which has a reputation for supporting more liberal positions — in a “fair and reliable” way, of course! The natural tendency for all of us is to look for people who think like we do and support positions we support. The problem with that strategy is that we develop a pack mentality and stop thinking for ourselves.

Partisan politics holds such sway in America that we sometimes forget that neither side gets everything right or has everything wrong. When New York City Mayor Michael Bloomberg announced that he was not running for president, he also decried the lack of candor he saw presidential candidates who were running [“I’m Not Running for President, but …New York Times, 28 February 2008]. He wrote:

“You sometimes get the feeling that the candidates — smart, all of them — must know better. They must know we can’t fix our economy and create jobs by isolating America from global trade. They must know that we can’t fix our immigration problems with border security alone. They must know that we can’t fix our schools without holding teachers, principals and parents accountable for results. They must know that fighting global warming is not a costless challenge. And they must know that we can’t keep illegal guns out of the hands of criminals unless we crack down on the black market for them. The vast majority of Americans know that all of this is true, but — politics being what it is — the candidates seem afraid to level with them. … We need innovative ideas, bold action and courageous leadership. That’s not just empty rhetoric, and the idea that we have the ability to solve our toughest problems isn’t some pie-in-the-sky dream. … I believe that an independent approach to these issues is essential to governing our nation. … The changes needed in this country are straightforward enough, but there are always partisan reasons to take an easy way out. There are always special interests that will fight against any challenge to the status quo.”

Bloomberg’s last point is pertinent to the subject at hand. The experts to whom we turn for opinions and advice have a vested interest in keeping things as they are because they have risen to the top of their profession in the current system. That’s why I’m skeptical about their impartiality. Kristof is also concerned about the soundness of their advice; but he is even more interested in how accurate they are in forecasting how current events will affect the future. He writes:

“The expert on experts is Philip Tetlock, a professor at the University of California, Berkeley. His 2005 book, ‘Expert Political Judgment,’ is based on two decades of tracking some 82,000 predictions by 284 experts. The experts’ forecasts were tracked both on the subjects of their specialties and on subjects that they knew little about. The result? The predictions of experts were, on average, only a tiny bit better than random guesses — the equivalent of a chimpanzee throwing darts at a board. ‘It made virtually no difference whether participants had doctorates, whether they were economists, political scientists, journalists or historians, whether they had policy experience or access to classified information, or whether they had logged many or few years of experience,’ Mr. Tetlock wrote. Indeed, the only consistent predictor was fame — and it was an inverse relationship. The more famous experts did worse than unknown ones. That had to do with a fault in the media. Talent bookers for television shows and reporters tended to call up experts who provided strong, coherent points of view, who saw things in blacks and whites. People who shouted — like, yes, Jim Cramer!”

People dislike ambivalence and ambiguity. They prefer a world that can be understood in terms of black and white — this is good and that is bad. My colleague Tom Barnett has been used as an expert on a number of talk shows, but not as often as he should be because he’s an optimist and has a nuanced view of the world. Tom has often made the point stressed by Professor Tetlock: people who present the world in black and white terms and avoid shades of gray get the most attention, make the most speeches, and sell the most books. But as the good professor concludes, they are also the ones most often wrong in their predictions. Another point that Tom makes is that it is easier to be a naysayer and a Cassandra than someone who is optimistic and offers solutions. Kristof notes that ideological leanings, rather than expertise, often makes one a better predictor. He talks about two kinds of people “hedgehogs” and “foxes” (a distinction made famous by Sir Isaiah Berlin). Berlin, however, borrowed the term from an even more ancient source, the ancient Greek poet Archilochus, who opined, “The fox knows many things, but the hedgehog knows one big thing.” According to Tetlock, foxes do much better in their predictions than hedgehogs. Kristof writes:

“Hedgehogs tend to have a focused worldview, an ideological leaning, strong convictions; foxes are more cautious, more centrist, more likely to adjust their views, more pragmatic, more prone to self-doubt, more inclined to see complexity and nuance. And it turns out that while foxes don’t give great sound-bites, they are far more likely to get things right. This was the distinction that mattered most among the forecasters, not whether they had expertise. Over all, the foxes did significantly better, both in areas they knew well and in areas they didn’t.”

Kristof would like to see experts held accountable for their views. The experts, on the other hand, probably don’t see much fun in that. Kristof’s point is that until experts are held accountable, people will go on thinking that they are somehow receiving good advice and like lemmings will follow the experts over the cliff. Kristof concludes:

“The marketplace of ideas for now doesn’t clear out bad pundits and bad ideas partly because there’s no accountability. We trumpet our successes and ignore failures — or else attempt to explain that the failure doesn’t count because the situation changed or that we were basically right but the timing was off. For example, I boast about having warned in 2002 and 2003 that Iraq would be a violent mess after we invaded. But I tend to make excuses for my own incorrect forecast in early 2007 that the troop ‘surge’ would fail. So what about a system to evaluate us prognosticators? Professor Tetlock suggests that various foundations might try to create a ‘trans-ideological Consumer Reports for punditry,’ monitoring and evaluating the records of various experts and pundits as a public service. I agree: Hold us accountable!”

My recommendation is to use experts — especially the foxes — as stimulators for self-thought and not as “fair and impartial” analysts. You’re just as likely to get it right as the experts and in the process you may become a fox yourself.

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!