Bias Hunter
  • Home
  • Blog
  • Resources
  • Contact

You Are Irrational, I Am Not

29/10/2015

3 Comments

 
The past month or so I’ve been reading Taleb’s Black Swan again, now for the second time. I’m very much impressed by his ideas, and the forceful in-your-face way that he writes. It’s certainly not a surprise that the book has captivated the minds of traders, businesspeople and other practitioners. The book is extremely good, even good enough to recommend it as a decision making resource. Taleb finds a cluster of biases (or more exactly, puts together research from other people to paint the picture), producing a sobering image of just how pervasive our neglect of Black Swans is in our society. And, he’s a hilariously funny writer to boot.

But.

Unfortunately, Taleb – like everyone else – succumbs in the same trap we all do. He’s very adept at poking other people about their biases, but he completely misses some blind spots of his own. Now, this is not evident in the Black Swan itself – the book is very well conceptualized and a rare gem in the clarity of what it is as a book and what it isn’t. The problem only becomes apparent in the following, monstrous volume Antifragile. When reading that one a few years ago, I remember being appalled – no, even outraged – by Taleb’s lack of critical thought towards his own framework. In the book, one gets the feeling that the barbell strategy is everywhere, and explains everything from financial stability to nutrition to child education. For example, he says:
​I am personally completely paranoid about certain risks, then very aggressive with others. The rules are: no smoking, no sugar (particularly fructose), no motorcycles, no bicycles in town [--]. Outside of these I can take all manner of professional and personal risks, particularly those in which there is no risk of terminal injury. (p. 278)
I don’t know about you, but I really find it hard to derive “no biking” from the barbell strategy.

​Ok, back to seeking out irrationality. Taleb certainly does recognize that ideas can have positive and negative effects. Regarding maths, at a point Taleb says:
[Michael Atiyah] enumerated applications in which mathematics turned out to be useful for society and modern life [--]. Fine. But what about areas where mathematics led us to disaster (as in, say, economics or finance, where it blew up the system)? (p.454)
My instant thought when reading the above paragraph was: “well, what about the areas where Taleb’s thinking totally blows us up?”

Now the point is not to pick on Taleb personally. I really love his earlier writing. I’m just following his example, and taking a good, personified example of a train of thought going off track. He did the same in the Black Swan, for example by picking on Merton as an example of designing models based on wrong assumptions, and in a wider perspective of models-where-mathematics-steps-outside-reality. In my case, I’m using Taleb as an example of the ever present danger of critiquing other people’s irrationality, while forgetting to look out for your own.
​
Now, the fact that we are better at criticizing others than ourselves is not exactly new. After all, even the Bible (I would’ve never guessed I’ll be referencing that on this blog!) said: “Why do you see the speck that is in your brother’s eye, but do not notice the log that is in your own eye?”
In fact, in an interview in 2011, Kahneman said something related:
I have been studying this for years and my intuitions are no better than they were. But I'm fairly good at recognising situations in which I, or somebody else, is likely to make a mistake - though I'm better when I think about other people than when I think about myself. My suggestion is that organisations are more likely than individuals to find this kind of thinking useful.
If I interpret this loosely, it seems to be saying the same thing as the Bible quote – just in reverse! Kahneman seems to think – and I definitely concur – that seeing your own mistakes is damn difficult, but seeing others’ blunders is easier. Hence, it makes sense for organizations to try to form a culture, where it’s ok to say that someone has a flaw in their thinking. Have a culture that prevents you explaining absolutely everything with your pet theory.
3 Comments
Matti HEino link
4/11/2015 15:14:43

Hi,

"well, what about the areas where Taleb’s thinking totally blows us up?”

Have you developed this idea further? Any suggestions for what such areas could be?

Best,

Matti

Reply
Tommi link
4/11/2015 18:05:36

Well, not off the top of my head. It's been a while since I read Antifragile, so I'd definitely have to look at it again. What Valtteri mentioned about the weight lifting was pretty hilarious, the same goes for his nutritional advice.

The points isn't that Taleb's thinking is wrong - we are all wrong occasionally, or even often - but that his metacognitive awareness of the possilibity of being wrong seems really low. Of course, another possibility is that he is very aware, but just likes to present himself as a jerk in the books. I don't like either of these options very much...

Reply
Matti Heino link
11/11/2015 18:54:27

Rationally speaking, the weight lifting and nutritional advice seem pretty inconsequential when discussing blow-ups :)

I wonder if it's legitimate to make estimates of one's metacognitive awareness based on a popular book that's meant to sell copies. Personally, I think it more clearly comes out in discussions preceding and following the publishing. I guess one way of testing the weight of one's ideas is to be hyper-assertive: draw as large a crowd of dissenters as you can, and then see which ideas survive the battle. Not saying it's the best way but it's one way.

If you're doing climate science, I don't see why you would *have* to address the possibility that all you've done is messed up and it's not impossible that the climate change deniers are right after all. You certainly *can* do it, but it's not a prerequisite for launching a book, I think.

I guess I'm trying to say that Antifragile is a poor proxy for making decisions about the writer's metacognitions :)




Leave a Reply.

    RSS Feed

    Archives

    December 2016
    November 2016
    April 2016
    March 2016
    February 2016
    November 2015
    October 2015
    September 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014

    Categories

    All
    Alternatives
    Availability
    Basics
    Books
    Cognitive Reflection Test
    Conferences
    Criteria
    Culture
    Data Presentation
    Decision Analysis
    Decision Architecture
    Defaults
    Emotions
    Framing
    Hindsight Bias
    Improving Decisions
    Intelligence
    Marketing
    Mindware
    Modeling
    Norms
    Nudge
    Organizations
    Outside View
    Phd
    Planning Fallacy
    Post Hoc Fallacy
    Prediction
    Preferences
    Public Policy
    Rationality
    Regression To The Mean
    Sarcasm
    Software
    Status Quo Bias
    TED Talks
    Uncertainty
    Value Of Information
    Wellbeing
    Willpower

Powered by Create your own unique website with customizable templates.