Bias Hunter
  • Home
  • Blog
  • Resources
  • Contact

Discussing Rationality

3/3/2015

2 Comments

 
I have a confession to make: I’m having a fight. Well, not a physical one, but an intellectual one, with John Kay’s book Obliquity. It seems to me that we have some differences in our views about rationality.

Kay writes that he used to run an economic consultancy business, and they would sell models to corporations. What he realized later on was that nobody was actually using the models for making decisions, but only for rationalizing them after they were made. So far so good – I can totally believe that happening. But now for the disagreeable part:
They have encouraged economists and other social scientists to begin the process of looking at what people actually do rather than imposing on them models of how economists think people should behave. One popular book adopts the title Predictably Irrational. But this title reflects the same mistake that my colleagues and I made when we privately disparaged our clients for their stupidity. If people are predictably irrational, perhaps they are not irrational at all: perhaps the fault lies not with the world, but with our concept of rationality.
- Obliquity, preface 
Ok, so I’ve got a few things to complain about. First of all, it’s obvious we disagree about rationality. Kay thinks that if you’re predictably irrational, then maybe the label of irrationality is misplaced. I think that if you’re predictably irrational, that’s a two-edged sword. The bad thing is that predictability means you’re irrational in many instances – they are not random errors. But predictability also entails that we can look for remedies – if irrationality is not just random errors, we can search for cures. The second thing I seem to disagree about – based on this snippet – are the causes of irrationality. For Kay, it’s stupidity. For me, it’s a failure of our cognitive system.

Regarding Kay’s conception of rationality, my first response was whaaat?! Unfortunately, that’s really not a very good counterargument. So what’s the deal? In my view, rationality means maximizing your welfare or utility, looked at from a very long-term and immaterial perspective. This means that things like helping out your friend is fine, giving money to charity is fine. Even the giving of gifts is fine, because you can give value to your act of trying to guess at your friend’s preferences. After all, to me this seems to be a big part of gift-giving: when we get a gift that shows insight into our persona, we’re extremely satisfied.

Since Kay is referring explicitly to Dan Ariely’s Predictably Irrational, it might be sound to look at a few cases of (purported) irrationality that it portrays. Here’s a few examples I found in there:

  1. We overvalue free products, choosing them even though a non-free options has better value for money (chapter 3)
  2.  We cannot predict our preferences in a hot emotional state from a cold one (chapter 6)
  3.  We value our possessions higher than other people do, so try to overcharge when selling them (chapter 8)
  4. Nice ambience, brand names etc. make things taste better, but can’t recognize this as the cause (chapter 10)
  5.  We used to perform surgery on osteoarthritis of the knee – later it turned out a sham surgery had the same effect

If Kay wants to say that these cases are alright, that this is perfectly rational behavior, then I don’t really know what one could say to that. With the exception of point 3, I think all cases are obvious irrationalities. The third point is a little bit more complex, since in some cases the endowment effect might be driven by strategic behavior, ie. trying to get the maximum selling price. However, it also includes cases where we give stuff to people at random, with a payout structure that ensures they should ask for their utility-maximizing selling price. But I digress. The point being that if Kay wants to say these examples are okay, then we have a serious disagreement. I firmly believe we’d be better off without these errors and biases. Of course, what we can do about them is a totally different problem – but it seems that Kay is arguing that they are in principle alright.

The second disagreement, as noted above, is about the causes of such behaviors. Kay says the chided their clients ‘stupidity’ for not using the models of rational behavior. Well, I think that most errors arise due to us using System 1 instead of System 2. Our resources are limited, and we’re more often than not paying inadequate attention to what is going on. This makes irrationality not a problem of stupidity, but a failure of our cognitive system. Ok, so intelligence is correlated to some tasks of rational decision making, but for some tasks, there is no correlation  (Stanovich & West, 2000). It's patently clear that intelligence alone will not save you from biases. And that’s why calling irrational people stupid is –for want of a more fitting word – stupid.

Ok, so not a strong start from my perspective for the book, but I’m definitely looking forward to what Kay has to say in later chapters. There’s still a tiny droplet of hope in me that he’s just written the preface unclearly, and he’s really advocating for better decisions. But, there’s still a possibility that he’s just saying weird things. I guess we’ll find out soon enough.
2 Comments

Intelligence Doesn't Protect from Biases

16/9/2014

0 Comments

 
Perhaps one of the most common misconceptions about biases is that they only happen to dumb people. However, this is - to be clear - false. I think there are a few reasons why this misconception persists.

Firstly, a lot of bias experiments seem really obvious after the correct answer has been revealed. This plays directly into our hindsight bias – also aptly named the knew-it-all-along effect – in which the answer makes us think “oh, I wouldn’t have fallen into that trap”. Well, as the data shows, you most likely would have.

A second reason is that a popular everyday conception of intelligence implies roughly that “more intelligence = more good stuff”. Unfortunately, this simplistic rule fails to work here. Intelligence in scientific terms is cognitive ability, which is computational power. In terms of biases, lack of power is not the issue. The issue is that we don’t see or notice how to use the power in the right way. It’s like if someone is trying to hammer nails by hitting them with the handle end. Sure, we can say that he needs a bigger hammer, but a reasonable solution would be to use the hammer with the right end.
Picture
Special Offer: The Hammer of Rationality
Stanovich & Stanovich (2010, p. 220)  summarize in their paper why intelligence does not help with rational thinking very much:

[--] while it is true that more intelligent individuals learn more things than less intelligent individuals, much knowledge (and many thinking dispositions) relevant to rationality are picked up rather late in life. Explicit teaching of this mindware is not uniform in the school curriculum at any level. That such principles are taught very inconsistently means that some intelligent people may fail to learn these important aspects of critical thinking. 

In their paper they also tabulate which biases or irrational dispositions have an association with intelligence, and which have not (Stanovich & Stanovich, 2010, p. 221):
Picture
Now I’m not going to go through the list more than this, the point is just to show that there are tons of biases that have no relation to intelligence, and that for the other ones the association is still quite low (.20-.35). In practice, such a low correlation means that intelligence is not a dominant factor: dumb people can be rational and intelligent people can be irrational.

Now, some might feel the lack of association to intelligence a dystopian thought. If intelligence is of no use here, what can we do? To be absolutely clear, I’m not saying that we are doomed to suffer from these biases forever. Even though intelligence does not help, we can still help ourselves by being aware of the biases and learning better reasoning strategies. Most biases arise due to our System 1 heuristics getting out of hand. What we need in those situations is better mindware, complemented by slower and more thorough reasoning. 

Thankfully, that can be learned.
0 Comments

    RSS Feed

    Archives

    December 2016
    November 2016
    April 2016
    March 2016
    February 2016
    November 2015
    October 2015
    September 2015
    June 2015
    May 2015
    April 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014

    Categories

    All
    Alternatives
    Availability
    Basics
    Books
    Cognitive Reflection Test
    Conferences
    Criteria
    Culture
    Data Presentation
    Decision Analysis
    Decision Architecture
    Defaults
    Emotions
    Framing
    Hindsight Bias
    Improving Decisions
    Intelligence
    Marketing
    Mindware
    Modeling
    Norms
    Nudge
    Organizations
    Outside View
    Phd
    Planning Fallacy
    Post Hoc Fallacy
    Prediction
    Preferences
    Public Policy
    Rationality
    Regression To The Mean
    Sarcasm
    Software
    Status Quo Bias
    TED Talks
    Uncertainty
    Value Of Information
    Wellbeing
    Willpower

Powered by Create your own unique website with customizable templates.