?

Log in

No account? Create an account

Previous Entry | Next Entry

Two cognitive books: Kahneman vs. Todd

I just finished Thinking, Fast and Slow by Daniel Kahneman. It is long with many chapters discussing many cognitive biases, making it hard to summarize in my usual "slog through and taken notes" way. Lucky you! It was quite interesting, though. One of those books that makes you think twice about democracy, except as Kahneman has pointed out, there's no guarantee the experts are any better.

One theme is a set of contrasts: System 1 vs. System 2, where 1 is fast, automatic, parallel, associative, and heuristic, basically systems of perception, memory and recognition; 2 is slow, serial, more logical, conscious, able to direct the attention of 1, and lazy. Another is Econs vs. Humans, rational decision makers vs. real people. A third, near the end, is the experiencing self vs. the remembering self. E.g. we tend to remember a painful episode not by the total pain, at least as modeled by a simple integral, but by the average of the peak pain and the last pain.

(So you can subject people to 60 seconds of their hand being in cold water, vs. 60 seconds of their hand being in equally cold water plus 30 more seconds of slightly less cold water, and they will choose to repeat the second experience over the first, because they remember the lower pain at the end, even though objectively it seems an entirely worse experience.)

Looking at http://en.wikipedia.org/wiki/List_of_biases_in_judgment_and_decision_making I'm reminded that major ones he talks about are anchoring, priming, framing effect, halo effect, base rate neglect, availability heuristic, endowment effect and loss-aversion, focusing effect (particularly by the remembering self, at the expense of the future experiencing self), impact bias, peak-end rule (what causes the cold water result).

Focusing: if you ask someone in Chicago how happy people in California are, the Chicagoan will think of the climate as a salient feature, focus on that, and expect Californians to be happier. In fact most Californians take the weather for granted, and aren't obviously any happier. Similarly people who don't know a paraplegic will expect one to be pretty unhappy after a year, whereas they tend to learn to cope and have close to normal levels of happiness.

If you ask people how happy they are, the current weather tends to have a big effect. Unless you first ask them what the weather is; then in considering their happiness, the weather is salient and they control for it.

Loss-aversion: given a chance to bet on a fair coin, heads they win $20, tails they lose $10, there's a tendency for humans to avoid the bet; losses are more painful, despite the nice expected value. If offered a chance to bet on the result of 100 coin tosses, most everyone would jump at that. Kahneman notes that this is too narrow minded: life is a whole series of small diverse bets with positive expected value, and it'd be very costly to systematically avoid them just because they look like diverse and unconnected bets, unlike 100 identical coin tosses. This feels relevant to me, who tends to be proudly risk-averse.

Framing: ask corporate executives about a fair chance to double or halve their capital, and most will avoid it. Their CEO would love for them to all take such a bet, as he can see the aggregate benefit to the company. Narrow framing vs. broad framing, and this actually goes back to the previous example: narrow framing is "how do I feel about the potential small loss here", broad framing is "what life policy should I have to such bets in general?"\

"Some people are more like their System 1, others are more like their System 2."

Keith Stanovich breaks up System 2, into an algorithmic mind -- slow thinking and demanding computation, IQ test performance, ability to switch tasks quickly and efficiently -- and a rational mind, or what Kahneman calls 'engaged', which is about reflectivity and resistance to biases, or ability to recognize when biases are likely and thus ability to slow down and think more. Someone can be intelligent, yet highly subject to bias; I couldn't help thinking of Intelligence and Wisdom in D&D.

This book also had the thing I mentioned recently, where asking people to think like a trader changes their behavior (in particular, makes them less loss averse in an experiment), which prompted me to think "f-ck! who would have thought of that as a requirement for human-equal AI? Intelligence is Hard."

***

My subject said two books. I've just started the second one, which is Simple Heuristics That Make us Smart, by Gerd Gigenrizer and Peter Todd and others. You'd think that'd be a similar research program to Kahneman and Tversky's, but there's apparently a fair bit of discord. I'd heard of Todd and his heuristics program back at IU -- he was there and I took a class -- so I recognized him when Kahneman mentioned them briefly in a footnote, saying they focused more on statistical simulation, that their evidence for actual psychological use was limited and disputed, and there for all its flaws, there's no need for System 1 to be frugal; it's built to use vast quantities of information while still being fast. This in 2012.

The other book, written in 1999, mentioned Kahneman and Tversky almost right away, with frequent sniping about how they focus on biases and deviations from a supposed perfectly rational ideal, while ignoring the ecological adaptedness and accuracy of fast and frugal heuristics. I've read a few chapters, and it's been an interesting reflective exercise to watch my biases at work. I liked Kahneman's book and his "no need to be frugal" criticism seemed plausible, so I come in biased against this work. The tone seems pettier, so there's a halo effect -- I don't like that, so I'm disposed to not like the content. And IMO it's an uglier book, particularly in the font, so that's the halo effect again.

As for the actual content, the first part was about the recognition heuristic and their famous example. If you're asked to judge which of two cities is larger, and you don't know, but you recognize that you've heard of one of them, it's a good bet to say that one is larger. Strikingly, you can do better by "knowing less": Americans might have more pairs of cities they've heard of and thus are stumped by, while Germans are more likely to have just heard of the biggest US cities. And they had a computer model that did best when taught the first 23 of 83 German cities that Americans recognized, even with other cues to help decide between pairs of recognized cities. (Basically, those cues were less accurate than just recognition, when applicable.) The next chapter talks about how recognition did better in picking a stock portfolio for 6 months of 1999 than almost any other strategy; they do acknowledge some of the potential pitfalls there.

My response was that yes, the recognition heuristic seems plausible and sensible in that situation (the cities one), but how common is that to the real world? And in their emphasis on "fast and frugal", and desire for clear computational models, they dismiss some alternatives, like familiarity. They seem to say that's too vague to consider, or form part of a research program, yet it seems obvious to me that if I recognize both cities but feel I've heard of one of them more, then I'll bet on that one and likely do well, and that familiarity -- number of associations, sense of prototypicalness, or just vague sense of hearing of it a lot -- should not be out of bounds for a cognitive research program, even if it would take more work to evaluated and model.

I just realized that Kahneman talked a lot about substitution effects -- faced with a hard question, like "how happy are you with your life", System 1 substituting an easier question, like "how happy am I right now". And the recognition effect would be just that. Memory doesn't return the size of a city, but recognition (or familiarity) is something, and can be substituted in.

See the comment count unavailable DW comments at http://mindstalk.dreamwidth.org/342642.html#comments

Comments

( 11 comments — Leave a comment )
houseboatonstyx
Nov. 24th, 2012 10:08 am (UTC)
(So you can subject people to 60 seconds of their hand being in cold water, vs. 60 seconds of their hand being in equally cold water plus 30 more seconds of slightly colder water, and they will choose to repeat the second experience over the first, because they remember the lower pain at the end, even though objectively it seems an entirely worse experience.)

Should that second 30 seconds be 'slightly LESS cold'?
mindstalk
Nov. 24th, 2012 11:54 am (UTC)
Yes, thank you.
fpb
Nov. 24th, 2012 10:41 am (UTC)
"If you're asked to judge which of two cities is larger, and you don't know, but you recognize that you've heard of one of them, it's a good bet to say that one is larger"

Lusaka is larger than Venice. Patterson, NJ, is larger than Athens, Greece. Astana is larger than Amsterdam. And even among better known cities, there is no such certainty: Birmingham, UK, is larger than San Francisco, and Naples is larger than Amsterdam. And there have by now got to be almost a dozen cities larger than New York City, even counting the tri-state area, none of which - not even Shanghai or Mexico City - you would think of first.
mindstalk
Nov. 24th, 2012 11:57 am (UTC)
The heuristic doesn't work well for pairs of cities across nations, no, especially between with Third World megacities involved. For cities within a country it works fairly well. Not perfectly of course, but that's not the point.
fpb
Nov. 26th, 2012 09:22 pm (UTC)
I was trying to make the point that other issues are more important than size for recognition. Amsterdam is barely 700,000 people, but it has an imperial, if discreditable, past; and so does Venice (80,000 people). San Francisco is small, but it has been a cultural centre from of old; Birmingham UK is rather large, but inevitably provincial to London. Culture and history, more than size or even economic importance, dictate the rankings of cities (and other things) in people's minds. If I were to ask you to rank Italy's three largest cities - Milan, Rome, Naples - for size, I am almost sure you would get it wrong; the order is Naples, Rome, Milan. But the world gets to hear of Milan, the richest and the cultural capital, and of Rome, the political capital.
mindstalk
Nov. 27th, 2012 12:15 am (UTC)
The point isn't that it's some failsafe method for determining size; it's a heuristic, after all. The point is just that sometimes "have I heard of it" is surprisingly useful information by itself.
harimad
Nov. 24th, 2012 02:24 pm (UTC)
You might also enjoy The Psychology of Intelligence Analysis (free download from CIA's website). Heuer was a CIA employee writing for CIA analysts ergo the "intelligence" part of the title, but the book is broadly applicable to most (all?) forms of persuasion.
notthebuddha
Nov. 24th, 2012 04:24 pm (UTC)
Framing: ask corporate executives about a fair chance to double or halve their budget, and most will avoid it. Their CEO would love for them to all take such a bet

Er, why would the CEO love it? Doesn't it increase budgeted expenses by 50% on avg?
mindstalk
Nov. 24th, 2012 07:07 pm (UTC)
I guess I didn't phrase that well. 'Double' was meant to imply return; i.e. doubling their capital or losing half of it.
februaryfour
Nov. 24th, 2012 05:18 pm (UTC)
Randomly, do you know this shows up twice in LJ (though only once in DW)?
mindstalk
Nov. 24th, 2012 07:02 pm (UTC)
No, I hadn't. Dreamwidth thought the first crosspost failed, either it didn't or something timed out and made an edit post again. Well, thanks!
( 11 comments — Leave a comment )

Profile

Phoenix
mindstalk
Damien Sullivan
Website

Latest Month

November 2017
S M T W T F S
   1234
567891011
12131415161718
19202122232425
2627282930  

Tags

Powered by LiveJournal.com
Designed by Lilia Ahner