The endowment effect is well-studied: people assign higher worths to their own possessions than to same things owned by someone else. That is to say: you wouldn't pay more than $30 for that couch on Craigslist, but when you list your own - same model, same condition - for some reason, you ask (and somehow, actually expect!) $75.
To many of us there are two central features of cognitives biases that make them so interesting. The first is that they're cognitive biases, not cognitive stupidities. We aren't just all over the map due to limited brainpower, as bounded cognition models would suggest; we consistently make mistakes in the same "direction". This consistency (even if it's consistent incorrectness) leads to the other interesting aspect of cognitive biases, which is the extent to which they might actually be instrumentally rational shortcuts in disguise, either being seen out of context or profitable only over the long run. That is, perhaps when we were hunter-gatherers, they were useful. For example, hyperactive pattern detection (type I error) is a terrible thing in the modern age when we're looking for incoming nuclear warheads, but during the Pleistocene, we were stupid and it couldn't get us into too much trouble. Sure, maybe you might end up thinking the gods struck the mountain with lightning because your children cursed a lot that morning, and a thousand other strange things, but we couldn't really do too much about our strange beliefs anyway - but if that one time you saw a shape in the grass that looked like a hyena, and it really was a hyena - well, you still came out ahead.
There's a lot in behavior and medicine like this that only makes sense in evolutionary context, for instance fever. Fever is not something that pathogens do to us, it's something that our bodies actively choose to do to them. In the modern age it's really hard for us to understand how fever can be beneficial, until we remember that merely two centuries ago when, without medicine, we could (and often did) die of fevers. But this is a kind of base-rate fallacy. Fever is a way our body shakes off pathogens, and before modern medicine, that cut in your foot might have a 70% chance of going septic and killing you. If the fever only has a 65% chance of killing you, you still come out ahead of any competitor who didn't have a fever response.
So how could a completely judgment-clouding bias like endowment effect ever be helpful? Finding a group of people who don't appear to have it might point us in the right direction. And indeed, Apicella et al found that Hadza hunter-gatherers in contact with markets show the endowment effect just like the rest of us, but Hadza way out in the bush away from markets do not.
This is pretty amazing - there's a difference in a known bias, even within the same group of people. So we've already learned one thing: the endowment effect is a learned behavior. So either we post-agricultural types are stupid, or we're getting something out of this bias. What's the benefit? And what exactly is different about the two groups of people that makes one group adopt this strategy?
A simple model of markets assumes information symmetry between participants. But in reality, specialization of labor means that the person approaching you to buy your possessions will almost always have better information than the aproachee who's selling because the buyer buys (cars, computers, art, etc.) all the time - and if they're initiating the transaction, they are even MORE likely to have better information. Therefore, the endowment effect may be a learned behavior whereby we value our own possessions more highly than is justified on the open market as a defense against information asymmetry.
Here's a concrete example. Imagine you're selling a car. You seek out an offer, and you find another individual buyer. Are you more comfortable that you're getting a fair deal from them than the car dealer? Of course you are. Now imagine you're approached out of the blue by someone who buys cars for a living. Sure, you'd consider it, but only at a very high price where you're sure you're not getting swindled. (This isn't to suggest that the endowment effect occurs consciously, but you can see how when we calculate consciously, we might behave in exactly the way the endowment effect would influence us.)
Consequently, you can imagine the endowment effect as the cushion you need to come out even, when someone with better information than you buys your possessions, especially one who's initiating the deal. And returning to the Hadza hunter-gatherers, we may have identified the relevant difference. Hunter-gatherers are less specialized. If you're a hunter-gatherer, you know just about everybody you could possibly buy or trade something from, and there's not much special knowledge to allow information asymmetry - i.e., everyone is equally smart about the relative value of gazelles and axe heads - so there's no cause to develop such a bias.
This theory makes several testable claims.
1) The more information asymmetry that exists between possession-owner/seller and the buyer, the more the endowment effect gap should be. You think your car is worth more when the car salesman runs up to you on the street initiating and offer to buy it, than when you're selling to someone online with apparently equal knowledge about cars.
2) The endowment effect gap should be larger in markets where there is poorer information, there is less trust, and for goods that are more difficult to value. Complex goods are more difficult to value.
Not central to the theory, but relevant to the fact that this appears to be a learned behavior: children at some arbitrarily young age should not show the endowment effect, because if this differs between humans, it's learned behavior; kids haven't been swindled out of their possessions enough times. (And what is this arbitrarily young age?) Non-hunter-gatherers who nonetheless live in small groups of people with little interchange with other populations should lack this bias. Hunter-gatherers should develop this bias after they come into contact with markets.
There is also a kind of transaction cost argument, separate from the theory. Yes, you might have just gotten a good deal for your used Toyota, but now you have a Nissan and there's a utility cost to you of learning how to operate a new car. If that's ALL the endowment effect is, then each individual's ability to learn new behaviors should completely predict the entire strength of the endowment effect in each individual, at least with complex things like cars.
Monday, June 30, 2014
Saturday, June 21, 2014
Previously I posted that we shouldn't be surprised that media information that we receive passively isn't necessarily useful to us. It costs money for all the information to get transmitted so broadly, and the reason it happens is because the people on the other end think it is benefiting them, usually financially. This of course doesn't correlate positively with that information being useful to the recipient, and might even correlate negatively. If you put in effort to find information you used to make a decision, it is more likely to be useful to you. Robin Hanson approaches this same idea in his post Why Info Push Dominates.