Sunday, April 14, 2013

Preference Falsification and Cascades

Preference falsification is the masking of one's real preferences in order to conform to "received" preferences, that is to social norms, often to avoid a cost of nonconformity; e.g., social ostracism or criminal punishment. One of the effects of the phenomenon is that each person who is falsifying, that is who holds a preference different from the received preference, does not know who else might also be falsifying. They could be surrounded by other people who also think that social/government/religious belief X is B.S., or they might be the only one, but in this situation, it's hard to know, since even bringing up questions about X might reveal their actual preference regarding X. When suddenly the fact of widespread preference falsification is obvious - everyone realizes that everyone else thinks that X is B.S. - things can change very quickly, and in fact this explanation has been advanced to explain "surprising" revolutions (Iran in 1979, the Eastern Bloc in 1989, etc.) This is called a preference cascade; its opposite is a spiral of silence.

Another implication of the theory is that secret ballots as we have in the United States should also occasionally lead to such surprising shifts, since behind the curtain you can vote to legalize marijuana or take voting rights away from ethnic/gender group X or whatever it is that you're otherwise afraid to admit in front of your neighbors, and hey, what do you know - everyone else voted that way too. On referenda in the U.S., such surprising preference cascades don't happen very much; I can't think of many examples. Yes, Washington and Colorado just legalized marijuana, but that wasn't a preference cascade, since people told the pollsters they were going to. Another way of saying this is that if the polls were predictive of the actual voting in a secret ballot (as they almost always are in the U.S.) then there was no preference cascade. This may represent a problem with the theory, or reflect that in the U.S. we don't falsify our preferences very often, or that referenda involving highly falsified preferences are somehow kept from the ballots.

The strongest example, indeed one of the only examples, that I can think of in recent U.S. politics is the Bradley effect, where Tom Bradley was projected to win the California gubernatorial race based on polls results, but he lost. The revealed preference here? Tom Bradley was black, and it was thought that many voters falsified their racial preference to pollsters but not at the ballot. Note however, that even in this case, the unmasked preference did not result in a cascade of Californians suddenly becoming open about refusing to vote for black candidates.

A second and dare I say humorous phenomenon is observed where the received preference is subject to central control. Think of a dictatorship that modifies its propaganda, or a steep status hierarchy in business or academia. This is different from the usual situation that obtains in that centrally-controlled received preferences are dependent on very few nodes in the network and therefore much more likely to shift rapidly, versus the "normal" situation (i.e. with social norms) which are held or at least claimed by large numbers of people and are inert over time. The game then becomes changing stated preferences to keep up with the fast-changing centrally-controlled ones. For a concrete example: "Eastasia is the enemy! Wait, MiniTruth says Eurasia is the enemy? I misspoke just now; what I meant was that Eurasia is the enemy, and always has been!" A logical next step in this game is a kind of truce where everyone agrees not to point out each other's very recently endorsed and now obsolete received preferences.

Based on the signal you want to send to the central node which has changed the preference, there are two ways to play this game. If the central node can be convinced that you actually believe the new preference, and cares whether you do, then it becomes very important to avoid looking like all you're doing is repeating the received preferences - since that gives away that you probably don't. This is the case in academia or business. In my medical training, very often I've heard an attending object to a resident's treatment plan for a patient, and the resident will change his or her mind - but rather than say "Okay, you're higher on the ladder than me so I'll do what you want", the resident will say, "Oh, uh, well, you're right, I hadn't thought of that and now that I thought of it I realize that's a better idea, let's do it that way." You can't just be a yes-man, you have to look like you really believe it. (I'm always struck by how few residents are willing, even in private, to admit that most of the time, the decision has everything to do with agreeing with your attending, and little to do with evidence.)

On the other hand, in an authority structure where you just want to signal to the central node that "I'll do it your way no matter what" or they just don't care what you really think as long as you keep your mouth shut, then being overt and clumsy about your publicly stated preference-shifting may actually be a good thing. (See above for a terrifying example where Saddam Hussein purged the government and ministers were suddenly leaping to their feet shouting their newfound and obviously motivated loyalty.) Then the central node knows you'll endorse whatever preference they tell you to - and there may even be some effect of dampening or confusing conflicting internal preferences by this constant shifting. This is also more likely to be the case where the stated preferences are more ethereal or ideological - i.e. religious or ideological claims - versus those which you expect to directly, practically affect decisions you're making on a daily or hourly basis as in business or medicine.

No comments: