Wednesday, December 30, 2009

Automation and Unemployment

Since the industrial revolution began there has been speculation about the long-term effect of automation on employment. You can create plausible-sounding but completely opposing just-so stories: that automation will free humans from drudgery to focus on more important work or leisure, or that automation will destroy employment. The first argument is one made by the now silly-seeming futurists of the 1950s who predicted a 20-hour work-week for Americans.

That mid-century naivete works fine if all you want is to maintain 1955's level of productivity - but these decisions do not result from central planning. They result from individual humans that seek status and are competitive and want more wealth if there's more wealth to be had. (Fortunately, this is the path that results in economic growth, rather than stagnation and class sedimentation.) Consequently, the real work-week actually grew slightly in the second half of the twentieth-century.

The second argument - that the machines will put us all out of work - is a more pessimistic version of the same argument. In this scenario, the 20-hour work week exists not because that's all we have to do to maintain our current economic growth rate, but because there just isn't any more work to be done - the machines are doing it all.

There are examples in our past we can point to. Yes, there were labor shocks in the U.S. in the 1970s and 80s, due to automation and offshore competition; yes, the populations of many Midwestern American states are lower today than they were a century ago, because those economies are mostly dependent on agriculture, and one person with machines today can produce what many did a century before. But our population as a whole grew, and became richer in that time, even if in certain industries or geographic areas at certain times there were disruptions. No one in 2009 is complaining that the beaver pelt industry is moribund, because people learned other trades besides trapping. Likewise there are industries besides agriculture, and as economies expand, new industries appear, and with them new needs.

That's why, it's hard to see why we'll suddenly all be out of work, unless we're talking about full self-repairing AI, in which case we'll have other problems. There are many more examples of automation expanding work that we can point to now. What is most relevant for the future is "white collar automation". It's certain that the automation of physical labor that the world experienced in the last century and a half (and which it's still experiencing in many quarters) is not the end of the road, and that computers will start automating white collar tasks much more than they already have. (Forget about outsourcing to India.)

Have there already been "cognitive" tasks automated where we can gauge the impact on the industry in question? Yes. That computers and calculators are widely used does not seem to have put accountants out of business - except, of course, for those accountants who refused to learn how to use a computer - and now one accountant can produce what many did a century ago. The same is true for other white collar professions. The answer is that in the future, on some level, we will all be programmers. (Compared to our parents, aren't we already?) We'll use the much-afeared automation as additional tools to increase our productivity. Just don't be that guy in the office who won't learn how to use Excel, and you'll be fine.

One challenge to the coming white collar automation blitz is the existence of administrative sinecures. The invisible hand has clearly not yet done away with them in the private sector, nor in government or academia. Since their existence flies in the face of economics, it is likely that their persistence is best explained
by universal irrational aspects of human psychology that will exist even in competitor organizations. Once the expert systems start coming on-line in earnest, will these positions survive? Will they, over time, make them seem more and more suspicious and non-contributory? One down-side to the new automation is that sinecures are not quantized; it's likely that some percentage of your time is sinecurity, and the inefficiencies that allow it to persist will be erased, or at least known about. Your little sick days to catch up on soaps? Your argument that you really need to audit that company in Miami? Your coworkers' simulations will catch it.

Alternatively, it's possible to imagine that while producers are staying ahead of the automation blitz by learning to use it to produce, will sinecurists will learn new strategies to justify why their positions are "more art than science" that
can't be reduced to vulgar numbers by mere machines.

Monday, December 21, 2009

Robert Frank, Che Guevara and Chilean Miners

In the Motorcycle Diaries, Che Guevara and his roadtrip buddy come upon a couple huddling at a stony roadside in the nighttime Chilean desert. They're waiting for mining bosses to show up and hire them to work in Chuiquicamata mine. They're poor, and they're cold. Moved by their plight, Guevara and friend give their coats to the people. (This was the period before Guevara was killing his own soldiers for not being enthusiastic enough, something that it would be useful for the world's T-shirt wearers to remember.)

Of course we're led to assume (implicitly) that the Anaconda mining bosses (who are often unpleasant, to put it mildly) are villains. But it's hard to ignore that the mining company is what makes it possible for them to make this living, the unpleasantness of its bosses notwithstanding. I most emphatically don't mean to call this couple whiners, or to diminish the back-breaking grind of mining. But the salient point is that without the mining company and its bosses, the couple in question would go from making at least some living, to making no living. But because in general humans have difficulty thinking about counterfactuals and the effect of status on their actions and happiness, this reality remains masked.

What does that mean? Think about this: imagine you have a chance to be the only person making $120,000 a year in a neighborhood where everyone else makes $100,000; or, you could be the only person making $180,000 a year in a neighborhood where everyone else makes $200,000. For simplicity, let's say you can change neighborhoods at will, so you're not permanently committing. No equivocating about
relative purchasing power - in the second case, you get $60,000 more per year of stuff, including a safer neighborhood. If humans were rational optimizers of external wealth, this would be a no-brainer for everybody - take the $180,000, and who cares that the neighbors think you're low-class, right? Wrong. This is a tough
call for most people, because relative status matters a great deal to human beings. This of course is the classic Frank conundrum.

How does this apply to the miners? When the couple is making a living, their needs are taken care of, but they have to deal with unpleasant mine bosses (often foreign ones); they certainly aren't making $180,000, but they're at least making something, although in earning a living they have to deal with unpleasant and often foreign bosses whose own status reminds the miners of their station in life. On the other hand, when the miners are unemployed, they don't have to deal with unpleasant bosses to obtain the non-money that they're not making. It's a difference between being made acutely aware of their social status and under the control of people they don't like when they're making some money, versus not being hassled by non-bosses who are not giving them money they're not making.

(One obvious solution to diminish any reputation for being abusive imperialists is to enforce fairness in hiring and labor standards above and beyond what is the local norm. This works at home too, as American technology companies in the 1980s competing for talent began to realize. Of course, Anaconda was not forced by any such labor shortage to do so, which is where governments come into the picture. If you think free markets work, it's in your interest to take actions to make other people favor them too. Given their impact on Guevara, unpleasant bosses at Chuquimacata certainly didn't do capitalism any favors in South America.)

The default position of human societies is poverty, not wealth, and sometimes, only foreigners have the capital to begin profitable operations. The unpleasantness of earning a living seems especially acute when the work is difficult and the labor market is imbalanced, and mineral extraction is probably the best example. But if there were no mine, these poor would have been poorer. There is a lesson for Americans in this too. Yes, you might not like your boss, and you might not like that a Japanese company just bought out your plant. Given the state of credit markets in the U.S., without them the plant would have closed. And no one is making you work there. Remember - for simplicity, you can change neighborhoods at will.

Thursday, December 17, 2009

Don't Like the Nobel Peace Pick This Year? Start Your Own

Most of the Nobel categories are necessarily open-ended; the achievement that merits the prize is not defined in advance. No one can predict what will be useful in chemistry ten years from now, or medicine, or economics.

This isn't true in the peace category. I think we can all point to specific parts of the world where people are suffering, and exactly what's going on there to make this so. That being the case, why aren't we setting concrete, measurable goals in advance, with prizes to motivate people. Why wait for future Matt Peters and Greg Mortensons to have an epiphany; why not an epiphany catalyst fund?

A new Nobel - call it the X Prize of progress in human welfare - might work like this.

- Specific prizes would be set in advance, after lengthy consideration of rough cost, possibility, and time frame.

- Prize terms would be incrementally measurable solutions to local, measurable, concrete effects. There will be no debate over whether a winner or winners deserve(s) it, because the result will be concrete. No "help the children of Africa"; more like "80% literacy in Zambia by 2020", with staggered decreases for
every percent lower than that. (Or, "twenty million for a five percent increase in literacy in Zambia by 2020, and five million more for every percent increase above that")

- There will be standing conditions for all the prizes. You can't finance your operation leading up to the prize by stripping a place of natural resources; activities will be audited prior to avoiding the prize to root out corruption. The prize committee will establish how it will measure the achievement and include it along with each new prize announcement.

- On the other hand, the award can be distributed however the winners wanted - including to stakeholders. If you can think of a way to get Somalian warlords to establish a lasting government by paying them off in installments, fine.

- The prize committee would be happy to offer help to contenders who don't have the capital to begin the work (that's all of them) by linking them to whatever NGOs or multinational resources are already out there.

- The prize money would be private, and of course, bigger than the Nobel (I'm looking at you, Buffett and Gates and Soros). Start the endowment and announce a bunch of long-term (but still concrete) goals to allow it to compound.


Criticisms:

- Lots of people with money want to make the world a better place but don't want to go to Sudan to make it happen. This outsources the work.

- Target governments might complain to the traditional interfering villains (the US, NATO, EU) - let them. They'll have to come into the light of day in an international forum like the UN to complain that evil imperialists are forcing their people to have clean water, literacy and open elections, and the governments
hosting the teams competing can always say "we can't do anything; they're private citizens". (Note Iran's efforts against Twitter; embarrassing when you're reduced to fighting your adversaries' companies, as opposed to actually confronting your adversaries.)


Pie in the sky? There are lots of people who want to make the world better, and a lot of money. The pie is getting the endowment set up. Writing the goals and getting applications would be a piece of cake. Encourage grad students. Start small and not overambitious; farm output in a single 10 km by 5 km valley? In a single village? Fine. Look at what has worked before and go do it.


Suggested prize targets to start out:
- Internet Censorship in China Ended by Year X

- Open Elections in China by Year X

- Literacy in [Developing Country] 80% by Year X

- Free Movement of People Between the Koreas by Year X

- Everybody in [fill in fourth-world country] Has Non-Cost-Limiting Access to Clean Water by X

- New Mexico Energy Entirely Green by X As Flagship to Rest of U.S.

- Iran Makes Official Announcement Abandoning Nuclear Weapons Ambitions by date X

- First weapons destroyed in Pakistan and India Nuclear Disarmament by X (note: new study shows that global nuclear winter could occur from Pakistan-India nuclear exchange alone)

- Israel and Palestine: No Troop-Driven Violence for Time Period X

- Iraq or Afghanistan (or Even just Anbar Province or Herat) Self-Governing with Minimal US Troop Presence (I daresay a private group outside the US government would be able to plan for this more effectively than has been done)

Sunday, December 6, 2009

Coastline and Wealth

Curve-fitting doesn't give strong R's; forper capita income (purchasing power parity) vs absolute coastline length or coastline per area, there's no linear relationship > 0.19.

However, average per capita income for the 138 countries I had both coastline and PCI data on $13,562, vs $9,040 for those with no coastline that I had PCI data on. No surprise.

Though Liechtenstein is not included in this analysis, it is one of two doubly-landlocked countries (the other being Uzbekistan); Liechteinstein is typically ranked first in per capita income rankings, so geography is not destiny.

Monday, November 30, 2009

Kay Sage and Yves Tanguy

Visited the Philly Art Museum yesterday. Besides their two Roberto Mattas, a couple pieces caught my eye; Kay Sage and Yves Tanguy were married, and produced cool stuff. A nice one from Sage:

Saturday, November 14, 2009

Decoding Eco's Irony

The following from a Spiegel interview (hat tip to Marginal Revolution).

SPIEGEL: Why are these lists and accumulations so particularly important to you?

Eco: The people from the Louvre approached me and asked whether I'd like to curate an exhibition there, and they asked me to come up with a program of events. Just the idea of working in a museum was appealing to me. I was there alone recently, and I felt like a character in a Dan Brown novel [emphasis mine - MC]. It was both eerie and wonderful at the same time. I realized immediately that the exhibition would focus on lists. Why am I so interested in the subject? I can't really say. I like lists for the same reason other people like football or pedophilia. People have their preferences.


He's either being more humble than he should be, or profoundly ironic in a way I can't quite unpack.

Tuesday, November 10, 2009

Specialization and Gaming the System

All professions are conspiracies against the laity.
- George Bernard Shaw

One of the problems of specialization, and of living in a literate society of laws, is that specialists use their expertise to game the system at the expense of the majority. This is exacerbated because very important commercial and legal events are big and therefore, for most people, infrequent, and because specialists form effective guilds to exclude nonspecialists (for example, in academia). There are specialists involved in conducting these transactions and because of Pareto-principle-type effects, 20% of the causes (read "merchants") are responsible for 80% of the effects.

Consequently in these transactions, the most important that we conduct in our lives, there is usually a large asymmetry in experience, skill, and confidence. If you're buying a house or condo, you could quite literally be hundreds of times less experienced than the real estate agent (or infinitely, if it's your first time); if you're buying from a developer, the seller is also in a better position. Same going to the dentist; same testifying in court about something; same taking your car to a new shop for a big repair. I've never bought a transmission before, I don't know what I'm doing, and I just want to get out of there with as little damage to my bank account as possible. Because the transactions are infrequent, the chance that I will have repeated encounters with these merchants is low, so from a game theory standpoint they have no reason not to cheat me.

Perhaps the worst such example is legislation. Few citizens of democracy have the time, expertise, or inclination to read laws. Worse, today, no one even makes an effort. The model that modern democracy is based on is Athens, where you could show up to give your two cents, and law and knowledge weren't so specialized as to require two thousand page amendments to prior multi-thousand page laws. That is no longer the world we live in. An oral culture was good enough for the Iroquois constitution but not for more complex Athens. Have there been oral cultures in the past limited in specialization and growth because there's some fundamental upper limit to social complexity of oral cultures? In the same vein, is mere static writing no longer enough for twenty-first century democracies like the U.S?

There is also the problem of separation of gamers, and gamers' interests, from constituents. A large democracy invariably creates a class of legislators and lobbyists (the gamers) who write the laws. The larger the democracy, the smaller the body of decision-makers relative to the population, the greater that the Pareto principle will magnify inefficiencies and bad decisions and holes that in an Athens or Andorra-sized democracy could have been identified and rectified more quickly. The concern is that the legislative class's interests diverge so much from the people they are supposed to represent that the system is no longer representative. (Added later: one possible solution from Conor Friedersdorf here).

The term "broken" is thrown about in political discourse. If size (even if not complexity) differs between democracies, it may be interesting to compare the functioning of the legislature of a small democracy (Athens or Andorra) to that of a large one (the U.S. or Japan), though again what would be interesting would be to compare the functioning of legislatures, although one problem is I don't think we know how to devise a metric unit of measure for legislature function.

The accumulation of these impenetrable strata of the laws which are in theory binding us is a problem in that the current process produces a corpus that is necessarily rife with inconsistencies (especially regarding the use of precedents in the judicial branch); therefore, a worthy goal is a "legislative programming language" that would have to compile successfully on top of the current constitution and set of laws in order to be adopted. It's not compiling? Too bad - root through preceding laws and find the inconsistency; fix it, or fix your new law. We also would have to be less concerned that modern Supreme Court jurisprudence doesn't just amount to some kind of legal sophistry to go through contortions harmonizing what's written in the Constitution against what most of us agree is a reasonable decision, somehow even though we're centuries away from the ideals of the men that first put pen to paper.

But what we're concerned with is not so much having a neat constitution with its i's dotted and t's crossed, but rather one that works predictably. That is the real reason it's so unnerving that legislators do not and in reality cannot read the bills they pass. Robin Hanson has suggested that we hold policy-makers accountable for their policies - that is, have every new law or policy contain its own success criteria, and then follow it up (which he terms futarchy). The penalty for lawmakers - and ultimately, even voters - who are wide of the mark would be decreasing influence (i.e., at 18 we all start out getting 100 votes, but if you vote for a bunch of referenda that don't do what they said they would do, in the next election you only have 50). In essence, it's a prediction market, where the bet is made with future influence - and status, if you're a politician. Hanson describes futarchy as distinct from democracy (in my opinion to be provocative) but I think it's entirely compatible with democracy. Most modern democracies have adjusted the franchise repeatedly in the past to improve themselves. Is there a reason we should stop now?

Sunday, October 11, 2009

World Values Survey Results

Have you seen the results of the World Values Survey?


The existence of this data makes it easier to determine the correlation of values, and the institutions dependent on them, with development and (most importantly) happiness. It's a bit of a coordination game, that is, a chicken-and-egg question: how exactly would a person in rural Zimbabwe go about obtaining self-expressive secular/rationalist values?

I do find it curious that in measures of quality of life, transparency, etc. the ranking organizations tend to be in Scandinavia, and the countries that typically do the best are in (drum roll) Scandinavia. Have you also noticed that in intra-US QOL measures, the Upper Midwest seems to do amazingly well? As I recall the survey organization is in Madison. This of course is possible, and you could say that the better-educated, more democratic, and more concerned with human welfare is a country, the more likely it is to have such organizations. Assuming that self-expression and rationalism are the good ends of the two dimensions on the values survey, Northwest Europe wins out again - although the executive board of the WVS has members from the US, Germany, Turkey, Spain, and elsewhere.

Added later: I was looking at this chart again and specifically looking at the ex-communist boundary. That's another way of asking how much of an impact a communist government might have on a culture. I'm typically a hard sell when it comes to arguments that a government can change the underlying culture (or even work well with it - witness democracy in Iraq). It's hard to tell, because the ex-communist countries are geographically adjacent, although the fancy footwork boundary around China, Korea and Taiwan is interesting.

Friday, October 2, 2009

Critical Thinking and Rejection of Received Wisdom in Greek Myth

Perseus knows from the conventional wisdom that fighting Medusa is a fool's errand, that to look in her eyes is death; yet he can't help himself, and in spite of common sense, he sets off to kill her. Too clever for his own good, he takes a polished shield that shows her reflection. Not only does he succeed in his mad quest as a result of this ingenuity, afterward he runs around using her head as an ugly-laser.

King Leonidas of Sparta goes to the Oracle to ask what he should do. The Oracle tells him either Sparta will fall, or you will die. Disgusted that a Greek oracle would suggest to a fellow Greek that they lay down in front of Xerxes, Leonidas leads the Spartans at Thermopylae. Whatever part of that story is real, we know that a) Leonidas really did lead the Spartans to fight and prevail against the Persians at Thermopylae, and b) the story as it's been handed down contains Leonidas's rejection of the Oracle. Because of this, Western civilization exists as we now now it.

Odysseus was told that no man could resist the call of the Sirens. Like a smartass he tells his men to cover their ears, and bind him firmly to the mast of the ship so he can hear them without jumping overboard. He hears the Sirens and once the ship has passed out of earshot he is no worse for the wear, though through his innovation he has experienced something supposedly fatal to mortals and defied the natural order.


Why Set Up Conventional Wisdom to Only to Reject It?

One of the functions mythology accomplishes, in its more coherent moments, is to transmit mores. Most of the time this is clear. Listen to Jehovah; if, despite direct commands from the Almighty, you give in to temptation like Lot's wife did, you will be smitten. Clear enough - which is why some of the Greek myths start to seem very strange indeed. Why would Greek myths so often set up a value or more, and then show how moral or cosmogonic authority can be dismissed with critical thought and persistence?

Imagine that, like Perseus, Lot were celebrated for overturning the established order, and doing an end-run around God by giving his wife a mirror with which to look back at burning Sodom. The Old Testament would have a different flavor.


Is This Aspect of Greek Myths Unique Among Ancient Cultures?

Is it possible that the Greeks were unique in incorporating directives to critical thought in their myths so early? To do this concretely, you would be forced to invoke some established wisdom, and then show how a hero could succeed by applying a new solution to an old problem. I have often wondered if the impact of Athens on the modern world was not a coincidence of history, that they were fetishized by the later Roman Empire who spread their work around Europe and the Middle East and it could have just as easily been the Lydians or the Cyreneans. (I've had the same questions about whether grapes are really any more characterful as the basis of a fermented beverage than say, apples.) If the Greeks were unique in so early celebrating as virtues critical thought and the rejection of static groupthink, this constitutes one argument for the uniqueness of the classical period and its contribution to world history.


Could Greek Myths Be a Palimpsest of Bronze Age and Classical Mores?

Another and not necessarily mutually exclusive explanation is that what we're seeing are bronze age myths with the standard structure of human fables, plus a layer added later, during the classical period.

In this view, maybe in the original "pre-rationalist" Mycenaean version, Perseus kills Medusa by luck or resists her power because he's Zeus's son, and the mirror trick was added later. Jason and the Argonauts encounter the Siren. As royalty (authority), Jason is able to outplay the Sirens, and the sailor who jumps off his ship is saved not by his own cleverness, but by a god. For this reason it's worth investigating whether the passage of Odysseus through the Sirens was a late addition to the Odyssey.

It should be stressed that this kind of shift-in-values-over-time is often investigated in epic works that seem to have accreted narrative from temporally-separated contributors, Beowulf being one example. Another is the Mayan Popol Vuh, which features Twins Hunahpu and Xbalanque defeating a whole pantheon of gods. Westerners don't typically draw a distinction between classical and post-classical Mayan periods, and the Popol Vuh is a K'iche' story from a post-classical culture in the Western highlands, separated by several centuries and more than a hundred miles from the much more famous earlier religious centers in the Peten lowlands. The interpretation is that the K'iche' were mocking the religion of their predecessors by incorporating its figures into their own myths and denigrating them.

It's worth noting that Beowulf and the Twins of the previous examples are "normal" exemplars of virtue who, like the heroes of the vast majority of human mythology, succeed by dint of their adherence to concrete ideals and their position within an authoritarian hierarchy (royal or divine heredity, or a mandate from a god or king). They aren't smartasses like Perseus and Odysseus.

Economic Rationality Index

It's been fashionable to point out that Homo economicus is not a fully rationally self-interested animal. Kahneman and Tversky were instrumental in waking us up to the reality that the human brain is not a universal well-rounded problem-solving machine - and of course, as a product of the accumulated exaptations and legacy systems and local optima and ad hoc functionalities of evolution, why would we expect it to be? Rather than serving as cause for hand-wringing, appreciating this fact lets us either do something to address and correct it, or at least to call it out and create hacks and workarounds. To me this is the promise of the Late Enlightenment.

I recently moved from the Bay Area to San Diego. Looking at gasbuddy.com, I noticed that the spread on gas prices in San Diego seemed greater. My experience of actually price-comparing gas stations has borne this out. There is greater variation over smaller areas in San Diego than there is in San Francisco. Sometimes there are ten cent differences at service stations across the street from each other. What's more amazing is that there are quite often cars filling up at the more expensive one. I've been tempted to walk up and ask people why. Clearly, this bothers me.

This suggests a way to compare the degree of rational self-interest between two geographic areas - take the price on 87 for all the gas stations within two predetermined square miles. Find the standard deviation for each square mile. Take repeated readings over some period of time and average, if you're worried about changes in supply cost rippling through the market and driving up the SD out of instability, rather than irrationality (non-100%-efficient markets is not that same as irrationality). Then look to see if the localities are consistently different. If there are consistent differences, next step: does it correlate with certain chains? Or with demographics of the area (education, income, income distribution)? Or some cultural intangible (i.e. San Diego is too relaxed for its own good)?

The square mile should be controlled so that there aren't geographic barriers (busy highways or streets, water, etc.) that would actually make consistent price differences more rational. The differences I've seen are ones between gas stations across the street or opposite ends of the block from each other. Since the index would consist purely of posted price as reported online, rather than sales, it's assuming that people are actually buying gas at the more expensive stations. But unless there's some bizarre detail at work here, this is a fair assumption - service stations aren't going to set a price at which nobody buys gas. (The bizarre detail could be something like - the local area is dominated by a single chain that profits from products other than gasoline, so its prices are less sensitive to actual consumer behavior; or, a frequent-flyer style gas club not available in both cities. This is why I should actually ask people what they're thinking filling up at the more expensive station, because maybe they are actually thinking something.)

In fairness, I'm very sad to have left San Francisco, and very eager to jump on anything that points the Bay Area or its residents as better than San Diego. But what I've observed anecdotally suggests strongly that San Diegans just can't be bothered to drive an extra 2 minutes to save three dollars. That is to say, it seems that San Diegans are truly less rationally self-interested than San Franciscans, and this provides a way for me to make sure it's not just my own confirmation bias operating against poor San Diego.

The next step is to actually do the calculation, which will take a while. If you run across this post and know of a similar index that's already established, please comment.

Sunday, September 20, 2009

Sports, Conquistadors and Status

I'm posting a comment I left at Liberal Order. The blogger shows some amazing bankruptcy statistics for retired NFL players (78%!) and then does an estimate of the actual value of a career in baseball, from the perspective of a high school athlete - $86.

So why do people keep trying to get into sports? I think it's two things. First, it's partly the conquistador effect. An anomalous number of successful (read: more ruthless than average) conquistadors came from the Extremadura region of western Spain, Pizarro and Cortez among them. Extremadura is still poorer than the rest of the country, though beautiful in the spring. Going to the New World was a ticket out. Same for post-coal-and-steel Western Pennsylvania ("the cradle of quarterbacks") and the continuing disproportionate contribution of athletes to professional sports from inner cities and poor rural areas in general.

The second reason people want an $86 career is the Robert Franks reason, status. Adulation by people back home, by kids, and (most importantly) by the ladies has a concrete value to most people. Gentlemen: as a single man, would you consider switching to a job with a lower salary if it meant more access to females?

Friday, September 4, 2009

How to Test Theories of Average Attractiveness

Tyler Cowen at Marginal Revolution wrote a post on theories of attractive women that engendered surprisingly few spikes in blood pressure. My comment on the post is reproduced below.

I see what Tyler is getting at - crudely put, a level playing field between otherwise isolated demographics will encourage an efficient exchange of goods and services, which means that beautiful women of all social backgrounds will congregate more efficiently where rich and poor meet.

Now, assuming that beauty is at least partly a function of genetics, this still has implications for the attractiveness of specific ethnicities. (Yes, I realize this is a slippery and subjective slope so I will avoid concrete examples.) If you're a woman from a culture that has been urbanized for longer, chances are better that these disparate demographics have been able to arise and then mix in the "marketplaces" that Tyler is describing. Consequently, women from long-urbanized gene pools will have had more time, pressure and opportunity to produce more beautiful combinations.

Now think of the opposite case: if you look like Angelina Jolie but you happen to have been born in a hunter-gatherer village of 100, then sure, your one in a million looks will get you the "best" guy (best hunter, OR chief's son, OR best-looking) out of that hundred. But the chance that you'll land the best on your continent (the male one-in-a-million, i.e. Brad Pitt) are low. So, among hunter-gatherers (or more recently urbanized people) you would not expect the same selection for beauty as a huge population urban center with admixture.

Note that this assumes a model of women valued by physical appearance, men valued by social status/wealth/power, a situation which the more enlightened among us will find distasteful but also recognize as having obtained for the vast majority of history (and probably all of prehistory). Consequently the same theory could also be used to argue for males having more of whatever it is that equates with wealth and power, if a) there is non-hereditary mobility within the urbanized society (i.e. the high priest's son can lose his spot if someone memorizes more potions than him), if b) the selected-for traits are constant over time, and c) if those characteristics are genetic. Those are thinner assumptions (especially the first two), but maybe explains why attractiveness in males and females don't seem to go hand in hand.

To test this, we could start naming cultures that we would expect to have more attractive women, but the only way to test this idea is against some universal index of attractiveness (i.e. women from group X on average are more attractive to Zulu, Inuit, and Clevelanders). We do not have such data, although it wouldn't be that hard to obtain. Consequently, at this point we're just arguing based on personal biases and current taste.

Like Tyler, I too spend a lot of time thinking about location theory.

Monday, August 3, 2009

The Bad Stripe

I keep coming back to this region of the country: a strip running from West Virginia down the back (west side) of the Appalachians, through Kentucky and Tennessee, on through Arkansas to Oklahoma. I'm going to call it the Bad Stripe. Why? It's cloudier there, and people are less happy; and they also are about the only part of the country that voted Republican at higher rates in the 2008 race than they did in 2004. This is not an endorsement of one party or indictment of the other; but I would say, and I think self-described cultural conservatives would agree with me, that cultural conservatives are likely to be "angrier" than non-cultural-conservatives. Therefore, there are people are probably angrier in the more-Republican-2008-stripe than outside it.

The overlap is less exact here, but again the same Bad Stripe jumps out when looking at life expectancy. At first I thought that this map was just showing the Black Belt, but this graph is white male life expectancy:

It appears to be partly poverty, which shows patchy low-per-capita-income areas throughout that stripe.

(from HowStuffWorks)

But still: a) there are other poor parts of the country that are happier (southwest Texas) and b) many rural poor Western areas are both happier and healthier. The only thing that comes to mind is (say it with me) diet and exercise, the latter of which is much easier with the sun shining.

Wednesday, July 29, 2009

Sunlight and Happiness

Take an African animal and put it in Norway , or Seattle. Do you think it would be happy in that climate? But that's us. Growing up in Pennsylvania, I was familiar with the long summer days and long winter nights of temperate regions - but the week between Christmas and New Year, standing in Regent's Park in London during my first visit to the U.K., and looking at the tired glow of the sun through the clouds barely shoulder-high on the horizon at 11 in the morning - I started to realize why humans are so eager to to reduce their latitudes.

In the U.S., much of which is at sunnier latitudes, estimates of sufferers of clinical seasonal affect disorder or just winter blues run 10-15%. There are also claims that alcoholism correlates positively with latitude.

So I was not surprised when I looked at a sunshine map of the U.S., and saw a stripe of cloudiness along the Appalachians that matched (somewhat) with a portion of the unhappy stripe in a recent happiness survey:





The unhappy stretch extends from the west side of the Appalachians in Tennessee and Kentucky all the way through Arkansas and Oklahoma, which aren't as gloomy. I often point out to whiners in the Pacific Northwest that they aren't any cloudier than the poor folks in West Virginia and central/western PA.

Oddly enough, I noticed earlier that the unhappy stripe correlated even better with voting Republican for President in the 2008 U.S. elections.

Tuesday, July 28, 2009

Institutions and Values Matter

This evening I was playing around with cost of living, quality of life, latitude and average temperature numbers. Basically I was investigating the idea that countries at very high or low latitudes had a worse quality-per-cost ratio than those in temperate regions; that is to say, sure, maybe Norway has a higher quality of life, but it costs a lot more to live there than Spain, and do you get that much more? There is a lot of forced investment in infrastructure because of the marginal environment that they don't have to worry quite as much about in Spain.

I found only extremely weak connections, so I won't bother posting the data and graphs. But this isn't the first time I've looked for such connections. In fact whenever I've looked for connections between some economic or social indicator on one hand, and a non-human aspect of a country's real estate on the other (latitude, resources, climate) I either find no relationship, or a relationship that cannot be separated from the confounding facts of history, like the inheritance of certain values and institutions, particularly from Enlightenment Europe as it was colonizing the West. In fact, the first question on examining such relationships is whether we should try to separate the trend from history.

More and more I find myself siding with the development experts who state that it's the institutions of a country that matter more than anything else - not its resources or climate, temporarily wealthy petro-theocracies notwithstanding. One assumes that the values of the people have to support such institutions. As the U.S. is learning, you can't just drop a democracy onto a culture that has no history of open discourse and personal responsibility, even before the Hussein regime, and in fact I would argue that Indo-European cultures in general have at their root a value of parliamentary decision-making and openness that is rare elsewhere; why would the world's oldest parliament, the Althing, have appeared in Norse Iceland in the guts of the brutal Middle Ages?

Another way of emphasizing the importance of institutions, and underlying values, is in traditional economics terms; that it's the labor and not the land that makes life better and generates wealth. This shouldn't be surprising; it tracks the development of technology, which continually increases the potential productivity of human beings and their power to shape their environment. The last school of economics that discounted the role of labor entirely was the physiocrats in the eighteenth century, and no one has been able to make that mistake since. This can also explain the (near) disappearance of slavery, from a purely cynical economic standpoint. Three thousand years ago, there wasn't a whole lot more you could contribute as a scribe or farmer than you could as a slave. By the nineteenth century, the institution had shrunk enough in importance that moral concerns could override whatever clout the related industries retained, first in England, and later in the U.S. In 2009, from an economic standpoint, the idea of forcibly feeding and housing a person so they can pick plants instead of voluntarily build a better microchip seems patently absurd. In 2109 it will seem moreso.

The question remains then of how to measure the goodness of institutions without the tautology of just saying that whatever raises per capita income and gross national happiness must be good. Measuring values would be trickier still. Encouraging values that support good institutions and therefore the elimination of misery is the most important and difficult question of all.

Friday, July 24, 2009

Proposal: Adopt a Universal Phonetic Alphabet Based on Roman Characters

See my original post at halfbakery.com, which includes follow-up comments.

The benefits of literate people around the world being able to communicate, regardless of spoken language, are obvious. When building a writing system, there are two possible approaches.

1) Use symbols based on meaning. In such systems there are necessarily a lot of these (in the thousands. Chinese uses this strategy.)

2) Use phonetic values (an alphabet or syllabary). English (and many other non-Roman-alphabet languages) use this system; the number of symbols is often substantially less than 100.

I propose an alphabet, rather than a system of ideograms, and specifically a phonetic version of the Roman alphabet, because

a) well over half of humans live in countries where the Roman alphabet has official or co-official status (3.8 billion)

b) alphabets are easier to learn (if you are a first-language Chinese-speaker and need 3,000 characters to read a newspaper, how hard can it be to learn 30 more?)

The main benefit will be facilitation of second-language learning, rather than universal communication (any monographic alphabet-user learning Japanese or Chinese can attest to this). Because of course different languages have different sounds, an expanded Roman alphabet could be used (mimicking the International Phonetic Alphabet?) That would be more fair, so everyone has to learn a somewhat new system, even people who already read Roman characters.

Someone posted a proposal here that all languages be written in ideograms. Beyond the difficulty of teaching ideograms to alphabet-readers, it's very difficult to adopt these symbols between languages, given differences in word order and grammar. The best-known example, Japanese, uses a klugey system of Chinese characters with home-grown syllabary characters scotch-taping them together within Japanese grammar; and the ideograms do drift from their original meanings, defeating the purpose anyway of adopting such a difficult system.

The problem of implementation is first and foremost a political one, of convincing the Chinese and Arabic-speaking governments of educating their citizenry to be at least bi-scriptural. But Turkey has done so, without which change it's doubtful whether there would even be an argument today over whether they could join the EU. [Added later: it turns out that there were serious proposals in Meiji Japan to Romanize Japanese; they would have beat Ataturk to the punch. There are actually books from this era written in Romanji-Japanese. A study of why it didn't catch on would be informative to this proposal.]

A Proposal: Compile Constitutions in Programming Language So They're Consistent

This was originally posted at halfbakery.com, where you can see the follow-up comments.

Most democracies have largely secular, rational post-enlightenment systems of government whose power flows neither from gun barrels nor arguments from authority to continue operating. However, because of the advance of technology, the laws these governments pass (and the way they can operate) will continue running into situations that their founders couldn't possibly have anticipated.

Currently many of these problems are solved by court rulings, which establish precedents. These precedents accumulate until there are layers upon layers, not all of them consistent with each other. Laws passed by legislative bodies also can take the form of an inconsistent patchwork that fail to take into account what went before.

By writing a constitution in a logical programming language that generates new laws and automatically checks for internal conflicts, these inefficiencies and inconsistencies can be avoided. Governments would become much truer to the ideal of being made of laws, and not of men.

Saturday, July 18, 2009

Inflection Points in History: 1965 and 1990

It's tempting to try to find a point in time when an old zeitgeist fled and a new one took over. Anyone who does this in print must recognize that they're generalizing. After all, even in periods of real tumult, the zeitgeist is really just a constellation of attention-grabbing characteristics that mostly move independently of one each other. Art history example: the Renaissance is widely considered to have become the Mannerist period by the time Michelangelo began work on the the Last Judgment in 1537. But can we find a transitional work and point at emerging themes and say, here, this is the inflection point? Doing this with culture is not quite as easy as in biology, where there must be a clear linear descent.

Defining an age and trying to find its joints is necessarily a sloppy business, but this doesn't stop us. John McWhorter is a linguist, formerly of UC Berkeley and now with the Manhattan Institute, who wrote Doing Our Own Thing: The Degradation of Language and Music and Why We Should, Like, Care. McWhorter is a lover of the English language and the pages of this book mostly take the form of an elegy for a formal style of rhetoric (or really, the existence of rhetoric as such) that has passed into history in the United States, evidenced by the lesser demands placed on modern music and public speech. Mass media provide sensible landmarks of public taste for these kinds of discussions because they're a shared experience. McWhorter expounds on why the shift might have occurred and repeatedly comes back to 1965 as the inflection point, going so far as to find a "transition species", a so-called cultural archaeopteryx, in a live performance by Sammy Davis Jr. that had one foot in the old, formal style and one in the new, structureless, self-indulgent informality. McWhorter argues that a host of values and attitudes shifted along with this sharply punctuated 1965 transition.

Thoughtful people interested in the cultural changes of their country (and where they fit into it) can't help but find these speculations engaging. Probably the most famous treatment of the shifting of attitudes is Strauss and Howe's Generations. They attempt to explain history in cyclic terms with 4 recurring generational types, each determined by the nurturing patterns of the previous generation.

Without addressing Strauss and Howe's generational types, I've often speculated that a more recent and perhaps less profound cultural transition took place around my own coming of age, and there's a link to McWhorter's 1965. The 1980s in the U.S. - when I passed from kindergarten to tenth grade - in retrospect seem an oddly conservative island, a repeat of the 50s sandwiched between the era of disco, drugs and Vietnam on one side and grunge and the early internet on the other. Why? The kids of the post-1965, post-formal generation weren't yet out in the world on their own, independently interacting with the world, spreading those post-formal values. If you got married in 1963, had your first kid in 1965, she would have started college in 1983, carrying forward her parents' pre-1965-transition values. On the other hand, if you met at Woodstock and had your first kid two years later, she would start college in 1989.

What Happened in 1990?

My inspiration to collapse my thoughts into this blog post was a post on Andrew Sullivan's blogs, showing a sharp positive change in public perception of gay people in 1990. On this specific topic, try watching a few "socially conscious" 1980s movies that wear their values on their sleeve; they're recent enough that you expect their values to be the same as yours, but they're not. (The same argument can be made for why I am annoyed by the characters' values as they relate to gender roles in Bronte and Austen novels in a way that I am not by, say, Chaucer.) But the pro-gay attitude shift is just a canary in the coal-mine. In the early 90s, suddenly kids were growing their hair long on masse again and wearing lots of black and talking about conformity, there was loud angry music and grunge everywhere, and pot use skyrocketed. Coincidence? Or the coordinated coming of age of the first post-formal generations' kids?

There's no one archaeopteryx for the 1990 shift, which in any event wasn't quite as dramatic as 1965, but here are a few: 1989 had Batman, which celebrated "dark" (new to mainstream American film audiences); 1990's Dances with Wolves had the first naturalistic and positive treatment of Native Americans (imagine that in 1985!) and in 1991, Terminator 2 shows us the badly-behaved punk kid (the young John Connor) whose criminal sensibilities end up serving him well. Imagine a pubescent criminal as protagonist and hero on the mainstream big screen even in 1988. Musically, in rock, late 80s thrash (underground, no airtime and little MTV exposure) gave way fully to grunge by 1992 (on MTV you couldn't get away from it).

Is It a 25-Year Cycle?

If the pattern is real, then we're due in 2015 for another shift. But I have my doubts that it will remain cohesive. The use of mass media as milestones is becoming potentially problematic, because the way we consume media (and create it) has changed so much. On the other hand, it's the spread of values that shapes these shifts, and thanks to technology that process has never moved so quickly. Unfortunately I can't make a prediction because I don't have a sense of which values will carry over from the early 90s and dominate the new zeitgeist, just as it would have been difficult in 1989 to make the same kind of call. Check back in 2017; by then such a shift should be obvious.

[Added later: Razib Khan separately notes an inflection point also in 1990 for another sexual more, black-white dating.]

High Growth Rates: Nature Abhors a Vacuum

When confronted with China's recent brilliant growth rates, a cynic might say China had an unfair advantage: it had room to grow. That is, it's easy to grow your GDP at 6.46% annually since 1980 if you start out with a per capita income of US$305.46. Labor is cheap, you have no legacy infrastructure to deal with, and your exports are extremely competitive. Of course, this begs the question that there are lots of countries with low PCI, and not all of them grow at such robust rates - but let's come back to that. I was also once challenged by a European that the U.S. grew slightly faster than Europe not because of any decision we've made to embrace free markets, but because of our good land and wide open spaces which are cheaper to develop. Because of population density rather than PCI, we have room to grow.

This interested me, so I pulled together some IMF and UN figures for 179 countries and territories; most growth rates are annualized since 1980. First let's look at population density's relationship to income growth, if any (source for population and area data here and here resp.) For viewability purposes, the scatter plot below excludes the 11 most densely populated countries/territories (Singapore, Hong Kong, Malta, Bangladesh, Bahrain, Maldives, Taiwan, Mauritius, Barbados, Korea, and Lebanon, all > 400 people/km^2).



The red circle contains 18 countries, all of which have had at least 10% annual growth since 1992: Armenia, Kazakhstan, Estonia, Latvia, Lithuania, Turkmenistan, Bosnia and Herzegovina, Equatorial Guinea, Russia, Azerbaijan, Belarus, Tajikistan, Cambodia, Ukraine, Slovakia, Moldova, Czech Republic, and Croatia (Bosnia-Herzegovina data since 1994, Equatorial Guinea data since 1980, Cambodia data since 1986). There's a trend there: of these 18 countries, fully 17 transitioned from a closed communist in the last decade; 16 of 18 were Soviet satellites. The trend on display is the effect of markets, not low population density. Not the effect I was looking to call out, but interesting that it's so apparent here.

It's worth pointing out that, for the 12 countries that grew at greater than 15%, the average density was 45/km^2; for the rest that grew at less than 15%, the average population density was 218/km^2. The same statistics using 10% as the break point are 62/km^2 for 18 countries >10% growth and 222 for the rest. Breaking the other way, the 58 countries with greater than 120 people/km^2 density grew at 4.15%; the rest that have less than 120 people/km^2 grew at 4.34%. There does seem to be some effect. (These figures include the 11 densest countries taken out of the scatter plot.)

The picture for growth rate and PCI could fool you into thinking it's some sort of normal distribution, but it's not. PCI is taken from 1999 for all countries because it was the first year that IMF had data for all 179 countries.



Interestingly, the vast majority of high-PCI countries have a middle-road growth rate of around 5%. Low PCI countries are more widely distributed. The 18 countries with growth above 10% have average PCI of US$1,836.08; those with growth less than 10%, US$6,505.82. Then again, the 32 countries with negative growth rates clocked in at average PCI US$1,703.99, vs. positive growers at US$6,979.31. Breaking the other way, those with 1999 PCI below US$5,000 had average growth of 4.17% (131 countries), while those above US$5,000 PCI had average growth of 4.55%. Of course, again the low PCI, high growth countries were all ex-communist but one. Who are the low PCI low-growth countries (i.e. < $5,000 PCI and negative growth)? Georgia, Congo-Zaire, Ghana, Mongolia, Yemen, Niger, Sierra Leone, Burundi, Eritrea, Madagascar, Papua New Guinea, Nigeria, Ivory Coast, Gambia, Solomon Islands, Namibia, Zambia, Myanmar, Malawi, Paraguay, Syria, Togo, Uganda, Ethiopia, Suriname, Guyana, Rwanda, and Guinea. This is a grab-bag, but many of the countries were victims of civil wars (10 of 28) and a few resource-cursed ones.

There is a weak inverse correlation between growth and both population density and per capita income, although it is swamped by the signal from the post-communist states. The lesson here? Those states were left with strong institutions, which visibly benefit them (particularly in the case of the growth-PCI comparison). So perhaps it's true: China's low initial PCI and its strong institutions, as well as the U.S.'s open spaces, are both advantages to growth.

Friday, July 17, 2009

Mexico's Resource Curse: The United States

"¡Pobre Mexico! ¡Tan lejos de Dios y tan cerca de los Estados Unidos!"

– Porfirio Diaz


To a developing country, a long border with a wealthy, industrialized neighbor might seem like a blessing. But we have at least one pairing where this is anything but obvious: Mexico and the United States.

There are other cases where a national asset that seems on its face to be a big advantage turns out to be anything but; the famous example is the resource curse. You would think that a developing country fortunate enough to be sitting on mineral wealth would be able to use that wealth to its advantage – especially if it's oil. Nigeria is the textbook case.

These countries experience a vicious cycle of incredibly corrupt juntas uninterested in developing other industries or indeed doing anything except pocketing the proceeds from mineral extraction being conducted by foreign companies. In these institutionless post-colonial kleptocracies, the only options for the ambitious are to get out, or try to get in on the next coup. It's hard to believe that these governments could keep the figurative lights on for even one minute after the oil and diamonds stopped coming out of the ground. These are failed states with an allowance. They're Somalias waiting to happen.

What started me thinking about this was the recent arguments I've seen from several quarters that Mexico is increasingly showing alarming characteristics of a failed state. Combine this speculation with the interesting observation that the U.S.-Mexican border is easily the longest one in the world between a developing and an industrialized country, and you may begin to wonder if there's a connection. If Mexico had started off with a decent-sized middle class and relatively transparent institutions, they may have joined in the ongoing growth that the Anglophone parts of the continent have enjoyed since the industrial revolution.

Among the generally agreed upon characteristics of failed states are these: economic decline; loss of control of and security in their territory; deterioration of basic services; inability to make and execute decisions, in both domestic and international arenas. I don't think Mexico is there yet. It has elections, the lights shine and the toilets flush, and it's a productive member of the international community. It's even tied for 72 out of 180 in Transparency International's 2008 index – not great, but pretty good for a supposedly failing state. Still, the increasingly brazen coordinated paramilitary attacks on the police are a bad sign. Mexico is, in fact, losing control of and security in large stretches of its territory. Where? The states closest to the U.S. To whom? Drug cartels. Coincidence?

The truth is that Mexico is resource-cursed, and the resource is drug-consuming Americans. To be more accurate, the resource is drugs and the market is the U.S., but the situation is in some ways worse than in Africa's resource-cursed states. Imagine that Nigeria had a long land border with the EU. Now imagine that oil is outlawed as a result of global-warming legislation. The oil would still flow north - but it would become contraband, and the trade would be an entirely criminal activity. Because the drug trade is internationally sanctioned, the Mexican government (unlike Nigeria's) can't openly profit from the trade as it does with legal oil in Nigeria - because even graft and kickbacks are parasitized from activities that are at least legal in and of themselves. So the business becomes the domain of paramilitary drug cartels and some corrupt officials that allow them to flourish. It's worth pointing out that, although it doesn't border the United States, Colombia is the other perilously-close-to-failing state in Latin America, though it's improved in recent years. Still, it's had large tracts of its territory not under its control for years on end – and those tracts were controlled by organizations in the same trade as the paramilitary groups in Mexico. And they had the same end market.

A reasonable objection is that there are income disparities across other borders in the world; surely Mexico and the U.S. aren't the only odd couple, yet there are no paramilitary drug groups forming elsewhere. I suspected there were reasons why this didn't happen elsewhere, so I compiled a list of 279 sets of two-country shared land borders, and ordered it in terms of absolute nominal per capita income difference. Out of 279, here are the top ten:


RankCountry1Country2PCI1PCI2PCI Diff
1AustriaLiechtenstein50,098145,734 95,636
2NorwayRussia95,06211,80783,255
3SwitzerlandLiechtenstein67,385145,73478,349
4Saudi ArabiaQatar19,34593,20473,859
5FinlandNorway51,98995,06243,073
6IraqKuwait2,98945,92042,931
7NorwaySweden95,06252,79042,272
8FinlandRussia51,98911,80740,182
9USAMexico46,85910,23536,624
10UAEOman54,60718,98835,619

Source: IMF World Economic Outlook Database 2009, except Liechtenstein from CIA World Factbook April 2009. Border ranking process did not include exclaves (e.g. Ceuta, Kaliningrad)


It's immediately interesting that the U.S.-Mexico border pops up as 9th out of 279, and is one of the longest on this top-ten list. Three qualifiers in order here. a) There is less incentive to engage in risky activities if basic needs are met. An Austrian might know his neighbors are wealthier, but it's different when your PCI is over US$50,000 as opposed to just over US$10,000. If you're comfortable, you're probably less likely to consider running drugs to Liechtenstein. b) Not all borders are as easy to cross (legally or otherwise) as the one between the U.S. and Mexico. Many countries don't have the same freedom of movement (like Russia), many countries don't have as well-developed a road system as the U.S. or Mexico, and even if a border is about as long as the U.S. Mexican border (as in Finland and Russia), it may be even less hospitable than the often-mild desert between the U.S. and Mexico. c) Many of the extremes of per capita income reported by the IMF would be flattened if a median calculation were used instead of a mean. Few subjects of the UAE come close to the reported per capita incomes listed here.

What's the solution? I don't anticipate Americans' consumption of drugs will stop any time soon, nor will Mexicans' willingness to supply them; after all, markets are markets. The part of the equation we can control is a choice that we've made which forces the profits from the drug trade underground. That is to say, if the United States decriminalizes, suddenly Mexico's unique resource curse can at least benefit Mexicans and their institutions openly. Sounds like a pie-in-the-sky solution, right? Wrong. One country – Portugal, a civilized EU country no less – has already done exactly this, and "judging by every metric, decriminalization in Portugal has been a resounding success."

Thursday, July 16, 2009

Batting Averages for the U.S. Major Parties

On my political I had calculated some statistics about U.S. presidential elections, specifically about how the Electoral College affected the outcome in terms of the two modern major parties.

I thought it would be interesting to do a quick calculation about time-in-office and number of terms, starting with 1860 (the first year when it was a real Republican-Democrat election).

Since then, counting the current term, 15 out of 38 terms (39%) have been Democrat administrations, and 23 out of 38 terms (61%) have been Republican. Not counting the current term, the average time for a Republican in office is 4.84 years, vs 6.98 years for Democrats. Yes, FDR throws it off; but to come down to the GOP average, you have to take out FDR, Truman, and all 3 two-termer Democrats (Clinton, Wilson and Grover "Mr. Non-Contiguous" Cleveland). To bring the GOP's average up to the Democrats, you have to throw out Garfield, Harding, Ford, Arthur, Johnson, Hayes, Harrison, Taft, Hoover, Bush I, Lincoln, and McKinley.) Looking at the list, it struck me that the iconic JFK is the shortest-term Democrat since the Civil War, and the second shortest of either party.

From the start of Lincoln's term until today, the GOP has had 1104 months as presidient, vs. 821 months for Democrats. If you look at contiguous administrations (whether or not it was the same individual running them), Republicans have an average streak of 2.875, and Democrats 2. Take away the post-Civil War era from the GOP and it falls to 2.43; take away FDR and Truman and the Democrats fall to 1.5. One interesting idea would be to look at the same data over the same period for Congressional Districts. Besides showing the trend of Democrats getting elected in areas that vote Republican for president, there's more data and therefore less noise.

The interesting thing about looking at the data over time is that it appears to converge. Streak-length appears to moderate too, at the same point, right after Truman (administration names provided below for reference because the X-axis is totally irregular with respect to time). What's interesting about this is that the "brands" the two represent, i.e. the demographics they capture based on their message and political climate of the time, have changed radically over a century and a half. Will these graphs still look this way after another century?


Wednesday, July 15, 2009

When Your Hear a New Name or Concept, Do You Often Hear It More than Once?

Have you ever had the experience that you hear of a new person, or a new word or concept for the first time - and then you hear it again in a relatively short period? I'm not talking about a concept or celebrity that is genuinely new, or even obviously enjoying a revival, but rather a concept that appears not be making the rounds with any greater frequency and just happens to be new to you. This has happened to me frequently in my life, though of course I can never remember any examples. It seems trivial, but for the meme-minded, it fcries out for an explanation. So when it happened to me again today, I said to myself: I have to document this on my blog.

And here's the concept that I was exposed to twice. Last night I watched Confederate States of America, a really clever alternative history movie that deserves to be better-known. It's produced as if you're watching a British documentary from a parallel universe about the Confederate nation that rose from the ashes of the American Civil War, complete with commercial breaks that feature ads for slave-tracking collars. Granted, sometimes the tone is tongue-in-cheek, but overall it's very well-done, and highly recommended.

In the film, a real nineteenth-century physician, Samuel Cartwright, is cited for his "discovery" of drapetomania, a mental disorder exhibited by slaves running away from their masters. Yes, to us it sounds like these escapees were of perfectly sound mind, but the point of studying the topic is that the Cartwrights of the world were and are committing a grave injustice to science and to humanity by giving veneers of scientific respectability to depraved institutions like slavery.

A mere 14 hours later, I was listening to NPR and caught a This American Life story called Pro Se, about legal self defense and which involved psychiatry. I almost crashed my car when they mentioned Cartwright and drapetomania.

I provide the full context to emphasize that there is no clear connection between the two exposures. That is, there doesn't seem to be an ongoing press campaign about drapetomania, and if there were, I'm impressed that they got me to rent the relevant movie several years after its release, and on the day before the NPR story to boot. If this was the first time such a double-exposure had happened to me, you could rightly accuse me of confirmation bias. But I'm probably better-read that the average bear, so it's not all that frequently that I hear terms in mainstream media outlets that jump out as unfamiliar. If I were ten years old and hearing unfamiliar terms all the time, you'd have more of a point. But this has happened to me enough - over the course of my adult life, several times a year, at least - that confirmation bias becomes a harder argument to make.

Here are the explanations people have offered so far:

1) It's random but I'm human, so I notice it. If I work out the numbers, I would see that I hear x new terms per unit time, so there's a small but nonzero chance that the first time I hear it, I would hear the same term within forty-eight hours. Statistically, if it works out that this would happen several times a year, then it's nothing special.

2) I've heard the term before but didn't notice it until now. I find this one difficult to believe. I've been hearing "drapetomania" on radio and TV occasionally for years, and only now do I pay heed and recognize it as a word I don't understand?

3) It really is nonrandom. This would be easier to believe if (for example) I'd read drapetomania from a college friend of mine in Charlotte in an email yesterday, and then overheard someone in a restaurant talking about it today. Maybe my friend is using the word a lot, he knows other people in San Francisco and emailed them, and the meme spreads that way. Sometimes this explanation seems like a plausible candidate, but it's harder to explain with information sources that you consume nonpassively. If you hear the same term on the evening news on three different network afiliates, not so special. If you read it for the first time in a book published in 1919, and then see it in a movie released in 1983 and a newscast the same night, that's special.

Note that by posting this to my blog, if I hear "drapetomania" again in the next few days from friends, I won't know if I've influenced the spread. It's worth noting that when this has happened in the past I've sometimes heard the term three or more times in a short period, making the probability of #1 lower.

A more speculative version of #3 above is that there are strange macro-patterns operating in human behavior that we're not yet aware of, and trivial though it is, this is one of them. But without a guess at how the pattern operates that allows me to make falsifiable predictions, this is the same as no explanation at all.

[Original post, 15 July 2009]
[Added 21 December 2009 - the next time I was aware of such an instance was in the past week, when I read about lipograms on Wikipedia. 10 hours later I ran across a mention of lipograms in a James Fallows piece. This one seems more easily explained, since I was reading about lipograms because I had read something on Marginal Revolution, and bloggers and readers of East Coast literary and economic-conservative blogs are a pretty small subset of total meme hosts. Still, it puts a ceiling on the frequency of these events - if you count this one, 0.2 month^-1).

[Additional examples: I ran across Rex Wockner for the first time the last week of December 2009, and then again the same week: first, because he mentioned Point Loma and I was searching for the name of a mountain you can see from there, and then in a casual mention on Andrew Sullivan's blog. The first week of January 2010 I twice ran across the term Phallos (in reference to East Indian mythology) and twice across the term "wattle" in usages I was unfamiliar with not having to do with hanging skin (twice within six hours (in Tihkal by Sasha Shulgin, and the name of a winery.)]

Is Wine Uniquely Nuanced Among Fermented Fruit Drinks?

Is there anything special about wine that gives it its depth? Or is it just the millennia of accumulated culture that make it seem special, and fermented apple juice would have been just as promising as a snob drink?

Historical arguments miss the point. Yes, things might have been different if the apple had first been cultivated further west in Georgia (along with the first vines) instead of in Kazakhstan. But if there's something special about fermented grape juice that makes it so neat-o, then how could this have mattered in the long run?

I submit that currently, we have no good reason to think there's anything special or nuanced or detailed about wine, relative to other fruit-beverages, that gave it its snob status today. That beer has no such a place in the modern West has more to do with European history than it does with anything about the drinks themselves - it's all about the impact of the Romans and then that empire after them which did so much defining of dining, the French, versus the smaller and until the last few centuries more modest city-states of northern Europe, quietly drinking their beer. Yes, "a few centuries" is a long time, but prestige signals are a giant coordination game, and they change only very slowly. Gold is another good example.

I think that the relative prestige values of beer and wine, and the converging prestige-values of each, further weakens the "wine is innately special" argument. Starting in the last few decades, attitudes toward beer even in the legendarily philistine U.S. have begun to change. Beer used to be something that you chugged after mowing the lawn and don't think about very much beyond that, but now it's become a beverage that is properly the subject of adjudicated festivals. I'd like to give my craft beer-making countrymen some of the credit for that, as well as improved technology that allows what are effectively the centuries-old craft brews of Northern Europe's villages to be enjoyed outside Northern Europe.

Now back to the question of nuance. It's an empirical fact that beer is chemically far more complex than wine - there are around 200 compounds in many beer, versus maybe 20 in wine. It's apparently only in the last decade that beer has been systematically run out through columns to see what's in there, though I have my doubts that this is really true since chemists are not infrequently also beer enthusiasts. The point is that, if it's nuance you're looking for, then beer has it all. Beer is closing the prestige gap but it's not quite there yet. There are festivals, but men still show off wine knowledge to their dates instead of beer knowledge - but given the relative chemical richness of the two drinks, it's impossible to argue that wine's appeal is a result of its innate character, as opposed to its history.

Having said all this, I'm just as suckered as anybody by some of the interesting episodes in the history of winemaking. To some degree, I have to be; they'd throw me out of California otherwise. But there really is a lot you can do with it, and for a New World barbarian with the good sense to ignore convention, the possibilities are astounding, if you're innovative and willing to fail now and again, like some winemakers. Commercial winemaking in California really got off the ground with the European phylloxera epidemic in the second half of the nineteenth century, one of the few times a New World organism infected the Old (although there is some concern that Douglas firs, eminently well-suited to marine climates, are today in the process of becoming another invasive in Europe that also went New>Old World). The European wine industry almost collapsed, saved only by producing resistant strains, usually by some technique involving crossing with American vines. It's oddly underappreciated that the only place the old pure-European strains survive is growing on the sandy soils of a few Mediterranean islands, although I've heard rumors that there are isolated purebred Spanish vines growing in monastery gardens in a few mountain valleys of Mexico. Another odd bit of American wine history: one of the earliest successful California wineries in Fremont was destroyed in the 1906 quake, and you can still see earthen mounds at the site, covering piles of never-removed debris.

See? That's just one corner of wine history in one state, and cognitive hedonists can't avoid thinking about all that (and enjoying thinking about it) while drinking wine. It becomes part of the experience. But crucially, there's still nothing in any of this that could not have happened in some form with fermented apple juice (or beer). Even if some brilliant genetic engineer were able to make up for three millennia of underdevelopment of apple-wine in one year and develop a fully-nuanced 2009 red delicious, you still couldn't take your finacee on a tour of the vineyard and talk about its history, how the Count of Lyzanxia used to walk there when Magritte came to visit, et cetera. And that accumulated history is critical, because signaling taste and knowledge is where the real game is. Even assuming that there is special nuance in wine, and you happen to be among the gifted few that can tell the difference - if the drinking experience is what matters, the pure sense-pleasure you get from the taste of the wine - why would you care if others know how goddamn good you are?

The following doesn't address the original question of whether fermented grape juice is a better substrate for nuance, but it does support the signaling hypothesis. A study of members of the Stanford University wine club looked at brain activity in wine drinkers in response to two wines. Unbeknownst to these connoisseurs, they were actually being given the same wine twice, but they were told it was two different wines, one $5 a bottle, the other $45. They reported greater pleasure on drinking the "forty-five dollar" wine, and their brain lit up in a way consistent with greater pleasure. So they probably weren't even lying; an effective self-deception strategy if it's not the taste experience, but rather your own erudition and willingness to conform to fancy-pants values that you're trying to signal with your preference for the more expensive wine. Then again, the emperor's new clothes are for signaling, not for warmth. Try it yourself! Have a blinded wine tasting party at your house, and you'll find out how inconsistent people's answers are - in a 12-bottle tasting, the same bottle from the same winery will be given positions 3 and 11. In the end, maybe there are wine-tasters who actually know what they're doing. What's clear is that most people who think they know what they're doing, don't. One possible function left intact even after we consider all this to be hedonic wheel-spinning is signaling, part of which can be accomplished through conspicuous consumption. Boy those forty-five dollars go down smooth!

I like wine. I'm not attacking it, really. But so much is written about this one consumible that you can't help but wonder if there really is more information in a glass of wine than any other drink produced by natural processes and therefore possessed of a robust chemical composition. In the interest of full disclosure, I freely admit to being a much bigger fan of beer, and specifically unfiltered beers (like Belgian ones); enough so that I'm posting this preference on a blog long after it became cool (i.e., a useful signal) to declare it. I have a pretty severe sweet tooth and I'm probably picking up the simple sugars in these beers. It's probably also why I prefer nigori sake, which is an awesome preference to have, and here's why. Nigori is unfiltered cheap poor-man's sake; you can keep your $200 a bottle bullshit sake to yourself in your no-foreigners-bars in Kyoto. (Anti-meta-signal: I'm so refined that I don't need to signal, and I overtly reject your signaling value system. ZING!)

I apply the same dismissal to wine as I do to sake. I've come to the conclusion that intentionally refining one's palate is a form of masochism that any self-respecting hedonist should reject. Why the hell would I ever deliberately make my palate more difficult to please? By developing your taste, you're intentionally making your marginal unit of pleasure more expensive - you're making yourself more difficult to please. If you have a bad case of wine signal-itis and you enjoy announcing to dining compatriots all the flaws you've found in the wine on the table in front of you, you might put it in perspective this way. If you're American, chances are that your grandparents only had wine on a few special occasions in their lives, and that it was almost certainly disgusting. They'd smack you silly for not only complaining about a wine that's a little too fruity but for making it harder on yourself to enjoy eating and drinking. That's why I'm intentionally letting what little refinement I've achieved go fallow, and I automatically order the cheapest table wine on the menu. Or I don't, and get a Coke.

Sunday, July 12, 2009

The Decline of Overtly Signaled Subcultures

"...entire subcultures could rise overnight, thrive
for a dozen weeks, and then vanish utterly." - William Gibson's Neuromancer

The web is full of essays with nostalgic hat tips to the optimistic naivete of 1950s science fiction, but I feel the same way about the naive pessimism of 70s and 80s science fiction - not only cyberpunk but the whole body of sf with the theme of "the future will be disjointed and schizophrenic and incomprehensible and alienating" beginning in the 70s and extending through the 80s, exemplified by works like Shockwave Rider. It's 2009, and guess what? Yes, things move a little quicker; we have a recession and some wars; and on the whole, not many people pine away for the simple 70s - a time when, if you wanted to learn about a company, you had to drive to the library and hope you could learn something from of the two printed paragraphs about them you might find if you dug through microfilm for three hours.

One prediction of the dark, scatterbrained future that seems to be exactly wrong is the fragmentation of subcultures. When was the last time you saw a kid with a metal shirt on? Compare that to ten years ago. I have a special sensitivity to this issue, because as a reformed (but not retired) metal fan, I've noticed the trend - and it seems to extend to every strongly-signaled subculture (in terms of clothing or speech or hair). I don't think that's a coincidence.

Youth subcultures are about establishing identity - not only in opposition to your parents, but to the rest of the people in your own generation. I never once wished there were more metal kids at my school - I didn't have much contact with the other kids who were, and I kind of liked being "special". In fact, at my first Metallica show I had the uncomfortable realization that I was no longer "the metal guy", because suddenly I was surrounded by 10,000 Klingon-looking guys that were indisputably more metal than me.

But what did I get out of the long hair and the scary facial hair and the T-shirts? That signalled my specialness, and (stupidly) as a teenaged male, made me feel good that here, finally, was a way I could make others react - by signal that "I am a member of this clan of loudness and drinking and aggression." Contrary to one folk belief, most metalheads actually do like the music - it's not just about shocking people, and in fact it's the only part of the deal that I retain today. But would it have been as much "fun" if I couldn't signal my membership in this subgroup of unknown values to strangers? Of course not.

You can apply these same arguments to any non-mainstream subculture of the 80s or 90s whose members behaved or dressed in such a way as to mark themselves; I think kids do it for the same reason, and I think that signaling has faded for the same reason. That reason is, what else, the internet.

Kids in 2009 don't expect to be able to shock guys my age or older by coloring their hair or putting something on a shirt. They know we have access to the same websites they do; driving by a high school one day you see a weird-looking kid wearing a T-shirt with some incomprehensible expression on it, you go home and Google it, and you say "Oh, that's all it means." Any subculture that would rise and fluorish - or even survive - must be able to do so even after being catalogued and described and compared by the peers and families of the teenagers that would adopt them. References to taboo subjects are irreversibly weakened when you can start reading Wikipedia articles about them. That's why, if you're a teenager now, you know instinctively the futility of trying to establish an incomprehensible artistic and dress code, because it'll be on Youtube tomorrow. And in particular, any subculture that tries to build a feeling of power in its members through the intimidation that its other-ness creates is doomed to failure ab initio. Know why? Snopes.com. No kid, your Satan shirt doesn't scare me, because there's never been a real sacrifice. Now go back to gym class and see if you can run a mile in under 10 minutes.

In a way I feel sorry for kids who are 15 right now who, had they been born 20 years earlier, would have been punks and metalheads and goths. But there are still pockets of America where one can view these endangered species. The last time I saw a group of teens in full I'm-not-mainstream regalia during business hours (i.e. T-shirts and black trenchcoats not right outside a concert) was in Cody, Wyoming in August 2008. Go there, would-be Goths and punks and metalheads! Be free!