2) What is the penalty for making damaging accusations? It's certainly the case that more rapes go unreported, than false rape accusations are made. But the case of a serial rape-false-accuser who is getting jail time is now prompting people to ask questions: since there is clearly an irreversible cost of (rape accusations, molestation accusations, etc.) what is the cost to the accuser in cases where it turns out to be false? Another bizarre legal inconsistency is that it's contempt of court to lie under oath, but then a defendant pleading innocent and found guilty does not automatically earn an additional contempt charge.
Tuesday, February 26, 2013
Questioning Outrage
1) News flash: people asking for money on the street are sometimes lying. When you give a beggar money, part of the deal is lack of accountability. You don't know what s/he will spend it on; you don't know if s/he is actually a millionaire on a lark. Expecting that a stranger to whom you give money is telling the truth is stupid, and being outraged by this man's behavior is frankly bizarre. That said, his fake speech impediment gives him away. He talks like someone with a brain injury, but his grammatical lapses are out of place.
2) What is the penalty for making damaging accusations? It's certainly the case that more rapes go unreported, than false rape accusations are made. But the case of a serial rape-false-accuser who is getting jail time is now prompting people to ask questions: since there is clearly an irreversible cost of (rape accusations, molestation accusations, etc.) what is the cost to the accuser in cases where it turns out to be false? Another bizarre legal inconsistency is that it's contempt of court to lie under oath, but then a defendant pleading innocent and found guilty does not automatically earn an additional contempt charge.
2) What is the penalty for making damaging accusations? It's certainly the case that more rapes go unreported, than false rape accusations are made. But the case of a serial rape-false-accuser who is getting jail time is now prompting people to ask questions: since there is clearly an irreversible cost of (rape accusations, molestation accusations, etc.) what is the cost to the accuser in cases where it turns out to be false? Another bizarre legal inconsistency is that it's contempt of court to lie under oath, but then a defendant pleading innocent and found guilty does not automatically earn an additional contempt charge.
Sunday, February 24, 2013
Specialization: Or, The Rest of Us Have No Chance
One of the problems in complex societies is that specialization produces information asymmetries in the marketplace, and without reasonably efficient information exchange, markets don't work. Three thousand years ago, there just wasn't that much to know about fish or bronze, and even though the the fishmonger might know more than someone else about his product, the imbalance was far less than it is today. Today, someone's whole career (and a whole industry's R&D apparatus) is spent trying to get you (for example) to eat junk food, and you're spending your life trying to devote your attention to multiple things, so chances are more often than not they'll win. The same applies to any industry or profession you might name: finance, cars, medicine, ad infinitum.
It turns out junk food is an excellent example; here is an excellent piece showing how the food industry conceives and sells it.
It turns out junk food is an excellent example; here is an excellent piece showing how the food industry conceives and sells it.
Labels:
capitalism,
specialization
Monday, February 18, 2013
Tool to Search State Legislative Activities
This is a massive step forward in forcing government transparency and letting us evaluate the performance of our legislators. Without good information, we have no idea what or how they're doing - which of course is what they prefer. You can track specific legislators and bills. Visit OPENSTATES.ORG.
The Law-Giving Machines
This post also appears at Speculative Nonfiction. This article includes story spoilers.
There are now actual drones in our skies, both watchers and hunter-killers. But they're (so far) only semi-autonomous, and they're on missions to protect us legally and militarily, rather than sent by fellow machines to exterminate. Thought experiments in fiction about automatic law-giving devices have been much more interesting than apocalypse porn about bad AI.
Two short stories come to mind here, one of which has enjoyed recent attention, Robert Sheckley's Watchbird (1990) and Larry Niven's Cloak of Anarchy (1972). Both these stories involve surveillance drones with some degree of autonomy and that can hurt or kill their targets. In Watchbird, the drones are police devices, intended to kill murderers before they commit their crime; the drones are able to learn on the job and once released, they expand their definition and start protecting all living things and even some machines. In the end, another drone is released to kill the first drone species, but of course it soon expands its own definition of what it should kill. (Watchbird was adapted for film here.)
The drone in Cloak of Anarchy is the copseye. In this future world, there are "free parks" where anything is allowed except violence against another human being. The floating copseyes watch over the park,a nd if violence is imminent, the copseye stuns both the aggressor and aggressee, and both wake up later, calmed down and with a hangover. Then someone finds a way to short circuit the copseyes, and within hours factions have formed inside the park and violence breaks out.
The stories present us with two different sets of concerns, based on the problems that occur. In Watchbird, the central concern is the autonomy of the devices. Their ability to learn is what allows the problem to grow, but the protagonist is preoccupied with the very fact of machines executing laws without intervening humans. On one hand this could almost seem like a reactionary position: one of the greatest inventions of modernity was nations of laws and not of men. Intervening humans with narrow self-interest executing these laws have always been the problem! (Hence this proposal for a legal programming language in which to write laws that then have to compile with previous laws.) But even without that quibble, his point is well-taken that when autonomous law-givers are able to immediately carry out their sentence and we can no longer modify them, they might become paperclip maximizers, Less Wrong parlance: that is, a moral rule which seems universalizable has consequences that humans could not foresee when implementing it in all-powerful enforcers which can no longer be called back. The protagonist has no problem with more efficient enforcement, but the moral mutations allowed by the machines' autonomy.
To this end, naively, little mention is made of the interests of those authorizing and supporting the program. Still, Watchbird does peripherally make the point that technology allows concentrations of power in the hands of individuals in a way that distorts society. With sudden increments in enforcement power, some humans are able to apply laws with an all-pervasiveness and immediacy that had just never been possible before. Even someone with good intentions and what you would have called good values would suddenly find him or herself in a position of dictatorial authority. It's not even that power corrupts (although it does); it's that this centralization is so unnatural as to be impossible to handle with a good outcome. The best example is this exchange with the protagonist early in the bad behavior of the Watchbirds:
It bears mentioning that the conclusion of the story, where the Watchbird-killers are now expanding their prey definition, is recapitulating one of the problems of a Singularity solution of building anti-AI AIs: there could conceivably be a parallel to an auto-immune reaction disease if humans fell into the definition of AIs.
At first glance the problem in Cloak of Anarchy is a curious one - that humans immediately revert to violent tribalism when the violence control mechanism is defunct - since Niven is elsewhere clearly sympathetic to libertarian concerns. The obvious interpretation of the story is the paternalistic one, that humans need authority to make them behave. But there's another interpretation, which is that the drones created the problem. That is to say, when we are coddled by perfect enforcement from drones, we lose the ability to exercise moral choices, as well as the ability to appreciate the consequences of poor choices. When it is physically impossible to harm another person, why learn restraint? Why worry about what happens when you pick a fight? When suddenly the daddies aren't around to break up the fights and bail everyone out, we shouldn't be surprised at what happens.
The watchbirds do exist today, although with less autonomy and more firepower. The changes are incremental; there won't be a red carpet unveiling of AI even as profound as the release of the watchbirds (or copseyes). They'll be to areas where there's the most pressure for advance, and the least opportunity for public awareness and understanding. It will be, and is, the addition of subroutines allowing a drone to apply the laws of war to a kill it's about to make (instead of getting slow permission from a JAG in an office in St. Louis who might be in the bathroom). It's the growth of autonomous stock trading algorithms. It will even be in advertising on porn sites.
There are now actual drones in our skies, both watchers and hunter-killers. But they're (so far) only semi-autonomous, and they're on missions to protect us legally and militarily, rather than sent by fellow machines to exterminate. Thought experiments in fiction about automatic law-giving devices have been much more interesting than apocalypse porn about bad AI.
Two short stories come to mind here, one of which has enjoyed recent attention, Robert Sheckley's Watchbird (1990) and Larry Niven's Cloak of Anarchy (1972). Both these stories involve surveillance drones with some degree of autonomy and that can hurt or kill their targets. In Watchbird, the drones are police devices, intended to kill murderers before they commit their crime; the drones are able to learn on the job and once released, they expand their definition and start protecting all living things and even some machines. In the end, another drone is released to kill the first drone species, but of course it soon expands its own definition of what it should kill. (Watchbird was adapted for film here.)
The drone in Cloak of Anarchy is the copseye. In this future world, there are "free parks" where anything is allowed except violence against another human being. The floating copseyes watch over the park,a nd if violence is imminent, the copseye stuns both the aggressor and aggressee, and both wake up later, calmed down and with a hangover. Then someone finds a way to short circuit the copseyes, and within hours factions have formed inside the park and violence breaks out.
The stories present us with two different sets of concerns, based on the problems that occur. In Watchbird, the central concern is the autonomy of the devices. Their ability to learn is what allows the problem to grow, but the protagonist is preoccupied with the very fact of machines executing laws without intervening humans. On one hand this could almost seem like a reactionary position: one of the greatest inventions of modernity was nations of laws and not of men. Intervening humans with narrow self-interest executing these laws have always been the problem! (Hence this proposal for a legal programming language in which to write laws that then have to compile with previous laws.) But even without that quibble, his point is well-taken that when autonomous law-givers are able to immediately carry out their sentence and we can no longer modify them, they might become paperclip maximizers, Less Wrong parlance: that is, a moral rule which seems universalizable has consequences that humans could not foresee when implementing it in all-powerful enforcers which can no longer be called back. The protagonist has no problem with more efficient enforcement, but the moral mutations allowed by the machines' autonomy.
To this end, naively, little mention is made of the interests of those authorizing and supporting the program. Still, Watchbird does peripherally make the point that technology allows concentrations of power in the hands of individuals in a way that distorts society. With sudden increments in enforcement power, some humans are able to apply laws with an all-pervasiveness and immediacy that had just never been possible before. Even someone with good intentions and what you would have called good values would suddenly find him or herself in a position of dictatorial authority. It's not even that power corrupts (although it does); it's that this centralization is so unnatural as to be impossible to handle with a good outcome. The best example is this exchange with the protagonist early in the bad behavior of the Watchbirds:
"One of the watchbirds went to work on a slaughterhouse man. Knocked him out."One man is suddenly in the uncomfortable position of morally disapproving of whole industries and forcing them to change. What's more, this previously reluctant man does not seem so reluctant now.
Gelsen thought about it for a moment. Yes, the watchbirds would do that. With their new learning circuits, they had probably defined the killing of animals as murder.
"Tell the packers to mechanize their slaughtering," Gelsen said. "I never liked that business myself."
"All right," Macintyre said. He pursed his lips, then shrugged his shoulders and left.
It bears mentioning that the conclusion of the story, where the Watchbird-killers are now expanding their prey definition, is recapitulating one of the problems of a Singularity solution of building anti-AI AIs: there could conceivably be a parallel to an auto-immune reaction disease if humans fell into the definition of AIs.
At first glance the problem in Cloak of Anarchy is a curious one - that humans immediately revert to violent tribalism when the violence control mechanism is defunct - since Niven is elsewhere clearly sympathetic to libertarian concerns. The obvious interpretation of the story is the paternalistic one, that humans need authority to make them behave. But there's another interpretation, which is that the drones created the problem. That is to say, when we are coddled by perfect enforcement from drones, we lose the ability to exercise moral choices, as well as the ability to appreciate the consequences of poor choices. When it is physically impossible to harm another person, why learn restraint? Why worry about what happens when you pick a fight? When suddenly the daddies aren't around to break up the fights and bail everyone out, we shouldn't be surprised at what happens.
The watchbirds do exist today, although with less autonomy and more firepower. The changes are incremental; there won't be a red carpet unveiling of AI even as profound as the release of the watchbirds (or copseyes). They'll be to areas where there's the most pressure for advance, and the least opportunity for public awareness and understanding. It will be, and is, the addition of subroutines allowing a drone to apply the laws of war to a kill it's about to make (instead of getting slow permission from a JAG in an office in St. Louis who might be in the bathroom). It's the growth of autonomous stock trading algorithms. It will even be in advertising on porn sites.
Labels:
ai,
immune system,
politics,
science fiction,
singularity
Alternative Geography: New States Based on Equal Population
A previous alternate ("what-if") map of the U.S. here.
From Fake is the New Real. Yes, I know you can't see it, so view the original.
I grew up in the Susquehanna-Pocono-Philadelphia tri-state area, now identify Yerba Buena as home, and currently live in (gulp) Orange. Coastal Tule is quite pleasant. Or, here's the U.S. if every group of counties that tried to secede from their parents state actually succeeded. (I don't know how you fit 124 stars on the American flag.)
Plus, here's the U.S. on the Moon, by Boredboarder8 on Reddit's Map Porn. I think the Great Lakes wouldn't last so long.
From Fake is the New Real. Yes, I know you can't see it, so view the original.
I grew up in the Susquehanna-Pocono-Philadelphia tri-state area, now identify Yerba Buena as home, and currently live in (gulp) Orange. Coastal Tule is quite pleasant. Or, here's the U.S. if every group of counties that tried to secede from their parents state actually succeeded. (I don't know how you fit 124 stars on the American flag.)
Plus, here's the U.S. on the Moon, by Boredboarder8 on Reddit's Map Porn. I think the Great Lakes wouldn't last so long.
A New Fairy Tale
In Hungeree, from the Rose Carnival in Cologne.
What I love is that it's insulting to everyone involved, although the symbolic scheme is admittedly a little heavy-handed. For clarity, the pig facing the camera is Italy.
"The enforcer class is primarily concerned with itself (see Dorner)"
Please read this great piece by Ian Welsh, the Logic of Surveillance, which contains the statement in the title of this post. What was most disturbing about the Dorner incident was how the police acted out of all proportion to "get him", compared to a threat he might have made against any other professional class. They really didn't make much of an effort to hide the fact that this was a personal vendetta, because Dorner had declared war on police. Certainly when an armed and trained individual declares his intentions to kill human beings, you have to find him and stop him. That said, if Dorner had made threats against lawyers, or dentists, or scientists, certainly there would have been a response, but not the war footing we saw.
What's more, I highly doubt that the public would be as blithely accepting of the collateral damage we experienced in the form of a mother and a mailman being shot because they were driving vaguely similar vehicles.
The full quote that I used from the title is: "The enforcer class is also insular, primarily concerned with itself (see Dorner) and is paid in large part by practical immunity to many laws and a license to abuse ordinary people.) Not being driven primarily by justice and a desire to serve the public and with a code of honor which appears to largely center around self-protection and fraternity within the enforcer class, the enforcers reliability of the enforcers is in question: they are blunt tools and their fear for themselves makes them remarkably inefficient." It bears pointing out that the Blue Wall of Silence is particularly strong in SoCal cities. LAPD is notorious for this from incidents in the 90s, and San Diego's current mayor may be in office largely owing to support from the police union in a close election, since his competitor promised more thorough oversight.
What's more, I highly doubt that the public would be as blithely accepting of the collateral damage we experienced in the form of a mother and a mailman being shot because they were driving vaguely similar vehicles.
The full quote that I used from the title is: "The enforcer class is also insular, primarily concerned with itself (see Dorner) and is paid in large part by practical immunity to many laws and a license to abuse ordinary people.) Not being driven primarily by justice and a desire to serve the public and with a code of honor which appears to largely center around self-protection and fraternity within the enforcer class, the enforcers reliability of the enforcers is in question: they are blunt tools and their fear for themselves makes them remarkably inefficient." It bears pointing out that the Blue Wall of Silence is particularly strong in SoCal cities. LAPD is notorious for this from incidents in the 90s, and San Diego's current mayor may be in office largely owing to support from the police union in a close election, since his competitor promised more thorough oversight.
Sunday, February 10, 2013
The Good and Bad of One Dimensional Status
This New Yorker piece about Amy Bishop - the woman who shot faculty members in her department when she was denied tenure - has been getting some reaction from within academia. At its core this is the story of a woman with a personality disorder who clearly needed status - as we all do - but in her case the reaction to its denial was repeatedly pathological. There's an increasing sense from multiple disciplines that multiple overlapping status hierarchies result in better quality of life for many of us, but this fragmentation also provides an opportunity for gaming the whole system; perhaps this latter outcome is still a net positive.
In Bishop's case, she defined her status entirely within one hierarchy - academia. Just about everyone who reads the piece will be angry at her for one reason or another, and in my case it was her narrowness in defining her worth this way. If you're that fragile, make sure you're not invested in a single hierarchy that can be taken away from you!
I've been having the reverse experience Bishop did, resulting in a different problem. As a nontraditional student in medical school with a successful previous career, it's incredibly de-stressing to know that I had a great life before medicine and if something goes wrong and I don't finish the MD, I'll be fine. I hesitate to say that out loud around my superiors because they're used to being the sole dispensers of status, and probably aren't pleased to find someone who is less able to be motivated by implicit threats therein. At the same time, it's demotivating - the younger students' identities are entirely wrapped up in medicine and can't imagine a life without a future MD. When you're burning the midnight oil studying, I imagine it's easier to get through another chapter of nephrology that way.
As a general rule in life, it's a good idea to stay out of zero sum games, and status is one of the most common zero sum games. If you have to play it, play in multiple leagues simultaneously.
In Bishop's case, she defined her status entirely within one hierarchy - academia. Just about everyone who reads the piece will be angry at her for one reason or another, and in my case it was her narrowness in defining her worth this way. If you're that fragile, make sure you're not invested in a single hierarchy that can be taken away from you!
I've been having the reverse experience Bishop did, resulting in a different problem. As a nontraditional student in medical school with a successful previous career, it's incredibly de-stressing to know that I had a great life before medicine and if something goes wrong and I don't finish the MD, I'll be fine. I hesitate to say that out loud around my superiors because they're used to being the sole dispensers of status, and probably aren't pleased to find someone who is less able to be motivated by implicit threats therein. At the same time, it's demotivating - the younger students' identities are entirely wrapped up in medicine and can't imagine a life without a future MD. When you're burning the midnight oil studying, I imagine it's easier to get through another chapter of nephrology that way.
As a general rule in life, it's a good idea to stay out of zero sum games, and status is one of the most common zero sum games. If you have to play it, play in multiple leagues simultaneously.
Subscribe to:
Posts (Atom)