Tuesday, February 17, 2015

Thoughts from New Orleans on Fat Tuesday, 2015

-One of the most striking things about New Orleans during Mardi gras, oddly enough, is the attitude of the police. (Okay, lest you be questioning whether red blood still flows through my veins, there are other striking things too, noted below, but this one was perhaps the most surprising). I’ve never seen police so chilled out in my whole life, entirely unconcerned by the debauchery around them. I spent a while watching them trying to figure out if this was

a) the fact that nothing surprises them anymore, having seen all this nonsense thousands of times,

b) part of a brilliantly devised ‘small footprint’ strategy whereby they let small infractions go and concentrate only on the big stuff, as the debauchery is important for the city and police antagonism will mostly make the situation worse, or

c) whether they were in fact wholly nonchalant about crime, and simply didn’t give a flying @#$%.

It’s probably a little of all three, but I ended up putting more weight on the latter option than I had initially. Part of this came from hearing various stories from locals, including seeing a cop in uniform light up a joint, someone trying to alert police to a man passed out on the side of the road and receiving a shrug as the official response, and of course the murder rate of 57 per 100,000 which would make the Republic of New Orleans the second highest murder rate country in the world.

-Related to strategy b) above, New Orleans really reveals the absurdity of open container and street drinking laws. Who would have thought that people can actually take a beer from a bar out into the street and society doesn’t collapse around them. Instead, the focus is on more practical thing like having all drinks served in plastic cups to minimize the risk of broken glass. You’d think that this kind of sensible example would catch on around the western world, but only if you’d never seen the absurd moral panics that society gets attached to. Giving people a ticket for having a beer in public is contemptible and unworthy of a free society.

-Having a passing familiarity some of the extant literature on the subject, there was actually less public nudity at Mardi Gras than I expected. Which is to say, there was some, but it certainly wasn’t ubiquitous. Never underestimate the power of good editing to create a very unrepresentative sample. As well as being more clothed on average, the crowd was also older and blacker than the literature would suggest. The fact that editing would hide the first fact is unsurprising, the second fact perhaps more so.

-In the annals of ‘curious facts about male sexual preferences’, the odd fascination with public nudity is definitely up there. This is put into sharp relief when you have on Bourbon street multiple strip clubs which will show you highly attractive fully nude women at a moment’s notice for not very much money. But instead, during Mardi Gras people seem far more interested in the possibility of a one second flash from some who isn’t a stripper, and usually doesn’t have a stripper’s body. Never underestimate the appeal of the illicit, of seeing what is normally covered up, and overall the aspect of slight reluctance. Seeing someone get convinced by a crowd to flash appeals to the male brain in ways that a girl on stage willingly taking off her clothes never quite captures. Male sexual preference is odd indeed, especially when it comes to strip clubs

-Mardi Gras attracts a large number of very earnest Christians out to try to save the souls of revelers. I find these people fascinating. Say what you will about their beliefs, it takes some serious cojones to stand in the middle of Bourbon Street carrying a huge cross and yelling about Jesus to the potentially antagonistic drunks all around you. Most of us never believe anything with that kind of sincerity (for better or worse).

-The fact that Mardi Gras is associated with the Catholic traditions around Lent is always hilarious to me. People seem to have taken the idea of penance and renunciation for Lent and instead transformed it exclusively into a time-series shift in debauchery while keeping the total amount either constant, or more likely increasing it in total. Even funnier, the tradition of increasing sordid behavior before lent stuck around long after people stopped following the other part of piety and giving up pleasures. Substitution effects are tricky things.

-Bourbon street is another example like the Vegas strip of the unusually strong power of network effects. There is very little architecturally, visually or resource-wise to set apart Bourbon street from nearby streets. But one of them is packed when the others are nearly deserted. Truly, people like being around other people.
-I went to the Orpheuscapade Ball, which was awesome. I only found out about the various balls because one of the girls in our group had grown up in New Orleans, and knew that this was the thing to do (while the tourists all go to Bourbon Street). There were thousands of people in black tie, watching the floats go through the New Orleans convention center. I really enjoyed seeing the old Southern High Society. You never hear about them much – I kind of thought the Civil War had routed most of that old tradition, but it still lingers on. All you hear about the South is the rural white trash side, but never the rich upper class white side. Especially the Southern society girls. Smoking hot, rich, conservative – what’s not to love?

-Related to the above, the ball had as its main musical act a guy who was apparently a big country star. I’ve been in this country more than a decade now, which is long enough to lull me into the sense that I’ve pretty much got the hang of the place. And then I’ll hear a country music concert and get reminded how there’s a huge side of America than I just about never see. To make matters even stranger, a lot of the country music crowd would probably vote in a more similar way to me (if I were inclined to vote, which I’m not) than the people I live around. Though if you broke it down issue by issue instead of shoe-boxing everyone into one of two parties, the overlap would certainly become smaller. While the crowd here was a long way from the standard rural Republican voting set, the enthusiasm of the crowd for a wholly alien musical genre was a bit of a reminder of the extent of the country that is essentially invisible when you live in big coastal cities.

Thursday, February 12, 2015

A good heuristic for a certain type of BS

One phrase that in practice means almost the exact opposite of what it claims is the expression 'scientifically proven'.

I have known a good number of scientists, both social and physical, and I've never once heard them use this expression non-ironically to describe either their own, or anyone else's work. Mathematics proves things, by formal theorems. Science, on the other hand, provides evidence that supports some hypotheses and which rejects other hypotheses. But even when a null hypothesis is formally rejected, knowledge in the sciences is contingent. At any time, your theory is making falsifiable predictions that are so far consistent with the data, but which might be overturned at any time.

And even in places like economics, theory models, which do use formal mathematical proofs of particular ideas and thus may loosely be justified in terms of speaking of 'proof', almost never use the term when referencing the broad idea they're trying to advance. Economists will say 'I solve a model which shows how information asymmetry affects trading volume', not 'Information asymmetry is scientifically proven to decrease trading volume'. What has been solved is one particular model, but there are many other competing models that may be consistent with the data too. Nobody would dream of saying that science proved their theory result.

'Oh sure', you might say, 'we understand that there's a distinction among the finer points of philosophy of science. But in practice, saying science has proved something just means there's lots of evidence consistent with it. Why be such a purist?'

A good question, since you asked.

The reason my heuristic works, however, is that most people who perform actual science do understand the distinction, and are likely to use the right language. By contrast, people who like the phrase 'scientifically proven' are almost always sneaking in an appeal to authority in order to paper over either a) their lack of understanding of the complexity of the issue, or b) the annoyingly inconclusive evidence for the particular proposition that they think it would be politically desirable for more people to believe.

The claim in the above paragraph, of course, is a hypothesis. In the name of science, we should see whether the evidence supports the hypothesis or not.

To check, here's the top 5 results that come up when I type in the phrase 'reject the null hypothesis' into Google News:

1. Do Teams Undervalue European Skaters in the Draft?
2. Hypothesis Testing in Finance: Concept & Examples
3. Culture war in the deep blue sea: Science’s contentious quest to understand whales and dolphins
4. WaPo Climate Fail on Missouri
5. Using a fund manager? You'd get the same results at a casino

So that may not sound stellar, but they're all somewhat related to formal evaluation of evidence for and against ideas in the social or physical sciences. Now compare it with what comes up for 'scientifically proven':

1. Scientifically proven herbal aphrodisiacs
2. Writing Exercises Scientifically Proven To Redirect Your Life
3. 10 scientifically proven ways love can heal!
4. Emojis Are Now Scientifically Proven To Help You Get Lucky
5. Ryan Gosling’s Face Has Been Scientifically Proven To Make Men More Supportive Of Feminism

In other words, worthless clickbait. Colour me shocked.

The results, while not subjected to formal statistical testing, directionally support the hypothesis that 'scientifically proven' is a brain-dead appeal to authority by lazy English majors who wish to unjustifiably associate their claims with the patina of scientific credibility.

Saturday, February 7, 2015

Legal institutions are sticky things, often stupidly so

I cannot for the life of me understand why courts still award alimony.

Not child support - that still mostly makes sense in principle, though in practice it has its own problems , like the fact that it can be spent on any number of things other than care for the children. There also particularly revolting versions like California's paternity arrangements whereby a man who is duped into believing that someone else's child is his own has only two years from the birth of the child to challenge paternity, otherwise he's stuck paying child support forever, genetic testing be damned. And even if he files in time, the court may still decide it's not in the child's interests - the man's interests, having been the subject of a vicious con that is the male equivalent of rape, are of less importance.

Where was I? Oh yes, the basic principles of child support are reasonable.

But what in the name of all that is holy is the justification for alimony in this day and age? When you marry someone, apparently you are entitled to a certain standard of living from that person in perpetuity. Phrased this way, it is bonkers.

For the feminists on this blog, here's a story a friend of mine told me today. His brother in law was married to a woman, and they had a child. The woman was a lawyer, but decided she wanted to stop working. She wasn't actually involved any more than the man in raising the child - they had nannies to take care of the child. Instead, the woman just lived a life of leisure, and never returned back to work. At some point she got bored, began an affair, and divorced the man. She claimed alimony, which she was awarded, based on the lifestyle she had before. She could still go back to her legal career now, obviously, but why would she? The man will be stuck paying alimony unless the woman decides to remarry. Of course, since there's now enormous financial disincentives against her remarrying, the smart money predicts she'll just move in with her new boyfriend and never remarry, so as to keep the cash flowing.

How on earth did we end up with such a bizarre arrangement? It seems obvious that nobody in their right mind would design this monstrosity today. But it's a holdover from the years long past when
a) women couldn't work outside the home, so couldn't support themselves short of remarrying,
b) divorces were only granted by fault, so if the man wanted to just pack up and leave, he would be slugged with alimony, but if the women was having an affair and the man sought a divorce, bad luck for the woman.
c) the social pressure on people to remarry the subject of their affair after the divorce was large, hence 'alimony until remarriage' was a reasonable estimate of the length of financial hardship.

It's pretty clear that none of this holds any more. There is a very limited grounds for alimony when a woman has given up several years of a career to raise the family's children. But once the children are at school age, it's hard to know why courts should subsidise permanent leisure. And between nannies and daycare, there are plenty of ways for both parents to go back to work within a length of time that won't be massively disruptive to a career, certainly for the one point something children that the average couple has.

There are good policy reasons to make sure that a non-working partner doesn't get totally left in the lurch, particularly when children are involved. But remember, even without alimony most of the time the non-working partner is going to get a significant fraction of the assets, so they're not going to be totally broke. And if there are still reasons to grant payments under a limited form of alimony, it seems that they should be something like unemployment benefits - payments for a limited number of time while the person finds a new occupation. Why one should get alimony indefinitely without working is beyond me. And if there are no children involved, it is absolutely inconceivable what the justification for alimony is. Get a damn job!

None of this will happen, of course. Feminists like alimony because they live in a Stalinist 'who, whom' universe, where extracting resources from beta male schlubs is an end in itself.

The only chance whatsoever for alimony reform is that as women's incomes start rising, the number of cases where lazy men are claiming alimony from their working ex-wives is on the rise. That might finally strike feminists as being unfair and deserving of reform, but just about nothing else will.

Speaking of which, in that story I told you, I did alter one minor detail. The main protagonist was actually my friend's sister-in-law. The lazy parent who stopped working, began an affair and successfully claimed alimony? That was the husband.

And you know what? The absurdity and injustice is exactly the same.

Wednesday, February 4, 2015

On Jordan

So Jordan is in the news recently. A Jordanian fighter pilot who had been captured by ISIS was burned alive on camera. So Jordan executed two prisoners whom ISIS had been purportedly talking about trading for the pilot.

Whatever you may think of this in terms of the rule of law (and let's face it, it's not exactly optimistic in terms of your likelihood of getting a fair trial if you commit terrorist acts in Jordan), this is definitely sound, if grim, foreign policy. Doing nothing in response to foreign provocations (such as, ooh, I don't know, Benghazi) looks weak and contemptible. Invading a country in response, however, seems more like throwing good money and lives after bad. This is an escalation, but not a big one, and a payment in kind. It sends two messages - one, that brutality will have consequences, and two, that we will not be squeamish about how we choose to retaliate. Both of which are pretty reasonable messages to send to ISIS in response to this kind of thing.

As far as Middle Eastern countries go, Jordan is pretty damn good. They've got a fairly stable government, which seems to be reasonably popular - at any rate, they managed to ride out the awful Arab Spring without the massive disruption of nearby countries, suggesting a certain level of popularity for the King. They made peace with Israel two decades ago (which was sensible) and gave up claims to the Palestinian territories (which, given they subsequently turned into basket case hellholes, was doubly sensible). Plus they have a totally hot Queen, and the neoreactionary in me is cheered by seeing a monarchy getting some good PR, even if for dubious reasons.

Look, Switzerland it ain't. I wouldn't want to try to run a newspaper there, nor find myself on the wrong side of their police force. But as you may have noticed, there aren't a whole lot of Switzerlands in the Middle East, certainly among the Arab Muslim countries.

Do you know the thing that recommends Jordan to me the most?

You don't hear much about Jordan.

And believe me, in that neighbourhood, that's a pretty damn good outcome. The same is true about Kuwait, incidentally. But in the case of Jordan, they happen to share borders with Syria, Iraq, the Palestinian territories, Saudi Arabia, Egypt and Israel. Be honest, can you imagine a list of countries you'd less rather be next door to in terms of promoting stability in your own country? Three of those places are essentially failed states, for crying out loud.

You may not like the country at an absolute level, but short of zombie Lord Cromer coming back to run it, it seems pretty likely than the Jordanian government is about as good as you're going to get any time soon from a government in that region. The perfect ought not be the enemy of the good.

Wednesday, January 28, 2015

Brecher on Boko Haram

The number of people who write interesting, nuanced pieces on Islam is shockingly low. Either all Muslims are crazy, or all Muslims are peace-loving, but it's got to be one of the two. More importantly, to the extent that anyone resiles from this position, it's as a wholly dishonest token ass-covering method that simply emphasises by comparison the main sentiment. In other words, you'll read either 'Of course, not all Muslims are crazy terrorists, but [implicitly they are nearly all terrorists]', or 'Of course, there are a tiny number of Muslim terrorist fanatics, but [implicitly they are nearly all peace-loving]'.

If you want to test whether a given piece has any nuance, you can check whether it makes any attempt to distinguish between different groups of Muslims. In other words, if there are 'a tiny number of terrorist fanatics', to me it seems mightily interesting as to how we might diagnose which ones are which. Bangladesh, for instance, produces relatively few jihadis. Neither does Turkey. Saudi Arabia and Yemen, however, produce quite a lot. But how often do you hear about that distinction? Or how we should set policy as a consequence?

Gary Brecher, however, is one of the best sources of actually informed, disinterested commentary on the subject. The standard problem, as he puts it, is thus:
A few days ago, a suicide bomber got on a luxury commuter bus in Northern Nigeria and blew himself up, along with 60 people who were heading home from work.
It didn’t get much publicity. African casualties rarely do, especially when there’s a depressing religious angle. The suicide bomber came from the Northern Nigerian Islamist group “Boko Haram.” The name is interesting: “Boko” comes from the English word “book,” as pronounced by the Hausa, the biggest northern ethnic group. “Haram” (“forbidden”) is an Arabic word, the Wahhabis’ favorite word of all. When people talk about “Northern Nigeria” they mean “Muslim Nigeria.” There are three big divisions in the country: The Muslim/Hausa North, the Christian/Igbo South, and the Yoruba West. (The Yoruba are the only big group that’s mixed, with Christians and Muslims). Boko Haram blew up those buses because the people on them were going to an Igbo/Christian neighborhood of Kano, a Muslim/Northern city.
That’s already more than most squeamish Westerners want to know. “Ah, it’s religious…” is about all they need to hear before settling back into their comfy stances. Conservatives figure it’s just one more proof that all Muslims are crazy. The left mumbles “Islamophobia” and tries to change the subject to Palestine. So from left to right on your radio dial, there’s not a lot of what my social-studies teacher called “hunger for knowledge.”
 I challenge you to argue that he's wrong.

Here's his latest piece, describing the disingenuousness of people who are suddenly interested in Boko Haram as a way of distracting attention from Charlie Hedbo.

It's awesome. Read the whole thing.

Tuesday, January 27, 2015

The Imitation Game

I recently saw 'The Imitation Game', the new Benedict Cumberbatch movie about the role of Alan Turing in solving the Enigma Code in World War 2. Some thoughts thereon:

The movie managed to get a fairly even-handed description of a lot of the important parts of Turing's life and work. It managed to hit on artificial intelligence, computers, the general problems of encryption and information leakage, and the question of how much to act on information from the cables so as to not reveal that you've broken the code.

That may seem like an easy thing, but it's actually surprisingly hard. Compare it with, say, 'A Beautiful Mind', which covered the life of John Nash. That movie barely touched on what was Nash's singular contribution, the Nash Equilibrium, and when they did, they managed to completely screw it up. In the game of 'do we all compete for the hot girl at the bar and crowd each other out, or target the plain ones?', they depict the answer as 'everyone goes for the plain ones, thus we all get paired up'. That's not a damn Nash equilibrium! If everyone else is going for the plain ones, the dominant strategy is to go for the hot one. But apparently the concept of randomisation was a bridge too far. So by that dismal standard, The Imitation Game is practically a cryptography textbook.

The movie also made me think curious the question of who gets credit for big accomplishments like breaking the Enigma code. Doubt not that Turing was brilliant and a huge part of it. But did you know that a lot of the early work that made it possible was done by several Polish cryptographers before the war? I will go so far as to wager quite confidently that none of my readers has ever had cause to use the world word 'Poland' (or its derivations) and the word 'cryptography' (or its derivations) together in any sentence they have uttered or thought, ever. Even after the story gets popular, they are forgotten. And in terms of geniuses who will never, ever be remembered, the movie made me wonder about who designed the Enigma machine in the first place. Though it was eventually cracked, it is an outstanding piece of cryptography. You will never hear about the Germans responsible for its creation. Brilliant German scientists were only famous once they had been succesfully rehabilitated - Wernher von Braun was a genius when he was designing rockets for the Nazis, but it was only possible to acknowledge this brilliance in hindsight once he'd also used it to land Americans on the moon.

I was rather impressed with the quite sensitive way that they tackled Turing's homosexuality. They resist what I imagine would have been a tendency in a lot of treatments of the story: to make homosexuality the central part of his character, and his whole raison d'etre. Given the extent of Turing's intellectual achievements, such as basically founding computer science as a discipline, a movie that simply made him a gay activist or martyr would have deeply missed the point about his life. But this is exactly the kind of depressing mistake Hollywood tends to make. This is particularly so, given the tragic treatment he received at the hands of the authorities in in being convicted of indecency and chemically castrated, which probably contributed to his eventual suicide. Rather, they show quite sweetly the scenes of a lonely Turing at school developing a romantic friendship with another boy, but one that never has an actual physical aspect of sexuality in any form. This depiction actually meshed very well with the way Robert Graves describes such things in 'Goodbye to all that':
"G.H.Rendall, the then Headmaster at Charterhouse, is reported to have innocently said at a Headmasters' Conference: 'My boys are amorous, but seldom erotic.' ...
Yet I agree with Rendall's distinction between 'amorous' (by which he meant a sentimental falling in love with younger boys) and eroticism, or adolescent lust...

In English preparatory and public schools romance is necessarily homosexual. The opposite sex is despised and treated as something obscene. Many boys never recover from this perversion. For every one born homosexual, at least ten permanent pseudo-homosexuals are made by the public school system: nine of these ten as honourably chaste and sentimental as I was.
"The school consisted of about six hundred boys, whose chief interests were games [sports] and romantic friendships."
The extent to which they manage to capture this atmosphere, without simply transforming it into modern ideas of what being gay involves, was a pleasant surprise.

The other idea that I really enjoyed seeing depicted was the old sense of Britishness - restraint, propriety, a stiff upper lip to the point of being emotionally distant. You could see how the British were able to run a huge empire for so long, and win World War I. The most memorable scene in this regard was when Joan, the female cryptographer and one-time fiance (Keira Knightly) sees Alan (Benedict Cumberbatch) after his conviction and chemical castration. Turing is close to a breakdown and starts crying. Joan gently encourages him to sit down, and rather than talk through his problems, suggests that they do a crossword puzzle. I found this scene oddly touching and heartbreaking. It is a hallmark of the lost Britain that you only see today in the elderly. As Theodore Dalrymple recounts:
No culture changes suddenly, and the elderly often retained the attitudes of their youth. I remember working for a short time in a general practice in a small country town where an old man called me to his house. I found him very weak from chronic blood loss, unable to rise from his bed, and asked him why he had not called me earlier. “I didn’t like to disturb you, Doctor,” he said. “I know you are a very busy man.”
From a rational point of view, this was absurd. What could I possibly need to do that was more important than attending to such an ill man? But I found his self-effacement deeply moving. It was not the product of a lack of self-esteem, that psychological notion used to justify rampant egotism; nor was it the result of having been downtrodden by a tyrannical government that accorded no worth to its citizens. It was instead an existential, almost religious, modesty, an awareness that he was far from being all-important.
I experienced other instances of this modesty. I used to pass the time of day with the husband of an elderly patient of mine who would accompany her to the hospital. One day, I found him so jaundiced that he was almost orange. At his age, it was overwhelmingly likely to mean one thing: inoperable cancer. He was dying. He knew it and I knew it; he knew that I knew it. I asked him how he was. “Not very well,” he said. “I’m very sorry to hear that,” I replied. “Well,” he said quietly, and with a slight smile, “we shall just have to do the best we can, won’t we?” Two weeks later, he was dead.
Do you, like me, feel a great sorrow when you think of what once was, and how far we have fallen?

Saturday, January 24, 2015

Thoughts of the Day

"Thus, posterity's jest. Pre-war Europeans would never have entertained for a moment the construction of mosques from Malmö to Marseilles. But post-war Holocaust guilt, and the revulsion against nationalism and the embrace of multiculturalism and mass immigration, enabled the Islamization of Europe. The principal beneficiaries of the Continent's penance for the great moral stain of the 20th century turned out to be the Muslims — with the Jews on the receiving end, yet again."
-Mark Steyn, with context at the link

"There are 2500 British war cemeteries in France and Belgium. The sophisticated observer of the rows of headstones will do well to suspect that very often the bodies below are buried in mass graves, with the headstones disposed in rows to convey the illusion that each soldier has his individual place...
Every war is ironic because every war is worse than expected. Every war constitutes an irony of situation because its means are so melodramatically disproportionate to its presumed ends. In the Great War eight million people were destroyed because two persons, the Archduke Francis Ferdinand and his Consort, had been shot. The Second World War offers even more preposterous ironies. Ostensibly begun to guarantee the sovereignty of Poland, that war managed to bring about Poland's bondage and humiliation. "
Paul Fussell', "The Great War and Modern Memory"

Sunday, January 18, 2015

A conversation in two parts, lightly edited.

Part 1.

CC: I'm off to see that movie, Selma, tonight.

Shylock: Here's a prediction for you. I'll bet you that at absolutely no point in the movie do they ever mention that George Wallace was a Democrat.

CC: I'm sure they do. If they do, you have to go watch the movie.

Shylock: Betcha they don't. We'll see.

Part 2.

CC: So I watched Selma - as I so wisely predicted, they did mention party affiliations. Implicitly. So now you have to watch the movie like you promised.

Shylock: "Implicitly?" You mean they never say Wallace was a Democrat? Well colour me shocked.

CC: Well, there wasn't an explicit line in the movie where they said that George Wallace is a raging Democrat. But there were definitely a few scenes where he was talking very intimately with LBJ in the way that only party comrades do. It was totally obvious.


CC: Listen, it is totally possible that they said he was a Democrat and I missed it.

Shylock: "Dishonest biases of liberal filmmakers correctly predicted in advance by cynical reactionary, pundits astounded, full report at 11".

CC: If people who watch the film are so obtuse that they don't know that LBJ was a Dem, then I doubt explicitly stating anything was going to make a difference.

Shylock: Put it this way - can you think of any way they could have told the important facts of the story with any LESS emphasis on party affiliations than they actually did?

CC: Yes, I can think of many ways. Not emphasising LBJ and George Wallace as characters at all. Or revising history, and making them all Republicans.

Shylock: How's Birmingham, AL, doing these days? I'm guessing that doesn't get mentioned much either.

CC: That's besides the point. And I don't know - great food, strong family values? Doing as good as it ever was, I assume.

Shylock: Of course it is. We took away the racist institutions, and yet somehow now it looks like the third world. There are lots of possible explanations, but it's at least a puzzle, no?

CC: It's very hard to me to jump to the conclusion that Birmingham had become *a third world country* because of the enfranchisement of a minority. More likely has something to do with the decline of agrarian agriculture or whatever.

Shylock: If Birmingham were a country, its 2012 murder rate of 67 per 100,000 would make it literally the second highest in the world, behind Honduras. In 1951, the rate was 13 per 100,000. Probably agrarian agriculture. Or whatever.

Shylock: And personally, I'm in favour of universal disenfranchisement, but that's a separate issue.

Thursday, January 15, 2015

On the Charlie Hedbo killings

It's taken me a while to write about the Charlie Hedbo killings. It takes me a while to write anything anymore in this august journal, but it wasn't just that.

I felt genuinely stirred by one thing, first and foremost. The Charlie Hedbo staff had some pretty damn enormous stones. Drawing original Mohammed cartoons, under your own name, when the location of your office is publicly known, after you've already been firebombed once for doing so? That, my friend, is some serious commitment to thick liberty of speech. The ghost of John Stuart Mill is applauding the glorious dead of Charlie Hedbo. They paid the ultimate price to insist that the right to speak one's mind exists not only as a theoretical construct, but one that you can actually exercise. Behold, the roll of honour:
  • Cabu (Jean Cabut), 76, cartoonist
  • Charb (Stéphane Charbonnier), 47, cartoonist, columnist, and editor-in-chief of Charlie Hebdo.
  • Mustapha Ourrad , 60, copy editor.
  • Tignous (Bernard Verlhac), 57, cartoonist.
Alas, I fear we will not see their kind again soon.

The whole #JeSuisCharlie show of support was a mixed bag. I was at least heartened by the extent of explicit public solidarity, though I was inclined to agree with the various commentators who noted that there is a definite strain of false bravery by association in the hashtag, at least compared with the stupendous bravery of the actual Hedbo staff. But this is relatively minor.

One odd and yet somewhat positive result to come from this affair is that it finally, surprisingly dragged a number of US publications kicking and screaming into publishing some kind of depiction of Mohammed. They were for the most part unwilling in initial reporting to show any of the original cartoons that provoked the ire of the killers. They were certainly unwilling to print absolutely any of the Danish Mohammed cartoons a few years ago, to their great disgrace. 

But when the cover of the next edition of Charlie Hedbo was released, it seemed to finally shame some fraction of the American media into growing some balls, no matter how tiny and shriveled. Partly I suspect this was out of sympathy for their fellow journalists, partly because they perhaps sensed that they'd have enough of a justification and safety in numbers. Still, credit where diminutive credit is due, a surprising number at long, long last were willing to show something. According to the Daily Beast, the Washington Post ran photos of the cover, while USA Today and the Los Angeles Times put photos on their website. Even the BBC, to my astonishment, put a picture up, in one online story (which seems to be the the 'trial balloon' option, since you can take it back down again if you suddenly get scared). But of course, cowardice continues to win the day at CNN, ABC, AP, The New York Times, and so on. If you're unwilling to even reprint a cover specifically related to the story, whose depiction of Mohammed is not only mild and inoffensive, but which even contains the words 'All is Forgiven' above it, you'd sure as hell better not claim that you, too, are Charlie.

You can bet your ass that even the current crop of the recently less craven won't run more Mohammed pictures again soon. But the current reversal was made possible by the fact that for a short-lived time, a good number of the usual suspects who would ordinarily trumpet how free speech shouldn't include the right to say anything that might hurt the feelings of (certain chosen) religious minorities were at least temporarily shamed into silence. As expected, it didn't last long. It never does.

From this point on, alas, the story had mainly disappointment for me. 

With the distance of a few days, what strikes me the most about it is the fact that the only approved, socially acceptable response is sadness, and a "show of support", whatever that means. (Of course, half the left can't even muster that, going only for mealy-mouthed equivocation of "I support free speech, but..".).

But even take the #JeSuisCharlie people, whose solidarity I'm still glad to have. What exactly does it get you? You can have candlelit vigils in Paris. You can have hashtags. You can "show support", as an individual, and you can even assemble an impressive number of world leaders to do the same.

But then what?

What, exactly, does anyone plan to do in response? What, if any, policies or actions will change as a result?

The men who invaded the Charlie Hedbo offices were willing to trade their own lives, with very high expectation, to make sure that the people who drew and printed Mohammed cartoons were brutally and publicly killed. They were willing to die to send the message that if you create and distribute pictures of Mohammed under your own name, you will eventually be hunted down, even if you have police protection.

Will future such men be deterred by your hashtags? 

Will they be frightened by your "support"?

It is worth asking whether the killers succeeded in their purpose. Depressingly, I have to conclude that they did.

If you were a cartoonist, what would you learn from all this?

I'd learn, if I didn't already know it, that if I wrote a Mohammed cartoon, there's a strong chance I'd get killed. I might also learn that there's a reasonable chance I'd get a sympathetic hashtag going afterwards. How do you think that bargain strikes most cartoonists?

You don't have to guess to find out. Have a look through The Australian's gallery of cartoons drawn in the aftermath. I see a strong sentiment that the pen is mightier than the sword. I see a distinct lack of new drawings of Mohammed. 

I don't mean to single these guys out as cowards. They've just performed exactly the calculation that the terrorists wanted them to perform: if you draw a cartoon about Mohammed and publish it in such a way that we can identify you, you may be killed. Eli Valley drew about the dilemma quite poignantly here. It ends with the depressing conclusion: "The only context for me is this: call me a coward, but I want to continue to be alive." Not exactly stirring, is it? But then again, what have you done lately that's equivalently brave as what you're asking of him?

Mr Valley is absolutely right in his calculation of the stakes. Doubt not that this is deadly serious. Ask Molly Norris, a Seattle cartoonist in hiding since 2010 after death threats were made to her over her cartoons during 'Everyone Draw Mohammed' day. This is happening in America too. The only difference is that very few got printed the first time around, so there's fewer people to threaten.

Hence, the current implied scenario. Reprinting someone else's otherwise respectful depiction of Mohammed probably won't get you killed. Drawing your own anonymous Mohammed cartoon won't get you killed. Owning up to your public drawing quite possibly will.

Is there any serious doubt that of the people in the west who were, i) willing to publicly put their name to pictures of Mohammed, and ii) were set to run such cartoons in a major print publication, a large fraction were killed last week?

This is why the the terrorists succeeded.

So let's take it as given that "support", while better than opposition, will not in fact diminish the chances of future attacks occurring, nor will it significantly reduce the likely deterrent that the current attacks provide against new people drawing pictures of Mohammed. On its own, support, in other words, won't achieve anything. We return to the question from before. What, then, does anyone propose to do?

There is a very good reason that sadness is the only socially acceptable response. Anger, by contrast, requires action. When people are angry, they might actually do something. Is there anything that current political opinion will actually allow to be done?

The terrorists who perpetrated the act are already dead, so aside from cathartic displays equivalent to hanging Mussolini's corpse, there is nothing to be done there.

And since since we are loudly informed by all the great and the good that such attacks are representative of absolutely no wider sociological phenomenon but are merely the work of a tiny number of deranged madmen, apparently there's nothing to do directly to anyone else either.

So what if one's anger were turned towards the question of how we might ensure that this doesn't happen again, what might acceptable opinion consider?

Various Deus Ex Machina type answers get proposed. Better surveillance! Stop the flow of weapons to terrorist groups! Convince more Muslims to embrace free speech!

Very good. How, exactly, should this be accomplished?

The only one that might have any chance is the first. At least in America, we tried that. It was called The Patriot Act. While it is hard to judge its effectiveness, when the very name of your policy has effectively become shorthand for 'knee-jerk response to terrorism that permanently eroded important civil liberties', you may see why 'better surveillance' is not in fact an ideal policy response.

As for the second option, if anyone has the vaguest idea about what policy France might have implemented that would have succeeded in preventing the terrorists from having access to the weapons they had, I'm yet to hear it.

As for the third, nobody in any position of political power seems to have much of an idea how to get radical Muslims to love free speech other than 'be scrupulously nice to Muslims, insist that they're all peace-loving, don't discriminate against them, try not to offend them by depicting pictures of Mohammed...'

Give or take a few hiccups, it seems to me that this is the policy we've already been trying, no? This, in other words, is what brought us to the current position. Even if one were to think that we haven't done enough in this direction (like communism, true outreach has never been tried!), it surely seems worth at least considering the possibility that this policy actually does not work, and then what else one might do.

The West has collectively taken an enormous bet. It has bet that it can allow mass immigration from certain Muslim countries and successfully include such people into society in a way that doesn't compromise the West's own core values or result in permanent social conflict.

Maybe that bet is right. Every fibre of my being hopes that it is right. But Gnon cares little what you'd like to be true. It care only about what is.

However, the West, and the left in particular, cannot back away from its bet, no matter how high the stakes, no matter what evidence piles up. Because something much bigger is at issue. To acknowledge the possibility that the policy of large scale immigration from certain countries might have been mistaken would be to contemplate the notion that radical egalitarianism is false; that, much as we may hope it to be true, people are not all the same, and cultural systems are not all equally valid.

This will never be given up by the left. Never, ever, ever. 

Muslim immigration was never the cause, it was only ever the symptom. The cause was always our iron belief in radical egalitarianism. 

And this is why, in the end, we come to the conclusion that we knew all along. 

What, exactly, will the West do in response to all this? 


It will do nothing at all.

Tuesday, December 2, 2014

What is Said, What is Unsaid

Political correctness, like all successful forms of social censorship, tends to go through two phases.

There is a loud phase, where average people are actively confronted about why they can't say those mean things any more. Sanctimonious and humourless scolds, buoyed with righteous indignation, delight in vocally complaining about the oppressiveness of some other other innocuous set of jokes and observations. Most of the populace doesn't have much of a dog in the fight, so just goes with the default position of trying not to offend people, and accedes to the request. A smaller group of contrarian reactionaries fights a rearguard action of ridicule and stubborn insistence on the status quo, but usually knows it's a losing battle. Cthulu swims left, after all, so you may as well get on the right side of history.

This phase is the part that everyone remembers.

But after that, there's a quiet phase of political correctness too. Once the kulaks have been beaten into obedience and the new list of proscribed words and ideas becomes the reigning orthodoxy, what's left behind is the silence of the things that used to be said but now aren't. It lingers a while, like a bell that's been struck, and then gradually fades to nothing. Once this happens, it's easy to forget that it was ever there. What goes unsaid long enough eventually goes unthought, as Mr Sailer put it.

And the only way to see what's gone is to look at the past, and see what used to be said but now isn't.

In the case of political correctness, since it's a relatively recent phenomenon, you don't even need to go that far in the past.

Movies are a great example of this. It's a useful exercise to consider which classic movies from the past couldn't get made today.

Some ones are kept around in the public consciousness as examples of how wicked we used to be - everyone knows you can't make Birth of a Nation any more, but very few people today would even want to. Other movies are partially excused because of their cinematic value, although it's well understood that nobody should get the wrong idea - Gone With the Wind, for instance (1940, 8.2 out of 10 on IMDB). People still like that movie, but nobody would imagine that the current script, with its copious references to 'darkies' would get through even the first read-through at a studio. But this was made a long time ago, so they should get some credit for their good intentions - it was progressive for refraining from using the word 'nigger' and including the word 'damn', both choices of which proved surprisingly far-sighted. So Gone With the Wind gets a partial pass, like a racist Grandma that people still find lovable as long as you don't get her on the subject of the Japanese or crime in America.

But interestingly enough, there are some modern examples too, that people still think of fondly, but couldn't get made today.

Rain Man, for instance (made in 1988, 8.0 on IMDB), will never ever have a sequel or a reboot. It is inconceivable that you could make it today. The entire premise of the movie is that Dustin Hoffman is an autistic savant, and Tom Cruise is his intolerant brother who needs to transport him across the country in order to get access to an inheritance. The premise of nearly every joke is Dustin Hoffman's odd and innocent behaviour, and Tom Cruise's aggressive, cynical and frustrated ways of dealing with it. (Sample quote, yelled at Dustin Hoffman's head: "You can't tell me that you're not in there somewhere!).

As it turns out, the portrayal of Dustin Hoffman's character is actually rather sympathetic - while a lot of the jokes involve the absurdity of his behaviour, at least part of it is about Tom Cruise being a complete insensitive dick about it all, so the broad message is certainly not that it's hilarious to make fun of autistic people. But that wouldn't stop the autism activists having a fit if it were made today. It wouldn't get greenlit, it wouldn't get seriously discussed, it wouldn't get through the first glance through of a script reader, and because everyone knows this, it wouldn't get written in the first place.

Or take Silence of the Lambs (made in 1991, 8.6 on IMDB). The offending premise here is a little bit more subtle - the serial killer Buffalo Bill, whom the protagonists are hunting, is a man who kills and dismembers women in part because he was frustrated at his inability to become transsexual. That is to say, he wanted to become a woman as a result of childhood abuse (because why else would you want to become a member of the opposite sex if your thinking wasn't deranged for some good reason), but he was denied a sex change operation due to said abusive circumstances. It's taken as a fairly straightforward premise of the movie that the desire to amputate one's genitals and attempt to become a member of the opposite sex was, prima facie, an indication of likely mental illness. Hence it's not surprising that he would wind up killing women as part of his sexual confusion and jealousy.

These days, it's a mark of bigotry to even raise questions about whether transsexuals should be allowed to use women's bathrooms, or compete in womens MMA tournaments.

If the movie were being rewritten today, Buffalo Bill would probably be a killer driven by misogyny, and his evil childhood influences would be rejection by women and too much reading of the manosphere. THAT will make you kill people! Transsexuals are just fine, in fact they're better than that, they're almost a protected group (or will be soon - trust me).

And these are just the changes that have happened in my lifetime.

The changes in the zeitgeist that happened before one's lifetime are far harder to see.

If you really want to see what they are, pick a few random primary sources from Moldbug, cited next to each relevant post.

If conservatism is the democracy of the dead, the only way to find out how they might vote if they could is to actually read what they've written.

Who knows what ideas are going entirely unthought in your head, not for having examined the subject and rejected it, but by simply having never heard it at all.

Wednesday, November 19, 2014

The worst law in London

What does absurd government monomania in the face technological irrelevance look like?

Back in the early years of the 20th century, before computers had become widespread, the word 'calculator' actually referred to people. They would perform large numbers of arithmetic calculations, essentially being a slow and kludgy version of a spreadsheet.

Let's suppose, hypothetically, that being a human computer was a licensed and highly regulated profession in 1920. The government required you to study for years, and prove that you could do hundreds of long division calculations without making a mistake. A whole mystique grew up about 'doing the sums', the examination required to become a calculator. Only licensed calculators were permitted to perform arithmetic operations for more than half an hour a day in a commercial setting

Then IBM popularises the computer, and  Richard Mattessich invents the spreadsheet, and it becomes totally clear to absolutely everybody that 'doing the sums' is completely worthless as a skill set. Not only is keeping the current regulation raising costs by a lot, but it's producing huge deadweight loss from all the people devoting years of their life to studying something that's now completely redundant.

What do you think the response of the government and the public would be once it became apparent that the new technology was cheap and easily available? Immediate repeal of the absurd current regime? Outcry and anger at the horrendous government-mandated inefficiency?

Ha! Not likely,

I suspect the old regime would trundle merrily along, and the New York Times would write philosophically-minded pieces extolling the virtues of it.

Because, dear reader, there actually exists regulation exactly this disgraceful - The Knowledge, the required examination for London taxi drivers.

The New York Times Magazine wrote a long piece describing just how much taxi drivers are required to memorise:
"You will need to know: all the streets; housing estates; parks and open spaces; government offices and departments; financial and commercial centres; diplomatic premises; town halls; registry offices; hospitals; places of worship; sports stadiums and leisure centres; airline offices; stations; hotels; clubs; theatres; cinemas; museums; art galleries; schools; colleges and universities; police stations and headquarters buildings; civil, criminal and coroner’s courts; prisons; and places of interest to tourists.
 Test-takers have been asked to name the whereabouts of flower stands, of laundromats, of commemorative plaques. One taxi driver told me that he was asked the location of a statue, just a foot tall, depicting two mice sharing a piece of cheese. It’s on the facade of a building in Philpot Lane, on the corner of Eastcheap, not far from London Bridge.
What, in the name of all that is holy, is the purpose of making it a legal requirement of driving a taxi that you can name the location of a foot-tall statue of two mice that exists somewhere in London?

In the first place, the demand for finding the location of a statue like this from your taxi driver is zero. A precisely estimated zero, as the statisticians say. The revenues side of the ledger is a donut. It is literally inconceivable that the location of this statue has been the subject of a legitimate question towards a London taxi driver in the history of the entire profession. The only benefit is rent-seeking and limiting the size of the taxi industry. So why not just make them memorise the Roman Emperors in chronological order, or the full text of War and Peace? It would serve just as much purpose.

Not only is there no value to your taxi driver knowing this, but if I type in 'statue of two mice in London' into Google, the first image lists the location as 'Philpot Lane'. (The only sites that come up, ironically, are ones referencing the damn test, suggesting just how pointless this knowledge is). The internet has made memorising this kind of trivia, for all possible sets of London trivia, irredeemably useless.

Everything a taxi driver needs to know has been replaced by a smartphone. Everything. Which is why every man and his dog can drive Uber around just fine.

So what threadbare arguments does the NYT offer when, three quarters of the way through the article, it finally gets around to discussing the question of whether this damn test is worth anything?
Taxi drivers counter such claims by pointing out that black cabs have triumphed in staged races against cars using GPS, or as the British call it, Sat-Nav. Cabbies contend that in dense and dynamic urban terrain like London’s, the brain of a cabby is a superior navigation tool — that Sat-Nav doesn’t know about the construction that has sprung up on Regent Street, and that a driver who is hailed in heavily-trafficked Piccadilly Circus doesn’t have time to enter an address and wait for his dashboard-mounted robot to tell him where to steer his car.
Okay, I'll bite. They beat them in staged races by... how much? One minute? Maybe two? Perhaps 60 or 70% of the time? And the value of this time-saving is what, exactly? How does it compare to the extra time the person waited trying to hail a cab because of the artificial limit on the number of taxis?

It seems that New York Times writers are not required to distinguish between statements like 'the revenue side of the income statement here has literally no items on it' and the statement 'this is a positive NPV project that should be invested in'. Disproving the first statement is sufficient to establish the truth of the second. Look, there's a benefit! Really! See, it shows it must be a good idea to do the project.

Perhaps sensing the unpersuasive ring of this argument to anyone who's ever ridden in an Uber and found it cost 40% of the price, we then get another tack:
Ultimately, the case to make for the Knowledge may not be practical-economic (the Knowledge works better than Sat-Nav), or moral-political (the little man must be protected against rapacious global capitalism), but philosophical, spiritual, sentimental: The Knowledge should be maintained because it is good for London’s soul, and for the souls of Londoners. 
Well, in that case!

But riddle me this - how, exactly, can I tell whether this egregious rent-seeking and artificial deadweight loss monopoly is good for London's soul? 
The Knowledge stands for, well, knowledge — for the Enlightenment ideal of encyclopedic learning, for the humanist notion that diligent intellectual endeavor is ennobling, an end in itself. 
'Enlightenment'. You keep using that word, I do not think it means what you think it means.

Learning is definitely good. Government-mandated learning, especially when used as part of banning the consensual commercial activity of many individuals, is a wholly separate matter.

Just ask someone from the Enlightenment, like John Stuart Mill:
But, without dwelling upon supposititious cases, there are, in our own day, gross usurpations upon the liberty of private life actually practised, and still greater ones threatened with some expectation of success, and opinions propounded which assert an unlimited right in the public not only to prohibit by law everything which it thinks wrong, but in order to get at what it thinks wrong, to prohibit any number of things which it admits to be innocent.
Like, for instance, driving a cab without studying for years to satisfy a ludicrous exam requirement. 

But it's not just the higher taxi fees and difficulty getting a cab at the wrong time of night that make up the real tragedy here. What's the human toll of making every potential taxi driver learn this kind of nonsense, regardless of whether they ultimately succeed?
McCabe had spent the last three years of his life thinking about London’s roads and landmarks, and how to navigate between them. In the process, he had logged more than 50,000 miles on motorbike and on foot, the equivalent of two circumnavigations of the Earth, nearly all within inner London’s dozen boroughs and the City of London financial district. 
 It was now 37 months since he’d paid the £525 enrollment fee to sign on for the test and appearances. “The closer you get, the wearier you are, and the worse you want it,” McCabe said. “You’re carrying all this baggage. Your stress. Worrying about your savings.” McCabe said that he’d spent in excess of £200,000 on the Knowledge, if you factored in his loss of earnings from not working. “I want to be out working again before my kids are at the age where someone will ask: ‘What does your daddy do?’ Right now, they know me as Daddy who drives a motorbike and is always looking at a map. They don’t know me from my past, when I had a business and guys working for me. You want your life back.”
Apparently this must be a strong case of the false consensus effect, because reading this paragraph filled me with furious rage, but the NYT writes about it as one of those quaint things they do in old Blighty.

In the end, McCabe gets his license, so it's all a happy story!

He does not, however, get the three years of his life and £200,000 back.

How on earth do the parasites who run the testing and administration of this abomination justify all this to themselves? How do they explain their role in this shameful waste of money and fleeting human years, the restrictions on free and informed commerce, the ongoing fleecing of consumers, and the massive, groaning, hulking, deadweight loss of this monstrous crime against economic sense and liberty?

They must be either extraordinarily intellectually incurious, morally bankrupt, or both.

As the Russians are fond of saying, how can you not be ashamed?

Friday, November 7, 2014

They're all IQ tests, you just didn't know it

Here's one to file under the category of 'things that may have been obvious to most well-adjusted people, but were at least a little bit surprising to me'.

Many people do not react particularly positively when you tell them what their IQ is, particularly when this information is unsolicited.

Not in the sense of 'I think you're an idiot', or 'you seem very clever'. Broad statements about intelligence, even uncomplimentary ones, are fairly easy to laugh off. If you think someone's a fool, that's just, like, your opinion, man.

What's harder to laugh off is when you put an actual number to their IQ.

Having done this a couple of times now, the first thing you realise is that people are usually surprised that you can do this at all. IQ is viewed as something mysterious, requiring an arcane set of particular tasks like pattern spotting in specially designed pictures, which only trained professionals can ascertain.

The reality is far simpler. Here's the basic cookbook:

1. Take a person's score on any sufficiently cognitively loaded task = X

2. Convert their score to normalised score in the population (i.e. calculate how many standard deviations above or below the mean they are, turning their score into a standard normal distribution). Subtract off the mean score on the test, and divide by the standard deviation of scores on the test. Y = [ X - E(X) ] / [ σ(X)]

3. Convert the standard normal to an IQ score by multiplying the standard normal by 15 and adding 100:
IQ = 100 + 15*Y

That's it.

Because that's all IQ really is - a normal distribution of intelligence with a mean of 100 and a standard deviation of 15.

Okay, but how do you find out a person's score on a large-sample, sufficiently cognitively-loaded task?

Simple - ask them 'what did you get on the SAT?'. Most people will pretty happily tell you this, too.

The SAT pretty much fits all the criteria. It's cognitively demanding, participants were definitely trying their best, and we have tons of data on it. Distributional information is easy to come by - here, for instance. 

You can take their score and convert it to a standard normal as above - for the composite score, the mean is 1497 and the standard deviation is 322. Alternatively you can use the percentile information they give you in the link above and convert that to a standard normal using the NORM.INV function in excel. At least for the people I looked at, the answers only differed by a few IQ points anyway. On the one hand, this takes into account the possibly fat-tailed nature of the distribution, which is good. On the other hand, you're only getting percentiles rounded to a whole number of percent, which is lame. So it's probably a wash.

And from there, you know someone's IQ.

Not only that, but this procedure can be used to answer a number of the classic objections to this kind of thing.

Q1: But I didn't study for it! If I studied, I'm sure I'd have done way better.

A1: Good point. Fortunately, we can estimate how big this effect might be. Researchers have formed estimates of how much test preparation boosts SAT scores after controlling for selection effects. For instance:
When researchers have estimated the effect of commercial test preparation programs on the SAT while taking the above factors into account, the effect of commercial test preparation has appeared relatively small. A comprehensive 1999 study by Don Powers and Don Rock published in the Journal of Educational Measurement estimated a coaching effect on the math section somewhere between 13 and 18 points, and an effect on the verbal section between 6 and 12 points. Powers and Rock concluded that the combined effect of coaching on the SAT I is between 21 and 34 points. Similarly, extensive metanalyses conducted by Betsy Jane Becker in 1990 and by Nan Laird in 1983 found that the typical effect of commercial preparatory courses on the SAT was in the range of 9-25 points on the verbal section, and 15-25 points on the math section. 
So you can optimistically add 50 points onto your score and recalculate. I suspect it will make less difference than you think. If you want a back of the envelope calculation, 50 points is 50/322 = 0.16 standard deviations, or 2.3 IQ points.

Q2: Not everyone in the population takes the SAT, as it's mainly college-bound students, who are considerably smarter than the rest of the population. Your calculations don't take this into account, because they're percentile ranks of SAT takers, not the general population. Surely this fact alone makes me much smarter, right?

A2: Well, sort of. If you're smart enough to think of this objection, paradoxically it probably doesn't make much difference in your case - it has more of an effect for people at the lower IQ end of the scale. The bigger point though, is that this bias is fairly easy to roughly quantify. According to the BLS, 65.9% of high school graduates went on to college. To make things simple, let's add a few assumptions (feel free to complicate them later, I doubt it will change things very much). First, let's assume that everyone who went on to college took the SAT. Second, let's assume that there's a rank ordering of intelligence between college and non-college - the non-SAT cohort is assumed to be uniformly dumber than the SAT cohort, so the dumbest SAT test taker is one place ahead of the smartest non-SAT taker.

So let's say that I'm in the 95th percentile of the SAT distribution. We can use the above fact to work out my percentile in the total population, given I'm assumed to have beaten 100% of the non-SAT population and 95% of the SAT population
Pctile (true) = 0.341 + 0.95*0.659 = 0.967

And from there, we convert to standard normals and IQ. In this example, the 95th percentile is 1.645 standard deviations above the mean, giving an IQ of 125. The 96.7th percentile is 1.839 standard deviations above the mean, or an IQ of 128. A surprisingly small effect, no?

For someone who scored in the 40th percentile of the SAT, however, it moves them from 96 to 104. So still not huge. But the further you go down, the bigger it becomes. Effectively you're taking a weighted average of 100% and whatever your current percentile is, and that makes less difference when your current one is already close to 100.

Of course, the reality is that if someone is offering these objections after you've told them their IQ, chances are they're not really interested in finding out an unbiased estimate of their intelligence, they just want to feel smarter than the number you told them. Perhaps it's better to not offer the ripostes I describe.

Scratch that, perhaps it's better to not offer any unsolicited IQ estimates at all. 

Scratch that, it's almost assuredly better to not offer them. 

But it can be fun if you've judged your audience well and you, like me, occasionally enjoy poking people you know well, particularly if you're confident the person is smart enough that the number won't sound too insulting.

Of course, readers of this august periodical will be both a) entirely comfortable with seeing reality as it is, and thus would nearly all be pleased to get an unbiased estimate of their IQ, and b) are all whip-smart anyway, so the news could only be good regardless.

If that's not the case... well, let's just say that we can paraphrase Eliezer Yudkowsky's advice to 'shut up and multiply', in this context instead as rather 'multiply, but shut up about it'.

The strange thing is that even though people clearly are uncomfortable having their IQ thrown around, they're quite willing to tell you their SAT score, because everybody knows it's just a meaningless test that doesn't measure anything. Until you point out what you can measure with it. 

I strongly suspect that if SAT scores were given as IQ points, people would demand that the whole thing be scrapped. On the other hand, the people liable to get furious were probably not that intelligent anyway, adding further weight to the idea that there might be something to all this after all.

Sunday, November 2, 2014

On Being Sensible

There comes a point in one’s life where one surrenders to the lure of the practical, rather than the romantic. Bit by bit, the arguments for whimsical and aesthetic considerations make way for the fact that it is generally better to simply have one’s affairs in order. I suppose this is part of what maturity means – the extension of one’s planning horizon, so that the present value of sensible choices outweigh the desire to do things merely for the je ne sais quois of seeing something new.

For those of us of a mostly sensible bent, the appeal of solid, practical decisions doesn’t need much extra boosting. But even among such as I, there is still romance, in the broad sense. It just shows up in unexpected places.

While I don’t know exactly when this shift towards sensibility occurred (or even if it had a particular turning point), I do know one of the marks of its arrival.

The clearest indicator, at least to me, is the choice of which seat to choose on the aeroplane.

At some point, the desire to be able to easily get to and from the bathroom becomes the thing one values in this microcosm of life’s choices. Stepping over people is a pain, not being able to pee when one wants to is a pain, waking up people who fell asleep at inopportune times is definitely a pain, especially for the introverted. Life is just easier when you don’t have to worry about these things.

And yet, sometimes an overbooked flight forces you into a window seat, and you remember when you used to pick the window to watch the world beneath. You gaze out into the silvery moonlight, with wisps of clouds floating below you. Tiny patches of criss-crossing light mark the small towns far distant, defying the sea of darkness. The steady glow appear as lichen, growing in odd patterns along the grooves of a rock in an otherwise barren desert.

How many generations of your ancestors lived and died without seeing a sight so glorious?

How many would trade this for slightly more convenient bathroom access?

It is worth noting that this tradeoff does not need to be explained to small children. They instinctively get what’s amazing about watching the world below at takeoff and landing.

Particularly for those of us whose affairs are mostly in order, it is worth being occasionally reminded of the lesson.

Tuesday, October 21, 2014

The Various Ironies of Gough Whitlam

Former Australian Prime Minister Gough Whitlam died recently, at age 98. Predictable hagiographies followed, with cringe-inducing link titles like 'Gough Whitlam a Martyr and a Hero'. This causes right-thinking people to be torn between the polite and worthy tradition of not speaking ill of the recently-deceased, and a mildly grating feeling that the hagiographers write the narratives when this happens. Obituaries are hard to do well, that's for sure, and most don't even really try.

Say whatever else you will about Gough Whitlam, but he was a transformative Prime Minister. Unfortunately, the balance of this transformation was decidedly negative. To his credit, Whitlam enacted some truly good policies, most notably getting rid of the draft, and cutting tarriffs. He also brought in some others that were probably inevitable, like no-fault divorce and recognition of China. He also had some disastrous ones. The Racial Discrimination Act was probably his most poisonous legacy, most recently in the news for being part of the trashing of free speech in the prosecution of Andrew Bolt. Getting rid of university fees almost certainly contributed to the permanent underfunding and subsequent underperformance of Australian universities to this day. He also cut off Rhodesia (leading it to the brilliant sunlit uplands it's in today), and rewarded the buffoonish Lionel Murphy for his bizarre raids on ASIO offices (which tarnished Australia's reputation as a serious state in intelligence matters) by appointing him to the High Court (where he was predictably and comically awful).

But the big irony of the Whitlam years involves the Liberal Party. They struggled so mightily to unseat him, including blocking the funding of government to provoke a constitutional crisis. Blocking supply, I might add, was something that the Libs attract an oddly small amount of criticism for, given its role in the whole affair. Whitlam was famously dismissed by Governor-General John Kerr (who became the boogie-man to the Labor party faithful ever since). Whitlam was also then subsequently voted out by a huge margin in the ensuing elections (a fact that Whitlam fans never seem to discuss very much, since it doesn't fit the narrative very well).

So the Liberals finally won their big victory over Whitlam! And what was their big reward?

Eight years of Malcolm Bloody Fraser, the most disappointing Liberal Prime Minister ever, and one of the worst overall (giving Gough a red hot go for that title).

If the election is between Fraser and Whitlam, honestly, why even bother? It's like the David Cameron v Gordon Brown election - as Simon and Garfunkel said, every way you look at it, you lose.

Thankfully, conservatives eventually had something to cheer for when Fraser was kicked out and Australia finally got some sensible and important economic reforms, coming from... Labor Prime Ministers Bob Hawke and Paul Keating! The former was excellent, the latter was pretty decent too (and superb as Hawke's treasurer). Ex-post, is there a single member of the Liberal Party today (excluding the braindead and the hyper-partisan) who, if sent back in time to 1983 but knowing what they do now, would actually vote for Fraser over Hawke?

And yet Whitlam is the 'hero and the martyr'. Hawke plays second fiddle in Labor Party folklore, despite being excellent in ways that were of mostly bipartisan benefit (floating the dollar, cutting inflation, and other instances of important micro-economic reform).

Yeah, I don't get it either.

Monday, October 20, 2014

More Gold

It always warms my heart when the mere title of an essay makes me laugh. Herein, the estimable Theodore Dalrymple, with 'Your Dad is Not Hitler'. His other recent essay, 'A Sense of Others' is also fantastic.

Honestly, if Taki's Magazine had Moldbug and Heartiste writing for it, one would scarcely need to go anywhere else.

Sunday, October 19, 2014

Yes, we are still on for the thing tonight, just like we said, god dammit.

Continuing my descent into old fogey-ness, I seem to have encountered another shift in the zeitgeist that marks off my age. The first one was the enormous increase in the number of text messages sent by the average teenager. But this was something that one mostly would only see if one actually has a teenager around the house. Since this doesn't apply to me, I only find out about it in odd magazine articles.

But there is another trend that I have had cause to experience firsthand - the proliferation in confirmatory text messages over every social arrangement.

Up until recently, my general presumption was that things worked as follows:

-You and person X would agree to do activity Y at time Z.
-If one of you couldn't make it, you would inform the other ahead of time.
-Absent that, it is assumed that the arrangements stand and you both turn up at time Z.

You, like me, might presume that this is how things still work, yes?

You, like me, might end up being rather surprised.

These days, a lot of people, particularly young people, seem to have decided collectively that they're switching from an opt-out system of arrangements to an opt-in one. In other words, plans to do things in two days time are merely a suggestion, a vague agreement-in-principle. If you actually intend to follow through, you have to confirm this.

I found this when I'd start getting messages asking if we were still on for what I considered agreed-upon plans. I used to respond with 'of course' or something like that, wondering vaguely why this was now the thing that people did, but dismissing it as evidence of their neediness or insecurity. Confirming to them would seem pointless, but not a big deal.

I remember complaining to a friend, and saying that it was refreshing to find people who didn't need this. I was meeting someone new for coffee that evening, and was glad that we hadn't done the obligatory text message dance, which seemed like a good sign. That is, until she didn't actually turn up. Apparently she had decided that not receiving a confirmation was an indication that things were canceled, so much so that she apparently hadn't bothered to message me to check.

To paraphrase Frank Costanza, as I rained abusive text messages on her, I realized there had to be another way. After my rage subsided, it became pretty clear that my attempts to fight a rearguard action against the culture were as doomed as the 50's protests against rock and roll. So I now suck it up and send confirmatory messages. Sometimes one still isn't enough - I've sent a confirmation the night before, only to get another query confirming things an hour before. Who are these people, and what on earth is wrong with them?

I think the reality is that people have become so flaky that this is actually the more efficient social arrangement. When enough people become sufficiently inconsiderate that they just cancel all the time at the last minute, confirmations are actually time-saving. They're only a net drain when the probability of last minute cancellations is politely low, at which point they're a nuisance. This was what I assumed was the case, but apparently not. The real shift will have arrived when cancelling is so common that it's not even considered that impolite. Once again, I'm pretty sure this is a generational thing.

If narcissism and self-centredness are the psychological traits of our age, then flakiness is merely the natural result. Everyone else's time is less valuable than mine (one reasons), so what difference does it make if I change plans on someone at the last minute? Actually, it's probably worse than that - the median reasoning (such as it is) is probably closer to 'I have something better on, or can't be bothered. Ergo, I won't go'. To that extent, expecting confirmatory text messages at least indicates an ability to escape from pure solipsism and anticipate everyone else's self-centredness too. Which, at the margin, I guess is a good thing, even if the need for such anticipation is ultimately depressing.

Plus I just hate sending zillions of text messages, which annoys me too. Why? Same underlying reason.

Monday, October 6, 2014

Crazy is not a hypothesis

One of the criticisms I sometimes hear of behavioral finance, mostly from the rational crowd, is that one is just showing that 'people are crazy' or 'people are stupid'. This is always said dismissively, as if such an observation were trivially true and thus unworthy of observation or elaboration.

The first indication that this is a vastly overblown criticism is given by the fact that, despite the claimed triviality and obviousness of people's stupidity and craziness, these traits don't seem to find their way into that many models - the agents in those models are all rational, you see.

Well, actually, it's a bit subtler than that. Stupid agents have actually been in models for quite a while now, most notably in models that include noise traders, trading on false beliefs or for wholly idiosyncratic reasons.

But agents who could be described as 'crazy' are harder to find - acting in completely counterproductive or irrational ways given a set of preferences and information. So why is that?

The reason, ultimately, is that 'crazy' is usually not a useful hypothesis. It's a blanket name given to a set of behaviors that falls outside of what could be considered rational behavior, or even partially rational (such as kludgy rules of thumb or naive reinforcement learning).

And the reason you know that crazy isn't a useful hypothesis is that it tells you very little about how someone will act, other than to specify what they won't do. How would you go about modeling the behavior of someone who was truly crazy? Maybe you could say they act at random (in which case things look like the noise traders that we labelled as stupid). But are you really sure that their behavior is random? How sure are you that it's not actually predictable in ways you haven't figured out? It seems pretty unlikely that there are large fractions of traders who are in a bona fide need of institutionalisation in a sanitorium, if for no other reason than someone who was really bonkers would (hopefully) struggle to get a job at the JP Morgan trading desk or acquire enough millions of dollars to move financial markets.

The whole point of behavioral economics (and abnormal psychology before it) is to figure out how people are crazy. When someone is doing something you don't understand, you can either view it as mysterious and just say that they went mad, or you can try to figure out what's driving the behavior. But madness is an abdication of explanation.

Good psychiatry reduces the mystery of madness to specific pathologies - bipolar disorder, psychopathy, depression, autism, what have you. 'Madness' functions as the residual claimant, thankfully getting smaller each year.

Good behavioral finance ultimately strives at similar ends - maybe people are overconfident, maybe they use mental accounting, maybe they exhibit the disposition effect. These are things we can model. These things we can understand, and finally cleave the positive from the normative - if rational finance is a great description of what people should do but a lousy description of what they do do, then let's also try to figure out what people are actually doing, while still preaching the lessons we formulated from the rational models.

To say that behavioral finance is just 'people acting crazy' is somewhat like saying that all of economics can be reduced to the statement 'people respond to incentives'.  In a trivial sense, it may not be far from the truth. But that statement alone doesn't tell you very much about what to expect, as the whole science is understanding the how and the why of incentives in different situations - all the hard work is still to be done, in other words.

It's also worth remembering this in real life situations - when someone you know seems to be acting crazily, it's possible they have an unusual form of mental illness as yet unknown to you, but it's also possible that you simply have inadequate models of their preferences and decision-making processes. Usually, I'd bet on the latter.

Thursday, September 25, 2014

A thing I did not know until recently

The word 'se'nnight'. It's an archaic word for 'week', being a contraction of 'seven night(s)'. The most interesting thing is that it makes it immediately clear where 'fortnight' comes from, being a similar contraction of 'fourteen night(s)'. The more you know.

Via the inimitable Mark Steyn.

Tuesday, September 23, 2014

On the dissolving of political bands and the causes impelling separation

Well, Scottish independence has come and gone, thank God. The list of grievances being cited was pathetic enough to make even the complaints of the American colonists (already laughably overblown) seem like the accounts of survivors from North Korean prison camps.

But one thing this whole debacle really illustrated is the following: very few people these days think in a principled way about secession. When, if ever, do a group of people have a right to secede from a country? Do they even need legitimate grievances? How many of them need to agree, and by what margin?

This is certainly true in America. What are the two historical events that most people in this country agree on? Firstly, that the American revolution was a jolly good thing and entirely appropriate. And secondly, that the civil war was fortunately won by the North, whose cause was ultimately just (this is probably still somewhat disputed in the South today, but I think it's probably broadly agreed on overall).

Ponder, however, the surprising difficulty in reconciling those two positions in a principled manner. For some thoughts on the justification for the Cofederacy, meet Raphael Semmes, a Captain of the Confederate States Navy. Have a read of how an actual member of the Confederacy justifies the South's position. It's all in the first couple of chapters of his book, 'Memoirs of Service Afloat', which Gutenberg has for free here.

If you're too lazy to read the original, his argument is quite simple. Firstly, he argues that the same rights that gave the states the ability to join the union gave them the right to leave - they were separate political entities capable of their own decisions, a status that predated the union. Second, he argues that the people of the North and the people of the South are fundamentally dissimilar in attitude and culture. And finally, that the North had been oppressing the South. over the years, and the South simply wanted out.

Now, you may consider these arguments persuasive or unpersuasive. But before you decide, it is worth comparing them to the arguments that the American Colonists claimed as their justification for seceding from Britain. Semmes' argument, if you boil it down, essentially says that we claim the same right to secede from the Union as the thirteen colonies claimed as their right to secede from Britain.

Perhaps slavery is the trump card, the elimination of which (presuming for a moment that this was the sole rationale for the war from the Northern perspective, a far from obvious point) had such moral force that it overwhelmed all the other arguments. But without this logical Deus Ex Machina, it is quite challenging to come up with a consistent set of principles under which the colonies independence was was justified but the South's was not. It's not impossible, but it's not straightforward either. And when you're done with that, be sure to reconcile it with your thoughts on independence in Kosovo, Catalonia, Chechnya, the Kurds in Turkey, ISIS in northern Iraq and other modern examples.

Or put it this way - hypothetically, had the South agreed to abolish slavery, and then done so in a way that meant reinstating it was impossible, but afterwards still insisted on secession, would their cause have been justified then?

I really don't know what most Americans would say to that one.

I don't think Americans are alone in this unthinking attitude to the question.

You saw this exactly on display in the Scottish fiasco. Most political unions don't contain explicit descriptions of how they can be dissolved. This goes doubly so for countries like Britain, which don't have a formal constitution at all.

What this means is that it's entirely unclear when or which bits of it can break off. Scotland at least had the virtue of being a polity with its own history, own accent, own traditions and so forth. People know who 'The Scots' are, so you don't need to explain why they should be considered their own entity. But what if Glasgow decided that, notwithstanding the opinion of the rest of Scotland, they wanted to secede from the UK themselves. Could they do it? Population-wise, there's as many people living in Glasgow (596,000) as Montenegro (625,000)or Luxembourg (549,000). And if Glasgow, what about Inverness (72,000)?

And not only that, but the lack of formality was on display by the method of deciding the question. A single referendum, with the Scots as the only people being consulted. Moreover, for a decision this momentous, you might assume that you need some kind of supermajority or something. But since we can't specify that kind of thing ahead of time, the default assumption is that a simple majority will do, one time. If 50.01% of Scots want to leave, then out they go. Bad luck for the remaining 49.99%. Bad luck for any Scots yet to come who might have preferred the union. I suspect that if Cameron had thought he might lose, he might have asked for a higher standard. But a) how would he justify that higher number, and b) if he did, would he then be bound by the outcome?

For a lot of major political decisions, the public never gets consulted at all. It's not clear if the British will get a vote on whether to stay in the EU. They did get a referendum in 1975 to decide whether to join the European Economic Community (which later became the EU) but you'd be a bold man to claim that that signing up to the EEC meant a full knowledge of the leviathan that the EU would later become. In November 2012, support for leaving the EU was 56%. Under the one-time, one-vote rule, that could have been enough to get them out. One might say that holding this vote would force exclusion from the EU for future Brits, who might not be able to change their minds. Then again, one could equally say that the vote in 1975 forced inclusion on lots of modern Brits who now also can't change their mind.

I don't pretend there's easy answers to any of these questions. The libertarians would say every individual has the right to secede from any group, which is a consistent, if difficult to implement position.

But the whole Scotland thing has shown is that avoiding thinking about these kinds of questions doesn't make them go away. They're going to come up periodically, and you just get incoherent answers by not having any contingency plans.

Everyone goes into marriages thinking they'll last forever. And yet we still think it prudent to have divorce procedures well known in advance.

Since I'm mostly a fan of formalism, I think countries would benefit from the same arrangements.