Archive for the ‘rationality’ Category

OKTrends has an amusing post, but what I like about it is that it’s consilient with the process I defined here. My idea was that songs that were rated 5 might be good, but might also just be violently weird to the reviewer. By the same logic, the same must be true of the 1s. Assuming that my tastes aren’t the same as the reviewer, the information in the reviews was whether the music was either mediocre, or potentially interesting. The output is here.

The OKTrends people seem to have rediscovered the idea independently looking at dating profiles – it’s better to be ugly to some and beautiful to others than it is to be boringly acceptable to everybody.

Advertisements

Over at CT, a link to two polls – here and here. The killer finding is that the same sort of percentage of the US population, and the same sort of people, deny that Barack Obama is a US citizen *and* that the American and African continents were once part of the same landmass.

Specifically, an actual majority – well, a plurality – of Republicans disagree with plate tectonics. The crossbreaks are hilarious; the only groups of which this was true were Republicans and Southerners, but the most likely groups to get it right were blacks, Latinos, and Democrats, in that order, so those were almost certainly the same individuals.

So far, another stupid Americans story. But the differences between the groups can be summarised as follows:

  1. Less Republican groups had a small but real advantage in the percentage who voted yes.
  2. More Republican groups had a considerable lead in the percentage who voted no.
  3. More Republican groups had many fewer Don’t Knows.

I think the most important statement is number three. The Democratic- and fact-leaning groups, although they had significant numbers of people who got it wrong, were much more likely to say they weren’t sure than to choose No. Being unsure of the answer, they expressed doubt.

Republican-leaning groups weren’t just more likely not to know, but more likely to plump for an answer anyway. Fools, you could say, rushed in.

Now, I really don’t believe quite that many of them are ignorant of basic geology – I rather suspect that the question tripped a number of trigger words (Africa!) before getting to that point. Everyone thinks like that a lot of the time.

This is, of course, how operationalised post-modernism works – what matters is the theatre of action, jumping and yelling and trying to dominate the mental space, and all that determines which way you’re pointing is a small set of identity-defining talking points. Did you know Senator X is weak on fufferum? Did you know that?? But why aren’t you talking about his position on elefurt? That’s what I want to know! The base have probably internalised this to a considerable degree.

That does raise the possibility of getting things done by tailoring your message to fire their immune receptors. The classic example is adding the word “security” to whatever proposal you have. Similarly, the Decent Left project was based on giving a whole lot of ugly right-wing ideas the right biological markers to stimulate a certain kind of leftie.

So somebody reviewed 1,302 songs by the same number of bands, giving each one six words only.

But how to centrifuge this toxic dump? Clearly there was no possibility of scraping the page and wget-ing the lot; Sturgeon’s Law (90% of everything is shit) applies to music as it does to few other things. I thought of trying to express my tastes in a set of criteria, that I might even implement in a python script, but on reflection this seemed to be too much like work, and anyway, it didn’t really fit the aim. I wanted surprises, not confirmation.

Then I had an idea; what about applying some sort of statistical method? Yer man had given each song a rating between 1 and 5; as you know, Bob, if you ask people in a survey to rate something on a scale of 1 to 5, they will go for 3 far more often than you’d expect from a normal distribution, because it’s the safe choice. But presumably the ones he gave a top rating to must have something.

And there were basically two ways a song could get into the bottom rank; either it was objectively arrant shite, or else it was incompatible with the other guy’s tastes. Now, I have no idea what those are and no reason to assume they are anything like mine, so in fact, being one-starred could actually be a recommendation. Similarly, being top-rated could be either evidence of quality, or else just a matter of taste. And I had no reason to imagine either case was more likely. Further, the principle of management by exception was in my mind; the top and bottom 10% must be doing something right or wrong, so they’re the ones to look at.

So I decided to ignore all the 2s and 3s and most of the 4s, and then make a selection from the ones that remained, based on unreason and hunch, and at least once on the basis that they came from Leeds.

And? I’m grinning with delight at the results, a pile of 31 MP3s of which 30 are by people I’ve literally never heard of and at least 28 are utterly great. Here’s the really interesting bit, though: I can’t tell which ones were 1s and which were 5s. Well, there is at least one exception to that, but as a rule, no, it is far from obvious. And why are so many fronted by women? This isn’t something I’d noticed as a taste, although – horribly – I just remembered that my father owns a vast amount of vinyl by early 1970s hippy-chick singer-songwriters. Boxes and Nick Hornbyesque boxes of ’em. That’s hardly characteristic of the list I came up with, but it is scary. Perhaps it’s sampling bias – or maybe the quasi-automatic process got around my unconscious prejudices?

Here’s an interesting scientific paper about Palestinians and Israeli settlers. The experiments asked each group questions intended to judge how willing they were to compromise. Then, they asked the questions again, but threw in a side-offer, for example of economic aid or third-power security guarantees.

Interestingly, all the groups split into two identifiable types; some weren’t happy to compromise but thought they could do so, some rejected any compromise outright. The really significant result, however, is that the no-compromise group responded very badly to the offer of a side payment – it just made them angrier and more intransigent. Only a dramatic sacrifice of symbols by the other side would induce them to change – exactly, as it happens, the sort of thing the compromise group wouldn’t think of doing for fear of what the non-compromisers would say.

I wonder if it would be possible to re-analyse the results using Robert Altemeyer’s tests of social authoritarianism and dominance? It feels intuitively right; more formally, an intense concern with symbols and symbolic norms would seem to be very similar with the obsession with the preservation of hierarchical norms Altemeyer identified among his authoritarian subjects.

It also fits with a lot of the language of extreme conservatism through history; the idea of the corrupting nature of compromise and of democracy, especially of parliaments, and its opposite, the cult of the decision embodied in the leader, has been around since the counter-enlightenment.

This does, of course, point out a deep ambiguity – we admire principle but also reasonableness, which must mean the ability to ignore it.

Further question; remember Chris Lightfoot’s analysis of the Political Survey results? Chris selected the statements from a survey which maximised the variance in the population’s answers to them and used these to summarise the results on two axes. This is one of the axes:

  • Sense Statement
  • agree Prisons are too soft on criminals
  • agree The UK should withdraw from the European Union
  • disagree Most immigrants are beneficial to the UK
  • agree Some crimes are so serious that the only proper punishment is the death penalty
  • disagree It’s more important to rehabilitate criminals than to punish them
  • disagree The government should give more aid to poor countries
    agree National law should always override international agreements and European directives
  • agree Working people pay too much tax
  • disagree The cost of living in the UK should be allowed to rise in order to fight global warming
  • agree The government is mostly interested in helping itself, not ordinary people

The people surveyed broke by vote into two well-specified groups on this axis; one encompassed the Labour, Liberal, Welsh and Scottish Nationalist, RESPECT, and Green voters, the other the Conservatives, BNPers, ‘kippers and Veritas voters (if any measure of them can be considered statistically significant). Now, I would suggest that for a lot of the latter group, the last but one question isn’t really a stereotype-rationalist one about negotiating costs and risks but an identitarian one about not being a *refined shudder* greenie, which means that only the tax one can be considered as a question of compromise.

What have we here? Via Spencer Ackerman: David Wurmser, trying to sketch the wiring in his head on a really big piece of paper.

The spider chart was meant “to create a strategic picture, and that strategic picture is the foundation of policy change,” Wurmser said. “It helped you visualize, because if you saw, say, twenty relationships between X and Y, and twenty between Y and Z, then there’s at least a suspicion that Z and X are interacting through Y.” A map like that could bring insight, but there were perils in surmising too much.

Suppose X and Y were Dick Cheney and Colin Powell. Twice they served in senior posts under presidents named Bush. In the early 1990s, they worked at the same address and were spotted together on international flights. They communicated frequently, encrypting their secrets….

That’ll have been back when they still trusted him with the felt tip pens, I suppose. It reminds me a lot of this post from last August, regarding surrealism, rolling news, and TV anchor Glenn Beck’s “methodology”, which seems to have been identical to Wurmser’s.

The problem with this sort of semi-random links-and-ties analysis is twofold – not only is your brain predisposed by millions of years of evolution to impose patterns on raw data, which means you’re bound to find pattern if you look for it, but the spurious ones we inevitably perceive come from somewhere. Specifically, they come from our preconceptions, prejudices, and perhaps most of all, from the ones we don’t want to admit to. Just as you’d only dump the whole logs from a computer program to trace a bug, you don’t free-associate in order to make plans.

So as well as generating lots of time-sucking, budgetivorous false positives, this kind of thinking actually tends to make us behave even more stupidly, because it strengthens all the least rational forces within us.

I really mean this, by the way, and I’d love to hear from anyone who has comments about its potential implementation.

“Sir” Ian Blair addresses the troops:

“Our approach will be one of humility. On 22nd July 2005, we confidently believed that our systems of command, of surveillance and of firearms intervention were among the best in the world. However, they failed in response to a previously unforeseen circumstance, suicide bombers on the run.”

Well, it’s an admission of sorts. But it wasn’t all that unforeseen, was it? The Met had two plans in place, a static one (Operation C) to deal with an attacker at a major event and a mobile one (Operation KRATOS) to deal with…a suicide bomber on the loose in public.

They just didn’t decide which one fit the circumstances, came up with a dog’s breakfast of a hybrid instead, and then carried it out so poorly it would have been a disaster whatever the plan.

We haven’t had any Metropolitan Police blogging for a while, which is a pity given that “Sir” Ian seems to be bent on making enemies of literally everyone on the force. I can’t help thinking that Tarique Ghaffur and Ali Dizaei are making this into a personal vendetta, but then, who the hell wouldn’t?

I liked this comment from Chris “Chris” Williams regarding Arthur C. Clarke:

What future? A better one than we’ve got: a worse on than we’d have had without him. Several million fanboys and girls grew up exposed to clear prose, opposition to nationalism, scepticism about organised religion, faith in technology, faith in humanity, and some great comedy.

“The guest of honour pressed a button (which wasn’t connected to anything). The chief engineer threw a switch (which was).” – or thereabouts. From Travel by Wire. All there at the start.

Which amused me; especially as the same post got linked by the Adam Smith Institute. Ha, I can’t imagine two technologies that got commercially deployed whose development had less to do with Teh Market than satellite communications and GSM. Even though there is fierce competition in both fields, a lot of it is down to the fact that the GSM founding engineers designed it in, working for ASI-tastic organisations like nationalised Nordic telcos and the European Commission.

Satellites, well…you do know Bell Labs (itself hardly the most Thatcherite operation, and one Reaganism killed off pretty sharpish) actually considered launching the first comsat on a Soviet rocket? Beyond mockery, what I’m driving at is that Clarke delivered a solid disrespect for ideology as well as religion and nationalism and Western arrogance – surely, the Indian-engineer archetype must have something to do with all his Dr Chandras, next to the IITs and the unintended consequences of IBM being kicked out of India in the 70s? (And what would the ASI make of *that*?)

The political landscapes he delivered were always nicely sceptical of state bureaucracies (2001: A Space Odyssey can be read as an attack on the security-bureaucratic complex) and also of big business. He missed the revival of small business, but then, who didn’t. And his major political flaw was that he was too optimistic about technocratic cooperation – he seemed to believe that politics stopped in low earth-orbit, and Space Station One is essentially the European Union at L-5. Just as you can’t have non-political bread, you certainly can’t have non-political spaceflight; but of all the political mistakes you could make, it’s a pretty minor one compared with some of the others on offer during his career.

From the 1930s to today, he could have variously believed in die-hard opposition to Indian autonomy, to say nothing of independence, that Stalin was an honourable gentleman, that what we really need is a strong leader to discipline the feminine masses, that white people were smarter than other people, that the US intelligence services were engaged in a conspiracy to downplay Soviet power and that therefore we need many more nuclear weapons, that burning the North Sea oil reserves in order to support sterling at an exchange rate high enough to flatten the export sector was a good idea, that the UN is a secret Zionist conspiracy to take your guns, that what we really need is a restored Caliphate, or that invading Iraq was wise. And this is far from an exhaustive list. Literally no other period of human history has offered a richer cornucopia of delusions; as George Orwell said, no ordinary man could be such a fool.

The Clarkean vision was that perhaps, we might be able to imbue reality with the inspiration and excitement various groups of us applied to the list of ideological manias above. Rather than pluricontinentalism or bimetallism or conservatism, we might consider the renal parasites of cephalopods, the neurological basis or otherwise of psychoanalysis, or viewing the surface of Venus in the infrared. Nothing is mere; so said Richard Feynman. It finally poses the question; is a sceptical utopia possible?

One of the many wonderful things about the Web is that its hypertext structure not only permits us to navigate it, and to invoke external resources (scripts, graphics, etc), but also to measure relevance and authority. Google’s killer insight was of course just this; to use links as votes for the relevance of a given document, and to do this recursively so that the more authoritative the document, the more powerful its outbound links.

But there is a fundamental problem here; the introduction of the REL=”NOFOLLOW” tag was meant to stop spammers manipulating this structure by autogenerating great numbers of links, but this is only a partial solution. After all, the fact that somebody considers a document unreliable, irrelevant, spammy, or just…repellent is useful information; but there is no way of capturing it. Ideas like the “Semantic Web” have examined things like the idea of creating links that go backwards as well as forwards; I for one have never been able to understand this, and it sounds far too much like INTERCAL’s COME FROM… statement. (You thought GOTO was considered harmful; COME FROM … is the exact opposite.)

What I propose is that we introduce a negative hyperlink. A kind of informational veto. I’ve blogged about the Stupid Filter before, which attempts to gather enough stupidity from the Web that it can characterise stupid and use Bayesian filtering to get rid of it, as we do with spam. But I suspect that is a fundamentally limited, and also illiberal, approach; StupidFilter is indexing things like YouTube comments threads, which seems to guarantee that what it actually filters will be inarticulacy, or to put it another way, non-anglophones, the poor, the young, and the members of subcultures of all kinds. The really dangerous stupidity walks at noon and wears a suit, and its nonsense is floated in newspaper headlines and nicely formatted PowerPoint decks. StupidFilter would never filter Dick Cheney.

But a folksonomic approach to nonsense detection would not be bound to any one kind of stupidity or dishonesty, just as PageRank isn’t restricted to any one subject. Anyone could antilink any document for any reason, across subjects, languages and cultures. Antilinks would be simple to capture programmatically – just as simple as other HTML tags are. In Python, it would be as simple as replacing the search string in a BeautifulSoup instance – one line of code. Even without changes to today’s Web browsers, a simple user script could flash a warning when one was encountered, or provide a read-out of the balance between positive and negative links to a page.

Consider this post at qwghlm.co.uk; Chris is quite right to mock the Metropolitan Police’s efforts to encourage the public to report “unusual” things. After all, there is no countervailing force; if you collect enough noise, statistically speaking, you will eventually find a pattern. What you need is the refiner’s fire. Why is there no Debunk a Terror Alert hotline?

I am quite serious about this. Implementation could be as simple as a REL=”BULLSHIT” attribute. Now how do you go making a submission to the W3C?

How did a set of medical techniques and institutional styles with absolutely no therapeutic value survive for 2,500 years from ancient Greece to the early 20th century – even though the scientific knowledge required to demolish them had been available since the 1600s? This is the question David Wootton’s “Bad Medicine: Doctors Doing Harm since Hippocrates” sets out to answer.

Writing the history of failure is an interesting project; even more so when the activity concerned isn’t adversarial. Wootton’s main thesis is, in effect, that the germ theory was the all-decisive factor in the great breakout from disease in the mid-19th century, and that resistance to it or failure to grasp the consequences was the main reason why scientific medicine took so long to arrive.

Van Leeuwenhoek, the inventor of the microscope, was the first man to see a bacterium, in the 1650s in Leiden; soon there was a wave of interest across Europe in the new technology of microscopy and the astonishing discovery of microorganisms. This gave rise to an intellectual ferment about the nature of, well, fermentation among other things; the centre of the debate was the notion of spontaneous generation. Although, with hindsight, the fact that microbes were already everywhere should have blown the gaff and made clear that the experiments that supposedly displayed it were actually examples of experimental error, there were still believers as late as the 1870s.

Whether germs were spontaneously generated or not seems a slightly odd preoccupation; surely it was more important that they were there? However, it was a major ideological roadblock to accepting that germs were responsible for wound infections; strangely, none of van Leeuwenhoek’s peers seems to have thought of turning their microscope on a patient. Even more strangely, microscopes went out of fashion; medicine simply ignored microscopy up to the 1830s. The opportunity was passed up. This sort of thing kept on happening; even John Locke, who called on van Leeuwenhoek in exile and looked down the microscope, wrote that it was absurd to imagine it had any clinical use.

What was going on was a major disjuncture between the value medicine placed on different forms of knowledge; practical, technical knowledge was undervalued, and canonical, scholastic knowledge overvalued. As Wootton pointed out, a hypothetical early-18th century pupil in Leiden could have observed an infected wound with a microscope; they could have experimented on an animal; they could have tried, as Schwann eventually did in 1837, to kill the germs with heat or salt or perhaps alcohol (Holland was where the gin came from, after all), and this would have given us antiseptic surgery a hundred and fifty years before Lister.

Something similar happened with regard to infection control; Alexander Gordon had recognised that puerperal fever and erysipelas were the same disease, and that they were spread through the hospital by doctors, in the 1790s. He was even able to predict who would get it next, having worked out which staff members were carriers – including himself. But the only part of his research anyone was interested in were some suggestions about bloodletting, the canonical medical treatment pre-Lister. It was the kind of knowledge that was authorised. Oliver Wendell Holmes made the same discoveries fifty years later and collected a similar budget of abuse. Ignaz Semmelweis did, too, but fatally missed the link with other diseases, which would have given us antiseptic surgery thirty years before Lister.

In the 17th century, it had been routine for ships to carry lemon juice as a precaution against scurvy, but although it actually worked the medical establishment was able to persuade the navy that they were wrong to use it for almost a hundred years. Again and again, bad knowledge actually triumphed over good; it kept doing so until its failure was both glaringly apparent and its replacement obvious.

A leitmotif in the book is the microscope; not only because of its role in microbiology, but I think also because it was a form of subversive technology. Unlike medical degrees and Galenic textbooks, anyone could possess a microscope; even the skills required to make them were not incredibly rare. The user was able to observe the new nature without anyone’s intermediation; a genuinely Protestant product. No wonder they were scared. Similarly, the beginning of statistics made it increasingly impossible to conceal the uselessness of medicine. John Snow could plot cholera cases on a map; so could the priest Henry Whitehead, who set out to conduct his own research in order to refute Snow but ended up convincing himself. Counting, like microscopy, was fatal to the closed system of knowledge.

None of this guarantees success; Snow had to convince William Farr, a top government official and a sort of David Kane figure who theorised that cholera was caused by living too close to sea level. Farr had identified a correlation between altitude and cases, and derived a formula; unfortunately it predicted that at sea level everyone would already be dead, but this didn’t stop him. He argued that people at sea level actually lived 13 feet above it because of buildings, and predicted that the race would degenerate unless the government forced everyone to build on higher ground.

It’s hard not to wonder what other scientific revolutions didn’t happen; it’s more profitable to wonder what our systems of knowledge are denying now. Wootton points out that medicine didn’t have to be converted; it would have been quite possible for the traditional doctors to stagger on, competing for patients with newly emerging Lister Institutes, perhaps perpetuating the divide between surgery and medicine. As late as the 1970s, he says, it was possible to find “Ionian” – i.e. Galenic – doctors practising in Iraq.

The climate change deniers are an obvious example, but I suspect they are well on the way out; I can’t help but suspect there are a lot of Galenic economists out there.

In a special note, by the way, Wootton does suggest that Daniel Davies’s manifesto may be flawed. Daniel argues that middle-class progressives’ schemes to nudge the poor this way and that are always and everywhere stupid, ineffective, and destructive of freedom. There is much to be said for this view; however, Wootton presents a strong case for the view that a canonical example of such schemes, the health visitor and the Fabians’ keenness on telling the poor how to cook, may have saved many lives in Edwardian London. Specifically, although London had the Bazalgette sewerage system and clean water by then, the death rate from infantile diarrohea was the same in houses with flush toilets and without; the explanation of the paradox was that children shat in the street, and then didn’t wash their little hands. The answer was apparently a good finger-wagging, and maybe a thrashing or six; by the 1930s the rate was effectively zero.

I’ve gradually become addicted to Overcoming Bias, and specifically Eliezer Yudkowsky’s contributions to it. And it struck me, reading the reports on the de Menezes trial, that a good dose of this blog could have done the Metropolitan Police a power of good.

Specifically, members of the police command staff recalled hearing a radio message first that de Menezes was definitely not the suspect, and then that he definitely was the suspect. Now, it’s very unlikely indeed that someone who was behaving rationally would go from certainty of X to certainty of Y without passing through stages of progressively greater doubt about X. It’s possible that you might encounter a situation when you had strong enough evidence to do the whole leap; it’s just very, very unlikely, and therefore you should be suspicious of any such suggestion. There’s a good reason for the cultural norm that you should be suspicious of sudden converts’ motives.

Similarly, it’s very hard to imagine a scenario where the police could have gone from being certain that he definitely wasn’t a suicide bomber – a prior of zero – to certain that he definitely was. Either they weren’t certain to begin with, in which case the officer in question shouldn’t have said so, or they weren’t certain when they changed their mind, in which case they doubly shouldn’t have said so.

In fact, the account of the police command-and-control of the operation that emerges is an appalling hellbroth of cognitive bias. The specialist firearms team was briefed in terms described as “inflammatory” about “firing a bullet into the brain of a suicide bomber”, immediately after having been issued with special 124 grain ammunition – had it been calculated to embed a perceptual fix that whoever they ended up chasing was indeed a suicide bomber, it couldn’t have done so better. Under stress, people tend to exhibit perceptual rigidity, blocking out information, and target fixation.

Further, for reasons that still haven’t been made clear, this outfit didn’t reach the scence until 5 hours after the original call for them; so there was no time for them to be cross-briefed by the surveillance team. So lacking in orientation, and cranked up with aggression and tension, were they that one of them chased the train driver into a tunnel waving a gun under the impression he was another member of “the cell”. But the surveillance squad had followed only one man into the station. Where did this cell come from, other than crisis fever and ignorance?

This general farrago of stupidity was matched at headquarters, where the command centre was besieged by every other staff officer who could squeeze in to watch the fun; the hubbub was such it was difficult to hear the radio traffic, and tempers can only have been wearing out – another risk factor, as was the fact everyone had been up all night. Worse, it seems there may have been two commanders – everyone remembers Cressida Dick, who was given out until recently as having been the Gold Commander, but it appears that this title was also held for much of the operation by John McDowell. It is not clear whether they were co-equal (pretty bad) or whether there was a change of command in the middle of the crisis (even worse).

The overwhelming impression is that the Met is not serious about the super-duper war-on-terrorism role various chiefs, especially Sir Ian Blair, have been so keen to take on and expand. Its command arrangements here were desperately bad, to say nothing of Blair’s statements in the aftermath or the leak campaign against the victim. Then, there is the Forest Gate incident when a man was shot “by accident”; the military call this a “negligent discharge” and treat it very seriously indeed even if no-one is hurt or anything damaged, but the Met essentially shrugged it off and contented itself by leaking to the NOTW that the victim was a paedophile.

It is not enough to blame the pilot; it is not enough to say it was an accident. The only conceivable answer to this would be along the lines of “If that’s your best…I don’t want to see your worst.” Accidents happen for reasons; reasons that are found in institutions. This is a sick institution.