Sometimes serendipity happens. I was trying to think of a way to link together a couple of sections of the Hyperpost book, when I found this old post from Scott Aaronson’s blog Shtetl-Optimised.
In it, Aaronson talks about how he’d noticed that there was a lot of overlap between Libertarians and proponents of the Many-Worlds Hypothesis in quantum physics, and had tried to figure out why:
Some connections are obvious: libertarianism and MWI are both grand philosophical theories that start from premises that almost all educated people accept (quantum mechanics in the one case, Econ 101 in the other), and claim to reach conclusions that most educated people reject, or are at least puzzled by (the existence of parallel universes / the desirability of eliminating fire departments)…
My own hypothesis has to do with bullet-dodgers versus bullet-swallowers. A bullet-dodger is a person who says things like:
“Sure, obviously if you pursued that particular line of reasoning to an extreme, then you’d get such-and-such an absurd-seeming conclusion. But that very fact suggests that other forces might come into play that we don’t understand yet or haven’t accounted for. So let’s just make a mental note of it and move on.”
Faced with exactly the same situation, a bullet-swallower will exclaim:
“The entire world should follow the line of reasoning to precisely this extreme, and this is the conclusion, and if a ‘consensus of educated opinion’ finds it disagreeable or absurd, then so much the worse for educated opinion! Those who accept this are intellectual heroes; those who don’t are cowards.”
I think he’s on to something, but I think there’s a second aspect, which is what happens when those ideas actually hit reality.
Because Libertarianism and the Many Worlds Hypothesis have one big difference between them – one has immediate real-world consequences, and the other doesn’t. And that means that it is no longer a purely intellectual exercise.
Leaving aside whether the claims for Libertarianism (of the Ayn Rand type, which is what Aaronson is referring to) stack up logically, and assume for a moment one believes them to be correct, should you *act* as if you believe the claims to be correct? To take Aaronson’s example, should we privatise the fire service?
If you’re a libertarian, you believe the answer should be yes – that privatising the fire service would have the end result of fewer fires, and those fires being fought more cheaply. But what if you’re wrong? If you’re wrong, then the result would be people – potentially a lot of people – losing their homes.
So there’s a second level of calculation to be done here – how sure are you of your own reasoning ability and the information (your priors, in Bayesian terms) you use to come to your conclusions? *WHEN YOU FACTOR IN THE PROBABILITY OF YOU BEING WRONG* does the expected benefit if you’re right outweigh the expected loss if you’re wrong?
Now, on this blog I often fall into the ‘bullet biter’ side of things *when talking about ideas with no real-world immediate consequences*, because it’s both intellectually right and more interesting. But take the Many-Worlds hypothesis. I consider this the most likely of the various explanations of quantum theory I’ve read, and would put my confidence in that judgement at about 80% – I’m a bullet-biter there, and proud of it.
And I’m a bullet-biter when it comes to certain forms of alternative medicine. I’m convinced from the experimental evidence, for example, that taking certain vitamin supplements in large doses will massively decrease the risk of cancer, and have stated that on this blog too. And again, I’d put my confidence in that at about 80% (I rarely put my confidence in *anything* much above that).
Now, the downside with taking vitamins is that there’s a cost of maybe a pound a day and – if you believe the very worst possible reports, which as far as I can see have no evidentiary basis, but if we’re assuming I’m wrong we’re assuming I’m wrong – a very small increased risk of kidney stones. The benefit, if I’m right, is not getting cancer. An 80% chance of ‘not getting cancer’ outweighs a 20% chance of a 1% increase in kidney stones, so it’s worth the pound a day to me to put my money where my mouth is and actually take the vitamins.
On the other hand, one can come up with a real-world test for the Many-Worlds Hypothesis. If it’s true then, were I to stand at ground zero of a nuclear weapons test, I should expect to live through it. There would be a googolplex or so universes where I’d die instantly, but I would not experience those, because I’d die too quickly. On the other hand, there’d be a one-in-a-googolplex chance of me surviving, which according to Many-Worlds means there’s a universe where I *would* survive. That would be the only one I’d experience, so from my own point of view I’d survive.
But even though I am persuaded by the Many-Worlds hypothesis, I’m not going to try that one out.
However, there are people out there who *would* do it, who would say “No, I’ll be fine! Drop the bomb!” – let’s call them bomb-testers.
And I think while being a bullet-biter can be a good thing, being a bomb-tester never is.
A bullet-biter might say “I’m convinced the Singularity is coming, but I’ll give some money to Greenpeace just in case” while the bomb-tester would say “I’m convinced the Singularity is coming, so I’m not going to support environmental protection measures, because we’ll be gods in twenty years anyway”.
A bullet-biter might say “I’m convinced the Bible is literally true, but I’m not going to hurt anyone who thinks differently”. A bomb-tester would say “I’m convinced the Bible is literally true, so I’ll persecute homosexuals”
I think a lot of people – particularly in the ‘skeptic’ community – think of themselves as being bullet-biters when they’re actually bomb-testers. They’ve reached a logical conclusion, and are going to act on that and damn the consequences. This is why some people say Richard Dawkins and fundamentalist Christians are the same kind of person – not because their beliefs are equally unjustifiable, but because they are both certain enough of their own rightness that they’ll act on it even when the downside of that action looks to the rest of us far worse than whatever upside they believe in.
Which is not to say that “acting on one’s beliefs” is a bad thing. One reason I have more respect for Eliezer Yudkowsky (of Less Wrong ) than for other Signulatarians is that he’s willing to act on his beiefs (even though I don’t find his arguments convincing, and think he has more than a little of a Messianic streak at times). But his actions *take into account the possibility he’s wrong* – he’s acting in a way to minimise expected harm. If he’s right and he doesn’t act, the world will end. If he’s wrong and he does act, then he wastes his time and looks a fool. Were I to find his general arguments convincing, I’d be doing the same.
If you find yourself defending an intellectual position that others don’t hold, then you’re quite possibly an ‘intellectual hero’. But if you find yourself acting on that position without considering what might happen if you’re wrong, then you’ll end up a real-world villain…
I was going to do a Batman post today, but I’ve got annoyed again, so you’ll have to wait.
Specifically, I got annoyed by this , something that’s been going round on the internet for a few days.
It calls itself The Periodic Table Of Irrational Nonsense, but it is itself nonsense at least as irrational as anything it attacks, and I’m *SICK* of this.
Before I go any further, let me make something clear: I am a scientific rationalist. I have had papers I co-authored published in more than one scientific discipline. I consider the scientific method the only reliable way of discovering knowledge that humanity has ever discovered. I am also a sceptic and, I believe, a clear-headed thinker.
But *as* a scientific, clear-headed, rational thinker, I consider that list to be utter, unadulterated, concentrated irrationality.
Because I can see two possible ways that list was put together:
Possibility a – The author has, himself, researched into all these categories, read all the relevant literature, looked at the arguments used by the most prominent advocates of those beliefs/hypotheses/ideas, checked their data, and somehow come to the conclusion that only those things that are attacked regularly by Ben Goldacre, Richard Dawkins and other prominent ‘skeptics’ are irrational nonsense or
Possibility b – He has chosen a list that, within the group of people he wishes to associate with, is completely uncontroversial, a list endorsed by the alpha males of his group, without actually thinking rationally about any of it.
There is quite a bit of evidence for possibility b. (In the next two paragraphs I’m using examples that my friend Gavin brought up in a discussion with me. I’d probably have used these examples anyway, but credit where due):
“Faith healing” supposedly works through the placebo effect. Double-blind clinical trials rely on the supposed efficacy of the placebo effect to have any validity. Either the placebo effect is real, in which case faith healing works, or (as I consider the evidence to show) it isn’t, and double-blind trials should be on there too.
“Memetics”, on the other hand, is just a set of Just-So stories, a supposed ‘science’ with no explanatory power, which makes no falsifiable predictions that are not trivially true without it, and which insists on treating a metaphor as having objective reality. Judged purely rationally, it should be right there snugly next to Scientology. Yet it’s strangely missing…
Other things that are strangely missing from there, but are irrational as hell, include meta-analyses, libertarianism and supporting illegal wars, all of which many of the ‘rational’ ‘skeptics’ (always spelled the American way, even though the creator of this image is British) on that person’s blogroll support. Those would certainly be on any list of the ‘irrational’ I put together…
And on the other hand, several of the things on there simply shouldn’t be. The obvious one is ‘conspiracy theories’. *ALL* conspiracy theories? Even the ones we know demonstrably to be true (such as Brown agreeing to stand down in the Labour leadership so long as Blair stood down in his favour eventually as Prime Minister)? So *no* conspiracies ever happen? We should probably get rid of the laws against conspiracy then, I suppose…
I’ve looked into *some* of the things on that list for myself – the majority I haven’t. Of those I have, some appear to me to have some truth, some appear to me to be almost certainly false, and some are in a grey area. I suspect that that would be true for anyone who looked into them *without the bias of trying to fit into a pre-approved ‘skeptic’ mould*.
So I’ve made a decision – I’m not going to believe in the ‘rationality’ of anyone who isn’t prepared to defend at least one of the things on that list as being reasonable. I won’t fall out with anyone over it, but I’m going to assume that anything you say is justified not by reason but by appeal to authority. (Of course some of you already get a pass on this – like Debi, who is a rational, sceptical, scientist but also a Buddhist – Buddhism’s on the list).
So what’s your heresy? What, out of that list of thoughtcrimes, do you think has some merit?
In my case, it’s vitamin megadoses.
I take a minimum of six grams of vitamin C every day, rising to much more whenever I’m even slightly ill. I take many other supplements, too, at much higher than the RDA. These ‘megadoses’ have improved my physical and mental health enormously. Having read many books co-authored by my uncle Dr Steve Hickey (here are two of them, both of which I proofread. That’s my Amazon affiliate link, but you can buy them without that) and, more importantly, checked the original papers he cites, I have come to the conclusion that there is an *overwhelming* body of evidence in favour of the hypothesis that many vitamins can have health benefits at levels far beyond those in the RDA.
To take the most egregious example, in the mid 1970s Prof Linus Pauling – possibly the most important scientist of the 20th century, and the only person ever to win two Nobel Prizes – and Dr Ewan Cameron tried giving vitamin C in very large doses to terminal cancer patients. These patients outlived their expected lifespan by significant periods (in some cases people expected to live hours or days stayed alive for years on this treatment).
The Mayo Clinic, a ‘prestigious’ medical centre, claimed to have ‘proved’ that Cameron and Pauling’s research was flawed – they tried to replicate the tests and failed, and their publication effectively meant that any investigation of vitamin C’s role in cancer was laughed at as ‘pseudo-science’ for more than twenty years.
Except that the Mayo Clinic used oral doses while Pauling and Cameron used intravenous doses. And that the Mayo Clinic cut their trial short. And that before the Mayo Clinic cut their trial short there had been no deaths from cancer, but the death rate went up as soon as the vitamin C was withdrawn. There were methodological errors in the Mayo paper that would get someone a fail in GCSE Biology, let alone when dealing in serious oncology.
I could go into this much, much more, but I’m tired and too hot. But I’ll discuss in comments. But if you want me to discuss more of my reasoning here, first tell me: What’s *YOUR* heresy? And while you’re at it, what do you think *should* be on that list that isn’t (my big one would be ‘making lists of things and claiming those things are irrational nonsense, as if “irrational nonsense” was a property attached to them rather than an opinion’)?