Now, the Singulatarian worldview can be summed up, roughly, as “Real Soon Now, we’re going to enter a Golden Age which will last forever. This Golden Age will probably be brought about by companies like Google, (with the help of geeks like me, and other people who can see how right I am!), so long as government doesn’t interfere with them, and is what the whole of humanity has been leading up to!”
Now, that’s a dangerous message in itself – you’ve got a mythical Golden Age in the future to look forward to, support for unrestrained corporatism (so long as the corporations are working towards this Golden Age, or can appear as if they are) and a group of people (geeks) singled out as being better and more important than everyone else. Add in a scapegoat group to blame if everything goes wrong (I suggest Microsoft, if anyone’s wanting advice) and you’ve got all the recipes for fascism right there.
Now, ever since John W Campbell there’s been a strong admixture of racism and boil-in-the-bag Nietzscheanism (Fans Are Slans!) in ‘geek culture’, along with a big chunk of groupthink and support for the big company over the individual (see most recently all the people having conniptions at the idea that there were people who weren’t going to go and see Scott Pilgrim on its opening weekend – these multi-billion dollar film corporations need your support, people, or they might stop making middle-brow high-concept comic adaptations! – as well as the frankly disgusting attitudes taken by comic fans every time a creator actually tries to assert any of their rights. ) That kind of thing is why I resist being referred to as a geek.
But what’s more worrying is the Manifest Destiny aspect of this. Singulatarians (for the most part) believe this *has* to happen. Ray Kurzweil draws his straight lines, and they keep going on forever, so the Singularity *must* happen. Tipler is even firmer on this point – he argues that the Omega Point is a boundary condition for the wave function of the multiverse (this means it must happen by logical necessity, and if it didn’t the universe would cease ever to have existed). The Singularity is inevitable.
Now, this kind of thinking is very popular in extremists of both left and right – come the Glorious Revolution, all will be right/the Invisible Hand of the market will fix everything. The attraction in both cases is that it allows the privileged not to feel bad about their privilege. If the Worldwide Dictatorship of the Proletariat *HAS* to happen, then there’s no point trying to make poor people’s lives any better now – in fact it might be a bad thing, because it’ll discourage the proletariat from realising their oppression and rising up. Best just buy a new TV rather than help the poor. And if you’re on the right, it’s even easier – you’ve got your money because that’s the most efficient possible allocation of those resources. Helping poor people would actually be *inefficient* and in the long run would hurt them! Best just buy a new TV…
This is the natural political result of *any* kind of predestination, and explains why, for example, it was so easy for Christopher Hitchens to switch from being a Trotskyist SWP member to being an adviser to the Bush White House (in fact a huge number of neocons had previously been on the hard left).
It also explains why the Singularity is so beloved of tech billionaires – they’ve become billionaires as a necessary step to the Golden Age, and there’s no need for them to give their money to the poor or anything like that, because the Singularity will raise *everyone* to their level! In fact by keeping their money, and investing in tech companies, they’re helping the poor far more than redistribution could! Of course, it helps that people like Kurzweil think the current set-up is just great – Kurzweil actually says, in his book, that he believes it will soon be possible for us to create machines that will literally make *any physical object you want* – program it to make a steak, or a perfect atom-level copy of the Mona Lisa, and it will. He thinks that it will be important to protect the intellectual property rights of those who write these programs!!!
But it’s also a very, very dangerous attitude.
Because in so far as Kurzweil’s lines going off to infinity, measuring information processing over time, have any value at all, they’re also graphs of energy use (there is essentially a linear relationship between the two). And energy use is a problem.
There are a whole host of environmental and economic disasters that look set to hit over the next century or so – from overpopulation leading to massive food shortages, to global warming, to peak oil, to antibiotic-resistant bacteria, it is entirely possible that human civilisation as we know it will end in the next century. Even if you believe that one of these is low-probability or soluble, the combination seems to have a pretty high risk.
But if you *know* – because you can draw a straight line – that all the world’s problems will be solved Real Soon Now – then you don’t need to do anything about these problems yourself, because it’ll all be fine.
Not only that, but you’re not going to support any efforts by anyone else to mitigate these risks, because it’s a waste of resources. You won’t vote for politicians who want to fix these problems, because you don’t believe that the problems are real.
(I am going to exempt Eliezer Yudkowsky of the Singularity Institute for Artificial Intelligence here. He sees the creation of a singularity of his favoured type as a way to avoid existential risk for humanity, and has decided to try to do this himself because he sees it as a moral duty to do something about it. He’s got an ego the size of the universe, some rather messianic beliefs about himself, and he hasn’t backed up his talk with any actual measurable action, but compared to the rest of these people he’s a model of sanity and clear-headedness, which is why I occasionally link his group blog Less Wrong here).
The Singularity may well happen at some point – the Singulatarians may be right and I may be wrong. But even if it doesn’t, they’re right when the say that life in a hundred years will be unimaginably different from how it is today. The question is whether it will be unimaginably better or unimaginably worse. And that is going to be decided by the actions of every person alive today, and the decisions they make. If we manage to find solutions to our problems, we may well end up with something like the Singularity, eventually, but *we need to work toward the solutions first*.
And for a bunch of rich, technically skilled people with access to the media, politicians and business leaders, to abrogate their responsibility to make those decisions and find those solutions, in favour of the worst kind of Panglossianism, is not only morally dubious but *dangerous* – in a very real sense they’re betting the earth that they’re right, and it’s not theirs to bet.