Sci-Ence! Justice Leak!

Geeks Dig Metaphors: The Politics Of The Singularity

Posted in Uncategorized by Andrew Hickey on August 30, 2010

(Following on from the introduction and the technical problems )

Now, the Singulatarian worldview can be summed up, roughly, as “Real Soon Now, we’re going to enter a Golden Age which will last forever. This Golden Age will probably be brought about by companies like Google, (with the help of geeks like me, and other people who can see how right I am!), so long as government doesn’t interfere with them, and is what the whole of humanity has been leading up to!”

Now, that’s a dangerous message in itself – you’ve got a mythical Golden Age in the future to look forward to, support for unrestrained corporatism (so long as the corporations are working towards this Golden Age, or can appear as if they are) and a group of people (geeks) singled out as being better and more important than everyone else. Add in a scapegoat group to blame if everything goes wrong (I suggest Microsoft, if anyone’s wanting advice) and you’ve got all the recipes for fascism right there.

Now, ever since John W Campbell there’s been a strong admixture of racism and boil-in-the-bag Nietzscheanism (Fans Are Slans!) in ‘geek culture’, along with a big chunk of groupthink and support for the big company over the individual (see most recently all the people having conniptions at the idea that there were people who weren’t going to go and see Scott Pilgrim on its opening weekend – these multi-billion dollar film corporations need your support, people, or they might stop making middle-brow high-concept comic adaptations! – as well as the frankly disgusting attitudes taken by comic fans every time a creator actually tries to assert any of their rights. ) That kind of thing is why I resist being referred to as a geek.

But what’s more worrying is the Manifest Destiny aspect of this. Singulatarians (for the most part) believe this *has* to happen. Ray Kurzweil draws his straight lines, and they keep going on forever, so the Singularity *must* happen. Tipler is even firmer on this point – he argues that the Omega Point is a boundary condition for the wave function of the multiverse (this means it must happen by logical necessity, and if it didn’t the universe would cease ever to have existed). The Singularity is inevitable.

Now, this kind of thinking is very popular in extremists of both left and right – come the Glorious Revolution, all will be right/the Invisible Hand of the market will fix everything. The attraction in both cases is that it allows the privileged not to feel bad about their privilege. If the Worldwide Dictatorship of the Proletariat *HAS* to happen, then there’s no point trying to make poor people’s lives any better now – in fact it might be a bad thing, because it’ll discourage the proletariat from realising their oppression and rising up. Best just buy a new TV rather than help the poor. And if you’re on the right, it’s even easier – you’ve got your money because that’s the most efficient possible allocation of those resources. Helping poor people would actually be *inefficient* and in the long run would hurt them! Best just buy a new TV…

This is the natural political result of *any* kind of predestination, and explains why, for example, it was so easy for Christopher Hitchens to switch from being a Trotskyist SWP member to being an adviser to the Bush White House (in fact a huge number of neocons had previously been on the hard left).

It also explains why the Singularity is so beloved of tech billionaires – they’ve become billionaires as a necessary step to the Golden Age, and there’s no need for them to give their money to the poor or anything like that, because the Singularity will raise *everyone* to their level! In fact by keeping their money, and investing in tech companies, they’re helping the poor far more than redistribution could! Of course, it helps that people like Kurzweil think the current set-up is just great – Kurzweil actually says, in his book, that he believes it will soon be possible for us to create machines that will literally make *any physical object you want* – program it to make a steak, or a perfect atom-level copy of the Mona Lisa, and it will. He thinks that it will be important to protect the intellectual property rights of those who write these programs!!!

But it’s also a very, very dangerous attitude.

Because in so far as Kurzweil’s lines going off to infinity, measuring information processing over time, have any value at all, they’re also graphs of energy use (there is essentially a linear relationship between the two). And energy use is a problem.

There are a whole host of environmental and economic disasters that look set to hit over the next century or so – from overpopulation leading to massive food shortages, to global warming, to peak oil, to antibiotic-resistant bacteria, it is entirely possible that human civilisation as we know it will end in the next century. Even if you believe that one of these is low-probability or soluble, the combination seems to have a pretty high risk.

But if you *know* – because you can draw a straight line – that all the world’s problems will be solved Real Soon Now – then you don’t need to do anything about these problems yourself, because it’ll all be fine.

Not only that, but you’re not going to support any efforts by anyone else to mitigate these risks, because it’s a waste of resources. You won’t vote for politicians who want to fix these problems, because you don’t believe that the problems are real.

(I am going to exempt Eliezer Yudkowsky of the Singularity Institute for Artificial Intelligence here. He sees the creation of a singularity of his favoured type as a way to avoid existential risk for humanity, and has decided to try to do this himself because he sees it as a moral duty to do something about it. He’s got an ego the size of the universe, some rather messianic beliefs about himself, and he hasn’t backed up his talk with any actual measurable action, but compared to the rest of these people he’s a model of sanity and clear-headedness, which is why I occasionally link his group blog Less Wrong here).

The Singularity may well happen at some point – the Singulatarians may be right and I may be wrong. But even if it doesn’t, they’re right when the say that life in a hundred years will be unimaginably different from how it is today. The question is whether it will be unimaginably better or unimaginably worse. And that is going to be decided by the actions of every person alive today, and the decisions they make. If we manage to find solutions to our problems, we may well end up with something like the Singularity, eventually, but *we need to work toward the solutions first*.

And for a bunch of rich, technically skilled people with access to the media, politicians and business leaders, to abrogate their responsibility to make those decisions and find those solutions, in favour of the worst kind of Panglossianism, is not only morally dubious but *dangerous* – in a very real sense they’re betting the earth that they’re right, and it’s not theirs to bet.

5 Responses

Subscribe to comments with RSS.

  1. [...] Metaphors: The Techn…Geeks Dig Metaphors:… on Geeks Dig Metaphors: Intr…Geeks Dig Metaphors:… on Geeks Dig Metaphors: The Techn…Geeks Dig Metaphors:… on [...]

  2. Kieran said, on August 31, 2010 at 12:08 am

    This strikes me as the nub, because the thing that distinguishes the singularity from traditional tech-utopias is the speed at which it occurs. And that lets you ignore the potentially disastrous implications of any of the attendant technologies.

    I don’t mean that any invention can be bad, but none are inherently good, technological progress opens up the potential space of futures but getting onto the right one requires a lot of political and social action.

    The singularitarian answer is to skip to the point where the best future for anyone is the best future for everyone, but if the technology in question comes too slowly it might lead to some ugly mid-futures.

    You mentioned Eliezer Yudkowsky as an exception and I’d like to put in another good word for him, because he really gets that: getting the technology there is only half the battle, you have to get it working in the right direction. He’s more worried about I Have No Mouth And I Must Scream than Dollhouse or Altered Carbon, but still.

    • Andrew Hickey said, on August 31, 2010 at 6:33 am

      “if the technology in question comes too slowly it might lead to some ugly mid-futures.”
      Absolutely – *especially* given th huge love many of these people have for big corporations. I don’t think my short story the SIngularity is too far off what might happen if some of them got their way.

    • pillock said, on August 31, 2010 at 7:09 am

      Whaddaya mean what “might” happen?!?

  3. Gavin Burrows said, on August 31, 2010 at 5:15 pm

    Many years ago the anarchist songwriter Joe Hill coined the now-famous line “you get pie in the sky when you die.”

    Anytime we’re told that here is not the place, that now is not the time, we’re being sold a whole load of hooey.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 195 other followers