Not A Review Of Neoreaction A Basilisk

Phil Sandifer sent me a review copy of his book Neoreaction a Basilisk, ostensibly about Eliezer Yudkowsky, Nick Land, and “Mencius Moldbug”, a couple of weeks ago.

(For those who don’t know, Yudkowsky is a blogger and fanfic author who basically founded an entire community based around his desire to create a machine God; Land is basically the Dave Sim of philosophy — someone who did good work, and then had some sort of severe breakdown which he incorporated into his later work, including some horrific political views; and Moldbug is just a dick on the Internet who spends ten million words saying “I don’t like black people and would like a hereditary dictatorship because I wrongly think people like me would be in charge.”)

I haven’t yet written a review, because, frankly, it’s review-proof. Anyone who’s read anything by Sandifer before knows exactly what “a book by Phil Sandifer about transhumanist ‘rationalists’ and neoreactionaries” is going to be like (except Eliezer Yudkowsky, who has shown the same understanding of the Streisand effect that he showed with Roko’s Basilisk, unwittingly promoting Sandifer’s book far more effectively than anything Sandifer himself has done). In other words, it’s only tangentially about the three men who make up its ostensible topic, and is more about Blake, Milton, and existential horror.

(That’s not a criticism, by the way — you could say much the same about my own books on Grant Morrison).

People who like Sandifer’s stuff will like it, people who find him annoying will find it annoying. I like Sandifer’s stuff myself, so I liked it, but chances are if you’re reading this you either know you’ll like it or know you won’t.

So, instead, just a few thoughts it brought up…

The abyss trolley problem. An infinite number of tracks, each equally inane, each therefore evil in the only true sense of the word – that is, meaningless but malignant. You have the lever.

Is the political,in the end, merely personal? Sandifer seems to think so, and certainly it seems largely true of Yudkowsky (I’m not as familiar with the other two as I am with him).

Sandifer seems to suggest — and he may be right — that political philosophies come from a confrontation with the abyss; the absolute knowledge that, as he puts it in the first sentence of his book, “we are fucked”. The knowledge of our own inevitable death, the anthropocene extinction, the great filter, Moloch, Mundum, Gnon, Cthulhu… these are all, basically, stemming from the same instinct as my own scream of horror. A recognition that the world is a place of filth and horror, in which everything good is eventually destroyed.

In this viewpoint, Marxism and Liberalism are both, in their ways, attempts to build a bridge across the abyss. Neither bridge has reached the other side, yet, perhaps partly because they keep getting distracted by trying to sabotage each other’s bridge, but they’re both trying that.

Yudkowskian rationality is an attempt to build a bridge across the abyss by using only a hammer.

Moldbug is trying to pull as many people as possible into the abyss with him, because he likes it in there.

Nick Land thinks the best way to cross the abyss is to jump straight down head first – eventually you’ll reach the bottom and you’ll bounce back up on the other side.

And Sandifer thinks that there’s no point in even trying to cross the abyss, because “we’re all fucked”. His sympathies seem to lie most with Thomas Ligotti, who considers consciousness itself to be a mistake.

(Personally, my sympathies lie with Yudkowsky. I think he’s wrong, as a matter of simple fact, about the possibility of digital immortality and how it may be approached, but all of this really seems to me to come down to the problem that human consciousness is relatively recent software running on legacy meatware, and it would be much better were that software ported to a system that didn’t have such a tendency toward catastrophic failure. Again, the similarities with my An Incomprehensible Condition come to the fore. Nick Land’s Phyll-Undhu, with its fascination with the Tower of Babel, shares a lot of the same mental territory as that book.

Yudkowsky is the odd one out, really, in that he locates his golden age in the future, while Moldbug and Land locate it in the past (Land, possibly, ironically — he may not believe a golden age could ever happen). The myth of the Fall — a loss of innocence being brought by knowledge and consciousness — is also, when one looks at it in a different angle, the myth of Progress, with Prometheus bringing knowledge that will lift man up to be with the Gods. While Land and Moldbug are Reactionaries, Yudkowsky is, in this sense at least, Progressive. But both are still the same myth — it’s really just a question of whether or not you think ignorance really is bliss.)

This seems to suggest that the problem most political campaigners want to solve is really a problem in their own heads. Certainly my own campaigning has rather a lot to do with my exaggerated sense of scrupulosity — a belief that if I don’t fix everything in the world, everything that’s still wrong *is my fault*. This is, of course, an extraordinarily egotistical view (and one, again, that is shared by Yudkowsky — part of the reason I mock his thinking when he’s wrong is that his thinking is so often so close to my own, the narcissism of small differences), and it’s possibly healthier to, as Phil suggests, “assume we’re all fucked”.

It’s *certainly* better to do so than to, as Land and Moldbug do, try their best to make that fuckedness come true as quickly and thoroughly as possible.

Moldbug, in particular, seems a ridiculous figure — just someone who bloviates on the Internet about how the blacks and the women just *are* genetically inferior, because Carlyle — but he’s one funded by Peter Thiel, a billionaire who shares most of Moldbug’s views, who’s a delegate for Trump, and who recently funded Hulk Hogan’s destruction of Gawker Media (a company I’ll shed no tears for, but a sign of how Thiel uses his power). If we *are* fucked, it’s in large part because of the existence of people like Thiel — people brought up on the technolibertarian dream, and who think that if the world could just be run by tech billionaires everything would be OK. (Moldbug argued, seriously, that Steve Jobs should be made dictator, while Jobs was still alive).

It may well be that we are all fucked. Certainly my own wetware, with its autism, and its anxiety, and depression, and general disgust at the universe, predisposes me to believe so. I believe that my own Liberalism is probably futile, that people will keep living down to my worst expectations, that the world is full of Definite Wrongness. (And if you haven’t, read the Northern Caves, which I linked above with the word “Mundum”. Nostalgebraist says it’s the best response he could have written to Sandifer’s book, before the latter was even written).

But like Nostalgebraist’s Salby, I have to keep trying, not out of any expectation that it will work, or that I can make anything better, but because when the world is *this wrong* it’s that or suicide, the Ligotti option — and taking that would hurt people enough that it would add to the Definite Wrongness.

So fundamentally, I can’t agree — as in, I am not *capable* of agreeing — with Sandifer’s book, even as I agree with his assessments of Land and Moldbug (I’m more charitable towards Yudkowsky, who is very much the person I could have been had I been unfortunate enough to have been born a couple of socioeconomic levels more privileged than I was). But if you’re interested in these ideas, or want to see what I’m talking about (I freely admit that this post may be the least coherent thing I’ve ever written) you have thirty hours to back his Kickstarter.

Alternatively (or as well) you could back my Patreon, like the other kind people who have funded this post. You won’t get Sandifer’s book if you do that, but you’ll encourage me to write more things like this…

This entry was posted in Uncategorized and tagged . Bookmark the permalink.

4 Responses to Not A Review Of Neoreaction A Basilisk

  1. FrF says:

    I’m glad that you find positive words to say about Eliezer Yudkowsky, Andrew. Yes, he didn’t think about the Streisand Effect when he distanced himself from Philip’s book but I can understand that he was distraught. I’m also of the opinion that Phil’s too cavalier about Eliezer’s accomplishments by calling him a “crank”. Indeed, he doesn’t have academic credentials but are the Sequences nothing? Or the Singularity Institute/MIRI? Or CFAR? Or starting Less Wrong and inspriring people to carry on without his day-to-day involvement? These are things that take hard work and patience. This lack of respect that I often see when Humanities people write about Yudkowsky goes beyond a mere (healthy) contra-stance to the idolatry that Eliezer himself sometimes cultivatated. I love Philip’s “Last War”, this ultimate no-things-get-left-out book project and I am definitely going to buy it, but by dragging Eliezer Yudkowsky into the Neoreaction “mess” he frivolously let the horses of stream-of-consciousness writing get the best of him.

    • I’m afraid I disagree. I find the “Sequences” very enjoyable, but frankly I think calling Yudkowsky a crank is perfectly fair comment. MIRI/SIAI/Whatever it’s called this week is… well, “nothing” is frankly the *best* one could say of it. Whether the good it’s done by persuading Peter Thiel to spend money on it that he otherwise would have spent on his evil political ideas is outweighed by the harm it’s done by taking charitable donations that otherwise would have gone on interventions that actually worked is very, very debatable. What isn’t really debatable is that its single most notable achievement is a Harry Potter fanfic — a good Harry Potter fanfic, but still, not exactly the “single most important way that you could donate money” that Yudkowsky claims. It’s certainly done no real work of any importance.
      And LessWrong I consider actively harmful. As I was saying on Tumblr the other day, I was on LessWrong for quite a while, in a very low-key way. My period of time there basically went “These are people talking about interesting stuff. Admittedly they have a few odd beliefs like the cryonic thing, but interesting people.” “…apart from this virulent racist who keeps talking about IQ…” “…and all these people who keep talking about being ‘Pick-Up Artists’…” “my God, this place needs to be burned down and the earth salted!”
      And Sandifer didn’t “drag Yudkowsky into the neoreaction ‘mess'” — neoreaction is largely inspired by Yudkowsky’s ideas, the neoreactionary community was formed on LessWrong and LessWrong diaspora sites like Slate Star Codex, and Moldbug is funded by the same person who funds Yudkowsky. Michael Anissimov was employed by the SIngularity Institute, and his blog, “More Right”, is explicitly inspired by LessWorng. Yes, Yudkowsky explicitly disowns their political views (and yes, Sandifer notes that in his book), but without his work they wouldn’t exist at all, and they are by most objective standards Yudkowsky’s biggest impact on the world. Trying to talk about the Neoreactionaries *without* talking about Yudkowsky would be like trying to talk about them without talking about Robin Hanson or Peter Thiel.
      And I’m saying this as someone who is at least fairly sympathetic to Yudkowsky. I like his writing. I suspect I’ve read everything he’s ever written, and I’ve paid for it when (as with Rationality: From AI To Zombies) that’s been an option — even his weird short story about the fetishist virgin enslaving the devil with perversion and game theory. I think the world is a more interesting place for having Yudkowsky in it, and I want him to continue writing.
      But I *also* think the evidence is overwhelming that he *is* a crank, and that other than producing a ton of blog posts and fanfic his main achievements in the real world have been to create a toxic community.

  2. Pingback: Get June Started Right with June Links | Gerry Canavan

  3. plok says:

    “Humanities people”?

    Lots of interesting people are cranks. Alan Moore is a crank; PKD was a crank. I myself, very likely, am a crank. It doesn’t have to mean a guy on a science forum talking about phlogiston and how gravity is really just magnetism and what-all! And if Singulatarians and digital-consciousness people can’t be considered cranks in pretty much the moment they take the field, then we might as well give up using the word altogether.

    But I’m glad you mentioned the Abyss, because I’ve been thinking about people’s reactions to it a lot lately. Fear of death is real, but for some people it’s the only real thing…and in lots of neo-materialist thinking I see this underlying wave of terror driving extreme “rationalism”. My favourite example isn’t actually from the “alt-rat” bunch or whatever they are called, but in more well-known kooks like (say) Sam Harris and Richard Dawkins. I admit that’s a little off-topic as far as choosing examples goes, but many who are uncomfortable with philosophy seem to need philosophy very desperately on a personal level — they’re grappling with the subject non-stop, but because they refuse to pick up the appropriate tools of reason they are suffering. Trying to use a hammer to screw in a nail! Maybe they even do manage to get it properly in the wood, but my God what a way to work.

    That said, I know only a very little about the Less Wrong etc. crowd…but what I’ve seen definitely makes me want to read Sandifer’s book. Why are they this way? Perfect visual acuity, yet seemingly without any depth perception. If I still did fortune-telling, I’d rank them only below “Hollywood actor” as probably the easiest paydays in the world…

    Except they do it too, don’t they? The charlatan thing…

    Sorry, that’s actually NOT a criticism, though I guess it sounds quite dismissive. I don’t mean it that way — not everyone can be a proper charlatan, and I sometimes think we should have a professional association so we can keep out the poor craftsmen. Charlatanry at its best is delicious formal rhetorical play, a parody of logic, and a riff on the appeal of ordered verbal propositions. Didn’t Yudkowsky once say he could talk someone into doing something obviously contrary to their self-interest, using no inducements? I suppose that was meant to demonstrate the devilish danger of a superintelligent AI, that can quote Scripture to its own purpose, but for me it was reminiscent of the typical problems one has to solve as a professional charlatan intent on earning his daily bread…

    Sorry again, it does sound dismissive! But what I mean is, it’s easy to fool people but I admire the skill and intelligence it takes to fool them “fair and square”, if you will. It’s a discipline.

    Oh dear, this is coming across all wrong…you can tell I’m rusty…

Comments are closed.