Tags

, , ,

[content warning: Neoreaction A Basilisk, Roko’s Basilisk, slurs]
[The most delightful part of NAB is all the really long reviews of it. In addition to my own humble contribution to the genre, I recommend Promethea’s, Rob’s, and psybersecurity’s.]

I.

A review of Neoreaction: A Basilisk by a rationalist is also a review of the discourse surrounding Neoreaction: A Basilisk.

Here is my review: Consider the fervent Twilight hater or the late lamented Anti-Shurtugal [cw: tvtropes]. They spend hours arguing about the text and writing detailed essays explicating their preferred interpretation. They engage in close readings about puzzling characterization, worldbuilding, or thematic questions. They write fanfiction and draw fanart. They have some ships they love, and other ships they despise with the red-hot passion of ten thousand fiery suns. They count down, excited, until the day the next book comes out, then devour it in one sitting so they can talk about it with all their friends. There is a word for this behavior. It’s called “being a fan.”

Similarly, if you and all your friends spend hours arguing about philosophy inspired by the works of Eliezer Yudkowsky, there is a word for this behavior. It’s called “being a rationalist.”

Now, you may protest, they cannot possibly be rationalists. But any criterion one might use to exclude them also excludes a bunch of thorough-going, obvious rationalists. For instance, being an AI risk skeptic, thinking HPMOR is not terribly good, and saying large numbers of rationalists are ableist cannot possibly mean that one is not a rationalist, because if that was true I wouldn’t be a rationalist. It’s hard to think of any set of beliefs that would get more than, say, 40% agreement among rationalists that wouldn’t bring in a bunch of people who obviously aren’t rationalists (e.g. atheism).

Social definitions also fail. Many rationalists do not attend in-person meetups. Anti-rationalists interact socially with rationalists online. Admittedly, it’s mostly through arguing, but rationalists mostly interact with each other through arguing anyway. And anti-rationalists do sometimes have pleasant interactions with rationalists– most notably nihilsupernum, whose taste in friends I shall never be able to understand as long as I live.

Indeed, the only rule I can think of for saying that they aren’t rationalists is that they tend to be somewhat unpleasant and widely disliked. However, if general unpleasantness and being widely disliked meant someone wasn’t a rationalist, then we would have a completely perfect defense against being accused of neoreactionary infestation, because everyone fucking hates Anissimov.

II.

I am unfortunately not familiar with Land or Moldbug, but I am familiar with Yudkowsky. Sandifer’s discussion of Yudkowsky is factually correct, but occasionally has moments of Just Not Getting It. For instance, Sandifer seems to think the usual rationalist philosophy of identity has something to do with many-worlds quantum mechanics (?). Of course, many-worlds was involved in Roko’s original formulation of the basilisk, but it has nothing to do with philosophy of identity. Similarly, Sandifer mocks the concept of ‘inferential distance’, which is particularly strange given that he’d just spent several pages walking the reader through who this Yudkowsky person was. And Sandifer thinks Eliezer thinks that “deathist” means “a person who is not signed up for cryonics”, even though it just means “person who is pro-death”, and many many people who are skeptical about cryonics are anti-death. (Hi!) Nevertheless, for someone who is not immersed in the philosophy, I felt he did quite well.

III.

Neoreaction A Basilisk has been criticized as a hitjob. It’s really not. It’s really, really not.

Indeed, Neoreaction A Basilisk is perhaps characterized best by its obvious affection for the people it’s criticizing.

I don’t have a hell of a lot to say about the first five sections of Neoreaction A Basilisk, because I like them. Take some snark, add some clever intellectual games, sprinkle with a dash of Milton, a bit of Bryan Fuller Hannibal, and the tiniest hint of Fanon: that’s how you make something that makes me grin every other page.

Section six and seven, however, just utterly do not work for me. This is not really Sandifer’s fault, it’s a product of who I am as a person. Due to a tragic deficiency in my education, my Blake is limited to the Tyger. As a borderline and autistic, I find it hard to read the word ’empathy’ without having a flinch reaction because of how closely the word is associated with Simon Baron-Cohen telling me I don’t have any. And Sandifer and I are just working in fundamentally different strands of feminism. As soon as I read the word ‘femininity’, my inner radical feminist (she has combat boots and a buzzcut) starts stomping around shouting about the patriarchy and how ‘femininity’ is a nice word for women being men’s slaves.

Part of the problem with ’empathy’ is that it really doesn’t fit his chosen thinkers well. Land, sure, Land finds empathy horrifying. Empathy never crossed Moldbug’s mind as a thing he ought to incorporate into his design: that’s a fair cop. But Eliezer is characterized as “a person who thinks a lot about empathy but isn’t any good at it.” You would hope this would be illustrated with examples of Eliezer either failing to model other people or acting in a callous way. Instead, it is illustrated with Eliezer… talking about empathy in a way Sandifer finds funny? I don’t understand why this is a failure of empathy? One can very well have empathy without being able to theorize about it. Indeed, one might very well argue that the failure of empathy here is Sandifer’s; he’s the one who has a hard time understanding someone else’s point of view on account of they talk funny.

At one point, Sandifer proposes that the people with empathy become their own post-apocalyptic tribe and shoot the people without empathy. Sandifer is, I hope, aware of the joke here, and I wouldn’t put it past him to do a sly self-pwning of his own ideology. Indeed, that’s probably the fundamental tension of Sandifer’s idea of empathy, and perhaps of Neoreaction A Basilisk as a whole.

Sandifer is talking about empathy as something scary, almost like Campbell’s The Thing– as the invasion of one’s mind by a mind alien to one’s own. That’s the meaning of the strange affection Sandifer has for Yudkowsky, for Moldbug, and (especially) for Land– he empathizes with them and he wants to understand them. He’s doing the whole Ender’s Game thing: “In the moment when I truly understand my enemy, understand him well enough to defeat him, then in that very moment I also love him. I think it’s impossible to really understand somebody, what they want, what they believe, and not love them the way they love themselves. And then, in that very moment when I love them…. I destroy them.”

There’s no cheat codes here. There’s no way to get out of it. Empathizing is inherently the process of empathizing with that which is alien to you, whatever it is; therefore empathizing is inherently horrifying, inherently a threat. And while I do not know Sandifer as a person, it’s not unreasonable to suggest that neoreactionaries and race realists are exactly who is alien to a postmodernist Marxist. Imagine him showing up on Tumblr to all his cool leftist friends and being like “actually, I am reading this guy whose fans are like 90% white nationalists and I think he’s really interesting and insightful and I want to write a book about him?” God.

So he distances himself. He snarks. He follows up the idea of arguing with Yudkowsky, Moldbug, and Land in person with “(ew)”. He cultivates the ironic distance between himself and his subjects. He has this whole air of superiority to the disgusting racists plus one AI crackpot he’s decided to write a book about.

(The fact that Sandifer probably had an impulse to comment on my eighth-grade reading level after that Ender’s Game quote is exactly what I’m fucking talking about.)

But hatedom is just another kind of fandom. You don’t write a book about a couple of obscure Internet philosophers unless they fascinate you, unless you find them interesting, unless– not to put too fine a point on it– you like them. The lady doth protest too much, methinks.

IV.

I really, really want a revision of Neoreaction: A Basilisk where Sandifer engages with Brian Tomasik. I think it would say really interesting things about his empathy thesis, given that Brian’s whole thing is extending empathy to an absurd degree. If there is a thing that could conceivably be suffering in this world then Brian will empathize with it– whether it is a fly, a video game NPC, or an electron. (Talk about experiments with radical empathy.) And Thomas Ligotti would be really interesting to put in conversation with Tomasik, because Tomasik is genuinely in favor of destroying nature in order to end wild-animal suffering. While Tomasik is not in favor of destroying humanity, some of his friends are. Seriously, Sandifer! Please consider this for the sequel! I promise this is only a little bit motivated by the fact that I’m an even bigger Tomasik fan than I am a Yudkowsky fan.

V.

Sandifer begins: “let us assume that we are fucked,” and he continues with this assumption. He classifies responses to the fuckedness in three categories: denial; decelerationism, or the attempt to delay apocalypse, which is Yudkowsky’s praxis; and accelerationism, or the attempt to make apocalypse happen as fast as possible, which is what Land is up to.

Of course, Yudkowsky is not actually decelerationist. His project is about accepting the possibility of our doom and working to avoid it, which is clearly a different thing from accepting the inevitability of our doom and working to delay it. Like, surely Sandifer has snarked enough about how Yudkowsky wants to live forever in a computer to notice this. One of the most puzzling failures of Sandifer’s understanding of Yudkowsky is when he characterizes the following passage as “Ligottian in its bleakness”:

I visualize the past and future of humankind, the tens of billions of deaths over our history, the misery and fear, the search for answers, the trembling hands reaching upward out of so much blood, what we could become someday when we make the stars our cities, all that darkness and all that light—I know that I can never truly understand it, and I haven’t the words to say.

Ligotti wants to destroy all of humanity because consciousness is inherently evil. (Which, incidentally, is just boring as horror-philosophy. Humanity should be destroyed? Puh-leeze, some of my best friends are negative utilitarians.) This passage is, like, literally the opposite of that. Did you not notice the bit about the trembling hands reaching upwards and the stars being our cities and all that? This isn’t bleakness, this is Whig history. Every day, and in every way, the world is getting better and better.

So what this is about is effective altruism.

Effective altruism says: “wait a minute, why are you assuming that we’re fucked?”

Last year, the global poverty rate fell below 10% for the first time. (For comparison purposes, for most of history, the global poverty rate was close to 100%.) The global life expectancy at birth has risen from 26 during the Iron Age to 67 today, more than doubling. 500 million have died of smallpox but not one single one more, ever again— and within a few years Jai will get to write a post like that about polio, and within his lifetime about malaria. The environmentalist movement keeps slowly, quietly, winning, its victories celebrated only in the fact that the movement starts yelling about something else: you can breathe in Los Angeles; the ozone layer is repairing itself and will be fixed by the mid-21st-century; acid rain levels have dropped 65% in the US. In the 1970s, marital rape was legal everywhere on the globe; today it is illegal in more countries than it isn’t.

Just about the only place where we’re not improving is existential risk. Sandifer is skeptical about artificial general intelligence, which implies he’s also skeptical about other exotic existential risks (e.g. nanotech, the Dark Lords of the Matrix deciding to shut down our simulation). So you’ve mostly got the standard ones: nuclear war; asteroids; pandemic; runaway global warming. Sandifer says his best guess is “that millennials will probably live long enough to see the second Great Depression, which will blur inexorably with the full brunt of climate change to lead to a massive human dieback, if not quite an outright extinction.”

Climate change is expected to cause about five million deaths over twenty years, concentrated mostly among people in the developing world; for comparison, malaria causes nearly twice as many deaths per year. I am not a huge fan of malaria [citation needed]. However, if we’re not unutterably fucked because of the existence of malaria now, I very much doubt that we will be unutterably fucked because of the existence of climate change a decade from now.

And I don’t mean to be a douchebag here, but… the Great Depression? You mean the time period that’s better than 90% of human history?

Imagine talking to a medieval peasant. “You can talk to people miles and miles away,” you say. “You can hear music recorded by the world’s greatest musicians whenever you like. There’s about a 75% chance you don’t have to have any babies unless you want to. And the chance that you will die in childbirth is low, and getting lower.”

The peasant eyes you suspiciously. “What’s the catch?”

You hedge. “Well, you’re not quite as rich as you could have been if our civilization was more competent,” you say, “and there’s an increased chance you’ll die of malnutrition or diarrhea. I mean, the chance is not as high as it is in your time period, but it definitely could have been lower than it was.”

Such dystopia, much doom. Wow. I can imagine the medieval peasant shaking their head and being so glad that they avoided such a dread fate.

Sandifer characterizes Ligotti’s viewpoint on Milton as follows: “We might imagine, for instance, the swiftness with which it would dismantle the Miltonian position simply by blinking uncomprehendingly as soon as Milton begins to speak (and thus to sin) and asking “why are you doing that,” to which there is no possible response that Milton could ever give.”

This is, I think, the effective altruist response to Phil Sandifer as well. We blink uncomprehendingly as soon as he says “let us assume that we are fucked” and ask “why are you doing that,” to which there is no possible response that Sandifer could ever give. The world is a bad place, but it is getting better, and we are in the tremendously lucky position that each of us can play a vital role in improving it.

Perhaps I’m missing the whole point with all these ‘facts’ and ‘historical perspective.’ Perhaps it is not so much about the actual reality of whether or not we are fucked, but an attitude, a perspective from which we approach the world. Personally, I left grimdark edgelordery back in middle school, along with music snobbery and “Sarcasm Is The Body’s Natural Defense Against Stupidity” T-shirts, but you do you.

VI.

To return to what I started my review saying: a review of Neoreaction: A Basilisk by a rationalist is inherently a review of the discourse surrounding Neoreaction: A Basilisk.

Long ago, a person reviewed John Ringo’s book Ghost in a snarky yet kind of affectionate way:

The PALADIN OF SHADOWS series is arguably the most horrifying series of books I have ever read. It has a hero I can’t stand, politics so strong they’re comical, and sex scenes that are downright horrifying. And I cannot stop reading it. I am going to buy every single one, and if Ringo ever comes out with a spin-off featuring Katya as Cottontail the Bionic Whore, I will buy that too. Because dammit, there’s bad, and then there’s so bad you have to memorialize it for future generations.

John Ringo responded by saying the critiques were absolutely fair, his book was indeed total id-spew shit, and did they want to sell T-shirts with “OH JOHN RINGO NO” on them to raise money for a domestic violence shelter? The only way I saw anyone respond to this was “wow, Ghost is a shitty book, but John Ringo is an absolute class act.” The criticism of his book was a complete PR win.

I propose this as a model for rationalists. Be an absolute class act. Stop with the “I, a PR expert with a complete understanding of how human social dynamics work, have never heard of the Steisand Effect and thus think telling people not to read a book is a good way to get them to not read it” nonsense. Admit that the snark is funny. (Calling the Sequences “of a genuine intellectual heft comparable to Kant’s Critiques, assuming you don’t much care for Kant’s critiques” is genuinely hilarious and you know it.) Think that it is totally neat that we are being called Lovecraft protagonists. Buy the book. Tell your friends. Make memes out of the lines (I personally would advocate for “let us assume that we are fucked”). Make Neoreaction: A Basilisk one of the must-read rationalist books, next to Godel Escher Bach, Thinking Fast and Slow, and Worm. Make it one of those books that everyone is familiar with, like Strategy of Conflict, even if they haven’t read it, just because everyone references it so much.

There can be one of two results here. First, if Sandifer is the person I hope he is, he will roll with it and grin and appreciate his new fanbase and welcome his membership in the rationalist community. Second, if he is not, we look like people who can take a joke, he looks like a cantankerous douchebag, and– most crucially- it will annoy the shit out of him. Really, it’s win/win.