[content warning: Neoreaction A Basilisk, Roko’s Basilisk, slurs]
[The most delightful part of NAB is all the really long reviews of it. In addition to my own humble contribution to the genre, I recommend Promethea’s, Rob’s, and psybersecurity’s.]
I.
A review of Neoreaction: A Basilisk by a rationalist is also a review of the discourse surrounding Neoreaction: A Basilisk.
Here is my review: Consider the fervent Twilight hater or the late lamented Anti-Shurtugal [cw: tvtropes]. They spend hours arguing about the text and writing detailed essays explicating their preferred interpretation. They engage in close readings about puzzling characterization, worldbuilding, or thematic questions. They write fanfiction and draw fanart. They have some ships they love, and other ships they despise with the red-hot passion of ten thousand fiery suns. They count down, excited, until the day the next book comes out, then devour it in one sitting so they can talk about it with all their friends. There is a word for this behavior. It’s called “being a fan.”
Similarly, if you and all your friends spend hours arguing about philosophy inspired by the works of Eliezer Yudkowsky, there is a word for this behavior. It’s called “being a rationalist.”
Now, you may protest, they cannot possibly be rationalists. But any criterion one might use to exclude them also excludes a bunch of thorough-going, obvious rationalists. For instance, being an AI risk skeptic, thinking HPMOR is not terribly good, and saying large numbers of rationalists are ableist cannot possibly mean that one is not a rationalist, because if that was true I wouldn’t be a rationalist. It’s hard to think of any set of beliefs that would get more than, say, 40% agreement among rationalists that wouldn’t bring in a bunch of people who obviously aren’t rationalists (e.g. atheism).
Social definitions also fail. Many rationalists do not attend in-person meetups. Anti-rationalists interact socially with rationalists online. Admittedly, it’s mostly through arguing, but rationalists mostly interact with each other through arguing anyway. And anti-rationalists do sometimes have pleasant interactions with rationalists– most notably nihilsupernum, whose taste in friends I shall never be able to understand as long as I live.
Indeed, the only rule I can think of for saying that they aren’t rationalists is that they tend to be somewhat unpleasant and widely disliked. However, if general unpleasantness and being widely disliked meant someone wasn’t a rationalist, then we would have a completely perfect defense against being accused of neoreactionary infestation, because everyone fucking hates Anissimov.
II.
I am unfortunately not familiar with Land or Moldbug, but I am familiar with Yudkowsky. Sandifer’s discussion of Yudkowsky is factually correct, but occasionally has moments of Just Not Getting It. For instance, Sandifer seems to think the usual rationalist philosophy of identity has something to do with many-worlds quantum mechanics (?). Of course, many-worlds was involved in Roko’s original formulation of the basilisk, but it has nothing to do with philosophy of identity. Similarly, Sandifer mocks the concept of ‘inferential distance’, which is particularly strange given that he’d just spent several pages walking the reader through who this Yudkowsky person was. And Sandifer thinks Eliezer thinks that “deathist” means “a person who is not signed up for cryonics”, even though it just means “person who is pro-death”, and many many people who are skeptical about cryonics are anti-death. (Hi!) Nevertheless, for someone who is not immersed in the philosophy, I felt he did quite well.
III.
Neoreaction A Basilisk has been criticized as a hitjob. It’s really not. It’s really, really not.
Indeed, Neoreaction A Basilisk is perhaps characterized best by its obvious affection for the people it’s criticizing.
I don’t have a hell of a lot to say about the first five sections of Neoreaction A Basilisk, because I like them. Take some snark, add some clever intellectual games, sprinkle with a dash of Milton, a bit of Bryan Fuller Hannibal, and the tiniest hint of Fanon: that’s how you make something that makes me grin every other page.
Section six and seven, however, just utterly do not work for me. This is not really Sandifer’s fault, it’s a product of who I am as a person. Due to a tragic deficiency in my education, my Blake is limited to the Tyger. As a borderline and autistic, I find it hard to read the word ’empathy’ without having a flinch reaction because of how closely the word is associated with Simon Baron-Cohen telling me I don’t have any. And Sandifer and I are just working in fundamentally different strands of feminism. As soon as I read the word ‘femininity’, my inner radical feminist (she has combat boots and a buzzcut) starts stomping around shouting about the patriarchy and how ‘femininity’ is a nice word for women being men’s slaves.
Part of the problem with ’empathy’ is that it really doesn’t fit his chosen thinkers well. Land, sure, Land finds empathy horrifying. Empathy never crossed Moldbug’s mind as a thing he ought to incorporate into his design: that’s a fair cop. But Eliezer is characterized as “a person who thinks a lot about empathy but isn’t any good at it.” You would hope this would be illustrated with examples of Eliezer either failing to model other people or acting in a callous way. Instead, it is illustrated with Eliezer… talking about empathy in a way Sandifer finds funny? I don’t understand why this is a failure of empathy? One can very well have empathy without being able to theorize about it. Indeed, one might very well argue that the failure of empathy here is Sandifer’s; he’s the one who has a hard time understanding someone else’s point of view on account of they talk funny.
At one point, Sandifer proposes that the people with empathy become their own post-apocalyptic tribe and shoot the people without empathy. Sandifer is, I hope, aware of the joke here, and I wouldn’t put it past him to do a sly self-pwning of his own ideology. Indeed, that’s probably the fundamental tension of Sandifer’s idea of empathy, and perhaps of Neoreaction A Basilisk as a whole.
Sandifer is talking about empathy as something scary, almost like Campbell’s The Thing– as the invasion of one’s mind by a mind alien to one’s own. That’s the meaning of the strange affection Sandifer has for Yudkowsky, for Moldbug, and (especially) for Land– he empathizes with them and he wants to understand them. He’s doing the whole Ender’s Game thing: “In the moment when I truly understand my enemy, understand him well enough to defeat him, then in that very moment I also love him. I think it’s impossible to really understand somebody, what they want, what they believe, and not love them the way they love themselves. And then, in that very moment when I love them…. I destroy them.”
There’s no cheat codes here. There’s no way to get out of it. Empathizing is inherently the process of empathizing with that which is alien to you, whatever it is; therefore empathizing is inherently horrifying, inherently a threat. And while I do not know Sandifer as a person, it’s not unreasonable to suggest that neoreactionaries and race realists are exactly who is alien to a postmodernist Marxist. Imagine him showing up on Tumblr to all his cool leftist friends and being like “actually, I am reading this guy whose fans are like 90% white nationalists and I think he’s really interesting and insightful and I want to write a book about him?” God.
So he distances himself. He snarks. He follows up the idea of arguing with Yudkowsky, Moldbug, and Land in person with “(ew)”. He cultivates the ironic distance between himself and his subjects. He has this whole air of superiority to the disgusting racists plus one AI crackpot he’s decided to write a book about.
(The fact that Sandifer probably had an impulse to comment on my eighth-grade reading level after that Ender’s Game quote is exactly what I’m fucking talking about.)
But hatedom is just another kind of fandom. You don’t write a book about a couple of obscure Internet philosophers unless they fascinate you, unless you find them interesting, unless– not to put too fine a point on it– you like them. The lady doth protest too much, methinks.
IV.
I really, really want a revision of Neoreaction: A Basilisk where Sandifer engages with Brian Tomasik. I think it would say really interesting things about his empathy thesis, given that Brian’s whole thing is extending empathy to an absurd degree. If there is a thing that could conceivably be suffering in this world then Brian will empathize with it– whether it is a fly, a video game NPC, or an electron. (Talk about experiments with radical empathy.) And Thomas Ligotti would be really interesting to put in conversation with Tomasik, because Tomasik is genuinely in favor of destroying nature in order to end wild-animal suffering. While Tomasik is not in favor of destroying humanity, some of his friends are. Seriously, Sandifer! Please consider this for the sequel! I promise this is only a little bit motivated by the fact that I’m an even bigger Tomasik fan than I am a Yudkowsky fan.
V.
Sandifer begins: “let us assume that we are fucked,” and he continues with this assumption. He classifies responses to the fuckedness in three categories: denial; decelerationism, or the attempt to delay apocalypse, which is Yudkowsky’s praxis; and accelerationism, or the attempt to make apocalypse happen as fast as possible, which is what Land is up to.
Of course, Yudkowsky is not actually decelerationist. His project is about accepting the possibility of our doom and working to avoid it, which is clearly a different thing from accepting the inevitability of our doom and working to delay it. Like, surely Sandifer has snarked enough about how Yudkowsky wants to live forever in a computer to notice this. One of the most puzzling failures of Sandifer’s understanding of Yudkowsky is when he characterizes the following passage as “Ligottian in its bleakness”:
I visualize the past and future of humankind, the tens of billions of deaths over our history, the misery and fear, the search for answers, the trembling hands reaching upward out of so much blood, what we could become someday when we make the stars our cities, all that darkness and all that light—I know that I can never truly understand it, and I haven’t the words to say.
Ligotti wants to destroy all of humanity because consciousness is inherently evil. (Which, incidentally, is just boring as horror-philosophy. Humanity should be destroyed? Puh-leeze, some of my best friends are negative utilitarians.) This passage is, like, literally the opposite of that. Did you not notice the bit about the trembling hands reaching upwards and the stars being our cities and all that? This isn’t bleakness, this is Whig history. Every day, and in every way, the world is getting better and better.
So what this is about is effective altruism.
Effective altruism says: “wait a minute, why are you assuming that we’re fucked?”
Last year, the global poverty rate fell below 10% for the first time. (For comparison purposes, for most of history, the global poverty rate was close to 100%.) The global life expectancy at birth has risen from 26 during the Iron Age to 67 today, more than doubling. 500 million have died of smallpox but not one single one more, ever again— and within a few years Jai will get to write a post like that about polio, and within his lifetime about malaria. The environmentalist movement keeps slowly, quietly, winning, its victories celebrated only in the fact that the movement starts yelling about something else: you can breathe in Los Angeles; the ozone layer is repairing itself and will be fixed by the mid-21st-century; acid rain levels have dropped 65% in the US. In the 1970s, marital rape was legal everywhere on the globe; today it is illegal in more countries than it isn’t.
Just about the only place where we’re not improving is existential risk. Sandifer is skeptical about artificial general intelligence, which implies he’s also skeptical about other exotic existential risks (e.g. nanotech, the Dark Lords of the Matrix deciding to shut down our simulation). So you’ve mostly got the standard ones: nuclear war; asteroids; pandemic; runaway global warming. Sandifer says his best guess is “that millennials will probably live long enough to see the second Great Depression, which will blur inexorably with the full brunt of climate change to lead to a massive human dieback, if not quite an outright extinction.”
Climate change is expected to cause about five million deaths over twenty years, concentrated mostly among people in the developing world; for comparison, malaria causes nearly twice as many deaths per year. I am not a huge fan of malaria [citation needed]. However, if we’re not unutterably fucked because of the existence of malaria now, I very much doubt that we will be unutterably fucked because of the existence of climate change a decade from now.
And I don’t mean to be a douchebag here, but… the Great Depression? You mean the time period that’s better than 90% of human history?
Imagine talking to a medieval peasant. “You can talk to people miles and miles away,” you say. “You can hear music recorded by the world’s greatest musicians whenever you like. There’s about a 75% chance you don’t have to have any babies unless you want to. And the chance that you will die in childbirth is low, and getting lower.”
The peasant eyes you suspiciously. “What’s the catch?”
You hedge. “Well, you’re not quite as rich as you could have been if our civilization was more competent,” you say, “and there’s an increased chance you’ll die of malnutrition or diarrhea. I mean, the chance is not as high as it is in your time period, but it definitely could have been lower than it was.”
Such dystopia, much doom. Wow. I can imagine the medieval peasant shaking their head and being so glad that they avoided such a dread fate.
Sandifer characterizes Ligotti’s viewpoint on Milton as follows: “We might imagine, for instance, the swiftness with which it would dismantle the Miltonian position simply by blinking uncomprehendingly as soon as Milton begins to speak (and thus to sin) and asking “why are you doing that,” to which there is no possible response that Milton could ever give.”
This is, I think, the effective altruist response to Phil Sandifer as well. We blink uncomprehendingly as soon as he says “let us assume that we are fucked” and ask “why are you doing that,” to which there is no possible response that Sandifer could ever give. The world is a bad place, but it is getting better, and we are in the tremendously lucky position that each of us can play a vital role in improving it.
Perhaps I’m missing the whole point with all these ‘facts’ and ‘historical perspective.’ Perhaps it is not so much about the actual reality of whether or not we are fucked, but an attitude, a perspective from which we approach the world. Personally, I left grimdark edgelordery back in middle school, along with music snobbery and “Sarcasm Is The Body’s Natural Defense Against Stupidity” T-shirts, but you do you.
VI.
To return to what I started my review saying: a review of Neoreaction: A Basilisk by a rationalist is inherently a review of the discourse surrounding Neoreaction: A Basilisk.
Long ago, a person reviewed John Ringo’s book Ghost in a snarky yet kind of affectionate way:
The PALADIN OF SHADOWS series is arguably the most horrifying series of books I have ever read. It has a hero I can’t stand, politics so strong they’re comical, and sex scenes that are downright horrifying. And I cannot stop reading it. I am going to buy every single one, and if Ringo ever comes out with a spin-off featuring Katya as Cottontail the Bionic Whore, I will buy that too. Because dammit, there’s bad, and then there’s so bad you have to memorialize it for future generations.
John Ringo responded by saying the critiques were absolutely fair, his book was indeed total id-spew shit, and did they want to sell T-shirts with “OH JOHN RINGO NO” on them to raise money for a domestic violence shelter? The only way I saw anyone respond to this was “wow, Ghost is a shitty book, but John Ringo is an absolute class act.” The criticism of his book was a complete PR win.
I propose this as a model for rationalists. Be an absolute class act. Stop with the “I, a PR expert with a complete understanding of how human social dynamics work, have never heard of the Steisand Effect and thus think telling people not to read a book is a good way to get them to not read it” nonsense. Admit that the snark is funny. (Calling the Sequences “of a genuine intellectual heft comparable to Kant’s Critiques, assuming you don’t much care for Kant’s critiques” is genuinely hilarious and you know it.) Think that it is totally neat that we are being called Lovecraft protagonists. Buy the book. Tell your friends. Make memes out of the lines (I personally would advocate for “let us assume that we are fucked”). Make Neoreaction: A Basilisk one of the must-read rationalist books, next to Godel Escher Bach, Thinking Fast and Slow, and Worm. Make it one of those books that everyone is familiar with, like Strategy of Conflict, even if they haven’t read it, just because everyone references it so much.
There can be one of two results here. First, if Sandifer is the person I hope he is, he will roll with it and grin and appreciate his new fanbase and welcome his membership in the rationalist community. Second, if he is not, we look like people who can take a joke, he looks like a cantankerous douchebag, and– most crucially- it will annoy the shit out of him. Really, it’s win/win.
Jacob Schmidt said:
Do you read Sandifer’s tumblr at all? He’s pretty much upfront about finding these 3 fascinating, and outright says that he kinda likes Yudkowsky. Dude’s not doing a lot of protesting.
LikeLike
Ghatanathoah said:
Is it really that common for people to find empathy an unpleasant and horrifying experience? When I try to empathize with someone who is really different from me, especially someone who is different in a rather nasty way (i.e. Ted Bundy) my usual emotional response is fascination. I feel the same emotions I feel when I read insight porn. This is generally followed by me patting myself on the back for being a fearless truthseeker.
I do occasionally get horrified at the idea that there are people out there who would kill me if they could, but the horror is generally due to their existence, rather than comprehension of their motives (for instance, I’m a little horrified thinking about Ligotti, but not at all horrified thinking about the movie version of Folken Fanel, because Ligotti exists and Folken doesn’t). Empathy actually makes the horror go down because I feel more in control once I understand them.
LikeLiked by 3 people
MugaSofer said:
People tend to like to think that they’re special, myself included. “There but for the grace of God go I” is not particularly reassuring unless you think you have an indefinite supply of the Grace of God.
LikeLiked by 1 person
Machine Interface said:
Expression of appraisal for the completely obscure Escaflowne reference.
LikeLike
1angelette said:
I’ve had my legs broken over the course of six months by a dozen needles fourteen millimeters in diameter, as well as found wounds taking up a tenth of my entire foot that bled through three inches of gauze, and I’ve never felt greater pain than developing the empathetic understanding that I had mercilessly exploited a person I professed to dearly love. So the empathy experience depends on the person, I guess.
LikeLike
Ann Onora Mynuz said:
I’m not good at making out the distinction between empathy and sympathy, but I could stand to have a lot less of the latter.
LikeLike
veronica d said:
I men Phil face-to-face once, at a mutual friend’s house. We had a really amazing conversation, where I concluded that he’s on a totally different wavelength from me, but it’s a wavelength I like. Anyway, he can be scathing. If you like the “always put up a (maybe false) front of niceness” style in your discourse, perhaps Phil ain’t your guy. Perhaps he is. We each get to choose.
I’m a ticked off old tranny, so basically he’s golden to me. Anyway, all that said, he gets Yudkowsky very wrong.
Okay, so I’ve never met EY. I don’t think I’ve ever actually interacted with him in any way. I doubt he knows I exist. But still, I think I “get him.” I dunno. Some of you here know him face-to-face, so set me straight if I’m wrong.
EY doesn’t lack empathy. He’s swimming in the shit. In fact, he’s a “sensitive soul,” in a way that Sandifer can’t quite get, cuz EY doesn’t look like the kinds of “sensitive souls” Sandifer is used to (Blake, Moore, etc.).
So it goes. The reason EY responds like he does to “sneer culture” is cuz it hurts him.
Empathy? The right to be invaded? Oh my!
Sandifer has a skin a thousands times thicker than Yudkowsky does. Is this good or bad? You decide.
(Personally I like both of them.)
I got zero respect for Moldbug. I really don’t care if someone drags him hard. Whatevs. Land — I don’t even know what to fucking think. I might read him someday, as brain porn or whatever. I dunno.
I have an inner “edgelord” who I will occasionally feed candy.
#####
I like Ligotti precisely cuz I thought EY’s “metaethics” were garbage. It’s as if, everyone steers this bogus course through nihilism never quite facing up to the fact that, nope, this really is utterly fucking meaningless.
Which fine. I’m happy anyway. I’m a happy fucking nihilist. Life has not meaning. We’re shambling meatsacks who kill each other a lot. I’m drifting through life trying to do something interesting before death takes me. How silly this is.
Thing is, I’m happy cuz brain chemicals. Ligotti’s miserable cuz brain chemicals. That’s it. How fucking lovely.
I recall watching “True Detective” and agreeing quite literally with everything the Ligotti-analog character said, except the part where I’m happy.
I’m not sure what to think about that. The fact is, there is nothing to think. We’re naked apes who evolved. This is it.
There is a sense that everyone gets to find some narrative that works for them, some way to work through this. As an approach, horror seems as honest as any other.
Given the choice, I’d rather see humanity thrive. There is no reason we should. There is no reason we should not.
#####
I’m kinda cynical about climate change, the utter unpredictability of climate systems and the horror of fundamental instability. I’m saddened by the rise of white identity politics in the west, the bitterness that manifests at “Trumpism,” along with the fact that some of my friends think it might be reasonable to murder cops, not as an “edgy” bit of performance (“ACAB!”), but in response to the actual murder of cops. (I don’t always like cops. But I don’t want them murdered).
Blah blah blah.
More blahs. A lot of blahs.
I think entropy always wins and shit is unstable and natural selection runs atop a mountain of death and suffering, and why should expect large economic systems to do much better? Central planning does not work. Read that Thinking Like a State thing. Also learn about the “curse of dimensionality.” How does our “superintelligence” find these optimal points?
There is no grand design. There is no reason to suppose we can achieve a nice, happy optimal global economic system that is not a nightmare of war and exploitation.
More blahs.
Plus, there is only so much oil in the ground, only so much coal. We’ve done well for a few hundred years, running on the back of easy energy, but there are hard pressures coming.
Perhaps we’ll “innovate” our way out of the mess. Perhaps we will not. Watching the orgy of stupid that wall call the “tech bubble” along with the complete intellectual “trashfire” that now dominates one half of US politics (and this compared to our usual levels of stupid) — well I’m kinda cynical.
I think we might be fucked. It’s a nice enough starting point.
#####
Of course, I don’t know. Neither do you.
LikeLiked by 1 person
veronica d said:
Ha! I met Phil. I didn’t “men” him — although now I’m trying to imagine that describes some utterly transgressive sexual act that would shock even Phil.
LikeLike
Ghatanathoah said:
There’s something Eliezer once wrote (I can’t remember where, unfortunately) that I think is fairly relevant to Ligotti-style nihilism. I think what he said (I’m paraphrasing, obviously) was that Ligotti-style nihilism isn’t the belief that life has no objective value. It’s the belief that life does have an objective value, and that that value is always zero or negative. Even if a person says they believe in no objective value, if they act like Ligotti does, what they really believe is that life has an objective value that is very low.
(He didn’t mention Ligotti by name, he was just talking about nihilism in general and Ligotti was the person it reminded me of)
So maybe the difference between you and Ligotti is that you really believe life has no objective value, whereas he says he does, but doesn’t really.
LikeLiked by 2 people
veronica d said:
@Ghatanathoah — I think it is more complicated than that. To my view, it starts with the Mind Projection stuff [1]. Specifically, there is a way our minds “color” the world with all sorts of meaning. Things such as our sense of attraction, versus disgust, our sense of fair, versus unfair, our capacity to immediately empathize with one person, but not the other — all of these things seem to “just happen.” In fact, I suspect they result from unconscious “system 1” type thinking.
I mean, not always. We can override this — sometimes, to some degree. But it doesn’t just turn off, even if we realize it is “in our head,” just as I don’t stop finding sexy things sexy just because I know that it is “in my mind.”
It is not that I think someone like Ligotti is “more correct,” not exactly. It’s that I think his message is needed. Typically, when the philosophically inclined struggle with this, they swerve away from horror. This is natural. Our system 1 is baked deep and we respond accordingly. In LW terminology, we get a principle such as, “It all adds up to normality” [2]. Which sure, I agree. On the other hand, saying, “Don’t worry, your system 1 will continue to involuntarily produce the illusion” — well, I’m not sure if that really wrestles the beast.
I’ve always wanted to really, truly wrestle the beast. But I also don’t want to go fucking insane.
So I leave my system 1 running. Why wouldn’t I? I mean, I don’t want to live as Ligotti does. I like to experience happiness. It is pleasant.
But there is nothing magic-special about my system 1. It is just, I’m lucky. I have a (somewhat) normal brain. But that’s just my brain. There are other brains.
The universe itself — it is empty of meaning. It will accept equally all of these narratives.
Honestly, I find that a little bit horrifying. But it feels honest to me.
[1] http://lesswrong.com/lw/oi/mind_projection_fallacy/
[2] http://lesswrong.com/lw/sk/changing_your_metaethics/
LikeLike
Ghatanathoah said:
>Honestly, I find that a little bit horrifying. But it feels honest to me.
The idea that you ought to find it horrifying is, in itself, one of those System 1 projections you’re talking about. I feel important when I contemplate it, because it reminds that we are an essential part of Meaningfulness. Meaningfulness exists because we exist, we literally give Meaning to things.
LikeLiked by 1 person
veronica d said:
Also I don’t think it quit works to call Sandifer a “rationalist.” Myself, I certainly don’t quite accept the label, mostly because I find the central tendency of rationalism-space to be people I cannot abide. That said, I like the edges of the community. Thus I’m happy to call myself “rationalist adjacent.” It’s a nice enough term that seems to describe an actual collection of people.
But I don’t think Sandifer is quite even that, at least in the sense that rationalism is a meaningful “epicenter” (big metaphor alert!) of (at least part of) my social space. The same ain’t true for Sandifer. He’s no closer to (for example) Scott Alexander than he is to any other “notable” nerd blogger.
After all, EY forms only one-third of the rouges gallery he takes on in his book. It would be perverse to thus describe him as an NRx-er.
LikeLiked by 3 people
InferentialDistance said:
NEMESIS!
I thought the orgies meant we didn’t have to assume…
LikeLiked by 1 person
callmebrotherg said:
If we’re going to make memes out of this, then //let us assume that we are fucked// is a great tagline for bonobo rationalism.
LikeLiked by 1 person
InferentialDistance said:
As you wish.
LikeLiked by 2 people
callmebrotherg said:
I am feeling rather down tonight, but you made me smile. Thank you ever so much.
LikeLike
pillsy said:
I took the fucked assumption in the book in Basilisk less as something unarguable, and more as what Sandifer thought best starting point for engaging with EY, Moldbug and Land in the rather odd-seeming way that he wanted to engage with them.
I agree with most of the rest of Ozy’s review, except for the EA stuff, which I’m pretty sure I’ll never agree with.
LikeLiked by 2 people
Susebron said:
Yeah, it seemed like it was more “here’s one way to look at this whole thing” rather than “this is the one true way to understand neoreaction.
LikeLike
pillsy said:
I think “assume we’re not fucked” would lead you to writing something more like Scott Alexander’s FAQ. There’s nothing wrong with that, but it’s been done.
LikeLike
arbitrary_greay said:
It was interesting to realize that my kneejerk unpleasant reaction to NAB was basically what most other people feel when confronted with RPF. I found that this particular volume of RPF was uncomfortable to me, unlike my usual enthusiastic support of the category, because it purported to not be that, but philosophical/political analysis, and then pulled a few moral judgments out of the narrative written in the RPF.
There’s a glorious AU fic of NAB to be written where Eruditorum Press are the Holograms, and the NeoReactionalists are the Misfits, and Eliezer just might be Stormer. Or NRX are the Misfits, and Rationalists are the Stingers?
Would have much preferred the Eternal Golden Cuckball was written as it should be, as a threesome PWP. Right up there in the glorious annals of rule 34 RPF as with Dan Rather/Tom Brokaw/Peter Jennings.
LikeLiked by 3 people
davidmikesimon said:
@arbitrary_greay I have basically no idea what you’re talking about but I think I’m intrigued?
LikeLiked by 1 person
arbitrary_greay said:
@davidmikesimon: Not sure which paragraph you are referring to, so an elaboration on all three! 😀
RPF is real person fanfiction. For example, a fanfiction about your two favorite boyband members having a romance with each other, or also very commonly, the actors for your favorite TV show/film. Common anti-RPF arguments are like “stop treating real people like meat puppets,” “this is what tabloid culture does,” “stop removing your subjects’ agency,” “what if people wrote RPF about you,” etc.
I’m usually all for RPF. But the way NAB is written, it’s introspective character study fanfiction, interpreting the men through their professed beliefs, but pretending to just to be an analysis of the beliefs. Evidently, death of the author doesn’t apply to philosophy, when it comes to NRx and Rationality. I’m all for an Lovecraft AU fanfiction about these guys, but not when it’s also purporting to be nonfiction. No one would, or should, ever cite an RPF as evidence for, say, Chris Evans (actor of Captain America) having some particular personality trait.
The second paragraph is referring to Jem and the Holograms.
Eternal Golden Cuckball (or something like that) was apparently going to be the original title for NAB. PWP is “porn without plot,” where the smut is the only thing that happens in the fic.
LikeLiked by 2 people
osberend said:
I’ve always understood PWP to stand for “Plot? What Plot?”
LikeLike
arbitrary_greay said:
@osberend:
There are plenty of PWP variants:
porn without plot
porn without purpose
plot what plot
(I’ve also seen “porn with plot,” but almost always spelled out and not condensed to PWP)
LikeLike
nancylebovitz said:
Is this Ligotti the horror writer?
Also, I had my empathy crank up a lot once (forgive me, rathionalists– I actually think of it as my shields against other people’s auras getting lowered). It was tremendously painful and I hope it never happens again, or at least not unless I have much better resources for dealing with it. I have much more sympathy (if not empathy) for people who are bad at shielding themselves.
LikeLiked by 1 person
callmebrotherg said:
Indeed it is Ligotti the horror writer.
LikeLike
wallowinmaya said:
>I really, really want a revision of Neoreaction: A Basilisk where Sandifer engages with Brian Tomasik.
Yeah, that would certainly make for an interesting read 🙂
>And I don’t mean to be a douchebag here, but… the Great Depression? You mean the time period that’s better than 90% of human history?
Nice point 🙂
>Ligotti wants to destroy all of humanity because consciousness is inherently evil. (Which, incidentally, is just boring as horror-philosophy. Humanity should be destroyed? Puh-leeze, some of my best friends are negative utilitarians.)
😀
As an aside, some of my best friends are negative utilitarians as well and they don’t want to destroy humanity: they take cooperating with other value-systems extremely seriously. Which actually shouldn’t be surprising because the arguments for doing so are pretty compelling. (See e.g. the “NU FAQ” (http://www.utilitarianism.com/nu/nufaq.html) and of course the numerous essays on the importance of cooperation & compromise by Tomasik himself (https://foundational-research.org/).
LikeLiked by 1 person
Kenny said:
How can I get the book? I can’t find any links anywhere. Just lots of commentary.
LikeLike
callmebrotherg said:
It’s currently only available to Kickstarter backers. You can probably follow http://www.eruditorumpress.com/ to learn when it becomes generally available.
LikeLike