[I would like to thank Rob Nostalgebraist, dataandphilosophy, and Andrew Rettek for inspiring this post, which is primarily a synthesis of their various viewpoints– although of course not necessarily endorsed by any of the above.]
I think that the rationalist movement can be modeled as a spectrum between two different groups. Nostalgebraist calls them, perhaps unclearly, the Yudkowskians and the Tumblr Academic Freedom Club; because it amuses me, I’ll call them the Craft and the Community.
The basic belief of the Craft is that society is approaching a local maximum in a wide variety of fields; however, there are much better local maxima (or perhaps even a global maximum) over there. In order to reach the global maximum, you have to acquire a certain set of skills, so you don’t end up in the much worse trench between maxima. In this, they’re not dissimilar from (to pick a few random examples) mystics or leftists hungering for revolution. However, unlike those groups, the Craft believes that the appropriate skills to develop are those that fall broadly under the aegis of “rationality”, from Bayesianism to awareness of cognitive biases.
Conversely, the Community tends to be fairly skeptical about the Craft’s project. The Community is primarily interested in creating a social group with norms they find pleasant: statistical literacy; citing sources; civility; rewarding people for changing their minds; the not-geek not-autism thing; a high tolerance for really absurdly long web fiction. The most controversial such norm, of course, is refusing to shun people for having beliefs generally considered to be evil.
You could also talk about a middle position, where rationality is useful for, say, effective altruism and existential risk, but resoundingly useless in one’s day-to-day life. This seems to be the position outlined by Scott Alexander’s Extreme Rationality: It’s Not That Great. I would call this the Compromise, to keep up the C theme.
One important thing to note is that the Craft, the Community, and the Compromise all share a lot of similar beliefs. While of course getting rationalists to reach consensus is something like herding cats, typical rationalist philosophical positions include reductionism, materialism, moral non-realism, utilitarianism, anti-deathism and transhumanism. Rationalists across all three groups tend to have high opinions of the Sequences and Slate Star Codex and cite both in arguments; rationalist discourse norms were shaped by How To Actually Change Your Mind and 37 Ways Words Can Be Wrong, among others.
There are people who agree on few to no rationalist positions but still like going to our parties and reading our blog posts. I coined the term “rationalist-adjacent” for this group before I got the idea that the names of all subdivisions of the rationalist community should begin with the letter C.
Nostalgebraist wonders why the Craft and the Community continue to both identify as rationalists. I think this makes more sense than he thinks.
The Craft gets two primary benefits from the existence of the Community. First, social interaction. Most people see the Craft as crackpots. But the Community doesn’t instantly respond to “I work at MIRI” with “you’re going to prevent Skynet?” or “Are you sure this isn’t a cult?” The Craft is unlikely to recruit enough people to be able to have an entire small-c community on its own. Second, new ideas and members. The Community is selected for people who like thinking about stuff. The Community can come up with useful ideas for the Craft even if they don’t believe the Craft is right. Furthermore, a lot of people are going to be converted to the Craft’s side of things if they hang around the community for long enough, in the same way that you become a feminist if you spend a lot of time hanging around feminists, and convincing people who already subscribe to a lot of rationalist ideas is easier than convincing people who don’t.
In the Community’s case, we already have a norm against excluding people just because they believe things we think are stupid or evil, and that applies to the Craft as well as to neoreactionaries. The Craft follows the norms that the Community finds congenial, so there’s no particular reason to separate ourselves from them– particularly since we agree with them on a lot of our fundamental assumptions.
However, I would like to highlight that the Craft and the Community’s incentives are not necessarily aligned. In particular, the Craft has an incentive to get good PR, both from high-status people (who can be convinced that AI risk is important) and the general public (who can be convinced to adopt the Community’s ideals of rationality). Furthermore, the Craft has a much stronger interest in intellectual diversity than the Community does.
To talk about a specific example: a lot of Less Wrong references a lot of nerd culture, such as catgirls, anime, fanfiction, Harry Potter, My Little Pony, etc. From the Community’s perspective, this is awesome, because it means the community has shared cultural references. However, the vast majority of people, including intelligent and well-educated people, would either not understand those references or think that they’re off-putting in a serious philosophical debate. It’s not well-advised to select strongly for a trait totally unrelated to any of the traits you want to optimize for, since you’d be ruling out smart, curious, hard-working people because they happen to prefer James Joyce to Takeshi Obata. In the worst-case scenario, nerds have particular common failure modes (the Geek Cognitive Fallacies?) and selecting for nerds increases the likelihood the community would fall into those pitfalls.
Similarly, having neoreactionaries around selects for people who will not punish you for contemplating any idea, no matter how evil. Again, that’s one of the primary traits the Community is selecting for, so it’s cool with the Community. But the Craft, while it considers open-mindedness important, needs to select for other virtues, from a burning curiosity to a passion for scholarship. It must weigh carefully the benefits of the insight from this one fringe political group and the selection pressure it exercises on the rationalist community’s members, versus saying “okay, we can settle for ‘open-minded enough to want to build a Friendly AI because they read about it on the Internet’ and instead exert more selective pressure for people who have some other desirable trait.”
Lambert said:
I feel like this post is a ‘how did I reason effectively before knowing this?’ type enlightenment. The kind of realisation that one forgets that one did not have once.
LikeLiked by 1 person
blacktrance said:
I think this analysis underestimates the overlap between Craft’s and Community’s projects. Here’s one plausible narrative I like: “Many of the Craft’s object-level conclusions are correct, as are other common rationalist beliefs, but a significant number of them aren’t mainstream. What else are we missing by sticking to mainstream beliefs without questioning them? Usually there are some people who were saying the right thing when it was unpopular, so we should be inquisitive-minded and not punish people for saying unpopular things. Even if they’re wrong, they may be wrong in interesting ways and we would benefit from taking the effort to explain why. Moreover, rationality isn’t useless in day-to-day life, because it helps us form correct beliefs, teaches useful concepts that clarify one’s thoughts, and reject social norms that conflict with what’s instrumentally rational.”
LikeLiked by 2 people
ozymandias said:
You seem to me to be describing a Craft position, albeit one that’s in favor of a lot of things the Community likes.
LikeLiked by 1 person
multiheaded said:
I feel like I’m one of the few representatives of the grudgingly-tolerated evil beliefs from the *other* end here.
(I.e. some Maoism now and then, and generally sometimes advocating violence and politicizing/being angry over things that feel nice and neutral and not like grounds for conflict.)
LikeLike
multiheaded said:
(I’m honestly not generally optimizing for trolling, but trying for more clarity and getting out to the root of disagreement, when I say sometimes say things like, “The most rational/utilitarian/local-optimum-shifting thing would be to threaten that guy over there at gunpoint, and yes actually use unjustified, illegal physical force against him if he doesn’t behave as told or give you stuff.”)
LikeLike
Lambert said:
To risk moving the thread down a metalevel,
“He who fights monsters should see to it that he himself does not become a monster. And if you gaze for long into an abyss, the abyss gazes also into you.”
LikeLiked by 2 people
Jeffrey Austen Gandee said:
Exactly how sure are you about your priors, though? I have seen many of your posts, but I’m relatively new to SSC and ToT, so I don’t know exactly how you justify your Maoism, but I’ll assume you have a well-thought out, rational reason for those beliefs that would lead you to endorse violence.
The trouble is, many people with beliefs antithetical to yours also have well thought out reasons for believing them. Do you forgive them if they also endorse violence?
Even if I could imagine the most perfect utopia, one that could drastically reduce or even eliminate human suffering, I can’t imagine being so sure of myself that I would do violence or kill in order to bring it about.
LikeLiked by 1 person
argleblarglebarglebah said:
Jeffery: Clearly in the past violence has worked sometimes. It’s certainly at least effective for getting the old regime out. Sometimes, historically, the new regime has been so clearly better that the violence was in retrospect worth it (such as in the case of the American Revolution). Sometimes not so, though often even when the revolution doesn’t immediately make things better it does make things better over the long term in a way that tolerating the old regime would never have.
So I don’t think there’s good reason to say that violent revolution is never justified, only that it’s very risky.
LikeLike
Doug S. said:
I don’t think there’s that good a case for the American Revolution, honestly, but that’s kind of off-topic. It paid off, I guess, but things could easily have gone horribly wrong and there’s a good chance that slavery in the American South would have ended sooner if it remained part of the British Empire.
LikeLiked by 1 person
Siggy said:
Being firmly in the rationalist-adjacent camp, I appreciate you mapping out the rationalist-sphere.
All I can really add is a personal account of being “rationalist-adjacent”. I basically “grew up” in the more mainstream skeptical community. I’ve long been aware of the rationalist community, but I’ve rarely paid them attention until recently. A lot of rationalist values strike me as very odd and negative, particularly transhumanism and not shunning people. I also feel like we have endless miscommunications, apparently because even though I have very similar methodological values, I don’t structure them in precisely the same way. And Yudkowski/Scott, I just don’t really care about them?
The pop nerd culture I also find irritating, but that’s just me and has little to do with skeptical/atheist communities. I’m used to not sharing cultural reference points with anyone anywhere.
LikeLike
slatestarcodex said:
Careful now. There are two kinds of “is divided into two part”. There’s the kind where “The island of Ireland is divided into two parts, Northern Ireland and the Republic of Ireland”, and there’s the kind where “Leftists are divided into two groups, those who care more about social issues and those who care more about economic issues.”
I think any kind of craft/community distinction within rationalism is more like the second kind of division.
A lot of “community” Tumblr rationalists are polyamorous, but poly is like an archetypal example of the “everyone else is doing things wrong because of status quo bias, we can find a way that works better for us”. Likewise, a lot of people who would never dream of reforming the world economy still use modafinil, which is another example of “we think everyone else has missed this great opportunity which we will now seize because we have fewer weird taboos than they do.”
I’m even more ambiguous – I’m about as anti-life-hack as they come, up to the point where I refused an offer for a free spot at a CFAR camp a few years ago because it seemed like a waste of time (note: I now think this was a mistake, just based on value of information) but I still think that prediction markets might revolutionize everything and people are dumb for not trying them more.
I hope this does not become a way of splitting people into “those rationalists over there” and “us perfectly normal rationalists who don’t believe weird things.”
LikeLiked by 5 people
ozymandias said:
“Rationalist” signals affiliation with Eliezer Yudkowsky. People who don’t want to signal affiliation with Eliezer can go identify as “skeptic”, which is much more efficient than changing the word “rationalist” to stop signaling affiliation with Eliezer. If the Craft and the Community were to split apart, the Craft is almost certainly going to keep “rationalist”, so you need not fear us playing respectability politics with you.
And, yes, as I said in my post, it is clearly a spectrum, no one is pure Community or pure Craft, if you wanted to you could draw up a little Klein grid of whether you are in the Community or the Craft on various issues or in various contexts.
Also, I gave you a category, you are the Compromise, gosh.
LikeLike
Fossegrimen said:
“Rationalist” (possibly) signals affiliation with Eliezer if you are already a part of the LW community. For those of us who have come recently to LW but have considered rationality to be a virtue long before, it signals affiliation with Pythagoras and Eliezer is just a guy who’s good at explaining Bayes’ Theorem.
Since this post was qualified by “within the LW community”, I realise this is off topic, but the concept of rationality is a lot older than LW and I just wanted to point that out anyway. Personally, I self-identified as a rationalist (as per the Encyclopaedia Britannica definition) during the Carter administration.
LikeLiked by 1 person
ozymandias said:
The traditional meaning of “rationalist” and the LWian meaning of “rationalist” are sufficiently far apart that it is best to consider them homonyms.
LikeLiked by 1 person
Lambert said:
ESR considers the movement to be very similar to logical positivism & general semantics of the early and mid 20th century, respectively. Perhaps the people with the rationalist mindset of each generation gather and form a new movement.
LikeLike
fubarobfusco said:
In terms of epistemology, it seems to have a lot more to do with pragmatism (or Peirce’s pragmaticism) than with Logical Positivism by a long shot.
The overlap with g.s. is primarily the map/territory distinction. Bayesian rationalism surpasses g.s. in actually having a proper mathematical formalism for non-binary truth values; Korzybski intuited the need for one but doesn’t seem to have had the math to clearly specify it.
LikeLike
Susebron said:
Perhaps more of a spectrum than a division? That would better explain why people on both sides call themselves rationalists. Or maybe that was your whole point and I just misunderstood you.
LikeLike
Data And Philosophy said:
“I think that the rationalist movement can be modeled as a spectrum between two different groups.”
I think we can safely say that this was Ozy’s point, as that was Ozy’s first sentence in the post.
LikeLiked by 4 people
Susebron said:
Argh. That’ll teach me not to skim next time.
LikeLike
Matthew said:
I’m trying to place myself according to this typology, and finding it frustratingly difficult. This seems to me to suggest that it’s more map than territory.
On the Craft side, I generally think that thinking about epistemic and instrumental rationality is actually useful in my daily life. On the other hand, I’m not at all sold on some of the macro-level issues that the inner circle of Craft seem to think are no-brainers: I think that recursively improving AI is unlikely, and that is fortunate, because friendliness is not tractable. I’m not interested in cryonics, which is a subset of not having bought into certain parts of transhumanism. Effective altruism does, however, interest me.
On the Community side, I am really grateful that I have people to talk to about intellectually interesting things according to common norms of acceptable discourse. I’m mildly unhappy that I’m far from the physical places where most of these people congregate, and very agitated by the fact that it seems to be centered less and less around conventional blogs and more and more around a) Facebook (where I don’t get my veneer of plausible deniability) and b)twitter/tumblr (because I can’t participate during the work day and often feel like the conversation is passing me by.
However, HPMOR is the only fanfiction I care about, Fullmetal Alchemist is the only anime I’ve enjoyed as an adult, and I’m mystified by the fascination you otherwise adult-seeming people have with MLP:FIM and Steven Universe. I am a nerd — science fiction/fantasy and board games in particular, but I’m clearly not in quite the same nerd cluster. (This may partly be a generational thing.) Also not Poly, which seems to be something of a marker as well.
LikeLiked by 1 person
Matthew said:
Scott’s post appeared while I was in the middle of this, but yeah, what his first paragraph said.
LikeLike
Ann Onora Mynuz said:
> Fullmetal Alchemist is the only anime I’ve enjoyed as an adult
Which version? This is extremely important.
LikeLike
Matthew said:
I have not had a chance to see Brotherhood, but my expectation is that I would like both versions. You will have to explain the importance.
LikeLike
Matthew said:
If you want stick with both geometric metaphors and the letter C, you could always go with [Rationalism-]Congruent.
LikeLiked by 2 people
transientpetersen said:
Or rationalism-compatible.
LikeLike
osberend said:
I think rationalist compatible might get more at the key point.
LikeLike
shemtealeaf said:
Just wanted to say thank you for the link to Scott’s post. Somehow I never ran across that one, but it’s an almost perfect description of how I feel about rationality.
LikeLike
LTP said:
I’m totally rationalist-adjacent, though you won’t see me at any of your parties or meet-up groups! Thanks for acknowledging us.
I think these are very interesting categorizations and makes a lot of sense to me.
I think I’m primarily attracted to rationalist blogs for a few reasons:
1. Being civil is a big deal to me (though, OTOH, I’m not opposed to exclusion when necessary, but even then I prefer it to be of the quieter sort (e.g. quietly banning and account rather than shouting them down))
2. It’s one of the few non-religious spaces where philosophical issues are discussed outside of academia, at least in North American culture.
3. I appreciate that it’s a space with lots of culturally blue-tribe people who aren’t reflexively blue-tribe politically.
4. Scott and you, Ozy, are interesting reads.
So why am I not a rationalist? Well, one is that I’m not a fan of Yudkowski, both his views and his personality, and there seems to be a lot of Yudkowski hero worship in the community. For another, well, my opinions on “reductionism, materialism, moral non-realism, utilitarianism, anti-deathism and transhumanism” are: skeptical, skeptical, maybe but unlikely, hell no, meh and fun to think about but not of immediate concern. And I’m very skeptical about singularity-esque AI.
LikeLiked by 3 people
Matthew said:
Is misspelling Yudkowsky as Yudkowski some sort of intentional adjacency signal? Both Siggy and you have done it now.
LikeLiked by 2 people
LTP said:
No that was completely unintentional , at least on my part (It probably just means we have not spent enough time on LW to know the correct spelling).
LikeLiked by 1 person
osberend said:
This also about sums me up, except for point 1, where I am almost the opposite: All views should be admissible into the discussion, but it’s okay if the discussion involves a bit of declaring one’s opponents to be scum, as long as one is honest and fair in doing so. Granted, the latter is fairly rare.
LikeLike
osberend said:
Also, I keep meaning to go to (publicly announced, open-invite) parties, but being either in the wrong state (mostly Michigan) or too busy.
LikeLike
Sniffnoy said:
In the worst-case scenario, nerds have particular common failure modes (the Geek Cognitive Fallacies?) and selecting for nerds increases the likelihood the community would fall into those pitfalls.
Well — to represent a point of view I don’t actually hold — Sark Julian would probably tell you that these nerd failure modes are actually inherent to rationalism…
LikeLike
Maxim Kovalev said:
So, LW consists of MIRI fans (Craft) and CFAR fans (Community)?
LikeLike
ozymandias said:
No, LW consists of MIRI/CFAR fans (Craft) and people who like the environment created by MIRI/CFAR fans (Community).
LikeLiked by 1 person
Sniffnoy said:
You could maybe make a “MIRI fan / CFAR fan” scale, though…
LikeLike
MugaSofer said:
>There are people who agree on few to no rationalist positions but still like going to our parties and reading our blog posts. I coined the term “rationalist-adjacent” for this group before I got the idea that the names of all subdivisions of the rationalist community should begin with the letter C.
“Cousins”.
LikeLiked by 1 person
osberend said:
Is the autism-spectrum parallel intentional?
LikeLike
Kaj Sotala said:
Interestingly, this division resonates with me, but at the same time I have difficulties placing myself within this spectrum. I do feel that society is stuck within a local maximum and that rationality skills could be really helpful in changing it, but on the other hand I also feel that individual rationality is kinda useless and what we need are social systems and communities and better social norms. But that also requires people to be educated about rationality skills so it’s kind of believing in both at the same time.
LikeLiked by 1 person
Kaj Sotala said:
That not-geek-not-autist thing – it sounds kinda like it might correlate at least loosely with Sarah Constantin’s aesthetic of intricacy and Sister Y’s Unified Theory. Somehow the way you describe it makes me feel like these are aspects of the same thing, e.g. in some post you mentioned fascination with things like Catholicism or utilitarianism and Sarah mentions a fascination with intricate formal rule-systems like Catholicism or utilitarianism.
LikeLiked by 2 people
Sniffnoy said:
First link there doesn’t work.
LikeLike