I think a formative moment for any rationalist– our “Uncle Ben shot by the mugger” moment, if you will– is the moment you go “holy shit, everyone in the world is fucking insane.”
Your dad buys lottery tickets. Your best friend treats her colds with homeopathic remedies. Your sister thinks she can beat the stock market. The value of a life is something like twenty million dollars and you can save one for $3,500. The most common cause of condom failure is not wearing one. Ninety percent of people believe in God. You have– multiple times– started watching a movie and it turned out to be terrible and you kept watching it because “we’re already half an hour in” and at no point did it occur to you that this doesn’t make any goddamn sense.
I don’t think it’s an accident that a lot of rationalists are mentally ill. Those of us who are mentally ill learn early and painfully that your brain is constantly lying to you for no reason. I don’t think our brains lie to us more than neurotypicals’ brains do; but they lie more dramatically, about things society is not set up to accommodate, and so the lesson is drilled in.
Now, there are basically two ways you can respond to this.
First, you can say “holy shit, everyone in the world is fucking insane. Therefore, if I adopt the radical new policy of not being fucking insane, I can pick up these giant piles of utility everyone is leaving on the ground, and then I win.”
And this leads to: Polyamory. Modafinil use. Effective altruism. Pretty much any CFAR technique. Attempting to solve medicine single-handedly. Cryonics. SENS. MIRI. Half the things that make Eliezer haters go “Jesus Christ, what is he up to now.”
This is the strategy of discovering a hot new stock tip, investing all your money, winning big, and retiring to Maui.
Second, you can say “holy shit, everyone in the world is fucking insane. However, none of them seem to realize that they’re insane. By extension, I am probably insane. I should take careful steps to minimize the damage I do.”
And this leads to: The emphasis on applying meta-level moral rules that you would wish your enemies abode by. The Principle of Charity. The cults sequence. Trusting experts. The virtue of scholarship. Chesterton’s Fence. The outside view.
This is the strategy of discovering a hot new stock tip, realizing that most stock tips are bogus, and not going bankrupt.
I want to emphasize that these are not mutually exclusive. In fact, they’re a dialectic (…okay, look, this hammer I found is really neat and I want to find some place to use it). Trying to minimize the damage from your insanity is, in fact, a strategy for picking up some of that utility on the ground and then winning.
Nevertheless, there are definitely some people who are more on the becoming-sane side of things, and other people who are more on the insanity-harm-reduction side of things. Eliezer is way over on the “I can become sane!” side; Scott seems to be somewhere in the middle; I’m a grouchy asshole who wanders around waving my cane and saying “you kids are not going to outperform the people with actual degrees“; Topher is somehow even grouchier than I am, and opposes the Principle of Charity on the grounds that people are too insane to be able to ever do it. And I feel like a lot of conflicts in the rationalist community boil down to this conversation:
Becoming Sane Side: “Hey! Guys! I found out how to take over the world using only the power of my mind and a toothpick.”
Harm Reduction Side: “You can’t do that. Nobody’s done that before.”
Becoming Sane Side: “Of course they didn’t, they were completely irrational.”
Harm Reduction Side: “But they thought they were rational, too.”
Becoming Sane Side: “The difference is that I’m right.”
Harm Reduction Side: “They thought that, too!”
Becoming Sane Side: “So, what, I’m not supposed to try to outperform the people who literally spend their money on lottery tickets?”
Harm Reduction Side: “No, you can try to outperform them by not trying to take over the world.”
Both Sides, Simultaneously: “YOU ARE INSANE.”
But, you know, Eliezer wrote Ethical Injunctions, and Topher is a modafinil-using polyamorous effective altruist who’s signed up for cryonics. Whichever one we favor, we all have both impulses in our soul. And I hope that next time we notice a conversation happening along those lines we can at least come to an understanding about what we’re arguing about.
(I think the argument I am going to make is close to a position stated by jadagul on Tumblr a few months ago, but I can’t find it because Tumblr is terrible)
It is not because every member of the group is mad that the group itself is mad. Or, maybe better:
Everyone is mad, yet the world (mostly, kinda) makes sense.
You present an example of this with the stock market, where investors are doing as good as random chance most of the time, yet the system is still working and is useful to create value, fund innovation and ultimately lift people out of poverty. The system appears to work so well on its own that it is a somewhat defensible position to say it should be subjected to very few regulations (even before talking about the problem of regulation capture) for optimal results.
Apparently the crazyness can be self-cancelling.
Despite all the stupidity we see in academia (every… single… day), science is moving forward. Some affirm it could go much faster. Many claims of local problem (funding, glacial pace of change in focus in some areas) are valid (if not necessary actionable), while wide-ranging criticism is… well, there the sequence on Science and Rationality for an example of that, I think we have beaten this horse enough.
LikeLiked by 2 people
Minor remark on the first way to respond: with your writings on mono/poly obligates, I would have expected self-hacking to be there rather than polyamory itself.
LikeLike
I agree… as a very monogamous person I am bummed to see that assumption 😦
LikeLike
What’s with all the hating on lottery tickets? A lottery ticket costs a dollar, and state lotteries usually support education. Spending a dollar on a fun little placebo of hope while simultaneously contributing to a scholarship fund doesn’t strike me as particularly crazy. It’s certainly less crazy than other, more expensive means of paying for a fantasy, and similar to spending a quarter on pinball.
LikeLiked by 1 person
Because lotteries are a waste of hope and your rationality is my business.
LikeLiked by 2 people
First link should lead here.
LikeLike
Ah, but I think Eliezer’s first point is silly. He’s equating dreams and ambitions (like founding a business) with fantasies (like having a sledge drawn by arctic foxes that you ride around your town). The two are different emotionally and use very different energies. I don’t think fantasies of effortless wealth are a “sink of emotional energy” any more than any other fantasy. And I have yet to see LW criticizing people for fantasies of, say, fighting zombies and winning, developing supernatural powers, or being sexually desired by their favorite celebrity. These are also things that will never, ever happen, but holding them as fantasies makes people happy and helps them unwind. If spending large amounts of time and money constructing an elaborate costume so you can spend a weekend at ComiCon indulging in the fantasy that you’re a space pirate is okay (and I’ve yet to see anyone argue it isn’t) then why is it wrong to spend a dollar to indulge in the fantasy that you’ll become instantly and effortlessly rich?
LikeLiked by 2 people
You don’t have to spend the dollar to daydream about becoming effortlessly rich. I daydream about becoming effortlessly rich all the time. What your dollar buys is hope– a hope that only exists because you are sufficiently statistically illiterate that you think you have any chance of winning.
No one thinks dressing up as a space pirate will cause you to actually become a pirate.
LikeLiked by 4 people
Eh, I think there are people who find a fantasy more satisfying with props. I can see where it would just feel too *silly* to fantasize about winning a lottery without actually buying a ticket for the fantasy to be satisfying. You can fantasize about lots of hot women being attracted to you while wanking to free internet porn instead of paying to watch strippers, but presumably it feels less authentic.
LikeLiked by 1 person
I think there’s a failure to grasp the point that the lottery ticket buyers are probably judging that winning the lottery is substantially more likely than other fantastic scenarios (like marrying a wealth heir, turning out to have a huge inheritance because ones _real_ parents put one up for adoption, discovering sunken treasure etc).
One might wonder why slightly less fantastic scenarios like “start a rock band” “make a killing on the stock market” “become famous on youtube and milk it somehow” don’t occur to people. Surely they’re more satisfying?
I think most people don’t believe even in their wildest dreams that they would really _do_ anything like that. Sure they _could_, but as insiders they know they _won’t_.
Maybe rationalists have a more robust self image and can believe in the possibility tomorrow might be drastically different than today in a good way. Most of the people I know really well (smart, quirky misfits like me) do not share that belief.
$1 buys that little nudge to plausibility. It is now _possible_ to win. Versus the conviction other paths to wild riches aren’t possible _for that person_ as far as they’re concerned, after taking their lack of energy, stress, and issues into account.
And sometimes even a few seconds of thinking “problems I hate dealing with _could_ go away” might be the few seconds a person needs to take a break from the low grade grueling marathon which is “living” without a sense of satisfying purpose. (Seriously, I get the impression even most non depressed people are not particularly satisfied with whatever purposes they use to get through their days.) A few seconds to reallocate the tiresome weight of stress and dissatisfaction so they can go on more easily.
Also, it’s a very undemanding fantasy in terms of mental effort. All the important pieces are concrete and easy to imagine with stock footage. Other fantasies run into problems (“I don’t know how to scuba dive and I doubt I will ever learn” “I’m positive I wasn’t adopted” (ie, someone would rather be poor than think they couldn’t tell that their parents had adopted them and kept the secret successfully) “I can’t even get someone attractive at a social function to chat with me, I have no idea how I’d appeal to someone important” (again, someone would prefer to be poor than imagine there’s some strategy that they _might_ use to find a partner _but they’ve been incapable of using so far_).)
Even if the plausibility issue can be resolved, it involves _work_. Mental creative writing that most people are too tired to spare the energy for. (It also involves _initiative_ to a far greater degree than buying a lottery ticket. Never mind that we get a real rush out of buying _anything_ http://siderea.livejournal.com/1232320.html )
Why did I write this entry? For the fantasy I might modify someones opinion in a way I find favorable. A fantasy that’ll bounced around in my head as a background process for a few minutes after finishing and a few seconds intermittently over the next few days. Because for me, spending 20 minutes writing this was emotionally easier than imagining I might learn to drum, form an awesome band, and make a ton of money.
I know this won’t happen 😛 But it’s now literally possible since I wrote this entry. Whereas as I feel right now, becoming a rockstar is impossible _for me_.
LikeLiked by 3 people
Ken: Your link seems irrelevant to your comment? Did you intend a different one?
LikeLike
Ah, OK, I guess Ken must have meant to link to this: http://siderea.livejournal.com/1232789.html
LikeLike
My policy is to fantasize about picking up a winning lottery ticket that someone dropped on the street. This does not significantly lower my odds of winning, but it saves me money.
LikeLiked by 3 people
I note that no one in the first group has yet achieved anything more impressive than creating dubiously useful employment for themselves, which was already a solved problem, and, from what I can gather from my social networks, some kind of San Francisco Bay Area commune, which was *definitely* already a solved problem; whereas people in the second group have successfully thought carefully about things that are difficult to think carefully about on *multiple* occasions, even if they do it imperfectly.
LikeLiked by 2 people
Effective altruism has saved thousands of lives this year alone. Also they aren’t groups, they are tendencies, and every rationalist has both of them even if some people favor one or the other.
LikeLiked by 3 people
Why is “doing impressive things” the standard for comparison? If this kind of reasoning leads to marginal improvements in many areas as well as helps avoid rare but significant failures (both of which I believe to be the case), there won’t be much you can point to and be able to say that they outperform the general population.
LikeLiked by 4 people
In what sense have the second group, and not the first, successfully thought carefully about etc.?
LikeLike
Great article, and dialectic approaches do seem way more effective than they ought to be. I wonder why.
Minor quibble: you can outperform someone with a degree, but not without doing the work. Which is to say, the degree is a certification of knowledge. It is not the knowledge itself.
(I never attended university, but I outperform your average person with a CompSci degree, including many with advanced degrees. But I did the work. I read the books. I read the papers. I did the problems. I spent years doing the hard stuff.)
That said, I agree with the broader point. There is this notion swimming around in rational-space that simply having some high-school level science plus Bayesian tools will magically let you outguess people who have spent thirty years studying a topic deeply. That idea is kinda silly.
LikeLiked by 2 people
As a guess, because the world is so complex that any concept short enough to fit into a thesis statement is going to be woefully oversimplified. By setting two contradictory concepts as anchors, one can respond with nuance to myriad situations by switching between concepts as appropriate. It does require a skilled user to pull off, however.
LikeLike
Harm-reduction seems at least moderately correlated with Community within LW, yes?
LikeLiked by 1 person
Sounds more like a hammer and sickle.
LikeLike
I have a favorite quote that I frequently recall. “Just because you function well in an insane world does NOT mean you are sane”
LikeLike
I definitely had a formative rationalist moment like that.
When I was a child, I had to go to church. This was because my mother was a professional organist, and sitting in a pew is a place to put a child when you’re working. I didn’t like it any more than any other child (no child likes sitting still for an hour while someone talks), and I wasn’t allowed to bring any books. Except the bible. I could bring that.
So I read the bible each week. Starting on page one and working my way through.
So… there was never a moment when I was religious. I started with a child like agnosticism (everyone around me says stuff about religion but I have no opinion because I was busy thinking about jets and dinosaurs and didn’t listen), and then I encountered the Old Testament. Which is pretty obviously made up. It starts with a bunch of fairy tales, hen rerelease different versions of them, then it has a bunch of old myths but badly retold, and it often repeats those differently, then it starts telling you a bunch of laws from a really awful but kind of interesting society. And the New Testament wasn’t better. The prophecies were silly, it was pretty clear that the stories were myths, and everything had the whole “magic is definitely real but it only happens where you can’t see so you’ll have to take my word for it” thing going on. Plus Paul was so obviously a fraud that it was kind if funny. So I filed this under “obviously myths” and moved on.
So why did everyone still go to church if it was obviously not true?
My world view at this age was a very “match of progressivism” kind of view. I looked around and saw a world where people were reasonably fair and equitable, but every generation you went back things got worse. Like, my generation, black and white kids could play together. But that was controversial for my parents. And further back it was unacceptable. And further back was slavery, etc. So it completely fit my world view, by the way, to find that ancient Israelite law was evil to the point of being comical. The past was terrible and the arc of history was learning not to be terrible. That’s what I expected.
So I figured that church was a kind of hold over. A legacy thing people did for community, and for their grandparents, who maybe didn’t know any better because they were raised back in time when everyone was ignorant and evil.
Now I don’t want to make it sound like I thought everyone over 60 was stupid and evil. I actually admired them, because they were raised in a stupid and evil world, and managed to mostly move past that. I figured that was probably really hard. And I didn’t expect them to be perfect. I saw them as a generation that figured out that it wasn’t doing things as it should, and raised a generation of kids that were better raised than they were, who repeated the process. I expected there to be some latent hostility or ignorance or traditionalism.
I know the above is not an accurate world view. But I was like, seven. And it’s a lot less wrong than it could have been.
So anyway, I figured that church had value even if you didn’t believe in it, and anyways it made grand parents happy if you went. And I figured that obviously no one educated or modern actually believed religion stuff, because the bible was everywhere and all you needed to do to figure out it wasn’t true was open it and read for a bit. And then you’d be like, “ooooh, yeah, the eternal source of goodness and light probably didn’t order his followers to attack neighboring villages, murder everyone, and rape all the little girls, that seems like it would be out of character, I bet I’m just reading ancient myths.”
My “everyone is crazy” moment was realizing that everyone around me actually believed in this stuff. They never had the “well, obviously this isn’t real” moment because they didn’t even know those passages were there. I was more shocked to learn that someone could live their life surrounded by a single book and never read it than I was surprised that they believed. I had this moment where it clicked in my head that I, and the pastor, were very possibly the only ones in this several hundred person strong church who had read the Bible.
He was a good guy. I wish I had been able to talk to him about all this. But I was really young, and I knew what Christianity said about unbelievers. Which I apparently was, alone. I always had been, but it was different feeling that way alone, and realizing everyone actually believed that stuff about the saved and the unsaved.
So I kept very quiet.
By the time I was old enough to responsibly broach the issue my circumstances were different, and the pastor had moved on. I never got the same intellectual vibe from the next one. So I never really had that conversation, though I’ve read plenty of apologists trying to deal with the obvious mythology of their holy text. I think I would have preferred my old pastors perspective. Apologists tend to be embattled and defensive, and, well, vicious. I suspect my pastor was more of a “I’ve achieved a separate peace” kind of guy.
LikeLiked by 6 people
Very interesting post. I’m not close enough to the rationalist community to be sure I fully understand the explanation of its divisions (which I wasn’t aware of).
You linked to the LW survey results, re mental illness. It would be helpful to know the base rates for each of the conditions (and I’m too lazy to look them up). What stuck out for me was the very high rate of depression (half of respondents including self-diagnosis, which I’m pretty sure is way above the population at large). Anxiety, at a third of respondents, is also very high, but I vaguely remember the base rate for that is very high too. ASD is also high—a quarter. [I am pleased that these fractions came out so neatly, because I have ASD.]
Anyway, I thought this was interesting because these three are typically correlated with nihilism (or so I’ve claimed elsewhere), and the other mental health conditions surveyed are not. (And those others have low prevalence among responders—although I have no idea what the base rates are, so this hypothesis might be nonsense.)
I think I know how to get oneself out of nihilism, which helps with depression and anxiety and ASD’s downsides (although it’s not necessarily a “cure”). Seeing these numbers makes it seem more urgent to write about that!
LikeLiked by 2 people
Seven percent, 1.5 percent, and one percent, respectively. We’re pretty fucking crazy around here. (We also have higher rates of every mental disorder other than schizophrenia.)
LikeLiked by 2 people
Thank you very much! So, my theory about why rationalists have these particular mental health issues would be disconfirmed if we were similarly above base rates for the other conditions (which would then be quite rare). I don’t suppose you have numbers handy? (If not, I’ll go looking myself. Thanks either way!)
If it turns out that it’s *specifically* depression, anxiety, and ASD that are elevated in rationalists, I think I’ve got an interesting story to tell…
LikeLike
Whoops, sorry, I wasn’t paying attention! You said “We also have higher rates of every mental disorder other than schizophrenia.” Back to the drawing board, maybe!
LikeLiked by 1 person
This seems like a useful way of framing the differences within the rationality (+adjacent) community. However it doesn’t resolve the question which is troubling me, namely whence all the hostility towards Eliezer often found on the “harm reduction side” (I am not talking about you here; you seem to be one of the nicest people in that camp). I can understand the disagreement, I cannot understand the anger. There seem to be a lot of things in the world that deserve much more anger than well-meaning people you consider overconfident about something.
LikeLiked by 1 person
You guys can get pretty angry at us too. (Eliezer has… not exactly been kind to skeptics in the past.)
I think it’s because we share a community, and because you are much less likely to become a rationalist if you can make peace with other people being insane.
LikeLike
My impression was that usually harm-reduction is proactively angry and becoming-sane is only reactively angry. But I am biased so my impression might be wrong.
Regarding making peace with insanity, this is a very interesting point. Maybe there is a lesson in rationality here:
In a mad world, your prior for other people being insane is pretty high. Therefore you might decide someone is insane while they are actually fairly rational but observed different evidence.
People can have certain intuitions for good reasons (the intuition was calibrated on lots of evidence), but it is hard to communicate them since you don’t explicitly remember the list of evidence supporting the intuition. Therefore converging beliefs require “Aumannian” communication where the intuition of the other person is accepted as valid evidence. However, Aumannian communication is impossible when you assume the other person is insane. As a result, convergence of beliefs becomes very slow. Add confirmation bias and you might get no convergence at all.
LikeLiked by 3 people
Besides the obvious uncharitable status explanation: well-meaning, overconfident, wrong people can genuinely be very dangerous.
LikeLike
This is exactly what I don’t understand. What is so dangerous about Eliezer in the eyes of his critics?
LikeLike
Good post and interesting concept. I mostly agree.
But I reject your characterization of me as “middle of the road”. I think of myself as an outside view extremist (see eg my blog tagline). For example, http://lesswrong.com/lw/jq7/selfcongratulatory_rationalism/anbt is me defending the Outside View against Topher, even though you classify him as more Outside-View-ish than me.
The problem might be that you need a further split. Once people acknowledge that they are often insane, there are at least two non-exclusive possible strategies. First, you can decide to trust expert consensus. Second, you can become generally less confident and adjust your beliefs back toward the prior. The problem with the first is you have to use your own insane brain to figure out which experts are trustworthy (eg not psychoanalysts or archbishops); the problem with the second is that you have to use your own insane brain to figure out which prior to use (eg concept of antipredictions). In theory these two methods complement each other; in practice both sides tend to view the other as overly dogmatic and insufficiently humble.
LikeLiked by 3 people
Every rationalist does both things sometimes. Even if Topher favors the harm-reduction side of things more than you do, it isn’t that surprising to find that on one occasion you took a more harm-reduction side than he did. (I mean, Topher trusts experts more than you do, and yet only one of you is signed up for cryo.) Furthermore, the question of whether you should trust people who make reasonable arguments or people who have high rationality and IQ as shown through them making reasonable arguments seems (a) almost entirely unrelated to the concept I am pointing to and (b) rather like debating angels dancing on the head of a pin anyway.
I don’t think that’s a split that exists in reality. There are many, many courses of action one may choose to take as harm reduction, from peer review to binding oneself to solemn promises not to do things even if it seems like a really good idea to listening to criticism and taking it to heart to running experiments. I see no reason to favor two of them as “sides”, except that you happen to favor one of them and Topher happens to favor the other.
LikeLiked by 1 person
I think dialectics may be one of those infinitely expansive explainers with no predictive power. It seems like it would be pretty easy to describe any counterfactual rationalist community in terms of dialectics, even if (especially if) the description of that community was self-contradictory or otherwise internally bogus.
At best it might be a useful taxonomy or framework, but even then I’m not sure how you’d demonstrate utility.
LikeLike
I think treating it as an explainer is a wrong concept– it’s more of a tool. The way I think of it, it’s not “every issue has a thesis and an antithesis that you bring into synthesis”; it’s “when you are thinking about ideas that seem opposed, you will often get useful results by assuming they are, on a fundamental level, not opposed.” It’s not evidence. (Actually, this post doesn’t have any evidence.)
LikeLiked by 1 person
I’ve never liked walking out of a movie being used as an example of the Sunk Cost Fallacy. Complaining about bad movies is almost as fun as enjoying a good movie.
And maybe it’s just me, but pretty much any time I haven’t finished a movie I’ve found myself haunted by that fact, and eventually finish it later. I watched part of “The Brain that Wouldn’t Die” when I was ten and finished it when I was 25.
LikeLiked by 1 person
Likewise, though for a different reason – leaving a movie unfinished is hugely unsatisfying, so it has to be really bad to outweigh that. That’s why I avoid even starting movies that might be bad.
LikeLike
Nothing makes you lose your respect for degrees like ghost writing the term papers for a dozen different majors. The bar is not high.
LikeLike
It’s inappropriate to answer without specifics but the general case of ‘Becoming Sane’ raises alarm bells for me. Micro-optimizations and self-declared munchkinism strike me as either inefficient in terms of gains vs effort or exploiting those around you by defecting from some norm. An example: speeding.
One point of view holds that speed limits are set by uninformed committee to constrain the damage caused by below average drivers. If you are a skilled driver then there are a variety of techniques you can safely use to decrease the amount of time you spend in traffic, these include swerving between lanes, following close or merging into small spaces, or significantly exceeding the speed of traffic. Potential downsides are low probability, particularly the chance of being stopped by police. But this is clearly an exploitative optimization that is only possible by engaging in behavior that others will not and is directly eroding a system that enables people (with lesser skill) to reach their destinations safely.
As I’ve seen it stated elsewhere, the exceptions can be lumped under the header ‘best practices’. Actions like altering ambient light around your sleep schedule or journaling to track mood over time are great. The extent to which I’m being uncharitable to the Becoming Sane camp is probably related to the extent that they push best practices but that feels more like a Harm Reduction value.
LikeLike
Pingback: Wireheading Done Right: Stay Positive Without Going Insane | Qualia Computing
Reblogged this on YBoris.
LikeLike