In the comments of my last post on sacred values, I noticed a lot of people thinking that they should transition from using sacred values to consciously-held ethical injunctions, or that sacred values are utterly unnecessary as long as you can think “torture is wrong even when I hear a good argument that torture is right”, etc. I think that this is ridiculous. Sacred values are great! I am personally working on cultivating several myself.

Of course, sacred values get a bad rap, because a lot of people have stupid sacred values. The problem with “it is unthinkable to not take every effort to extend a person’s life, even for a few days!” is not that it’s a sacred value; it’s that “spend all available resources to extend a person’s life, even when you have a good argument that you shouldn’t” is a terrible ethical injunction. Such considerations do not apply to sensible ethical injunctions like “don’t torture people” or “don’t deceive yourself” or “if you are Ozy, a known agoraphobe, go outside when you have planned to go outside.”

(Notably, “how important is the thing” is actually unrelated to how good a sacred value is. A good sacred value is one where it’s more likely that you’ve made a mistake about whether Thing is a good idea than it is that Thing is actually a good idea. Some ethical injunctions– such as me going outside even when I’m rationalizing why I shouldn’t have to– are about pretty minor things.)

Imagine the famous case of the dragon in one’s garage. In most cases, your system 1 and system 2 are aligned: you believe the propositional statement “there is no dragon in my garage” (system 2), and when you visualize your garage there isn’t any dragon in it (system 1). Sometimes this situation might get out of whack: you might believe the statement “there is a dragon in my garage”, while imagining your garage to be empty; on the other hand, you might believe the statement “there is no dragon in my garage” while on a certain level expecting that there’s a fire-breathing lizard inside it.

In the latter cases, I think, you can be said to “not really believe” that there isn’t a dragon in your garage. You “really believe” when your system 1 and system 2 match up; if they’re mixed up– if you think there’s no dragon in your garage while your gut is like “RUN AWAY RUN AWAY SCARY BIG LIZARD”– you don’t really believe there’s no dragon in your garage. (The former case– system 2 says there’s a dragon, system 1 says no– is typically called “belief in belief”; the idea is that you believe that you believe there’s a dragon, but you don’t actually believe there’s a dragon.)

So, let’s say that you accept the ethical injunction ‘you should never torture anyone, even if torture seems like a good idea, because it’s more likely you’ve made a mistake than torture is a good idea’, but you don’t treat it as a sacred value. I think you can be said to believe that you believe that ‘torture is bad’ is an ethical injunction, but you don’t really believe it– any more than the person who expects their garage to be empty really believes there’s a dragon there. If you actually believed it, your system 1 would get with the program. Now, there are certain advantages to believing you believe ethical injunctions, without actually believing them: most notably, you get to signal that you’re a tough-minded consequentialist who pushes fat men in front of trolleys and tortures terrorists for the greater good.

But there are also advantages to, you know, actually believing ethical injunctions. For one thing, when you find yourself in a stressful situation, your system one often takes precedence over your system two. When it’s time to open that garage door, if your system one is screaming “DRAGON! DRAGON! THERE’S A FIREBREATHING DRAGON AND IT’S GOING TO EAT ME!”, it is significantly more difficult to open the damn garage door. Even if you have no problems thinking “it is desirable to open the garage door, so I can get to my car, because dragons don’t exist” when you are in your living room safely away from any potential dragons, when you’re actually there you will quite often find yourself turning around and running instead. Similarly, when your system two is going “no torture! ever!” and your system one is going “surely if it were JUSTIFIED it would be okay, and I can come up with a dozen justifications, I am so good at coming up with justifications for things I wanted to do anyway”, then it’s a lot harder not to torture people.

You may feel that you would have no difficulties resisting the temptation to torture someone. I would propose that you are potentially falling victim to restraint bias and ask you to reflect on your no doubt spotless history of drinking exactly as much as you intended to, saying ‘no’ to all high-pressure sales tactics, never cursing out drivers on the freeway, responding in a calm and mature manner to your romantic partner at all times, and turning in all your work a week before deadline. Of course, if you are such a paragon of self-control, you may certainly choose not to have sacred values, but for the rest of us mere mortals, we need all the help we can get.