[Related: Why you should focus more on talent gaps, not on funding gaps.]
Alice is founding a startup. If all goes well, she will be dominant in the Uber-for-kittens space and make millions of dollars and donate all of it to global poverty charities; if all doesn’t go well, she’s already started mulling over Uber for chihuahuas.
Bob works in HR. He made thirty thousand dollars last year. If all goes well, he will eventually top out at about fifty thousand dollars. He has taken the Giving What We Can pledge and donates ten percent of his income to charity.
A lot of articles about earning to give implicitly talk about people earning very high incomes and donating a high percentage of it: the classic example is a Wall Street trader, while the version ridiculously overrepresented in the EA community is programmers. However, within the EA community, both Bob and Alice would probably characterize themselves as “earning to give”. I do not think this is useful.
Alice and Bob are clearly doing two different things. In fact, Alice probably has a lot more in common with people aiming to do direct work than with Bob. For Alice, the potential to do good is one of the most important criteria in choosing a career; for Bob, it’s secondary to the fact that he likes HR. Alice routinely asks herself about whether becoming vegan will be too much altruism and make her burn out, or whether buying a nice suit will be worth the expense because of her increased credibility. Bob rarely thinks about such tradeoffs.
I propose we reserve “earning to give” for people who deliberately seek out high-earning professions in order to donate, such as Alice, and instead refer to Bob as a “softcore EA”.
A softcore EA is someone who has pledged to donate a high percentage of their income, but otherwise lives a pretty normal life. A hardcore EA is someone who is making their career decisions to deliberately maximize the amount of good they can do in the world.
I am not in love with the hardcore/softcore terminology; it’s the best I’ve thought of, but I feel like it suggests that softcore EAs are worse than hardcore EAs, or “not real EAs” somehow, which is not what I mean at all. (I appreciate suggestions for different terminology, and will edit the post with thanks to anyone who thinks of some.)
In reality, the vast majority of EAs are and will always be softcore EAs, and that’s fine.
80,000 Hours has a career recommender. The first question is “Were you good at math, science, or logic when you studied them?”; the second question is “Are you good at writing or speaking?” If you say “no” on both, you will get one suggestion: policy-oriented civil service. Remember that half of people are below average. And the suggested careers for people who are good at math– startup early employee, economics PhD, data science– imply they want significantly above average math skills.
And even if Bob is really good at writing or speaking… well, Bob has a family. Bob has a mortgage. Bob has a long resume filled mostly with HR positions. Bob cannot actually abandon his life to go off and join party politics or become a foundation grantmaker.
I’ve seen a lot of people take the career recommender quiz and start beating themselves up because the jobs it recommends are things they cannot actually do. That’s okay. Hardcore effective altruism is for a relatively narrow segment of the population– people at the beginning of their careers or who were very lucky in early career selection, who happen to be talented in math or speaking rather than music or athletics, and who are generally privileged enough to be able to get in good fields. 80,000 Hours specializes in recommending jobs for hardcore EAs, because they’re the ones who are making career decisions based on effective altruism. If you’re basically an ordinary person, you are probably going to end up a softcore EA, and you should not feel guilty.
First, of course, because it is ridiculous to feel guilty about not being able to do something you can’t do; you could not be Dustin Moskovitz even if you tried, and so no blame is attached to you for not being Dustin Moskovitz. Second, because saving dozens of lives over the course of your life is actually really amazing and something to be proud of. And third, because softcore EAs are vital to the success of the effective altruism movement.
It is true that most people cannot be hardcore EAs. It is also true that to stick to being an effective altruist– to keep donating even when all your friends are going on exciting awesome vacations, to stand up for policies that help people in the developing world in spite of political pressure, to prevent burnout after two years of seventy-hour weeks at a nonprofit and no end in sight– we need a community that reminds people why we’re EAs. We can’t have that community just from hardcore EAs. There aren’t enough of them. What we need is a bunch of ordinary people, keeping their pledges, and creating a community in which the coolest fucking thing you can do is save a life.
Dustin Moskovitz has done as much good as the rest of the EA movement put together. But without us he wouldn’t have done half as much good.
And furthermore think about the PR. Think about showing other people that you’re a normal person, with no exceptional abilities, who is living an ordinary life with no undue self-sacrifice– and you’re saving children’s lives. We can write a lot of articles about how great EA is, but none of it is as persuasive as a friend saying “actually, I give ten percent of my income to Give Directly! It gives my life a lot of meaning to know that I’m not just working for myself, but to help others.” They can see with their own eyes that giving ten percent is possible without giving up your ability to have Starbucks coffee, movie tickets, or a gym membership. And that kind of quiet demonstration is worth a lot.
As a softcore EA: there is nothing to be ashamed of in being a softcore EA. We are doing tremendous good in the world, both directly through our donations and indirectly through working to create a community of giving. We’re not earning to give, but we have a lot to be proud of.
queenshulamit said:
OK but this terminology doesn’t distinguish between someone like Bob and someone who is like me, who would be hardcore except I’m not good at things? I would be Alice if I could be Alice, whereas Bob doesn’t want to be Alice.
LikeLike
queenshulamit said:
Note to self read full post next time before crying and leaving stupid comment
LikeLiked by 4 people
David Barry said:
I think ‘regular EA’ (or something similar) would be a better term than ‘softcore EA’, to avoid negative connotations.
I’m not sure if I helped inspire this post with some of my comments on Tumblr and Jeff K’s Facebook, but in any case I agree that some sort of separation of terminology is needed, especially on ‘earning to give’. I have my own preferences on what ‘earning to give’ should cover, but at this stage I don’t really care as long as it becomes clear what people are talking about (and who they’re talking to).
LikeLiked by 2 people
rageofthedogstar said:
Local optimization EA vs Global optimization EA? Perhaps abbreviated as LO-EA and GO-EA?
LikeLike
sniffnoy said:
Focus EA vs. persistence EA?
LikeLike
systemicinsanity said:
It might be better for Bob to just be an “Effective Altruist”- the 10% Schelling Point is attractive in part because it’s consistent with a typical comfortable life, but if widely adopted would Solve All the Problems. So we want Bob to basically just seem like the normal bread-and-butter Effective Altruist, because his capacity to be normal is precisely what makes his strategy powerful. We don’t *want* Bob to be part of a special EA subgroup with secret code-rings and fancy hats.
Alice, on the other hand, is really going above and beyond- saving many lives, inspiring others, and earning special status in the community. But the cost is so large that most people either lack the innate talent required, or would prefer to *honor* Alice rather than *be* Alice. If we reserve ‘earning to give’ for this sort of person, the phrase will take on a certain amount of weight.
Consider “Christian” versus “Saint”.
LikeLiked by 3 people
Ben Pace said:
Yes, I think this is a very good analogy.
Everyone can be a Christian.
Very few Christians will be Saints.
And being a Christian is absolutely fine, and only a few people Christianity are worrying intensely about not doing the most good.
Admittedly, given that EA has a more utilitarian/consequentialist bent, it is harder to make that distinction fuzzy, and I would be happy for the community to try to act more narrative driven (i.e. more story like / virtue ethicsy versus straightforward optimisation) to hide that. For example, we could have an actual ritual by which you become an EA ‘Saint’ and get a special name or title or something.
LikeLike
ozymandias said:
That is not theologically sound! A saint is just a dead person who’s in Heaven; canonized saints are people who have been confirmed to be in Heaven. Most of the church militant will eventually wind up in the church triumphant, one hopes.
LikeLiked by 3 people
systemicinsanity said:
Another way to create a distinction and special honor would be to have an awards ceremony every year, singling out the top [five] altruists that have not already won the award, separate categories for percent of income donated and total cash donated. Possibly at both the local and national level, possibly with other categories emphasizing things like success in filling talent gaps, if you can manage it impartially.
Scale the volume of awards such that the top 10% or so of effective altruists can reasonably expect to get in at some point in their lives. Ozy’s “hardcore” EA’s will then be generally understood as the people in competition for the award, and there are enough of them that we’ll think of them as a type rather than as individuals.
LikeLike
Jubilee said:
Leadership altruism versus the altruist constituency? 80,000 Hours in particular values directly EA-focused businesses, and a major element that holds back EA according to the other post you cited is “people with enough capital or talent to do something Big.” The altruist constituency can pour funding into Effective Altruism, but can’t really make a thing happen on its own aside from clamoring for something, or asking for someone to take point on the subject — and then, assuming that person has the talent to do that and the ability to meaningfully give over the rest of their work effort to that for who knows how many years, they’re part of the leadership.
LikeLike
Rob said:
“Ben_Todd 12 October 2015 03:22:56PM 10 points [-]
Just a few remarks about 80,000 Hours.
Our intention is to eventually provide career advice to all graduates.
However, for the next 1-2 years, it seems far better for us to focus on especially talented graduates in their 20s. For startups the usual advice is to start by having strong appeal in a small market, and this audience is the best fit for us (it’s where we’ve had most success in the past, where we have the strongest advantage over existing advice, and where we can have the largest impact with a small number of users).
Unfortunately, this has the negative side effect of making effective altruism look more elitist, and I don’t see any easy way to avoid that.
Another thing to bear in mind is that 2/3 of the sections of our guide apply to everyone: https://80000hours.org/career-guide/basics/ https://80000hours.org/career-guide/how-to-choose/
We also have this article, which we link to in the intro material: https://80000hours.org/articles/how-to-make-a-difference-in-any-career/
The main bit that’s targeted to talented students are the career profiles: https://80000hours.org/career-guide/profiles/ And the career recommender which is based on them. Even here, we’d like to expand these to include a wider range of careers within the next 6 months.
If you’re talking to someone at 80,000 Hours who might be put off by it seeming overly elitist, stress the general principles (‘learn the basics’ section), approach to choosing a career (‘make a decision’ tool) and broad pathways to impact (building skills, etg, direct work, advocacy), since these apply to everyone.”
http://effective-altruism.com/ea/op/eas_image_problem/5aj
LikeLiked by 1 person
sandorzoo said:
“What we need is a bunch of ordinary people”
I sort of disagree with the sentiment here. It’s certainly true that EA shouldn’t (and can’t) just be a group of uber-geniuses who work hundred-hour weeks. Even Einstein wouldn’t have gotten very far without Besso.
However, what we absolutely do need are people with a high tolerance for weirdness. You can’t change the world in significant ways without doing stuff that’s risky and weird. That’s one of the big reasons why there are so many startups (and EAs) in the Bay Area – San Francisco tolerates weirdness better than most other places.
There is a guy I sort of know, I’ll call him Bob. Bob has done a tremendous amount of good in the world. Bob is by no means a scientific or literary genius – he’s never been to grad school, never proved a theorem, never wrote machine learning software, never started a company, and has no audience of adoring fans. Most of his life is pretty normal. But he a) gets his damn job done, and b) did not freak the hell out the first time someone drove to work in a motorized cupcake. Those two things really count for a lot.
If we don’t select for weirdness-tolerance, EA will devolve into “local mobile social digital education green empowering deep mindful gluten-free healthy exponential innovation empowerment integrated Big Data jobs culture disruption acceleration strategic genetics platform”, as “Singularity” did before it.
LikeLike
Josh Morrison said:
Maybe Alice should be called a fundamentalist EA
LikeLike
Robert Liguori said:
What if Bob buys a lottery ticket? If all goes well, he’ll be able to donate millions to charity, and keep his entire modest salary!
Or perhaps we could realize that there is one hell of a dodge in equating “My startup goes from vision in my head to making millions salably.” to “I remain employed for a decade or two and get either one promotion or several COLA salary increases.”
I’m not sure if I’m fighting the hypothetical or looking too closely at it, but if Alice genuinely thought that founding Uber for kittens was the best way to make a lot of money to donate to charity, I’d strongly question her business acumen. I might be being uncharitable, but I’d frankly assume that this was a person who cared about looking good over doing good, and knew that she’d get socially rewarded for the attempt of starting a start-up even if it failed.
I really would like to see what kind of success rate people are assuming on Uber for Kittens, as well as their priors on startup founder actually being chosen as a career because someone did the math instead of picking it for social reasons.
LikeLiked by 1 person
ozymandias said:
The math.
LikeLike
Robert Liguori said:
Are we reading the same page? The expected payouts of people who get seed capital is in the millions per year. (Due largely to unicorns, as the page notes.) The average odds of a viable company that wants seed capital getting it is perhaps 1%.
%1 of $5,000,000 a year is $50,000. That’s a whole heck of a lot smaller than Bob the Like-For-Like Corporate IT Guy’s expected salary, and that assumes 100% conversion rate from “I have an idea!” to “I have a company and am requesting funding and applying to accelerators!”
Now, I see a few big confounders here. The first is that failed startups aren’t zero-value; you can learn things, make connections, and impress people. If you’re genuinely good at your job and have a startup fail due to unavoidable market conditions, you may well have a much higher chance at getting funding next time. Another big one is actual downside risk; effective altruism isn’t just about funding, but doing good. Pouring your last cent into your startup might not just leave you with no effective giving, but could well impoverish you.
Then there’s also the fact that people can try multiple things; they can have a great idea for a startup, try it out, have it fail, shrug, and instead pursue a conventional career and give that way. Alternatively, they can work a conventional career, carefully hoard and invest their money, retire at 40, then have fun knocking out startup ideas at their own pace and in their own way for the next 20 years, while living off of their accumulated investments and basically tithing 10% of their yearly capital gains to EA.
So, I still don’t feel convinced that going the start-up route really does maximize your expected returns unless you are in a very specific situation (went to the right college, made the right friends, have already-existing social contacts in the funding space, etc.), and I definitely don’t think encouraging people to say “I’m working on Uber for Kittens and I will totally donate millions to charity once I’m rich and successful!” will actually produce many more successful Ubers for Kittens relative to the number of people getting into startups for their social cachet.
LikeLike
person said:
“The average odds of a viable company that wants seed capital getting it is perhaps 1%.”
maybe for Y combinator, but other accelerators accept many more startups… also, Y Combinator’s acceptance rate accounts for many startups that are not viable. and just because your odds are only 1% does not mean it’s worth spending a few months on.
also small businesses can be very profitable even in the absence of venture funding.
EA community startups thus far seem to me like they’ve been doing remarkably well all things considered (Health eFilings, Quixey, altspaceVR, etc.)
LikeLike
Julia said:
80,000 Hours has spelled out that they do not consider Bob earning to give. https://80000hours.org/articles/earning-to-give/
“By ‘earning to give’ we mean that: (i) You deliberately pursue a career that is high-earning (given your options) in order to do good through your donations.
AND
(ii) You donate a very significant proportion of your earnings, where for someone earning more than the average in rich countries, ‘very significant’ means at least 20% of income.”
In the early days of this term I was annoyed that giving 50% of my salary but not changing careers did not qualify me, because I didn’t want there to be any categories of EA that I didn’t qualify for. It seemed especially weird that if I doubled my income and gave 20%, I would be giving less in both absolute and relative terms but would now qualify as “earning to give.”
But then I got over it.
As you say, I think there’s a lot of value in having people who are visible examples of having both normal things and also significant donations as part of their lives. And I’m better at being an example of that than I would have been in a high-earning career.
LikeLike
anon said:
I’ll take it a step farther and say I think that if the first place I’d been exposed to EA ideas was 80,000 Hours I’d have come away with a profoundly negative impression of the movement. It’s extremely elitist and unwelcoming to the great lump of humanity like myself who aren’t immediately positioned to start making lots and lots of money. I’ll be lucky if I can even make enough to support myself once I graduate. Not everyone is in a position to build “career capital,” some of us would settle for having a career at all.
It might be a great resource for people who are already committed EAs, but 80,000 Hours is the exact wrong way to sell the movement to normal people:
“Oh, people at cushy Ivy League schools with six figure jobs want to tell me why I’M the one who could be doing more? Get fucked”
Actually, come to think of it, the indifference toward normal people is distressingly common across the EAscape. Peter Singer himself said donations under 100,000 dollars are purely symbolic and don’t have any real value (which calls into question why, as a consequentialist, he would recommend people make them at all, since that money would have very non symbolic value to people if they kept it instead of donating it).
LikeLike
davidmikesimon said:
Source for quote from Peter Singer? I can’t find it offhand.
LikeLiked by 1 person
rossry said:
Late to the comment thread, but:
I don’t think that “hardcore” vs. “softcore” is carving reality apart at its joints here — the distinction feels more like a Gryffindor-EA vs Hufflepuff-EA difference.
LikeLike
Sci said:
And what of those of us who supposedly have the skills to be hardcore EAs, but have no desire/interest to do so, but would rather follow an egoist career/life path but pledge to give a percentage effectively? Who cares about doing some good but not “the most you can do?”
I’ve been researching this in light of my own desires and job searches. The idea of devoting oneself to “doing good” to such a degree for no reason besides rational moral obligation is abhorrent. Also, I can’t help but see EA as rather all or nothing. Say you give your 10%; if you’re careful with money, you can still have your luxuries, maybe just fewer of them. Under EA’s ethics, though, doing this “causes harm” and you should give that money instead. It’s a slippery slope then to a kind of asceticism.
Regarding EA in general I question the obligation in the first place. Why is it our duty, our obligation, rather than simply a preferable thing (this kind of wording, too, seems to encourage the all-or-nothing view).
Ack, sorry for the long post!
LikeLike
apprenticebard said:
Really glad I found this, as I now feel a lot better about my weird violent hatred reaction towards the 80,000 Hours website (in the sense that I now better understand it and can mostly let it go). Like, I knew on some level that the material was primarily directed at very elite/affluent/capable crowd, even by American/British standards, but it wasn’t necessarily presented that way (I remember coming away with an impression of “anyone who can get a bachelor’s degree can save X many lives!” where X is large enough to make the statement Clearly False), but I also felt like I, a basically middle-class college student with way-above-average SAT scores, should be in that group. The end result was that I felt Terrible, even though it did give me some useful ideas for making a bigger impact within the area I’m already trying to go into (teaching).
At the same time, I am curious about where you’d draw the distinction between hardcore and softcore EA’s, considering that people have different talents and can also have more than one possible source of income. Like, I’m planning to pay my bills with a teaching job, but I’m also planning to become a fiction writer. Being a fiction writer is one of those things where the vast majority of people who attempt it make very little money (like, they could have made significantly more if they’d used the same time to work a part-time minimum-wage job), but where a very talented and fortunate individual can make a lot of money. Becoming a successful novelist is probably one of the only ways that someone with my base abilities could plausibly make large amounts of money (I guess I could probably also have gone into film or marketing or something, though my political views probably make politics a no-go). It also has a high potential for what 80,000 Hours calls “advocacy”, though for some reason they don’t seem to account for the fact that art (and particularly writing) can sometimes be used to convince people of things. (Though this does seem like a weirdly huge oversight to me? I mean, it’s too obvious for it to be one of those things that people miss just because they don’t have any artists on the team, right? Do they know something I don’t? Is it actually not possible to persuade people to do things with words? If this is so, why on Earth did they write a book?)
80,000 Hours discourages work in the arts on the grounds that competition is fierce and most artists don’t make very much money, but this is also true of entrepreneurs. Setting up a new business is a big risk, and it might even result in a net loss in the end. Given this, I don’t really see a lot of difference between Alice vs someone who looks like Bob but who is also pursuing other potential ways of making money and/or gaining a large audience or platform. Alice’s Uber for Kittens is probably not actually going to make her millions of dollars. Bob Except He’s Writing A Book is also probably not going to make millions of dollars, but it’s still a possibility, and Bob also has a much higher probability of eventually having something to show for his total efforts (assuming Alice doesn’t also have another job while she’s trying to get Uber for Kittens off the ground).
80,000 Hours seems to think that people in almost any career can participate in any of their four areas (earning to give, advocacy, research, or direct work), but that some careers will offer better opportunities than others, either due to the nature of the career itself or due to personal ability. This seems like a much healthier way to look at things compared to dividing people into distinct categories of EA’s, especially for people like me who want to maximize impact but who will be in fairly dire situations if they don’t also manage to take care of themselves. You don’t actually have to quit your job and found a startup in order to be doing the best you can with what you have.
I don’t know that it really matters what we call Alice (who, in terms of actual odds, probably will not be earning very much to give) vs what we call Bob. I just find it a little offputting that for a movement about maximizing the good you can do in the world, a lot of EA stuff seems very focused on people without substantial limitations, and occasionally reassuring people with limitations that they are Not Bad People, with relatively little material about helping people do lots of good despite their limitations. Possibly this is because my sources of moral reasoning feature more stories about (hardcore!) widows giving mites than those of the EA leaders. But still, even just looking at earthly consequences, I get that people with lots of resources can obviously do more per person, but there are more people of basically average abilities than there are people in the most capable 1%. Even if the 1% can still do more overall, I feel like it’s also worth it to reach out to people who are closer to average, and that reassurances that they are Not Bad may not always be as useful as helping them find concrete ways to help more (even in terms of the effect on their own self-esteem). Possibly I should stop complaining about this tendency and start a blog or something to begin fixing it.
Again, though, I really am glad I read this, and I do feel a lot better now. Thanks, Ozy.
LikeLiked by 1 person
ozymandias said:
80,000 Hours combines all of the arts into a single category w.r.t. advocacy, which is probably a mistake. Fiction writing has a much higher potential for advocacy than, say, modern dance.
That said, I think it is true that founding software startups currently has a higher expected value than fiction writing. I ran some calculations a while back and it seems like software engineers consistently make ~ten times more than fiction writers at pretty much every level of success from beginner to Rowling/Gates. (But of course if you have the capability to become a fiction writer and do not have the capability to found a startup, this may not be the most relevant consideration.)
LikeLiked by 1 person