21 Comments

My "normal person, enjoys politics, not interested in murder" shirt has people asking a lot of questions already answered by my shirt.

Expand full comment
author

lmao

Expand full comment
May 24, 2023·edited May 24, 2023Liked by Sam Atis

Laughed out loud at reading the passage from an EA Forum about the typical EA.

Meanwhile, I wanted to respond to

>> "Nearly all forecasters are paid more by their day jobs to do something other than forecasting."

I don't think this is right. Although it might seem this way from talking to side-gig forecasters, I know some people who make a lot more than the average CFA [1] working for hedge funds and they follow principles from the book Superforecasting as part of their work. They don't have a lot of time for side-gig forecasting, but they are professional forecasters.

[1] https://finance.uworld.com/cfa/salary/

Expand full comment

I agree with a lot of this. There are also just sort of these weird philosophical memes that really irk me--for example, lots of people seem to be preference utilitarians (which is close to the right view, but still...seems like they're only that because a few high-ranking EAs are), physicalism about consciousness, moral anti-realism, etc. One thing though, I think there's a lot of good about EA--e.g., the fact that it's saved hundreds of thousands of lives, that it's improved the conditions of tons of animals, that EAs are disproportionately vegan, etc. This article seems to be only about the bad things about EAs, so I think maybe the title should reflect that. If I was writing an article about only what I didn't like about the Democrats, I probably wouldn't call it "Notes on the Democratic Party."

Expand full comment
author

I started writing all the good stuff too, but it was all so boring and obvious.

Expand full comment

Surely there's some stuff about EA that is good and not boring or obvious. Like, I think EAs are really willing to tolerate weird ideas, which is good. If you say to normal people that you're concerned about wild animal suffering, they'll freak--not EAs, they react rationally. And they tend to be pretty smart and informed.

Expand full comment
May 31, 2023Liked by Sam Atis

> I would love it if there were more EAs who studied literature or History of Art or the humanities, if more EAs had an eye for aesthetics, and so on. Not exactly for impact reasons, just because it would be nice to have a chat at EAG that involved someone recommending a novel [...]

I totally agree with you on this: I just enjoy talking with interesting people! The only aesthetics-related EA post I've seen was still actually about impact, just in a roundabout way. Oh well. If you find any such EA subgroup, please do let me know!

Expand full comment

From what I've seen, EAs seem to care both too much and too little about second-order effects

e.g., the castle helps productivity/aesthetics --> more money raised, so it's justifiable = too focussed on second-order effects

versus

systemic change issues = not focussed enough on second-order effects

I wonder if this split is roughly along the lines of the bed nets versus extinction-prevention groups. The longer timelines = x-order effects are justifiable?

Expand full comment

"And as I’ve spoken to more people who are sort of on the edge of Effective Altruism, where they’re not quite EAs but also very sympathetic, I’ve realised they’re much more annoyed about the castle than most dyed-in-the-wool EAs seem to realise. The common sentiment is ‘I thought this was about helping some of the world’s poorest people, why the fuck did you buy yourselves a castle?!’."

The first part basically describes me--I first encountered EA ideas perhaps 5 years ago and find most of these ideas very convincing, I contribute a small percentage of my income to charity and use GiveWell as my main method of analysis, I was even considering going to work for an EA-adjacent organization in my professional life (basically public policy and lobbying) before the org blew up as a result of the FTX fallout--but I do not keep up with any of the EA gossip (for lack of a better term). I had not heard of this castle purchase until this post and my visceral reaction was also pretty close to what you outlined. It seems absurd on its face to spend that much money on a castle (or abbey) and then justify it by basically saying there's something special about being in an inspiring location like this castle. I mean maybe there is, but also that sounds kind of woo, which I always associated as being the opposite of the EA and rationalist lens on the world.

Expand full comment

I think the abbey was probably a reasonable purchase, and I think the quoted rationale is a pretty bad argument for it.

My argument in favor is basically "conference venues are useful, owning a conference venue can be cheaper / easier than always renting, and despite what it looks like the cost is about on par with non-castles in the area." (Also, I think "the money isn't gone, it's just locked up in real estate for a bit" is important.)

I'm least certain about the cost being comparable to non-castles, and it might be the most important (to me, at least). It comes down to either investigating real estate prices around Oxford or having enough implicit trust in the people involved that you think they probably did a reasonable job investigating that question. I wouldn't expect anybody outside of EA to have this level of trust in CEA, so I think it's understandable that they strongly distrust the abbey purchase.

Expand full comment

My thoughts:

I appreciate you writing something. A way to avoid deference culture is for people to give their honest opinions.

"I think one of the worst things about EA is that Effective Altruists (and rationalists) have the awful combination of being extremely susceptible to memes, while believing that they’re especially resistant to them."

Soft agree here. I think that once a false meme gets into EA we are surprisingly weak at dealing with it eg SBF was frugal (he wasn't), C02 levels are really important (are they?), Strongminds is a 10x+ givedirectly (seems the analysis was weaker and many suspected it).

I guess I think that people like you Sam, are the solution here. For me the frame isn't "EA sucks because it overrates signalling" but "EA overrates signalling, it's my job as someone who actually knows about this to write a forum post to correct them and likely correct they will".

I guess I see this as a recurrent problem when someone identifies what they see as a systemic problem when I see them as the systemic solution to that problem.

But there is this huge body of knowledge in EA and it's hard to know what's been triple checked and what hasn't. In my grouchy opinion, this isn't helped by a lack of attempts to try and pare down the "things we all think". The vague expectation is that everyone reads the forum which I think is a terrible vague expectation.

"‘forecasting is extremely important and will make everything better, and we should also spend loads of money on forecasting’."

So I agree that forecasting is perhaps overrated in vibes. But in terms of resources EA doesn't actually spend that much on forecasting. (What like $10 - $100mn?) Like lets not pretend that vibes matter more than actual resource allocation. And there does feel, to me (a forecaster) like some genuine research is happening here and that the field is growing. The success of 538, the economist pushing forecasting, MattYglesias, etc suggest that this isn't some isolated thing either.

And in terms of "The market message is “don’t forecast” " I think this is curious. Like Michael Story (who Sam quotes) says this whilst running a forecasting org - he thinks there is untapped alpha there. I don't know why there isn't more forecasting in business: the answer is, as you say, probably not that EAs are the only ones who know the answers, but it doesn't seem like it is that forecasting is useful either.

I agree that forecasting could tone down the vibes a bit.

On views that EAs think.

"No, we should not sterilize people against their will.

No, we should not murder AI researchers. Murder is generally bad. Martyrs are generally effective. Executing complicated plans is generally more difficult than you think, particularly if failure means getting arrested and massive amounts of bad publicity.

Sex and power are very complicated. If you have a power relationship, consider if you should also have a sexual one. Consider very carefully if you have a power relationship: many forms of power relationship are invisible, or at least transparent, to the person with power. Common forms of power include age, money, social connections, professional connections, and almost anything that correlates with money (race, gender, etc). Some of these will be more important than others. If you're concerned about something, talk to a friend who's on the other side of that from you. If you don't have any, maybe just don't.

And yes, also, don't assault people.

Sometimes deregulation is harmful. "More capitalism" is not the solution to every problem.

Very few people in wild animal suffering think that we should go and deliberately destroy the biosphere today.

Racism continues to be an incredibly negative force in the world. Anti-black racism seems pretty clearly the most harmful form of racism for the minority of the world that lives outside Asia.

Much of the world is inadequate and in need of fixing. That EAs have not prioritized something does not mean that it is fine: it means we're busy."

Ehh this seems a bit lazy from you. What % of people do you think believe any of these? For the sterilisation, I guess 1 - 5%. Do you actually think that over 10% of EAs believe any of these? I imagine some of your readers will think that significant chunks EAs hold these views. I don't think many more do than the communities that EA pulls from. Probably less than the public on many.

I think that this is an error from you. You have quite a big following and either you think these things are true (in which case we disagree) or you don't, in which case you've laundered the idea that EAs are a bit eugenicksy. What % of EAs do you think think people should be sterilised or that it's okay to assault people?

"But why is he posting this on the EA forum?' I dunno, cos he wants to discuss and build concensus. Unsure that's helped by shortforms being shared as if they imply something I'm not sure they do.

"Sam Bowman once made the point that surprisingly few EAs seemed to make much fuss about the UK government’s decision to cut foreign aid spending to below the 0.7% target, which is probably true"

Is it true? Or does it just feel true?

- I know an EA who campaigned on this

- Sam Bowman is a talented political operative, while I wish I had his nouse I don't. I thought a bit about the 0.7 campaign but I didn't know what to do.

- What specifically would you have liked to see?

- https://twitter.com/NathanpmYoung/status/1414924399648690179

- There was a forum post about the EAs involved in the campaign to keep it. https://forum.effectivealtruism.org/posts/jLJPsDb77nBE96dEv/the-0-7-campaign-appears-higher-impact-than-we-expected#4__The_government_won__but_it_was_a_hard_fought_victory

"This ‘EAs love systemic change’ article put out by 80,000 hours seems a bit like cope - if we really loved systemic change, maybe we wouldn’t have to say so all the time." No I think this is a bad heuristic. Let's look at reality. EA has coincided with a shift towards aid focused on impact. My sense is that other large philathropies and governments are more likely to talk about impact and seek to quantify it than they once were. Likewise there are significant EA supported chages in animal welfare policy and AI policy. In terms of looking at the world, I think we should not conclude that EAs are unengaged with systemic change.

My theory would be that EAs look like they wouldn't be engaged in systemic change or the change the movement supprts (global dev, animal welfare, tech policy) isn't the kind that critics are interested in.

The Abbey.

So I agree that the Abbey feels like a low-interest rate phenomenon. I am probably slightly bearish on it given that Owen Cotton-Barrett seemed to have the vision for it and he's now been removed from the project. But had that not been the case, I can see it as reasonable hits-based-giving. If there really was a top notch x-risk conference then that might make a big difference. And I don't generally think OpenPhil is stupid with this stuff. What odds would you take that it will seem like a good decision in 10 years time, given what was known at the time and what we know in 10 years. Maybe 60% and 30%? Seems non-terrible.

"I would love it if there were more EAs who studied literature or History of Art or the humanities, if more EAs had an eye for aesthetics, and so on. Not exactly for impact reasons, just because it would be nice to have a chat at EAG that involved someone recommending a novel, or basically anything other than talking about the percentage chance that AI kills everyone."

I sense you'll be the first to say that EA should be about impact not a social club. So while I think that I expect there to be effective, altruistic people in the humanities and I don't want us to overlook that, that's my reason here. I am open to the idea we should be more "curious" as a community, but I am wary of the "eas should read more books thing". EAs should in my opinion, decide what proportion of their time they want to use effectively and spend that doing the most effective thing. With the rest of their time it's none of my business. Let them go to church, or parties, or have families or read books, or hang out with EAs or not. There is an edge here that EA should make people *interesting* and I dislike that. And I dislike the totalising ideal that EA is all there is.

"EAs are often bad in social situations, and say things that might come across as hurtful despite their friendliness."

I think EA is probably as socially incompetent as one should expect from a community so high in neurodiverse folks. I strive to do better and I sense that yes, there is room for some community wide workshopping here. Sad this happened to your partner.

I genuinely appreciate you writing this, because I think there is an undersupply of stuff like this, especially from those not totally on board. That said, I do disagree with most of the points.

Expand full comment
May 25, 2023·edited May 25, 2023

Thanks for the insightful article, a lot of the points hit (close to) home.

Even though it's from a quote, I want to respond to this, since it's false:

>He is ethically utilitarian and politically centrist; an atheist, but culturally protestant.

EA's are not centrist, they are generally leftish or left: https://forum.effectivealtruism.org/s/FxFwhFG227F6FgnKk/p/AJDgnPXqZ48eSCjEQ#Politics

Expand full comment
author

Seems like a very small majority are either centre-left, centre, or centre-right, so it doesn't seem like 'false' is exactly right here, although maybe it's slightly misleading?

Expand full comment

I mean this graph, which has 37% "Left" and 40% "Center Left", which is an overwhelming majority. Is thee another graph?

https://39669.cdn.cke-cs.com/cgyAlfpLFBBiEjoXacnz/images/5eb44961e4881430df248955940bda73cac7fca6cc58559c.png/w_1999

Expand full comment
author
May 25, 2023·edited May 25, 2023Author

Sure, but if you sum centre left, centre, and centre-right, you also get a majority, because someone being 'centre-left' is consistent both with them being centrist and with them being leftish.

Expand full comment
May 25, 2023Liked by Sam Atis

That's discarding specific information: The vast majority of people in your "center" explicitly identifying as center left.

If there were not also 37% of people explicitly stating they are "left" that interpretation could be true.

More ways to look at it:

- If we'd force the people in the "Center left" to choose between "center" and "left" I doubt all of them would say "center" - so the "left" is probably bigger than the "center".

- If we assign numbers from -2 to 2 from left to right and calculate the average, we get -1.1 ("Libertarian" and "other" =0 for simplicity, though it doesn't make much of a difference).

This is comfortably in the left half.

- Even if we bunch the heavily biased center into one group, we still have 37% identifying as left, which is not as homogenous as claimed in the quote (I wouldn't call a group with 37% women a "group of males").

Expand full comment
author

These are all fair points but it just seems slightly pedantic to me, the whole thing is a slightly jokey composite that is not meant to be literally accurate. Much of this stuff won't apply to >70% of EAs - we don't really think a huge majority frequently eat huel, are both white and male, or are named Sam, do we? But I do take the point, EAs are probably more left-leaning than many (especially those on the left who aren't EAs) realise.

Expand full comment

Also worth noting that "left" by American standards (at least economically) is barely centrist by European standards. Most EA seem to be fairly energetically pro-capitalism and that's not strictly LEFT in most senses of the term.

Expand full comment

Thank you! This helped me understand the EA community a little bit better.

Expand full comment

I haven't read the Caplan book, didn't know it was an EA thing, but do subscribe to the education signalling idea.

I disagree with the point on forecasting being useful. Plenty of people don't do things that are economically useful and do do things that aren't. Businesses too. Facebook at work is a thing because Facebook is cognitively easy and desirable. Acting based on forecasts is risky, hard to think about and makes you look silly if you're wrong ("you did it because some randoms on the internet told you to?!").

I also only diverge in the ways you do, plus my name (I'm counting Australia within the predicted geographies, they only left it out to be succinct). Engineering then a decade later education at normal universities.

Expand full comment

My take on forecasting is that there are already loads of forecasting markets that are just not visible as such. Maybe even every transaction is forecasting. I buy groceries for a week even though I am not exactly sure what I will want in the next week. The supermarket stocks foods in advance even though they can't be exactly sure what people will buy. And farmers grow some amount of veggies despite not knowing exactly how much the market demand there will be this year. All of it is forecasting.

Expand full comment