9 Comments

I think really the most important consideration here is not the hypothetical risk that the industrial revolution will end up causing something really bad that produces a lot of suffering and death, but the fact that it's already causing a lot of suffering and deaths on industrial factory farms. When you just look at the human-level data, the industrial revolution looks pretty good on net, and it's just hypothetical tail risks that could have it turn out to be net-negative. But when you add the data point that, e.g., 50 billion broiler chickens are killed every year (which adds up to trillions since the advent of industrial farming), after lives that only lasted a few weeks and were basically pure suffering from the start, it's distinctly possible this has all been net-negative. To be clear, I'm not sure I believe that the industrial revolution has actually been negative on net, but I think this is the most relevant consideration.

Expand full comment
author

Yeah, this is a great point that I hadn't really thought about properly - shamefully, animal suffering is often a blindspot for me.

Expand full comment

I'll admit I've thought about it much less than the people who make it their career, but taking advantage of the "Epistemic confidence: Very Low" context, I've always wondered whether these existential risk scenarios are truly wiping out people, or 'just' pushing civilization back to the dark ages. For example, some tiny fraction of people are likely to be naturally immune to a given supervirus. The Subreddit 'Preppers' has about 290k members. Chuck Norris.

There would be a ton of suffering to get there, of course, but if the alternative was never having come out of the dark ages, it seems like a relevant distinction.

Expand full comment
author

Yeah, this is definitely a relevant point that I should've thought about more. I think Ord does say that existential catastrophe doesn't have to actually wipe everyone out - so would be interesting to see a breakdown of what percentage of the time the x-risks he talks about lead to a full wipeout as opposed to just huge population loss, you're right that it has obvious implications for the argument here.

Expand full comment

Worth saying that Bostrom's original definition of 'x-risk' was roughly equivalent to 'a risk that could stop us colonising space' (and this was the point) - obviously there are other definitions used today, but I think it's had an effect on the trajectory of the field and thinking about what's important about x-risk.

Expand full comment

Strongly recommend https://www.youtube.com/watch?v=EO5jJFpbJvg, Tyler Cowen lecture, "Is Economic Growth a Moral Imperative?" who continues extensively where you left off.

Expand full comment
author
Mar 13, 2022·edited Mar 13, 2022Author

Thanks! Is there a timestamp for when he talks about x-risks specifically (if he does)? Will get round to watching the whole thing at some point but would be interested if he addresses this specifically.

Expand full comment

Clever!

Is this a problem of the industrial revolution, or a problem for utilitarianism? It sounds like an applied version of the repugnant conclusion. And it’s for this reason I’ve always been deeply skeptical of utilitarianism. I’ve never seen a good reply to the repugnant conclusion, and yet it is accepted in a “shut up and calculate” way quite broadly in the EA and rationalist communities. If you’re just adding up people, these are the (often counterintuitive) results you get.

Expand full comment
author

I was actually thinking about linking this to the repugnant conclusion at the end, maybe it would have made for a more interesting piece - but I'm not sure the conclusion is actually so repugnant, I need to think about it more. But on your claim that this sounds like a problem for utilitarianism, I think it's not really comparable to something like Pascal's mugging or a utility monster. It's not some tricky logic that gets you to a weird conclusion, the claim that 'it might be a bad thing to massively increase the chance of us all dying even though there's much less suffering' seems actually like it could be a challenge to massive, rapid economic growth.

Expand full comment