2 days ago
EA - The illusion of consensus about EA celebrities by Ben Millwood
Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: The illusion of consensus about EA celebrities, published by Ben Millwood on March 17, 2023 on The Effective Altruism Forum.
Epistemic status: speaking for myself and hoping it generalises
I don't like everyone that I'm supposed to like:
I've long thought that [redacted] was focused on all the wrong framings of the issues they discuss,
[redacted] is on the wrong side of their disagreement with [redacted] and often seems to have kind of sloppy thinking about things like this,
[redacted] says many sensible things but a writing style that I find intensely irritating and I struggle to get through; [redacted] is similar, but not as sensible,
[redacted] is working on an important problem, but doing a kind of mediocre job of it, which might be crowding out better efforts.
Why did I redact all those names? Well, my criticisms are often some mixture of:
half-baked; I don't have time to evaluate everyone fairly and deeply, and don't need to in order to make choices about what to focus on,
based on justifications that are not very legible or easy to communicate,
not always totally central to their point or fatal to their work,
kind of upsetting or discouraging to hear,
often not that actionable.
I want to highlight that criticisms like this will usually not surface, and while in individual instances this is sensible, in aggregate it may contribute to a misleading view of how we view our celebrities and leaders. We end up seeming more deferential and hero-worshipping than we really are. This is bad for two reasons:
it harms our credibility in the eyes of outsiders (or insiders, even) who have negative views of those people,
it projects the wrong expectation to newcomers who trust us and want to learn or adopt our attitudes.
What to do about it?
I think "just criticise people more" in isolation is not a good solution. People, even respected people in positions of leadership, often seem to already find posting on the Forum a stressful experience, and I think tipping that balance in the more brutal direction seems likely to cost more than it gains.
I think you could imagine major cultural changes around how people give and receive feedback that could make this better, mitigate catastrophising about negative feedback, and ensure people feel safe to risk making mistakes or exposing their oversights. But those seem to me like heavy, ambitious pieces of cultural engineering that require a lot of buy-in to get going, and even if successful may incur ongoing frictional costs. Here's smaller, simpler things that could help:
Write a forum post about it (this one's taken, sorry),
Make disagreements more visible and more legible, especially among leaders or experts. I really enjoyed the debate between Will MacAskill and Toby Ord in the comments of Are we living at the most influential time in history? – you can't come away from that discussion thinking "oh, whatever the smart, respected people in EA think must be right", because either way at least one of them will disagree with you!
There's a lot of disagreement on the Forum all the time, of course, but I have a (somewhat unfair) vibe of this as the famous people deposit their work into the forum and leave for higher pursuits, and then we in the peanut gallery argue over it.
I'd love it if there were (say) a document out there that Redwood Research and Anthropic both endorsed, that described how their agendas differ and what underlying disagreements lead to those differences.
Make sure people incoming to the community, or at the periphery of the community, are inoculated against this bias, if you spot it. Point out that people usually have a mix of good and bad ideas. Have some go-to examples of respected people's blind spots or mistakes, at least as they appear to you. (Even if you never end up explaining them to anyone, it's probably goo...