Thanks for the guest spot, buddy - 10/10 would do again.

Expand full comment
Jan 31Liked by Christopher Brunet

There's just too much to respond here, so this:

- the folks you describe as "rationalists" have been around forever. They are the true scientists (and heretics), while the rest of the population are much closer to sheep. There's a reason why you see, "To thine own self be true." carved all over universities.

- there are many Overton Windows. They vary depending on place and time, and are the subject of fierce battles. Such a battle is going on now, with something equivalent to a blitzkrieg having taken place about five years ago. The blitzkrieg has finally been noticed, and the recently installed Wokists are now being challenged. If humanity is lucky they will be soundly defeated.

Expand full comment

Very good essay, RC! I think you could extrapolate quite a lot from here about the actual functionality of EA. How well can organizations that care so much about optics outside of "do we achieve our goal X better or worse?" actually focus on doing something better or worse in terms of efficiency? Is there a practical difference between standard charitable organizations and EA branded organizations if EA orgs are so serious about looking good instead of doing good? I always took EA's competitive advantage to be mercilessly measuring and judging interventions, rejecting those that didn't pay off well compared to others, regardless of how nice they sound or look, or how they make you feel. Seems that is entirely off the table these days.

Expand full comment
Jan 31·edited Jan 31Liked by Christopher Brunet

Good article.

One way I think you can square the circle is that rationalists are very often "consequentialist" - if I understand correctly, they frequently think that the more rational way to approach an ethical issue is looking at the consequences rather than the intent or the means.

So assuming the EA people are consequentialist rationalists (and I have no idea), there's an argument that it makes perfect sense to disassociate the movement from Bostrom. Our hypothetical EA poohbahs believe that they personally make better decisions when using a rationist framework, but that framework has led them to believe that any harm done by putting out an anti-Bostrom statement is outweighed by the extra mosquito nets and water purifiers they'll be able to get placed if they're not known as a bunch of racists.

This is what drives some people crazy about Scott Alexander - they want him to be a crusader for "truth," but if I am reading Scott right, he doesn't hold "everything I say is exactly true" as a value in and of itself, and if he can nudge more people towards what he sees as constructive reasoning and debate processes, he's happy to be nicer than some of his fans would prefer.

(Also, he's naturally nice and it seems to really stress him out not to be nice, but rationalists are allowed to have aesthetic preferences too.)

Expand full comment

The description of rationalism here reminds me quite a bit of 4chan back in its heyday. The strategies adopted by the two communities are quite distinct: rationalists sought to remove all emotional inflection from belief by relying on pure logic and evidence, whereas 4chan pursued the same goal by deploying shock humor to batter down emotional programming. The basic ethos, that nothing should be off the table for discussion, was very similar; both communities saw the Overton window as a challenge, rather than a guardrail. Both communities also prioritized the argument over the arguer, with 4chan adopting this principle by default due to the simple fact that it was impossible to identify the arguer - every participant in a thread is simply Anon.

It's no accident that anons referred to their style as weaponized autism. It's also no accident that both communities trend overwhelmingly young, male, and brash. The kinds of smart kids who sit in the back of the class and only raise their hands when they've come up with a way to embarrass the nice lady teacher trying to impart her midwit values in the guise of educating them.

Of course, the big difference is that it would never occur to anons to unperson Bostrom for dropping an n-bomb, since they do it so frequently it's essentially a verbal tic.

Expand full comment
Jan 31·edited Jan 31Liked by Christopher Brunet

John Carter and I had a long debate on the second half of the latest Martian Wonderland podcast about the problems with utilitarianism.

ETA podcast link: https://open.substack.com/pub/wonderlandrules/p/martian-wonderland-episode-1?utm_source=direct&r=220r9&utm_campaign=post&utm_medium=web

The main one is that utilitarianism is a philosophy with a particular (and narrow) use case. It's for zero-sum tribal leadership decisions and pretty much nothing else--the plane has crashed, we're starving, do we eat grandma. That kind of thing.

EAs and rationalists have two related psychological problems; rationalists want to apply their shiny new intellectual toy to *everything* because it's cool. EAs go further; they want to *create* situations where we have to decide whether to eat grandma in order to justify making that kind of leadership decision.

Rationalists are *gear queers,* or that guy who whips out his expensive problem-solving gizmo at the slightest provocation.

EAs are a cult of *grandiose narcissists/psychopaths,* who have taken the rationalist movement and applied it to seeking power over others.

Expand full comment
Jan 31Liked by Christopher Brunet

This probably all qualifies as 'wrong in little granular ways', but:

> Most EAs were rationalists - one movement sprang from personnel acquired from the other.

This seems wrong or at least very misleading. The big rationalist names in EA are broadly 'in EA' in the sense that they identified with the rationalist movement, got involved in rationalist organisations, and because of EAs increasingly taking longtermism and AI seriously, those became viewed as EA organisations.

Most of the big names and shapers in EA who were at explicitly EA organisations from the start (say, Toby Ord, Brian Tomasik, Julia Wise, Will MacAskill, Holden Karnofsky, Elie Hassenfeld, Nick Beckstead, Rob Wiblin, Owen Cotton Barratt) were involved in rationalism minimally if at all, and inasmuch as they are now it's mainly because of the same confluence described above.

Contra Ro's comment below (and, perhaps, the description of this post), I would say rationalism *in practice* comprises a much broader set of assumptions than merely 'truth seeking'. The hyperfocus on Bayes Theorem is an example - ask a statistician and they'll probably tell you it's one useful tool among many equally important ones. There's a culture of expectation around having 'read the sequences' and internalised their jargon, for eg.

There's also a culture of convinction that AI is around the corner and will change everything, and of adherence to certain philosophical schools (preference utilitarianism over hedonistic utilitarianism; value pluralism over value monism; moral antirealism over moral realism; some kind of updateless decision theory over eg causal or evidential decision theory; and perhaps many worlds theory of quantum mechanics over other interpretations; etc). One might reasonably think any these things, but it seems to that their widespread popularity has more to do with selection effects from a certain type of people being drawn to a certain type of people than 'pure truth-seeking' - since there are many intelligent truth-seeking philosophers, mathematicians and scientists who reject each of them.

Expand full comment

this is a good post

i'm kind of confused what's going on here (this is karl's substack?)

as i assume you know, even a lot of rank-and-file EA ppl are outraged at they way bostrom is being treated

Expand full comment
Jan 31Liked by Christopher Brunet

I think there's a little danger here of conflating the potential of 'rationalists' to consider propositions that lie outside polite discourse and their taste for considering propositions impolitely (though you did make the distinction).

One of the fundamental features of 'rationalism', I think (and I quote it because I'm used to a philosophical connotation that is very different from what is essentially an identity label for an internet sub-culture) is the denial of context. The denial of the cultural, intellectual and linguistic backgrounds that make belief and discourse possible and meaningful.

On the one hand, there is an unspoken pretence that these individuals are either uniquely immune to ideology, or that they have undertaken the Cartesian project of working out all their beliefs from first principles and have managed to set them all in order. Of course, neither is true, and what you have instead are people with a particularly American brand of edgy utilitarianism pretending that they are not a product of their society's various propaganda forces (as we all are). To an outsider, the emptiness of the label is clear from the highly predictable focus on culture war issues at the expense of more substantive ones.

At the same time, there is a pretence that natural language is or can be a medium of strictly logical discourse. So, when Bostrom makes a statement like the one he made, there's supposed to be a single, literal good faith interpretation, and all its critics are ungenerous ideologues. It's funny that the complaint will often be that condemnations take the statements "out of context" when the root of their problem is that they piggishly choose to never put things in context. There's no attempt at rhetoric, no attempt to anticipate interpretations, no attempt to highlight the social foundations of an idea that can mean that it has very different resonances and consequences for different groups of people.

If their project in the end is just about being able to make pithy, offensive remarks while whining and pointing to reams of secreted caveats, we should welcome their demise and hope their place will be taken by people who can think outside the mainstream but are willing to imagine how the mainstream will respond to ideas that challenge it. In reality, such people do already exist but they constitute a diverse class of writers who don't turn to internet forums to do philosophy role-play.

Expand full comment
Feb 6·edited Feb 6

> places like r/slatestarcodex started actively purging all their wrong-thinkers

That's a rather misleading way to put it, I think. They didn't purge wrong-thinkers; they just decided to stop hosting culture war discussions. Not every forum needs to allow political discussions. It was done because of unmanageable harassment towards Scott. https://slatestarcodex.com/2019/02/22/rip-culture-war-thread/

These discussions were moved to a separate subreddit (/r/themotte), which later moved again (because of Reddit admins) to themotte.org. This was done to disassociate these from Scott; mods of the new place were mods on /r/slatestarcodex before.

Expand full comment

> I’m really, really interested to see what happens here.

100% they won't defend him

80% they will pretend it didn't happen and quietly downvote poeple who bring it up

Expand full comment
deletedJan 31Liked by Christopher Brunet
Comment deleted
Expand full comment