25 Comments
author

Thanks for the guest spot, buddy - 10/10 would do again.

Expand full comment
Jan 31, 2023Liked by Christopher Brunet

There's just too much to respond here, so this:

- the folks you describe as "rationalists" have been around forever. They are the true scientists (and heretics), while the rest of the population are much closer to sheep. There's a reason why you see, "To thine own self be true." carved all over universities.

- there are many Overton Windows. They vary depending on place and time, and are the subject of fierce battles. Such a battle is going on now, with something equivalent to a blitzkrieg having taken place about five years ago. The blitzkrieg has finally been noticed, and the recently installed Wokists are now being challenged. If humanity is lucky they will be soundly defeated.

Expand full comment

Very good essay, RC! I think you could extrapolate quite a lot from here about the actual functionality of EA. How well can organizations that care so much about optics outside of "do we achieve our goal X better or worse?" actually focus on doing something better or worse in terms of efficiency? Is there a practical difference between standard charitable organizations and EA branded organizations if EA orgs are so serious about looking good instead of doing good? I always took EA's competitive advantage to be mercilessly measuring and judging interventions, rejecting those that didn't pay off well compared to others, regardless of how nice they sound or look, or how they make you feel. Seems that is entirely off the table these days.

Expand full comment
Jan 31, 2023·edited Jan 31, 2023Liked by Christopher Brunet

Good article.

One way I think you can square the circle is that rationalists are very often "consequentialist" - if I understand correctly, they frequently think that the more rational way to approach an ethical issue is looking at the consequences rather than the intent or the means.

So assuming the EA people are consequentialist rationalists (and I have no idea), there's an argument that it makes perfect sense to disassociate the movement from Bostrom. Our hypothetical EA poohbahs believe that they personally make better decisions when using a rationist framework, but that framework has led them to believe that any harm done by putting out an anti-Bostrom statement is outweighed by the extra mosquito nets and water purifiers they'll be able to get placed if they're not known as a bunch of racists.

This is what drives some people crazy about Scott Alexander - they want him to be a crusader for "truth," but if I am reading Scott right, he doesn't hold "everything I say is exactly true" as a value in and of itself, and if he can nudge more people towards what he sees as constructive reasoning and debate processes, he's happy to be nicer than some of his fans would prefer.

(Also, he's naturally nice and it seems to really stress him out not to be nice, but rationalists are allowed to have aesthetic preferences too.)

Expand full comment

The description of rationalism here reminds me quite a bit of 4chan back in its heyday. The strategies adopted by the two communities are quite distinct: rationalists sought to remove all emotional inflection from belief by relying on pure logic and evidence, whereas 4chan pursued the same goal by deploying shock humor to batter down emotional programming. The basic ethos, that nothing should be off the table for discussion, was very similar; both communities saw the Overton window as a challenge, rather than a guardrail. Both communities also prioritized the argument over the arguer, with 4chan adopting this principle by default due to the simple fact that it was impossible to identify the arguer - every participant in a thread is simply Anon.

It's no accident that anons referred to their style as weaponized autism. It's also no accident that both communities trend overwhelmingly young, male, and brash. The kinds of smart kids who sit in the back of the class and only raise their hands when they've come up with a way to embarrass the nice lady teacher trying to impart her midwit values in the guise of educating them.

Of course, the big difference is that it would never occur to anons to unperson Bostrom for dropping an n-bomb, since they do it so frequently it's essentially a verbal tic.

Expand full comment
Jan 31, 2023Liked by Christopher Brunet

This probably all qualifies as 'wrong in little granular ways', but:

> Most EAs were rationalists - one movement sprang from personnel acquired from the other.

This seems wrong or at least very misleading. The big rationalist names in EA are broadly 'in EA' in the sense that they identified with the rationalist movement, got involved in rationalist organisations, and because of EAs increasingly taking longtermism and AI seriously, those became viewed as EA organisations.

Most of the big names and shapers in EA who were at explicitly EA organisations from the start (say, Toby Ord, Brian Tomasik, Julia Wise, Will MacAskill, Holden Karnofsky, Elie Hassenfeld, Nick Beckstead, Rob Wiblin, Owen Cotton Barratt) were involved in rationalism minimally if at all, and inasmuch as they are now it's mainly because of the same confluence described above.

Contra Ro's comment below (and, perhaps, the description of this post), I would say rationalism *in practice* comprises a much broader set of assumptions than merely 'truth seeking'. The hyperfocus on Bayes Theorem is an example - ask a statistician and they'll probably tell you it's one useful tool among many equally important ones. There's a culture of expectation around having 'read the sequences' and internalised their jargon, for eg.

There's also a culture of convinction that AI is around the corner and will change everything, and of adherence to certain philosophical schools (preference utilitarianism over hedonistic utilitarianism; value pluralism over value monism; moral antirealism over moral realism; some kind of updateless decision theory over eg causal or evidential decision theory; and perhaps many worlds theory of quantum mechanics over other interpretations; etc). One might reasonably think any these things, but it seems to that their widespread popularity has more to do with selection effects from a certain type of people being drawn to a certain type of people than 'pure truth-seeking' - since there are many intelligent truth-seeking philosophers, mathematicians and scientists who reject each of them.

Expand full comment

this is a good post

i'm kind of confused what's going on here (this is karl's substack?)

as i assume you know, even a lot of rank-and-file EA ppl are outraged at they way bostrom is being treated

Expand full comment
Feb 6, 2023·edited Feb 6, 2023

> places like r/slatestarcodex started actively purging all their wrong-thinkers

That's a rather misleading way to put it, I think. They didn't purge wrong-thinkers; they just decided to stop hosting culture war discussions. Not every forum needs to allow political discussions. It was done because of unmanageable harassment towards Scott. https://slatestarcodex.com/2019/02/22/rip-culture-war-thread/

These discussions were moved to a separate subreddit (/r/themotte), which later moved again (because of Reddit admins) to themotte.org. This was done to disassociate these from Scott; mods of the new place were mods on /r/slatestarcodex before.

Expand full comment

> I’m really, really interested to see what happens here.

100% they won't defend him

80% they will pretend it didn't happen and quietly downvote poeple who bring it up

Expand full comment
deletedJan 31, 2023Liked by Christopher Brunet
Comment deleted
Expand full comment
deletedJan 31, 2023·edited Jan 31, 2023Liked by Christopher Brunet
Comment deleted
Expand full comment