--------
-[TODO:
+I got a chance to talk to Yudkowsky in person at the 2021 Event Horizon[^event-horizon] Fourth of July party. In accordance with the privacy norms I'm adhering to while telling this Whole Dumb Story, I don't think I should elaborate on what was said. (It felt like a private conversation, even if most of it was outdoors at a party. No one joined in, and if anyone was listening, I didn't notice them.)
- * depressed after talking to him at Independence Day party 2021 (I can mention that, because it was outdoors and probably lots of other people saw us, even if I can't talk about content)
+[^event-horizon]: Event Horizon was the name of a group house in Berkeley.
- * It wouldn't be so bad if he weren't trying to sell himself as a religious leader, and profiting from the conflation of rationalist-someone-who-cares-about-reasoning, and rationalist-member-of-robot-cult
+I will say that it finalized my sense that the vision of rationalism he had preached in the Sequences was dead as a cultural force. I was somewhat depressed for months afterwards.
- * But he does, in fact, seem to actively encourage this conflation (contrast to how the Sequences had a [Litany Against Gurus](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus) these days, with the way he sneers as Earthlings and post-rats)
+It wouldn't be so bad if Yudkowsky weren't trying to sell himself as a _de facto_ religious leader,[^religious-leader] profiting from the conflation of _rationalist_ in the sense of "one who aspires to systematically correct reasoning" and _rationalist_ as member of his fan-club/personality-cult.
- * a specific example that made me very angry in September 2021—
+[^religious-leader]: "Religious leader" continues to seem like an apt sociological description, even if [no supernatural claims are being made](https://www.lesswrong.com/posts/u6JzcFtPGiznFgDxP/excluding-the-supernatural).
-https://twitter.com/ESYudkowsky/status/1434906470248636419
-> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet.
-
-Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the Caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke."
+But he does seem to actively encourage this conflation. Contrast the [Litany Against Gurus](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus) from the Sequences, to the way he sneers at "post-rationalists"—or even "Earthlings" in general (in contrast to his fictional world of dath ilan). The framing is optimized to delegitimize dissent. [Motte](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/): someone who's critical of central "rationalists" like Yudkowsky or Alexander; bailey: someone who's moved beyond reason itself.
-Or [as Yudkowsky put it](https://www.facebook.com/yudkowsky/posts/10154981483669228)—
+One example that made me furious came in September 2021. Yudkowsky, replying to Scott Alexander on Twitter, [wrote](https://twitter.com/ESYudkowsky/status/1434906470248636419):
-> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others.
+> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet.
-It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior!
+I understand, of course, that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Yudkowsky or Alexander because they're the leaders of the Caliphate. I had just spent more than three and a half years of my life[^years-of-my-life] [explaining in](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/) [exhaustive](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries), [exhaustive](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception) [detail](https://www.lesswrong.com/posts/vhp2sW6iBhNJwqcwP/blood-is-thicker-than-water), with math, how Alexander was wrong about something, no one serious actually disagrees, and Yudkowsky was still using his social power to boost Scott's right-about-everything (!!) reputation. That seemed egregiously unfair, in a way that wasn't dulled by "it was just a joke."
-An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But, the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller.
+[^years-of-my-life]: I started outlining ["The Categories Where Made for Man to Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/) in January 2018. I would finally finish ["Blood Is Thicker Than Water"](https://www.lesswrong.com/posts/vhp2sW6iBhNJwqcwP/blood-is-thicker-than-water), following up on the "dolphins are fish" claim later that month of September 2021.
-Similarly, the "Caliphate" humor _only makes sense in the first place_ in the context of a celebrity culture where deferring to Yudkowsky and Alexander is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. _Nullius in verba_.
+Or [as Yudkowsky had once put it](https://www.facebook.com/yudkowsky/posts/10154981483669228)—
- * the fact that David Xu interpreted criticism of the robot cult as me going "full post-rat" suggests that Yudkowsky's framing had spilled onto others. (The framing is optimized to delegitimize dissent. Motte: someone who's critical of central rationalists; bailey: someone who's moved beyond reason.)
+> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others.
-sneering at post-rats; David Xu interprets criticism of Eliezer as me going "full post-rat"?! 6 September 2021
+It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior!
-> Also: speaking as someone who's read and enjoyed your LW content, I do hope this isn't a sign that you're going full post-rat. It was bad enough when QC did it (though to his credit QC still has pretty decent Twitter takes, unlike most post-rats).
+An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller.
-https://twitter.com/davidxu90/status/1435106339550740482
+Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Yudkowsky and Alexander is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. _Nullius in verba_.
-https://twitter.com/zackmdavis/status/1435856644076830721
-> The error in "Not Man for the Categories" is not subtle! After the issue had been brought to your attention, I think you should have been able to condemn it: "Scott's wrong; you can't redefine concepts in order to make people happy; that's retarded." It really is that simple! 4/6
+I don't think the motte-and-bailey concern is hypothetical, either. When I [indignantly protested](https://twitter.com/zackmdavis/status/1435059595228053505) the "we're both always right" remark, one Mark Xu [commented](https://twitter.com/davidxu90/status/1435106339550740482): "speaking as someone who's read and enjoyed your LW content, I do hope this isn't a sign that you're going full post-rat"—as if my criticism of Yudkowsky's self-serving bluster itself marked me as siding with the "post-rats"!
I once wrote [a post whimsically suggesting that trans women should owe cis women royalties](/2019/Dec/comp/) for copying the female form (as "intellectual property"). In response to a reader who got offended, I [ended up adding](/source?p=Ultimately_Untrue_Thought.git;a=commitdiff;h=03468d274f5) an "epistemic status" line to clarify that it was not a serious proposal.
From my perspective, such advice would be missing the point. [I'm not trying to force though some particular policy.](/2021/Sep/i-dont-do-policy/) Rather, I think I _know some things_ about the world, things I wish I had someone had told me earlier. So I'm trying to tell others, to help them live in _a world that makes sense_.
-]
-
+------
[David Xu writes](https://twitter.com/davidxu90/status/1436007025545125896) (with Yudkowsky ["endors[ing] everything [Xu] just said"](https://twitter.com/ESYudkowsky/status/1436025983522381827)):
I don't doubt Yudkowsky could come up with some clever casuistry why, _technically_, the text he wrote in 2007 and the text he endorsed in 2021 don't contradict each other. But _realistically_ ... again, no.
-[TODO: elaborate on how 2007!Yudkowsky and 2021!Xu are saying the opposite things if you just take a plain-language reading and consider, not whether individual sentences can be interpreted as "true", but what kind of _optimization_ the text is doing to the behavior of receptive readers]
-
I don't, actually, expect people to spontaneously blurt out everything they believe to be true, that Stalin would find offensive. "No comment" would be fine. Even selective argumentation that's _clearly labeled as such_ would be fine. (There's no shame in being an honest specialist who says, "I've mostly thought about these issues though the lens of ideology _X_, and therefore can't claim to be comprehensive; if you want other perspectives, you'll have to read other authors and think it through for yourself.")
What's _not_ fine is selective argumentation while claiming "confidence in [your] own ability to independently invent everything important that would be on the other side of the filter and check it [yourself] before speaking" when you _very obviously have done no such thing_.
✓ "Agreeing With Stalin" intro recap
✓ recap of crimes, cont'd
✓ Dolphin War finish
-_ lead-in to Sept. 2021 Twitter altercation
+✓ lead-in to Sept. 2021 Twitter altercation
_ Michael Vassar and the Theory of Optimal Gossip
_ plan to reach out to Rick / Michael on creepy men/crazy men
_ reaction to Ziz
-Notes from pt. 3 readthrough—
+pt. 3 edit tier—
_ fullname Taylor and Hoffman at start of pt. 3
_ footnote clarifying that "Riley" and Sarah weren't core members of the group, despite being included on some emails?
_ be more specific about Ben's anti-EA and Jessica's anti-MIRI things, perhaps in footnotes
_ do I have a better identifier than "Vassarite"?
_ maybe I do want to fill in a few more details about the Sasha disaster, conditional on what I end up writing regarding Scott's prosecution?—and conditional on my separate retro email—also the Zolpidem thing
-Notes from pt. 4 readthrough—
+pt. 4 edit tier—
_ mention Nick Bostrom email scandal (and his not appearing on the one-sentence CAIS statement)
_ revise and cut words from "bad faith" section since can link to "Assume Bad Faith"
_ cut words from January 2020 Twitter exchange (after war criminal defenses)
_ revise reply to Xu
_ cut lots of words from Scotts comments on Jessica's MIRI post (keep: "attempting to erase the agency", Scott blaming my troubles on Michael being absurd)
-Notes pt. 5—
+pt. 5 edit tier—
+_ quote specific exchange where I mentioned 10,000 words of philosophy that Scott was wrong—obviously the wrong play
_ "as Soares pointed out" needs link
_ can I rewrite to not bury the lede on "intent doesn't matter"?
_ also reference "No such thing as a tree" in Dolphin War section
_ better brief explanation of dark side epistemology
_ "deep causal structure" argument needs to be crystal clear, not sloopy
+_ it's a relevant detail whether the optimization is coming from Nate
+_ probably cut the vaccine polarization paragraphs? (overheard at a party is not great sourcing, even if technically admissible)
+_ elaborate on how 2007!Yudkowsky and 2021!Xu are saying the opposite things if you just take a plain-language reading and consider, not whether individual sentences can be interpreted as "true", but what kind of _optimization_ the text is doing to the behavior of receptive readers
+_ Scott got comas right in the same year as "Categories"
+_ cite Earthling/postrat sneers
+_ cite postYud Tweet
------
With internet available—
+_ Earthing/postrat sneers
_ Is http://www.overcomingbias.com/2011/01/be-a-charity-angel.html the best link for after-the-fact prize funding?
_ P(doom)
_ Michael on OB in 'aught-eight on smart kids internalizing rules meant for the norm of reaction of a dumber population
* Arguing with him resulted in my backing away from pure BBL ("Useful Approximation")
* Later, he became disillusioned with "Blanchardians" and went to war against them. I kept telling him he _is_ a "Blanchardian", insofar as he largely agrees with the main findings (about AGP as a major cause). He corresponded with Bailey and became frustrated with Bailey's ridigity. Blanchardians market themselves as disinterest truthseekers, but a lot of what they're actually doing is providing a counternarrative to social justice.
* There's an analogy between Tail's antipathy for Bailey and my antipathy for Yudkowsky: I still largely agree with "the rationalists", but the way especially Yudkowsky markets himself as a uniquely sane thinker
+
+Something he said made me feel spooked that he knew something about risks of future suffering that he wouldn't talk about, but in retrospect, I don't think that's what he meant.
+
+https://twitter.com/zackmdavis/status/1435856644076830721
+> The error in "Not Man for the Categories" is not subtle! After the issue had been brought to your attention, I think you should have been able to condemn it: "Scott's wrong; you can't redefine concepts in order to make people happy; that's retarded." It really is that simple! 4/6