-There are a number of things that could be said to this,[^number-of-things] but most importantly: the battle that matters—the battle with a Right Side and a Wrong Side—isn't "pro-trans" _vs._ "anti-trans". (The central tendency of the contemporary trans rights movement is firmly on the Wrong Side, but that's not the same thing as all trans people as individuals.) That's why Jessica joined our posse to try to argue with Yudkowsky in early 2019. (She wouldn't have, if my objection had been, "trans is Wrong; trans people Bad".) That's why Somni—one of the trans women who [infamously protested the 2019 CfAR reunion](https://www.ksro.com/2019/11/18/new-details-in-arrests-of-masked-camp-meeker-protesters/) for (among other things) CfAR allegedly discriminating against trans women—[understands what I've been saying](https://somnilogical.tumblr.com/post/189782657699/legally-blind).
-
-[^number-of-things]: Note the striking contrast between ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument), in which the Yudkowsky of 2007 wrote that a campaign manager "crossed the line [between rationality and rationalization] at the point where you considered whether the questionnaire was favorable or unfavorable to your candidate, before deciding whether to publish it"; and these 2021 Tweets, in which Yudkowsky seems completely nonchalant about "not have been as willing to tweet a truth helping" one side of a cultural dispute, because "this battle just isn't that close to the top of [his] priority list". Well, sure! Any hired campaign manager could say the same: helping the electorate make an optimally informed decision just isn't that close to the top of their priority list, compared to getting paid.
-
- Yudkowsky's claim to have been focused on nudging people's cognition towards sanity seems incredibly dubious: if you're focused on sanity, you should be spontaneously noticing sanity errors on both sides. (Moreover, if you're living in what you yourself describe as a "half-Stalinist environment", you should expect your social environment to proportionately _more_ errors on the "pro-Stalin" side.) As for the rationale that "those people might matter to AGI someday", judging by local demographics, it seems much more likely to apply to trans women themselves, than their critics!
-
-The battle that matters—and I've been _very_ explicit about this, for years—is over this proposition eloquently [stated by Scott Alexander in November 2014](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) (redacting the irrelevant object-level example):
-
-> I ought to accept an unexpected [X] or two deep inside the conceptual boundaries of what would normally be considered [Y] if it'll save someone's life. There's no rule of rationality saying that I shouldn't, and there are plenty of rules of human decency saying that I should.
-
-This is a battle between Feelings and Truth, between Politics and Truth.
-
-In order to take the side of Truth, you need to be able to tell Joshua Norton that he's not actually Emperor of the United States (even if it hurts him). You need to be able to tell a prideful autodidact that the fact that he's failing quizzes in community college differential equations class, is evidence that his study methods aren't doing what he thought they were (even if it hurts him). And you need to be able to say, in public, that trans women are male and trans men are female _with respect to_ a female/male "sex" concept that encompasses the many traits that aren't affected by contemporary surgical and hormonal interventions (even if it hurts someone who does not like to be tossed into a Male Bucket or a Female Bucket as it would be assigned by their birth certificate, and—yes—even if it probabilistically contributes to that person's suicide).
-
-If you don't want to say those things because hurting people is wrong, then you have chosen Feelings.
-
-Scott Alexander chose Feelings, but I can't really hold that against him, because Scott is [very explicit about only speaking in the capacity of some guy with a blog](https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/). You can tell from his writings that he never wanted to be a religious leader; it just happened to him on accident because he writes faster than everyone else. I like Scott. Scott is great. I feel sad that such a large fraction of my interactions with him over the years have taken such an adversarial tone.
-
-Eliezer Yudkowsky ... did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he knows to be unambiguously false.
-
-But Eliezer Yudkowsky does not identify as just some guy with a blog. Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. He markets himself as a master of the hidden Bayesian structure of cognition, who ["aspires to make sure [his] departures from perfection aren't noticeable to others"](https://twitter.com/ESYudkowsky/status/1384671335146692608).
-
-In making such boasts, I think Yudkowsky is opting in to being held to higher standards than other mortals. If Scott Alexander gets something wrong when I was trusting him to be right, that's disappointing, but I'm not the victim of false advertising, because Scott Alexander doesn't _claim_ to be anything more than some guy with a blog. If I trusted him more than that, that's on me.
-
-If Eliezer Yudkowsky gets something wrong when I was trusting him to be right, _and_ refuses to acknowledge corrections _and_ keeps inventing new galaxy-brained ways to be wrong in the service of his political agenda of being seen to agree with Stalin without technically lying, then I think I _am_ the victim of false advertising. His marketing bluster was optimized to trick people like me, even if it's my fault for being _dumb enough to believe him_.
-
-Because, I did, actually, trust him. Back in 'aught-nine when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). And of course, it was a joke, but the hero-worship that make the joke funny was real. (You wouldn't make those jokes for your community college physics teacher, even if he was a decent teacher.)
-
-["Never go in against Eliezer Yudkowsky when anything is on the line."](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts?commentId=Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to.
-
-[Yudkowsky writes](https://twitter.com/ESYudkowsky/status/1096769579362115584):
-
-> When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes.
-
-I notice that this advice leaves out a possibility: that the "seems to believe" is a deliberate show, rather than a misperception on your part. I am left in a [weighted average of](https://www.lesswrong.com/posts/y4bkJTtG3s5d6v36k/stupidity-and-dishonesty-explain-each-other-away) shaking my head sadly about the mortal frailty of my former hero, and shaking my head in disgust at his craven duplicity. If Eliezer Yudkowsky can't _unambigously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_.
-
--------
-
-[TODO section existential stakes, cooperation
- * so far, I've been writing this from the perspective of caring about _rationality_ and wanting there to be a rationality movement, the common interest of many causes
- * Singularity stuff scares me
- * e.g., as recently as 2020 I was daydreaming about working for an embryo selection company as part of the "altruistic" (about optimizing the future, rather than about my own experiences) component of my actions
- * if you have short timelines, and want to maintain influence over what big state-backed corporations are doing, self-censoring about contradicting the state religion makes sense
- * you could tell a story in which I'm the villain for undermining Team Singularity with my petty temporal concerns
-]
-
-/2020/Aug/memento-mori/#if-we-even-have-enough-time
-
-> [_Perhaps_, replied the cold logic](https://www.yudkowsky.net/other/fiction/the-sword-of-good). _If the world were at stake._
->
-> _Perhaps_, echoed the other part of himself, _but that is not what was actually happening._
-
-[TODO: social justice and defying threats
- * There's _no story_ in which misleading people about this is on Yudkowsky's critical path!! I can cooperate with censorship that doesn't actively interfere with my battle, but Yudkowsky was interfering
- * I don't pick fights with Paul Christiano, because Paul Christiano doesn't take a shit on my Something to Protect
- * back in 'aught-nine, SingInst had made a point of prosecuting Tyler Emerson, citing decision theory
- * there's a lot of naive misinterpretations of timeless decision theory out there that don't understand the counterfactual dependence thing, but the parsing of social justice as an agentic "threat" to be avoided rather than a rock to be dodged does seem to line up with the fact that people punish heretics more than infidels
- * But it matters where you draw the zero point: is being excluded from the coalition a "punishment" to threaten you out of bad behavior, or is being included a "reward" for good behavior?
- * at least Sabbatai Zevi had an excuse: his choices were to convert to Islam or be impaled https://en.wikipedia.org/wiki/Sabbatai_Zevi#Conversion_to_Islam
-]