### Yudkowsky Doubles Down (February 2021)
-I eventually explained what was wrong with Yudkowsky's new arguments at the length of 12,000 words in March 2022's ["Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/),[^challenges-title]. Briefly: given a conflict over pronoun conventions, there's not going to be a "right answer", but we can at least be objective in describing what the conflict is about, and Yudkowsky wasn't doing that. Given that we can't coordinate a switch to universal singular _they_, the pronouns _she_ and _he_ continue to have different meanings in the minds of native English speakers, in the sense that your mind forms different probabilistic expectations of someone taking feminine or masculine pronouns. That's _why_ trans people want to be referred to by the pronoun corresponding to their chosen gender; if there were literally no difference in meaning, there would be no reason to care. Thus, making the distinction on the basis of gender identity rather than sex [has consequences](https://www.lesswrong.com/posts/veN86cBhoe7mBxXLk/categorizing-has-consequences); by proclaiming his "simplest and best protocol" without acknowledging the ways in which it's _not_ simple and not _unambiguously_ the best, Yudkowsky was failing to heed that [policy debates should not appear one-sided](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided).
+I eventually explained what was wrong with Yudkowsky's new arguments at the length of 12,000 words in March 2022's ["Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/),[^challenges-title]. Briefly: given a conflict over pronoun conventions, there's not going to be a "right answer", but we can at least be objective in describing what the conflict is about, and Yudkowsky wasn't doing that. Given that we can't coordinate a switch to universal singular _they_, the pronouns _she_ and _he_ continue to have different meanings in the minds of native English speakers, in the sense that your mind forms different probabilistic expectations of someone taking feminine or masculine pronouns. That's _why_ trans people want to be referred to by the pronoun corresponding to their chosen gender; if there were literally no difference in meaning, there would be no reason to care. Thus, making the distinction on the basis of gender identity rather than sex [has consequences](https://www.lesswrong.com/posts/veN86cBhoe7mBxXLk/categorizing-has-consequences); by proclaiming his "simplest and best protocol" without acknowledging the ways in which it's _not_ simple and not _unambiguously_ the best, Yudkowsky was [falsely portraying the policy debate as one-sided](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided).
[^challenges-title]: The title is an allusion to Yudkowsky's ["Challenges to Christiano's Capability Amplification Proposal"](https://www.lesswrong.com/posts/S7csET9CgBtpi7sCh/challenges-to-christiano-s-capability-amplification-proposal).
At this point, some readers might protest that I'm being too uncharitable in harping on the "not liking to be tossed into a [...] Bucket" paragraph. The same post also explicitly says that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I agree that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not give him the benefit of the doubt and assume that he meant to communicate the reading that does make sense, rather than the reading that doesn't make sense?
-I reply: _given that the author is Eliezer Yudkowsky_—no, obviously not. I have been ["trained in a theory of social deception that says that people can arrange reasons, excuses, for anything"](https://www.glowfic.com/replies/1820866#reply-1820866), such that it's informative ["to look at what _ended up_ happening, assume it was the _intended_ result, and ask who benefited."](http://www.hpmor.com/chapter/47) If Yudkowsky just wanted to post about how gendered pronouns are unnecessary and bad as an apolitical matter of language design, he could have written a post just making that narrow point. What ended up happening is that he wrote a post featuring sanctimonious flag-waving about the precious feelings of people "not lik[ing] to be tossed into a Male Bucket or Female Bucket", and concluding with a policy proposal that gives the trans activist coalition everything they want, proclaiming this "the simplest and best protocol" without so much as acknowledging the real arguments on [the other side of the policy debate](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided). I don't think it's crazy to assume this was the intended result, and to ask who benefitted.
+I reply: _given that the author is Eliezer Yudkowsky_—no, obviously not. I have been ["trained in a theory of social deception that says that people can arrange reasons, excuses, for anything"](https://www.glowfic.com/replies/1820866#reply-1820866), such that it's informative ["to look at what _ended up_ happening, assume it was the _intended_ result, and ask who benefited."](http://www.hpmor.com/chapter/47) If Yudkowsky just wanted to post about how gendered pronouns are unnecessary and bad as an apolitical matter of language design, he could have written a post just making that narrow point. What ended up happening is that he wrote a post featuring sanctimonious flag-waving about the precious feelings of people "not lik[ing] to be tossed into a Male Bucket or Female Bucket", and concluding with a policy proposal that gives the trans activist coalition everything they want, proclaiming this "the simplest and best protocol" without so much as acknowledging the real arguments on [the other side of the policy debate](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided). I don't think it's crazy for me to assume this was the intended result, and to ask who benefitted.
When smart people act dumb, it's often wise to conjecture that their behavior represents [_optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some channel other than their words straightforwardly reflecting reality. Someone who was actually stupid wouldn't be able to generate text so carefully fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think the point is to pander to biological sex denialists without technically saying anything unambiguously false that someone could call out as a "lie."
I got offended. I felt like a devout Catholic watching the Pope say, "Jesus sucks; I hate God; I never should have told people about God."
-Later, I felt the need to write another message clarifying exactly what I found offensive. The problem wasn't the condescension of the suggestion that other people couldn't reason. People being annoyed at the condescension was fine. The problem was that "just learn[ing] to persuade people of things instead" was giving up on the principle that the arguments you use to convince others should be the same as the ones you used to decide which conclusion to argue for. Giving up on that amounted to giving up on the _concept_ of intellectual honesty, choosing instead to become a propaganda AI that calculates what signals to output in order to manipulate an agentless world.
+Later, I felt the need to write another message clarifying exactly what I found offensive. The problem wasn't the condescension of the suggestion that other people couldn't reason. The problem was that "just learn[ing] to persuade people of things instead" was giving up on the principle that the arguments you use to convince others should be the same as the ones you used to decide which conclusion to argue for. Giving up on that amounted to giving up on the _concept_ of intellectual honesty, choosing instead to become a propaganda AI that calculates what signals to output in order to manipulate an agentless world.
[He put a check-mark emoji reaction on it](/images/davis-amounts-to-giving-up-on-the-concept-of-intellectual-honesty.png), indicating agreement or approval.
-If the caliph has lost his [belief in](https://www.lesswrong.com/posts/duvzdffTzL3dWJcxn/believing-in-1) the power of intellectual honesty, I can't necessarily say he's wrong on the empirical merits. It is written that our world is [beyond the reach of God](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god); there's no law of physics that says honesty must yield better results than propaganda.
+If the caliph has lost his [belief in](https://www.lesswrong.com/posts/duvzdffTzL3dWJcxn/believing-in-1) the power of intellectual honesty, I can't necessarily say he's wrong on the empirical merits. It is written that our world is [beyond the reach of God](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god); there's no law of physics that says honesty must yield better consequences than propaganda.
-But since I haven't relinquished my belief in honesty, I have the responsibility to point out that the formerly rightful caliph has relinquished his Art and lost his powers.
+But since I haven't lost my belief in honesty, I have the responsibility to point out that the formerly rightful caliph has relinquished his Art and lost his powers.
The modern Yudkowsky [writes](https://twitter.com/ESYudkowsky/status/1096769579362115584):
(Additionally, I would have hoped that my two previous mentions in the thread of supporting keeping nuclear, bioweapon, and AI secrets should have already made it clear that I wasn't against _all_ cases of Society hiding information, but to further demonstrate my ability to generate counterexamples, I mentioned that I would also admit _threats_ as a class of legitimate infohazard: if I'm not a perfect decision theorist, I'm better off if Tony Soprano just doesn't have my email address to begin with, if I don't trust myself to calculate when I "should" ignore his demands.)
-----
+
+https://discord.com/channels/936151692041400361/954750671280807968/1210280210730061854
+> I wouldn't say [the history Screen is] just a plot device and I can see the real dath ilan doing it; Earth definitely shouldn't bother considering it, though.
✓ make sure I'm summarizing "policy debates" moral from "Challenges"
✓ revise "too good a writer" to be more explicit "someone could be that naive"
✓ footnote about how I could be blamed for being too credulous?
+_ Stephen Jay Gould
_ edit post to clarify "nudging the cognition"
_ Tail's objection to FFS example
_ Brennan "everyone else should participate" needs more wording adjustments
_ Sept. 2020 clarification noted that a distinction should be made between
_ emphasize that 2018 thread was policing TERF-like pronoun usage, not just disapproving of gender-based pronouns
+_ emphasize that the philosophy-of-language thing was MUCH worse
+_ note the "larger than protons" concession
_ look for a place to link http://benjaminrosshoffman.com/discursive-warfare-and-faction-formation/
_ parenthetical defending literal fraud?
_ link https://thingofthings.substack.com/p/why-callout-posts-often-include-trivial
_ the mailing list post noted it as a "common sexual fantasy"
+_ Feynman, "pretend that the master truthseekers of any age of history"
+_ Dawkins (https://www.theguardian.com/books/2021/apr/20/richard-dawkins-loses-humanist-of-the-year-trans-comments) and Jerry Coyne (https://whyevolutionistrue.com/2023/08/27/on-helen-joyces-trans/) and Hooven (https://www.thefp.com/p/carole-hooven-why-i-left-harvard)
+_ emphasize that the philosophy-of-language thing is much worse
+_ it's gotten worse in the past 10–20 years
+_ social gender, hair color, and "believing in"
_ cite more sneers; use a footnote to pack in as many as possible
-_ Stephen Jay Gould
-_ Dawkins and Jerry Coyne and https://www.thefp.com/p/carole-hooven-why-i-left-harvard
-
+_ "if he decided after all that" exact clause
time-sensitive globals TODOs—
✓ consult Said
Still citing it (13 Feb 24): https://www.lesswrong.com/posts/kSq5qiafd6SqQoJWv/technologies-and-terminology-ai-isn-t-software-it-s-deepware
+Still citing it (22 Feb 24): https://twitter.com/mlbaggins/status/1760710932047577282
+
At least I don't have to link the rebuttal myself every time:
https://www.datasecretslox.com/index.php/topic,1553.msg38755.html
https://old.reddit.com/r/slatestarcodex/comments/10vx6gk/the_categories_were_made_for_man_not_man_for_the/j7k8fjc/
https://jdpressman.com/2023/08/28/agi-ruin-and-the-road-to-iconoclasm.html
-https://www.lesswrong.com/posts/BahoNzY2pzSeM2Dtk/beware-of-stephen-j-gould
-> there comes a point in self-deception where it becomes morally indistinguishable from lying. Consistently self-serving scientific "error", in the face of repeated correction and without informing others of the criticism, blends over into scientific fraud.
-
https://time.com/collection/time100-ai/6309037/eliezer-yudkowsky/
> "I expected to be a tiny voice shouting into the void, and people listened instead. So I doubled down on that."
> "Study science, not just me!" is probably the most important piece of advice Ayn Rand should've given her followers and didn't. There's no one human being who ever lived, whose shoulders were broad enough to bear all the weight of a true science with many contributors.
https://www.lesswrong.com/posts/96TBXaHwLbFyeAxrg/guardians-of-ayn-rand
+
+He's still dunking instead of engaging—
+https://twitter.com/ESYudkowsky/status/1760701916739194949
+> Every time I've raised an inscrutable alien baby to hyperintelligence by giving it shots of heroin whenever it correctly predicts the exact next word spoken by fictional good characters, it's learned to be a genuinely good person inside!
+
+
+-----
+
+> I recently advised somebody to distinguish firmly in her mind between "X is actually true" and "X is the politic thing to say"; I advised drawing a great line and the creation of separate mental buckets. The words you write, taken at face value, seem to be missing some...
+
+https://twitter.com/ESYudkowsky/status/1356493665988829186
+> ...similar distinctions. There's a distinction between honesty in the form of blurting out the whole truth, and honesty in the form of not uttering lies, and a related thing that's not making public confusions *worse* even if you aren't trying to unravel them. There's...
+
+https://twitter.com/ESYudkowsky/status/1356493883094441984
+> ...being honest in the privacy of your own mind, and being honest with your friends, and being honest in public on the Internet, and even if these things are not perfectly uncorrelated, they are also not the same. Seeking truth is the first one. It's strange and disingenuous...
+
+https://twitter.com/ESYudkowsky/status/1356494097511370752
+> ...to pretend that the master truthseekers of any age of history, must all have been blurting out everything they knew in public, at all times, on pain of not possibly being able to retain their Art otherwise. I doubt Richard Feynman was like that. More likely is that, say, ...
+
+https://twitter.com/ESYudkowsky/status/1356494399945854976
+> ...he tried to avoid telling outright lies or making public confusions worse, but mainly got by on having a much-sharper-than-average dividing line in his mine between peer pressure against saying something, and that thing being *false*. That's definitely most of how I do it.
+
+-------
+
+> Anyone who's worked with me on public comms knows that among my first instructions is "We only use valid arguments here." (Which makes hiring writers difficult; they have to know the difference.) I've never called for lying to the public. Label the shit you make up as made-up.
+https://twitter.com/ESYudkowsky/status/1760133310024671583
+
+https://www.lesswrong.com/posts/BahoNzY2pzSeM2Dtk/beware-of-stephen-j-gould
+> there comes a point in self-deception where it becomes morally indistinguishable from lying. Consistently self-serving scientific "error", in the face of repeated correction and without informing others of the criticism, blends over into scientific fraud.
02/19/2024,118811,0\r
02/20/2024,118902,91\r
02/21/2024,118965,63\r
-02/22/2024,,
\ No newline at end of file
+02/22/2024,118965,0\r
+02/23/2024,118965,0\r
+02/24/2024,118965,0\r
+02/25/2024,118965,0\r
+02/26/2024,118965,0\r
+02/27/2024,
\ No newline at end of file
-
-Eliezer Yudkowsky has not been consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities. 1/
-
-
+I've now told enough of my Whole Dumb Story that it's time for the part where I explain how @ESYudkowsky has not been consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities: 1/
+If the world is ending either way, I prefer to die with my committment to public reason intact. It's been heartbreaking coming to terms with the apparent reality that the person who long ago once wrote the Sequences doesn't feel the same way. I thought you deserved to know.
--------
-> Anyone who's worked with me on public comms knows that among my first instructions is "We only use valid arguments here." (Which makes hiring writers difficult; they have to know the difference.) I've never called for lying to the public. Label the shit you make up as made-up.
-https://twitter.com/ESYudkowsky/status/1760133310024671583
----------
-So, I'm almost ready to publish my loud public denunciation of Yudkowsky for intellectual dishonesty (as pt. 4 of my memoir sequence). Is anyone interested in offering advice or "hostile advice" (trying to talk me out of something you see as destructive, _e.g._ infighting while the world is about to end)?
-
-I'm eager for advice because this is a high-stakes political move and needs to be a _flawless performance_. (When you strike at a king, _you must kill him_.) My ideal outcome is for Eliezer to actually learn something, but since that's probably not going to happen (by the Law of Continued Failure), I'll settle for dealing targeted reputational damage.
+So, I'm almost ready to publish pt. 4 of my memoir sequence, which features a loud public denunciation of Yudkowsky for intellectual dishonesty. Is anyone interested in offering advice or "hostile advice" (trying to talk me out of something you see as destructive, _e.g._ kicking up intra-cult infighting while the world is about to end)?
-It's unpleasant for it to come to this, but at this point, I don't think I have any other options besides "lay down and die." I tried the good-faith object-level argument thing for years, and he made it very clear that he reserves the right to _ignore counterarguments on political grounds_ (because that's where his political incentives point), and that he thinks it's insane (his word choice) to get angry at people who are just following their political incentives. At that point, _my_ incentive is to cry "Fraud!" for the benefit of people who still erroneously trust him not to think that intellectual honesty is insane.
+My ideal outcome is for Eliezer to actually learn something, but since that's probably not going to happen (by the Law of Continued Failure), I'll settle for dealing reputational damage.
-(It's really striking how, despite sneering about the lost of art of perspective taking, he acts as if he's incapable of entertaining the perspective under which the published text of the Sequences might have led someone to form higher expectations of him. Oli Habryka gets it! (<https://www.greaterwrong.com/posts/juZ8ugdNqMrbX7x2J/challenges-to-yudkowsky-s-pronoun-reform-proposal/comment/he8dztSuBBuxNRMSY>) Vaniver gets it! (<https://www.greaterwrong.com/posts/yFZH2sBsmmqgWm4Sp/if-clarity-seems-like-death-to-them/comment/dSiBGRGziEffJqN2B>) Eliezer Yudkowsky either doesn't get it, or is pretending not to get it. I almost suspect it's the first one, which is far worse.)
+I thought about taking out a Manifold market for "Will Yudkowsky reply to [post tile] in a way that an _Overcoming Bias_ reader in 2008 would consider non-evasive, as assessed by [third party judge]?" and buying some NO. (I think Ben Pace is credibly neutral and would agree to judge.) The idea being that the existence of the market incentivizes honesty in a potential reply, because it would look very bad for him if he tries the kind of high-verbal-IQ ass-covering I've seen from him in the past and the judge rules that a 2008 _Overcoming Bias_ reader wouldn't buy it.
-An key aspect of this situation from my perspective is that it's very obviously a conflict and not an honest disagreement. It's prudent for me to strategize about what his defense is going to be—if any. He _could_ just ignore it. But he does occasionally respond to critics, and I think my voice carries enough intra-cult weight that he'll plausibly want to defend against the reputational damage. We've seen that he's _very_ skilled at high-verbal-IQ ass-covering. Is there anything I can do to preëmpt the ass-covering maneuvers, separately from what's already in the post?
+But I'm leaning against the Manifold gambit because I don't want it look like I'm expecting or demanding a reply. I've more than used up my lifetime supply of Eliezer-bandwidth. The point is for me to explain to _everyone else_ why I think he's a phony and I don't respect him anymore. If he actively _wants_ to contest my claim that he's a phony—or try to win back my respect—he's welcome to do so. But given that he doesn't give a shit what people like me think of his intellectual integrity, I'm just as happy to prosecute him _in absentia_.
-I thought about taking out a Manifold market for "Will Yudkowsky reply to [post tile] in a way that an _Overcoming Bias_ reader in 2008 would consider non-evasive, as assessed by [third party judge]?" and buying some NO. (I think Ben Pace is credibly neutral and would agree to judge.) The idea being that the existence of the market incentivizes honesty, because it would look very bad for him if he tries to ass-cover and the judge rules that a 2008 _Overcoming Bias_ reader wouldn't buy it.
-
-But I'm leaning against the Manifold gambit because I don't want it look like I'm expecting or demanding a reply. I've more than used up my lifetime supply of Eliezer-bandwidth. The point is for me to explain to _everyone else_ why he's a phony and I don't respect him anymore. If he actively _wants_ to contest my claim that he's a phony—or try to win back my respect—he's welcome to do so. But given that he doesn't give a shit what people like me think of his intellectual integrity (if he wanted to be honest, he could have done it seven years ago), I'm just as happy to prosecute him _in absentia_.
[TODO: reply to message in question]
I do quote this November 2022 message in the post, which I argue doesn't violate consensus privacy norms, due to the conjunction of (a) it not being particularly different-in-character from things he's said in more public venues, and (b) there bring _more than 100 people in this server_; I argue that he can't have had a reasonable expectation of privacy (of the kind that would prohibit sharing a personal email, even if the email didn't say anything particularly different-in-character from things the author said in a more public venue). But I'm listening if someone wants to argue that I'm misjudging the consensus privacy norms.
-
------
+My guess is that that's what the mutual best response looks like: I deal reputational damage to him in the minds of people who care about the intellectual standards I'm appealing to, and he ignores it, because the people who care about the standards I'm appealing to aren't a sufficiently valuable political resource to him. If there's a Pareto improvement over that, I'm not seeing it?
+
[TODO: at this point, the entire debate tree has been covered so thoroughly that Caliphate loyalists don't have anything left other than, "accusing people of bad faith is mean". E.g., Xu and Kelsey. Did I stutter?]
[TODO: maybe he'll try to spin complaints about the personality cult into more evidence for the personality cult]