>
> But if somebody's hair color is halfway between two central points? If their civilization has developed stereotypes about hair color they're not comfortable with, such that they feel that the pronoun corresponding to their outward hair color is something they're not comfortable with because they don't fit key aspects of the rest of the stereotype and they feel strongly about that? If they have dyed their hair because of that, or **plan to get hair surgery, or would get hair surgery if it were safer but for now are afraid to do so?** Then it's stupid to try to force people to take complicated positions about those social topics _before they are allowed to utter grammatical sentences_.
-I agree that a language convention in which pronouns map to hair color doesn't seem great. The people in this world should probably coordinate on switching to a better convention, if they can figure out how.
+I agree that a language convention in which pronouns map to hair color seems pretty bad. The people in this world should probably coordinate on switching to a better convention, if they can figure out how.
But taking the convention as given, a demand to be referred to as having a hair color _that one does not have_ seems outrageous to me!
It makes sense to object to the convention forcing a binary choice in the "halfway between two central points" case. That's an example of genuine nuance brought on by a genuine complication to a system that _falsely_ assumes discrete hair colors.
-But "plan to get hair surgery"? "Would get hair surgery if it were safer but for now are afraid to do so"? In what sense do these cases present a challenge to the discrete system and therefore call for complication and nuance? There's nothing ambiguous about these cases: if you haven't, in fact, changed your hair color, then your hair is, in fact, its original color. The decision to get hair surgery does not _propagate backwards in time_. The decision to get hair surgery cannot be _imported from a counterfactual universe in which it is safer_. People who, today, do not have the hair color that they would prefer are, today, going to have to deal with that fact _as a fact_.
+But "plan to get hair surgery"? "Would get hair surgery if it were safer but for now are afraid to do so"? In what sense do these cases present a challenge to the discrete system and therefore call for complication and nuance? There's nothing ambiguous about these cases: if you haven't, in fact, changed your hair color, then your hair is, in fact, its original color. The decision to get hair surgery does not _propagate backwards in time_. The decision to get hair surgery cannot be _imported from a counterfactual universe in which it is safer_. People who, today, do not have the hair color that they would prefer are, today, going to have to deal with that fact _as a fact_.[^pronoun-roles]
+
+[^pronoun-roles]: If the problem is with the pronoun implying stereotypes and social roles in the language as spoken, such that another pronoun should be considered more correct despite that lack of corresponding hair color, you should be making that case on the empirical merits, not appealing to hypothetical surgeries.
Is the idea that we want to use the same pronouns for the same person over time, so that if we know someone is going to get hair surgery—they have an appointment with the hair surgeon at this-and-such date—we can go ahead and switch their pronouns in advance? Okay, I can buy that.
But this potential unification seemed dubious to me, especially if trans women were purported to be "at least 20% of the ones with penises" (!) in some population. After it's been pointed out, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female but 'otherwise identical' copy of himself" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. So in October 2016, [I wrote to Yudkowsky noting the apparent reversal and asking to talk about it](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price). Because of the privacy rules I'm adhering to in telling this Whole Dumb Story, I can't confirm or deny whether any such conversation occurred.
-Then, in November 2018, while criticizing people who refuse to use trans people's preferred pronouns, Yudkowsky proclaimed that "Using language in a way _you_ dislike, openly and explicitly and with public focus on the language and its meaning, is not lying" and that "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning". But _that_ seemed like a huge and surprising reversal from the position articulated in ["37 Ways Words Can Be Wrong"](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong). After attempts to clarify via email failed, I eventually wrote ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) to explain the relevant error in general terms, and Yudkowsky eventually [clarified his position in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228).
+Then, in November 2018, while criticizing people who refuse to use trans people's preferred pronouns, Yudkowsky proclaimed that "Using language in a way _you_ dislike, openly and explicitly and with public focus on the language and its meaning, is not lying" and that "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning". But _that_ seemed like a huge and surprising reversal from the position articulated in ["37 Ways Words Can Be Wrong"](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong).
+
+(And this November 2018 reversal on the philosophy of language was much, much worse than the March 2016 reversal on the psychology of sex, because the latter is a complicated empirical question about which reasonable people might read new evidence differently and change their minds; in contrast, there's no plausible good reason for him to have reversed course on whether words can be wrong.)
+
+After attempts to clarify via email failed, I eventually wrote ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) to explain the relevant error in general terms, and Yudkowsky eventually [clarified his position in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228).
But then in February 2021, he reopened the discussion to proclaim that "the simplest and best protocol is, '_He_ refers to the set of people who have asked us to use _he_, with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition", the problems with which post I explained in March 2022's ["Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/) and above.
On "his turn", he comes up with some pompous proclamation that's obviously optimized to make the "pro-trans" faction look smart and good and the "anti-trans" faction look dumb and bad, "in ways that exhibit generally rationalist principles."
-On "my turn", I put in an absurd amount of effort explaining in exhaustive, _exhaustive_ detail why Yudkowsky's pompous proclamation, while [not technically making any unambiguously false atomic statements](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly), was substantively misleading compared to what any serious person would say if they were trying to make sense of the world without worrying what progressive activists would think of them.
+On "my turn", I put in an absurd amount of effort explaining in exhaustive, _exhaustive_ detail why Yudkowsky's pompous proclamation, while [perhaps not technically making any unambiguously false atomic statements](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly), was substantively misleading compared to what any serious person would say if they were trying to make sense of the world without worrying what progressive activists would think of them.
At the start, I never expected to end up arguing about the minutiæ of pronoun conventions, which no one would care about if contingencies of the English language hadn't made them a Schelling point for things people do care about. The conversation only ended up here after a series of derailings. At the start, I was trying to say something substantive about the psychology of straight men who wish they were women.
What makes all of this especially galling is that _all of my heretical opinions are literally just Yudkowsky's opinions from the 'aughts!_ My thing about how changing sex isn't possible with existing or foreseeable technology because of how complicated humans (and therefore human sex differences) are? Not original to me! I [filled in a few technical details](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard), but again, this was in the Sequences as ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). My thing about how you can't define concepts any way you want because there are mathematical laws governing which category boundaries [compress](https://www.lesswrong.com/posts/mB95aqTSJLNR9YyjH/message-length) your [anticipated experiences](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences)? Not original to me! I [filled in](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [a few technical details](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception), but [_we had a whole Sequence about this._](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong)
-Seriously, do you think I'm smart enough to come up with all of this independently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still gave a shit about telling the truth_. (Actively telling the truth, and not just technically not lying.) The things I'm hyperfocused on that he thinks are politically impossible to say in the current year are almost entirely things he already said, that anyone could just look up!
+Seriously, do you think I'm smart enough to come up with all of this independently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still cared about telling the truth_. (Actively telling the truth, and not just technically not lying.) The things I'm hyperfocused on that he thinks are politically impossible to say in the current year are almost entirely things he already said, that anyone could just look up!
I guess the egregore doesn't have the reading comprehension for that?—or rather, the egregore has no reason to care about the past; if you get tagged by the mob as an Enemy, your past statements will get dug up as evidence of foul present intent, but if you're playing the part well enough today, no one cares what you said in 2009?
### The Battle That Matters
-Yudkowsky [defended his behavior in February 2021](https://twitter.com/ESYudkowsky/status/1356812143849394176):
+In February 2021, Yudkowsky [defended his behavior](https://twitter.com/ESYudkowsky/status/1356812143849394176) (referring back to [his November 2018 "hill of meaning in defense of validity" Twitter statement](https://twitter.com/ESYudkowsky/status/1067183500216811521)):
> I think that some people model civilization as being in the middle of a great battle in which this tweet, even if true, is giving comfort to the Wrong Side, where I would not have been as willing to tweet a truth helping the Right Side. From my perspective, this battle just isn't that close to the top of my priority list. I rated nudging the cognition of the people-I-usually-respect, closer to sanity, as more important; who knows, those people might matter for AGI someday. And the Wrong Side part isn't as clear to me either.
Such readers may have a point. If _you_ [already knew](https://www.lesswrong.com/posts/tSgcorrgBnrCH8nL3/don-t-revere-the-bearer-of-good-info) that Yudkowsky's pose of epistemic superiority was phony (because everyone knows), then you are wiser than I was. But I think there are a lot of people in the "rationalist" subculture who didn't know (because we weren't anyone). This post is for their benefit.
-Because, I did, actually, trust him. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). ["Never go in against Eliezer Yudkowsky when anything is on the line"](https://www.greaterwrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts/comment/Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to.
+Perhaps he thinks it's unreasonable for someone to hold him to higher standards. As he [wrote](https://twitter.com/ESYudkowsky/status/1356493883094441984) [on](https://twitter.com/ESYudkowsky/status/1356494097511370752) [Twitter](https://twitter.com/ESYudkowsky/status/1356494399945854976) in February 2021:
+
+> It's strange and disingenuous to pretend that the master truthseekers of any age of history, must all have been blurting out everything they knew in public, at all times, on pain of not possibly being able to retain their Art otherwise. I doubt Richard Feynman was like that. More likely is that, say, he tried to avoid telling outright lies or making public confusions worse, but mainly got by on having a much-sharper-than-average dividing line in his mine between peer pressure against saying something, and that thing being _false_.
+
+I've read _Surely You're Joking, Mr. Feynman_. I cannot imagine Richard Feynman trying to get away with the "sometimes personally prudent and not community-harmful" line. (On the other hand, I couldn't have imagined Yudkowsky doing so in 2009.)
+
+Other science educators in the current year such as [Richard Dawkins](https://www.theguardian.com/books/2021/apr/20/richard-dawkins-loses-humanist-of-the-year-trans-comments), University of Chicago professor [Jerry Coyne](https://whyevolutionistrue.com/2023/08/27/on-helen-joyces-trans/), or ex-Harvard professor [Carole Hooven](https://www.thefp.com/p/carole-hooven-why-i-left-harvard) have been willing to pay political costs to stand up for the scientific truth that biological sex continues to be real even when it hurts people's feelings.
+
+If Yudkowsky thinks he's too important for that (because his popularity with progressives has much greater impact on the history of Earth-originating intelligent life than Carole Hooven's), that might be the right act-consequentialist decision, but one of the consequences he should be tracking is that he's forfeiting the trust of everyone who expected him to live up to the epistemic standards successfully upheld by UChicago or Harvard biology professors.
+
+It looks foolish in retrospect, but I did trust him much more than that. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). ["Never go in against Eliezer Yudkowsky when anything is on the line"](https://www.greaterwrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts/comment/Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to.
Part of what made him so trustworthy back then was that he wasn't asking for trust. He clearly _did_ think it was [unvirtuous to just shut up and listen to him](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus): "I'm not sure that human beings realistically _can_ trust and think at the same time," [he wrote](https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science). He was always arrogant, but it was tempered by the expectation of being held to account by arguments rather than being deferred to as a social superior. "I try in general to avoid sending my brain signals which tell it that I am high-status, just in case that causes my brain to decide it is no longer necessary," [he wrote](https://www.lesswrong.com/posts/cgrvvp9QzjiFuYwLi/high-status-and-stupidity-why).
Second, it's a conditional: _if_ Yudkowsky can't unambiguously choose Truth over Feelings, _then_ he's a fraud. If he wanted to come clean, he could do so at any time.
-He probably won't. We've already seen from his behavior that he doesn't give a shit what people like me think of his intellectual integrity. Why would that change?
+He probably won't. We've already seen from his behavior that he doesn't care what people like me think of his intellectual integrity. Why would that change?
Third, given that "fraud" is a semantically meaningful description rather than an emotive negative evaluation, I should stress that evaluation is a separate step. If being a fraud were necessary for saving the world, maybe being a fraud would be the right thing to do? More on this in the next post. (To be continued.)
✓ revise "too good a writer" to be more explicit "someone could be that naive"
✓ footnote about how I could be blamed for being too credulous?
✓ Stephen Jay Gould
-_ emphasize that the philosophy-of-language thing is much worse
-_ edit post to clarify "nudging the cognition"
+✓ social gender, hair color, and "believing in"
+✓ emphasize that the philosophy-of-language thing was much worse
+✓ Feynman, "pretend that the master truthseekers of any age of history"
+✓ Dawkins and Coyne and Hooven
+✓ edit post to clarify "nudging the cognition"
_ Tail's objection to FFS example
_ Brennan "everyone else should participate" needs more wording adjustments
_ Sept. 2020 clarification noted that a distinction should be made between
_ emphasize that 2018 thread was policing TERF-like pronoun usage, not just disapproving of gender-based pronouns
-_ emphasize that the philosophy-of-language thing was MUCH worse
_ note the "larger than protons" concession
_ look for a place to link http://benjaminrosshoffman.com/discursive-warfare-and-faction-formation/
-_ parenthetical defending literal fraud?
_ the mailing list post noted it as a "common sexual fantasy"
-_ Feynman, "pretend that the master truthseekers of any age of history"
-_ Dawkins (https://www.theguardian.com/books/2021/apr/20/richard-dawkins-loses-humanist-of-the-year-trans-comments) and Jerry Coyne (https://whyevolutionistrue.com/2023/08/27/on-helen-joyces-trans/) and Hooven (https://www.thefp.com/p/carole-hooven-why-i-left-harvard)
-_ it's gotten worse in the past 10–20 years
-_ social gender, hair color, and "believing in"
_ cite more sneers; use a footnote to pack in as many as possible
-
+_ add headers to pt. 2 and link back?
time-sensitive globals TODOs—
✓ consult Said
https://twitter.com/ESYudkowsky/status/1356494399945854976
> ...he tried to avoid telling outright lies or making public confusions worse, but mainly got by on having a much-sharper-than-average dividing line in his mine between peer pressure against saying something, and that thing being *false*. That's definitely most of how I do it.
-
--------
-
-> Anyone who's worked with me on public comms knows that among my first instructions is "We only use valid arguments here." (Which makes hiring writers difficult; they have to know the difference.) I've never called for lying to the public. Label the shit you make up as made-up.
-https://twitter.com/ESYudkowsky/status/1760133310024671583
-
-
-> there comes a point in self-deception where it becomes morally indistinguishable from lying.
-
-
-
-I've now told enough of my Whole Dumb Story that it's time for the part where I explain how @ESYudkowsky has not been consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities: 1/
+### Option A (just a link, with just the meme denunciation)
+I've now told enough of my Whole Dumb Story that it's time for the part where I explain how @ESYudkowsky has not been consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities: [link]
-If the world is ending either way, I prefer to die with my committment to public reason intact. It's been heartbreaking coming to terms with the apparent reality that the person who long ago once wrote the Sequences doesn't feel the same way. I thought you deserved to know.
+### Option B (thread with more explicit denunciation)
---------
+I've now told enough of my Whole Dumb Story that it's time for the part where I explain how @ESYudkowsky has not been consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities: [link] 1/7
+The Whole Dumb Story is 87K words so far, which few will read, so in this thread I'll briefly summarize why I think @ESYudkowsky has relinquished his Art and lost his powers (with the disclaimer that this is only a summary & the full Story covers nuances that don't fit here). 2/7
+
+Since 2016, I've been frustrated that Society has apparently decided that men can be women by means of saying so. There's a lot of nuance that I've covered elsewhere, but briefly, in less than 280 characters, my objection is that this just isn't true. 3/7
+
+I know that Yudkowsky knows that it isn't true, because I learned it from him in 2008. But as I document in the post, since 2016, he's repeatedly made public statements that obfuscate and prevaricate on this point, switching to new arguments after I've critiqued the old ones. 4/7
+
+Coming from any other public intellectual, this might not be a big deal. But Yudkowsky makes a lot of grandiose claims to authority, that he's an "epistemic hero", that "too many people think it's unvirtuous to shut up and listen to [him]", &c. https://twitter.com/ESYudkowsky/status/1509944888376188929 5/7
+
+I consider these authority claims to be morally fraudulent. Someone who behaves the way @ESYudkowsky has (as I describe thoroughly in the post) is not an epistemic hero, and I think he knows that. 6/7
+
+If the world is ending either way, I prefer to die with my committment to public reason intact. It's been heartbreaking coming to terms with the realization that the person who wrote the Sequences apparently doesn't feel the same way. I thought you deserved to know. 7/7
----------
So, I'm almost ready to publish pt. 4 of my memoir sequence, which features a loud public denunciation of Yudkowsky for intellectual dishonesty. Is anyone interested in offering advice or "hostile advice" (trying to talk me out of something you see as destructive, _e.g._ kicking up intra-cult infighting while the world is about to end)?
-My ideal outcome is for Eliezer to actually learn something, but since that's probably not going to happen (by the Law of Continued Failure), I'll settle for dealing reputational damage.
+(This is unpleasant, but at this point, it's my only other option besides laying down and dying. I tried making object-level arguments _first_, for years, and he made it very, very, clear that he doesn't see any problem with marketing himself as an epistemic hero while reserving the right to ignore counterarguments on political grounds. What is there left for me to do but cry "Fraud!" at the top of my lungs? Does anyone want to make a case that I _should_ lay down and die, for some reason?)
+
+My ideal outcome is for Eliezer to actually learn something, but since that's probably not going to happen (by the Law of Continued Failure), I'll settle for dealing justified reputational damage.
I thought about taking out a Manifold market for "Will Yudkowsky reply to [post tile] in a way that an _Overcoming Bias_ reader in 2008 would consider non-evasive, as assessed by [third party judge]?" and buying some NO. (I think Ben Pace is credibly neutral and would agree to judge.) The idea being that the existence of the market incentivizes honesty in a potential reply, because it would look very bad for him if he tries the kind of high-verbal-IQ ass-covering I've seen from him in the past and the judge rules that a 2008 _Overcoming Bias_ reader wouldn't buy it.
But I'm leaning against the Manifold gambit because I don't want it look like I'm expecting or demanding a reply. I've more than used up my lifetime supply of Eliezer-bandwidth. The point is for me to explain to _everyone else_ why I think he's a phony and I don't respect him anymore. If he actively _wants_ to contest my claim that he's a phony—or try to win back my respect—he's welcome to do so. But given that he doesn't give a shit what people like me think of his intellectual integrity, I'm just as happy to prosecute him _in absentia_.
+As for my Twitter marketing strategy, I tried drafting a seven-Tweet thread summary of the reputational attack (because no one is going to read a 16K word post), but I'm unhappy with how it came out and am leaning towards just doing a two Tweets (option C: <https://gist.github.com/zackmdavis/7395e1978c42e0251cd8ae7add406ebc>) rather than trying to summarize in a thead. That's possibly cowardly (pulling my punches because I'm scared), but I think it's classy (because it's better to not try to do complicated things on Twitter; the intellectual and literary qualities that make my punches _hit hard_ to people who have read the Sequences don't easily compress to the 280-character format)
[TODO: reply to message in question]
-I do quote this November 2022 message in the post, which I argue doesn't violate consensus privacy norms, due to the conjunction of (a) it not being particularly different-in-character from things he's said in more public venues, and (b) there bring _more than 100 people in this server_; I argue that he can't have had a reasonable expectation of privacy (of the kind that would prohibit sharing a personal email, even if the email didn't say anything particularly different-in-character from things the author said in a more public venue). But I'm listening if someone wants to argue that I'm misjudging the consensus privacy norms.
+I do quote this November 2022 message in the post, which I argue doesn't violate consensus privacy norms, due to the conjunction of (a) it not being particularly different-in-character from things he's said in more public venues, and (b) there bring _more than 100 people in this server_ (not sure about this channel particularly); I argue that he can't have had a reasonable expectation of privacy (of the kind that would prohibit sharing a personal email, even if the email didn't say anything particularly different-in-character from things the author said in a more public venue). But I'm listening if someone wants to argue that I'm misjudging the consensus privacy norms.
------
[TODO: maybe he'll try to spin complaints about the personality cult into more evidence for the personality cult]
+It's really striking how, despite sneering about the lost of art of perspective taking, he acts as if he's incapable of entertaining the perspective under which the published text of the Sequences might have led someone to form higher expectations of him. Oli Habryka gets it! (<https://www.greaterwrong.com/posts/juZ8ugdNqMrbX7x2J/challenges-to-yudkowsky-s-pronoun-reform-proposal/comment/he8dztSuBBuxNRMSY>) Vaniver gets it! (<https://www.greaterwrong.com/posts/yFZH2sBsmmqgWm4Sp/if-clarity-seems-like-death-to-them/comment/dSiBGRGziEffJqN2B>) Eliezer Yudkowsky either doesn't get it, or is pretending not to get it. I almost suspect it's the first one, which is far worse
+
----------------
Post later (can't afford to spend more Twitter time now)—