+[^gullible]: Perhaps some readers will consider this post to be more revealing about my character rather than Yudkowsky's: that [everybody knows](https://thezvi.wordpress.com/2019/07/02/everybody-knows/) his bluster wasn't supposed to be taken seriously, so I have no more right to complain about "false advertising" than purchasers of a ["World's Best"](https://en.wikipedia.org/wiki/Puffery) ice-cream who are horrified (or pretending to be) that it may not objectively be the best in the world.
+
+ Such readers may have a point. If _you_ [already knew](https://www.lesswrong.com/posts/tSgcorrgBnrCH8nL3/don-t-revere-the-bearer-of-good-info) that Yudkowsky's pose of epistemic superiority was phony (because everyone knows), then you are wiser than I was. But I think there are a lot of people in the "rationalist" subculture who didn't know (because we weren't anyone). This post is for their benefit.
+
+Perhaps he thinks it's unreasonable for someone to hold him to higher standards. As he [wrote](https://twitter.com/ESYudkowsky/status/1356493883094441984) [on](https://twitter.com/ESYudkowsky/status/1356494097511370752) [Twitter](https://twitter.com/ESYudkowsky/status/1356494399945854976) in February 2021:
+
+> It's strange and disingenuous to pretend that the master truthseekers of any age of history, must all have been blurting out everything they knew in public, at all times, on pain of not possibly being able to retain their Art otherwise. I doubt Richard Feynman was like that. More likely is that, say, he tried to avoid telling outright lies or making public confusions worse, but mainly got by on having a much-sharper-than-average dividing line in his mine between peer pressure against saying something, and that thing being _false_.
+
+I've read _Surely You're Joking, Mr. Feynman_. I cannot imagine Richard Feynman trying to get away with the "sometimes personally prudent and not community-harmful" line. (On the other hand, I couldn't have imagined Yudkowsky doing so in 2009.)
+
+Other science educators in the current year such as [Richard Dawkins](https://www.theguardian.com/books/2021/apr/20/richard-dawkins-loses-humanist-of-the-year-trans-comments), University of Chicago professor [Jerry Coyne](https://whyevolutionistrue.com/2023/08/27/on-helen-joyces-trans/), or ex-Harvard professor [Carole Hooven](https://www.thefp.com/p/carole-hooven-why-i-left-harvard) have been willing to pay political costs to stand up for the scientific truth that biological sex continues to be real even when it hurts people's feelings.
+
+If Yudkowsky thinks he's too important for that (because his popularity with progressives has much greater impact on the history of Earth-originating intelligent life than Carole Hooven's), that might be the right act-consequentialist decision, but one of the consequences he should be tracking is that he's forfeiting the trust of everyone who expected him to live up to the epistemic standards successfully upheld by UChicago or Harvard biology professors.
+
+It looks foolish in retrospect, but I did trust him much more than that. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). ["Never go in against Eliezer Yudkowsky when anything is on the line"](https://www.greaterwrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts/comment/Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to.
+
+Part of what made him so trustworthy back then was that he wasn't asking for trust. He clearly _did_ think it was [unvirtuous to just shut up and listen to him](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus): "I'm not sure that human beings realistically _can_ trust and think at the same time," [he wrote](https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science). He was always arrogant, but it was tempered by the expectation of being held to account by arguments rather than being deferred to as a social superior. "I try in general to avoid sending my brain signals which tell it that I am high-status, just in case that causes my brain to decide it is no longer necessary," [he wrote](https://www.lesswrong.com/posts/cgrvvp9QzjiFuYwLi/high-status-and-stupidity-why).
+
+He visibly [cared about other people being in touch with reality](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business). "I've informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before," [he wrote](https://www.greaterwrong.com/posts/kLR5H4pbaBjzZxLv6/polyhacking/comment/rYKwptdgLgD2dBnHY). (I can testify that this is true: while sharing a car ride with Anna Salamon in 2011, he told me I had B.O.)[^bo-heroism]
+
+[^bo-heroism]: A lot of the epistemic heroism here is just in [noticing](https://www.lesswrong.com/posts/SA79JMXKWke32A3hG/original-seeing) the conflict between Feelings and Truth, between Politeness and Truth, rather than necessarily acting on it. If telling a person they smell bad would predictably meet harsh social punishment, I couldn't blame someone for consciously choosing silence and safety over telling the truth.
+
+ What I can and do blame someone for is actively fighting for Feelings while misrepresenting himself as the rightful caliph of epistemic rationality. There are a lot of trans people who would benefit from feedback that they don't pass but aren't getting that feedback by default. I wouldn't necessarily expect Yudkowsky to provide it. (I don't, either.) I _would_ expect the person who wrote the Sequences not to proclaim that the important thing is the feelings of people who do not like to be tossed into a Smells Bad bucket, which don't bear on the factual question of whether someone smells bad.
+
+That person is dead now, even if his body is still breathing.
+
+I think he knows it. In a November 2022 Discord discussion, [he remarked](/images/yudkowsky-i_might_have_made_a_fundamental_mistake.png):
+
+> I might have made a fundamental mistake when I decided, long ago, that I was going to try to teach people how to reason so that they'd be able to process my arguments about AGI and AGI alignment through a mechanism that would discriminate true from false statements.
+>
+> maybe I should've just learned to persuade people of things instead
+
+I got offended. I felt like a devout Catholic watching the Pope say, "Jesus sucks; I hate God; I never should have told people about God."
+
+Later, I felt the need to write another message clarifying exactly what I found offensive. The problem wasn't the condescension of the suggestion that other people couldn't reason. The problem was that "just learn[ing] to persuade people of things instead" was giving up on the principle that the arguments you use to convince others should be the same as the ones you used to decide which conclusion to argue for. Giving up on that amounted to giving up on the _concept_ of intellectual honesty, choosing instead to become a propaganda AI that calculates what signals to output in order to manipulate an agentless world.
+
+[He put a check-mark emoji reaction on it](/images/davis-amounts-to-giving-up-on-the-concept-of-intellectual-honesty.png), indicating agreement or approval.
+
+If the caliph has lost his [belief in](https://www.lesswrong.com/posts/duvzdffTzL3dWJcxn/believing-in-1) the power of intellectual honesty, I can't necessarily say he's wrong on the empirical merits. It is written that our world is [beyond the reach of God](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god); there's no law of physics that says honesty must yield better consequences than propaganda.
+
+But since I haven't lost my belief in honesty, I have the responsibility to point out that the formerly rightful caliph has relinquished his Art and lost his powers.
+
+The modern Yudkowsky [writes](https://twitter.com/ESYudkowsky/status/1096769579362115584):