+I _would_ expect the person who wrote the Sequences not to insist that the important thing is the feelings of human beings who are people describing reasons someone does not like to be tossed into a Smells Bad bucket which don't bear on the factual question of whether someone smells bad.
+
+That person is dead now, even if his body is still breathing.
+
+I think he knows it. In a November 2022 Discord discussion, [he remarked](yudkowsky-i_might_have_made_a_fundamental_mistake.png):
+
+> I might have made a fundamental mistake when I decided, long ago, that I was going to try to teach people how to reason so that they'd be able to process my arguments about AGI and AGI alignment through a mechanism that would discriminate true from false statements.
+>
+> maybe I should've just learned to persuade people of things instead
+
+I got offended. I said that I felt like a devout Catholic watching the Pope say, "Jesus sucks; I hate God; I never should have told people about God."
+
+Later, I felt the need to write another message clarifying exactly what I found offensive. The problem wasn't the condescension of the suggestion that other people couldn't reason. People being annoyed at the condescension was fine. The _problem_ was that just learning to persuade people of things instead was giving up on deep hidden-structure-of-normative-reasoning principle, that the arguments you use to convince others should be the same as the ones you used to decide which conclusion to argue for. Giving up on that amounted to giving up on the _concept_ of intellectual honesty, choosing instead to become a propaganda AI that calculates what signals to output in order to manipulate an agentless world.
+
+[He put a check-mark emoji on it](davis-amounts-to-giving-up-on-the-concept-of-intellectual-honesty.png), indicating agreement or approval.
+
+If the caliph has lost his faith in the power of intellectual honesty, I can't necessarily say he's wrong on the empirical merits. It is written that our world is [beyond the reach of God](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god); there's no law of physics that says honesty must yield better results than propaganda.
+
+But since I haven't relinquished my faith, I have the responsibility to point it out when he attempts to wield his priestly authority as the author of the Sequences while not being consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities. The modern Yudkowsky [writes](https://twitter.com/ESYudkowsky/status/1096769579362115584):
+
+> When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes.
+
+I notice that this advice leaves out a possibility: that the "seems to believe" is a deliberate show (judged to be personally prudent and not community-harmful), rather than a misperception on your part. I am left shaking my head in a [weighted average of](https://www.lesswrong.com/posts/y4bkJTtG3s5d6v36k/stupidity-and-dishonesty-explain-each-other-away) sadness about the mortal frailty of my former hero, and disgust at his craven duplicity. **If Eliezer Yudkowsky can't _unambigously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_.**