-> What if self-deception helps us be happy? What if just running out and overcoming bias will make us—gasp!—_unhappy?_ Surely, _true_ wisdom would be _second-order_ rationality, choosing when to be rational. That way you can decide which cognitive biases should govern you, to maximize your happiness.
->
-> Leaving the morality aside, I doubt such a lunatic dislocation in the mind could really happen.
->
-> [...]
->
-> For second-order rationality to be genuinely _rational_, you would first need a good model of reality, to extrapolate the consequences of rationality and irrationality. If you then chose to be first-order irrational, you would need to forget this accurate view. And then forget the act of forgetting. I don't mean to commit the logical fallacy of generalizing from fictional evidence, but I think Orwell did a good job of extrapolating where this path leads.
->
-> You can't know the consequences of being biased, until you have already debiased yourself. And then it is too late for self-deception.
->
-> The other alternative is to choose blindly to remain biased, without any clear idea of the consequences. This is not second-order rationality. It is willful stupidity.
->
-> [...]
+I wasn't sure what my wordcount and "diplomacy" "budget limits" for the server were, but I couldn't let go; I kept the thread going on subsequent days. There was something I felt I should be able to convey, if I could just find the right words.
+
+When [Word of God](https://tvtropes.org/pmwiki/pmwiki.php/Main/WordOfGod) says, "trying to prevent most [_X_] from discovering what they are [...] continues to strike me as a basically reasonable policy option", then, separately from the particular value of _X_, I expected people to jump out of their chairs and say, "No! This is wrong! Morally wrong! People can stand what is true about themselves, because they are already doing so!"
+
+And to the extent that I was the only person jumping out of my chair, and there was a party-line response of the form, "Ah, but if it's been decreed by authorial fiat that these-and-such probabilities and utilities take such-and-these values, then in this case, self-knowledge is actually bad under the utilitarian calculus," I wasn't disputing the utilitarian calculus. I was wondering—here I used the "🐛" bug emoji customarily used in Glowfic culture to indicate uncertainty about the right words to use—_who destroyed your souls?_
+
+Yudkowsky replied:
+
+> it feels powerfully relevant to me that the people of whom I am saying this _are eliezera_. I get to decide what they'd want because, unlike with Earth humans, I get to put myself in their shoes. it's plausible to me that the prediction markets say that I'd be sadder if I was exposed to the concept of sadism in a world with no masochists. if so, while I wouldn't relinquish my Art and lose my powers by trying to delude myself about that once I'd been told, I'd consider it a friendly act to keep the info from me—_because_ I have less self-delusional defenses than a standard Earthling, really—and a hostile act to tell me; and if you are telling me I don't get to make that decision for myself because it's evil, and if you go around shouting it from the street corners in dath ilan, then yeah I think most cities don't let you in.
+
+I wish I had thought to ask if he'd have felt the same way in 2008.
+
+Ajvermillion was still baffled at my skepticism: if the author specifies that the world of the story is simple in this-and-such direction, on what grounds could I _disagree_?
+
+I admitted, again, that there was a sense in which I couldn't argue with authorial fiat. But I thought that an author's choice of assumptions reveals something about what they think is true in our world, and commenting on that should be fair game for literary critics. Suppose someone wrote a story and said, "in the world portrayed in this story, everyone is super-great at _kung fu_, and they could beat up everyone from our Earth, but they never have to practice at all."
+
+(Yudkowsky retorted, "...you realize you're describing like half the alien planets in comic books? when did Superman ever get depicted as studying kung fu?" I wish I had thought to admit that, yes, I _did_ hold Eliezer Yudkowsky to a higher standard of consilient worldbuilding than DC Comics. Would he rather I _didn't_?)
+
+Something about innate _kung fu_ world seems fake in a way that seems like a literary flaw. It's not just about plausibility. Fiction often incorporates unrealistic elements in order to tell a story that has relevance to real human lives. Innate _kung fu_ skills are scientifically plausible[^instinct] in a way that faster-than-light travel is not, but throwing faster-than-light travel into the universe so that you can do a [space opera](https://tvtropes.org/pmwiki/pmwiki.php/Main/SpaceOpera) doesn't make the _people_ fake in the way that Superman's fighting skills are fake.
+
+[^instinct]: All sorts of other instinctual behaviors exist in animals; I don't se why skills humans have to study for years as a "martial art" couldn't be coded into the genome.
+
+Maybe it was okay for Superman's fighting skills to be fake from a literary perspective (because realism along that dimension is not what Superman is _about_), but if the Yudkowskian ethos exulted intelligence as ["the power that cannot be removed without removing you"](https://www.lesswrong.com/posts/SXK87NgEPszhWkvQm/mundane-magic), readers had grounds to demand that the dath ilani's thinking skills be real, and a world that's claimed by authorial fiat to be super-great at epistemic rationality, but where the people don't have a will-to-truth stronger than their will-to-happiness, felt fake to me. I couldn't _prove_ that it was fake. I agreed with Harmless's case that, _technically_, as far as the Law went, you could build a Civilization or a Friendly AI to see all the ugly things that you preferred not to see.
+
+But if you could—would you? And more importantly, if you would—could you?
+
+It was possible that the attitude I was evincing here was just a difference between the eliezera out of dath ilan and the Zackistani from my medianworld, and that there was nothing more to be said about it. But I didn't think the thing was a _genetic_ trait of the Zackistani! _I_ got it from spending my early twenties obsessively re-reading blog posts that said things like, ["I believe that it is right and proper for me, as a human being, to have an interest in the future [...] One of those interests is the human pursuit of truth [...] I wish to strengthen that pursuit further, in this generation."](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business)
+
+There were definitely communities on Earth where I wasn't allowed in because of my tendency to shout things from street corners, and I respected those people's right to have a safe space for themselves.
+
+But those communities ... didn't call themselves _rationalists_, weren't _pretending_ be to be inheritors of the great tradition of E. T. Jaynes and Robin Dawes and Richard Feynman. And if they _did_, I think I would have a false advertising complaint against them.
+
+"[The eleventh virtue is scholarship. Study many sciences and absorb their power as your own](https://www.yudkowsky.net/rational/virtues) ... unless a prediction market says that would make you less happy," just didn't have the same ring to it. Neither did "The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But higher than both of those, is trusting your Society's institutions to tell you which kinds of knowledge will make you happy"—even if you stipulated by authorial fiat that your Society's institutions are super-competent, such that they're probably right about the happiness thing.
+
+Attempting to illustrate [the mood I thought dath ilan was missing](https://www.econlib.org/archives/2016/01/the_invisible_t.html), I quoted (with Discord's click-to-reveal spoiler blocks around the more plot-relevant sentences) the scene from _Atlas Shrugged_ where our heroine Dagny expresses a wish to be kept ignorant for the sake of her own happiness, and gets shut down by John Galt—and Dagny _thanks_ him.[^atlas-shrugged-ref]
+
+> "[...] Oh, if only I didn't have to hear about it! If only I could stay here and never know what they're doing to the railroad, and never learn when it goes!"