+In the #dath-ilan channel of the server, Yudkowsky elaborated on the reasoning for the masochism coverup:
+
+> altruistic sadists would if-counterfactually-fully-informed prefer not to know, because Civilization is capped on the number of happy sadists. even if you can afford a masochist, which requires being very rich, you're buying them away from the next sadist to whom masochists were previously just barely affordable
+
+In response to a question about how frequent sadism is among Keepers, Yudkowsky wrote:
+
+> I think they're unusually likely to be aware, nonpracticing potential sexual sadists. Noticing that sort of thing about yourself, and then not bidding against the next sadist over for the limited masochist supply, and instead just operating your brain so that it doesn't hurt much to know what you can't have, is exactly the kind of cost you're volunteering to take on when you say you wanna be a Keeper.
+> that's archetypally exactly The Sort Of Thing Keepers Do And Are
+
+> They choose not to, not just out of consideration for the next person in line, but because not harming the next person in line is part of the explicit bargain of becoming a Keeper.
+> Like, this sort of thing is exactly what you're signing up for when you throw yourself on the bounded rationality grenade.
+> Let the truth destroy what it can—but in you, not in other people.
+
+I objected (to the room, I told myself, not technically violating my prior intent to not bother Yudkowsky himself anymore) that "Let the truth destroy what it can—in yourself, not in other people" is such an _incredibly_ infantilizing philosophy. It's a meme that optimizes for shaping people (I know, _other_ people) into becoming weak, stupid, and unreflective, like Thellim's impression of Jane Austen characters. I expect people on Earth—not even "rationalists", just ordinary adults—to be able to cope with ... learning facts about psychology that imply that there are desirable sexual experiences they won't get to have.
+
+A user called Numendil insightfully pointed out that dath ilani might be skeptical of an Earthling saying that an unpleasant aspect our of existence is actually fine, for the same reason we would be skeptical of a resident of Golarion saying that; it makes sense for people from richer civilizations to look "spoiled" to people from poorer ones.
+
+Other replies were more disturbing. One participant wrote:
+
+> I think of "not in other people" not as "infantilizing", but as recognizing independent agency. You don't get to do harm to other people without their consent, whether that is physical or pychological.
+
+I pointed out that this obviously applies to, say, religion. Was it wrong to advocate for atheism in a religious Society, where robbing someone of their belief in God might be harming them?
+
+"Every society strikes a balance between protectionism and liberty," someone said. "This isn't news."
+
+It's not news about _humans_, I conceded. It was just—I thought people who were fans of Yudkowsky's writing in 2008 had a reasonable expectation that the dominant messaging in the local subculture would continue in 2022 to be _in favor_ of telling the truth and _against_ benevolently intended Noble Lies. It ... would be interesting to know why that changed.
+
+Someone else said:
+
+> dath ilan is essentially a paradise world. In a paradise world, people have the slack to make microoptimisations like that, to allow themselves Noble Lies and not fear for what could be hiding in the gaps. Telling the truth is a heuristic for this world where Noble Lies are often less Noble than expected and trust is harder to come by.
+
+I said that I thought people were missing this idea that the reason "truth is better than lies; knowledge is better than ignorance" is such a well-performing injunction in the real world (despite the fact that there's no law of physics preventing lies and ignorance from having beneficial consequences), is because it protects against unknown unknowns. Of course an author who wants to portray an ignorance-maintaining conspiracy as being for the greater good, can assert by authorial fiat whatever details are needed to make it all turn out for the greater good, but _that's not how anything works in real life_.
+
+I started a new thread to complain about the attitude I was seeing (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). When fiction in this world, _where I live_, glorifies Noble Lies, that's a cultural force optimizing for making shared maps less accurate, I explained. As someone trying to make shared maps _more_ accurate, this force was hostile to me and mine. I understood that secrets and lies are different, but if you're a consequentialist thinking in terms of what kinds of optimization pressures are being applied to shared maps, it's the same issue: I'm trying to steer _towards_ states of the world where people know things, and the Keepers of Noble Secrets are trying to steer _away_ from states of the world where people know things. That's a conflict. I was happy to accept Pareto-improving deals to make the conflict less destructive, but I wasn't going to pretend the pro-ignorance forces were my friends just because they self-identify as "rationalists" or "EA"s. I was willing to accept secrets around nuclear or biological weapons, or AGI, on "better ignorant than dead" grounds, but the "protect sadists from being sad" thing was _just_ coddling people who can't handle the truth, which makes _my_ life worse.
+
+And just—back in the 'aughts, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog in 2008.
+
+[TODO: Eliezerfic fight, cont'd]