* Counterargument: as a reductio, am I opposed to movie and homework exercise spoilers, too? Reply: No, I recognize spoiler protection as a legitimate class of infohazard. (And AGI/nuclear as an social exfohazard, and threats as a legitimate infohazard.)
* Counterargument: the text says that regular dath ilani do believe "that which can be destroyed" eventually, just not immediately. Reply: this would be more believable with a timetable; I can understand "I don't want to deal with this yet", but there's no indication that Merrin or Keltham are supposed to figure it out.
* Counterargument: but the gaslighting is supposed to inculcate distrust of authority. Reply: clearly it doesn't work, given how much everyone trusts the government that is lying to them all the time?
+ * Numendil's point: dath ilani should look spoiled to us, for the same reason we look spoiled to Golarionites
]
### Conclusion
-
--------
-
-[OUTLINE—
- * the race of dath ilani humans are called the "eliezera" in Word of God canon
- presenting an eliezera racial supremacy narrative. (It's still a racial supremacy narrative even if he doesn't _use the verbatim phrase_ "racial supremacy.")
- * Bluntly, this is not a culture that gives a shit about people being well-informed. This is a culture that has explicitly
- * In more detail: the algorithm that designed dath ilani Civilization is one that systematically favors plans that involve deception, over than plans that involve being honest.
- * This is not a normative claim or a generic slur that dath ilani are "evil" or "bad"; it's a positve claim about systematic deception. If you keep seeing plans for which social-deception-value exceeds claimed-social-benefit value, you should infer that the plans are being generated by a process that "values" (is optimizing for) deception, whether it's a person or a conscious mind.
- * Watsonian rationale: with smarter people, knowledge actually is dangerous. I'm more interested in a Doylist interpretation, that this reflects authoritarian tendencies in later Yudkowsky's thought.
-
-Perhaps for lack of any world-saving research to do, Yudkowsky started writing fiction again, largely in the form of Glowfic (a genre of collaborative storytelling pioneered by Alicorn) featuring the world of dath ilan .
-
-The bulk of the dath ilan Glowfic canon was an epic titled [_Planecrash_](https://www.glowfic.com/boards/215)[^planecrash-title] coauthored with Lintamande, in which Keltham, an unusually selfish teenage boy from dath ilan, apparently dies in a freak aviation accident, and [wakes up in the world of](https://en.wikipedia.org/wiki/Isekai) Golarion, setting of the _Dungeons-&-Dragons_–alike _Pathfinder_ role-playing game. A [couple](https://www.glowfic.com/posts/4508) of [other](https://glowfic.com/posts/6263) Glowfic stories with different coauthors further flesh out the setting of dath ilan, which inspired a new worldbuilding trope,
-
-[^planecrash-title]: The title is a triple pun, referring to the airplane crash leading to Keltham's death in dath ilan, and how his resurrection in Golarion collides dath ilan with [the "planes" of existence of the _Pathfinder_ universe](https://pathfinderwiki.com/wiki/Great_Beyond), and Keltham's threat to destroy (crash) the _Pathfinder_ reality if mortals aren't given better afterlife conditions. (I use the word "threat" colloquially here; the work itself goes into some detail distinguishing between bargaining and decision-theoretic threats that should be defied.)
-
-You can, of course, make up a sensible [Watsonian](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) rationale for this. A world with much smarter people is more "volatile"; with more ways for criminals and terrorists to convert knowledge into danger, maybe you _need_ more censorship just to prevent Society from blowing itself up.
-
-I'm more preoccupied by a [Doylistic](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) interpretation—that dath ilan's obsessive secret-Keeping reflects something deep about how the Yudkowsky of the current year relates to speech and information, in contrast to the Yudkowsky who wrote the Sequences. The Sequences had encouraged you—yes, _you_, the reader—to be as rational as possible. In contrast, the dath ilan mythos seems to portray advanced rationality as dangerous knowledge that people need to be protected from.
-
-As another notable example of dath ilan hiding information for the alleged greater good, in Golarion, Keltham discovers that he's a sexual sadist, and deduces that Civilization has deliberately prevented him from realizing this, because there aren't enough corresponding masochists to go around in dath ilan. Having concepts for "sadism" and "masochism" as variations in human psychology would make sadists like Keltham sad about the desirable sexual experiences they'll never get to have, so Civilization arranges for them to _not be exposed to knowledge that would make them sad, because it would make them sad_ (!!).
-
-What happens, I asked, to the occasional dath ilani free speech activists, with their eloquent manifestos arguing that Civilization would be better off coordinating on maps that reflect the territory, rather than coordinating to be a Keeper-managed zoo? (They _had_ to exist: in a medianworld centered on Yudkowsky, there are going to be a few weirdos who are +2.5 standard deviations on "speak the truth, even if your voice trembles" and −2.5 standard deivations on love of clever plots; this seems less weird than negative utilitarians, who were [established to exist](https://www.glowfic.com/replies/1789623#reply-1789623).) I _assumed_ they get dealt with somehow in the end (exiled from most cities? ... involuntarily cryopreserved?), but there had to be an interesting story about someone who starts out whistleblowing small lies (which Exception Handling allows; they think it's cute, and it's "priced in" to the game they're playing), and then just keeps _escalating and escalating and escalating_ until Governance decides to unperson him.
-
-[...]
-
-If we believe that [IQ research validates the "Jews are clever" stereotype](https://web.mit.edu/fustflum/documents/papers/AshkenaziIQ.jbiosocsci.pdf), I wondered if there's a distinct (albeit probably correlated) "enjoying deception" trait that validates the "Jews are sneaky" stereotype? If dath ilan is very high in this "sneakiness" trait (relative to Earth Jews), that would help explain all the conspiracies!
-
-The existence of such a widespread sneakiness/"taste for deception" trait among the eliezera, in conjunction with their culture just not particularly valuing public knowledge (because they assume everything important is being handled by the Keepers), explains the recurring conspiracies and coverups, like the Ordinary Merrin Conspiracy, Exception Handling's fabrication of evidence for Sparashki being real, the sadism/masochism coverup, and [the village that deliberately teaches anti-redhead bigotry to children in order to test the robustness of dath ilan's general humanism indoctrination](https://www.lesswrong.com/posts/uyBeAN5jPEATMqKkX/lies-told-to-children-1).
-
-I stress that this hypothesis _doesn't_ require dath ilani to be cartoon villains who hate knowledge and want people to be ignorant. Just that, as a result of the widespread sneakiness trait and their outsourcing information-process to the Keepers, in the course of trying to accomplish other things, plans-that-involve-conspiracies are often higher in their search ordering than plans-that-involve-keeping-people-informed.
-
-I claimed that there was a hidden-core-of-rationality thing about a culture that values living in truth, that the dath ilani didn't have. In previous discussion of the Sparashki example, a user called lc had written, "If you see someone wearing an elf costume at work and conclude elves are real and make disastrous decisions based on that conclusion you are mentally deranged". And indeed, you would be mentally deranged if you did that _on Earth_, because we don't have an elves-are-real conspiracy on Earth.
-
-In elves-are-real conspiracy-world, you (Whistleblower) see someone (Conspirator) wearing an elf costume at work and say, "Nice costume." They say, "What costume?" You say, "I see that you're dressed like an elf, but elves aren't real." They say, "What do you mean? Of course elves are real. I'm right here." You say, "You know exactly what I mean."
-
-It would appear that there's a conflict between Conspirator (who wants to maintain a social reality in which they're an elf, because it's fun, and the conspiracy is sufficiently outlandish that it's assumed that no one is "really" being deceived) and Whistleblower (who wants default social reality to map to actual reality; make-believe is fine at a designated fandom convention which has designated boundaries, but let's be serious at work, where your coworkers are trying to make a living and haven't opted-in to this false social reality).
-
-I was skeptical that a culture where people collude to maintain a fake social reality at their job in a hospital, and everyone else is expected to play along because it's fun, really has this living-in-truth thing. People play those social-reality games on Earth, too, and when _they_ say no one is being deceived, they're _definitely_ lying about that, and I doubted that the eliezera were actually built that differently.
-
-"Natural History of Ashkenazi Intelligence"
-
-(I was tempted to tag that as "epistemic status: low-confidence speculation", but that's _frequentist_ thinking—as if "Jews and gentiles are equally sneaky" were a "null hypothesis" that could only be rejected by data that would be sufficiently unlikely assuming that the null was true. Ha ha, that would be _crazy!_ Obviously, I should have a _prior_ on the effect size difference between the Jew and gentile sneakiness distributions, that can be updated as sneakiness data comes in. I think the mean of my prior distribution is at, like, _d_ ≈ 0.1? So it's not "low confidence"; it's "low confidence of the effect size being large enough to be of much practical significance".)
-
-For context on why I have no sense of humor about this, on Earth (which _actually exists_, unlike dath ilan), when someone says "it's not lying, because no one _expected_ me to tell the truth in that situation", what's usually going on, [as Zvi Mowshowitz explains](https://thezvi.wordpress.com/2019/07/02/everybody-knows/), is that is that conspirators benefit from deceiving outsiders, and the claim that "everyone knows" is them lying to _themselves_ about the fact that they're lying.
-
-(If _you_ got hurt by not knowing, well, it's not like anyone got hurt, because if you didn't know, then you weren't anyone.)
-
-Okay, but if it were _actually true_ that everyone knew, what would be _function_ of saying the false thing? On dath ilan (if not in Earth boardrooms), I suppose the answer is "Because it's fun"? Okay, but what is the function of your brain giving out a "fun" reward in this context? It seems like at _some_ point, there has to be the expectation of _some_ cognitive system (although possibly not an entire "person") taking the signals literally.
-
-That's why, when I _notice_ myself misrepresenting my actual beliefs or motivations because I think it's funny or rhetorically powerful (and it takes a special act of noticing; humans aren't built to be honest by default), I often take care to disclaim it immediately (as was observed in the message this is one a reply to), precisely because I _don't_ think that "everybody knows"; I'm not going to give up on humor or powerful rhetoric, but I'm also not going to delude myself into thinking it's "zero-calorie" (people who don't "get the joke" _are_ going to be misled, and I don't think it's unambigously "their fault" for not being able to read my "intent" to arbitrary precision)
-
-But maybe dath ilan is sufficiently good at achieving common knowledge in large groups that they _can_ pull off a zero-calorie "everyone knows" conspiracy without damaging shared maps??
-
-I'm still skeptical, especially given that we see them narratizing it as "not lying" (in the same words that corrupt executives on Earth use!), rather than _explicitly_ laying out the evopysch logic of sneakiness superstimuli, and the case that they know how to pull it off in a zero-calorie (trivial damage to shared maps) way.
-
-In general, I think that "it's not lying because no one expected the truth" is something you would say as part of an attempted nearest-unblocked-strategy end run around a deontological constraint against "lying" (https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly); I don't think it's something you would say if you _actually cared_ about shared maps being accurate
-
-What I see on Earth (which, again, _actually exists_, unlike dath ilan, which _does not exist_), is that people mostly drink their own Kool-Aid; lying to the world without lying to yourself is just not psychologically sustainable.
-
-I had some insightful discussion with someone in 2017, in which I was saying that I wanted something to be public knowledge, that was frequently denied for political reasons. (The object-level topic doesn't matter in this context.) This person said, "I don't particularly care about this being commonly recognized. C'mon! It's fun to have some secrets that are not public knowledge".
-
-In 2021, _the same person_ says she feels "disgusted and complicit" for having talked to me.
-
-> At some point during this time I started treating this as a hidden truth that I was proud of myself for being able to see, which I in retrospect I feel disgusted and complicit to have accepted
-
-> above exchanges did in retrospect cause me emotional pain, stress, and contributed to internalizing sexism and transphobia.
-
-I think it's very significant that she _didn't_ say that she encountered _new evidence_ that made her _change her mind_, and decided that she was _actually wrong_ in 2017. The claim is just that the things we said in 2017 are "harmful."
-
-wrap-up dath ilan is fictional
-
-the reason I'm paranoid and humorless about
-
-positive-valence fictional depictions of
-
-"but, but, information can _hurt people_, and hurting people is wrong" and "it's not lying if 'everybody knows'" memes, is because I _actually see this stuff destroying people's minds in real life_
-
-I certainly don't think Yudkowsky is consciously "lying"
-
-(natural language is very flexible, you can _come up with_ some interpretation)
-
-You can cosplay an elf _at a designated fandom convention_, where people have temporarily _opted in_ to that false social reality. But I don't think you can cosplay an elf _at work_ (in "real life") and have everyone play along for extended periods of time, without dealing damage to real-life shared maps.
-
-Similarly, I cosplay female characters at fandom conventions, and that's fun, and I'm glad that conventions exist, but I can't transition in "real life", because I don't expect anyone in real life to believe that I'm female, because it's _very obviously not true_. People will _pretend_ to believe it because they're terrified of being accused of transphobia, but _they are lying_, and the
-
-people who try to claim that no one is being deceived because "everyone knows"
-
-_are also lying_.
-
-Keltham contradicts himself _in the same tag_
-https://www.glowfic.com/replies/1865236#reply-1865236
-
-> The sneakiest thing dath ilan did was covertly shape him to never notice he was a sadist
-> [...]
-> Obviously past-Keltham was shaped in all sorts of ways as a kid, but those shaping-targets are matters of public documentation on the Network. They're not _covert_ intended effects of the alien technology.
-
-I mean, it's worth noting that their concept of a "good reason" literally includes "prediction markets think people will be happier this way". This is not a Society that gives a shit (as a terminal value) about non-Keepers having accurate information (or they wouldn't, _e.g._, gaslight Merrin about how famous she is).
-
-_Of course_ a Society that prizes freedom-from-infohazards as a core value is going to have lots of "good reasons" for the systematically-misleading-representations they make, that will seem genuinely compelling to the people of that Society who are in on it!
-
-One might have hoped that dath ilani would be self-aware enough to notice that things that seem like a "good reason" for a conspiracy _to dath ilani_, would not seem like a "good reason" to people from a Society that prizes freedom-of-speech? But if they've screened off their history (for the greater good, of course), they might not have a concept of what other Societies are like ...
-
-(Yes, I know we've been informed by authorial fiat that dath ilan has a lot of internal diversity, but there are necessarily limits to that if you're going to be a human Society specifically rather than a Solomonff inductor, and it seems clear that any faction that thinks gaslighting Merrin is morally wrong is on the losing end of the counterfactual warfare of democracy.)
-
-Aslan / amputation of destiny
-
- * An ethnographer might note that Americans believe themselves to be "the land of the brave and the home of the free", without being obliged for their ethnography to agree with this description. I'm taking the same stance towards dath ilan: as a literary critic, I don't have to share its Society's beliefs about itself.
-
-spoilers for the pleasure of discovering sex for themselves: https://glowfic.com/replies/1812613#reply-1812613
-
-being told eugenics prospects early as self-fulfilling prophecies, as a legitimate infohazard: https://glowfic.com/replies/1812614#reply-1812614
-
-sex and classical mechanics spoilers (spoilers are not the same thing as conspiracies/secrets; Keltham and Carissa arne't supposed to find out): https://glowfic.com/replies/1718168#reply-1718168
-
-https://glowfic.com/replies/1788890#reply-1788890
-> He likes confusing people. Supposedly it's to train strong minds that don't weakly rely on being told how reality works. I think it may be what dath ilan does with all its repressed sadism.
-
-https://glowfic.com/replies/1801463#reply-1801463
-> "I really think all y'all don't give dath ilan enough credit on some dimensions, if not others. They didn't tell me about my sexual sadism because there would have been no good way for me to satisfy it, Pilar, not because they wanted to deny me my utilityfunction. Or did you have something else in mind?"
-
-(masochism search tags)
-https://glowfic.com/replies/search?board_id=&author_id=366&template_id=&character_id=&subj_content=masochism&sort=created_old&condensed=on&commit=Search
-
-https://glowfic.com/replies/1735044#reply-1735044
-> "But yes, or rather, my suspicion is that not many sadists in dath ilan know what they are and Civilization tries to prevent us from finding out, because dath ilan does not have masochists.
-
-> His Lawful Good world tried to make sure he never found out about that. Keltham thinks that's because dath ilan has no masochists. He thinks masochism itself is unlikely
-https://glowfic.com/replies/1788845#reply-1788845
-
-----
+Perhaps for lack of any world-saving research to do, Yudkowsky started writing fiction again, largely in the form of Glowfic (a genre of collaborative storytelling pioneered by Alicorn) featuring the world of dath ilan.
-[TODO: bridge—link to pulled-out standalone post, "On the Public Anti-Epistemology of dath ilan"]
+The bulk of the dath ilan Glowfic canon was an epic titled [_Planecrash_](https://www.glowfic.com/boards/215)[^planecrash-title] coauthored with Lintamande, in which Keltham, an unusually selfish teenage boy from dath ilan, apparently dies in a freak aviation accident, and [wakes up in the world of](https://en.wikipedia.org/wiki/Isekai) Golarion, setting of the _Dungeons-&-Dragons_–alike _Pathfinder_ role-playing game. A [couple](https://www.glowfic.com/posts/4508) of [other](https://glowfic.com/posts/6263) Glowfic stories with different coauthors further flesh out the setting of dath ilan.
-It did not escape my notice that when "rationalist" authorities _in real life_ considered public knowledge of some paraphilia to be an infohazard (ostensibly for the benefit of people with that paraphilia), I _didn't take it lying down_.
+[^planecrash-title]: The title is a triple pun, referring to the airplane crash leading to Keltham's death in dath ilan, and how his resurrection in Golarion collides dath ilan with [the "planes" of existence of the _Pathfinder_ universe](https://pathfinderwiki.com/wiki/Great_Beyond), and Keltham's threat to destroy (crash) the _Pathfinder_ reality if mortals aren't given better afterlife conditions. (I use the word "threat" colloquially here; the work itself goes into some detail distinguishing between mere bargaining and decision-theoretic threats that should be defied.)
-This parallel between dath ilan's sadism/masochism coverup and the autogynephilia coverup I had fought in real life, was something I was only intending to comment on in passing in the present memoir, rather than devoting any more detailed analysis to, but as I was having trouble focusing on my own writing in September 2022, I ended up posting some critical messages about dath ilan's censorship regime in the "Eliezerfic" Discord server for reader discussion of _Planecrash_, using the sadism/masochism coverup as my central example.
+On the topic of dath ilan's rationality training, I appreciated [this passage about the cognitive function of categorization](https://www.glowfic.com/replies/1779051#reply-1779051):
+
+> Dath ilani kids get told to not get fascinated with the fact that, in principle, 'bounded-agents' with finite memories and finite thinking speeds, have any considerations about mapping that depend on what they want. It doesn't mean that you get to draw in whatever you like on your map, because it's what you want. It doesn't make reality be what you want.
-[...]
+Vindication! This showed that Yudkowsky _did_ understand what was at issue in the dispute over "... Not Man for the Categories", even if he couldn't say "Zack is right and Scott is wrong" for political reasons. Beyond that tidbit, however, the dath ilan mythos still seemed defective to me compared to the Sequences regarding its attitudes towards knowledge.
Someone at the 2021 Event Horizon Independence Day party had told me that I had been misinterpreting the "Speak the truth, even if your voice trembles" slogan from the Sequences. I had interpreted the slogan as suggesting the importance of speaking the truth _to other people_ (which I think is what "speaking" is usually about), but my interlocutor said it was about, for example, being able to speak the truth aloud in your own bedroom, to yourself. I think some textual evidence for my interpretation can be found in Daria's ending to ["A Fable of Science and Politics"](https://www.lesswrong.com/posts/6hfGNLf4Hg5DXqJCF/a-fable-of-science-and-politics), a multiple-parallel-endings story about an underground Society divided into factions over the color of the unseen sky, and one person's reaction when they find a passageway leading aboveground to a view of the sky:
[^other-endings]: Even Eddin's ending, which portrays Eddin as more concerned with consequences than honesty, has him "trying to think of a way to prevent this information from blowing up the world", rather than trying to think of a way to suppress the information, in contrast to how Charles, in his ending, _immediately_ comes up with the idea to block off the passageway leading to the aboveground. Daria and Eddin are clearly written as "rationalists"; the deceptive strategy only comes naturally to the non-rationalist Charles. (Although you could Watsonianly argue that Eddin is just thinking longer-term than Charles: blocking off _this_ passageway and never speaking a word of it to another soul, won't prevent someone from finding some other passage to the aboveground, eventually.)
-In contrast, the culture of dath ilan does not seem to particularly value people _standing under the same sky_.
-
-[...]
+In contrast, the culture of dath ilan does not seem to particularly value people _standing under the same sky_. Not only is their Society is steered by an order of [Keepers of Highly Unpleasant Things it is Sometimes Necessary to Know](https://www.glowfic.com/replies/1612937#reply-1612937) who safeguard advanced rationality techniques from a population allegedly too psychologically fragile to handle them, but we see many other cases of the dath ilani covering things up for some alleged greater good with seemingly no regard to the costs of people have less accurate world-models.
-On the topic of dath ilan's rationality training, I did appreciate [this passage about the cognitive function of categorization](https://www.glowfic.com/replies/1779051#reply-1779051):
+In one notable example, Keltham, the protagonist of _Planecrash_, is an obligate sexual sadist, but never discovered this fact about himself during his first life in dath ilan, because dath ilan has arranged to cover up the existence of sadism and masochism—precisely because people like Keltham would be sad if they discovered that there weren't enough masochists to go around.
- > Dath ilani kids get told to not get fascinated with the fact that, in principle, 'bounded-agents' with finite memories and finite thinking speeds, have any considerations about mapping that depend on what they want. It doesn't mean that you get to draw in whatever you like on your map, because it's what you want. It doesn't make reality be what you want.
+It did not escape my notice that when "rationalist" authorities in real life considered public knowledge of some paraphilia to be an infohazard (ostensibly for the benefit of people with that paraphilia), I _didn't take it lying down_.
-Vindication! (This showed that Yudkowsky _does_ understand what was at issue in the "... Not Man for the Categories" dispute, even if I can't be credited with winning the argument for political reasons.)
+This parallel between dath ilan's sadism/masochism coverup and the autogynephilia coverup I had fought in real life, was something I was only intending to comment on in passing in the present memoir, rather than devoting any more detailed analysis to, but as I was having trouble focusing on my own writing in September 2022, I ended up posting some critical messages about dath ilan's censorship regime in the "Eliezerfic" Discord server for reader discussion of _Planecrash_, using the sadism/masochism coverup as my central example.
-----------
+(I would later adapt my complaints into a standalone post, "On the Public Anti-Epistemology of dath ilan".)
Although Yudkowsky participated in the server, I had reasoned that my participation didn't violate my previous intent not to bother him anymore, because it was a publicly-linked Discord server with hundreds of members. Me commenting on the story for the benefit of the _other_ 499 people in the chat room wouldn't generate a notification _for him_, the way it would if I sent him an email or replied to him on Twitter.
-------
-
-
-
-In the #dath-ilan channel of the server, Yudkowsky elaborated on the reasoning for the masochism coverup:
-
-> altruistic sadists would if-counterfactually-fully-informed prefer not to know, because Civilization is capped on the number of happy sadists. even if you can afford a masochist, which requires being very rich, you're buying them away from the next sadist to whom masochists were previously just barely affordable
-
-In response to a question about how frequent sadism is among Keepers, Yudkowsky wrote:
-
-> I think they're unusually likely to be aware, nonpracticing potential sexual sadists. Noticing that sort of thing about yourself, and then not bidding against the next sadist over for the limited masochist supply, and instead just operating your brain so that it doesn't hurt much to know what you can't have, is exactly the kind of cost you're volunteering to take on when you say you wanna be a Keeper.
-> that's archetypally exactly The Sort Of Thing Keepers Do And Are
-
-> They choose not to, not just out of consideration for the next person in line, but because not harming the next person in line is part of the explicit bargain of becoming a Keeper.
-> Like, this sort of thing is exactly what you're signing up for when you throw yourself on the bounded rationality grenade.
-> Let the truth destroy what it can—but in you, not in other people.
-
-I objected (to the room, I told myself, not technically violating my prior intent to not bother Yudkowsky himself anymore) that "Let the truth destroy what it can—in yourself, not in other people" is such an _incredibly_ infantilizing philosophy. It's a meme that optimizes for shaping people (I know, _other_ people) into becoming weak, stupid, and unreflective, like Thellim's impression of Jane Austen characters. I expect people on Earth—not even "rationalists", just ordinary adults—to be able to cope with ... learning facts about psychology that imply that there are desirable sexual experiences they won't get to have.
-
-A user called Numendil insightfully pointed out that dath ilani might be skeptical of an Earthling saying that an unpleasant aspect our of existence is actually fine, for the same reason we would be skeptical of a resident of Golarion saying that; it makes sense for people from richer civilizations to look "spoiled" to people from poorer ones.
+The other chatroom participants mostly weren't buying what I was selling.
-Other replies were more disturbing. One participant wrote:
+When I objected to [Word of God](https://tvtropes.org/pmwiki/pmwiki.php/Main/WordOfGod)'s identification of the Keeper's credo as "Let the truth destroy what it can—in yourself, not in other people" as an incredibly infantalizing philosophy, someone replied:
> I think of "not in other people" not as "infantilizing", but as recognizing independent agency. You don't get to do harm to other people without their consent, whether that is physical or pychological.
"Every society strikes a balance between protectionism and liberty," someone said. "This isn't news."
-It's not news about _humans_, I conceded. It was just—I thought people who were fans of Yudkowsky's writing in 2008 had a reasonable expectation that the dominant messaging in the local subculture would continue in 2022 to be _in favor_ of telling the truth and _against_ benevolently intended Noble Lies. It ... would be interesting to know why that changed.
-
-Someone else said:
-
-> dath ilan is essentially a paradise world. In a paradise world, people have the slack to make microoptimisations like that, to allow themselves Noble Lies and not fear for what could be hiding in the gaps. Telling the truth is a heuristic for this world where Noble Lies are often less Noble than expected and trust is harder to come by.
-
-I said that I thought people were missing this idea that the reason "truth is better than lies; knowledge is better than ignorance" is such a well-performing [injunction](https://www.lesswrong.com/posts/dWTEtgBfFaz6vjwQf/ethical-injunctions) in the real world (despite the fact that there's no law of physics preventing lies and ignorance from having beneficial consequences), is because [it protects against unknown unknowns](https://www.lesswrong.com/posts/E7CKXxtGKPmdM9ZRc/of-lies-and-black-swan-blowups). Of course an author who wants to portray an ignorance-maintaining conspiracy as being for the greater good, can assert by authorial fiat whatever details are needed to make it all turn out for the greater good, but _that's not how anything works in real life_.
-
-I started a new thread to complain about the attitude I was seeing (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). When fiction in this world, _where I live_, glorifies Noble Lies, that's a cultural force optimizing for making shared maps less accurate, I explained. As someone trying to make shared maps _more_ accurate, this force was hostile to me and mine. I understood that "secrets" and "lies" are not the same thing, but if you're a consequentialist thinking in terms of what kinds of optimization pressures are being applied to shared maps, [it's the same issue](https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist): I'm trying to steer _towards_ states of the world where people know things, and the Keepers of Noble Secrets are trying to steer _away_ from states of the world where people know things. That's a conflict. I was happy to accept Pareto-improving deals to make the conflict less destructive, but I wasn't going to pretend the pro-ignorance forces were my friends just because they self-identified as "rationalists" or "EA"s. I was willing to accept secrets around nuclear or biological weapons, or AGI, on "better ignorant than dead" grounds, but the "protect sadists from being sad" thing wasn't a threat to anyone's life; it was _just_ coddling people who can't handle reality, which made _my_ life worse.
-
-I wasn't buying the excuse that secret-Keeping practices that wouldn't be okay on Earth were somehow okay on dath ilan, which was asserted by authorial fiat to be sane and smart and benevolent enough to make it work. Alternatively, if I couldn't argue with authorial fiat: the reasons why it would be bad on Earth (even if it wouldn't be bad in the author-assertion paradise of dath ilan) are reasons why _fiction about dath ilan is bad for Earth_.
-
-And just—back in the 'aughts, I said, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog in 2008—the will to _just get the right answer_, without all this galaxy-brained hand-wringing about who the right answer might hurt.
-
-(_Overcoming Bias_ had actually been a group blog then, but I was enjoying the æsthetic of saying "Robin Hanson's blog" (when what I had actually loved about _Overcoming Bias_ was Yudkowsky's Sequences) as a way of signaling contempt for the Yudkowsky of the current year.)
-
-I would have expected a subculture descended from the memetic legacy of Robin Hanson's blog in 2008 to respond to that tripe about protecting people from the truth being a form of "recognizing independent agency" with something like—
-
-"Hi! You must be new here! Regarding your concern about truth doing harm to people, a standard reply is articulated in the post ["Doublethink (Choosing to be Biased)"](https://www.lesswrong.com/posts/Hs3ymqypvhgFMkgLb/doublethink-choosing-to-be-biased). Regarding your concern about recognizing independent agency, a standard reply is articulated in the post ["Your Rationality Is My Business"](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business)."
-
-—or _something like that_. Not that the reply needed to use those particular Sequences links, or _any_ Sequences links; what's important is that someone needed to counter to this very obvious [anti-epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology).
-
-And what we actually saw in response to the "You don't get to do harm to other people" message was ... it got 5 "+1" emoji-reactions.
-
-Yudkowsky [chimed in to point out that](/images/yudkowsky-it_doesnt_say_tell_other_people.png) "Doublethink" was about _oneself_ not reasonably being in the epistemic position of knowing that one should lie to oneself. It wasn't about telling the truth to _other_ people.
-
-On the one hand, fair enough. My generalization from "you shouldn't want to have false beliefs for your own benefit" to "you shouldn't want other people to have false beliefs for their own benefit" (and the further generalization to it being okay to intervene) was not in the text of the post itself. It made sense for Yudkowsky to refute my misinterpretation of the text he wrote.
-
-On the other hand—given that he was paying attention to this #overflow thread anyway, I might have naïvely hoped that he would appreciate what I was trying to do?—that, after the issue had been pointed out, he would decided that he _wanted_ his chatroom to be a place where we don't want other people to have false beliefs for their own benefit?—a place that approves of "meddling" in the form of _telling people things_.
-
-The other chatroom participants mostly weren't buying what I was selling.
-
-A user called April wrote that "the standard dath ilani has internalized almost everything in the sequences": "it's not that the standards are being dropped[;] it's that there's an even higher standard far beyond what anyone on earth has accomplished". (This received a checkmark emoji-react from Yudkowsky, an indication of his agreement/endorsement.)
-
-Someone else said he was "pretty leery of 'ignore whether models are painful' as a principle, for Earth humans to try to adopt," and went on to offer some thoughts for Earth. I continued to maintain that it was ridiculous that we were talking of "Earth humans" as if there were any other kind—as if rationality in the Yudkowskian tradition wasn't something to aspire to in real life.
-
-Dath ilan [is _fiction_](https://www.lesswrong.com/posts/rHBdcHGLJ7KvLJQPk/the-logical-fallacy-of-generalization-from-fictional), I pointed out. Dath ilan _does not exist_. I thought it was a horrible distraction to try to see our world through Thellim's eyes and feel contempt over how much better things must be on dath ilan (which, to be clear, again, _does not exist_), when one could be looking through the eyes of an ordinary reader of Robin Hanson's blog in 2008 (the _real_ 2008, which _actually happened_), and seeing everything we've lost.
-
-[As it was taught to me then](https://www.lesswrong.com/posts/iiWiHgtQekWNnmE6Q/if-you-demand-magic-magic-won-t-help): if you demand Keepers, _Keepers won't help_. If I'm going to be happy anywhere, or achieve greatness anywhere, or learn true secrets anywhere, or save the world anywhere, or feel strongly anywhere, or help people anywhere—I may as well do it _on Earth_.
-
-The thread died out soon enough. I had some more thoughts about dath ilan's predilection for deception, of which I typed up some notes for maybe adapting into a blog post later, but there was no point in wasting any more time on Discord.
-
-On 29 November 2022 (four years and a day after the "hill of meaning in defense of validity" Twitter performance that had ignited my rationalist civil war), Yudkowsky remarked about the sadism coverup again:
-
-> Keltham is a romantically obligate sadist. This is information that could've made him much happier if masochists had existed in sufficient supply; Civilization has no other obvious-to-me-or-Keltham reason to conceal it from him.
-
-Despite the fact that there was no point in wasting any more time on Discord, I decided not to resist the temptation to open up the thread again and dump some paragraphs from my notes on the conspiracies of dath ilan.
-
----------
-
-A user called ajvermillion asked why I was being so aggressively negative about dath ilan. He compared it to Keltham's remark about how [people who grew up under a Lawful Evil government were disposed to take a more negative view of paternalism](https://www.glowfic.com/replies/1874754#reply-1874754) than they do in dath ilan, where paternalism basically works fine because dath ilan is benevolent.
-
-This question put me in a somewhat awkward position: it was a legitimate question that I felt I had to answer, that I had no way of answering honestly without at least _alluding_ to my prior greviances against Yudkowsky ... which were off-topic for the server. (Again, I had told myself that I was here to comment on the story, not to prosecute my greviances.)
-
-I tried to explain, briefly. Someone who might be even _more_ paranoid about abuses of power than someone who grew up with a Lawful Evil government, is someone who grew up under a power structure that put on a _good show_ of being clean and nice, but was actually corrupt and mean.
-
-Yudkowsky had this whole marketing image of him being uniquely sane and therefore uniquely benevolent, and because his Sequences were so life-changingly good, I _actually fell for it_. There was a long Dumb Story (this Story) that was off-topic and I hadn't then finished writing up, but basically, I had what I claimed were very strong reasons not to trust the guy anymore; I think he cares a lot about not explicitly _lying_, but what made the Sequences special is that they articulated a vastly higher standard than that, that he had no intention of living up to.
-
-And so, yeah, insofar as fiction about dath ilan functioned as marketing material for Yudkowsky's personality cult that I thought was damaging people like me (in some ways, while simultaneously helping us in other ways), I had an incentive to come up with literary criticism that paints dath ilan negatively?
+It's not news about _humans_, I conceded. It was just—I thought people who were fans of Yudkowsky's writing in 2008 had a reasonable expectation that the dominant messaging in the local subculture would continue in 2022 to be _in favor_ of telling the truth and _against_ benevolently intended noble lies. It ... would be interesting to know why that changed.
-It was great for ajvermillion to notice this! It _would_ be bad if my brain were configured to come up with dath-ilan-negative literary criticism, and for me to _simultaneously_ present myself as an authority on dath ilan whom you should trust. But if dath-ilan-negative literary criticism was undersupplied for structural reasons (because people who like a story are selected for not seeing things the story is doing that are Actually Bad), and my brain was configured to generate it anyway (because I disliked the person Yudkowsky had become, in contrast to the person he was in 2008), it seemed pro-social for me to post it, for other people to take or leave according to their own judgement?
+I started a new thread for my topic (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). It died out after a couple days, and I reopened it later in response to more discussion of the masochism coverup.
-Yudkowsky soon entered the thread again, initially replying to someone else. (I remarked parenthetically that his appearance made me think I should stop wasting time snarking in his fiction server and just finish my memoir already.) We had a brief back-and-forth:
+Yudkowsky made an appearance. (After he replied to someone else, I remarked parenthetically that his appearance made me think I should stop wasting time snarking in his fiction server and just finish my memoir already.) We had a brief back-and-forth:
> **Eliezer** — 11/29/2022 10:33 PM
> the main thing I'd observe contrary to Zack's take here, is that Keltham thought that not learning about masochists he can never have, was obviously in retrospect what he'd have wanted Civilization to do, or do unless and until Keltham became rich enough to afford a masochist and then he could be told
> **Eliezer** — 11/29/2022 10:36 PM
> I am sorry that some of the insane people I attracted got together and made each other more insane and then extensively meta-gaslit you into believing that everyone generally and me personally was engaging in some kind of weird out-in-the-open gaslighting that you could believe in if you attached least-charitable explanations to everything we were doing
-It was pretty annoying that Yudkowsky was still attributing my greviances to Michael's malign influence—as if the gender identity revolution was something I would otherwise have just _taken lying down_. In the counterfactual where Michael had died in 2015, I think something like my February 2017 breakdown would have likely happened anyway. (Between August 2016 and January 2017, I sent Michael 14 emails, met with him once, and watched 60% of South Park season 19 at his suggestion, so he was _an_ influence on my thinking during that period, but not a disproportionately large one compared to everything else I was doing at the time.) How would I have later reacted to the November 2018 "hill of meaning" Tweets (assuming they weren't butterfly-effected away in this counterfactual)? It's hard to say. Maybe, if that world's analogue of my February 2017 breakdown had gone sufficiently badly (with no Michael to visit me in the psych ward or help me make sense of things afterwards), I would have already been a broken man, and not even sent Yudkowsky an email. In any case, I feel very confident that my understanding of the behavior of "everyone generally and [Yudkowsky] personally" would not have been _better_ without Michael _et al._'s influence.
+It was pretty annoying that Yudkowsky was still attributing my greviances to Michael's malign influence—as if the gender identity revolution was something I would otherwise have just taken lying down. In the counterfactual where Michael had died in 2015, I think something like my February 2017 breakdown would have likely happened anyway. (Between August 2016 and January 2017, I sent Michael 14 emails, met with him once, and watched 60% of South Park season 19 at his suggestion, so he was _an_ influence on my thinking during that period, but not a disproportionately large one compared to everything else I was doing at the time.) How would I have later reacted to the November 2018 "hill of meaning" Tweets (assuming they weren't butterfly-effected away in this counterfactual)? It's hard to say. Maybe, if that world's analogue of my February 2017 breakdown had gone sufficiently badly (with no Michael to visit me in the psych ward or help me make sense of things afterwards), I would have already been a broken man, and not even sent Yudkowsky an email. In any case, I feel very confident that my understanding of the behavior of "everyone generally and [Yudkowsky] personally" would not have been _better_ without Michael _et al._'s influence.
> [cont'd]
> you may recall that this blog included something called the "Bayesian Conspiracy"
> they won't tell you about it, because it interferes with the story they were trying to drive you insaner with, but it's so
> **zackmdavis** — 11/29/2022 10:37 PM
-> it's true that the things I don't like about modern Yudkowsky were still there in Sequences-era Yudkowsky, but I think they've gotten _worse_
+> it's true that the things I don't like about modern Yudkowsky were still there in Sequences-era Yudkowsky, but I think they've gotten _worse_
> **Eliezer** — 11/29/2022 10:39 PM
> well, if your story is that I was always a complicated person, and you selected some of my posts and liked the simpler message you extracted from those, and over time I've shifted in my emphases in a way you don't like, while still having posts like Meta-Honesty and so on... then that's a pretty different story than the one you were telling in this Discord channel, like, just now. today.
(When a literary critic proposes a "dark" interpretation of an author's world, I think it's implied that they're expressing disbelief in the "intended" world; the fact that I was impudently refusing to buy the benevolent interpretation wasn't because I didn't understand it.)
-> Hate-warp like this is bad for truth-perception; my understanding of the situation is that it's harm done to you by the group you say you left. I would read this as being a noninnocent error of that group; that they couldn't get what they wanted from people who still had friends outside their own small microculture, and noninnocently then decided that this outer culture was bad and people needed to be pried loose from it. They tried telling some people that this outer culture was gaslighting them and maliciously lying to them and had to be understood in wholly adversarial terms to break free of the gaslighting; that worked on somebody, and made a new friend for them; so their brain noninnocently learned that it ought to use arguments like that again, so they must be true.
+> Hate-warp like this is bad for truth-perception; my understanding of the situation is that it's harm done to you by the group you say you left. I would read this as being a noninnocent error of that group; that they couldn't get what they wanted from people who still had friends outside their own small microculture, and noninnocently then decided that this outer culture was bad and people needed to be pried loose from it. They tried telling some people that this outer culture was gaslighting them and maliciously lying to them and had to be understood in wholly adversarial terms to break free of the gaslighting; that worked on somebody, and made a new friend for them; so their brain noninnocently learned that it ought to use arguments like that again, so they must be true.
> This is a sort of thing I super did not do because I _understood_ it as a failure mode and Laid My Go Stones Against Ever Actually Being A Cult; I armed people with weapons against it, or tried to, but I was optimistic in my hopes about how much could actually be taught.
> **zackmdavis** — 11/29/2022 11:20 PM
> Without particularly defending Vassar _et al._ or my bad literary criticism (sorry), _modeling the adversarial component of non-innocent errors_ (as contrasted to "had to be understood in wholly adversarial terms") seems very important. (Maybe lying is "worse" than rationalizing, but if you can't hold people culpable for rationalization, you end up with a world that's bad for broadly the same reasons that a world full of liars is bad: we can't steer the world to good states if everyone's map is full of falsehoods that locally benefitted someone.)
> **zackmdavis** — 11/29/2022 11:22 PM
> yeah
> **Eliezer** — 11/29/2022 11:23 PM
-> It remains something that mortals do, and if you cut off anybody who's ever done that, you'll be left with nobody. And also importantly, people making noninnocent errors, if you accuse them of malice, will look inside themselves and correctly see that this is not how they work, and they'll stop listening to the (motivated) lies you're telling them about themselves.
+> It remains something that mortals do, and if you cut off anybody who's ever done that, you'll be left with nobody. And also importantly, people making noninnocent errors, if you accuse them of malice, will look inside themselves and correctly see that this is not how they work, and they'll stop listening to the (motivated) lies you're telling them about themselves.
> This also holds true if you make up overly simplistic stories about 'ah yes well you're doing that because you're part of $woke-concept-of-society' etc.
> **zackmdavis** — 11/29/2022 11:24 PM
> I think there's _also_ a frequent problem where you try to accuse people of non-innocent errors, and they motivatedly interpret _you_ as accusing malice
--- /dev/null
+-------
+
+[OUTLINE—
+ * the race of dath ilani humans are called the "eliezera" in Word of God canon
+ presenting an eliezera racial supremacy narrative. (It's still a racial supremacy narrative even if he doesn't _use the verbatim phrase_ "racial supremacy.")
+ * Bluntly, this is not a culture that gives a shit about people being well-informed. This is a culture that has explicitly
+ * In more detail: the algorithm that designed dath ilani Civilization is one that systematically favors plans that involve deception, over than plans that involve being honest.
+ * This is not a normative claim or a generic slur that dath ilani are "evil" or "bad"; it's a positve claim about systematic deception. If you keep seeing plans for which social-deception-value exceeds claimed-social-benefit value, you should infer that the plans are being generated by a process that "values" (is optimizing for) deception, whether it's a person or a conscious mind.
+ * Watsonian rationale: with smarter people, knowledge actually is dangerous. I'm more interested in a Doylist interpretation, that this reflects authoritarian tendencies in later Yudkowsky's thought.
+
+You can, of course, make up a sensible [Watsonian](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) rationale for this. A world with much smarter people is more "volatile"; with more ways for criminals and terrorists to convert knowledge into danger, maybe you _need_ more censorship just to prevent Society from blowing itself up.
+
+I'm more preoccupied by a [Doylistic](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) interpretation—that dath ilan's obsessive secret-Keeping reflects something deep about how the Yudkowsky of the current year relates to speech and information, in contrast to the Yudkowsky who wrote the Sequences. The Sequences had encouraged you—yes, _you_, the reader—to be as rational as possible. In contrast, the dath ilan mythos seems to portray advanced rationality as dangerous knowledge that people need to be protected from.
+
+As another notable example of dath ilan hiding information for the alleged greater good, in Golarion, Keltham discovers that he's a sexual sadist, and deduces that Civilization has deliberately prevented him from realizing this, because there aren't enough corresponding masochists to go around in dath ilan. Having concepts for "sadism" and "masochism" as variations in human psychology would make sadists like Keltham sad about the desirable sexual experiences they'll never get to have, so Civilization arranges for them to _not be exposed to knowledge that would make them sad, because it would make them sad_ (!!).
+
+What happens, I asked, to the occasional dath ilani free speech activists, with their eloquent manifestos arguing that Civilization would be better off coordinating on maps that reflect the territory, rather than coordinating to be a Keeper-managed zoo? (They _had_ to exist: in a medianworld centered on Yudkowsky, there are going to be a few weirdos who are +2.5 standard deviations on "speak the truth, even if your voice trembles" and −2.5 standard deivations on love of clever plots; this seems less weird than negative utilitarians, who were [established to exist](https://www.glowfic.com/replies/1789623#reply-1789623).) I _assumed_ they get dealt with somehow in the end (exiled from most cities? ... involuntarily cryopreserved?), but there had to be an interesting story about someone who starts out whistleblowing small lies (which Exception Handling allows; they think it's cute, and it's "priced in" to the game they're playing), and then just keeps _escalating and escalating and escalating_ until Governance decides to unperson him.
+
+[...]
+
+If we believe that [IQ research validates the "Jews are clever" stereotype](https://web.mit.edu/fustflum/documents/papers/AshkenaziIQ.jbiosocsci.pdf), I wondered if there's a distinct (albeit probably correlated) "enjoying deception" trait that validates the "Jews are sneaky" stereotype? If dath ilan is very high in this "sneakiness" trait (relative to Earth Jews), that would help explain all the conspiracies!
+
+The existence of such a widespread sneakiness/"taste for deception" trait among the eliezera, in conjunction with their culture just not particularly valuing public knowledge (because they assume everything important is being handled by the Keepers), explains the recurring conspiracies and coverups, like the Ordinary Merrin Conspiracy, Exception Handling's fabrication of evidence for Sparashki being real, the sadism/masochism coverup, and [the village that deliberately teaches anti-redhead bigotry to children in order to test the robustness of dath ilan's general humanism indoctrination](https://www.lesswrong.com/posts/uyBeAN5jPEATMqKkX/lies-told-to-children-1).
+
+I stress that this hypothesis _doesn't_ require dath ilani to be cartoon villains who hate knowledge and want people to be ignorant. Just that, as a result of the widespread sneakiness trait and their outsourcing information-process to the Keepers, in the course of trying to accomplish other things, plans-that-involve-conspiracies are often higher in their search ordering than plans-that-involve-keeping-people-informed.
+
+I claimed that there was a hidden-core-of-rationality thing about a culture that values living in truth, that the dath ilani didn't have. In previous discussion of the Sparashki example, a user called lc had written, "If you see someone wearing an elf costume at work and conclude elves are real and make disastrous decisions based on that conclusion you are mentally deranged". And indeed, you would be mentally deranged if you did that _on Earth_, because we don't have an elves-are-real conspiracy on Earth.
+
+In elves-are-real conspiracy-world, you (Whistleblower) see someone (Conspirator) wearing an elf costume at work and say, "Nice costume." They say, "What costume?" You say, "I see that you're dressed like an elf, but elves aren't real." They say, "What do you mean? Of course elves are real. I'm right here." You say, "You know exactly what I mean."
+
+It would appear that there's a conflict between Conspirator (who wants to maintain a social reality in which they're an elf, because it's fun, and the conspiracy is sufficiently outlandish that it's assumed that no one is "really" being deceived) and Whistleblower (who wants default social reality to map to actual reality; make-believe is fine at a designated fandom convention which has designated boundaries, but let's be serious at work, where your coworkers are trying to make a living and haven't opted-in to this false social reality).
+
+I was skeptical that a culture where people collude to maintain a fake social reality at their job in a hospital, and everyone else is expected to play along because it's fun, really has this living-in-truth thing. People play those social-reality games on Earth, too, and when _they_ say no one is being deceived, they're _definitely_ lying about that, and I doubted that the eliezera were actually built that differently.
+
+"Natural History of Ashkenazi Intelligence"
+
+(I was tempted to tag that as "epistemic status: low-confidence speculation", but that's _frequentist_ thinking—as if "Jews and gentiles are equally sneaky" were a "null hypothesis" that could only be rejected by data that would be sufficiently unlikely assuming that the null was true. Ha ha, that would be _crazy!_ Obviously, I should have a _prior_ on the effect size difference between the Jew and gentile sneakiness distributions, that can be updated as sneakiness data comes in. I think the mean of my prior distribution is at, like, _d_ ≈ 0.1? So it's not "low confidence"; it's "low confidence of the effect size being large enough to be of much practical significance".)
+
+For context on why I have no sense of humor about this, on Earth (which _actually exists_, unlike dath ilan), when someone says "it's not lying, because no one _expected_ me to tell the truth in that situation", what's usually going on, [as Zvi Mowshowitz explains](https://thezvi.wordpress.com/2019/07/02/everybody-knows/), is that is that conspirators benefit from deceiving outsiders, and the claim that "everyone knows" is them lying to _themselves_ about the fact that they're lying.
+
+(If _you_ got hurt by not knowing, well, it's not like anyone got hurt, because if you didn't know, then you weren't anyone.)
+
+Okay, but if it were _actually true_ that everyone knew, what would be _function_ of saying the false thing? On dath ilan (if not in Earth boardrooms), I suppose the answer is "Because it's fun"? Okay, but what is the function of your brain giving out a "fun" reward in this context? It seems like at _some_ point, there has to be the expectation of _some_ cognitive system (although possibly not an entire "person") taking the signals literally.
+
+That's why, when I _notice_ myself misrepresenting my actual beliefs or motivations because I think it's funny or rhetorically powerful (and it takes a special act of noticing; humans aren't built to be honest by default), I often take care to disclaim it immediately (as was observed in the message this is one a reply to), precisely because I _don't_ think that "everybody knows"; I'm not going to give up on humor or powerful rhetoric, but I'm also not going to delude myself into thinking it's "zero-calorie" (people who don't "get the joke" _are_ going to be misled, and I don't think it's unambigously "their fault" for not being able to read my "intent" to arbitrary precision)
+
+But maybe dath ilan is sufficiently good at achieving common knowledge in large groups that they _can_ pull off a zero-calorie "everyone knows" conspiracy without damaging shared maps??
+
+I'm still skeptical, especially given that we see them narratizing it as "not lying" (in the same words that corrupt executives on Earth use!), rather than _explicitly_ laying out the evopysch logic of sneakiness superstimuli, and the case that they know how to pull it off in a zero-calorie (trivial damage to shared maps) way.
+
+In general, I think that "it's not lying because no one expected the truth" is something you would say as part of an attempted nearest-unblocked-strategy end run around a deontological constraint against "lying" (https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly); I don't think it's something you would say if you _actually cared_ about shared maps being accurate
+
+What I see on Earth (which, again, _actually exists_, unlike dath ilan, which _does not exist_), is that people mostly drink their own Kool-Aid; lying to the world without lying to yourself is just not psychologically sustainable.
+
+I had some insightful discussion with someone in 2017, in which I was saying that I wanted something to be public knowledge, that was frequently denied for political reasons. (The object-level topic doesn't matter in this context.) This person said, "I don't particularly care about this being commonly recognized. C'mon! It's fun to have some secrets that are not public knowledge".
+
+In 2021, _the same person_ says she feels "disgusted and complicit" for having talked to me.
+
+> At some point during this time I started treating this as a hidden truth that I was proud of myself for being able to see, which I in retrospect I feel disgusted and complicit to have accepted
+
+> above exchanges did in retrospect cause me emotional pain, stress, and contributed to internalizing sexism and transphobia.
+
+I think it's very significant that she _didn't_ say that she encountered _new evidence_ that made her _change her mind_, and decided that she was _actually wrong_ in 2017. The claim is just that the things we said in 2017 are "harmful."
+
+wrap-up dath ilan is fictional
+
+the reason I'm paranoid and humorless about
+
+positive-valence fictional depictions of
+
+"but, but, information can _hurt people_, and hurting people is wrong" and "it's not lying if 'everybody knows'" memes, is because I _actually see this stuff destroying people's minds in real life_
+
+I certainly don't think Yudkowsky is consciously "lying"
+
+(natural language is very flexible, you can _come up with_ some interpretation)
+
+You can cosplay an elf _at a designated fandom convention_, where people have temporarily _opted in_ to that false social reality. But I don't think you can cosplay an elf _at work_ (in "real life") and have everyone play along for extended periods of time, without dealing damage to real-life shared maps.
+
+Similarly, I cosplay female characters at fandom conventions, and that's fun, and I'm glad that conventions exist, but I can't transition in "real life", because I don't expect anyone in real life to believe that I'm female, because it's _very obviously not true_. People will _pretend_ to believe it because they're terrified of being accused of transphobia, but _they are lying_, and the
+
+people who try to claim that no one is being deceived because "everyone knows"
+
+_are also lying_.
+
+Keltham contradicts himself _in the same tag_
+https://www.glowfic.com/replies/1865236#reply-1865236
+
+> The sneakiest thing dath ilan did was covertly shape him to never notice he was a sadist
+> [...]
+> Obviously past-Keltham was shaped in all sorts of ways as a kid, but those shaping-targets are matters of public documentation on the Network. They're not _covert_ intended effects of the alien technology.
+
+I mean, it's worth noting that their concept of a "good reason" literally includes "prediction markets think people will be happier this way". This is not a Society that gives a shit (as a terminal value) about non-Keepers having accurate information (or they wouldn't, _e.g._, gaslight Merrin about how famous she is).
+
+_Of course_ a Society that prizes freedom-from-infohazards as a core value is going to have lots of "good reasons" for the systematically-misleading-representations they make, that will seem genuinely compelling to the people of that Society who are in on it!
+
+One might have hoped that dath ilani would be self-aware enough to notice that things that seem like a "good reason" for a conspiracy _to dath ilani_, would not seem like a "good reason" to people from a Society that prizes freedom-of-speech? But if they've screened off their history (for the greater good, of course), they might not have a concept of what other Societies are like ...
+
+(Yes, I know we've been informed by authorial fiat that dath ilan has a lot of internal diversity, but there are necessarily limits to that if you're going to be a human Society specifically rather than a Solomonff inductor, and it seems clear that any faction that thinks gaslighting Merrin is morally wrong is on the losing end of the counterfactual warfare of democracy.)
+
+Aslan / amputation of destiny
+
+ * An ethnographer might note that Americans believe themselves to be "the land of the brave and the home of the free", without being obliged for their ethnography to agree with this description. I'm taking the same stance towards dath ilan: as a literary critic, I don't have to share its Society's beliefs about itself.
+
+spoilers for the pleasure of discovering sex for themselves: https://glowfic.com/replies/1812613#reply-1812613
+
+being told eugenics prospects early as self-fulfilling prophecies, as a legitimate infohazard: https://glowfic.com/replies/1812614#reply-1812614
+
+sex and classical mechanics spoilers (spoilers are not the same thing as conspiracies/secrets; Keltham and Carissa arne't supposed to find out): https://glowfic.com/replies/1718168#reply-1718168
+
+https://glowfic.com/replies/1788890#reply-1788890
+> He likes confusing people. Supposedly it's to train strong minds that don't weakly rely on being told how reality works. I think it may be what dath ilan does with all its repressed sadism.
+
+https://glowfic.com/replies/1801463#reply-1801463
+> "I really think all y'all don't give dath ilan enough credit on some dimensions, if not others. They didn't tell me about my sexual sadism because there would have been no good way for me to satisfy it, Pilar, not because they wanted to deny me my utilityfunction. Or did you have something else in mind?"
+
+(masochism search tags)
+https://glowfic.com/replies/search?board_id=&author_id=366&template_id=&character_id=&subj_content=masochism&sort=created_old&condensed=on&commit=Search
+
+https://glowfic.com/replies/1735044#reply-1735044
+> "But yes, or rather, my suspicion is that not many sadists in dath ilan know what they are and Civilization tries to prevent us from finding out, because dath ilan does not have masochists.
+
+> His Lawful Good world tried to make sure he never found out about that. Keltham thinks that's because dath ilan has no masochists. He thinks masochism itself is unlikely
+https://glowfic.com/replies/1788845#reply-1788845
+
+
+In the #dath-ilan channel of the server, Yudkowsky elaborated on the reasoning for the masochism coverup:
+
+> altruistic sadists would if-counterfactually-fully-informed prefer not to know, because Civilization is capped on the number of happy sadists. even if you can afford a masochist, which requires being very rich, you're buying them away from the next sadist to whom masochists were previously just barely affordable
+
+In response to a question about how frequent sadism is among Keepers, Yudkowsky wrote:
+
+> I think they're unusually likely to be aware, nonpracticing potential sexual sadists. Noticing that sort of thing about yourself, and then not bidding against the next sadist over for the limited masochist supply, and instead just operating your brain so that it doesn't hurt much to know what you can't have, is exactly the kind of cost you're volunteering to take on when you say you wanna be a Keeper.
+> that's archetypally exactly The Sort Of Thing Keepers Do And Are
+
+> They choose not to, not just out of consideration for the next person in line, but because not harming the next person in line is part of the explicit bargain of becoming a Keeper.
+> Like, this sort of thing is exactly what you're signing up for when you throw yourself on the bounded rationality grenade.
+> Let the truth destroy what it can—but in you, not in other people.
+
+I objected (to the room, I told myself, not technically violating my prior intent to not bother Yudkowsky himself anymore) that "Let the truth destroy what it can—in yourself, not in other people" is such an _incredibly_ infantilizing philosophy. It's a meme that optimizes for shaping people (I know, _other_ people) into becoming weak, stupid, and unreflective, like Thellim's impression of Jane Austen characters. I expect people on Earth—not even "rationalists", just ordinary adults—to be able to cope with ... learning facts about psychology that imply that there are desirable sexual experiences they won't get to have.
+
+A user called Numendil insightfully pointed out that dath ilani might be skeptical of an Earthling saying that an unpleasant aspect our of existence is actually fine, for the same reason we would be skeptical of a resident of Golarion saying that; it makes sense for people from richer civilizations to look "spoiled" to people from poorer ones.
+
+Other replies were more disturbing. One participant wrote:
+
+> I think of "not in other people" not as "infantilizing", but as recognizing independent agency. You don't get to do harm to other people without their consent, whether that is physical or pychological.
+
+I pointed out that this obviously applies to, say, religion. Was it wrong to advocate for atheism in a religious Society, where robbing someone of their belief in God might be harming them?
+
+"Every society strikes a balance between protectionism and liberty," someone said. "This isn't news."
+
+
+Someone else said:
+
+> dath ilan is essentially a paradise world. In a paradise world, people have the slack to make microoptimisations like that, to allow themselves Noble Lies and not fear for what could be hiding in the gaps. Telling the truth is a heuristic for this world where Noble Lies are often less Noble than expected and trust is harder to come by.
+
+I said that I thought people were missing this idea that the reason "truth is better than lies; knowledge is better than ignorance" is such a well-performing [injunction](https://www.lesswrong.com/posts/dWTEtgBfFaz6vjwQf/ethical-injunctions) in the real world (despite the fact that there's no law of physics preventing lies and ignorance from having beneficial consequences), is because [it protects against unknown unknowns](https://www.lesswrong.com/posts/E7CKXxtGKPmdM9ZRc/of-lies-and-black-swan-blowups). Of course an author who wants to portray an ignorance-maintaining conspiracy as being for the greater good, can assert by authorial fiat whatever details are needed to make it all turn out for the greater good, but _that's not how anything works in real life_.
+
+I started a new thread to complain about the attitude I was seeing (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). When fiction in this world, _where I live_, glorifies Noble Lies, that's a cultural force optimizing for making shared maps less accurate, I explained. As someone trying to make shared maps _more_ accurate, this force was hostile to me and mine. I understood that "secrets" and "lies" are not the same thing, but if you're a consequentialist thinking in terms of what kinds of optimization pressures are being applied to shared maps, [it's the same issue](https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist): I'm trying to steer _towards_ states of the world where people know things, and the Keepers of Noble Secrets are trying to steer _away_ from states of the world where people know things. That's a conflict. I was happy to accept Pareto-improving deals to make the conflict less destructive, but I wasn't going to pretend the pro-ignorance forces were my friends just because they self-identified as "rationalists" or "EA"s. I was willing to accept secrets around nuclear or biological weapons, or AGI, on "better ignorant than dead" grounds, but the "protect sadists from being sad" thing wasn't a threat to anyone's life; it was _just_ coddling people who can't handle reality, which made _my_ life worse.
+
+I wasn't buying the excuse that secret-Keeping practices that wouldn't be okay on Earth were somehow okay on dath ilan, which was asserted by authorial fiat to be sane and smart and benevolent enough to make it work. Alternatively, if I couldn't argue with authorial fiat: the reasons why it would be bad on Earth (even if it wouldn't be bad in the author-assertion paradise of dath ilan) are reasons why _fiction about dath ilan is bad for Earth_.
+
+And just—back in the 'aughts, I said, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog in 2008—the will to _just get the right answer_, without all this galaxy-brained hand-wringing about who the right answer might hurt.
+
+(_Overcoming Bias_ had actually been a group blog then, but I was enjoying the æsthetic of saying "Robin Hanson's blog" (when what I had actually loved about _Overcoming Bias_ was Yudkowsky's Sequences) as a way of signaling contempt for the Yudkowsky of the current year.)
+
+I would have expected a subculture descended from the memetic legacy of Robin Hanson's blog in 2008 to respond to that tripe about protecting people from the truth being a form of "recognizing independent agency" with something like—
+
+"Hi! You must be new here! Regarding your concern about truth doing harm to people, a standard reply is articulated in the post ["Doublethink (Choosing to be Biased)"](https://www.lesswrong.com/posts/Hs3ymqypvhgFMkgLb/doublethink-choosing-to-be-biased). Regarding your concern about recognizing independent agency, a standard reply is articulated in the post ["Your Rationality Is My Business"](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business)."
+
+—or _something like that_. Not that the reply needed to use those particular Sequences links, or _any_ Sequences links; what's important is that someone needed to counter to this very obvious [anti-epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology).
+
+And what we actually saw in response to the "You don't get to do harm to other people" message was ... it got 5 "+1" emoji-reactions.
+
+Yudkowsky [chimed in to point out that](/images/yudkowsky-it_doesnt_say_tell_other_people.png) "Doublethink" was about _oneself_ not reasonably being in the epistemic position of knowing that one should lie to oneself. It wasn't about telling the truth to _other_ people.
+
+On the one hand, fair enough. My generalization from "you shouldn't want to have false beliefs for your own benefit" to "you shouldn't want other people to have false beliefs for their own benefit" (and the further generalization to it being okay to intervene) was not in the text of the post itself. It made sense for Yudkowsky to refute my misinterpretation of the text he wrote.
+
+On the other hand—given that he was paying attention to this #overflow thread anyway, I might have naïvely hoped that he would appreciate what I was trying to do?—that, after the issue had been pointed out, he would decided that he _wanted_ his chatroom to be a place where we don't want other people to have false beliefs for their own benefit?—a place that approves of "meddling" in the form of _telling people things_.
+
+The other chatroom participants mostly weren't buying what I was selling.
+
+A user called April wrote that "the standard dath ilani has internalized almost everything in the sequences": "it's not that the standards are being dropped[;] it's that there's an even higher standard far beyond what anyone on earth has accomplished". (This received a checkmark emoji-react from Yudkowsky, an indication of his agreement/endorsement.)
+
+Someone else said he was "pretty leery of 'ignore whether models are painful' as a principle, for Earth humans to try to adopt," and went on to offer some thoughts for Earth. I continued to maintain that it was ridiculous that we were talking of "Earth humans" as if there were any other kind—as if rationality in the Yudkowskian tradition wasn't something to aspire to in real life.
+
+Dath ilan [is _fiction_](https://www.lesswrong.com/posts/rHBdcHGLJ7KvLJQPk/the-logical-fallacy-of-generalization-from-fictional), I pointed out. Dath ilan _does not exist_. I thought it was a horrible distraction to try to see our world through Thellim's eyes and feel contempt over how much better things must be on dath ilan (which, to be clear, again, _does not exist_), when one could be looking through the eyes of an ordinary reader of Robin Hanson's blog in 2008 (the _real_ 2008, which _actually happened_), and seeing everything we've lost.
+
+[As it was taught to me then](https://www.lesswrong.com/posts/iiWiHgtQekWNnmE6Q/if-you-demand-magic-magic-won-t-help): if you demand Keepers, _Keepers won't help_. If I'm going to be happy anywhere, or achieve greatness anywhere, or learn true secrets anywhere, or save the world anywhere, or feel strongly anywhere, or help people anywhere—I may as well do it _on Earth_.
+
+The thread died out soon enough. I had some more thoughts about dath ilan's predilection for deception, of which I typed up some notes for maybe adapting into a blog post later, but there was no point in wasting any more time on Discord.
+
+On 29 November 2022 (four years and a day after the "hill of meaning in defense of validity" Twitter performance that had ignited my rationalist civil war), Yudkowsky remarked about the sadism coverup again:
+
+> Keltham is a romantically obligate sadist. This is information that could've made him much happier if masochists had existed in sufficient supply; Civilization has no other obvious-to-me-or-Keltham reason to conceal it from him.
+
+Despite the fact that there was no point in wasting any more time on Discord, I decided not to resist the temptation to open up the thread again and dump some paragraphs from my notes on the conspiracies of dath ilan.
+
+---------
+
+A user called ajvermillion asked why I was being so aggressively negative about dath ilan. He compared it to Keltham's remark about how [people who grew up under a Lawful Evil government were disposed to take a more negative view of paternalism](https://www.glowfic.com/replies/1874754#reply-1874754) than they do in dath ilan, where paternalism basically works fine because dath ilan is benevolent.
+
+This question put me in a somewhat awkward position: it was a legitimate question that I felt I had to answer, that I had no way of answering honestly without at least _alluding_ to my prior greviances against Yudkowsky ... which were off-topic for the server. (Again, I had told myself that I was here to comment on the story, not to prosecute my greviances.)
+
+I tried to explain, briefly. Someone who might be even _more_ paranoid about abuses of power than someone who grew up with a Lawful Evil government, is someone who grew up under a power structure that put on a _good show_ of being clean and nice, but was actually corrupt and mean.
+
+Yudkowsky had this whole marketing image of him being uniquely sane and therefore uniquely benevolent, and because his Sequences were so life-changingly good, I _actually fell for it_. There was a long Dumb Story (this Story) that was off-topic and I hadn't then finished writing up, but basically, I had what I claimed were very strong reasons not to trust the guy anymore; I think he cares a lot about not explicitly _lying_, but what made the Sequences special is that they articulated a vastly higher standard than that, that he had no intention of living up to.
+
+And so, yeah, insofar as fiction about dath ilan functioned as marketing material for Yudkowsky's personality cult that I thought was damaging people like me (in some ways, while simultaneously helping us in other ways), I had an incentive to come up with literary criticism that paints dath ilan negatively?
+
+It was great for ajvermillion to notice this! It _would_ be bad if my brain were configured to come up with dath-ilan-negative literary criticism, and for me to _simultaneously_ present myself as an authority on dath ilan whom you should trust. But if dath-ilan-negative literary criticism was undersupplied for structural reasons (because people who like a story are selected for not seeing things the story is doing that are Actually Bad), and my brain was configured to generate it anyway (because I disliked the person Yudkowsky had become, in contrast to the person he was in 2008), it seemed pro-social for me to post it, for other people to take or leave according to their own judgement?
✓ complicity and friendship
✓ plan to reach out to "Ethan"
✓ Michael on creepy men/crazy men
+✓ repair pt. 5 dath ilan transition
_ State of Steven
_ reaction to Ziz
-_ repair pt. 5 dath ilan transition
- Eliezerfic fight conclusion
blocks to fit somewhere—
_ mention Nick Bostrom email scandal (and his not appearing on the one-sentence CAIS statement)
_ revise and cut words from "bad faith" section since can link to "Assume Bad Faith"
_ cut words from January 2020 Twitter exchange (after war criminal defenses)
-_ revise reply to Xu
_ everyone *who matters* prefers to stay on the good side
pt. 5 edit tier—
_ probably cut the vaccine polarization paragraphs? (overheard at a party is not great sourcing, even if technically admissible)
_ elaborate on how 2007!Yudkowsky and 2021!Xu are saying the opposite things if you just take a plain-language reading and consider, not whether individual sentences can be interpreted as "true", but what kind of _optimization_ the text is doing to the behavior of receptive readers
_ Scott got comas right in the same year as "Categories"
+_ revise reply to Xu
_ cite Earthling/postrat sneers
_ cite postYud Tweet
_ when EY put a checkmark on my Discord message characterizing his strategy as giving up on intellectual honesty
_ cut lots of words from Scotts comments on Jessica's MIRI post (keep: "attempting to erase the agency", Scott blaming my troubles on Michael being absurd)
_ sucking Scott's dick is helpful because he's now the main gateway instead of HPMOR
_ Sarah's point that Scott gets a lot of undeserved deference, too: https://twitter.com/s_r_constantin/status/1435609950303162370
+_ clarify that Keltham infers there are no mascochists, vs. Word of God
+_ "Doublethink" ref in Xu discussion should mention that Word of God Eliezerfic clarification that it's not about telling others
dath ilan ancillary tier—
_ Who are the 9 most important legislators called?
_ collect Earth people sneers
_ psyops don't require people taking them seriously, only that they make people think the govenment is doing psyops
-
+_ the reason "truth is better than lies; knowledge is better than ignorance" is general; can assert by authorial fiat whatever details are needed to make it all turn out for the greater good, but _that's not how anything works in real life_.
things to discuss with Michael/Ben/Jessica—
_ Anna on Paul Graham
_ Michael's SLAPP against REACH
_ Michael on creepy and crazy men
-
------
With internet available—