-[OUTLINE—
- * Introduction and Thesis
- * Yudkowsky's new fictional universe is dath ilan, a medianworld centered around him (the race of dath ilani humans are called the "eliezera" in Word of God canon); the "joke" is that this is where the rationality tech of the Sequences came from (per the 2014 April Fools' Day post). Yudkowsky even talks this way in other contexts, including a trope of making fun of "Earth people" and presenting an eliezera racial supremacy narrative. (It's still a racial supremacy narrative even if he doesn't _use the verbatim phrase_ "racial supremacy.") One is led to believe that dath ilan represents a canonical rationalist utopia, someplace that readers of the Sequences would be proud of.
- * And yet, for such a supposed rationalist utopia, it's striking that dath ilan is _lying about everything_, seemingly whenever it "sounds like a good idea" to someone: not just keeping AGI secrets (the way we keep nuclear secrets on Earth), but preventing the idea from coming up as science fiction, keeping Merrin in a Truman-show-like reality where she doesn't know that she's famous, hiding info about sexuality on utilitarian grounds—even bizarre trivial stuff like knives. I discuss these examples in detail later in this essay.
- * Bluntly, this is not a culture that gives a shit about people being well-informed. This is a culture that has explicitly
- * In more detail: the algorithm that designed dath ilani Civilization is one that systematically favors plans that involve deception, over than plans that involve being honest.
- * This is not a normative claim or a generic slur that dath ilani are "evil" or "bad"; it's a positve claim about systematic deception. If you keep seeing plans for which social-deception-value exceeds claimed-social-benefit value, you should infer that the plans are being generated by a process that "values" (is optimizing for) deception, whether it's a person or a conscious mind.
- * Watsonian rationale: with smarter people, knowledge actually is dangerous. I'm more interested in a Doylist interpretation, that this reflects authoritarian tendencies in later Yudkowsky's thought.
-
- * Interlude: "I can't argue with authorial fiat"
- * A worldbuilding critic who takes a negative view of some world might be told that they're not allowed to contradict authorial intent. If the narrator says the people of dath ilan are doing something because they're good and smart and cooperative, the critic has to accept that.
- * But what makes the medianworld exercise interesting is that it's about trying to portray a realistic world, given a shifted distribution of psychological traits. We take the text of the story as a given, but we don't have to take dath ilan's self-image literally, if we think a different world could "project" into the same text and explain it better.
- * An ethnographer might note that Americans believe themselves to be "the land of the brave and the home of the free", without being obliged for their ethnography to agree with this description. I'm taking the same stance towards dath ilan: as a literary critic, I don't have to share its Society's beliefs about itself.
-
- * The Merrin Show
- * Merrin: a reverse Emperor Norton case
- * Overcoming Bias readers in 2008 would have found this offensive, not cute; Merrin is living a lie, and everyone is in on it.
- * Rittaen's claim that knowledge of psychology turns you into a psychopath seems dubious.
- * Exception Handling's Sparashki ruse is still kicking up epistemic dust even if few take it literally
- * "Everybody knows"
- * It has to bottom out in someone taking it literally
- * zero-calorie superstimulus theory