From 55cd7204cde09d849cb16863a0125101d0dd828e Mon Sep 17 00:00:00 2001 From: "M. Taylor Saotome-Westlake" Date: Mon, 27 Feb 2023 22:11:41 -0800 Subject: [PATCH] memoir: theory of conspiracies --- ...mments-on-the-conspiracies-of-dath-ilan.md | 6 +-- content/drafts/standing-under-the-same-sky.md | 54 ++++++++++++------- 2 files changed, 36 insertions(+), 24 deletions(-) diff --git a/content/drafts/comments-on-the-conspiracies-of-dath-ilan.md b/content/drafts/comments-on-the-conspiracies-of-dath-ilan.md index aac38ad..4f02ebb 100644 --- a/content/drafts/comments-on-the-conspiracies-of-dath-ilan.md +++ b/content/drafts/comments-on-the-conspiracies-of-dath-ilan.md @@ -4,16 +4,11 @@ Category: commentary Tags: Eliezer Yudkowsky, worldbuilding Status: draft -If we believe that IQ research validates the "Jews are clever" stereotype, I wonder if there's a distinct (albeit probably correlated) "enjoying deception" trait that validates the "Jews are sneaky" stereotype? - "Natural History of Ashkenazi Intelligence" -https://web.mit.edu/fustflum/documents/papers/AshkenaziIQ.jbiosocsci.pdf (I was tempted to tag that as "epistemic status: low-confidence speculation", but that's _frequentist_ thinking—as if "Jews and gentiles are equally sneaky" were a "null hypothesis" that could only be rejected by data that would be sufficiently unlikely assuming that the null was true. Ha ha, that would be _crazy!_ Obviously, I should have a _prior_ on the effect size difference between the Jew and gentile sneakiness distributions, that can be updated as sneakiness data comes in. I think the mean of my prior distribution is at, like, _d_ ≈ 0.1? So it's not "low confidence"; it's "low confidence of the effect size being large enough to be of much practical significance".) -Anyway, if dath ilan is very high in the sneakiness trait (relative to Earth), that would help explain all the conspiracies! -Not-actually-plausible conspiracies that everyone is in on (like "Sparashki are real") are a superstimulus like zero-calorie sweetener: engineered to let everyone enjoy the thrill of lying, without doing any real damage to shared maps. For context on why I have no sense of humor about this, on Earth (which _actually exists_, unlike dath ilan), when someone says "it's not lying, because no one _expected_ me to tell the truth in that situation", what's usually going on (as Zvi Mowshowitz explains: ) is that is that conspirators benefit from deceiving outsiders, and the claim that "everyone knows" is them lying to _themselves_ about the fact that they're lying. @@ -27,6 +22,7 @@ That's why, when I _notice_ myself misrepresenting my actual beliefs or motivati But maybe dath ilan is sufficiently good at achieving common knowledge in large groups that they _can_ pull off a zero-calorie "everyone knows" conspiracy without damaging shared maps?? + I'm still skeptical, especially given that we see them narratizing it as "not lying" (in the same words that corrupt executives on Earth use!), rather than _explicitly_ laying out the evopysch logic of sneakiness superstimuli, and the case that they know how to pull it off in a zero-calorie (trivial damage to shared maps) way. In general, I think that "it's not lying because no one expected the truth" is something you would say as part of an attempted nearest-unblocked-strategy end run around a deontological constraint against "lying" (https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly); I don't think it's something you would say if you _actually cared_ about shared maps being accurate diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index c63ddb6..771d987 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -570,11 +570,11 @@ Everyone in dath ilan receives rationality training from childhood, but knowledg Something that annoyed me about the portrayal of dath ilan was their incredibly casual attitude towards hiding information for some alleged greater good, seemingly without considering that [there are benefits and not just costs to people knowing things](http://benjaminrosshoffman.com/humility-argument-honesty/). -You can, of course, make up a sensible [Watsonian](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) rationale for this. A world with much smarter people is more "volatile"; with more ways for criminals and terrorists to convert knowledge into danger, maybe you _need_ more censorship just to prevent Society from blowing up. +You can, of course, make up a sensible [Watsonian](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) rationale for this. A world with much smarter people is more "volatile"; with more ways for criminals and terrorists to convert knowledge into danger, maybe you _need_ more censorship just to prevent Society from blowing itself up. I'm more preoccupied by a [Doylistic](https://tvtropes.org/pmwiki/pmwiki.php/Main/WatsonianVersusDoylist) interpretation—that dath ilan's obsessive secret-Keeping reflects something deep about how the Yudkowsky of the current year relates to speech and information, in contrast to the Yudkowsky who wrote the Sequences. The Sequences had encouraged you—yes, _you_, the reader—to be as rational as possible. In contrast, the dath ilan mythos seems to portray advanced rationality as dangerous knowledge that people need to be protected from. ["The universe is not so dark a place that everyone needs to become a Keeper to ensure the species's survival,"](https://glowfic.com/replies/1861879#reply-1861879) we're told. "Just dark enough that some people ought to." -Someone at the 2021 Event Horizon Independence Day party had told me that I had been misinterpreting the "Speak the truth, even if your voice trembles" slogan from the Sequences. I had interpreted the slogan as suggesting the importance of speaking the truth _to other people_ (which I think is what "speaking" is usually about), but my interlocutor said it was about, for example, being able to speak the truth aloud in your own bedroom, to yourself. I think some textual evidence for my interpretation can be found in Daria's ending to ["A Fable of Science and Politics"](https://www.lesswrong.com/posts/6hfGNLf4Hg5DXqJCF/a-fable-of-science-and-politics): +Someone at the 2021 Event Horizon Independence Day party had told me that I had been misinterpreting the "Speak the truth, even if your voice trembles" slogan from the Sequences. I had interpreted the slogan as suggesting the importance of speaking the truth _to other people_ (which I think is what "speaking" is usually about), but my interlocutor said it was about, for example, being able to speak the truth aloud in your own bedroom, to yourself. I think some textual evidence for my interpretation can be found in Daria's ending to ["A Fable of Science and Politics"](https://www.lesswrong.com/posts/6hfGNLf4Hg5DXqJCF/a-fable-of-science-and-politics), a multiple-parallel-endings story about an underground Society divided into factions over the color of the unseen sky, and one person's reaction when they find a passageway leading aboveground to a view of the sky: > Daria, once Green, tried to breathe amid the ashes of her world. _I will not flinch_, Daria told herself, _I will not look away_. She had been Green all her life, and now she must be Blue. Her friends, her family, would turn from her. _Speak the truth, even if your voice trembles_, her father had told her; but her father was dead now, and her mother would never understand. Daria stared down the calm blue gaze of the sky, trying to accept it, and finally her breathing quietened. _I was wrong_, she said to herself mournfully; _it's not so complicated, after all_. She would find new friends, and perhaps her family would forgive her ... or, she wondered with a tinge of hope, rise to this same test, standing underneath this same sky? "The sky is blue," Daria said experimentally, and nothing dire happened to her; but she couldn't bring herself to smile. Daria the Blue exhaled sadly, and went back into the world, wondering what she would say. @@ -588,21 +588,25 @@ For example, we are told of an Ordinary Merrin Conspiracy centered around a famo But _as_ a rationalist, I condemn the Ordinary Merrin Conspiracy as _morally wrong_, for the same [reasons I condemn the Emperor Norton Conspiracy](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/#emperor-norton). As [it was taught to me on _Overcoming Bias_ back in the 'aughts](https://www.lesswrong.com/posts/HYWhKXRsMAyvRKRYz/you-can-face-reality): what's true is already so. Denying it won't make it better. Acknowledging it won't make it worse. And _because_ it is true, it is what is there to be interacted with. Anything untrue isn't there to be lived. People can stand what is true, _because they are already doing so_. -In [the story about how Merrin came to the attention of dath ilan's bureau of Exception Handling](https://glowfic.com/posts/6263), we see the thoughts of a Keeper, Rittaen, who talks to Merrin. We're told that the discipline of modeling people mechanistically rather than [through empathy](https://www.lesswrong.com/posts/NLMo5FZWFFq652MNe/sympathetic-minds) is restricted to Keepers to prevent the risk of ["turning into an exceptionally dangerous psychopath"](https://glowfic.com/replies/1862201#reply-1862201). Rittaen [uses his person-as-machine Sight](https://glowfic.com/replies/1862204#reply-1862204) to infer that Merrin was biologically predisposed to learn to be afraid of having too much status. +In ["For No Laid Course Prepare"](https://glowfic.com/posts/6263), the story about how Merrin came to the attention of dath ilan's bureau of Exception Handling, we see the thoughts of a Keeper, Rittaen, who talks to Merrin. We're told that the discipline of modeling people mechanistically rather than [through empathy](https://www.lesswrong.com/posts/NLMo5FZWFFq652MNe/sympathetic-minds) is restricted to Keepers to prevent the risk of ["turning into an exceptionally dangerous psychopath"](https://glowfic.com/replies/1862201#reply-1862201). Rittaen [uses his person-as-machine Sight](https://glowfic.com/replies/1862204#reply-1862204) to infer that Merrin was biologically predisposed to learn to be afraid of having too much status. Notwithstanding that Rittaen can be Watsonianly assumed to have detailed neuroscience skills that the author Doylistically doesn't know how to write, I am entirely unimpressed by the assertion that this idea is somehow _dangerous_, a secret that only Keepers can bear, rather than something _Merrin herself should be clued into_. "It's not [Rittaen's] place to meddle just because he knows Merrin better than Merrin does," we're told. -In the same story, an agent from Exception Handling [tells Merrin that the bureau's Fake Conspiracy section is running an operation to plant evidence that Sparashki (the fictional alien Merrin happens to be dressed up as) are real](https://glowfic.com/replies/1860952#reply-1860952), and asks Merrin not to contradict this, and Merrin just ... goes along with it. It's in-character for Merrin to go along with it, because she's a pushover. My question is, why is it okay that Exception Handling has a Fake Conspiracies section, any more than it would have been if FTX or Enron explicitly had a Fake Accounting department? (Because dath ilan are the designated good guys? Well, so was FTX.) +In the same story, Merrin is dressed up as a member of a fictional alien species, the Sparashki, due to having been summoned to the hospital from a fan convention with no time to change outfits. An agent from Exception Handling [tells Merrin that the bureau's Fake Conspiracy section is running an operation to plant evidence that Sparashki are real](https://glowfic.com/replies/1860952#reply-1860952), and asks Merrin not to contradict this, and Merrin just ... goes along with it. + +It's in-character for Merrin to go along with it, because she's a pushover. My question is, why is it okay that Exception Handling has a Fake Conspiracies section (!), any more than it would have been if FTX or Enron explicitly had a Fake Accounting department? + +(Is it because dath ilan are the designated good guys? Well, so was FTX.) As another notable example of dath ilan hiding information for the alleged greater good, in Golarion, Keltham discovers that he's a sexual sadist, and deduces that Civilization has deliberately prevented him from realizing this, because there aren't enough corresponding masochists to go around in dath ilan. Having concepts for "sadism" and "masochism" as variations in human psychology would make sadists like Keltham sad about the desirable sexual experiences they'll never get to have, so Civilization arranges for them to _not be exposed to knowledge that would make them sad, because it would make them sad_ (!!). It did not escape my notice that when "rationalist" authorities _in real life_ considered public knowledge of some paraphilia to be an infohazard (ostensibly for the benefit of people with that paraphilia), I _didn't take it lying down_. -This parallel between dath ilan's sadism/masochism coverup and the autogynephilia coverup I had fought in real life, was something I was only intending to comment on in passing in the present memoir, rather than devoting any more detailed analysis to, but as I was having trouble focusing on my own writing in September 2022, I ended up posting some critical messages about dath ilan's censorship regime in the "Eliezerfic" Discord server for reader discussion of _Planecrash_, using the masochism coverup as my central example. +This parallel between dath ilan's sadism/masochism coverup and the autogynephilia coverup I had fought in real life, was something I was only intending to comment on in passing in the present memoir, rather than devoting any more detailed analysis to, but as I was having trouble focusing on my own writing in September 2022, I ended up posting some critical messages about dath ilan's censorship regime in the "Eliezerfic" Discord server for reader discussion of _Planecrash_, using the sadism/masochism coverup as my central example. -What happens, I asked, to the occasional dath ilani free speech activists, with their eloquent manifestos arguing that Civilization would be better off coordinating on maps that reflect the territory, rather than coordinating to be a Keeper-managed zoo? (They _had_ to exist: in a medianworld centered on Yudkowsky, there are going to a few weirdos who are +2.5 standard deviations on "speak the truth, even if your voice trembles" and −2.5 standard deivations on love of clever plots; this seems less weird than negative utilitarians, who were [established to exist](https://www.glowfic.com/replies/1789623#reply-1789623).) I _assumed_ they get dealt with somehow in the end (exiled from most cities? involuntarily cryopreserved?), but there had got to be an interesting story about someone who starts out whistleblowing small lies (which Exception Handling allows; they think it's cute, and it's "priced in" to the game they're playing), and then just keeps _escalating and escalating and escalating_ until Governance decides to unperson him. +What happens, I asked, to the occasional dath ilani free speech activists, with their eloquent manifestos arguing that Civilization would be better off coordinating on maps that reflect the territory, rather than coordinating to be a Keeper-managed zoo? (They _had_ to exist: in a medianworld centered on Yudkowsky, there are going to be a few weirdos who are +2.5 standard deviations on "speak the truth, even if your voice trembles" and −2.5 standard deivations on love of clever plots; this seems less weird than negative utilitarians, who were [established to exist](https://www.glowfic.com/replies/1789623#reply-1789623).) I _assumed_ they get dealt with somehow in the end (exiled from most cities? ... involuntarily cryopreserved?), but there had to be an interesting story about someone who starts out whistleblowing small lies (which Exception Handling allows; they think it's cute, and it's "priced in" to the game they're playing), and then just keeps _escalating and escalating and escalating_ until Governance decides to unperson him. -Although Yudkowsky participated in the server, I had reasoned that my participation didn't violate my previous intent not to bother him anymore, because it was a publicly-linked Discord server with hundreds of members. Me criticizing the story for the benefit of the _other_ 499 people in the chat room wouldn't generate a notification _for him_, the way it would if I sent him an email or replied to him on Twitter. +Although Yudkowsky participated in the server, I had reasoned that my participation didn't violate my previous intent not to bother him anymore, because it was a publicly-linked Discord server with hundreds of members. Me commenting on the story for the benefit of the _other_ 499 people in the chat room wouldn't generate a notification _for him_, the way it would if I sent him an email or replied to him on Twitter. In the #dath-ilan channel of the server, Yudkowsky elaborated on the reasoning for the masochism coverup: @@ -637,33 +641,35 @@ Someone else said: I said that I thought people were missing this idea that the reason "truth is better than lies; knowledge is better than ignorance" is such a well-performing injunction in the real world (despite the fact that there's no law of physics preventing lies and ignorance from having beneficial consequences), is because it protects against unknown unknowns. Of course an author who wants to portray an ignorance-maintaining conspiracy as being for the greater good, can assert by authorial fiat whatever details are needed to make it all turn out for the greater good, but _that's not how anything works in real life_. -I started a new thread to complain about the attitude I was seeing (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). When fiction in this world, _where I live_, glorifies Noble Lies, that's a cultural force optimizing for making shared maps less accurate, I explained. As someone trying to make shared maps _more_ accurate, this force was hostile to me and mine. I understood that secrets and lies are different, but if you're a consequentialist thinking in terms of what kinds of optimization pressures are being applied to shared maps, it's the same issue: I'm trying to steer _towards_ states of the world where people know things, and the Keepers of Noble Secrets are trying to steer _away_ from states of the world where people know things. That's a conflict. I was happy to accept Pareto-improving deals to make the conflict less destructive, but I wasn't going to pretend the pro-ignorance forces were my friends just because they self-identify as "rationalists" or "EA"s. I was willing to accept secrets around nuclear or biological weapons, or AGI, on "better ignorant than dead" grounds, but the "protect sadists from being sad" thing was _just_ coddling people who can't handle the truth, which made _my_ life worse. +I started a new thread to complain about the attitude I was seeing (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). When fiction in this world, _where I live_, glorifies Noble Lies, that's a cultural force optimizing for making shared maps less accurate, I explained. As someone trying to make shared maps _more_ accurate, this force was hostile to me and mine. I understood that "secrets" and "lies" are not the same thing, but if you're a consequentialist thinking in terms of what kinds of optimization pressures are being applied to shared maps, [it's the same issue](https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist): I'm trying to steer _towards_ states of the world where people know things, and the Keepers of Noble Secrets are trying to steer _away_ from states of the world where people know things. That's a conflict. I was happy to accept Pareto-improving deals to make the conflict less destructive, but I wasn't going to pretend the pro-ignorance forces were my friends just because they self-identified as "rationalists" or "EA"s. I was willing to accept secrets around nuclear or biological weapons, or AGI, on "better ignorant than dead" grounds, but the "protect sadists from being sad" thing wasn't a threat to life; it was _just_ coddling people who can't handle reality, which made _my_ life worse. -I wasn't buying the excuse that secret-Keeping practices that wouldn't be OK on Earth were somehow OK on dath ilan, which was asserted by authorial fiat to be sane and smart and benevolent enough to make it work. Or if I couldn't argue with authorial fiat: the reasons why it would be bad on Earth (even if it wouldn't be bad on dath ilan) are reasons why _fiction about dath ilan is bad for Earth_. +I wasn't buying the excuse that secret-Keeping practices that wouldn't be okay on Earth were somehow okay on dath ilan, which was asserted by authorial fiat to be sane and smart and benevolent enough to make it work. Alternatively, if I couldn't argue with authorial fiat: the reasons why it would be bad on Earth (even if it wouldn't be bad in the author-assertion paradise of dath ilan) are reasons why _fiction about dath ilan is bad for Earth_. -And just—back in the 'aughts, I said, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog in 2008—the will to _just get the right answer_, without all this galaxy-brained hand-wringing about who the right answer might hurt. +And just—back in the 'aughts, I said, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog[^overcoming-bias] in 2008—the will to _just get the right answer_, without all this galaxy-brained hand-wringing about who the right answer might hurt. -I would have expected a subculture descended from the memetic legacy of Robin Hanson's blog in 2008 to respond to that tripe about protecting people from being destroyed by the truth as a form of "recognizing independent agency" with something like— +[^overcoming-bias]: _Overcoming Bias_ had actually been a group blog then, but I was enjoying the æsthetic of saying "Robin Hanson's blog" (when what I had actually loved about _Overcoming Bias_ was Yudkowsky's Sequences) as a way of signaling contempt for the Yudkowsky of the current year. + +I would have expected a subculture descended from the memetic legacy of Robin Hanson's blog in 2008 to respond to that tripe about protecting people from the truth being a form of "recognizing independent agency" with something like— "Hi! You must be new here! Regarding your concern about truth doing harm to people, a standard reply is articulated in the post ["Doublethink (Choosing to be Biased)"](https://www.lesswrong.com/posts/Hs3ymqypvhgFMkgLb/doublethink-choosing-to-be-biased). Regarding your concern about recognizing independent agency, a standard reply is articulated in the post ["Your Rationality Is My Business"](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business)." -—or _something like that_. Not that the reply needed to use those particular Sequences links, or _any_ Sequences links; what's important is that someone needs counter to this very obvious [anti-epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology). +—or _something like that_. Not that the reply needed to use those particular Sequences links, or _any_ Sequences links; what's important is that someone needed to counter to this very obvious [anti-epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology). And what we actually saw in response to the "You don't get to do harm to other people" message was ... it got 5 "+1" emoji-reactions. Yudkowsky [chimed in to point out that](/images/yudkowsky-it_doesnt_say_tell_other_people.png) "Doublethink" was about _oneself_ not reasonably being in the epistemic position of knowing that one should lie to oneself. It wasn't about telling the truth to _other_ people. -On the one hand, fair enough. My generalization from "you shouldn't want to have false beliefs for your own benefit" to "you shouldn't want other people to have false beliefs for their own benefit" (and the further generalization to it being OK to intervene) was not in the text of the post itself. It made sense for Yudkowsky to refute my misinterpretation of the text he wrote. +On the one hand, fair enough. My generalization from "you shouldn't want to have false beliefs for your own benefit" to "you shouldn't want other people to have false beliefs for their own benefit" (and the further generalization to it being okay to intervene) was not in the text of the post itself. It made sense for Yudkowsky to refute my misinterpretation of the text he wrote. -On the other hand—given that Yudkowsky was paying attention to this #overflow thread anyway, I might have naïvely hoped that he would appreciate what I was trying to do?—that, after the issue had been pointed out, he would decided that he _wanted_ his chatroom to be a place where we don't want other people to have false beliefs for their own benefit?—a place that approves of "meddling" in the form of _telling people things_. +On the other hand—given that he was paying attention to this #overflow thread anyway, I might have naïvely hoped that he would appreciate what I was trying to do?—that, after the issue had been pointed out, he would decided that he _wanted_ his chatroom to be a place where we don't want other people to have false beliefs for their own benefit?—a place that approves of "meddling" in the form of _telling people things_. -The other participants mostly weren't buying what I was selling. +The other chatroom participants mostly weren't buying what I was selling. A user called April wrote that "the standard dath ilani has internalized almost everything in the sequences": "it's not that the standards are being dropped[;] it's that there's an even higher standard far beyond what anyone on earth has accomplished". (This received a checkmark emoji-react from Yudkowsky, an indication of his agreement.) -Someone else said he was "pretty leery of 'ignore whether models are painful' as a principle, for Earth humans to try to adopt," and went on to offer some thoughts for Earth. I continued to think it was ridiculous that we were talking of "Earth humans" as if there were any other kind—as if rationality in the Yudkowskian tradition wasn't something to aspire to in real life. +Someone else said he was "pretty leery of 'ignore whether models are painful' as a principle, for Earth humans to try to adopt," and went on to offer some thoughts for Earth. I continued to maintain that it was ridiculous that we were talking of "Earth humans" as if there were any other kind—as if rationality in the Yudkowskian tradition wasn't something to aspire to in real life. -Dath ilan [is _fiction_](https://www.lesswrong.com/posts/rHBdcHGLJ7KvLJQPk/the-logical-fallacy-of-generalization-from-fictional), I pointed out. Dath ilan _does not exist_. It was a horrible distraction to try to see our world through Thellim's eyes and feel contempt over how much better things must be on dath ilan (which, to be clear, again, _does not exist_), when one could be looking through the eyes of an ordinary reader of Robin Hanson's blog in 2008 (the _real_ 2008, which _actually happened_), and seeing everything we've lost. +Dath ilan [is _fiction_](https://www.lesswrong.com/posts/rHBdcHGLJ7KvLJQPk/the-logical-fallacy-of-generalization-from-fictional), I pointed out. Dath ilan _does not exist_. I thought it was a horrible distraction to try to see our world through Thellim's eyes and feel contempt over how much better things must be on dath ilan (which, to be clear, again, _does not exist_), when one could be looking through the eyes of an ordinary reader of Robin Hanson's blog in 2008 (the _real_ 2008, which _actually happened_), and seeing everything we've lost. [As it was taught to me then](https://www.lesswrong.com/posts/iiWiHgtQekWNnmE6Q/if-you-demand-magic-magic-won-t-help): if you demand Keepers, _Keepers won't help_. If I'm going to be happy anywhere, or achieve greatness anywhere, or learn true secrets anywhere, or save the world anywhere, or feel strongly anywhere, or help people anywhere—I may as well do it _on Earth_. @@ -675,9 +681,19 @@ On 29 November 2022 (four years and a day after the "hill of meaning in defense Despite the fact that there was no point in wasting any more time on Discord, I decided not to resist the temptation to open up the thread again and dump some paragraphs from my notes on the conspiracies of dath ilan. -[TODO: explain my sneakiness theory, shove the anti-semitism into a footnote] +If we believe that [IQ research validates the "Jews are clever" stereotype](https://web.mit.edu/fustflum/documents/papers/AshkenaziIQ.jbiosocsci.pdf), I wondered if there's a distinct (albeit probably correlated) "enjoying deception" trait that validates the "Jews are sneaky" stereotype? If dath ilan is very high in this "sneakiness" trait (relative to Earth Jews), that would help explain all the conspiracies![^edgy-anti-semitism] + +[^edgy-anti-semitism]: It probably would have been possible to bring up the sneakiness-trait hypothesis in a less edgy way, but I didn't care to. + +Not-actually-plausible conspiracies that everyone is in on (like "Sparashki are real") are a [superstimulus](https://www.lesswrong.com/posts/Jq73GozjsuhdwMLEG/superstimuli-and-the-collapse-of-western-civilization) like zero-calorie sweetener: engineered to let everyone enjoy the thrill of lying, without doing any real damage to shared maps. + +In "For No Laid Course Prepare", Merrin's coworkers falsely maintain to outsiders that Merrin always cosplays as a Sparashki while on duty. "This is not considered a lie, in that it would be universally understood and expected that no one in this social circumstance would tell the truth," the narrator tells us. The language used here is strikingly similar to that of one of the corrupt executives in _Moral Mazes_: "We lie all the time, but if everyone knows that we're lying, is a lie really a lie?" + +But if it were true that [everyone knew](https://thezvi.wordpress.com/2019/07/02/everybody-knows/), what would be _function_ of saying the false thing? On dath ilan (if not in Earth boardrooms), one supposes the answer is "Because it's fun"? But that just prompts the followup question: but what is the function of the brain giving out a "fun" reward in this context? It seems like at _some_ point, there has to be the expectation of _some_ cognitive system (although possibly not an entire "person") taking the signals literally.[^funny-or-powerful-falsehood] + +[^funny-or-powerful-falsehood]: This is why, when I notice myself misrepresenting my actual beliefs or motivations because I think it's funny or rhetorically powerful, I often take care to disclaim it immediately, precisely because I _don't_ think that "everybody knows"; I'm not going to give up on humor or powerful rhetoric, but I'm also not going to delude myself into thinking it's "zero-calorie": people who don't "get the joke" _are_ going to be misled, and I don't think it's unambigously "their fault" for not being able to read my "intent" to arbitrary precision. But maybe dath ilan is (by authorial fiat) sufficiently good at achieving common knowledge in large groups that they _can_ pull off a zero-calorie "everyone knows" conspiracy without damaging shared maps? -A user called ajvermillion asked why I was being so aggressively negative about dath ilan. He compared it to Keltham's speech about how [people who grew up under a Lawful Evil government were disposed to take a more negative view of paternalism](https://www.glowfic.com/replies/1874754#reply-1874754) than they do in dath ilan, where paternalism works fine because dath ilan is basically benevolent. +The existence of such a widespread sneakiness/"taste for deception" trait among the eliezera, in conjunction with their culture just not particularly valuing public knowledge (because they assume everything important is being handled by the Keepers), explains the recurring conspiracies and coverups, like the Ordinary Merrin Conspiracy, Exception Handling's fabrication of evidence for real Sparashki, the sadism/masochism coverup, and the village [TODO: regrets and wasted time * Do I have regrets about this Whole Dumb Story? A lot, surely—it's been a lot of wasted time. But it's also hard to say what I should have done differently; I could have listened to Ben more and lost faith Yudkowsky earlier, but he had earned a lot of benefit of the doubt? -- 2.17.1