From 80f9f5542c25319f6bfe9c138b69792bb43063f0 Mon Sep 17 00:00:00 2001 From: "Zack M. Davis" Date: Sat, 7 Oct 2023 13:01:30 -0700 Subject: [PATCH] memoir: editing pt. 5; Bayesian Conspiracy didn't claim to be altruistic --- content/drafts/standing-under-the-same-sky.md | 46 +++++++++---------- 1 file changed, 21 insertions(+), 25 deletions(-) diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index afe683c..b182afa 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -261,7 +261,7 @@ An analogy: racist jokes are also just jokes. Alice says, "What's the difference Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Yudkowsky and Alexander is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. _Nullius in verba_. -I don't think the motte-and-bailey concern is hypothetical, either. When I [indignantly protested](https://twitter.com/zackmdavis/status/1435059595228053505) the "we're both always right" remark, one Mark Xu [commented](https://twitter.com/davidxu90/status/1435106339550740482): "speaking as someone who's read and enjoyed your LW content, I do hope this isn't a sign that you're going full post-rat"—as if my criticism of Yudkowsky's self-serving bluster itself marked me as siding with the "post-rats"! +I don't think the motte-and-bailey concern is hypothetical, either. When I [indignantly protested](https://twitter.com/zackmdavis/status/1435059595228053505) the "we're both always right" remark, one David Xu [commented](https://twitter.com/davidxu90/status/1435106339550740482): "speaking as someone who's read and enjoyed your LW content, I do hope this isn't a sign that you're going full post-rat"—as if my criticism of Yudkowsky's self-serving bluster itself marked me as siding with the "post-rats"! I once wrote [a post whimsically suggesting that trans women should owe cis women royalties](/2019/Dec/comp/) for copying the female form (as "intellectual property"). In response to a reader who got offended, I [ended up adding](/source?p=Ultimately_Untrue_Thought.git;a=commitdiff;h=03468d274f5) an "epistemic status" line to clarify that it was not a serious proposal. @@ -497,7 +497,7 @@ Paul Christiano, who has a much more optimistic picture of humanity's chances, n Perhaps for lack of any world-saving research to do, Yudkowsky started writing fiction again, largely in the form of Glowfic (a genre of collaborative storytelling pioneered by Alicorn) featuring the world of dath ilan. -The bulk of the dath ilan Glowfic canon was an epic titled [_Planecrash_](https://www.glowfic.com/boards/215)[^planecrash-title] coauthored with Lintamande, in which Keltham, an unusually selfish teenage boy from dath ilan, apparently dies in a freak aviation accident, and [wakes up in the world of](https://en.wikipedia.org/wiki/Isekai) Golarion, setting of the _Dungeons-&-Dragons_–alike _Pathfinder_ role-playing game. A [couple](https://www.glowfic.com/posts/4508) of [other](https://glowfic.com/posts/6263) Glowfic stories with different coauthors further flesh out the setting of dath ilan. +The bulk of the dath ilan Glowfic canon was an epic titled [_Planecrash_](https://www.glowfic.com/boards/215)[^planecrash-title] coauthored with Lintamande, in which Keltham, an unusually selfish teenage boy from dath ilan, apparently dies in a freak aviation accident, and [wakes up in the world of](https://en.wikipedia.org/wiki/Isekai) Golarion, setting of the _Dungeons-&-Dragons_–alike _Pathfinder_ role-playing game. A [couple](https://www.glowfic.com/posts/4508) of [other](https://glowfic.com/posts/6263) Glowfic stories with different coauthors further flesh out the setting of dath ilan, a smarter, better-coordinated alternate version of Earth steered by an order of [Keepers of Highly Unpleasant Things it is Sometimes Necessary to Know](https://www.glowfic.com/replies/1612937#reply-1612937) who safeguard advanced rationality techniques from a population allegedly too psychologically fragile to handle them. [^planecrash-title]: The title is a triple pun, referring to the airplane crash leading to Keltham's death in dath ilan, and how his resurrection in Golarion collides dath ilan with [the "planes" of existence of the _Pathfinder_ universe](https://pathfinderwiki.com/wiki/Great_Beyond), and Keltham's threat to destroy (crash) the _Pathfinder_ reality if mortals aren't given better afterlife conditions. (I use the word "threat" colloquially here; the work itself goes into some detail distinguishing between mere bargaining and decision-theoretic threats that should be defied.) @@ -505,7 +505,9 @@ On the topic of dath ilan's rationality training, I appreciated [this passage ab > Dath ilani kids get told to not get fascinated with the fact that, in principle, 'bounded-agents' with finite memories and finite thinking speeds, have any considerations about mapping that depend on what they want. It doesn't mean that you get to draw in whatever you like on your map, because it's what you want. It doesn't make reality be what you want. -Vindication! This showed that Yudkowsky _did_ understand what was at issue in the dispute over "... Not Man for the Categories", even if he couldn't say "Zack is right and Scott is wrong" for political reasons. Beyond that tidbit, however, the dath ilan mythos still seemed defective to me compared to the Sequences regarding its attitudes towards knowledge. +Vindication! This showed that Yudkowsky _did_ understand what was at issue in the dispute over "... Not Man for the Categories", even if he couldn't say "Zack is right and Scott is wrong" for political reasons. + +Despite that tidbit, however, the dath ilan mythos overall seemed visibly defective to me compared to the Sequences—particularly regarding how humans should relate to unpleasant truths. Someone at the 2021 Event Horizon Independence Day party had told me that I had been misinterpreting the "Speak the truth, even if your voice trembles" slogan from the Sequences. I had interpreted the slogan as suggesting the importance of speaking the truth _to other people_ (which I think is what "speaking" is usually about), but my interlocutor said it was about, for example, being able to speak the truth aloud in your own bedroom, to yourself. I think some textual evidence for my interpretation can be found in Daria's ending to ["A Fable of Science and Politics"](https://www.lesswrong.com/posts/6hfGNLf4Hg5DXqJCF/a-fable-of-science-and-politics), a multiple-parallel-endings story about an underground Society divided into factions over the color of the unseen sky, and one person's reaction when they find a passageway leading aboveground to a view of the sky: @@ -515,31 +517,29 @@ Daria takes it as a given that she needs to be open about her new blue-sky belie [^other-endings]: Even Eddin's ending, which portrays Eddin as more concerned with consequences than honesty, has him "trying to think of a way to prevent this information from blowing up the world", rather than trying to think of a way to suppress the information, in contrast to how Charles, in his ending, _immediately_ comes up with the idea to block off the passageway leading to the aboveground. Daria and Eddin are clearly written as "rationalists"; the deceptive strategy only comes naturally to the non-rationalist Charles. (Although you could Watsonianly argue that Eddin is just thinking longer-term than Charles: blocking off _this_ passageway and never speaking a word of it to another soul, won't prevent someone from finding some other passage to the aboveground, eventually.) -In contrast, the culture of dath ilan does not seem to particularly value people _standing under the same sky_. Not only is their Society is steered by an order of [Keepers of Highly Unpleasant Things it is Sometimes Necessary to Know](https://www.glowfic.com/replies/1612937#reply-1612937) who safeguard advanced rationality techniques from a population allegedly too psychologically fragile to handle them, but we see many other cases of the dath ilani covering things up for some alleged greater good with seemingly no regard to the costs of people have less accurate world-models. +In contrast, the culture of dath ilan does not seem to particularly value people _standing under the same sky_. It's not just the Keepers; we also see many other cases of the dath ilani covering things up for some alleged greater good with seemingly no regard to the costs of people have less accurate world-models. -In one notable example, Keltham, the protagonist of _Planecrash_, is an obligate sexual sadist, but never discovered this fact about himself during his first life in dath ilan, because dath ilan has arranged to cover up the existence of sadism and masochism—precisely because people like Keltham would be sad if they discovered that there weren't enough masochists to go around. +In one notable example, Keltham, the protagonist of _Planecrash_, is an obligate sexual sadist, but never discovered this fact about himself during his first life in dath ilan, because dath ilan has arranged to cover up the existence of sadism and masochism, because sadists like Keltham would be sad if they discovered that there weren't enough masochists to go around. It did not escape my notice that when "rationalist" authorities in real life considered public knowledge of some paraphilia to be an infohazard (ostensibly for the benefit of people with that paraphilia), I _didn't take it lying down_. -This parallel between dath ilan's sadism/masochism coverup and the autogynephilia coverup I had fought in real life, was something I was only intending to comment on in passing in the present memoir, rather than devoting any more detailed analysis to, but as I was having trouble focusing on my own writing in September 2022, I ended up posting some critical messages about dath ilan's censorship regime in the "Eliezerfic" Discord server for reader discussion of _Planecrash_, using the sadism/masochism coverup as my central example. +The parallel between dath ilan's sadism/masochism coverup and the autogynephilia coverup I had fought in real life, was something I was only intending to comment on in passing in the present memoir, rather than devoting any more detailed analysis to, but as I was having trouble focusing on my own writing in September 2022, I ended up posting some critical messages about dath ilan's censorship regime in the "Eliezerfic" Discord server for reader discussion of _Planecrash_, using the sadism/masochism coverup as my central example. (I would later adapt my complaints into a standalone post, "On the Public Anti-Epistemology of dath ilan".) -Although Yudkowsky participated in the server, I had reasoned that my participation didn't violate my previous intent not to bother him anymore, because it was a publicly-linked Discord server with hundreds of members. Me commenting on the story for the benefit of the _other_ 499 people in the chat room wouldn't generate a notification _for him_, the way it would if I sent him an email or replied to him on Twitter. +Although Yudkowsky participated in the server, I had reasoned that my participation didn't violate my previous intent not to bother him anymore, because it was a publicly-linked Discord server with hundreds of members. Me commenting on the story for the benefit of the _other_ 499 people in the chatroom wouldn't generate a notification _for him_, the way it would if I sent him an email or replied to him on Twitter. The other chatroom participants mostly weren't buying what I was selling. -When I objected to [Word of God](https://tvtropes.org/pmwiki/pmwiki.php/Main/WordOfGod)'s identification of the Keeper's credo as "Let the truth destroy what it can—in yourself, not in other people" as an incredibly infantalizing philosophy, someone replied: - -> I think of "not in other people" not as "infantilizing", but as recognizing independent agency. You don't get to do harm to other people without their consent, whether that is physical or pychological. +When I objected to [Word of God](https://tvtropes.org/pmwiki/pmwiki.php/Main/WordOfGod)'s identification of the Keeper's credo as "Let the truth destroy what it can—in yourself, not in other people" as an incredibly infantalizing philosophy, someone replied, "I think of 'not in other people' not as 'infantilizing', but as recognizing independent agency. You don't get to do harm to other people without their consent, whether that is physical or pychological." I pointed out that this obviously applies to, say, religion. Was it wrong to advocate for atheism in a religious Society, where robbing someone of their belief in God might be harming them? "Every society strikes a balance between protectionism and liberty," someone said. "This isn't news." -It's not news about _humans_, I conceded. It was just—I thought people who were fans of Yudkowsky's writing in 2008 had a reasonable expectation that the dominant messaging in the local subculture would continue in 2022 to be _in favor_ of telling the truth and _against_ benevolently intended noble lies. It ... would be interesting to know why that changed. +It's not news about _humans_, I conceded. But I thought people who were fans of Yudkowsky's writing in 2008 had a reasonable expectation that the dominant messaging in the local subculture would continue in 2022 to be _in favor_ of telling the truth and _against_ benevolently intended noble lies. It would be interesting to know why that had changed. -I started a new thread for my topic (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). It died out after a couple days, and I reopened it later in response to more discussion of the masochism coverup. +I started a new thread for my topic (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). It died out after a couple days, but I reopened it later in response to more discussion of the masochism coverup. Yudkowsky made an appearance. (After he replied to someone else, I remarked parenthetically that his appearance made me think I should stop wasting time snarking in his fiction server and just finish my memoir already.) We had a brief back-and-forth: @@ -560,11 +560,13 @@ It was pretty annoying that Yudkowsky was still attributing my greviances to Mic > **zackmdavis** — 11/29/2022 10:37 PM > it's true that the things I don't like about modern Yudkowsky were still there in Sequences-era Yudkowsky, but I think they've gotten _worse_ > **Eliezer** — 11/29/2022 10:39 PM -> well, if your story is that I was always a complicated person, and you selected some of my posts and liked the simpler message you extracted from those, and over time I've shifted in my emphases in a way you don't like, while still having posts like Meta-Honesty and so on... then that's a pretty different story than the one you were telling in this Discord channel, like, just now. today. +> well, if your story is that I was always a complicated person, and you selected some of my posts and liked the simpler message you extracted from those, and over time I've shifted in my emphases in a way you don't like, while still having posts like Meta-Honesty and so on... then that's a pretty different story than the one you were telling in this Discord channel, like, just now. today. + +Is it, though? The "always a complicated person [who has] shifted in [his] emphases in a way [I] don't like" story was true, of course, but it elided the substantive reasons _why_ I didn't like the new emphases, which I expect other people to be able to see, too. -Is it, though? The "always a complicated person [who has] shifted in [his] emphases in a way [I] don't like" story was true, of course, but it elided the substantive reasons _why_ I didn't like the new emphases, which could presumably be evaluated on their own merits. +(As far as [the](https://www.lesswrong.com/posts/fnEWQAYxcRnaYBqaZ/initiation-ceremony) [Bayesian](https://www.lesswrong.com/posts/ZxR8P8hBFQ9kC8wMy/the-failures-of-eld-science) [Conspiracy](https://www.lesswrong.com/posts/xAXrEpF5FYjwqKMfZ/class-project) [stories](https://www.lesswrong.com/posts/kXAb5riiaJNrfR8v8/the-ritual) [went](https://www.lesswrong.com/posts/yffPyiu7hRLyc7r23/final-words), I think there's a significant narrative contrast between Brennan _seeking_ knowledge from the master _beisutsukai_, and Keltham, Merrin, and Thellim being _protected from_ knowledge by the Keepers. Neither the Bayesian Conspiracy nor the Keepers are publishing open-access textbooks, but at least the Conspiracy isn't claiming that their secretiveness is _for others' benefit_.) -It's interesting that Yudkowsky listed "still having posts like Meta-Honesty" as an exculpatory factor here. The thing is, I [wrote a _critique_ of Meta-Honesty](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly). It was well-received (being [cited as a good example in the introductory post for the 2019 Less Wrong Review](https://www.lesswrong.com/posts/QFBEjjAvT6KbaA3dY/the-lesswrong-2019-review), for instance). I don't think I could have written a similarly impassioned critique of anything from the Sequences era, because the stuff from the Sequences era still looked _correct_ to me. To me, "Meta-Honesty" was evidence _for_ Yudkowsky having relinquished his Art and lost his powers, not evidence that his powers were still intact. +It's notable that Yudkowsky listed "still having posts like [Meta-Honesty](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases)" as an exculpatory factor here. The thing is, I [wrote a _critique_ of Meta-Honesty](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly). It was well-received (being [cited as a good example in the introductory post for the 2019 Less Wrong Review](https://www.lesswrong.com/posts/QFBEjjAvT6KbaA3dY/the-lesswrong-2019-review), for instance). I don't think I could have written a similarly impassioned critique of anything from the Sequences era, because the stuff from the Sequences era still looked correct to me. To me, "Meta-Honesty" was evidence _for_ Yudkowsky having relinquished his Art and lost his powers, not evidence that his powers were still intact. I didn't have that response thought through in real time. At the time, I just agreed: @@ -595,7 +597,7 @@ I didn't have that response thought through in real time. At the time, I just ag (When a literary critic proposes a "dark" interpretation of an author's world, I think it's implied that they're expressing disbelief in the "intended" world; the fact that I was impudently refusing to buy the benevolent interpretation wasn't because I didn't understand it.) -> Hate-warp like this is bad for truth-perception; my understanding of the situation is that it's harm done to you by the group you say you left. I would read this as being a noninnocent error of that group; that they couldn't get what they wanted from people who still had friends outside their own small microculture, and noninnocently then decided that this outer culture was bad and people needed to be pried loose from it. They tried telling some people that this outer culture was gaslighting them and maliciously lying to them and had to be understood in wholly adversarial terms to break free of the gaslighting; that worked on somebody, and made a new friend for them; so their brain noninnocently learned that it ought to use arguments like that again, so they must be true. +> Hate-warp like this is bad for truth-perception; my understanding of the situation is that it's harm done to you by the group you say you left. I would read this as being a noninnocent error of that group; that they couldn't get what they wanted from people who still had friends outside their own small microculture, and noninnocently then decided that this outer culture was bad and people needed to be pried loose from it. They tried telling some people that this outer culture was gaslighting them and maliciously lying to them and had to be understood in wholly adversarial terms to break free of the gaslighting; that worked on somebody, and made a new friend for them; so their brain noninnocently learned that it ought to use arguments like that again, so they must be true. > This is a sort of thing I super did not do because I _understood_ it as a failure mode and Laid My Go Stones Against Ever Actually Being A Cult; I armed people with weapons against it, or tried to, but I was optimistic in my hopes about how much could actually be taught. > **zackmdavis** — 11/29/2022 11:20 PM > Without particularly defending Vassar _et al._ or my bad literary criticism (sorry), _modeling the adversarial component of non-innocent errors_ (as contrasted to "had to be understood in wholly adversarial terms") seems very important. (Maybe lying is "worse" than rationalizing, but if you can't hold people culpable for rationalization, you end up with a world that's bad for broadly the same reasons that a world full of liars is bad: we can't steer the world to good states if everyone's map is full of falsehoods that locally benefitted someone.) @@ -624,7 +626,7 @@ The next day, I belatedly pointed out that "Keltham thought that not learning ab In response to someone positing that dath ilani were choosing to be happier but less accurate predictors, I said that I read a blog post once about why you actually didn't want to do that, linking to [an Internet Archive copy of "Doublethink (Choosing to Be Biased)"](https://web.archive.org/web/20080216204229/https://www.overcomingbias.com/2007/09/doublethink-cho.html) from 2008[^hanson-conceit]—at least, that was _my_ attempted paraphrase; it was possible that I'd extracted a simpler message from it than the author intended. -[^hanson-conceit]: I was really enjoying the "Robin Hanson's blog in 2008" conceit. +[^hanson-conceit]: I was enjoying the conceit of referring to Sequences posts as being from "Robin Hanson's blog in 2008", as a way of emphasizing the distinction between my respect for the material and my contempt for the man Yudkowsky had become. A user called Harmless explained the loophole. "Doublethink" was pointing out that decisions that optimize the world for your preferences can't come from nowhere: if you avoid painful thoughts in your map, you damage your ability to steer away from painful outcomes in the territory. However, there was no rule that all the information-processing going into decisions that optimize the world for your preferences had to take place in _your brain_ ... @@ -637,12 +639,6 @@ Yudkowsky clarified his position: I understood the theory, but I was still extremely skpetical of the practice, assuming the eliezera were even remotely human. Yudkowsky described the practice of "keeping BDSM secret and trying to prevent most sadists from discovering what they are—informing them only when and if they become rich enough or famous enough that they'd have a high probability of successfully obtaining a very rare masochist" as a "basically reasonable policy option that [he] might vote for, not to help the poor dear other people, but to help [his] own counterfactual self." -The problem I saw with this is that becoming rich and famous isn't a purely random exogenous event. In order to make an informed decision about whether or not to put in the effort to try to _become_ rich and famous (as contrasted to choosing a lower-risk or more laid-back lifestyle), you need accurate beliefs about the perks of being rich and famous. - -The dilemma of whether to make more ambitious economic choices in pusuit of sexual goals was something that _already_ happens to people on Earth, rather than being hypothetical. I once met a trans woman who spent a lot of her twenties and thirties working very hard to get money for various medical procedures. I think she would be worse off under a censorship regime run by self-styled Keepers who thought it was kinder to prevent _poor people_ from learning about the concept of "transsexualism". - -Further discussion established that Yudkowsky was (supposedly) already taking into account that class of distortion on individuals' decisions, but that the empirical setting of probabilities and utilities happened to be such that ignorance came out on top. - I wasn't sure what my wordcount and "diplomacy" "budget limits" for the server were, but I couldn't let go; I kept the thread going on subsequent days. There was something I felt I should be able to convey, if I could just find the right words. When [Word of God](https://tvtropes.org/pmwiki/pmwiki.php/Main/WordOfGod) says, "trying to prevent most [_X_] from discovering what they are [...] continues to strike me as a basically reasonable policy option", then, separately from the particular value of _X_, I expected people to jump out of their chairs and say, "No! This is wrong! Morally wrong! People can stand what is true about themselves, because they are already doing so!" @@ -655,7 +651,7 @@ Yudkowsky replied: I wish I had thought to ask if he'd have felt the same way in 2008. -Ajvermillion was still baffled at my skepticism: if the author specifies that the world of the story is simple in this-and-such direction, on what grounds could I _disagree_? +A user called Ajvermillion continued to be baffled at my skepticism: if the author specifies that the world of the story is simple in this-and-such direction, on what grounds could I _disagree_? I admitted, again, that there was a sense in which I couldn't argue with authorial fiat. But I thought that an author's choice of assumptions reveals something about what they think is true in our world, and commenting on that should be fair game for literary critics. Suppose someone wrote a story and said, "in the world portrayed in this story, everyone is super-great at _kung fu_, and they could beat up everyone from our Earth, but they never have to practice at all." @@ -673,7 +669,7 @@ It was possible that the attitude I was evincing here was just a difference betw There were definitely communities on Earth where I wasn't allowed in because of my tendency to shout things from street corners, and I respected those people's right to have a safe space for themselves. -But those communities ... didn't call themselves _rationalists_, weren't _pretending_ be to be inheritors of the great tradition of E. T. Jaynes and Robin Dawes and Richard Feynman. And if they _did_, I think I would have a false advertising complaint against them. +But those communities didn't call themselves _rationalists_, weren't _pretending_ be to be inheritors of the great tradition of E. T. Jaynes and Richard Feynman and Robin Hanson. And if they _did_, I think I would have a false advertising complaint against them. "[The eleventh virtue is scholarship. Study many sciences and absorb their power as your own](https://www.yudkowsky.net/rational/virtues) ... unless a prediction market says that would make you less happy," just didn't have the same ring to it. Neither did "The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But higher than both of those, is trusting your Society's institutions to tell you which kinds of knowledge will make you happy"—even if you stipulated by authorial fiat that your Society's institutions are super-competent, such that they're probably right about the happiness thing. -- 2.17.1