From a0aa972544aa5b9e1282c21098a2cac1dd50689f Mon Sep 17 00:00:00 2001 From: "Zack M. Davis" Date: Tue, 19 Dec 2023 20:55:07 -0800 Subject: [PATCH] memoir: apply pt. 4 pro edits (lower half) --- ...xhibit-generally-rationalist-principles.md | 150 +++++++++--------- 1 file changed, 75 insertions(+), 75 deletions(-) diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index ee0c67f..25c214e 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -267,21 +267,21 @@ But if I'm right that (a′) and (b′) should be live hypotheses and that Yudko > > Trying to pack all of that into the pronouns you'd have to use in step 1 is the wrong place to pack it. -Sure, if we were in the position of designing a constructed language from scratch under current social conditions in which a person's "gender" is understood as a contested social construct, rather than their sex being an objective and undisputed fact, then yeah: in that situation _which we are not in_, you definitely wouldn't want to pack sex or gender into pronouns. But it's a disingenuous derailing tactic to grandstand about how people need to alter the semantics of their already existing native language so that we can discuss the real issues under an allegedly superior pronoun convention when by your own admission, you have _no intention whatsoever of discussing the real issues!_ +Sure, if we were designing a constructed language from scratch under current social conditions, in which a person's "gender" is understood as a contested social construct rather than their sex being an objective and undisputed fact, then yeah: in that situation _which we are not in_, you definitely wouldn't want to pack sex or gender into pronouns. But it's a disingenuous derailing tactic to grandstand about how people need to alter the semantics of their existing native language so that we can discuss the real issues under an allegedly superior pronoun convention when by your own admission, you have _no intention whatsoever of discussing the real issues!_ -(Lest the "by your own admission" clause seem too accusatory, I should note that given constant behavior, admitting it is much better than not-admitting it, so huge thanks to Yudkowsky for the transparency on this point!) +(Lest the "by your own admission" clause seem too accusatory, I should note that given constant behavior, admitting it is much better than not admitting it, so huge thanks to Yudkowsky for the transparency on this point!) Again, [as discussed in "Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/#t-v-distinction), there's an instructive comparison to languages that have formality-based second person pronouns, like [_tú_ and _usted_ in Spanish](https://en.wikipedia.org/wiki/Spanish_personal_pronouns#T%C3%BA/vos_and_usted). It's one thing to advocate for collapsing the distinction and just settling on one second-person singular pronoun for the Spanish language. That's principled. -It's another thing altogether to simultaneously try to prevent a speaker from using _tú_ to indicate disrespect towards a social superior (on the stated rationale that the _tú_/_usted_ distinction is dumb and shouldn't exist) while also refusing to entertain or address the speaker's arguments explaining why they think their interlocutor is unworthy of the deference that would be implied by _usted_ (because such arguments are "unspeakable" for political reasons). That's just psychologically abusive. +It's another thing altogether to try to prevent a speaker from using _tú_ to indicate disrespect towards a social superior (on the stated rationale that the _tú_/_usted_ distinction is dumb and shouldn't exist) while also refusing to entertain the speaker's arguments for why their interlocutor is unworthy of the deference that would be implied by _usted_ (because such arguments are "unspeakable" for political reasons). -If Yudkowsky actually possessed (and felt motivated to use) the "ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking", it would be obvious to him that "Gendered Pronouns For Everyone and Asking To Leave The System Is Lying" isn't the hill anyone would care about dying on if it weren't a Schelling point. A lot of TERF-adjacent folk would be overjoyed to concede the (boring, insubstantial) matter of pronouns as a trivial courtesy if it meant getting to address their real concerns of "Biological Sex Actually Exists", and ["Biological Sex Cannot Be Changed With Existing or Foreseeable Technology"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and "Biological Sex Is Sometimes More Relevant Than Subjective Gender Identity." The reason so many of them are inclined to stand their ground and not even offer the trivial courtesy of pronouns is because they suspect, correctly, that the matter of pronouns is being used as a rhetorical wedge to try to prevent people from talking or thinking about sex. +If Yudkowsky actually possessed (and felt motivated to use) the "ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking", it would be obvious to him that "Gendered Pronouns for Everyone and Asking To Leave the System Is Lying" isn't the hill anyone would care about dying on if it weren't a Schelling point. A lot of TERF-adjacent folk would be overjoyed to concede the (boring, insubstantial) matter of pronouns as a trivial courtesy if it meant getting to address their real concerns of "Biological Sex Actually Exists" and ["Biological Sex Cannot Be Changed With Existing or Foreseeable Technology"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and "Biological Sex Is Sometimes More Relevant Than Subjective Gender Identity." The reason so many of them are inclined to stand their ground and not even offer the trivial courtesy of pronouns is because they suspect, correctly, that pronouns are being used as a rhetorical wedge to keep people from talking or thinking about sex. Having analyzed the ways in which Yudkowsky is playing dumb here, what's still not entirely clear is why. Presumably he cares about maintaining his credibility as an insightful and fair-minded thinker. Why tarnish that by putting on this haughty performance? Of course, presumably he doesn't think he's tarnishing it—but why not? [He graciously explains in the Facebook comments](/images/yudkowsky-personally_prudent_and_not_community-harmful.png): -> I think that in a half-Kolmogorov-Option environment where people like Zack haven't actually been shot and you get get away with attaching explicit disclaimers like this one, it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do _know_ they're living in a half-Stalinist environment [...] I think people are better off at the end of that. +> I think that in a half-Kolmogorov-Option environment where people like Zack haven't actually been shot and you get away with attaching explicit disclaimers like this one, it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do _know_ they're living in a half-Stalinist environment [...] I think people are better off at the end of that. Ah, _prudence_! He continues: @@ -322,53 +322,53 @@ Yudkowsky's conception of a "rational" argument—at least, Yudkowsky's concepti > > Only in this way can you offer a _rational_ chain of argument, one whose bottom line was written flowing _forward_ from the lines above it. Whatever _actually_ decides your bottom line is the only thing you can _honestly_ write on the lines above. -I remember this being pretty shocking to read back in 'aught-seven. What an alien mindset, that you somehow "can't" argue for something! It's a shockingly high standard for anyone to aspire to live up to—but what made Yudkowsky's Sequences so life-changingly valuable, was that they articulated the existence of such a standard. For that, I will always be grateful. +I remember this being pretty shocking to read back in 'aught-seven. What an alien mindset, that you somehow "can't" argue for something! It's a shockingly high standard for anyone to aspire to—but what made Yudkowsky's Sequences so life-changing was that they articulated the existence of such a standard. For that, I will always be grateful. -... which is why it's so bizarre that the Yudkowsky of the current year acts like he's never heard of it! If your actual bottom line is that it is sometimes personally prudent and not community-harmful to post your agreement with Stalin, then sure, you can _totally_ find something you agree with to write on the lines above! Probably something that "exhibits generally rationalist principles", even! It's just that any rationalist who sees the game you're playing is going to correctly identify you as a partisan hack on this topic and take that into account when deciding whether they can trust you on other topics. +... which is why it's bizarre that the Yudkowsky of the current year acts like he's never heard of it! If your actual bottom line is that it is sometimes personally prudent and not community-harmful to post your agreement with Stalin, then sure, you can _totally_ find something you agree with to write on the lines above! Probably something that "exhibits generally rationalist principles", even! It's just that any rationalist who sees the game you're playing is going to correctly identify you as a partisan hack on this topic and take that into account when deciding whether they can trust you on other topics. -"I don't see what the alternative is besides getting shot," Yudkowsky muses (where presumably, 'getting shot' is a generic metaphor for any undesirable consequence, like being unpopular with progressives). Yes, an astute observation. And any other partisan hack could say exactly the same, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative is besides getting shot (being fired from the campaign). +"I don't see what the alternative is besides getting shot," Yudkowsky muses (where, presumably, "getting shot" is a generic metaphor for any undesirable consequence, like being unpopular with progressives). Yes, an astute observation. And any other partisan hack could say exactly the same, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative is besides getting shot (being fired from the campaign). -Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is so terrifying to you that it seems analogous to getting shot, then, if those are really your true values, then sure—say whatever you need to say to keep your job or your popularity, as is personally prudent. You've set your price. +Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is so terrifying to you that it seems analogous to getting shot, then sure—say whatever you need to say to keep your job or your popularity, as is personally prudent. You've set your price. I just—would have hoped that abandoning the intellectual legacy of his Sequences, would be a price too high for such a paltry benefit? Michael Vassar [said](https://twitter.com/HiFromMichaelV/status/1221771020534788098), "Rationalism starts with the belief that arguments aren't soldiers, and ends with the belief that soldiers are arguments." By accepting that soldiers are arguments ("I don't see what the alternative is besides getting shot"), Yudkowsky is accepting the end of rationalism in this sense. If the price you put on the intellectual integrity of your so-called "rationalist" community is similar to that of the Snodgrass for Mayor campaign, you shouldn't be surprised if intelligent, discerning people accord similar levels of credibility to the two groups' output. -I see the phrase "bad faith" thrown around more than I think people know what it means, which is one of [the reasons I tend to be hesitant to use the term](https://www.lesswrong.com/posts/e4GBj6jxRZcsHFSvP/assume-bad-faith), but I think it fits here. "Bad faith" doesn't mean "with ill intent", and it's more specific than "dishonest": it's [adopting the surface appearance of being moved by one set of motivations, while actually acting from another](https://en.wikipedia.org/wiki/Bad_faith). +[I tend to be hesitant to use the term "bad faith"](https://www.lesswrong.com/posts/e4GBj6jxRZcsHFSvP/assume-bad-faith), because I see it thrown around more than I think people know what it means, but it fits here. "Bad faith" doesn't mean "with ill intent", and it's more specific than "dishonest": it's [adopting the surface appearance of being moved by one set of motivations, while acting from another](https://en.wikipedia.org/wiki/Bad_faith). -For example, an [insurance company employee](https://en.wikipedia.org/wiki/Claims_adjuster) who goes through the motions of investigating your claim while privately intending to deny it, might never consciously tell an explicit "lie", but is acting in bad faith: they're asking you questions, demanding evidence, _&c._ in order to make it look like you'll get paid if you prove the loss occurred—whereas in reality, you're just not going to be paid. Your responses to the claim inspector aren't completely casually inert: if you can make an extremely strong case that the loss occurred as you say, then the claim inspector might need to put some effort into coming up with some ingenious excuse to deny your claim, in ways that exhibit general claim-inspection principles. But at the end of the day, the inspector is going to say what they need to say in order to protect the company's loss ratio, as is sometimes personally prudent. +For example, an [insurance adjuster](https://en.wikipedia.org/wiki/Claims_adjuster) who goes through the motions of investigating your claim while privately intending to deny it might never consciously tell an explicit "lie" but is acting in bad faith: they're asking you questions, demanding evidence, _&c._ to make it look like you'll get paid if you prove the loss occurred—whereas in reality, you're just not going to be paid. Your responses to the claim inspector aren't casually inert: if you can make an extremely strong case that the loss occurred as you say, then the claim inspector might need to put effort into coming up with an ingenious excuse to deny your claim, in ways that exhibit general claim-inspection principles. But ultimately, the inspector is going to say what they need to say in order to protect the company's loss ratio, as is sometimes personally prudent. -With this understanding of bad faith, we can read Yudkowsky's "it is sometimes personally prudent [...]" comment as admitting that his behavior on politically-charged topics is in bad faith—where "bad faith" isn't a meaningless dismissal, but [literally refers](http://benjaminrosshoffman.com/can-crimes-be-discussed-literally/) to the pretending-to-have-one-set-of-motivations-while-acting-according-to-another behavior, such that accusations of bad faith can be true or false. Yudkowsky will [take care not to consciously tell an explicit "lie"](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases), while going through the motions to make it look like he's genuinely engaging with questions where I need the right answers in order to make extremely impactful social and medical decisions—whereas in reality, he's only going to address a selected subset of the relevant evidence and arguments that won't get him in trouble with progressives. +With this understanding of bad faith, we can read Yudkowsky's "it is sometimes personally prudent [...]" comment as admitting that his behavior on politically charged topics is in bad faith—where "bad faith" isn't a meaningless dismissal, but [literally refers](http://benjaminrosshoffman.com/can-crimes-be-discussed-literally/) to the behavior of pretending to different motivations than one does, such that accusations of bad faith can be true or false. Yudkowsky will [take care not to consciously tell an explicit "lie"](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases), while going through the motions to make it look like he's genuinely engaging with questions where I need the right answers in order to make extremely impactful social and medical decisions—whereas in reality, he's only going to address a selected subset of the relevant evidence and arguments that won't get him in trouble with progressives. -To his credit, he will admit that he's only willing to address a selected subset of arguments—but while doing so, he claims an absurd "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking" while simultaneously blatantly mischaracterizing his opponents' beliefs! ("Gendered Pronouns For Everyone and Asking To Leave The System Is Lying" doesn't pass anyone's [ideological Turing test](https://www.econlib.org/archives/2011/06/the_ideological.html).) +To his credit, he will admit that he's only willing to address a selected subset of arguments—but while doing so, he claims an absurd "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking" while blatantly mischaracterizing his opponents' beliefs! ("Gendered Pronouns for Everyone and Asking To Leave the System Is Lying" doesn't pass anyone's [ideological Turing test](https://www.econlib.org/archives/2011/06/the_ideological.html).) -Counterarguments aren't completely causally inert: if you can make an extremely strong case that Biological Sex Is Sometimes More Relevant Than Subjective Gender Identity (Such That Some People Perceive an Interest in Using Language Accordingly), Yudkowsky will put some effort into coming up with some ingenious excuse for why he _technically_ never said otherwise, in ways that exhibit generally rationalist principles. But at the end of the day, Yudkowsky is going to say what he needs to say in order to protect his reputation with progressives, as is sometimes personally prudent. +Counterarguments aren't completely causally inert: if you can make an extremely strong case that Biological Sex Is Sometimes More Relevant Than Subjective Gender Identity (Such That Some People Perceive an Interest in Using Language Accordingly), Yudkowsky will put some effort into coming up with some ingenious excuse for why he _technically_ never said otherwise, in ways that exhibit generally rationalist principles. But ultimately, Yudkowsky is going to say what he needs to say in order to protect his reputation with progressives, as is sometimes personally prudent. -Even if one were to agree with this description of Yudkowsky's behavior, it doesn't immediately follow that Yudkowsky is making the wrong decision. Again, "bad faith" is meant as a literal description that makes predictions about behavior, not a contentless attack—maybe there are circumstances in which engaging some amount of bad faith is the right thing to do, given the constraints one faces! For example, when talking to people on Twitter with a very different ideological background from me, I sometimes anticipate that if my interlocutor knew what I was actually thinking, they wouldn't want to talk to me, so I occasionally engage in a bit of what could be called ["concern trolling"](https://geekfeminism.fandom.com/wiki/Concern_troll): I take care to word my replies in a way that makes it look like I'm more ideologically aligned with my interlocutor than I actually am. (For example, I [never say "assigned female/male at birth" in my own voice on my own platform](/2019/Sep/terminology-proposal-developmental-sex/), but I'll do it in an effort to speak my interlocutor's language.) I think of this as the minimal amount of strategic bad faith needed to keep the conversation going, to get my interlocutor to evaluate my argument on its own merits, rather than rejecting it for coming from an ideological enemy. In cases such as these, I'm willing to defend my behavior. There _is_ a sense in which I'm being deceptive by optimizing my language choice to make my interlocutor make bad guesses about my ideological alignment, but I'm comfortable with that amount and scope of deception in the service of correcting the distortion where I don't think my interlocutor _should_ be paying attention to my personal alignment. +Even if one were to agree with this description of Yudkowsky's behavior, it doesn't immediately follow that Yudkowsky is making the wrong decision. Again, "bad faith" is meant as a literal description that makes predictions about behavior—maybe there are circumstances in which engaging some amount of bad faith is the right thing to do, given the constraints one faces! For example, when talking to people on Twitter with a very different ideological background from mine, I sometimes anticipate that if my interlocutor knew what I was thinking, they wouldn't want to talk to me, so I word my replies so that I [seem more ideologically aligned with them than I actually am](https://geekfeminism.fandom.com/wiki/Concern_troll). (For example, I [never say "assigned female/male at birth" in my own voice on my own platform](/2019/Sep/terminology-proposal-developmental-sex/), but I'll do it in an effort to speak my interlocutor's language.) I think of this as the minimal amount of strategic bad faith needed to keep the conversation going—to get my interlocutor to evaluate my argument on its own merits, rather than rejecting it for coming from an ideological enemy. I'm willing to defend this behavior. There _is_ a sense in which I'm being deceptive by optimizing my language choice to make my interlocutor make bad guesses about my ideological alignment, but I'm comfortable with that in the service of correcting the distortion where I don't think my interlocutor _should_ be paying attention to my alignment. -That is, my bad faith concern-trolling gambit of deceiving people about my ideological alignment in the hopes of improving the discussion seems like something that improves the accuracy of our collective beliefs about the topic being argued about. (And the topic is presumably of greater collective interest than which "side" I personally happen to be on.) +That is, my bad faith concern-trolling gambit of misrepresenting my ideological alignment to improve the discussion seems beneficial to the accuracy of our collective beliefs about the topic. (And the topic is presumably of greater collective interest than which "side" I personally happen to be on.) -In contrast, the "it is sometimes personally prudent [...] to post your agreement with Stalin" gambit is the exact reverse: it's _introducing_ a distortion into the discussion in the hopes of correcting people's beliefs about the speaker's ideological alignment. (Yudkowsky is not a right-wing Bad Guy, but people would tar him as a right-wing Bad Guy if he ever said anything negative about trans people.) This doesn't improve our collective beliefs about the topic; it's a _pure_ ass-covering move. +In contrast, the "it is sometimes personally prudent [...] to post your agreement with Stalin" gambit is the exact reverse: it's _introducing_ a distortion into the discussion in the hopes of correcting people's beliefs about the speaker's ideological alignment. (Yudkowsky is not a right-wing Bad Guy, but people would tar him as one if he ever said anything negative about trans people.) This doesn't improve our collective beliefs about the topic; it's a _pure_ ass-covering move. -Yudkowsky names the alleged fact that "people do _know_ they're living in a half-Stalinist environment" as a mitigating factor. But the reason censorship is such an effective tool in the hands of dictators like Stalin is because it ensures that many people _don't_ know—and that those who know (or suspect) don't have [game-theoretic common knowledge](https://www.lesswrong.com/posts/9QxnfMYccz9QRgZ5z/the-costly-coordination-mechanism-of-common-knowledge#Dictators_and_freedom_of_speech) that others do too. +Yudkowsky names the alleged fact that "people do _know_ they're living in a half-Stalinist environment" as a mitigating factor. But the reason censorship is such an effective tool in the hands of dictators like Stalin is because it ensures that many people _don't_ know—and that those who know (or suspect) don't have [game-theoretic common knowledge](https://www.lesswrong.com/posts/9QxnfMYccz9QRgZ5z/the-costly-coordination-mechanism-of-common-knowledge#Dictators_and_freedom_of_speech). -Zvi Mowshowitz has [written about how the false assertion that "everybody knows" something](https://thezvi.wordpress.com/2019/07/02/everybody-knows/) is typically used justify deception: if "everybody knows" that we can't talk about biological sex (the rationalization goes), then no one is being deceived when our allegedly truthseeking discussion carefully steers clear of any reference to the reality of biological sex when it would otherwise be extremely relevant. +Zvi Mowshowitz has [written about how the false assertion that "everybody knows" something](https://thezvi.wordpress.com/2019/07/02/everybody-knows/) is used to justify deception: if "everybody knows" that we can't talk about biological sex, then no one is being deceived when our allegedly truthseeking discussion carefully steers clear of any reference to the reality of biological sex even when it's extremely relevant. But if everybody knew, then what would be the point of the censorship? It's not coherent to claim that no one is being harmed by censorship because everyone knows about it, because the appeal of censorship is precisely that _not_ everybody knows and that someone with power wants to keep it that way. For the savvy people in the know, it would certainly be convenient if everyone secretly knew: then the savvy people wouldn't have to face the tough choice between acceding to Power's demands (at the cost of deceiving their readers) and informing their readers (at the cost of incurring Power's wrath). -[Policy debates should not appear one-sided.](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided) Faced with this kind of dilemma, I can't say that defying Power is necessarily the right choice: if there really were no other options between deceiving your readers with a bad faith performance, and incurring Power's wrath, and Power's wrath would be too terrible to bear, then maybe deceiving your readers with a bad faith performance is the right thing to do. +[Policy debates should not appear one-sided.](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided) Faced with this dilemma, I can't say that defying Power is necessarily the right choice: if there really were no options besides deceiving your readers with a bad-faith performance and incurring Power's wrath, and Power's wrath would be too terrible to bear, then maybe the bad-faith performance is the right thing to do. But if you cared about not deceiving your readers, you would want to be sure that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore politically unfavorable counterarguments_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion". -I think he's playing dumb here. In other contexts, he's written about ["attack[s] performed by selectively reporting true information"](https://twitter.com/ESYudkowsky/status/1634338145016909824) and ["[s]tatements which are technically true but which deceive the listener into forming further beliefs which are false"](https://hpmor.com/chapter/97). He's undoubtedly familiar with the motte-and-bailey doctrine as [described by Nicholas Shackel](https://philpapers.org/archive/SHATVO-2.pdf) and [popularized by Scott Alexander](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/). I think that _if he wanted to_, Eliezer Yudkowsky could think of some relevant differences between "2 + 2 = 4" and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'". +I think he's playing dumb here. In other contexts, he's written about ["attack[s] performed by selectively reporting true information"](https://twitter.com/ESYudkowsky/status/1634338145016909824) and ["[s]tatements which are technically true but which deceive the listener into forming further beliefs which are false"](https://hpmor.com/chapter/97). He's undoubtedly familiar with the motte-and-bailey doctrine as [described by Nicholas Shackel](https://philpapers.org/archive/SHATVO-2.pdf) and [popularized by Scott Alexander](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/). I think that if he wanted to, Eliezer Yudkowsky could think of some relevant differences between "2 + 2 = 4" and "the simplest and best protocol is, "_He_ refers to the set of people who have asked us to use _he_". -If you think it's "sometimes personally prudent and not community-harmful" to go out of your way to say positive things about Republican candidates and never, ever say positive things about Democratic candidates (because you live in a red state and "don't see what the alternative is besides getting shot"), you can see why people might regard you as a Republican shill, even if all the things you said were true. If you tried to defend yourself against the charge of being a Republican shill by pointing out that you've never told any specific individual, "You should vote Republican," that's a nice motte that might work on some people, but you shouldn't expect seasoned and devoted rationalists to fall for it. +If you think it's "sometimes personally prudent and not community-harmful" to go out of your way to say positive things about Republican candidates and never, ever say positive things about Democratic candidates (because you live in a red state and "don't see what the alternative is besides getting shot"), you can see why people might regard you as a Republican shill, even if all the things you said were true. If you tried to defend yourself against the charge of being a Republican shill by pointing out that you've never told any specific individual, "You should vote Republican," that's a nice motte, but you shouldn't expect devoted rationalists to fall for it. Similarly, when Yudkowsky [wrote in June 2021](https://twitter.com/ESYudkowsky/status/1404697716689489921), "I have never in my own life tried to persuade anyone to go trans (or not go trans)—I don't imagine myself to understand others that much", it was a great motte. I don't doubt the literal motte stated literally. -And yet it seems worth noticing that shortly after proclaiming in March 2016 that he was "over 50% probability at this point that at least 20% of the ones with penises are actually women", he made [a followup post gloating about causing someone's transition](https://www.facebook.com/yudkowsky/posts/10154110278349228): +And yet it seems worth noticing that shortly after proclaiming in March 2016 that he was "over 50% probability at this point that at least 20% of the ones with penises are actually women", he made [a followup post celebrating having caused someone's transition](https://www.facebook.com/yudkowsky/posts/10154110278349228): > Just checked my filtered messages on Facebook and saw, "Your post last night was kind of the final thing I needed to realize that I'm a girl." > ==DOES ALL OF THE HAPPY DANCE FOREVER== @@ -379,7 +379,7 @@ In the comments, he added: He [later clarified on Twitter](https://twitter.com/ESYudkowsky/status/1404821285276774403), "It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy." -But if Stalin is committed to convincing gender-dysphoric males that they need to cut their dicks off, and you're committed to not disagree with Stalin, you _shouldn't_ mostly believe it when gender-dysphoric males thank you for providing the the final piece of evidence they needed to realize that they need to cut their dicks off, for the same reason a self-aware Republican shill shouldn't take it literally when people thank him for warning them against Democrat treachery. We know—he's told us very clearly—that Yudkowsky isn't trying to provide gender-dysphoric people with the full state of information that they would need to decide on the optimal quality-of-life interventions. He's playing on a different chessboard. +But if Stalin is committed to convincing gender-dysphoric males that they need to cut their dicks off, and you're committed to not disagreeing with Stalin, you _shouldn't_ mostly believe it when gender-dysphoric males thank you for providing the final piece of evidence they needed to realize that they need to cut their dicks off, for the same reason a self-aware Republican shill shouldn't uncritically believe it when people thank him for warning them against Democrat treachery. We know—he's told us very clearly—that Yudkowsky isn't trying to provide gender-dysphoric people with the full state of information that they would need to decide on the optimal quality-of-life interventions. He's playing on a different chessboard. "[P]eople do _know_ they're living in a half-Stalinist environment," Yudkowsky claims. "I think people are better off at the end of that," he says. But who are "people", specifically? One of the problems with utilitarianism is that it doesn't interact well with game theory. If a policy makes most people better off, at the cost of throwing a few others under the bus, is enacting that policy the right thing to do? @@ -391,31 +391,31 @@ Depending on the details, maybe—but you probably shouldn't expect the victims When someone else doesn't see the problem with Jane Austen's characters, Thellim [redoubles her determination to explain the problem](https://www.glowfic.com/replies/1592987#reply-1592987): "_She is not giving up that easily. Not on an entire planet full of people._" -Thellim's horror at the fictional world of Jane Austen is basically how I feel about "trans" culture in the current year. It _actively discourages self-modeling!_ People who have cross-sex fantasies are encouraged to reify them into a gender identity which everyone else is supposed to unquestioningly accept. Obvious critical questions about what's actually going on etiologically, what it means for an identity to be true, _&c._ are strongly discouraged as hateful and hurtful. +Thellim's horror at the fictional world of Jane Austen is basically how I feel about "trans" culture in the current year. It actively discourages self-modeling! People who have cross-sex fantasies are encouraged to reify them into a gender identity which everyone else is supposed to unquestioningly accept. Obvious critical questions about what's actually going on etiologically, what it means for an identity to be true, _&c._ are strongly discouraged as hateful and hurtful. -The problem is _not_ that I think there's anything wrong with fantasizing about being the other sex, and wanting the fantasy to become real—just as Thellim's problem with _Pride and Prejudice_ is not there being anything wrong with wanting to marry a suitable bachelor. These are perfectly respectable goals. +The problem is not that I think there's anything wrong with fantasizing about being the other sex and wanting the fantasy to be real—just as Thellim's problem with _Pride and Prejudice_ is not her seeing anything wrong with wanting to marry a suitable bachelor. These are perfectly respectable goals. -The _problem_ is that people who are trying to be people, people who are trying to achieve their goals _in reality_, do so in a way that involves having concepts of their own minds, and trying to improve both their self-models and their selves—and that's not possible in a culture that tries to ban as heresy the idea that it's possible for someone's self-model to be wrong. +The _problem_ is that people who are trying to be people, people who are trying to achieve their goals _in reality_, do so in a way that involves having concepts of their own minds, and trying to improve both their self-models and their selves, and that's not possible in a culture that tries to ban as heresy the idea that it's possible for someone's self-model to be wrong. A trans woman I follow on Twitter complained that a receptionist at her workplace said she looked like some male celebrity. "I'm so mad," she fumed. "I look like this right now"—there was a photo attached to the Tweet—"how could anyone ever think that was an okay thing to say?" It is genuinely sad that the author of those Tweets didn't get perceived in the way she would prefer! But the thing I want her to understand, a thing I think any sane adult (on Earth, and not just dath ilan) should understand— -_It was a compliment!_ That receptionist was almost certainly thinking of someone like [David Bowie](https://en.wikipedia.org/wiki/David_Bowie) or [Eddie Izzard](https://en.wikipedia.org/wiki/Eddie_Izzard), rather than being hateful and trying to hurt. The author should have graciously accepted the compliment, and _done something to pass better next time_. The horror of trans culture is that it's impossible to imagine any of these people doing that—noticing that they're behaving like a TERF's [hostile](/2019/Dec/the-strategy-of-stigmatization/) [stereotype](/2022/Feb/link-never-smile-at-an-autogynephile/) of a narcissistic, gaslighting trans-identified man and snapping out of it. +_It was a compliment!_ That receptionist was almost certainly thinking of someone like [David Bowie](https://en.wikipedia.org/wiki/David_Bowie) or [Eddie Izzard](https://en.wikipedia.org/wiki/Eddie_Izzard), rather than being hateful. The author should have graciously accepted the compliment and _done something to pass better next time_. The horror of trans culture is that it's impossible to imagine any of these people doing that—noticing that they're behaving like a TERF's [hostile](/2019/Dec/the-strategy-of-stigmatization/) [stereotype](/2022/Feb/link-never-smile-at-an-autogynephile/) of a narcissistic, gaslighting trans-identified man and snapping out of it. -I want a shared cultural understanding that the correct way to ameliorate the genuine sadness of people not being perceived the way they prefer is through things like better and cheaper facial feminization surgery, not [emotionally blackmailing](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) people out of their ability to report what they see. I don't _want_ to relinquish [my ability to notice what women's faces look like](/papers/bruce_et_al-sex_discrimination_how_do_we_tell.pdf), even if that means noticing that mine isn't; if I'm sad that it isn't, I can endure the sadness if the alternative is forcing everyone in my life to doublethink around their perceptions of me. +I want a shared cultural understanding that the way to ameliorate the sadness of people who aren't being perceived how they prefer is through things like better and cheaper facial feminization surgery, not [emotionally blackmailing](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) people out of their ability to report what they see. I don't _want_ to relinquish [my ability to notice what women's faces look like](/papers/bruce_et_al-sex_discrimination_how_do_we_tell.pdf), even if that means noticing that mine isn't one. I can endure being sad about that if the alternative is forcing everyone to doublethink around their perceptions of me. -In a world where surgery is expensive, but some people desperately want to change sex and other people want to be nice to them, there's an incentive gradient in the direction of re-binding our shared concept of "gender" onto things like [ornamental clothing](http://web.archive.org/web/20210513192331/http://thetranswidow.com/2021/02/18/womens-clothing-is-always-drag-even-on-women/) that are easier to change than secondary sex characteristics. +In a world where surgery is expensive, but some people desperately want to change sex and other people want to be nice to them, there are incentives to relocate our shared concept of "gender" onto things like [ornamental clothing](http://web.archive.org/web/20210513192331/http://thetranswidow.com/2021/02/18/womens-clothing-is-always-drag-even-on-women/) that are easier to change than secondary sex characteristics. -But I would have expected people with an inkling of self-awareness and honesty to ... notice the incentives, and notice the problems being created by the incentives, and to talk about the problems in public so that we can coordinate on the best solution, [whatever that turns out to be](/2021/Sep/i-dont-do-policy/)? +But I would have expected people with an inkling of self-awareness and honesty to notice the incentives, and the problems being created by them, and to talk about the problems in public so that we can coordinate on the best solution, [whatever that turns out to be](/2021/Sep/i-dont-do-policy/)? And if that's too much to expect of the general public— And if it's too much to expect garden-variety "rationalists" to figure out on their own without prompting from their superiors— -Then I would have at least expected Eliezer Yudkowsky to take actions _in favor of_ rather than _against_ his faithful students having these very basic capabilities for reflection, self-observation, and ... _speech_? I would have expected Eliezer Yudkowsky to not _actively exert optimization pressure in the direction of transforming me into a Jane Austen character_. +Then I would have at least expected Eliezer Yudkowsky to take actions _in favor of_ rather than _against_ his faithful students having these basic capabilities for reflection, self-observation, and ... speech? I would have expected Eliezer Yudkowsky to not _actively exert optimization pressure in the direction of transforming me into a Jane Austen character_. -This is the part where Yudkowsky or his flunkies accuse me of being uncharitable, of [failing at perspective-taking](https://twitter.com/ESYudkowsky/status/1435617576495714304) and [embracing conspiracy theories](https://twitter.com/ESYudkowsky/status/1708587781424046242). Obviously, Yudkowsky doesn't _think of himself_ as trying to transform his faithful students into Jane Austen characters. One might then ask if it does not therefore follow that I have failed to understand his position? [As Yudkowsky put it](https://twitter.com/ESYudkowsky/status/1435618825198731270): +This is the part where Yudkowsky or his flunkies accuse me of being uncharitable, of [failing at perspective-taking](https://twitter.com/ESYudkowsky/status/1435617576495714304) and [embracing conspiracy theories](https://twitter.com/ESYudkowsky/status/1708587781424046242). Obviously, Yudkowsky doesn't _think of himself_ as trying to transform his faithful students into Jane Austen characters. Perhaps, then, I have failed to understand his position? [As Yudkowsky put it](https://twitter.com/ESYudkowsky/status/1435618825198731270): > The Other's theory of themselves usually does not make them look terrible. And you will not have much luck just yelling at them about how they must really be doing `terrible_thing` instead. @@ -423,55 +423,55 @@ But the substance of my complaints is [not about Yudkowsky's conscious subjectiv But my complaint is about the work the algorithm is _doing_ in Stalin's service, not about how it feels; I'm talking about a pattern of publicly visible _behavior_ stretching over years, not claiming to be a mind-reader. (Thus, "take actions" in favor of/against, rather than "be"; "exert optimization pressure in the direction of", rather than "try".) I agree that everyone has a story in which they don't look terrible, and that people mostly believe their own stories, but it does not therefore follow that no one ever does anything terrible. -I agree that you won't have much luck yelling at the Other about how they must really be doing `terrible_thing`. (People get very invested in their own stories.) But if you have the _receipts_ of the Other repeatedly doing the thing in public from 2016 to 2021, maybe yelling about it to _everyone else_ might help _them_ stop getting suckered by the Other's empty posturing. +I agree that you won't have much luck yelling at the Other about how they must really be doing `terrible_thing`. But if you have the receipts of the Other repeatedly doing the thing in public from 2016 to 2021, maybe yelling about it to everyone else might help _them_ stop getting suckered by the Other's empty posturing. Let's recap. -In January 2009, Yudkowsky published ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), essentially a revision of [a 2004 mailing list post responding to a man who said that after the Singularity, he'd like to make a female but "otherwise identical" copy of himself](https://archive.is/En6qW). "Changing Emotions" insightfully points out [the deep technical reasons why](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard) men who sexually fantasize about being women can't achieve their dream with foreseeable technology—and not only that, but that the dream itself is conceptually confused: a man's fantasy-about-it-being-fun-to-be-a-woman isn't part of the female distribution; there's a sense in which it _can't_ be fulfilled. +In January 2009, Yudkowsky published ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), essentially a revision of [a 2004 mailing list post responding to a man who said that after the Singularity, he'd like to make a female but "otherwise identical" copy of himself](https://archive.is/En6qW). "Changing Emotions" insightfully points out [the deep technical reasons why](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard) men who sexually fantasize about being women can't achieve their dream with foreseeable technology—and not only that, but that the dream itself is conceptually confused: a man's fantasy about it being fun to be a woman isn't part of the female distribution; there's a sense in which it _can't_ be fulfilled. -It was a good post! Though Yudkowsky was merely using the sex change example to illustrate [a more general point about the difficulties of applied transhumanism](https://www.lesswrong.com/posts/EQkELCGiGQwvrrp3L/growing-up-is-hard), "Changing Emotions" was hugely influential on me; I count myself much better off for having understood the argument. +It was a good post! Yudkowsky was merely using the sex change example to illustrate [a more general point about the difficulties of applied transhumanism](https://www.lesswrong.com/posts/EQkELCGiGQwvrrp3L/growing-up-is-hard), but "Changing Emotions" was hugely influential on me; I count myself much better off for having understood the argument. But seven years later, in a March 2016 Facebook post, Yudkowsky [proclaimed that](https://www.facebook.com/yudkowsky/posts/10154078468809228) "for people roughly similar to the Bay Area / European mix, I think I'm over 50% probability at this point that at least 20% of the ones with penises are actually women." -This seemed like a huge and surprising reversal from the position articulated in "Changing Emotions". The two posts weren't _necessarily_ inconsistent, if you assumed gender identity is a real property synonymous with "brain sex", and that the harsh (almost mocking) skepticism of the idea of true male-to-female sex change in "Changing Emotions" was directed at the erotic sex-change fantasies of _cis_ men (with a male gender-identity/brain-sex), whereas the 2016 Facebook post was about _trans women_ (with a female gender-identity/brain-sex), which are a different thing. +This seemed like a huge and surprising reversal from the position articulated in "Changing Emotions". The two posts weren't _necessarily_ inconsistent, if you assumed gender identity is a real property synonymous with "brain sex", and that the harsh (almost mocking) skepticism of the idea of true male-to-female sex change in "Changing Emotions" was directed at the erotic sex-change fantasies of cis men (with a male gender-identity/brain-sex), whereas the 2016 Facebook post was about trans women (with a female gender-identity/brain-sex), which are a different thing. But this potential unification seemed dubious to me, especially if trans women were purported to be "at least 20% of the ones with penises" (!!) in some population. After it's been pointed out, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female but 'otherwise identical' copy of himself" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. So in October 2016, [I wrote to Yudkowsky noting the apparent reversal and asking to talk about it](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price). Because of the privacy rules I'm adhering to in telling this Whole Dumb Story, I can't confirm or deny whether any such conversation occurred. -Then, in November 2018, while criticizing people who refuse to use trans people's preferred pronouns, Yudkowsky proclaimed that "Using language in a way _you_ dislike, openly and explicitly and with public focus on the language and its meaning, is not lying" and that "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning". But _that_ seemed like a huge and surprising reversal from the position articulated in ["37 Ways Words Can Be Wrong"](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong). After attempts to clarify via email failed, I eventually wrote ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) to explain the relevant error in general terms, and Yudkowsky would eventually go on to [clarify his position in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228). +Then, in November 2018, while criticizing people who refuse to use trans people's preferred pronouns, Yudkowsky proclaimed that "Using language in a way _you_ dislike, openly and explicitly and with public focus on the language and its meaning, is not lying" and that "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning". But _that_ seemed like a huge and surprising reversal from the position articulated in ["37 Ways Words Can Be Wrong"](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong). After attempts to clarify via email failed, I eventually wrote ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) to explain the relevant error in general terms, and Yudkowsky eventually [clarified his position in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228). -But then in February 2021, he reopened the discussion to proclaim that "the simplest and best protocol is, '"He" refers to the set of people who have asked us to use "he", with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition", the problems with which post I explained in March 2022's ["Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/) and above. +But then in February 2021, he reopened the discussion to proclaim that "the simplest and best protocol is, '_He_ refers to the set of people who have asked us to use _he_, with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition", the problems with which post I explained in March 2022's ["Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/) and above. End recap. -At this point, the nature of the game is very clear. Yudkowsky wants to make sure he's on peaceful terms with the progressive _zeitgeist_, subject to the constraint of [not writing any sentences he knows to be false](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases#2__The_law_of_no_literal_falsehood_). Meanwhile, I want to make sense of what's actually going on in the world as regards to sex and gender, because _I need the correct answer to decide whether or not to cut my dick off_. +At this point, the nature of the game is clear. Yudkowsky wants to make sure he's on peaceful terms with the progressive _zeitgeist_, subject to the constraint of [not writing any sentences he knows to be false](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases#2__The_law_of_no_literal_falsehood_). Meanwhile, I want to make sense of what's actually going on in the world as regarding sex and gender, because _I need the correct answer to decide whether or not to cut my dick off_. -On "his turn", he comes up with some pompous proclamation that's obviously optimized to make the "pro-trans" faction look smart and good and make the "anti-trans" faction look dumb and bad, "in ways that exhibit generally rationalist principles." +On "his turn", he comes up with some pompous proclamation that's obviously optimized to make the "pro-trans" faction look smart and good and the "anti-trans" faction look dumb and bad, "in ways that exhibit generally rationalist principles." -On "my turn", I put in an absurd amount of effort explaining in exhaustive, _exhaustive_ detail why Yudkowsky's pompous proclamation, while [not technically saying making any unambiguously false atomic statements](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly), was substantively misleading compared to what any serious person would say if they were trying to make sense of the world without worrying what progressive activists would think of them. +On "my turn", I put in an absurd amount of effort explaining in exhaustive, _exhaustive_ detail why Yudkowsky's pompous proclamation, while [not technically making any unambiguously false atomic statements](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly), was substantively misleading compared to what any serious person would say if they were trying to make sense of the world without worrying what progressive activists would think of them. -At the start, I never expected to end up arguing about something so trivial as the minutiae of pronoun conventions (which no one would care about if historical contingencies of the evolution of the English language hadn't made them a Schelling point for things people do care about). The conversation only ended up here after a series of derailings. At the start, I was trying to say something substantive about the psychology of straight men who wish they were women. +At the start, I never expected to end up arguing about the minutiae of pronoun conventions, which no one would care about if contingencies of the English language hadn't made them a Schelling point for things people do care about. The conversation only ended up here after a series of derailings. At the start, I was trying to say something substantive about the psychology of straight men who wish they were women. -In the context of AI alignment theory, Yudkowsky has written about a "nearest unblocked strategy" phenomenon: if you directly prevent an agent from accomplishing a goal via some plan that you find undesirable, the agent will search for ways to route around that restriction, and probably find some plan that you find similarly undesirable for broadly similar reasons. +In the context of AI alignment theory, Yudkowsky has written about a "nearest unblocked strategy" phenomenon: if you prevent an agent from accomplishing a goal via some plan that you find undesirable, the agent will search for ways to route around that restriction, and probably find some plan that you find similarly undesirable for similar reasons. -Suppose you developed an AI to [maximize human happiness subject to the constraint of obeying explicit orders](https://arbital.greaterwrong.com/p/nearest_unblocked#exampleproducinghappiness). It might first try administering heroin to humans. When you order it not to, it might switch to administering cocaine. When you order it to not use any of a whole list of banned happiness-producing drugs, it might switch to researching new drugs, or just _pay_ humans to take heroin, _&c._ +Suppose you developed an AI to [maximize human happiness subject to the constraint of obeying explicit orders](https://arbital.greaterwrong.com/p/nearest_unblocked#exampleproducinghappiness). It might first try forcibly administering heroin to humans. When you order it not to, it might switch to administering cocaine. When you order it to not to forcibly adminster any kind of drug, it might switch to forcibly implanting electrodes in humans' brains, or just _paying_ the humans to take heroin, _&c._ It's the same thing with Yudkowsky's political risk minimization subject to the constraint of not saying anything he knows to be false. First he comes out with ["I think I'm over 50% probability at this point that at least 20% of the ones with penises are actually women"](https://www.facebook.com/yudkowsky/posts/10154078468809228) (March 2016). When you point out that his own pre–[Great Awokening](https://www.vox.com/2019/3/22/18259865/great-awokening-white-liberals-race-polling-trump-2020) writings [explain why that's not true](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), then the next time he revisits the subject, he switches to ["you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning"](https://archive.is/Iy8Lq) (November 2018). When you point out that his earlier writings also explain why [_that's_ not true either](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong), he switches to "It is Shenanigans to try to bake your stance on how clustered things are [...] _into the pronoun system of a language and interpretation convention that you insist everybody use_" (February 2021). When you point out that [that's not what's going on](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/), he switches to ... I don't know, but he's a smart guy; in the unlikely event that he sees fit to respond to this post, I'm sure he'll be able to think of something—but at this point, _I have no reason to care_. Talking to Yudkowsky on topics where getting the right answer would involve acknowledging facts that would make you unpopular in Berkeley is a waste of everyone's time; he has a [bottom line](https://www.lesswrong.com/posts/34XxbRFe54FycoCDw/the-bottom-line) that doesn't involve trying to inform you. -Accusing one's interlocutor of bad faith is frowned upon for a reason. We would prefer to live in a world where we have intellectually fruitful object-level discussions under the assumption of good faith, rather than risk our fora degenerating into an acrimonious brawl of accusations and name-calling, which is unpleasant and (more importantly) doesn't make any intellectual progress. I, too, would prefer to have a real object-level discussion under the assumption of good faith. +Accusing one's interlocutor of bad faith is frowned upon for a reason. We would prefer to live in a world where we have intellectually fruitful object-level discussions under the assumption of good faith, rather than risk our fora degenerating into accusations and name-calling, which is unpleasant and (more importantly) doesn't make any intellectual progress. -Accordingly, I tried the object-level good-faith argument thing _first_. I tried it for _years_. But at some point, I think I should be allowed to notice the nearest-unblocked-strategy game which is obviously happening if you look at the history of what was said. I think there's some number of years and some number of thousands of words[^wordcounts] of litigating the object level (about gender) and the meta level (about the philosophy of categorization) after which there's nothing left for me to do but jump up to the meta-meta level of politics and explain, to anyone capable of hearing it, why I think I've accumulated enough evidence for the assumption of good faith to have been empirically falsified.[^symmetrically-not-assuming-good-faith] +Accordingly, I tried the object-level good-faith argument thing first. I tried it for _years_. But at some point, I should be allowed to notice the nearest-unblocked-strategy game which is obviously happening. I think there's some number of years and some number of thousands of words[^wordcounts] of litigating the object level (about gender) and the meta level (about the philosophy of categorization) after which there's nothing left to do but jump up to the meta-meta level of politics and explain, to anyone capable of hearing it, why I think I've accumulated enough evidence for the assumption of good faith to have been empirically falsified.[^symmetrically-not-assuming-good-faith] [^wordcounts]: ["The Categories Were Made for Man to Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/) (2018), ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) (2019), and ["Unnatural Categories Are Optimized for Deception"](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception) (2021) total to over 20,000 words. -[^symmetrically-not-assuming-good-faith]: Obviously, if we're crossing the Rubicon of abandoning the norm of assuming good faith, it needs to be abandoned symmetrically. I _think_ I'm doing a pretty good job of adhering to standards of intellectual conduct and being transparent about my motivations, but I'm definitely not perfect, and, unlike Yudkowsky, I'm not so absurdly mendaciously arrogant to claim "confidence in my own ability to independently invent everything important" (!) about my topics of interest. If Yudkowsky or anyone else thinks they _have a case_ based on my behavior that _I'm_ being culpably intellectually dishonest, they of course have my blessing and encouragement to post it for the audience to evaluate. +[^symmetrically-not-assuming-good-faith]: Obviously, if we're abandoning the norm of assuming good faith, it needs to be abandoned symmetrically. I _think_ I'm adhering to standards of intellectual conduct and being transparent about my motivations, but I'm not perfect, and, unlike Yudkowsky, I'm not so absurdly mendaciously arrogant to claim "confidence in my own ability to independently invent everything important" (!) about my topics of interest. If Yudkowsky or anyone else thinks they have a case that _I'm_ being culpably intellectually dishonest, they of course have my blessing and encouragement to post it for the audience to evaluate. -What makes all of this especially galling is the fact that _all of my heretical opinions are literally just Yudkowsky's opinions from the 'aughts!_ My whole thing about how changing sex isn't possible with existing or foreseeable technology because of how complicated humans (and therefore human sex differences) are? Not original to me! I [filled in a few technical details](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard), but again, this was in the Sequences as ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). My thing about how you can't define concepts any way you want because there are mathematical laws governing which category boundaries [compress](https://www.lesswrong.com/posts/mB95aqTSJLNR9YyjH/message-length) your [anticipated experiences](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences)? Not original to me! I [filled in](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [a few technical details](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception), but [_we had a whole Sequence about this._](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) +What makes all of this especially galling is that _all of my heretical opinions are literally just Yudkowsky's opinions from the 'aughts!_ My thing about how changing sex isn't possible with existing or foreseeable technology because of how complicated humans (and therefore human sex differences) are? Not original to me! I [filled in a few technical details](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard), but again, this was in the Sequences as ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). My thing about how you can't define concepts any way you want because there are mathematical laws governing which category boundaries [compress](https://www.lesswrong.com/posts/mB95aqTSJLNR9YyjH/message-length) your [anticipated experiences](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences)? Not original to me! I [filled in](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [a few technical details](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception), but [_we had a whole Sequence about this._](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) -Seriously, do you think I'm smart enough to come up with all of this independently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still gave a shit about telling the truth_. (Actively telling the truth, and not just technically not lying.) The things I'm hyperfocused on that he thinks are politically impossible to say in the current year, are almost entirely things he already said, that anyone could just look up! +Seriously, do you think I'm smart enough to come up with all of this independently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still gave a shit about telling the truth_. (Actively telling the truth, and not just technically not lying.) The things I'm hyperfocused on that he thinks are politically impossible to say in the current year are almost entirely things he already said, that anyone could just look up! -I guess the point is that the egregore doesn't have the reading comprehension for that?—or rather, the egregore has no reason to care about the past; if you get tagged by the mob as an Enemy, your past statements will get dug up as evidence of foul present intent, but if you're doing good enough of playing the part today, no one cares what you said in 2009? +I guess the egregore doesn't have the reading comprehension for that?—or rather, the egregore has no reason to care about the past; if you get tagged by the mob as an Enemy, your past statements will get dug up as evidence of foul present intent, but if you're playing the part well enough today, no one cares what you said in 2009? -Does ... does he expect the rest of us not to _notice_? Or does he think that "everybody knows"? +Does he expect the rest of us not to _notice_? Or does he think that "everybody knows"? But I don't think that everybody knows. And I'm not giving up that easily. Not on an entire subculture full of people. @@ -479,13 +479,13 @@ Yudkowsky [defended his behavior in February 2021](https://twitter.com/ESYudkows > I think that some people model civilization as being in the middle of a great battle in which this tweet, even if true, is giving comfort to the Wrong Side, where I would not have been as willing to tweet a truth helping the Right Side. From my perspective, this battle just isn't that close to the top of my priority list. I rated nudging the cognition of the people-I-usually-respect, closer to sanity, as more important; who knows, those people might matter for AGI someday. And the Wrong Side part isn't as clear to me either. -There are a number of things that could be said to this,[^number-of-things] but most importantly: the battle that matters—the battle with a Right Side and a Wrong Side—isn't "pro-trans" _vs._ "anti-trans". (The central tendency of the contemporary trans rights movement is firmly on the Wrong Side, but that's not the same thing as all trans people as individuals.) That's why Jessica Taylor [joined our posse to try to argue with Yudkowsky in early 2019](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/#jessica-joins). (She wouldn't have, if my objection had been, "trans is Wrong; trans people Bad.") That's why Somni—one of the trans women who [infamously protested the 2019 CfAR reunion](https://www.ksro.com/2019/11/18/new-details-in-arrests-of-masked-camp-meeker-protesters/) for (among other things) CfAR allegedly discriminating against trans women—[understands what I've been saying](https://somnilogical.tumblr.com/post/189782657699/legally-blind). +There are a number of things that could be said to this,[^number-of-things] but most importantly: the battle that matters—the battle with a Right Side and a Wrong Side—isn't "pro-trans" _vs._ "anti-trans". (The central tendency of the contemporary trans rights movement is firmly on the Wrong Side, but that's not the same thing as all trans people as individuals.) That's why Jessica Taylor [joined our posse to try to argue with Yudkowsky in early 2019](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/#jessica-joins). (She wouldn't have if my objection had been, "Trans is Wrong; trans people Bad.") That's why Somni—one of the trans women who [infamously protested the 2019 CfAR reunion](https://www.ksro.com/2019/11/18/new-details-in-arrests-of-masked-camp-meeker-protesters/) for (among other things) CfAR allegedly discriminating against trans women—[understands what I've been saying](https://somnilogical.tumblr.com/post/189782657699/legally-blind). -[^number-of-things]: Note the striking contrast between ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument), in which the Yudkowsky of 2007 wrote that a campaign manager "crossed the line [between rationality and rationalization] at the point where you considered whether the questionnaire was favorable or unfavorable to your candidate, before deciding whether to publish it"; and these 2021 Tweets, in which Yudkowsky seems completely nonchalant about "not have been as willing to tweet a truth helping" one side of a cultural dispute, because "this battle just isn't that close to the top of [his] priority list". Well, sure! Any hired campaign manager could say the same: helping the electorate make an optimally informed decision just isn't that close to the top of their priority list, compared to getting paid. +[^number-of-things]: Note the striking contrast between ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument), in which the Yudkowsky of 2007 wrote that a campaign manager "crossed the line [between rationality and rationalization] at the point where [they] considered whether the questionnaire was favorable or unfavorable to [their] candidate, before deciding whether to publish it", and these 2021 Tweets, in which Yudkowsky seems nonchalant about "not hav[ing] been as willing to tweet a truth helping" one side of a cultural dispute, because "this battle just isn't that close to the top of [his] priority list". Well, sure! Any hired campaign manager could say the same: helping the electorate make an optimally informed decision just isn't that close to the top of their priority list, compared to getting paid. - Yudkowsky's claim to have been focused on nudging people's cognition towards sanity seems dubious: if you're focused on sanity, you should be spontaneously noticing sanity errors in both political camps. (Moreover, if you're living in what you yourself describe as a "half-Stalinist environment", you should expect your social environment to make proportionately more errors on the "pro-Stalin" side.) As for the rationale that "those people might matter to AGI someday", [judging by local demographics](/2017/Jan/from-what-ive-tasted-of-desire/), it seems much more likely to apply to trans women themselves, than their critics! + Yudkowsky's claim to have been focused on nudging people's cognition towards sanity seems dubious: if you're focused on sanity, you should be spontaneously noticing sanity errors in both political camps. (Moreover, if you're living in what you yourself describe as a "half-Stalinist environment", you should expect your social environment to make proportionately more errors on the "pro-Stalin" side, because Stalinists aren't facing social pressure to avoid errors.) As for the rationale that "those people might matter to AGI someday", [judging by local demographics](/2017/Jan/from-what-ive-tasted-of-desire/), it seems much more likely to apply to trans women themselves than their critics! -The battle that matters—and I've been very explicit about this, for years—is over this proposition eloquently [stated by Scott Alexander in November 2014](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) (redacting the irrelevant object-level example): +The battle that matters—and I've been explicit about this, for years—is over this proposition eloquently [stated by Scott Alexander in November 2014](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) (redacting the irrelevant object-level example): > I ought to accept an unexpected [X] or two deep inside the conceptual boundaries of what would normally be considered [Y] if it'll save someone's life. There's no rule of rationality saying that I shouldn't, and there are plenty of rules of human decency saying that I should. @@ -499,53 +499,53 @@ And you need to be able to say, in public, that trans women are male and trans m If you don't want to say those things because hurting people is wrong, then you have chosen Feelings. -Scott Alexander chose Feelings, but I can't really hold that against him, because Scott is [very explicit about only speaking in the capacity of some guy with a blog](https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/).[^hexaco] You can tell from his writings that he never wanted to be a religious leader; it just happened to him on accident because he writes faster than everyone else. I like Scott. Scott is alright. I feel sad that such a large fraction of my interactions with him over the years have taken such an adversarial tone. +Scott Alexander chose Feelings, but I can't hold that against him, because Scott is [explicit about only speaking in the capacity of some guy with a blog](https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/).[^hexaco] You can tell that he never wanted to be a religious leader; it just happened because he writes faster than everyone else. I like Scott. Scott is alright. I feel sad that such a large fraction of my interactions with him over the years have taken such an adversarial tone. [^hexaco]: The authors of the [HEXACO personality model](https://en.wikipedia.org/wiki/HEXACO_model_of_personality_structure) may have gotten something importantly right in [grouping "honesty" and "humility" as a single factor](https://en.wikipedia.org/wiki/Honesty-humility_factor_of_the_HEXACO_model_of_personality). -Eliezer Yudkowsky did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he consciously knows to be unambiguously false. And the reason I can hold it against _him_ is because Eliezer Yudkowsky does not identify as just some guy with a blog. Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. He markets himself as a master of the hidden Bayesian structure of cognition, who ["aspires to make sure [his] departures from perfection aren't noticeable to others"](https://twitter.com/ESYudkowsky/status/1384671335146692608), who [complains that "too many people think it's unvirtuous to shut up and listen to [him]"](https://twitter.com/ESYudkowsky/status/1509944888376188929). +Eliezer Yudkowsky did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he knows to be unambiguously false. And the reason I can hold it against _him_ is because Eliezer Yudkowsky does not identify as just some guy with a blog. Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. He markets himself as a master of the hidden Bayesian structure of cognition, who ["aspires to make sure [his] departures from perfection aren't noticeable to others"](https://twitter.com/ESYudkowsky/status/1384671335146692608), who [complains that "too many people think it's unvirtuous to shut up and listen to [him]"](https://twitter.com/ESYudkowsky/status/1509944888376188929). In making such boasts, I think Yudkowsky is opting in to being held to higher standards than other mortals. If Scott Alexander gets something wrong when I was trusting him to be right, that's disappointing, but I'm not the victim of false advertising, because Scott Alexander doesn't claim to be anything more than some guy with a blog. If I trusted him more than that, that's on me. -If Eliezer Yudkowsky gets something wrong when I was trusting him to be right, and refuses to acknowledge corrections (in the absence of an unsustainable 21-month nagging campaign) and keeps inventing new galaxy-brained ways to be wrong in the service of his political agenda of being seen to agree with Stalin without technically lying, then I think I _am_ the victim of false advertising. His marketing bluster was designed to trick people like me into trusting him, even if my being dumb enough to believe him is on me. +If Eliezer Yudkowsky gets something wrong when I was trusting him to be right, and refuses to acknowledge corrections (in the absence of an unsustainable 21-month nagging campaign), and keeps inventing new galaxy-brained ways to be wrong in the service of his political agenda of being seen to agree with Stalin without technically lying, then I think I _am_ the victim of false advertising. His marketing bluster was designed to trick people like me into trusting him, even if my being dumb enough to believe him is on me. -Because, I did, actually, trust him. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). And of course, it was a joke, but the joke was one of over-the-top exaggeration of a hero worship that was very real. (You wouldn't make those jokes for your community college physics teacher, even if they were a good teacher.) +Because, I did, actually, trust him. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). ["Never go in against Eliezer Yudkowsky when anything is on the line"](https://www.greaterwrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts/comment/Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to. -["Never go in against Eliezer Yudkowsky when anything is on the line"](https://www.greaterwrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts/comment/Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to. +Part of what made him so trustworthy back then was that he wasn't asking for trust. He clearly _did_ think it was [unvirtuous to just shut up and listen to him](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus): "I'm not sure that human beings realistically _can_ trust and think at the same time," [he wrote](https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science). He was always arrogant, but it was tempered by the expectation of being held to account by arguments rather than being deferred to as a social superior. "I try in general to avoid sending my brain signals which tell it that I am high-status, just in case that causes my brain to decide it is no longer necessary," [he wrote](https://www.lesswrong.com/posts/cgrvvp9QzjiFuYwLi/high-status-and-stupidity-why). -Part of what made him so trustworthy back then was that he wasn't asking for trust. He clearly _did_ think it was [unvirtuous to just shut up and listen to him](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus): "I'm not sure that human beings realistically _can_ trust and think at the same time," [he wrote](https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science). He was always arrogant, but it was an arrogance tempered by the expectation of being held to account by arguments; he wrote about ["avoid[ing] sending [his] brain signals which tell it that [he was] high-status, just in case that cause[d his] brain to decide it [was] no longer necessary."](https://www.lesswrong.com/posts/cgrvvp9QzjiFuYwLi/high-status-and-stupidity-why). +He visibly [cared about other people being in touch with reality](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business). "I've informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before," [he wrote](https://www.greaterwrong.com/posts/kLR5H4pbaBjzZxLv6/polyhacking/comment/rYKwptdgLgD2dBnHY). (I can testify that this is true: while sharing a car ride with Anna Salamon in 2011, he told me I had B.O.) -He visibly [cared about other people being in touch with reality](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business). Not just by writing the Sequences, but also things like how he [reported](https://www.greaterwrong.com/posts/kLR5H4pbaBjzZxLv6/polyhacking/comment/rYKwptdgLgD2dBnHY), "I've informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before." (I can testify that this is true: while sharing a car ride with Anna Salamon in 2011, he told me I had B.O.) +Telling people about their body odor represents an above-and-beyond devotion to truth-telling: it's an area where people would benefit from feedback (if you know, you can invest in deodorant) but aren't getting that feedback by default (because no one wants to be so rude as to tell people they smell bad). -Telling people about their body odor represents an above-and-beyond devotion to truth-telling: it's an area where people would benefit from feedback (if you know, you can invest in deodorant), but aren't getting that feedback by default (because no one wants to be so rude as to tell people they smell bad). +Really, a lot of the epistemic heroism here is just in [noticing](https://www.lesswrong.com/posts/SA79JMXKWke32A3hG/original-seeing) the conflict between Feelings and Truth, between Politeness and Truth, rather than necessarily acting on it. If telling a person they smell bad would predictably meet harsh social punishment, I couldn't blame someone for consciously choosing silence and safety over telling the truth. -Really, a lot of the epistemic heroism here is just in [noticing](https://www.lesswrong.com/posts/SA79JMXKWke32A3hG/original-seeing) the conflict between Feelings and Truth, between Politeness and Truth, rather than necessarily acting on it. If telling someone they smell bad would predictably meet harsh social punishment, I couldn't blame someone for choosing silence and safety over telling the truth, with the awareness that they were so choosing. +What I can and do blame someone for is actively fighting for Feelings while misrepresenting himself as the rightful caliph of epistemic rationality. There are a lot of trans people who would benefit from feedback that they don't pass but aren't getting that feedback by default. I wouldn't necessarily expect Yudkowsky to provide it. (I don't, either.) -What I can and do blame someone for is actively fighting for Feelings while misrepresenting oneself as a soldier for Truth. There are a lot of trans people who would benefit from feedback that they don't pass, but aren't getting that feedback by default. I wouldn't necessarily expect Yudkowsky to provide it. (I don't, either.) - -I _would_ expect the person who wrote the Sequences not to insist that the important thing is the feelings of human beings who are people describing reasons someone does not like to be tossed into a Smells Bad bucket which don't bear on the factual question of whether someone smells bad. +I _would_ expect the person who wrote the Sequences not to publicly proclaim that the important thing is the feelings of people describing reasons someone does not like to be tossed into a Smells Bad bucket which don't bear on the factual question of whether someone smells bad. That person is dead now, even if his body is still breathing. -I think he knows it. In a November 2022 Discord discussion, [he remarked](yudkowsky-i_might_have_made_a_fundamental_mistake.png): +I think he knows it. In a November 2022 Discord discussion, [he remarked](/images/yudkowsky-i_might_have_made_a_fundamental_mistake.png): > I might have made a fundamental mistake when I decided, long ago, that I was going to try to teach people how to reason so that they'd be able to process my arguments about AGI and AGI alignment through a mechanism that would discriminate true from false statements. > > maybe I should've just learned to persuade people of things instead -I got offended. I said that I felt like a devout Catholic watching the Pope say, "Jesus sucks; I hate God; I never should have told people about God." +I got offended. I felt like a devout Catholic watching the Pope say, "Jesus sucks; I hate God; I never should have told people about God." -Later, I felt the need to write another message clarifying exactly what I found offensive. The problem wasn't the condescension of the suggestion that other people couldn't reason. People being annoyed at the condescension was fine. The _problem_ was that just learning to persuade people of things instead was giving up on deep hidden-structure-of-normative-reasoning principle, that the arguments you use to convince others should be the same as the ones you used to decide which conclusion to argue for. Giving up on that amounted to giving up on the _concept_ of intellectual honesty, choosing instead to become a propaganda AI that calculates what signals to output in order to manipulate an agentless world. +Later, I felt the need to clarify exactly what I found offensive. The problem wasn't the condescension of the suggestion that other people couldn't reason. People being annoyed at the condescension was fine. The problem was that "just learn[ing] to persuade people of things instead" was giving up on the principle that the arguments you use to convince others should be the same as the ones you used to decide which conclusion to argue for. Giving up on that amounted to giving up on the _concept_ of intellectual honesty, choosing instead to become a propaganda AI that calculates what signals to output in order to manipulate an agentless world. -[He put a check-mark emoji on it](davis-amounts-to-giving-up-on-the-concept-of-intellectual-honesty.png), indicating agreement or approval. +[He put a check-mark emoji on it](/images/davis-amounts-to-giving-up-on-the-concept-of-intellectual-honesty.png), indicating agreement or approval. If the caliph has lost his faith in the power of intellectual honesty, I can't necessarily say he's wrong on the empirical merits. It is written that our world is [beyond the reach of God](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god); there's no law of physics that says honesty must yield better results than propaganda. -But since I haven't relinquished my faith, I have the responsibility to point it out when he attempts to wield his priestly authority as the author of the Sequences while not being consistently candid in his communications with his followers, hindering their ability to exercise their responsibilities. The modern Yudkowsky [writes](https://twitter.com/ESYudkowsky/status/1096769579362115584): +But since I haven't relinquished my faith, I have the responsibility to point out that the formerly rightful caliph has relinquished his Art and lost his powers. + +The modern Yudkowsky [writes](https://twitter.com/ESYudkowsky/status/1096769579362115584): > When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes. -I notice that this advice leaves out a possibility: that the "seems to believe" is a deliberate show (judged to be personally prudent and not community-harmful), rather than a misperception on your part. I am left shaking my head in a [weighted average of](https://www.lesswrong.com/posts/y4bkJTtG3s5d6v36k/stupidity-and-dishonesty-explain-each-other-away) sadness about the mortal frailty of my former hero, and disgust at his craven duplicity. **If Eliezer Yudkowsky can't _unambiguously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_.** +I notice that this advice fails to highlight the possibility that the "seems to believe" is a deliberate show (judged to be personally prudent and not community-harmful), rather than a misperception on your part. I am left shaking my head in a [weighted average of](https://www.lesswrong.com/posts/y4bkJTtG3s5d6v36k/stupidity-and-dishonesty-explain-each-other-away) sadness about the mortal frailty of my former hero, and disgust at his duplicity. **If Eliezer Yudkowsky can't _unambiguously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_.** A few clarifications are in order here. First, as with "bad faith", this usage of "fraud" isn't a meaningless [boo light](https://www.lesswrong.com/posts/dLbkrPu5STNCBLRjr/applause-lights). I specifically and literally mean it in [_Merriam-Webster_'s sense 2.a., "a person who is not what he or she pretends to be"](https://www.merriam-webster.com/dictionary/fraud)—and I think I've made my case. Someone who disagrees with my assessment needs to argue that I've gotten some specific thing wrong, [rather than objecting to character attacks on procedural grounds](https://www.lesswrong.com/posts/pkaagE6LAsGummWNv/contra-yudkowsky-on-epistemic-conduct-for-author-criticism). @@ -553,4 +553,4 @@ Second, it's a conditional: _if_ Yudkowsky can't unambiguously choose Truth over He probably won't. We've already seen from his behavior that he doesn't give a shit what people like me think of his intellectual integrity. Why would that change? -Third, given that "fraud" is a semantically meaningful description and not just a emotive negative evaluation, I should stress that the evaluation is a separate step. If being a fraud were necessary for saving the world, maybe being a fraud would be the right thing to do? More on this in the next post. (To be continued.) +Third, given that "fraud" is a semantically meaningful description rather than an emotive negative evaluation, I should stress that evaluation is a separate step. If being a fraud were necessary for saving the world, maybe being a fraud would be the right thing to do? More on this in the next post. (To be continued.) -- 2.17.1