From: Zack M. Davis Date: Sat, 16 Dec 2023 01:14:11 +0000 (-0800) Subject: memoir: pt. 4 belated ispell pass X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=4c202c95f33d7fc47de8536985979ffb18abed72;p=Ultimately_Untrue_Thought.git memoir: pt. 4 belated ispell pass (Should've done this before sending it to editor and red team.) --- diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index 17a0fe1..f6bee70 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -9,7 +9,7 @@ Status: draft > > —_Atlas Shrugged_ by Ayn Rand -Quickly recapping my Whole Dumb Story so far: [ever since puberty, I've had this obsessive sexual fantasy about being magically transformed into a woman, which got contextualized by these life-changing Sequences of blog posts by Eliezer Yudkowsky that taught me (amongst many, many other things) how fundamentally disconnected from reality my fantasy was.](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/) [So it came as a huge surprise when, around 2016, the "rationalist" community that had formed around the Sequences seemingly unanimously decided that guys like me might actually be women in some unspecified metaphysical sense.](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/) [A couple years later, having strenuously argued against the popular misconception that the matter could be resolved by simply redefining the word _woman_ (on the grounds that you can define the word any way you like), I flipped out when Yudkowsky prevaricated about how his own philosophy of language says that you can't define a word any way you like, prompting me to join with allies to persuade him to clarify.](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/) [When that failed, my attempts to cope with the "rationalists" being fake led to a series of small misadventures culminating in Yudkowsky eventually clarifying the philosophy-of-lanugage issue after I ran out of patience and yelled at him over email.](/2023/Dec/if-clarity-seems-like-death-to-them/) +Quickly recapping my Whole Dumb Story so far: [ever since puberty, I've had this obsessive sexual fantasy about being magically transformed into a woman, which got contextualized by these life-changing Sequences of blog posts by Eliezer Yudkowsky that taught me (amongst many, many other things) how fundamentally disconnected from reality my fantasy was.](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/) [So it came as a huge surprise when, around 2016, the "rationalist" community that had formed around the Sequences seemingly unanimously decided that guys like me might actually be women in some unspecified metaphysical sense.](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/) [A couple years later, having strenuously argued against the popular misconception that the matter could be resolved by simply redefining the word _woman_ (on the grounds that you can define the word any way you like), I flipped out when Yudkowsky prevaricated about how his own philosophy of language says that you can't define a word any way you like, prompting me to join with allies to persuade him to clarify.](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/) [When that failed, my attempts to cope with the "rationalists" being fake led to a series of small misadventures culminating in Yudkowsky eventually clarifying the philosophy-of-language issue after I ran out of patience and yelled at him over email.](/2023/Dec/if-clarity-seems-like-death-to-them/) Really, that should have been the end of the story—with a relatively happy ending, too: that it's possible to correct straightforward philosophical errors, at the cost of almost two years of desperate effort by someone with [Something to Protect](https://www.lesswrong.com/posts/SGR4GxFK7KmW7ckCB/something-to-protect). @@ -63,9 +63,9 @@ So the _New York Times_ implicitly accuses us of being racists, like Charles Mur I didn't speak up at the time of the old-email scandal, either. I had other things to do with my attention and Overton budget. -It works surprisingly well. I fear my love of Truth is not so great that if I didn't have Something to Protect, I would have happily participated in the coverup. +It works surprisingly well. I fear my love of Truth is not so great that if I didn't have Something to Protect, I would have happily participated in the cover-up. -As it happens, in our world, the defensive coverup consists of _throwing me under the bus_. Facing censure from the progressive egregore for being insufficiently progressive, we can't defend ourselves ideologically. (We think we're egalitarians, but progressives won't buy that because we like markets too much.) We can't point to our racial diversity. (Mostly white if not Jewish, with a handful of East and South Asians, exactly as you'd expect from chapters 13 and 14 of _The Bell Curve_.) [Subjectively](https://en.wikipedia.org/wiki/Availability_heuristic), I felt like the sex balance got a little better after we hybridized with Tumblr and Effective Altruism (as [contrasted with the old days](/2017/Dec/a-common-misunderstanding-or-the-spirit-of-the-staircase-24-january-2009/)), but survey data doesn't unambiguously back this up.[^survey-data] +As it happens, in our world, the defensive cover-up consists of _throwing me under the bus_. Facing censure from the progressive egregore for being insufficiently progressive, we can't defend ourselves ideologically. (We think we're egalitarians, but progressives won't buy that because we like markets too much.) We can't point to our racial diversity. (Mostly white if not Jewish, with a handful of East and South Asians, exactly as you'd expect from chapters 13 and 14 of _The Bell Curve_.) [Subjectively](https://en.wikipedia.org/wiki/Availability_heuristic), I felt like the sex balance got a little better after we hybridized with Tumblr and Effective Altruism (as [contrasted with the old days](/2017/Dec/a-common-misunderstanding-or-the-spirit-of-the-staircase-24-january-2009/)), but survey data doesn't unambiguously back this up.[^survey-data] [^survey-data]: We go from 89.2% male in the [2011 _Less Wrong_ survey](https://www.lesswrong.com/posts/HAEPbGaMygJq8L59k/2011-survey-results) to a virtually unchanged 88.7% male on the [2020 _Slate Star Codex_ survey](https://slatestarcodex.com/2020/01/20/ssc-survey-results-2020/)—although the [2020 EA survey](https://forum.effectivealtruism.org/posts/ThdR8FzcfA8wckTJi/ea-survey-2020-demographics) says only 71% male, so it depends on how you draw the category boundaries of "we." @@ -73,7 +73,7 @@ But _trans!_ We have plenty of those! In [the same blog post in which Scott Alex The benefit of having plenty of trans people is that high-ranking members of the [progressive stack](https://en.wikipedia.org/wiki/Progressive_stack) can be trotted out as a shield to prove that we're not counterrevolutionary right-wing Bad Guys. Thus, [Jacob Falkovich noted](https://twitter.com/yashkaf/status/1275524303430262790) (on 23 June 2020, just after _Slate Star Codex_ went down), "The two demographics most over-represented in the SlateStarCodex readership according to the surveys are transgender people and Ph.D. holders", and Scott Aaronson [noted (in commentary on the February 2021 _Times_ article) that](https://www.scottaaronson.com/blog/?p=5310) "the rationalist community's legendary openness to alternative gender identities and sexualities" should have "complicated the picture" of our portrayal as anti-feminist. -Even the haters grudgingly give Alexander credit for ["The Categories Were Made for Man, Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/): ["I strongly disagree that one good article about accepting transness means you get to walk away from writing that is somewhat white supremacist and quite fascist without at least awknowledging you were wrong"](https://archive.is/SlJo1), wrote one. +Even the haters grudgingly give Alexander credit for ["The Categories Were Made for Man, Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/): ["I strongly disagree that one good article about accepting transness means you get to walk away from writing that is somewhat white supremacist and quite fascist without at least acknowledging you were wrong"](https://archive.is/SlJo1), wrote one. Under these circumstances, dethroning the supremacy of gender identity ideology is politically impossible. All our [Overton margin](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting) is already being spent somewhere else; sanity on this topic is our [dump stat](https://tvtropes.org/pmwiki/pmwiki.php/Main/DumpStat). @@ -111,7 +111,7 @@ I'll agree that the problems shouldn't be confused. Psychology is complicated, a But I think it's important to notice both problems, instead of pretending that the only problem was Brennan's disregard for Alexander's privacy. It's one thing to believe that people should keep promises that they, themselves, explicitly made. But instructing commenters not to link to the email seems to imply not just that Brennan should keep _his_ promises, but that everyone else is obligated to participate in a conspiracy to conceal information that Alexander would prefer concealed. I can see an ethical case for it, analogous to returning stolen property after it's already been sold and expecting buyers not to buy items that they know have been stolen. (If Brennan had obeyed Alexander's confidentiality demand, we wouldn't have an email to link to, so if we wish Brennan had obeyed, we can just act as if we don't have an email to link to.) -But there's also a non-evil-bully case for wanting to reveal information, rather than participate in a coverup to protect the image of the "rationalists" as non-threatening to the progressive egregore. If the orchestrators of the coverup can't even acknowledge to themselves that they're orchestrating a coverup, they're liable to be confusing themselves about other things, too. +But there's also a non-evil-bully case for wanting to reveal information, rather than participate in a cover-up to protect the image of the "rationalists" as non-threatening to the progressive egregore. If the orchestrators of the cover-up can't even acknowledge to themselves that they're orchestrating a cover-up, they're liable to be confusing themselves about other things, too. As it happened, I had another social media interaction with Yudkowsky that same day, 18 February 2021. Concerning the psychology of people who hate on "rationalists" for alleged sins that don't particularly resemble anything we do or believe, [he wrote](https://twitter.com/ESYudkowsky/status/1362514650089156608): @@ -199,7 +199,7 @@ If you can see why uncritically affirming people's current self-image isn't the In an article titled ["Actually, I Was Just Crazy the Whole Time"](https://somenuanceplease.substack.com/p/actually-i-was-just-crazy-the-whole), FtMtF detransitioner Michelle Alleva contrasts her current beliefs with those when she decided to transition. While transitioning, she accounted for many pieces of evidence about herself ("dislikes attention as a female", "obsessive thinking about gender", "doesn't fit in with the girls", _&c_.) in terms of the theory "It's because I'm trans." But now, Alleva writes, she thinks she has a variety of better explanations that, all together, cover the original list: "It's because I'm autistic," "It's because I have unresolved trauma," "It's because women are often treated poorly" ... including "That wasn't entirely true" (!!). -This is a rationality skill. Alleva had a theory about herself, which she revised upon further consideration of the evidence. Beliefs about onesself aren't special and can—must—be updated using the _same_ methods that you would use to reason about anything else—[just as a recursively self-improving AI would reason the same about transistors "inside" the AI and transistors in "the environment."](https://www.lesswrong.com/posts/TynBiYt6zg42StRbb/my-kind-of-reflection)[^the-form-of-the-inference] +This is a rationality skill. Alleva had a theory about herself, which she revised upon further consideration of the evidence. Beliefs about oneself aren't special and can—must—be updated using the _same_ methods that you would use to reason about anything else—[just as a recursively self-improving AI would reason the same about transistors "inside" the AI and transistors in "the environment."](https://www.lesswrong.com/posts/TynBiYt6zg42StRbb/my-kind-of-reflection)[^the-form-of-the-inference] [^the-form-of-the-inference]: Note, I'm specifically praising the _form_ of the inference, not necessarily the conclusion to detransition. If someone else in different circumstances weighed up the evidence about themself, and concluded that they _are_ trans in some specific objective sense on the empirical merits, that would also be exhibiting the skill. For extremely sex-atypical same-natal-sex-attracted transsexuals, you can at least see why the "born in the wrong body" story makes some sense as a handwavy [first approximation](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/). It's just that for males like me, and separately for females like Michelle Alleva, the story doesn't [pay rent](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences). @@ -360,7 +360,7 @@ For the savvy people in the know, it would certainly be convenient if everyone s [Policy debates should not appear one-sided.](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided) Faced with this kind of dilemma, I can't say that defying Power is necessarily the right choice: if there really were no other options between deceiving your readers with a bad faith performance, and incurring Power's wrath, and Power's wrath would be too terrible to bear, then maybe deceiving your readers with a bad faith performance is the right thing to do. -But if you cared about not deceiving your readers, you would want to be sure that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore politically unfavorable counteraguments_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion". +But if you cared about not deceiving your readers, you would want to be sure that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore politically unfavorable counterarguments_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion". I think he's playing dumb here. In other contexts, he's written about ["attack[s] performed by selectively reporting true information"](https://twitter.com/ESYudkowsky/status/1634338145016909824) and ["[s]tatements which are technically true but which deceive the listener into forming further beliefs which are false"](https://hpmor.com/chapter/97). He's undoubtedly familiar with the motte-and-bailey doctrine as [described by Nicholas Shackel](https://philpapers.org/archive/SHATVO-2.pdf) and [popularized by Scott Alexander](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/). I think that _if he wanted to_, Eliezer Yudkowsky could think of some relevant differences between "2 + 2 = 4" and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'". @@ -395,7 +395,7 @@ Thellim's horror at the fictional world of Jane Austen is basically how I feel a The problem is _not_ that I think there's anything wrong with fantasizing about being the other sex, and wanting the fantasy to become real—just as Thellim's problem with _Pride and Prejudice_ is not there being anything wrong with wanting to marry a suitable bachelor. These are perfectly respectable goals. -The _problem_ is that people who are trying to be people, people who are trying to acheive their goals _in reality_, do so in a way that involves having concepts of their own minds, and trying to improve both their self-models and their selves—and that's not possible in a culture that tries to ban as heresy the idea that it's possible for someone's self-model to be wrong. +The _problem_ is that people who are trying to be people, people who are trying to achieve their goals _in reality_, do so in a way that involves having concepts of their own minds, and trying to improve both their self-models and their selves—and that's not possible in a culture that tries to ban as heresy the idea that it's possible for someone's self-model to be wrong. A trans woman I follow on Twitter complained that a receptionist at her workplace said she looked like some male celebrity. "I'm so mad," she fumed. "I look like this right now"—there was a photo attached to the Tweet—"how could anyone ever think that was an okay thing to say?" @@ -403,7 +403,7 @@ It is genuinely sad that the author of those Tweets didn't get perceived in the _It was a compliment!_ That receptionist was almost certainly thinking of someone like [David Bowie](https://en.wikipedia.org/wiki/David_Bowie) or [Eddie Izzard](https://en.wikipedia.org/wiki/Eddie_Izzard), rather than being hateful and trying to hurt. The author should have graciously accepted the compliment, and _done something to pass better next time_. The horror of trans culture is that it's impossible to imagine any of these people doing that—noticing that they're behaving like a TERF's [hostile](/2019/Dec/the-strategy-of-stigmatization/) [stereotype](/2022/Feb/link-never-smile-at-an-autogynephile/) of a narcissistic, gaslighting trans-identified man and snapping out of it. -I want a shared cultural understanding that the correct way to ameliorate the genuine sadness of people not being perceived the way they prefer is through things like better and cheaper facial feminization surgery, not [emotionally blackmailing](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) people out of their ability to report what they see. I don't _want_ to reliniqush [my ability to notice what women's faces look like](/papers/bruce_et_al-sex_discrimination_how_do_we_tell.pdf), even if that means noticing that mine isn't; if I'm sad that it isn't, I can endure the sadness if the alternative is forcing everyone in my life to doublethink around their perceptions of me. +I want a shared cultural understanding that the correct way to ameliorate the genuine sadness of people not being perceived the way they prefer is through things like better and cheaper facial feminization surgery, not [emotionally blackmailing](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) people out of their ability to report what they see. I don't _want_ to relinquish [my ability to notice what women's faces look like](/papers/bruce_et_al-sex_discrimination_how_do_we_tell.pdf), even if that means noticing that mine isn't; if I'm sad that it isn't, I can endure the sadness if the alternative is forcing everyone in my life to doublethink around their perceptions of me. In a world where surgery is expensive, but some people desperately want to change sex and other people want to be nice to them, there's an incentive gradient in the direction of re-binding our shared concept of "gender" onto things like [ornamental clothing](http://web.archive.org/web/20210513192331/http://thetranswidow.com/2021/02/18/womens-clothing-is-always-drag-even-on-women/) that are easier to change than secondary sex characteristics. @@ -427,7 +427,7 @@ I agree that you won't have much luck yelling at the Other about how they must r Let's recap. -In January 2009, Yudkowsky published ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), essentially a revision of [a 2004 mailing list post responding to a man who said that after the Singularity, he'd like to make a female but "otherwise identical" copy of himself](https://archive.is/En6qW). "Changing Emotions" insightfully points out [the deep technical reasons why](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard) men who sexually fantasize about being women can't achieve their dream with forseeable technology—and not only that, but that the dream itself is conceptually confused: a man's fantasy-about-it-being-fun-to-be-a-woman isn't part of the female distribution; there's a sense in which it _can't_ be fulfilled. +In January 2009, Yudkowsky published ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), essentially a revision of [a 2004 mailing list post responding to a man who said that after the Singularity, he'd like to make a female but "otherwise identical" copy of himself](https://archive.is/En6qW). "Changing Emotions" insightfully points out [the deep technical reasons why](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard) men who sexually fantasize about being women can't achieve their dream with foreseeable technology—and not only that, but that the dream itself is conceptually confused: a man's fantasy-about-it-being-fun-to-be-a-woman isn't part of the female distribution; there's a sense in which it _can't_ be fulfilled. It was a good post! Though Yudkowsky was merely using the sex change example to illustrate [a more general point about the difficulties of applied transhumanism](https://www.lesswrong.com/posts/EQkELCGiGQwvrrp3L/growing-up-is-hard), "Changing Emotions" was hugely influential on me; I count myself much better off for having understood the argument. @@ -435,9 +435,9 @@ But seven years later, in a March 2016 Facebook post, Yudkowsky [proclaimed that This seemed like a huge and surprising reversal from the position articulated in "Changing Emotions". The two posts weren't _necessarily_ inconsistent, if you assumed gender identity is a real property synonymous with "brain sex", and that the harsh (almost mocking) skepticism of the idea of true male-to-female sex change in "Changing Emotions" was directed at the erotic sex-change fantasies of _cis_ men (with a male gender-identity/brain-sex), whereas the 2016 Facebook post was about _trans women_ (with a female gender-identity/brain-sex), which are a different thing. -But this potential unification seemed dubious to me, especially if trans women were purported to be "at least 20% of the ones with penises" (!!) in some population. After it's been pointed out, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female but 'otherwise identical' copy of himself" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. So in October 2016, [I wrote to Yudkowsky noting the apparent reversal and asking to talk about it](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price). Because of the privacy rules I'm adhering to in telling this Whole Dumb Story, I can't confirm or deny whether any such conversation occured. +But this potential unification seemed dubious to me, especially if trans women were purported to be "at least 20% of the ones with penises" (!!) in some population. After it's been pointed out, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female but 'otherwise identical' copy of himself" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. So in October 2016, [I wrote to Yudkowsky noting the apparent reversal and asking to talk about it](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price). Because of the privacy rules I'm adhering to in telling this Whole Dumb Story, I can't confirm or deny whether any such conversation occurred. -Then, in November 2018, while criticizing people who refuse to use trans people's preferred pronouns, Yudkowsky proclaimed that "Using language in a way _you_ dislike, openly and explicitly and with public focus on the language and its meaning, is not lying" and that "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning". But _that_ seemed like a huge and surprising reversal from the position articulated in ["37 Ways Words Can Be Wrong"](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong). After attempts to clarify via email failed, I eventually wrote ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) to explain the relevant error in general terms, and Yudkowsky would eventually go on to [clarify his position in Septembmer 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228). +Then, in November 2018, while criticizing people who refuse to use trans people's preferred pronouns, Yudkowsky proclaimed that "Using language in a way _you_ dislike, openly and explicitly and with public focus on the language and its meaning, is not lying" and that "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning". But _that_ seemed like a huge and surprising reversal from the position articulated in ["37 Ways Words Can Be Wrong"](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong). After attempts to clarify via email failed, I eventually wrote ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) to explain the relevant error in general terms, and Yudkowsky would eventually go on to [clarify his position in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228). But then in February 2021, he reopened the discussion to proclaim that "the simplest and best protocol is, '"He" refers to the set of people who have asked us to use "he", with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition", the problems with which post I explained in March 2022's ["Challenges to Yudkowsky's Pronoun Reform Proposal"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/) and above. @@ -465,9 +465,9 @@ Accordingly, I tried the object-level good-faith argument thing _first_. I tried [^symmetrically-not-assuming-good-faith]: Obviously, if we're crossing the Rubicon of abandoning the norm of assuming good faith, it needs to be abandoned symmetrically. I _think_ I'm doing a pretty good job of adhering to standards of intellectual conduct and being transparent about my motivations, but I'm definitely not perfect, and, unlike Yudkowsky, I'm not so absurdly mendaciously arrogant to claim "confidence in my own ability to independently invent everything important" (!) about my topics of interest. If Yudkowsky or anyone else thinks they _have a case_ based on my behavior that _I'm_ being culpably intellectually dishonest, they of course have my blessing and encouragement to post it for the audience to evaluate. -What makes all of this especially galling is the fact that _all of my heretical opinions are literally just Yudkowsky's opinions from the 'aughts!_ My whole thing about how changing sex isn't possible with existing or forseeable technology because of how complicated humans (and therefore human sex differences) are? Not original to me! I [filled in a few technical details](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard), but again, this was in the Sequences as ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). My thing about how you can't define concepts any way you want because there are mathematical laws governing which category boundaries [compress](https://www.lesswrong.com/posts/mB95aqTSJLNR9YyjH/message-length) your [anticipated experiences](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences)? Not original to me! I [filled in](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [a few technical details](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception), but [_we had a whole Sequence about this._](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) +What makes all of this especially galling is the fact that _all of my heretical opinions are literally just Yudkowsky's opinions from the 'aughts!_ My whole thing about how changing sex isn't possible with existing or foreseeable technology because of how complicated humans (and therefore human sex differences) are? Not original to me! I [filled in a few technical details](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard), but again, this was in the Sequences as ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). My thing about how you can't define concepts any way you want because there are mathematical laws governing which category boundaries [compress](https://www.lesswrong.com/posts/mB95aqTSJLNR9YyjH/message-length) your [anticipated experiences](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences)? Not original to me! I [filled in](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [a few technical details](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception), but [_we had a whole Sequence about this._](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) -Seriously, do you think I'm smart enough to come up with all of this indepedently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still gave a shit about telling the truth_. (Actively telling the truth, and not just technically not lying.) The things I'm hyperfocused on that he thinks are politically impossible to say in the current year, are almost entirely things he already said, that anyone could just look up! +Seriously, do you think I'm smart enough to come up with all of this independently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still gave a shit about telling the truth_. (Actively telling the truth, and not just technically not lying.) The things I'm hyperfocused on that he thinks are politically impossible to say in the current year, are almost entirely things he already said, that anyone could just look up! I guess the point is that the egregore doesn't have the reading comprehension for that?—or rather, the egregore has no reason to care about the past; if you get tagged by the mob as an Enemy, your past statements will get dug up as evidence of foul present intent, but if you're doing good enough of playing the part today, no one cares what you said in 2009? @@ -545,7 +545,7 @@ But since I haven't relinquished my faith, I have the responsibility to point it > When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes. -I notice that this advice leaves out a possibility: that the "seems to believe" is a deliberate show (judged to be personally prudent and not community-harmful), rather than a misperception on your part. I am left shaking my head in a [weighted average of](https://www.lesswrong.com/posts/y4bkJTtG3s5d6v36k/stupidity-and-dishonesty-explain-each-other-away) sadness about the mortal frailty of my former hero, and disgust at his craven duplicity. **If Eliezer Yudkowsky can't _unambigously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_.** +I notice that this advice leaves out a possibility: that the "seems to believe" is a deliberate show (judged to be personally prudent and not community-harmful), rather than a misperception on your part. I am left shaking my head in a [weighted average of](https://www.lesswrong.com/posts/y4bkJTtG3s5d6v36k/stupidity-and-dishonesty-explain-each-other-away) sadness about the mortal frailty of my former hero, and disgust at his craven duplicity. **If Eliezer Yudkowsky can't _unambiguously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_.** A few clarifications are in order here. First, as with "bad faith", this usage of "fraud" isn't a meaningless [boo light](https://www.lesswrong.com/posts/dLbkrPu5STNCBLRjr/applause-lights). I specifically and literally mean it in [_Merriam-Webster_'s sense 2.a., "a person who is not what he or she pretends to be"](https://www.merriam-webster.com/dictionary/fraud)—and I think I've made my case. Someone who disagrees with my assessment needs to argue that I've gotten some specific thing wrong, [rather than objecting to character attacks on procedural grounds](https://www.lesswrong.com/posts/pkaagE6LAsGummWNv/contra-yudkowsky-on-epistemic-conduct-for-author-criticism).