>
> —Zora Neale Hurston
-Recapping our Whole Dumb Story so far—in a previous post, ["Sexual Dimorphism in Yudkowsky's Sequences, in Relation to My Gender Problems"](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/), I told the the part about how I've "always" (since puberty) had this obsessive sexual fantasy about being magically transformed into a woman and also thought it was immoral to believe in psychological sex differences, until I got set straight by these really great Sequences of blog posts by Eliezer Yudkowsky, which taught me (incidentally, among many other things) [how absurdly unrealistic my obsessive sexual fantasy was given merely human-level technology](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), and that it's actually immoral _not_ to believe in psychological sex differences [given that](https://www.lesswrong.com/tag/litany-of-tarski) psychological sex differences are actually real. In a subsequent post, "Blanchard's Dangerous Idea and the Plight of the Lucid Crossdreamer", I told the part about how, in 2016, everyone in my systematically-correct-reasoning community up to and including Eliezer Yudkowsky suddenly started claiming that guys like me might actually be women in some unspecified metaphysical sense, and insisted on playing dumb when confronted with alternative explanations of the relevant phenomena, until I eventually had a stress- and sleep-deprivation-induced delusional nervous breakdown, got sent to psychiatric jail once, and then went crazy again a couple months later.
+Recapping my Whole Dumb Story so far—in a previous post, ["Sexual Dimorphism in Yudkowsky's Sequences, in Relation to My Gender Problems"](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/), I told the the part about how I've "always" (since puberty) had this obsessive sexual fantasy about being magically transformed into a woman and also thought it was immoral to believe in psychological sex differences, until I got set straight by these really great Sequences of blog posts by Eliezer Yudkowsky, which taught me (incidentally, among many other things) [how absurdly unrealistic my obsessive sexual fantasy was given merely human-level technology](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions), and that it's actually immoral _not_ to believe in psychological sex differences [given that](https://www.lesswrong.com/tag/litany-of-tarski) psychological sex differences are actually real. In a subsequent post, "Blanchard's Dangerous Idea and the Plight of the Lucid Crossdreamer", I told the part about how, in 2016, everyone in my systematically-correct-reasoning community up to and including Eliezer Yudkowsky suddenly started claiming that guys like me might actually be women in some unspecified metaphysical sense, and insisted on playing dumb when confronted with alternative explanations of the relevant phenomena, until—as described in the subsequent–subsequent post, "People, Evolved Social-Control Mechanisms, and Rocks"—I eventually had a stress- and sleep-deprivation-induced delusional nervous breakdown, got sent to psychiatric jail once, and then went crazy again a couple months later.
That's not the really egregious part of the story. The thing is, psychology is a complicated empirical science: no matter how "obvious" I might think something is, I have to admit that I could be wrong—[not just as an obligatory profession of humility, but _actually_ wrong in the real world](https://www.lesswrong.com/posts/GrDqnMjhqoxiqpQPw/the-proper-use-of-humility). If my fellow rationalists merely weren't sold on the autogynephilia and transgender thing, I would certainly be disappointed, but it's definitely not grounds to denounce the entire community as a failure or a fraud. And indeed, I _did_ [end up moderating my views quite a bit](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/) compared to the extent to which my thinking in 2016–7 took Blanchard–Bailey–Lawrence as received truth. (At the same time, I don't particularly regret saying what I said in 2016–7, because Blanchard–Bailey–Lawrence is still very obviously [_directionally_ correct](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/) compared to the nonsense everyone else was telling me.)
This wasn't about direct benefit _vs._ harm. This was about what, substantively, the machine and its operators were doing. They claimed to be cultivating an epistemically rational community, while in fact building an army of loyalists.
-Ben compared the whole set-up to that of Eliza the spambot therapist in my story ["Blame Me for Trying"](/2018/Jan/blame-me-for-trying/): regardless of the _initial intent_, scrupulous rationalists were paying rent to something claiming moral authority, which had no concrete specific plan to do anything other than run out the clock, maintaining a facsimile of dialogue in ways well-calibrated to continue to generate revenue. Minds like mine wouldn't surive long-run in this ecosystem. If we wanted minds that do "naïve" inquiry instead of playing savvy power games to survive, we needed an interior that justified that level of trust.
+Ben compared the whole set-up to that of Eliza the spambot therapist in my story ["Blame Me for Trying"](/2018/Jan/blame-me-for-trying/): regardless of the _initial intent_, scrupulous rationalists were paying rent to something claiming moral authority, which had no concrete specific plan to do anything other than run out the clock, maintaining a facsimile of dialogue in ways well-calibrated to continue to generate revenue. Minds like mine wouldn't surive long-run in this ecosystem. If we wanted minds that do "naïve" inquiry (instead of playing savvy power games) to live, we needed an interior that justified that level of trust.
+
+(To be continued. Yudkowsky would [eventually clarify his position on the philosophy of categorization in September 2021](https://www.facebook.com/yudkowsky/posts/10158853851009228)—but the story leading up to that will have to wait for another day.)
* I was reluctant to ping Oli (the way I pung Babcock and Pace) because I still "owed" him for the comment on "Challenges", but ultimately ended up sending a Twitter DM just after the verdict (when I saw that he had very-recent reply Tweets and was thus online); I felt a little bit worse about that one (the "FYI I'm at war"), but I think I de-escalated OK and he didn't seem to take it personally
+ ...
+
+ * Said is braver than me along some dimensions; the reason he's in trouble and I'm not, even though we were both fighting with Duncan, is that I was more "dovish"—when Duncan attacked, I focused on defense and withheld my "offensive" thoughts; Said's points about Duncan's blocking psychology were "offensive"
...
------
With internet available—
+_ double-check EnyeWord in archive.is?
_ my last interaction with Logan Strohl (when she blocked me for criticizing her trans thing)
_ Eliezerfic fight: where did the super-AIDS example fit in cronologically?
_ Keltham/Peranza exam answer should link specific tags
_ Yudkowsky's LW moderation policy
far editing tier—
+_ not doing much psychologizing because implausible to be simultaenously savvy enough to say this, and naive enough to not be doing so knowingly
_ dath ilan as a whining-based community
_ footnote to explain that when I'm summarizing a long Discord conversation to taste, I might move things around into "logical" time rather than "real time"; e.g. Yudkowsky's "powerfully relevant" and "but Superman" comments were actually one right after the other; and, e.g., I'm filling in more details that didn't make it into the chat, like innate kung fu; and that the fact that I can "enhance" the summaries of my part, but the people I'm talking to don't have the same privilege, should be taken into account—actually, this covers a lot of the memoir; maybe I should mention this earlier (and again in a footnote??)
_ 93's alternate theory of eliezera taste for deception