X-Git-Url: http://unremediatedgender.space/source?p=Ultimately_Untrue_Thought.git;a=blobdiff_plain;f=content%2Fdrafts%2Fsexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md;h=235e0f5af4b2f89503a21262173b202b6422661a;hp=8887db1478cb2d24f76cbbb3b7e149e91bf59f4b;hb=67aeedb82a86dc2863ee5becefd13ccb5a65ba18;hpb=d8bb4ace8b61a9225210df2df8ffdb57bdfb94c0 diff --git a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md index 8887db1..235e0f5 100644 --- a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md +++ b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md @@ -1,7 +1,7 @@ Title: Sexual Dimorphism in Yudkowsky's Sequences, in Relation to My Gender Problems Date: 2021-02-14 05:00 Category: commentary -Tags: autogynephilia, bullet-biting, cathartic, Eliezer Yudkowsky, epistemic horror, my robot cult, personal, sex differences, Star Trek, Julia Serano +Tags: autogynephilia, bullet-biting, cathartic, Eliezer Yudkowsky, Scott Alexander, epistemic horror, my robot cult, personal, sex differences, Star Trek, Julia Serano Status: draft > _I'll write my way out @@ -82,7 +82,7 @@ But it was a different time, then. Of course I had _heard of_ transsexualism as At the time, I had _no reason to invent the hypothesis_ that I might somehow literally be a woman in some unspecified psychological sense. I knew I was a boy _because_ boys are the ones with penises. That's what the word _means_. I was a boy who had a weird _sex fantasy_ about being a girl. That was just the obvious ordinary straightforward plain-language description of the situation. It _never occured to me_ to couch it in the language of "dysphoria", or actually possessing some innate "gender". The beautiful pure sacred self-identity thing was about identifying _with_ women, not identifying _as_ a woman—[roughly analogous to how](/2017/Jul/interlude-vi/) a cat lover might be said to "identify with" cats, without claiming to somehow _be_ a cat, because _that would be crazy_. -[It was while browsing _Wikipedia_ in 2006 that I encountered the obvious and perfect word for my thing](/2017/Feb/a-beacon-through-the-darkness-or-getting-it-right-the-first-time/)—_autogynephilia_, from the Greek for "[love of](https://en.wiktionary.org/wiki/-philia) [oneself as](https://en.wiktionary.org/wiki/auto-#English) [a woman](https://en.wiktionary.org/wiki/gyno-)." I was actually surprised that it turned out to have been coined in the context of a theory (by clinical psychologist Ray Blanchard) that it was the root cause of one of two types of male-to-female transsexualism. +[It was while browsing _Wikipedia_ in 2006 that I encountered the obvious and perfect word for my thing](/2017/Feb/a-beacon-through-the-darkness-or-getting-it-right-the-first-time/)—_autogynephilia_, from the Greek for "[love of](https://en.wiktionary.org/wiki/-philia) [oneself as](https://en.wiktionary.org/wiki/auto-#English) [a woman](https://en.wiktionary.org/wiki/gyno-)." I was actually surprised that it turned out to have been coined in the context of a theory (by clinical sexual psychologist Ray Blanchard) that it was the root cause of one of two types of male-to-female transsexualism. You see, a very important feature of my gender-related thinking at the time was that I was growing very passionate about—well, in retrospect I call it _psychological-sex-differences denialism_, but at the time I called it _antisexism_. Where sometimes people in the culture would make claims about how women and men are psychologically different, and of course I knew this was _bad and wrong_. Therefore the very idea of transsexualism was somewhat suspect insofar as it necessarily depends on the idea that women and men are psychologically different (in order for it to be possible to be in the "wrong" body). @@ -504,7 +504,7 @@ Peer-reviewed scientific papers aren't enough for you? (They could be cherry-pic [TODO: another clinical perspective: Dr. Will Powers https://www.facebook.com/strohl89/posts/10157396578969598 ] -Don't trust scientists or clinicians? Me neither! ([Especially](/2017/Mar/fresh-princess/) [not](/2017/Jun/memoirs-of-my-recent-madness-part-i-the-unanswerable-words/) [clinicians](TODO: Dr. W correspondence).) Want first-person accounts from trans women themselves? Me too! And there's lots! +Don't trust scientists or clinicians? Me neither! (Especially [not clinicians](/2017/Jun/memoirs-of-my-recent-madness-part-i-the-unanswerable-words/).) Want first-person accounts from trans women themselves? Me too! And there's lots! Consider this passage from Dierdre McCloskey's memoir _Crossing_, writing in the third person about her decades identifying as a heterosexual crossdresser before transitioning at age 53: @@ -570,33 +570,13 @@ Men who fantasize about being women do not particularly resemble actual women! W The "discourse algorithm" (the collective generalization of "cognitive algorithm") that can't just _get this shit right_ in 2021 (because being out of step with the reigning Bay Area ideological fashion is deemed too expensive by a consequentialism that counts unpopularity or hurt feelings as costs), also [can't get heliocentrism right in 1633](https://en.wikipedia.org/wiki/Galileo_affair) [_for the same reason_](https://www.lesswrong.com/posts/yaCwW8nPQeJknbCgf/free-speech-and-triskaidekaphobic-calculators-a-reply-to)—and I really doubt it can get AI alignment theory right in 2041. -Or at least—even if there are things we can't talk about in public for consequentialist reasons and there's nothing to be done about it, you would hope that the censorship wouldn't distort our maps of the things we _can_ talk about, or about the laws of mapmaking itself. Yudkowsky had written about the [dark side epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology) and [contagious lies](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies): trying to protect a false belief doesn't just mean being wrong about that one thing, it also gives you, on the object level, an incentive to be wrong about anything that would _imply_ the falsity of the protected belief—and, on the meta level, an incentive to be wrong _about epistemology itself_, about how "implying" and "falsity" work. +Or at least—even if there are things we can't talk about in public for consequentialist reasons and there's nothing to be done about it, you would hope that the censorship wouldn't distort our beliefs about the things we _can_ talk about. Yudkowsky had written about the [dark side epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology) and [contagious lies](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies): trying to protect a false belief doesn't just mean being wrong about that one thing, it also gives you, on the object level, an incentive to be wrong about anything that would _imply_ the falsity of the protected belief—and, on the meta level, an incentive to be wrong _about epistemology itself_, about how "implying" and "falsity" work. -[...] +So, a striking thing about my series of increasingly frustrating private conversations and subsequent public Facebook meltdown (the stress from which soon landed me in psychiatric jail, but that's [another](/2017/Mar/fresh-princess/) [story](/2017/Jun/memoirs-of-my-recent-madness-part-i-the-unanswerable-words/)) was the tendency for some threads of conversation to get _derailed_ on some variation of, "Well, the word _woman_ doesn't necessarily mean that," often with a link to ["The Categories Were Made for Man, Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) by Scott Alexander, the _second_ most prominent writer in our robot cult. -> ["It is a common misconception that you can define a word any way you like. [...] If you believe that you can 'define a word any way you like', without realizing that your brain goes on categorizing without your conscious oversight, then you won't take the effort to choose your definitions wisely."](https://www.lesswrong.com/posts/3nxs2WYDGzJbzcLMp/words-as-hidden-inferences) +So, this _really_ wasn't what I was trying to talk about; _I_ thought I was trying to talk about autogynephilia and transsexuality as an _empirical_ issue in psychology. -> ["So that's another reason you can't 'define a word any way you like': You can't directly program concepts into someone else's brain."](https://www.lesswrong.com/posts/HsznWM9A7NiuGsp28/extensions-and-intensions) - -> ["When you take into account the way the human mind actually, pragmatically works, the notion 'I can define a word any way I like' soon becomes 'I can believe anything I want about a fixed set of objects' or 'I can move any object I want in or out of a fixed membership test'."](https://www.lesswrong.com/posts/HsznWM9A7NiuGsp28/extensions-and-intensions) - -> ["There's an idea, which you may have noticed I hate, that 'you can define a word any way you like'."](https://www.lesswrong.com/posts/i2dfY65JciebF3CAo/empty-labels) - -> ["And of course you cannot solve a scientific challenge by appealing to dictionaries, nor master a complex skill of inquiry by saying 'I can define a word any way I like'."](https://www.lesswrong.com/posts/y5MxoeacRKKM3KQth/fallacies-of-compression) - -> ["Categories are not static things in the context of a human brain; as soon as you actually think of them, they exert force on your mind. One more reason not to believe you can define a word any way you like."](https://www.lesswrong.com/posts/veN86cBhoe7mBxXLk/categorizing-has-consequences) - -> ["And people are lazy. They'd rather argue 'by definition', especially since they think 'you can define a word any way you like'."](https://www.lesswrong.com/posts/yuKaWPRTxZoov4z8K/sneaking-in-connotations) - -> ["And this suggests another—yes, yet another—reason to be suspicious of the claim that 'you can define a word any way you like'. When you consider the superexponential size of Conceptspace, it becomes clear that singling out one particular concept for consideration is an act of no small audacity—not just for us, but for any mind of bounded computing power."](https://www.lesswrong.com/posts/82eMd5KLiJ5Z6rTrr/superexponential-conceptspace-and-simple-words) - -> ["I say all this, because the idea that 'You can X any way you like' is a huge obstacle to learning how to X wisely. 'It's a free country; I have a right to my own opinion' obstructs the art of finding truth. 'I can define a word any way I like' obstructs the art of carving reality at its joints. And even the sensible-sounding 'The labels we attach to words are arbitrary' obstructs awareness of compactness."](https://www.lesswrong.com/posts/soQX8yXLbKy7cFvy8/entropy-and-short-codes) - -> ["One may even consider the act of defining a word as a promise to \[the\] effect [...] \[that the definition\] will somehow help you make inferences / shorten your messages."](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) - -[...] - -[TODO: or at least, even if there are things we can't talk about, we should at least want to avoid dark side epistemology. Briefly tell the story of the Category War?—but try to keep it brief and not-personal; the focus should be on dark side epistemology, rather than re-picking my fight with S.A. or E.Y. (maybe don't name them, but describe the abstract dynamics and link). "Everyone else shot first." Wasn't what I was trying to talk about, but I took the bait. For me, this isn't just a "political" topic—I actually need the right answer in order to decide whether or not to cut my dick off] +[TODO: or at least, even if there are things we can't talk about, we should at least want to avoid dark side epistemology. Briefly tell the story of the Category War?—but try to keep it brief and not-personal; the focus should be on dark side epistemology, rather than re-picking my fight with S.A. or E.Y. (maybe don't name them, but describe the abstract dynamics, and demur that the full story is robot cult inside baseball bullshit). "Everyone else shot first." Wasn't what I was trying to talk about, but I took the bait. For me, this isn't just a "political" topic—I actually need the right answer in order to decide whether or not to cut my dick off] Someone asked me: "Wouldn't it be embarrassing if the community solved Friendly AI and went down in history as the people who created Utopia forever, and you had rejected it because of gender stuff?"