+(I _still_ don't want to be blogging about this, but unfortunately, it actually turns out to be central to the intellectual–political project I've been singlemindedly focused on for the past four years because [somebody has to and no one else will](https://unsongbook.com/chapter-6-till-we-have-built-jerusalem/))
+
+—my _favorite_—and basically only—masturbation fantasy has always been some variation on me getting magically transformed into a woman. I ... need to write more about the phenomenology of this. In the meantime, just so you know what I'm talking about, the relevant TVTrope is ["Man, I Feel Like a Woman."](https://tvtropes.org/pmwiki/pmwiki.php/Main/ManIFeelLikeAWoman) Or search "body swap" on PornHub. Or check out my few, circumspect contributions to [the popular genre of](/2016/Oct/exactly-what-it-says-on-the-tin/) captioned-photo female transformation erotica (everyone is wearing clothes, so these might be "safe for work" in a narrow technical sense, if not a moral one): [1](/ancillary/captions/dr-equality-and-the-great-shift/) [2](/ancillary/captions/the-other-side-of-me/) [3](/ancillary/captions/the-impossible-box/) [4](/ancillary/captions/de-gustibus-non-est/).
+
+(The first segment of my pen surname is a legacy of middle-school friends letting me borrow some of the [Ranma ½](https://en.wikipedia.org/wiki/Ranma_%C2%BD) graphic novels, about a young man named Ranma Saotome cursed ("cursed"??) to transform into a woman on exposure to cold water. This was just _before_ puberty kicked in for me, but I have no way of computing the counterfactual to know whether that had a causal influence.)
+
+So, there was that erotic thing, which I was pretty ashamed of at the time, and _of course_ knew that I must never, ever tell a single soul about. (It would have been about three years since the fantasy started that I even worked up the bravery to [tell my Diary about it](/ancillary/diary/53/#first-agp-confession).)
+
+But within a couple years, I also developed this beautiful pure sacred self-identity thing that would persist indefinitely, where I started having a lot of _non_-sexual thoughts about being female. Just—little day-to-day thoughts, little symbolic gestures.
+
+Like when I would [write in my pocket notebook in the persona of my female analogue](/images/crossdreaming_notebook_samples.png).
+
+Or when I would practice swirling the descenders on all the lowercase letters that had descenders [(_g_, _j_, _p_, _y_, _z_)](/images/handwritten_phrase_jazzy_puppy.jpg) because I thought it made my handwriting look more feminine.
+
+Or the time when track and field practice split up into boys and girls, and I ironically muttered under my breath, "Why did I even join this team?—boys, I mean."
+
+Or when it was time to order sheets to fit on the dorm beds at the University in Santa Cruz, and I deliberately picked out the pink-with-flowers design on principle.
+
+Or how I was proud to be the kind of guy who bought Julia Serano's _Whipping Girl: A Transsexual Woman on Sexism and the Scapegoating of Femininity_ when it was new in 2007, and [who would rather read from Evelyn Fox Keller's _Reflections on Gender and Science_ than](http://zackmdavis.net/blog/2013/03/tradition/) watch [Super Bowl XLII](https://en.wikipedia.org/wiki/Super_Bowl_XLII).
+
+Or how, at University, I tried to go by my [first-and-middle-initials](https://en.wikipedia.org/wiki/List_of_literary_initials) because I wanted a gender-neutral [byline](https://en.wikipedia.org/wiki/Byline), and I wanted what people called me in real life to be the same as my byline—even if, obviously, I didn't expect people to not-notice which sex I am in real life because _that would be crazy_.
+
+(This attempted nickname change actually turned out to be a terrible idea that ended up causing me a huge amount of pointless identity-crisis psychological pain—my particular pair of real-life initials never really "felt like a name" even to me (as contrasted to something like "C.J.", which feels like a name because it has a _J_ in it); I turned out to be incredibly uncomfortable with different people knowing me by different names, and didn't have the guts to nag everyone in my life to switch for something that didn't feel like a name even to me; _and_ the "gender-neutral byline" rationale almost certainly never held up in practice because my real-life first initial is a [high-Scrabble-score letter](https://en.wikipedia.org/wiki/Scrabble_letter_distributions#English) that begins one popular boy name and zero popular girl names. But it was the _principle!_)
+
+Or how I stopped getting haircuts and grew my beautiful–beautiful ponytail. (This turned out to be a great idea and I wish I had thought of it sooner.)
+
+Or how one of the [little song-fragments I used to write in my head](/tag/music/) went—
+
+> _Sometimes I sigh because I'll never get rich
+> And there's no magic so I can't be a witch
+> And that I must enjoy the scorn of the world
+> Just 'cause I'm butch and I'm a tranny girl_
+
+Or the time I felt proud when my Normal American Girl coworker at the supermarket in 'aught-nine said that she had assumed I was gay. (I'm not, but the fact that Normal American Girl thought so meant that I was successfully unmasculine.)
+
+And so on _et cetera ad infinitum_. This has been a very persistent _thing_ for me.
+
+The beautiful pure sacred self-identity thing doesn't _feel_ explicitly erotic. The thing I did in the day in class about writing in my notebook about being a girl, was _very different_ from the thing I did in my room at night about _visualizing_ girls with this abstract sense of "But what if that were _me_?" while furiously masturbating. The former activity was my beautiful pure happy romantic daydream, whereas the latter activity was not beautiful or pure at all!
+
+Now I am not a cognitive scientist, and can't claim to _know_ exactly what my beautiful pure sacred self-identity thing is, or where it comes from—that's [not the kind of thing I would expect people to _know_ from introspection alone](/2016/Sep/psychology-is-about-invalidating-peoples-identities/). But it has always seemed like a pretty obvious guess that there must have been _some sort of causal relationship_ between the erotic thing, and the beautiful pure sacred self-identity thing, even if the two things don't _feel_ the same: the overlap in subject matter is too much to be a coincidence. And the erotic thing definitely came _first_.
+
+Maybe this story reads differently in 2021 from how it was to live in the 'aughts? I think that teenage boys in the current year having the kind of feelings I was having then, upon referencing or hinting at the beautiful pure sacred self-identity thing—
+
+(and the beautiful pure sacred self-identity thing is _much_ easier to talk about than the erotic thing)
+
+(I mean, the beautiful pure sacred self-identity thing is much harder to talk about _clearly_, but talking about it _un_-clearly is less shameful and requires much less bravery)
+
+—are immediately provided with "Oh, that means you're not a cis boy; you're a trans girl" as the definitive explanation.
+
+But it was a different time, then. Of course I had _heard of_ transsexualism as a thing, in the form of the "woman trapped in a man's body" trope, but it wasn't something I expected to actually encounter in real life. (I understood my "tranny girl" song to reflect an idle fantasy, not a legitimate life plan.)
+
+At the time, I had _no reason to invent the hypothesis_ that I might somehow literally be a woman in some unspecified psychological sense. I knew I was a boy _because_ boys are the ones with penises. That's what the word _means_. I was a boy who had a weird _sex fantasy_ about being a girl. That was just the obvious ordinary straightforward plain-language description of the situation. It _never occured to me_ to couch it in the language of "dysphoria", or actually possessing some innate "gender". The beautiful pure sacred self-identity thing was about identifying _with_ women, not identifying _as_ a woman—[roughly analogous to how](/2017/Jul/interlude-vi/) a cat lover might be said to "identify with" cats, without claiming to somehow _be_ a cat, because _that would be crazy_.
+
+[It was while browsing _Wikipedia_ in 2006 that I encountered the obvious and perfect word for my thing](/2017/Feb/a-beacon-through-the-darkness-or-getting-it-right-the-first-time/)—_autogynephilia_, from the Greek for "[love of](https://en.wiktionary.org/wiki/-philia) [oneself as](https://en.wiktionary.org/wiki/auto-#English) [a woman](https://en.wiktionary.org/wiki/gyno-)." I was actually surprised that it turned out to have been coined in the context of a theory (by clinical sexual psychologist Ray Blanchard) that it was the root cause of one of two types of male-to-female transsexualism.
+
+You see, a very important feature of my gender-related thinking at the time was that I was growing very passionate about—well, in retrospect I call it _psychological-sex-differences denialism_, but at the time I called it _antisexism_. Where sometimes people in the culture would make claims about how women and men are psychologically different, and of course I knew this was _bad and wrong_. Therefore the very idea of transsexualism was somewhat suspect insofar as it necessarily depends on the idea that women and men are psychologically different (in order for it to be possible to be in the "wrong" body).
+
+So while I was certainly glad to learn that _there's a word for it_, an obvious and perfect word for _my thing_, I mostly just stole the word (whose referent and meaning I thought was self-explanatory from the common Greek roots) without paying any further attention to this Blanchard theory or the idea that _I_ might somehow be transgender.
+
+So, you know, as part of my antisexism, I read a lot about feminism. I remember checking out [_The Feminine Mystique_](https://en.wikipedia.org/wiki/The_Feminine_Mystique) and [Susan Faludi's _Backlash_](https://en.wikipedia.org/wiki/Backlash:_The_Undeclared_War_Against_American_Women) from the school library. Before I found my internet-home on _Overcoming Bias_, I would read the big feminist blogs—[_Pandagon_](https://web.archive.org/web/20070630211101/http://pandagon.net/), [_Feministe_](https://web.archive.org/web/20080901002058/http://www.feministe.us/blog), [_Feministing_](https://web.archive.org/web/20080605182529/http://www.feministing.com/). The one time I special-ordered a book at the physical Barnes & Noble before I turned 18 and got my own credit card and could order books online, it was [_Feminist Intepretations of Ayn Rand_](https://www.psupress.org/books/titles/0-271-01830-5.html).
+
+(In retrospect, it's notable how _intellectualized_ all of this was—my pro-feminism was an ideological matter between me and my books, rather than arising from any practical need. It's not like I had disproportionately female friends or whatever—I mean, to the extent that I had any friends and not just books.)
+
+It also seems like a pretty obvious guess that there must have been _some sort of causal relationship_ between my antisexism and the erotic and beautiful-pure-sacred-self-identity things. True, the [blank slate doctrine](/2020/Apr/book-review-human-diversity/#blank-slate) has been ideologically fashionable my entire life. In the sense that progressivism has been [likened to a nontheistic state religion](https://www.unqualified-reservations.org/2007/09/how-dawkins-got-pwned-part-1/)—uh, bear with me for a moment—I was a _very_ religious teenager.
+
+I have a vague memory of being in the Crown College library at the University in Santa Cruz in 2007, reading Robert Wright's _The Moral Animal_ (because it had been on [Yudkowsky's old book-recommendations list](https://web.archive.org/web/20200118114912/https://yudkowsky.net/obsolete/bookshelf.html)), and being _aghast_ at how openly, brazenly _sexist_ it was.
+
+(That is, with respect to what I considered _sexist_ at the time. I wish there was some way to know what my teenage self would think of my current self's writing, which is at least as "bad" as Wright and plausibly worse. Maybe if the whole benevolent-superintelligence thing my robot cult always talks about ever works out, I'll be able to kick off a limited-scope [ancestor-simulation](https://www.simulation-argument.com/simulation.html) to find out. In the meantime, if you're offended, I'd love it if you could let me know in the comments exactly how much and why! [Personal identity doesn't actually exist](https://www.lesswrong.com/posts/RLScTpwc5W2gGGrL9/identity-isn-t-in-specific-atoms); humans growing up in the same cultural tradition can be seen as being drawn from a similar _distribution_ as my teenage self.)
+
+That overwhelming feeling of cold horror and hatred at _the enemy revealed_—that, I conjecture, is what religious people feel when encountering a heretical text for the first time. (In _principle_, a sufficiently advanced neuroscience would be able to confirm that it is the same emotion, as a matter of biological fact.) The social–psychological need to [avoid the belief's real weak points](https://www.lesswrong.com/posts/dHQkDNMhj692ayx78/avoiding-your-belief-s-real-weak-points) is why the "religion" characterization makes sense, even if the claim that psychological sex differences are fake isn't a [_supernatural_](https://www.lesswrong.com/posts/u6JzcFtPGiznFgDxP/excluding-the-supernatural) one. But quasi-religious ideological fervor aside, there was presumably a _reason_ I cared so much about being a good pro-feminist _specifically_, and hardly spent any time at all thinking about other dimensions of social justice, like race or class. And I think the reason is because, because ...
+
+Well. The reason I'm blogging this story at all is because I'm scared that in order to finish that sentence in the current year and be understood, I'd have to say, "because I was trans." And with respect to what the words mean in the current year, it's true. But that's not how I think of it, then or now.
+
+It's because I was _straight_. Because I loved women, and wanted to do right by them. It's an _identificatory_ kind of love—loving women as extension of the self, rather than a mysterious, unfathomable [Other](https://en.wikipedia.org/wiki/The_Second_Sex#Volume_One). But that's not unusual, is it?—or it _shouldn't_ be. I would have assumed that guys who can't relate to this are probably just sexist.
+
+------
+
+Anyway, that's some background about where I was at, personally and ideologically, _before_ I fell in with this robot cult.
+
+My ideological committment to psychological-sex-differences denialism made me uncomfortable when the topic of sex differences happened to come up on the blog—which wasn't particularly often at all, but in such a _vast_ body of work as the Sequences, it did happen to come up a few times (and those few times are the subject of this blog post).
+
+For example, as part of [an early explanation of why the values we would want to program into an artificial superintelligence don't reduce to any one simple principle](https://www.lesswrong.com/posts/NnohDYHNnKDtbiMyp/fake-utility-functions), Yudkowsky remarks that "the love of a man for a woman, and the love of a woman for a man, have not been cognitively derived from each other or from any other value."
+
+From the perspective of axiomatic antisexism that I held at the time, this assertion is cringe-inducing. Of course most people are straight, but is it not all the _same love_?
+
+I wasn't ready to hear it then, but—I mean, probably not? So, for the _most_ part, all humans are extremely similar: [as Yudkowsky would soon write about](https://www.lesswrong.com/posts/Cyj6wQLW6SeF6aGLy/the-psychological-unity-of-humankind) [(following Leda Cosmides and John Tooby)](https://www.cep.ucsb.edu/papers/pfc92.pdf), complex functional adaptations have to be species-universal in order to not get scrambled during meiosis. As a toy example, if some organelle gets assembled from ten genes, those ten alleles _all_ have to be nearly universal in the population—if each only had a frequency of 0.9, then the probability of getting them all right would only be 0.9<sup>10</sup> ≈ 0.349. If allele H [epistatically](https://en.wikipedia.org/wiki/Epistasis) only confers a fitness advantage when allele G at some other locus is already present, then G has to already be well on its way to fixation in order for there to be appreciable selective pressure for H. Evolution, feeding on variation, uses it up. Complicated functionality that requires multiple genes working in concert can only accrete gradually as each individual piece reaches fixation in the entire population, resulting in an intricate species-universal _design_: just about everyone has 206 bones, a liver, a [parietal lobe](https://en.wikipedia.org/wiki/Parietal_lobe), _&c_.
+
+In this way (contrary to the uninformed suspicions of those still faithful to the blank slate), evolutionary psychology actually turns out to be impressively antiracist discipline: maybe individual humans can differ in small ways like personality, or [ancestry-groups in small ways](/2020/Apr/book-review-human-diversity/#ancestries) like skin color, but these are, and _have_ to be, "shallow" low-complexity variations on the same basic human design; new _complex_ functionality would require speciation.
+
+This luck does not extend to antisexism. If the genome were a computer program, it would have `if female { /* ... */ } else if male { /* ... */ }` conditional blocks, and inside those blocks, you can have complex sex-specific functionality. By default, selection pressures on one sex tend to drag the other along for the ride—men have nipples because there's no particular reason for them not to—but in those cases where it was advantageous in the environment of evolutionary adaptedness for females and males to do things _differently_, sexual dimorphism can evolve (slowly—[more than one and half orders of magnitude slower than monomorphic adaptations](/papers/rogers-mukherjee-quantitative_genetics_of_sexual_dimorphism.pdf), in fact).
+
+The evolutionary theorist Robert Trivers wrote, "One can, in effect, treat the sexes as if they were different species, the opposite sex being a resource relevant to producing maximum surviving offspring" (!). There actually isn't one species-universal design—it's _two_ designs.
+
+If you're willing to admit to the possibility of psychological sex differences _at all_, you have to admit that sex differences in the parts of the mind that are _specifically about mating_ are going to be a prime candidate. (But by no means the only one—different means of reproduction have different implications for [life-history strategies](https://en.wikipedia.org/wiki/Life_history_theory) far beyond the act of mating itself.) Even if there's a lot of "shared code" in how love-and-attachment works in general, there are also going to be specific differences that were [optimized for](https://www.lesswrong.com/posts/8vpf46nLMDYPC6wA4/optimization-and-the-intelligence-explosion) facilitating males impregnating females. In that sense, the claim that "the love of a man for a woman, and the love of a woman for a man, have not been cognitively derived from each other" just seems commonsensically _true_.
+
+I guess if you _didn't_ grow up with a quasi-religious fervor for psychological sex differences denialism, this whole theoretical line of argument about evolutionary psychology doesn't seem world-shatteringly impactful?—maybe it just looks like supplementary Science Details brushed over some basic facts of human existence that everyone knows. But if you _have_ built your identity around [quasi-religious _denial_](/2020/Apr/peering-through-reverent-fingers/) of certain basic facts of human existence that everyone knows (if not everyone [knows that they know](https://www.lesswrong.com/posts/CqyJzDZWvGhhFJ7dY/belief-in-belief)), getting forced out of it by sufficient weight of Science Details [can be a pretty rough experience](https://www.greaterwrong.com/posts/XM9SwdBGn8ATf8kq3/c/comment/Zv5mrMThBkkjDAqv9).
+
+My hair-trigger antisexism was sort of lurking in the background of some of my comments while the Sequences were being published (though, again, it wasn't relevant to _most_ posts, which were just about cool math and science stuff that had no avenue whatsoever for being corrupted by gender politics). The term "social justice warrior" wasn't yet popular, but I definitely had a SJW-alike mindset (nurtured from my time lurking the feminist blogosphere) of being preoccupied with the badness and wrongness of people who are wrong and bad (_i.e._, sexist), rather than trying to [minimize the expected squared error of my probabilistic predictions](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception).
+
+Another one of the little song-fragments I wrote in my head a few years earlier (which I mention for its being representative of my attitude at the time, rather than it being notable in itself), concerned an advice columnist, [Amy Alkon](http://www.advicegoddess.com/), syndicated in the _Contra Costa Times_ of my youth, who would sometimes give dating advice based on a pop-evopsych account of psychological sex differences—the usual fare about women seeking commitment and men seeking youth and beauty. My song went—
+
+> _I hope Amy Alkon dies tonight
+> So she can't give her bad advice
+> No love or value save for evolutionary psych_
+>
+> _I hope Amy Alkon dies tonight
+> Because the world's not girls and guys
+> Cave men and women fucking 'round the fire in the night_
+
+Looking back with the outlook later acquired from my robot cult, this is abhorrent. You don't _casually wish death_ on someone just because you disagree with their views on psychology! Even if it wasn't in a spirit of personal malice (this was a song I sung to myself, not an actual threat directed to Amy Alkon's inbox), the sentiment just _isn't done_. But at the time, I _didn't notice there was anything wrong with my song_. I hadn't yet been socialized into the refined ethos of "False ideas should be argued with, but heed that we too may have ideas that are false".
+
+[TODO: Me pretending to be dumb about someone not pretending to be dumb about my initials https://www.overcomingbias.com/2008/04/inhuman-rationa.html ; contrast that incident (it's not an accident that he guessed right) to Yudkowsky: "I try to avoid criticizing people when they are right. If they genuinely deserve criticism, I will not need to wait long for an occasion where they are wrong." (https://www.lesswrong.com/posts/MwQRucYo6BZZwjKE7/einstein-s-arrogance)]
+
+[TODO Vassar slapping me down in June 2008 (on the same day that I crossdressed in front of Seanan and Katie!): https://www.overcomingbias.com/2008/06/why-do-psychopa.html#comment-518267438]
+
+Sex differences would come up a couple more times in one of the last Sequences, on "Fun Theory"—speculations on how life could be truly _good_ if the world were superintelligently optimized for human values, in contrast to the cruelty and tragedy of our precarious existence [in a world shaped only by blind evolutionary forces](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god).
+
+According to Yudkowsky, one of the ways in which people's thinking about artificial intelligence usually goes wrong is [anthropomorphism](https://www.lesswrong.com/posts/RcZeZt8cPk48xxiQ8/anthropomorphic-optimism)—expecting arbitrary AIs to behave like humans, when really "AI" corresponds to [a much larger space of algorithms](https://www.lesswrong.com/posts/tnWRXkcDi5Tw9rzXw/the-design-space-of-minds-in-general). As a social animal, predicting other humans is one of the things we've evolved to be good at, and the way that works is probably via "empathic inference": [I predict your behavior by imagining what _I_ would do in your situation](https://www.lesswrong.com/posts/Zkzzjg3h7hW5Z36hK/humans-in-funny-suits). Since all humans are very similar, [this appeal-to-black-box](https://www.lesswrong.com/posts/9fpWoXpNv83BAHJdc/the-comedy-of-behaviorism) works pretty well in our lives (though it won't work on AI). And from this empathy, evolution also coughed up the [moral miracle](https://www.lesswrong.com/posts/pGvyqAQw6yqTjpKf4/the-gift-we-give-to-tomorrow) of [_sympathy_, intrinsically caring about what others feel](https://www.lesswrong.com/posts/NLMo5FZWFFq652MNe/sympathetic-minds).
+
+In ["Interpersonal Entanglement"](https://www.lesswrong.com/posts/Py3uGnncqXuEfPtQp/interpersonal-entanglement), Yudkowsky appeals to the complex moral value of sympathy as an argument against the desireability of nonsentient sex partners (_catgirls_ being the technical term). Being emotionally intertwined with another actual person is one of the things that makes life valuable, that would be lost if people just had their needs met by soulless catgirl holodeck characters.
+
+But there's a problem, Yudkowsky argues: women and men aren't designed to make each other optimally happy. The abstract game between the two human life-history strategies in the environment of evolutionary adaptedness had a conflicting-interests as well as a shared-interests component, and human psychology still bears the design signature of that game denominated in inclusive fitness, even though [no one cares about inclusive fitness](https://www.lesswrong.com/posts/XPErvb8m9FapXCjhA/adaptation-executers-not-fitness-maximizers). (Peter Watts: ["And God smiled, for Its commandment had put Sperm and Egg at war with each other, even unto the day they made themselves obsolete."](https://www.rifters.com/real/Blindsight.htm)) The secnario of Total Victory for the ♂ player in the conflicting-interests subgame is not [Nash](https://en.wikipedia.org/wiki/Nash_equilibrium). The design of the entity who _optimally_ satisfied what men want out of women would not be, and _could_ not be, within the design parameters of actual women.
+
+(And _vice versa_ and respectively, but in case you didn't notice, this blog post is all about male needs.)
+
+Yudkowsky dramatized the implications in a short story, ["Failed Utopia #4-2"](https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2), portraying an almost-aligned superintelligence constructing a happiness-maximizing utopia for humans—except that because of the mismatch in the sexes' desires, and because the AI is prohibited from editing people's minds, the happiness-maximizing solution (according to the story) turns out to be splitting up the human species by sex and giving women and men their own _separate_ utopias (on [Venus and Mars](https://en.wikipedia.org/wiki/Gender_symbol#Origins), ha ha), complete with artificially-synthesized romantic partners.
+
+Of course no one _wants_ that—our male protagonist doesn't _want_ to abandon his wife and daughter for some catgirl-adjacent (if conscious) hussy. But humans _do_ adapt to loss; if the separation were already accomplished by force, people would eventually move on, and post-separation life with companions superintelligently optimized _for you_ would ([_arguendo_](https://en.wikipedia.org/wiki/Arguendo)) be happier than life with your real friends and family, whose goals will sometimes come into conflict with yours because they weren't superintelligently designed _for you_.
+
+The alignment-theory morals are those of [unforseen maxima](https://arbital.greaterwrong.com/p/unforeseen_maximum) and [edge instantiation](https://arbital.greaterwrong.com/p/edge_instantiation). An AI designed to maximize happiness would kill all humans and tile the galaxy with maximally-efficient happiness-brainware. If this sounds "crazy" to you, that's the problem with anthropomorphism I was telling you about: [don't imagine "AI" as an emotionally-repressed human](https://www.lesswrong.com/posts/zrGzan92SxP27LWP9/points-of-departure), just think about [a machine that calculates what actions would result in what outcomes](https://web.archive.org/web/20071013171416/http://www.singinst.org/blog/2007/06/11/the-stamp-collecting-device/), and does the action that would result in the outcome that maximizes some function. It turns out that picking a function that doesn't kill everyone looks hard. Just tacking on the constaints that you can think of (make the _existing_ humans happy without tampering with their minds) [will tend to produce similar "crazy" outcomes that you didn't think to exclude](https://arbital.greaterwrong.com/p/nearest_unblocked).
+
+At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at "Failed Utopia #4-2" in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back a dozen years later—[or even four years later](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/D34jhYBcaoE7DEb8d)—my performative horror was missing the point.
+
+_The argument makes sense_. Of course, it's important to notice that you'd need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI in the story doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other. A faithful antisexist (as I was) might insist that that should be the _only_ moral, as it implies the other [_a fortiori_](https://en.wikipedia.org/wiki/Argumentum_a_fortiori). But if you're trying to _learn about reality_ rather than protect your fixed quasi-religious beliefs, it should be _okay_ for one of the lessons to get a punchy sci-fi short story; it should be _okay_ to think about the hyperplane between two coarse clusters, even while it's simultaneously true that a set of hyperplanes would suffice to [shatter](https://en.wikipedia.org/wiki/Shattered_set) every individual point, without deigning to acknowledge the existence of clusters.
+
+On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_ (presumably after [the Norse deity](https://en.wikipedia.org/wiki/Ver%C3%B0andi)), rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but since the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. Calling the _verthandi_ "women" would be _worse writing_; it would _fail to communicate_ the impact of what has taken place in the story.
+
+Another post in this vein that had a huge impact on me was ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). As an illustration of how [the hope for radical human enhancement is fraught with](https://www.lesswrong.com/posts/EQkELCGiGQwvrrp3L/growing-up-is-hard) technical difficulties, Yudkowsky sketches a picture of just how difficult an actual male-to-female sex change would be.
+
+It would be hard to overstate how much of an impact this post had on me. I've previously linked it on [this](/2016/Nov/reply-to-ozy-on-two-type-mtf-taxonomy/#changing-emotions-link) [blog](/2017/Jan/the-line-in-the-sand-or-my-slippery-slope-anchoring-action-plan/#changing-emotions-link) [five](/2018/Apr/reply-to-the-unit-of-caring-on-adult-human-females/#changing-emotions-link) [different](/2018/Dec/untitled-metablogging-26-december-2018/#changing-emotions-link) [times](/2019/Aug/the-social-construction-of-reality-and-the-sheer-goddamned-pointlessness-of-reason/#changing-emotions-link). In June 2008, half a year before it was published, I encountered the [2004 Extropians mailing list post](http://lists.extropy.org/pipermail/extropy-chat/2004-September/008924.html) that the blog post had clearly been revised from. (The fact that I was trawling through old mailing list archives searching for Yudkowsky content that I hadn't already read, tells you something about what a fanboy I am—if, um, you hadn't already noticed.) I immediately wrote to a friend: "[...] I cannot adequately talk about my feelings. Am I shocked, liberated, relieved, scared, angry, amused?"
+
+The argument goes: it might be easy to _imagine_ changing sex and refer to the idea in a short English sentence, but the real physical world has implementation details, and the implementation details aren't filled in by the short English sentence. The human body, including the brain, is an enormously complex integrated organism; there's no [plug-and-play](https://en.wikipedia.org/wiki/Plug_and_play) architecture by which you can just swap your brain into a new body and have everything Just Work without re-mapping the connections in your motor cortex. And even that's not _really_ a sex change, as far as the whole integrated system is concerned—
+
+> Remapping the connections from the remapped somatic areas to the pleasure center will ... give you a vagina-shaped penis, more or less. That doesn't make you a woman. You'd still be attracted to girls, and no, that would not make you a lesbian; it would make you a normal, masculine man wearing a female body like a suit of clothing.
+>
+> [...]
+>
+> So to actually _become female_ ...
+>
+> We're talking about a _massive_ transformation here, billions of neurons and trillions of synapses rearranged. Not just form, but content—just like a male judo expert would need skills repatterned to become a female judo expert, so too, you know how to operate a male brain but not a female brain. You are the equivalent of a judo expert at one, but not the other. You have _cognitive_ reflexes, and consciously learned cognitive skills as well.
+>
+> [...]
+>
+> What happens when, as a woman, you think back to your memory of looking at Angelina Jolie photos as a man? How do you _empathize_ with your _past self_ of the opposite sex? Do you flee in horror from the person you were? Are all your life's memories distant and alien things? How can you _remember_, when your memory is a recorded activation pattern for neural circuits that no longer exist in their old forms? Do we rewrite all your memories, too?
+
+But, well ... I mean, um ...
+
+(I still really don't want to be blogging about this, but _somebody has to and no one else will_)
+
+From the standpoint of my secret erotic fantasy, "normal, masculine man wearing a female body like a suit of clothing" is actually a _great_ outcome—the _ideal_ outcome. Let me explain.
+
+The main plot of my secret erotic fantasy accomodates many frame stories, but I tend to prefer those that invoke the [literary genre of science](https://www.lesswrong.com/posts/4Bwr6s9dofvqPWakn/science-as-attire), and posit "technology" rather than "spells" or "potions" as the agent of transformation, even if it's all ultimately magic (where ["magic" is a term of art for anything you don't understand how to implement as a computer program](https://www.lesswrong.com/posts/kpRSCH7ALLcb6ucWM/say-not-complexity)).
+
+So imagine having something like [the transporter in _Star Trek_](https://memory-alpha.fandom.com/wiki/Transporter), but you re-materialize with the body of someone else, rather than your original body—a little booth I could walk in, dissolve in a tingly glowy special effect for a few seconds, and walk out looking like (say) [Nana Visitor (circa 1998)](https://memory-alpha.fandom.com/wiki/Kay_Eaton?file=Kay_Eaton.jpg). (In the folklore of [female-transformation erotica](/2016/Oct/exactly-what-it-says-on-the-tin/), this machine is often called the ["morphic adaptation unit"](https://www.cyoc.net/interactives/chapter_115321.html).)
+
+As "Changing Emotions" points out, this high-level description of a hypothetical fantasy technology leaves many details unspecified—not just the _how_, but the _what_. What would the indistinguishable-from-magical transformation booth do to my brain? [As a preference-revealing thought experiment](https://www.lesswrong.com/posts/DdEKcS6JcW7ordZqQ/not-taking-over-the-world), what would I _want_ it to do, if I can't change [the basic nature of reality](https://www.lesswrong.com/posts/tPqQdLCuxanjhoaNs/reductionism), but if engineering practicalities weren't a constraint? (That is, I'm allowed to posit any atom-configuration without having to worry about how you would get all the atoms in the right place, but I'm not allowed to posit tethering my immortal soul to a new body, because [souls](https://www.lesswrong.com/posts/u6JzcFtPGiznFgDxP/excluding-the-supernatural) [aren't](https://www.lesswrong.com/posts/7Au7kvRAPREm3ADcK/psychic-powers) [real](https://www.lesswrong.com/posts/fdEWWr8St59bXLbQr/zombies-zombies).)
+
+The anti-plug-and-play argument makes me confident that it would have to change _something_ about my mind in order to integrate it with a new female body—if nothing else, my unmodified brain doesn't physically _fit_ inside Nana Visitor's skull. ([One meta-analysis puts the sex difference in intracranial volume and brain volume at](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3969295/) a gaping [Cohen's _d_](/2019/Sep/does-general-intelligence-deflate-standardized-effect-sizes-of-cognitive-sex-differences/) ≈ 3.0 and 2.1, respectively, and Visitor doesn't look like she has an unusually large head.)
+
+Fine—we're assuming that difficulty away and stipulating that the magical transformation booth can make the _minimal_ changes necessary to put my brain in a female body, and have it fit, and have all the motor-connection/body-mapping stuff line up so that I can move and talk normally in a body that feels like mine, without being paralyzed or needing months of physical therapy to re-learn how to walk.
+
+I want this more than I can say. But is that _all_ I want? What about all the _other_ sex differences in the brain? Male brains are more lateralized—doing [relatively more communication within hemispheres rather than between](https://www.pnas.org/content/111/2/823); there are language tasks that women and men perform equally well on, but [men's brains use only the _left_ inferior frontal gyrus, whereas women's use both](/papers/shaywitz-et_al-sex_differences_in_the_functional_organization_of_the_brain_for_language.pdf). Women have a relatively thicker corpus callosum; men have a relatively larger amygdala. Fetal testosterone levels [increase the amount of gray matter in posterior lateral orbitofrontal cortex, but decrease the gray matter in Wernicke's area](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3306238/) ...
+
+Do I want the magical transformation technology to fix all that, too?
+
+Do I have _any idea_ what it would even _mean_ to fix all that, without spending multiple lifetimes studying neuroscience?
+
+I think I have just enough language to _start_ to talk about what it would mean. Since sex isn't an atomic attribute, but rather a high-level statistical regularity such that almost everyone can be cleanly classified as "female" or "male" _in terms of_ lower-level traits (genitals, hormone levels, _&c._), then, abstractly, we're trying to take points from male distribution and map them onto the female distribution in a way that preserves as much structure (personal identity) as possible. My female analogue doesn't have a penis (because then she wouldn't be female), but she is going to speak American English like me and be [85% Ashkenazi like me](/images/ancestry_report.png), because language and autosomal genes don't have anything to do with sex.