+<a id="confided-to-thomas"></a>As it happened, I had just messaged him a few days earlier, on 22 March 2016, for the first time in four and a half years. (My opening message: "Ray Blanchard's Twitter feed is kind of disappointing, but I'm not sure what I was expecting".) I confided to him that I was seeing an escort on Saturday the twenty-sixth[^twenty-sixth] because the dating market was looking pretty hopeless, I had more money than I knew what to do with, and three female friends agreed that it was not unethical.
+
+(I didn't _have sex_ with her, obviously. _That_ would be unethical.[^unethical])
+
+[^internship]: The Singularity Institute at the time was not the kind of organization that offered formal _internships_; what I mean is that there was a house in Santa Clara where a handful of people were trying to do Singularity-relevant work, and I was allowed to sleep in the garage and also try to do work, without being paid.
+
+[^siai]: The "for Artificial Intelligence" part was a holdover from the organization's founding, from before Yudkowsky [decided that AI would kill everyone by default (and that this was a bad thing)](https://www.lesswrong.com/s/SXurf2mWFw8LX2mkG). People soon started using "SingInst" as an abbreviation more than "SIAI", until the organization was eventually rebranded as the Machine Intelligence Research Institute in 2013.
+
+[^twenty-sixth]: Writing this up years later, I was surprised to see from the dates (26 March 2016) that my date with the escort was the same day as the "20% of the ones with penises" post (and my comment thereon and following conversation with "Thomas"). They hadn't been stored in my long-term episodic memory as "the same day", likely because the Facebook post only seems overwhelmingly significant in retrospect; at the time, I did not realize what I would be spending the next seven years of my life on.
+
+[^unethical]: Another ethically mitigating factor is that she had a blog where she wrote in detail about how much she liked her job. The blog posts seemed like credible evidence that she wasn't being morally-relevantly coerced into it. Of course all women in that profession have to put up marketing copy that makes it sound like they enjoy their time with their clients even if they privately hate it, but the blog seemed "real", not part of the role.
+
+He had agreed that seeing escorts is ethical—arugably _more_ ethical than casual sex. He had said that he had gotten interested in politics and developed in a socially and sexually conservative direction. "Free love is a lie," he said, noting that in a more traditional Society, our analogues would probably be married with kids by now.
+
+He also said that his gender dysphoria had receded. "At a certain point, I just cut my hair, give away a lot of clothes, and left it behind. I kept waiting to regret it ... but the regret never came," he said. "It's like my brain got pushed off the fence and subtly re-wired."
+
+I had said that I was happy for him and respected him, even while my own life remained very pro-dysphoria, pro-ponytails, and anti-politics.
+
+After complimenting me on my comment on Yudkowsky's post on the twenty-sixth, "Thomas" elaborated that he thought Yudkowsky's post was really irresponsible, because virtually all of the men in Yudkowsky's audience with gender dysphoria probably had erotic target location errors. "Thomas" went on:
+
+> To get a little paranoid, I think the power to define other people's identities is extremely useful in politics. If a political coalition can convince you that you have a persecuted identity or sexuality and it will support you, then it owns you for life, and can conscript you for culture wars and elections. Moloch would never pass up this level of power, so that means a constant stream of bad philosophy about identity and sexuality (like trans theory).
+>
+> So when I see Eliezer trying to convince nerdy men that they are actually women, I see the hand of Moloch.[^moloch]
+
+[^moloch]: The references to "Moloch" are presumably an allusion to Scott Alexander's ["Meditations on Moloch"](https://slatestarcodex.com/2014/07/30/meditations-on-moloch/), in which Alexander personifies coordination failures as the god Moloch.
+
+ (Incidentally, the quotation in "Meditations on Moloch" of [a poem I wrote in the _Less Wrong_ comments in 2011](https://www.lesswrong.com/posts/bkRpALFAwJQuntHiF/the-gift-we-give-tomorrow-spoken-word-finished?commentId=ZkoKnj4nLbScHHHJC) is probably the most exposure my writing has ever gotten, and will ever get.)
+
+We chatted for a few more minutes. I noted [Samo Burja's comment](/images/burja-shape_of_the_moral_panic.png) on Yudkowsky's post as a "terrible thought" that had also occurred to me: Burja had written that the predicted moral panic may not be along the expected lines, if an explosion of MtFs were to result in trans women dominating previously sex-reserved spheres of social competition. "[F]or signaling reasons, I will not give [the comment] a Like", I added parenthetically.[^signaling-reasons]
+
+[^signaling-reasons]: This brazen cowardice makes for a stark contrast to my current habits of thought. Today, I would notice that that if "for signaling reasons", people don't Like comments that make _insightful and accurate predictions_ about contemporary social trends, then subscribers to our collective discourse will be _less prepared_ for a world in which those trends have progressed further.
+
+A few weeks later, I moved out of my mom's house in [Walnut Creek](https://en.wikipedia.org/wiki/Walnut_Creek,_California) to go live with a new roommate in an apartment on the correct side of the [Caldecott tunnel](https://en.wikipedia.org/wiki/Caldecott_Tunnel), in [Berkeley](https://en.wikipedia.org/wiki/Berkeley,_California), closer to other people in the robot-cult scene and with a shorter train ride to my coding dayjob in San Francisco.
+
+(I would later change my mind about which side of the tunnel is the correct one.)
+
+While I was waiting for internet service to be connected in my new apartment, I read a paper copy of _Nevada_ by Imogen Binnie. (I don't recall how I came across the recommendation.) It's about a trans woman in New York City named Maria Griffiths who steals her girlfriend's car to go on a road trip, and ends up meeting an autogynephilic young man named James H. in a Wal-Mart in Star City, Nevada, whereupon she tries to convince James that _autogynephilia_ is a bogus concept and that he's actually trans.
+
+In Berkeley, I met some really interesting people who seemed quite similar to me along a lot of dimensions, but also very different along some other dimensions having to do with how they were currently living their life! (I see where the pattern-matching faculties in Yudkowsky's brain got that 20% figure from.)
+
+This prompted me to do a little bit more reading in some corners of the literature that I had certainly _heard of_ (as discussed above), but hadn't already mastered and taken seriously in the previous twelve years of reading everything I could about sex and gender and transgender and feminism and evopsych. (Kay Brown's blog, [_On the Science of Changing Sex_](https://sillyolme.wordpress.com/), was especially helpful.)
+
+Between the reading, and a series of increasingly frustrating private conversations, I gradually became increasingly persuaded that Blanchard _wasn't_ dumb and wrong, that his taxonomy of male-to-female transsexualism is _basically_ correct, at least as a first approximation. So far this story has just been about _my_ experience, and not anyone's theory of transsexualism (which I had assumed for years couldn't possibly apply to me), so let me take a moment to explain the theory now.
+
+(With the caveated understanding that psychology is complicated and there's [a lot to be said about what "as a first approximation" is even supposed to mean](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/), but I need a few paragraphs to talk about the _simple_ version of the theory that makes _pretty good_ predictions on _average_, as a prerequisite for more complicated theories that might make even better predictions including on cases that diverge from average.)