At this point, any _normal people_ who are (somehow?) reading this might be thinking, isn't that weird and kind of cultish?—some blogger you follow posted something you thought was strange earlier this year, and you want to pay him _one grand_ to talk about it? To the normal person I would explain thusly—
-First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and paying people money in exchange for them using their time differently from how they otherwise would is a perfectly ordinary thing for microeconomic agents to do. Upper-middle class normal people don't blink at paying a licensed therapist $100 to talk for an hour, because their culture designates that as a special ritualized context in which paying money to talk to someone isn't weird. In my culture, we don't need the special ritualized context; Yudkowsky just had a somewhat higher rate than most therapists.
+First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and paying people money in exchange for them using their time differently from how they otherwise would is a perfectly ordinary thing for microeconomic agents to do. Upper-middle–class normal people don't blink at paying a licensed therapist $100 to talk for an hour, because their culture designates that as a special ritualized context in which paying money to talk to someone isn't weird. In my culture, we don't need the special ritualized context; Yudkowsky just had a somewhat higher rate than most therapists.
Second, $1000 isn't actually real money to a San Francisco software engineer.
-Third—yes. Yes, it _absolutely_ was kind of cultish. There's a sense in which, _sociologically and psychologically speaking_, Yudkowsky is a religious leader, and I was—am—a devout adherent of the religion he made up. By this I don't mean that the _content_ of Yudkowskian rationalism is much comparable to Christianity or Buddhism.
+Third—yes. Yes, it _absolutely_ was kind of cultish. There's a sense in which, _sociologically and psychologically speaking_, Yudkowsky is a religious leader, and I was—am—a devout adherent of the religion he made up.
-But whether or not there is a God or a Divine, the _features of human psychology_ that make Christianity or Buddhism adaptive memeplexes are still going to be active; the God-shaped whole in my head
+By this I don't mean that the _content_ of Yudkowskian rationalism is much comparable to Christianity or Buddhism. But whether or not there is a God or a Divine (there is not), the _features of human psychology_ that make Christianity or Buddhism adaptive memeplexes are still going to be active; the God-shaped whole in my head can't not be filled by _something_, and Yudkowsky's writings on the hidden Bayesian structure of the universe were a potent way to fill that whole.
+It seems fair to compare my tendency to write in Sequences links to a devout Christian's tendency to quote Scripture by chapter and verse; the underlying mental motion of "appeal to the holy text" is probably pretty similar. My only defense is that _my_ religion is _actually true_ (and that my religion says you should read the texts and think it through for yourself, rather than taking anything on "faith").
-
-
-
-
-
-[TODO: explain religion]
-
-One of my emails included the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?"
-
-(Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including inside of future superintelligences running simulations of the past, there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.)
+That's the context in which my happy-price email thread ended up including the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" (Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including inside of future superintelligences running simulations of the past, there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.)
[TODO: I can't actually confirm or deny whether he accepted the happy price offer because if we did talk, it would have been a private conversation]
-]
-
(It was also around this time that I snuck a copy of _Men Trapped in Men's Bodies_ into the [MIRI](https://intelligence.org/) office library, which was sometimes possible for community members to visit. It seemed like something Harry Potter-Evans-Verres would do—and ominously, I noticed, not like something Hermione Granger would do.)
Gatekeeping sessions finished, I finally started HRT at the end of December 2016. In an effort to not let my anti–autogynephilia-denialism crusade take over my life, earlier that month, I promised myself (and [published the SHA256 hash of the promise](https://www.facebook.com/zmdavis/posts/10154596054540199)) not to comment on gender issues under my real name through June 2017—_that_ was what my new pseudonymous blog was for.