+[TODO: Boston–New York visit
+
+ * worked on "Challenges" and my Charles Murray review from Boston
+
+ * saw an escort, who I found in the "ebony" section of eros.com
+ * who I'll call ... (which wasn't the name she used, and the name she used was surely not her real name)
+ * she finally got back to me just in time for me to go to the bank and get cash, there not being Wells Fargo branches on the east coast (speaking of decisions my father made for me when I was four years old)
+ * she was very late; we went to an Indian restaurant and then to my hotel
+ * an opportunity to talk to someone who I wouldn't ordinarily otherwise (messaging someone like her on match.com would have felt fake, paying for her time felt more "honest")
+ * I explained AGP to her
+ * I didn't let her/have her touch my penis (that seemed "unethical" according to my own sense of ethics, though I'm not super-confident that my "ethics" didn't make things weirder for her); I just wanted to touch
+ * I wasn't coming; she said that for $2K, I definitely deserved to get off
+ * she said I could have her breasts, they were heavy
+ * my comment about how I wished I could have a photograph, but that it would be rude to ask; she said "No", and I wanted to clarify that I didn't ask, I said I wished I _could_ ask—but, you see, her culture didn't support that level of indirection; the claim that I wasn't asking, would seem dishonest
+ * I didn't tell her about the Charles Murray book review I was writing
+
+ * met my NRx Twitter mutual, wore my Quillette shirt
+ * he had been banned from Slate Star Codex "for no reason"
+ * he offered to buy me a drink, I said I didn't drink, but he insisted that being drunk was the ritual for how men establish trust, so I had a glass and a half of wine
+ * it was so refreshing—not being constrained
+ * I explained the AI risk case; he mentioned black people having larger wingspan
+
+ * met Ben and his new girlfriend; Jessica wasn't around; he said the psych disaster was a betrayal, but a finite one; Ben's suggestion that if CfAR were serious, they'd hire me
+
+]
+
+------
+
+In October 2021, Jessica [published a memoir about her experiences at MIRI](https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe), making analogies between sketchy social pressures she had experienced in the core rationalist community (around short AI timelines, secrecy, deference to community leaders, _&c._) and those reported in [Zoe Cramer's recent account of Leverage Research](https://medium.com/@zoecurzi/my-experience-with-leverage-research-17e96a8e540b).
+
+Scott Alexander posted a comment "add[ing] some context [he thought was] important to this", essentially blaming Jessica's problems on her association with Michael Vassar, describing her psychotic episode as a "Vassar-related phenomenon" (!).
+
+I thought this was unfair, and [said so](https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe?commentId=GzqsWxEp8uLcZinTy) (and [offered textual evidence](https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe?commentId=yKo2uuCcwJxbwwyBw) against the claim that Michael was _trying_ to drive Jessica crazy).
+
+To me, Scott's behavior looked like raw factional conflict: Jessica had some negative-valence things to say about the Caliphate, so Caliphate leaders move in to discredit her by association.
+
+It was effective, though. After Alexander's comment (and [a comment from Yudkowsky](https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe?commentId=x5ajGhggHky9Moyr8) uncritically accepting Alexander's charge of Vassar "causing psychotic breaks in people"), the karma score on Jessica's post dropped by more than half, while Alexander's comment got voted up to more than 380 karma.
+
+[TODO my conversation with Scott—
+
+> when you had some more minor issues in 2019 I was more in the loop and I ended out emailing the Vassarites (deliberately excluding you from the email, a decision I will defend in private if you ask me) accusing them of making your situation worse and asking them to maybe lay off you until you were maybe feeling slightly better, and obviously they just responded with their "it's correct to be freaking about learning your entire society is corrupt and gaslighting" shtick.
+
+ * Scott interviewed me
+ * I said
+
+]
+
+In December, Jessica published [a followup post explaining the circumstances of her psychotic episode in more detail](https://www.lesswrong.com/posts/pQGFeKvjydztpgnsY/occupational-infohazards).
+
+[TODO: Scott concedes: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe?commentId=RGKkmyvyoeWe2LB7d ]
+
+------
+
+[TODO:
+Is this the hill _he_ wants to die on? If the world is ending either way, wouldn't it be more dignified for him to die _without_ Stalin's dick in his mouth?
+
+> The Kiritsugu shrugged. "When I have no reason left to do anything, I am someone who tells the truth."
+https://www.lesswrong.com/posts/4pov2tL6SEC23wrkq/epilogue-atonement-8-8
+
+ * Maybe not? If "dignity" is a term of art for log-odds of survival, maybe self-censoring to maintain influence over what big state-backed corporations are doing is "dignified" in that sense
+]
+
+At the end of the September 2021 Twitter altercation, I [said that I was upgrading my "mute" of @ESYudkowsky to a "block"](https://twitter.com/zackmdavis/status/1435468183268331525). Better to just leave, rather than continue to hang around in his mentions trying (consciously or otherwise) to pick fights, like a crazy ex-girlfriend. (["I have no underlying issues to address; I'm certifiably cute, and adorably obsessed"](https://www.youtube.com/watch?v=UMHz6FiRzS8) ...)
+
+I still had more things to say—a reply to the February 2021 post on pronoun reform, and the present memoir telling this Whole Dumb Story—but those could be written and published unilaterally. Given that we clearly weren't going to get to clarity and resolution, I didn't need to bid for any more of my ex-hero's attention and waste more of his time (valuable time, _limited_ time); I owed him that much.
+
+Leaving a personality cult is hard. As I struggled to write, I noticed that I was wasting a lot of cycles worrying about what he'd think of me, rather than saying the things I needed to say. I knew it was pathetic that my religion was so bottlenecked on _one guy_—particularly since the holy texts themselves (written by that one guy) [explicitly said not to do that](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus)—but unwinding those psychological patterns was still a challenge.
+
+An illustration of the psychological dynamics at play: on an EA forum post about demandingness objections to longtermism, Yudkowsky [commented that](https://forum.effectivealtruism.org/posts/fStCX6RXmgxkTBe73/towards-a-weaker-longtermism?commentId=Kga3KGx6WAhkNM3qY) he was "broadly fine with people devoting 50%, 25% or 75% of themselves to longtermism, in that case, as opposed to tearing themselves apart with guilt and ending up doing nothing much, which seems to be the main alternative."