The _casus belli_ for the religious civil war happened on 28 November 2018. I was at my new dayjob's company offsite event in Austin, Texas. Coincidentally, I had already spent much of the previous two days (since just before the plane to Austin took off) arguing trans issues with other "rationalists" on Discord.
-Just that month, I had started a Twitter account using my real name, inspired in an odd way by the suffocating [wokeness of the Rust open-source software scene](/2018/Oct/sticker-prices/) where I [occasionally contributed diagnostics patches to the compiler](https://github.com/rust-lang/rust/commits?author=zackmdavis). My secret plan/fantasy was to get more famous and established in the Rust world (one of compiler team membership, or conference talk accepted, preferably both), get some corresponding Twitter followers, and _then_ bust out the [@BlanchardPhd](https://twitter.com/BlanchardPhD) retweets and links to this blog. In the median case, absolutely nothing would happen (probably because I failed at being famous), but I saw an interesting tail of scenarios in which I'd get to be a test case in [the Code of Conduct wars](https://techcrunch.com/2016/03/05/how-we-may-mesh/).
+Just that month, I had started a Twitter account using my real name, inspired in an odd way by the suffocating [wokeness of the Rust open-source software scene](/2018/Oct/sticker-prices/) where I [occasionally contributed diagnostics patches to the compiler](https://github.com/rust-lang/rust/commits?author=zackmdavis). My secret plan/fantasy was to get more famous and established in the Rust world (one of compiler team membership, or conference talk accepted, preferably both), get some corresponding Twitter followers, and _then_ bust out the @BlanchardPhd retweets and links to this blog. In the median case, absolutely nothing would happen (probably because I failed at being famous), but I saw an interesting tail of scenarios in which I'd get to be a test case in [the Code of Conduct wars](https://techcrunch.com/2016/03/05/how-we-may-mesh/).
So, now having a Twitter account, I was browsing Twitter in the bedroom at the rental house for the dayjob retreat when I happened to come across [this thread by @ESYudkowsky](https://twitter.com/ESYudkowsky/status/1067183500216811521):
I think commonsense privacy-norm-adherence intuitions actually say _No_ here: the text of Alice's messages makes it too easy to guess that sometime between 5 and 6, Bob probably said that he couldn't come to the party because he has gout. It would seem that Alice's right to talk about her own actions in her own life _does_ need to take into account some commonsense judgement of whether that leaks "sensitive" information about Bob.
-In the substory (of my Whole Dumb Story) that follows, I'm going to describe several times when I and others emailed Yudkowsky to try to argue with what he said in public, without saying anything about whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was too busy with more existentially important things to check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the futher emails with followup arguments that I describe; I don't think I'm leaking any sensitive bits that aren't already easy to infer from what's been said (and not said) in public. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".)
+In the substory (of my Whole Dumb Story) that follows, I'm going to describe several times that I and others emailed Yudkowsky to argue with what he said in public, without saying anything about whether Yudkowsky replied or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgment will agree that me talking about the arguments _I_ made does not leak any sensitive information about the other side of a conversation that may or may not have happened. I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was too busy with more existentially important things to check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the further emails with followup arguments that I describe. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".)
-It seems particularly important to lay out these judgements about privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone _else_ can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the substory that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; Alexander's reputation isn't so direly in need of correction.)
+It seems particularly important to lay out these judgments about privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you talk to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone else can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the substory that follows, I also describe correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander. (Not because Scott performed well, but because one wouldn't really have expected him to in this situation; Alexander's reputation isn't so direly in need of correction.)
-In accordance with the privacy-norm-adherence policy just described, I don't think I should say whether Yudkowsky replied to Michael's and my emails, nor (again) whether he accepted the cheerful price money, because any conversation that may or may not have occured would have been private. But what I _can_ say, because it was public, is that we saw [this addition to the Twitter thread](https://twitter.com/ESYudkowsky/status/1068071036732694529):
+Thus, I don't think I should say whether Yudkowsky replied to Michael's and my emails, nor (again) whether he accepted the cheerful-price money, because any conversation that may or may not have occurred would have been private. But what I _can_ say, because it was public, is that we saw [this addition to the Twitter thread](https://twitter.com/ESYudkowsky/status/1068071036732694529):
> I was sent this (by a third party) as a possible example of the sort of argument I was looking to read: [http://unremediatedgender.space/2018/Feb/the-categories-were-made-for-man-to-make-predictions/](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/). Without yet judging its empirical content, I agree that it is not ontologically confused. It's not going "But this is a MAN so using 'she' is LYING."
-Look at that! The great Eliezer Yudkowsky said that my position is "not ontologically confused." That's _probably_ high praise coming from him!
+Look at that! The great Eliezer Yudkowsky said that my position is "not ontologically confused." That's probably high praise, coming from him!
You might think that that should have been the end of the story. Yudkowsky denounced a particular philosophical confusion, I already had a related objection written up, and he publicly acknowledged my objection as not being the confusion he was trying to police. I _should_ be satisfied, right?
-I wasn't, in fact, satisfied. This little "not ontologically confused" clarification buried deep in the replies was _much less visible_ than the bombastic, arrogant top level pronouncement insinuating that resistance to gender-identity claims _was_ confused. (1 Like on this reply, _vs._ 140 Likes/21 Retweets on start of thread.) I expected that the typical reader who had gotten the impression from the initial thread that Yudkowsky thought that gender-identity skeptics didn't have a leg to stand on, would not, actually, be disabused of this impression by the existence of this little follow-up. Was it greedy of me to want something _louder_?
+I wasn't, in fact, satisfied. This little "not ontologically confused" clarification buried deep in the replies was much less visible than the bombastic, arrogant top-level pronouncement insinuating that resistance to gender-identity claims _was_ confused. (1 Like on this reply, _vs._ 140 Likes/18 Retweets on start of thread.) This little follow-up did not seem likely to disabuse the typical reader of the impression that Yudkowsky thought gender-identity skeptics didn't have a leg to stand on. Was it greedy of me to want something _louder_?
-Greedy or not, I wasn't done flipping out. On 1 December 2019, I wrote to Scott Alexander (cc'ing a few other people), asking if there was any chance of an _explicit_ and _loud_ clarification or partial-retraction of ["... Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) (Subject: "super-presumptuous mail about categorization and the influence graph"). _Forget_ my boring whining about the autogynephilia/two-types thing, I said—that's a complicated empirical claim, and _not_ the key issue.
+Greedy or not, I wasn't done flipping out. On 1 December 2019, I wrote to Scott Alexander (cc'ing a few other people) to ask if there was any chance of an explicit and loud clarification or partial retraction of ["... Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) (Subject: "super-presumptuous mail about categorization and the influence graph"). Forget my boring whining about the autogynephilia/two-types thing, I said—that's a complicated empirical claim, and not the key issue.
-The _issue_ was that category boundaries are not arbitrary (if you care about intelligence being useful): you want to [draw your category boundaries such that](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) things in the same category are similar in the respects that you care about predicting/controlling, and you want to spend your [information-theoretically limited budget](https://www.lesswrong.com/posts/soQX8yXLbKy7cFvy8/entropy-and-short-codes) of short words on the simplest and most wide-rangingly useful categories.
+The _issue_ was that category boundaries are not arbitrary (if you care about intelligence being useful). You want to [draw your category boundaries such that](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) things in the same category are similar in the respects that you care about predicting/controlling, and you want to spend your [information-theoretically limited budget](https://www.lesswrong.com/posts/soQX8yXLbKy7cFvy8/entropy-and-short-codes) of short words on the simplest and most widely useful categories.
-It was true that [the reason _I_ was continuing to freak out about this](/2019/Jul/the-source-of-our-power/) to the extent of sending him this obnoxious email telling him what to write (seriously, who does that?!) had to with transgender stuff, but wasn't the reason _Scott_ should care.
+It was true that [the reason _I_ was continuing to freak out about this](/2019/Jul/the-source-of-our-power/) to the extent of sending him this obnoxious email telling him what to write (seriously, who does that?!) was because of transgender stuff, but that wasn't why _Scott_ should care.
-The other year, Alexander had written a post, ["Kolmogorov Complicity and the Parable of Lightning"](http://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/), explaining the consequences of political censorship by means of an allegory about a Society with the dogma that thunder occurs before lightning.[^kolmogorov-pun] Alexander had explained that the problem with complying with the dictates of a false orthodoxy wasn't so much the sacred dogma itself (it's not often that you need to _directly_ make use of the fact that lightning comes first), but that [the need to _defend_ the sacred dogma](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies) [_destroys everyone's ability to think_](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology).
+The other year, Alexander had written a post, ["Kolmogorov Complicity and the Parable of Lightning"](http://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/), explaining the consequences of political censorship with an allegory about a Society with the dogma that thunder occurs before lightning.[^kolmogorov-pun] Alexander had explained that the problem with complying with the dictates of a false orthodoxy wasn't the sacred dogma itself (it's not often that you need to _directly_ make use of the fact that lightning comes first), but that [the need to _defend_ the sacred dogma](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies) [destroys everyone's ability to think](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology).
-[^kolmogorov-pun]: The title was a [pun](https://en.wikipedia.org/wiki/Kolmogorov_complexity) referencing computer scientist Scott Aaronson's post advocating ["The Kolmogorov Option"](https://www.scottaaronson.com/blog/?p=3376), serving the cause of Truth by cultivating a bubble that focuses on specific truths that won't get you in trouble with the local political authorities. This after the Soviet mathematician Andrey Kolmogorov, who _knew better than to pick fights he couldn't win_.
+[^kolmogorov-pun]: The title was a [pun](https://en.wikipedia.org/wiki/Kolmogorov_complexity) referencing computer scientist Scott Aaronson's post advocating ["The Kolmogorov Option"](https://www.scottaaronson.com/blog/?p=3376), serving the cause of Truth by cultivating a bubble that focuses on specific truths that won't get you in trouble with the local political authorities. Named after the Soviet mathematician Andrey Kolmogorov, who knew better than to pick fights he couldn't win.
-It was the same thing here. It wasn't that I had any direct practical need to misgender anyone in particular. It still wasn't okay that trying to talk about the reality of biological sex to so-called "rationalists" got you an endless deluge of—polite! charitable! non-ostracism-threatening!—_bullshit nitpicking_. (What about [complete androgen insensitivity syndrome](https://en.wikipedia.org/wiki/Complete_androgen_insensitivity_syndrome)? Why doesn't this ludicrous misinterpretation of what you said [imply that lesbians aren't women](https://thingofthings.wordpress.com/2018/06/18/man-should-allocate-some-more-categories/)? _&c. ad infinitum_.) With enough time, I thought the nitpicks could and should be satisfactorily answered. (Any ones that couldn't would presumably be fatal criticisms rather than bullshit nitpicks.) But while I was in the process of continuing to write all that up, I hoped Alexander could see why I felt somewhat gaslighted.
+It was the same thing here. It wasn't that I had any practical need to misgender anyone in particular. It still wasn't okay that talking about the reality of biological sex to so-called "rationalists" got you an endless deluge of—polite! charitable! non-ostracism-threatening!—_bullshit nitpicking_. (What about [complete androgen insensitivity syndrome](https://en.wikipedia.org/wiki/Complete_androgen_insensitivity_syndrome)? Why doesn't this ludicrous misinterpretation of what you said [imply that lesbians aren't women](https://thingofthings.wordpress.com/2018/06/18/man-should-allocate-some-more-categories/)? _&c. ad infinitum_.) With enough time, I thought the nitpicks could and should be satisfactorily answered; any remaining would presumably be fatal criticisms rather than bullshit nitpicks. But while I was in the process of continuing to write all that up, I hoped Alexander could see why I felt somewhat gaslighted.
-(I had been told by others that I wasn't using the word "gaslighting" correctly. _Somehow_ no one seemed to think I had the right to define _that_ category boundary for my convenience.)
+(I had been told by others that I wasn't using the word "gaslighting" correctly. No one seemed to think I had the right to define _that_ category boundary for my convenience.)
-If our vaunted rationality techniques resulted in me having to spend dozens of hours patiently explaining why I didn't think that I was a woman and that [the person in this photograph](https://daniellemuscato.startlogic.com/uploads/3/4/9/3/34938114/2249042_orig.jpg) wasn't a woman, either (where "isn't a woman" is a _convenient rhetorical shorthand_ for a much longer statement about [naïve Bayes models](https://www.lesswrong.com/posts/gDWvLicHhcMfGmwaK/conditional-independence-and-naive-bayes) and [high-dimensional configuration spaces](https://www.lesswrong.com/posts/WBw8dDkAWohFjWQSk/the-cluster-structure-of-thingspace) and [defensible Schelling points for social norms](https://www.lesswrong.com/posts/Kbm6QnJv9dgWsPHQP/schelling-fences-on-slippery-slopes)), then our techniques were _worse than useless_.
+If our vaunted rationality techniques resulted in me having to spend dozens of hours patiently explaining why I didn't think that I was a woman and that [the person in this photograph](https://daniellemuscato.startlogic.com/uploads/3/4/9/3/34938114/2249042_orig.jpg) wasn't, either (where "not a woman" is a convenient rhetorical shorthand for a much longer statement about [naïve Bayes models](https://www.lesswrong.com/posts/gDWvLicHhcMfGmwaK/conditional-independence-and-naive-bayes) and [high-dimensional configuration spaces](https://www.lesswrong.com/posts/WBw8dDkAWohFjWQSk/the-cluster-structure-of-thingspace) and [defensible Schelling points for social norms](https://www.lesswrong.com/posts/Kbm6QnJv9dgWsPHQP/schelling-fences-on-slippery-slopes)), then our techniques were worse than useless.
-[If Galileo ever muttered "And yet it moves"](https://en.wikipedia.org/wiki/And_yet_it_moves), there's a long and nuanced conversation you could have about the consequences of using the word "moves" in Galileo's preferred sense, or some other sense that happens to result in the theory needing more epicycles. It may not have been obvious in November 2014, but in retrospect, _maybe_ it was a _bad_ idea to build a [memetic superweapon](https://archive.is/VEeqX) that says that the number of epicycles _doesn't matter_.
+[If Galileo ever muttered "And yet it moves"](https://en.wikipedia.org/wiki/And_yet_it_moves), there's a long and nuanced conversation you could have about the consequences of using the word "moves" in Galileo's preferred sense, as opposed to some other sense that happens to result in the theory needing more epicycles. It may not have been obvious in November 2014 when "... Not Man for the Categories" was published, but in retrospect, maybe it was a _bad_ idea to build a [memetic superweapon](https://archive.is/VEeqX) that says that the number of epicycles _doesn't matter_.
-And the reason to write this as a desperate email plea to Scott Alexander when I could be working on my own blog, was that I was afraid that marketing is a more powerful force than argument. Rather than good arguments propagating through the population of so-called "rationalists" no matter where they arose, what actually happened was that people like Alexander and Yudkowsky rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbed some of their worldview (plus noise and [conformity with the local environment](https://thezvi.wordpress.com/2017/08/12/what-is-rationalist-berkleys-community-culture/)). So for people who didn't [win the talent lottery](http://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) but thought they saw a flaw in the _Zeitgeist_, the winning move was "persuade Scott Alexander."
+The reason to write this as a desperate email plea to Scott Alexander instead of working on my own blog was that I was afraid that marketing is a more powerful force than argument. Rather than good arguments propagating through the population of so-called "rationalists" no matter where they arose, what actually happened was that people like Alexander and Yudkowsky rose to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else absorbed some of their worldview (plus noise and [conformity with the local environment](https://thezvi.wordpress.com/2017/08/12/what-is-rationalist-berkleys-community-culture/)). So for people who didn't [win the talent lottery](http://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) but thought they saw a flaw in the _zeitgeist_, the winning move was "persuade Scott Alexander."
-Back in 2010, the rationalist community had a shared understanding that the function of language is to describe reality. Now, we didn't. If Scott didn't want to cite my creepy blog about my creepy fetish, that was _totally fine_; I liked getting credit, but the important thing was that this "No, the Emperor isn't naked—oh, well, we're not claiming that he's wearing any garments—it would be pretty weird if we were claiming _that!_—it's just that utilitarianism implies that the _social_ property of clothedness should be defined this way because to do otherwise would be really mean to people who don't have anything to wear" gaslighting maneuver needed to _die_, and he alone could kill it.
+Back in 2010, the rationalist community had a shared understanding that the function of language is to describe reality. Now, we didn't. If Scott didn't want to cite my creepy blog about my creepy fetish, that was fine; I liked getting credit, but the important thing was that this "No, the Emperor isn't naked—oh, well, we're not claiming that he's wearing any garments—it would be pretty weird if we were claiming _that!_—it's just that utilitarianism implies that the _social_ property of clothedness should be defined this way because to do otherwise would be really mean to people who don't have anything to wear" maneuver needed to _die_, and he alone could kill it.
-... Scott didn't get it. We agreed that self-identity-, natal-sex-, and passing-based gender categories each had their own pros and cons, and that it's uninteresting to focus on whether something "really" belongs to a category, rather than on communicating what you mean. Scott took this to mean that what convention to use is a pragmatic choice that we can make on utilitarian grounds, and that being nice to trans people was worth a little bit of clunkiness, that the mental health benefits to trans people were obviously enough to tip the first-order uilitarian calculus.
+Scott didn't get it. We agreed that gender categories based on self-identity, natal sex, and passing each had their own pros and cons, and that it's uninteresting to focus on whether something "really" belongs to a category rather than on communicating what you mean. Scott took this to mean that what convention to use is a pragmatic choice we can make on utilitarian grounds, and that being nice to trans people was worth a little bit of clunkiness—that the mental health benefits to trans people were obviously enough to tip the first-order uilitarian calculus.
-I didn't think _anything_ about "mental health benefits to trans people" was obvious, but more importantly, I considered myself to be prosecuting _not_ the object-level question of which gender categories to use, but the meta-level question of what normative principles govern the use of categories, for which (I claimed) "whatever, it's a pragmatic choice, just be nice" wasn't an answer, because (I claimed) the normative the principles exclude "just be nice" from being a relevant consideration.
+I didn't think anything about "mental health benefits to trans people" was obvious. More importantly, I considered myself to be prosecuting not the object-level question of which gender categories to use but the meta-level question of what normative principles govern the use of categories. For this, "whatever, it's a pragmatic choice, just be nice" wasn't an answer, because the normative principles exclude "just be nice" from being a relevant consideration.
-["... Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) had concluded with a section on [Emperor Norton](https://en.wikipedia.org/wiki/Emperor_Norton), a 19th century San Francisco resident who declared himself Emperor of the United States. Certainly, it's not difficult or costly for the citizens of San Francisco to _address_ Norton as "Your Majesty" as a courtesy or a nickname. But there's more to being Emperor of the United States than people calling you "Your Majesty." Unless we abolish Congress and have the military enforce Norton's decrees, he's not _actually_ functioning in the role of emperor—at least not according to the currently generally-understood meaning of the word "emperor."
+["... Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) had concluded with a section on [Emperor Norton](https://en.wikipedia.org/wiki/Emperor_Norton), a 19th-century San Francisco resident who declared himself Emperor of the United States. Certainly, it's not difficult or costly for the citizens of San Francisco to address Norton as "Your Majesty". But there's more to being Emperor of the United States than what people call you. Unless we abolish Congress and have the military enforce Norton's decrees, he's not actually emperor—at least not according to the currently generally understood meaning of the word.
What are you going to do if Norton takes you literally? Suppose he says, "I ordered the Imperial Army to invade Canada last week; where are the troop reports? And why do the newspapers keep talking about this so-called 'President' Rutherford B. Hayes? Have this pretender Hayes executed at once and bring his head to me!"
To be sure, words can be used in many ways depending on context, but insofar as Norton _is_ interpreting "emperor" in the traditional sense, and you keep calling him your emperor without caveats or disclaimers, _you are lying to him_.
-... Scott still didn't get it. But I _did_ soon end up in more conversation with Michael Vassar, Ben Hoffman, and Sarah Constantin, who were game to help me with reaching out to Yudkowsky again to explain the problem in more detail—and to appeal to the conscience of someone who built their career on [higher standards](https://www.lesswrong.com/posts/DoLQN5ryZ9XkZjq5h/tsuyoku-naritai-i-want-to-become-stronger).
+Scott still didn't get it. But I did soon end up in more conversation with Michael Vassar, Ben Hoffman, and Sarah Constantin, who were game to help me reach out to Yudkowsky again to explain the problem in more detail—and to appeal to the conscience of someone who built their career on [higher standards](https://www.lesswrong.com/posts/DoLQN5ryZ9XkZjq5h/tsuyoku-naritai-i-want-to-become-stronger).
-Yudkowsky probably didn't think much of _Atlas Shrugged_ (judging by [an offhand remark by our protagonist in _Harry Potter and the Methods_](http://www.hpmor.com/chapter/20)), but I kept thinking of the scene[^atlas-shrugged-ref] where our heroine Dagny Taggart entreats the great Dr. Robert Stadler to denounce [an egregiously deceptive but technically-not-lying statement](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly) by the State Science Institute, whose legitimacy derives from its association with his name. Stadler has become cynical in his old age and demurs, disclaiming all responsibility: "I can't help what people think—if they think at all!" ... "How can one deal in truth when one deals with the public?"
+Yudkowsky probably didn't think much of _Atlas Shrugged_ (judging by [an offhand remark by our protagonist in _Harry Potter and the Methods_](http://www.hpmor.com/chapter/20)), but I kept thinking of the scene[^atlas-shrugged-ref] where our heroine, Dagny Taggart, entreats the great Dr. Robert Stadler to denounce [an egregiously deceptive but technically-not-lying statement](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly) by the State Science Institute, whose legitimacy derives from its association with his name. Stadler has become cynical in his old age and demurs: "I can't help what people think—if they think at all!" ... "How can one deal in truth when one deals with the public?"
[^atlas-shrugged-ref]: In Part One, Chapter VII, "The Exploiters and the Exploited".
(I was wrong.)
-If we had this entire posse, I felt bad and guilty and ashamed about focusing too much on my special interest except insofar as it was geniunely a proxy for "Has Eliezer and/or everyone else [lost the plot](https://thezvi.wordpress.com/2017/08/12/what-is-rationalist-berkleys-community-culture/), and if so, how do we get it back?" But the group seemed to agree that my philosophy-of-language grievance was a useful test case for prosecuting deeper maladies affecting our subculture.
+If we had this entire posse, I felt bad and guilty and ashamed about focusing too much on my special interest except insofar as it was genuinely a proxy for "Has Eliezer and/or everyone else [lost the plot](https://thezvi.wordpress.com/2017/08/12/what-is-rationalist-berkleys-community-culture/), and if so, how do we get it back?" But the group seemed to agree that my philosophy-of-language grievance was a useful test case.
-There were times during these weeks where it felt like my mind shut down with the only thought, "What am I _doing_? This is _absurd_. Why am I running around picking fights about the philosophy of language—and worse, with me arguing for the _Bad_ Guys' position? Maybe I'm wrong and should stop making a fool out of myself. After all, using [Aumann-like](https://www.lesswrong.com/tag/aumann-s-agreement-theorem) reasoning, in a dispute of 'me and Michael Vassar vs. _everyone else_', wouldn't I want to bet on 'everyone else'? Obviously."
+At times, it felt like my mind shut down with only the thought, "What am I doing? This is absurd. Why am I running around picking fights about the philosophy of language—and worse, with me arguing for the _Bad_ Guys' position? Maybe I'm wrong and should stop making a fool of myself. After all, using [Aumann-like](https://www.lesswrong.com/tag/aumann-s-agreement-theorem) reasoning, in a dispute of 'me and Michael Vassar vs. _everyone else_', wouldn't I want to bet on 'everyone else'?"
-Except ... I had been raised back in the 'aughts to believe that you're you're supposed to concede arguments on the basis of encountering a superior counterargument that makes you change your mind, and I couldn't actually point to one. "Maybe I'm making a fool out of myself by picking fights with all these high-status people" is _not a counterargument_.
+Except ... I had been raised back in the 'aughts to believe that you're you're supposed to concede arguments on the basis of encountering a superior counterargument, and I couldn't actually point to one. "Maybe I'm making a fool out of myself by picking fights with all these high-status people" is not a counterargument.
-Anna continued to be disinclined to take a side in the brewing Category War, and it was beginning to put a strain on our friendship, to the extent that I kept ending up crying during our occasional meetings. She said that my "You have to pass my philosophy-of-language litmus test or I lose all respect for you as a rationalist" attitude was psychologically coercive. I agreed—I was even willing to go up to "violent"—in the sense that I'd cop to [trying to apply social incentives towards an outcome rather than merely exchanging information](http://zackmdavis.net/blog/2017/03/an-intuition-on-the-bayes-structural-justification-for-free-speech-norms/). But sometimes you need to use violence in defense of self or property, even if violence is generally bad. If we thought of the "rationalist" brand name as intellectual property, maybe it was property worth defending, and if so, then "I can define a word any way I want" wasn't an obviously terrible time to start shooting at the bandits?
+Anna continued to be disinclined to take a side in the brewing Category War, and it was beginning to put a strain on our friendship, to the extent that I kept ending up crying during our occasional meetings. She said that my "You have to pass my philosophy-of-language litmus test or I lose all respect for you as a rationalist" attitude was psychologically coercive. I agreed—I was even willing to go up to "violent", in the sense that I'd cop to [trying to apply social incentives toward an outcome rather than merely exchanging information](http://zackmdavis.net/blog/2017/03/an-intuition-on-the-bayes-structural-justification-for-free-speech-norms/). But sometimes you need to use violence in defense of self or property. If we thought of the "rationalist" brand name as intellectual property, maybe it was property worth defending, and if so, then "I can define a word any way I want" wasn't an obviously terrible time to start shooting at the bandits.
-My _hope_ was that it was possible to apply just enough "What kind of rationalist are _you_?!" social pressure to cancel out the "You don't want to be a Bad ([Red](https://slatestarcodex.com/2014/09/30/i-can-tolerate-anything-except-the-outgroup/)) person, do you??" social pressure and thereby let people look at the arguments—though I wasn't sure if that actually works, and I was growing exhausted from all the social aggression I was doing about it. (If someone tries to take your property and you shoot at them, you could be said to be the "aggressor" in the sense that you fired the first shot, even if you hope that the courts will uphold your property claim later.)
+My hope was that it was possible to apply just enough "What kind of rationalist are _you_?!" social pressure to cancel out the "You don't want to be a Bad ([Red](https://slatestarcodex.com/2014/09/30/i-can-tolerate-anything-except-the-outgroup/)) person, do you??" social pressure and thereby let people look at the arguments—though I wasn't sure if that even works, and I was growing exhausted from all the social aggression I was doing. (If someone tries to take your property and you shoot at them, you could be said to be the "aggressor" in the sense that you fired the first shot, even if you hope that the courts will uphold your property claim later.)
-After some more discussion within the me/Michael/Ben/Sarah posse, on 4 January 2019, I wrote to Yudkowsky again (a second time), to explain the specific problems with his "hill of meaning in defense of validity" Twitter performance, since that apparently hadn't been obvious from the earlier link to ["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/); I cc'ed the posse, who chimed in afterwards.
+After some more discussion within the me/Michael/Ben/Sarah posse, on 4 January 2019, I wrote to Yudkowsky again (a second time), to explain the specific problems with his "hill of meaning in defense of validity" Twitter performance, since that apparently hadn't been obvious from the earlier link to ["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/). I cc'ed the posse, who chimed in afterwards.
-Ben explained what kind of actions we were hoping for from Yudkowsky: that he would (1) notice that he'd accidentally been participating in an epistemic war, (2) generalize the insight (if he hadn't noticed, what were the odds that MIRI had adequate defenses?), and (3) join the conversation about how to _actually_ have a rationality community, while noticing this particular way in which the problem seemed harder than it used to. For my case in particular, something that would help would be _either_ (A) a clear _ex cathedra_ statement that gender categories are not an exception to the general rule that categories are nonarbitrary, _or_ (B) a clear _ex cathedra_ statement that he's been silenced on this matter. If even (B) was too politically expensive, that seemed like important evidence about (1).
+Ben explained what kind of actions we were hoping for from Yudkowsky: that he would (1) notice that he'd accidentally been participating in an epistemic war, (2) generalize the insight (if he hadn't noticed, what were the odds that MIRI had adequate defenses?), and (3) join the conversation about how to _actually_ have a rationality community, while noticing this particular way in which the problem seemed harder than it used to. For my case in particular, something that would help would be either (A) a clear _ex cathedra_ statement that gender categories are not an exception to the general rule that categories are nonarbitrary, _or_ (B) a clear _ex cathedra_ statement that he's been silenced on this matter. If even (B) was too politically expensive, that seemed like important evidence about (1).
-Without revealing the other side of any private conversation that may or may not have occurred, I can say that we did not get either of those _ex cathedra_ statements from Yudkowsky at this time.
+Without revealing the other side of any private conversation that may or may not have occurred, I can say that we did not get either of those _ex cathedra_ statements at this time.
It was also around this time that our posse picked up a new member, whom I'll call "Riley".
-----
-On 5 January 2019, I met with Michael and his associate Aurora Quinn-Elmore in San Francisco to attempt mediated discourse with [Ziz](https://sinceriously.fyi/) and [Gwen](https://everythingtosaveit.how/), who were considering suing the [Center for Applied Rationality](https://rationality.org/) (CfAR)[^what-is-cfar] for discriminating against trans women. Michael hoped to dissuade them from a lawsuit—not because Michael approved of CfAR's behavior, but because lawyers make everything worse.
+On 5 January 2019, I met with Michael and his associate Aurora Quinn-Elmore in San Francisco to attempt mediated discourse with [Ziz](https://sinceriously.fyi/) and [Gwen](https://everythingtosaveit.how/), who were considering suing the [Center for Applied Rationality](https://rationality.org/) (CfAR)[^what-is-cfar] for discriminating against trans women. Michael hoped to dissuade them from a lawsuit—not because he approved of CfAR's behavior, but because lawyers make everything worse.
[^what-is-cfar]: CfAR had been spun off from MIRI in 2012 as a dedicated organization for teaching rationality.
-Ziz recounted [her](/2019/Oct/self-identity-is-a-schelling-point/) story of how Anna Salamon (in her capacity as President of CfAR and community leader) allegedly engaged in [conceptual warfare](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) to falsely portray Ziz as a predatory male. I was unimpressed: in my worldview, I didn't think Ziz had the right to say "I'm not a man," and expect people to just believe that. ([I remember that](https://twitter.com/zackmdavis/status/1081952880649596928) at one point, Ziz answered a question with, "Because I don't run off masochistic self-doubt like you." I replied, "That's fair.") But I did respect that Ziz actually believed in an intersex brain theory: in Ziz and Gwen's worldview, people's genders were a _fact_ of the matter, not just a manipulation of consensus categories to make people happy.
+Ziz recounted [her](/2019/Oct/self-identity-is-a-schelling-point/) story of how Anna Salamon (in her capacity as President of CfAR and community leader) allegedly engaged in [conceptual warfare](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) to falsely portray Ziz as a predatory male. I was unimpressed: in my worldview, I didn't think Ziz had the right to say "I'm not a man," and expect people to just believe that. ([I remember that](https://twitter.com/zackmdavis/status/1081952880649596928) at one point, Ziz answered a question with, "Because I don't run off masochistic self-doubt like you." I replied, "That's fair.") But I did respect that Ziz actually believed in an intersex brain theory: in Ziz and Gwen's worldview, people's genders were a _fact_ of the matter, not a manipulation of consensus categories to make people happy.
-Probably the most ultimately consequential part of this meeting on future events was Michael verbally confirming to Ziz that MIRI had settled with a disgruntled former employee, Louie Helm, who had put up [a website slandering them](https://archive.ph/Kvfus). (I don't actually know the details of the alleged settlement. I'm working off of [Ziz's notes](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) rather than particularly remembering that part of the conversation clearly myself; I don't know what Michael knew.) What was significant was that if MIRI _had_ paid Helm as part of an agreement to get the slanderous website taken down, then, whatever the nonprofit best-practice books might have said about whether this was a wise thing to do when facing a dispute from a former employee, that would decision-theoretically amount to a blackmail payout, which seemed to contradict MIRI's advocacy of timeless decision theories (according to which you [shouldn't be the kind of agent that yields to extortion](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/)).
+Probably the most ultimately consequential part of this meeting was Michael verbally confirming to Ziz that MIRI had settled with a disgruntled former employee, Louie Helm, who had put up [a website slandering them](https://archive.ph/Kvfus). (I don't know the details of the alleged settlement. I'm working off of [Ziz's notes](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) rather than remembering that part of the conversation clearly myself; I don't know what Michael knew.) What was significant was that if MIRI _had_ paid Helm as part of an agreement to get the slanderous website taken down, then (whatever the nonprofit best-practice books might have said about whether this was a wise thing to do when facing a dispute from a former employee) that would decision-theoretically amount to a blackmail payout, which seemed to contradict MIRI's advocacy of timeless decision theories (according to which you [shouldn't be the kind of agent that yields to extortion](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/)).
----
-Something else Ben had said while chiming in on the second attempt to reach out to Yudkowsky hadn't sit quite right with me. He had written:
+Something else Ben had said while chiming in on the second attempt to reach out to Yudkowsky hadn't sat quite right with me.
> I am pretty worried that if I actually point out the ***physical injuries*** sustained by some of the smartest, clearest-thinking, and kindest people I know in the Rationalist community as a result of this sort of thing, I'll be dismissed as a mean person who wants to make other people feel bad.
-I didn't know what he was talking about. My trans widow friend "Rebecca"'s 2015 psychiatric imprisonment ("hospitalization") had probably been partially related to her husband's transition and had involved rough handling by the cops. I had been through some Bad Stuff, but none of it was "physical injuries." What were the other cases, if he could share without telling me Very Secret Secrets With Names?
+I didn't know what he was talking about. My friend "Rebecca"'s 2015 psychiatric imprisonment ("hospitalization") had probably been partially related to her partner's transition and had involved rough handling by the cops. I had been through some Bad Stuff during my psychotic episodes of February and April 2017, but none of it was "physical injuries." What were the other cases, if he could share without telling me Very Secret Secrets With Names?
-Ben said that, probabilistically, he expected that some fraction of the trans women he knew who had "voluntarily" had bottom surgery, had done so in response to social pressure, even if some of them might very well have sought it out in a less weaponized culture.
+Ben said that, probabilistically, he expected that some fraction of the trans women he knew who had "voluntarily" had bottom surgery had done so in response to social pressure, even if some of them might well have sought it out in a less weaponized culture.
-I said that saying "I am worried that if I actually point out the physical injuries ..." when the actual example turned out to be sex reassignment surgery seemed pretty dishonest to me: I had thought he might have more examples of situations like mine or "Rebecca"'s, where gaslighting escalated into more tangible harm in a way that people wouldn't know about by default. In contrast, people _already know_ that bottom surgery is a thing; Ben just had reasons to think it's Actually Bad—reasons that his friends couldn't engage with if _we didn't know what he was talking about_. It was already bad enough that Yudkowsky was being so cagey; if _everyone_ did it, then we were really doomed.
+I said that saying, "I am worried that if I actually point out the physical injuries ..." when the actual example turned out to be sex reassignment surgery seemed dishonest: I had thought he might have more examples of situations like mine or "Rebecca"'s, where gaslighting escalated into more tangible harm in a way that people wouldn't know about by default. In contrast, people already know that bottom surgery is a thing; Ben just had reasons to think it's Actually Bad—reasons that his friends couldn't engage with if _we didn't know what he was talking about_. It was bad enough that Yudkowsky was being so cagey; if _everyone_ did it, then we were really doomed.
-Ben said that he was more worried that saying politically-loaded things in the wrong order would reduce our chances of getting engagement from Yudkowsky, than he was about someone sharing his words out of context in a way that caused him distinct harm—and maybe more than both of those, that saying the wrong keywords would cause his correspondents to talk about _him_ using the wrong keywords, in ways that caused illegible, hard-to-trace damage.
+Ben said he was more worried that saying politically loaded things in the wrong order would reduce our chances of getting engagement from Yudkowsky than that someone would share his words out of context in a way that caused him distinct harm. And maybe more than both of those, that saying the wrong keywords would cause his correspondents to talk about _him_ using the wrong keywords, in ways that caused illegible, hard-to-trace damage.
------
-There's a view that assumes that as long as everyone is being cordial, our truthseeking public discussion must be basically on-track: if no one overtly gets huffily offended and calls to burn the heretic, then the discussion isn't being warped by the fear of heresy.
+There's a view that assumes that as long as everyone is being cordial, our truthseeking public discussion must be basically on track; the discussion is only being warped by the fear of heresy if someone is overtly calling to burn the heretics.
-I do not hold this view. I think there's a _subtler_ failure mode where people know what the politically-favored [bottom line](https://www.lesswrong.com/posts/34XxbRFe54FycoCDw/the-bottom-line) is, and collude to ignore, nitpick, or just be targetedly _uninterested_ in any fact or line of argument that doesn't fit the party line. I want to distinguish between direct ideological conformity enforcement attempts, and people not living up to their usual epistemic standards in response to ideological conformity enforcement in the general culture they're embedded in.
+I do not hold this view. I think there's a _subtler_ failure mode where people know what the politically favored [bottom line](https://www.lesswrong.com/posts/34XxbRFe54FycoCDw/the-bottom-line) is, and collude to ignore, nitpick, or just be _uninterested_ in any fact or line of argument that doesn't fit. I want to distinguish between direct ideological conformity enforcement attempts, and people not living up to their usual epistemic standards in response to ideological conformity enforcement.
-Especially compared to normal Berkeley, I had to give the Berkeley "rationalists" credit for being _very good_ at free speech norms. (I'm not sure I would be saying this in the possible world where Scott Alexander didn't have a [traumatizing experience with social justice in college](https://slatestarcodex.com/2014/01/12/a-response-to-apophemi-on-triggers/), causing him to dump a ton of [anti-social-justice](https://slatestarcodex.com/tag/things-i-will-regret-writing/), [pro-argumentative-charity](https://slatestarcodex.com/2013/02/12/youre-probably-wondering-why-ive-called-you-here-today/) antibodies into the "rationalist" collective "water supply" after he became our subculture's premier writer. But it was true in _our_ world.) I didn't want to fall into the [bravery-debate](http://slatestarcodex.com/2013/05/18/against-bravery-debates/) trap of, "Look at me, I'm so heroically persecuted, therefore I'm right (therefore you should have sex with me)". I wasn't angry at the "rationalists" for being silenced or shouted down (which I wasn't); I was angry at them for _making bad arguments_ and systematically refusing to engage with the obvious counterarguments when they were made.
+Especially compared to normal Berkeley, I had to give the Berkeley "rationalists" credit for being very good at free speech norms. (I'm not sure I would be saying this in the possible world where Scott Alexander didn't have a [traumatizing experience with social justice in college](https://slatestarcodex.com/2014/01/12/a-response-to-apophemi-on-triggers/), causing him to dump a ton of [anti-social-justice](https://slatestarcodex.com/tag/things-i-will-regret-writing/), [pro-argumentative-charity](https://slatestarcodex.com/2013/02/12/youre-probably-wondering-why-ive-called-you-here-today/) antibodies into the "rationalist" water supply after he became our subculture's premier writer. But it was true in _our_ world.) I didn't want to fall into the [bravery-debate](http://slatestarcodex.com/2013/05/18/against-bravery-debates/) trap of, "Look at me, I'm so heroically persecuted, therefore I'm right (therefore you should have sex with me)". I wasn't angry at the "rationalists" for silencing me (which they didn't); I was angry at them for making bad arguments and systematically refusing to engage with the obvious counterarguments.
As an illustrative example, in an argument on Discord in January 2019, I said, "I need the phrase 'actual women' in my expressive vocabulary to talk about the phenomenon where, if transition technology were to improve, then the people we call 'trans women' would want to make use of that technology; I need language that _asymmetrically_ distinguishes between the original thing that already exists without having to try, and the artificial thing that's trying to imitate it to the limits of available technology".
-Kelsey Piper replied, "[T]he people getting surgery to have bodies that do 'women' more the way they want are mostly cis women [...] I don't think 'people who'd get surgery to have the ideal female body' cuts anything at the joints."
+Kelsey Piper replied, "the people getting surgery to have bodies that do 'women' more the way they want are mostly cis women [...] I don't think 'people who'd get surgery to have the ideal female body' cuts anything at the joints."
-Another woman said, "'the original thing that already exists without having to try' sounds fake to me" (to the acclaim of 4 "+1" emoji reactions).
+Another woman said, "'the original thing that already exists without having to try' sounds fake to me" (to the acclaim of four "+1" emoji reactions).
-The problem with this kind of exchange is not that anyone is being shouted down, nor that anyone is lying. The _problem_ is that people are motivatedly, ["algorithmically"](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie) "playing dumb." I wish we had more standard terminology for this phenomenon, which is ubiquitous in human life. By "playing dumb", I don't mean that to suggest that Kelsey was _consciously_ thinking, "I'm playing dumb in order to gain an advantage in this argument." I don't doubt that, _subjectively_, mentioning that cis women also get cosmetic surgery sometimes _felt like_ a relevant reply (because I had mentioned transitioning interventions). It's just that, in context, I was very obviously trying to talk about the natural category of "biological sex", and Kelsey could have figured that out _if she had wanted to_.
+The problem with this kind of exchange is not that anyone is being shouted down, nor that anyone is lying. The _problem_ is that people are motivatedly, ["algorithmically"](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie) "playing dumb." I wish we had more standard terminology for this phenomenon, which is ubiquitous in human life. By "playing dumb", I don't mean that Kelsey was consciously thinking, "I'm playing dumb in order to gain an advantage in this argument." I don't doubt that, subjectively, mentioning that cis women also get cosmetic surgery felt like a relevant reply. It's just that, in context, I was obviously trying to talk about the natural category of "biological sex", and Kelsey could have figured that out if she had wanted to.
-It's not that anyone explicitly said, "Biological sex isn't real" in those words. ([The elephant in the brain](https://en.wikipedia.org/wiki/The_Elephant_in_the_Brain) knew it wouldn't be able to get away with _that_.) But if everyone correlatedly plays dumb whenever someone tries to _talk_ about sex in clear language in a context where that could conceivably hurt some trans person's feelings, I think what you have is a culture of _de facto_ biological sex denialism. ("'The original thing that already exists without having to try' sounds fake to me"!!) It's not that hard to get people to admit that trans women are different from cis women, but somehow they can't (in public, using words) follow the implication that trans women are different from cis women _because_ trans women are male.
+It's not that anyone explicitly said, "Biological sex isn't real" in those words. ([The elephant in the brain](https://en.wikipedia.org/wiki/The_Elephant_in_the_Brain) knew it wouldn't be able to get away with _that_.) But if everyone correlatedly plays dumb whenever someone tries to talk about sex in clear language in a context where that could conceivably hurt some trans person's feelings, I think what you have is a culture of _de facto_ biological sex denialism. ("'The original thing that already exists without having to try' sounds fake to me"!!) It's not that hard to get people to admit that trans women are different from cis women, but somehow they can't (in public, using words) follow the implication that trans women are different from cis women _because_ trans women are male.
-Ben thought I was wrong to think of this kind of behavior as non-ostracisizing. The deluge of motivated nitpicking _is_ an implied marginalization threat, he explained: the game people were playing when they did that was to force me to choose between doing arbitarily large amounts of [interpretive labor](https://acesounderglass.com/2015/06/09/interpretive-labor/), or being cast as never having answered these construed-as-reasonable objections, and therefore over time losing standing to make the claim, being thought of as unreasonable, not getting invited to events, _&c._
+Ben thought I was wrong to see this behavior as non-ostracizing. The deluge of motivated nitpicking _is_ an implied marginalization threat, he explained: the game people were playing when they did that was to force me to choose between doing arbitrarily large amounts of [interpretive labor](https://acesounderglass.com/2015/06/09/interpretive-labor/) or being cast as never having answered these construed-as-reasonable objections, and therefore over time losing standing to make the claim, being thought of as unreasonable, not getting invited to events, _&c._
-I saw the dynamic he was pointing at, but as a matter of personality, I was more inclined to respond, "Welp, I guess I need to write faster and more clearly", rather than to say, "You're dishonestly demanding arbitrarily large amounts of interpretive labor from me." I thought Ben was far too quick to give up on people whom he modeled as trying not to understand, whereas I continued to have faith in the possibility of _making_ them understand if I just ... never gave up. Not to be _so_ much of a scrub as to play chess with a pigeon (which craps on the board and then struts around like it's won), or wrestle with a pig (which gets you both dirty, and the pig likes it), or dispute [what the Tortoise said to Achilles](https://en.wikipedia.org/wiki/What_the_Tortoise_Said_to_Achilles)—but to hold out hope that people in "the community" could only be _boundedly_ motivatedly dense, and anyway that giving up wouldn't make me a stronger writer.
+I saw the dynamic he was pointing at, but as a matter of personality, I was more inclined to respond, "Welp, I guess I need to write faster and more clearly", rather than, "You're dishonestly demanding arbitrarily large amounts of interpretive labor from me." I thought Ben was far too quick to give up on people whom he modeled as trying not to understand, whereas I continued to have faith in the possibility of _making_ them understand if I just didn't give up. Not to play chess with a pigeon (which craps on the board and then struts around like it's won), or wrestle with a pig (which gets you both dirty, and the pig likes it), or dispute [what the Tortoise said to Achilles](https://en.wikipedia.org/wiki/What_the_Tortoise_Said_to_Achilles)—but to hold out hope that people in "the community" could only be boundedly motivatedly dense, and anyway that giving up wouldn't make me a stronger writer.
-(Picture me playing Hermione Granger in a post-Singularity [holonovel](https://memory-alpha.fandom.com/wiki/Holo-novel_program) adaptation of _Harry Potter and the Methods of Rationality_ (Emma Watson having charged me [the standard licensing fee](/2019/Dec/comp/) to use a copy of her body for the occasion): "[We can do anything if we](https://www.hpmor.com/chapter/30) exert arbitrarily large amounts of interpretive labor!")
+(Picture me playing Hermione Granger in a post-Singularity [holonovel](https://memory-alpha.fandom.com/wiki/Holo-novel_program) adaptation of _Harry Potter and the Methods of Rationality_, Emma Watson having charged me [the standard licensing fee](/2019/Dec/comp/) to use a copy of her body for the occasion: "[We can do anything if we](https://www.hpmor.com/chapter/30) exert arbitrarily large amounts of interpretive labor!")
-Ben thought that making them understand was hopeless and that becoming a stronger writer was a boring goal; it would be a better use of my talents to jump up an additional meta level and explain _how_ people were failing to engage. That is, insofar as I expected arguing to work, I had a model of "the rationalists" that kept making bad predictions. What was going on there? Something interesting might happen if I tried to explain _that_.
+Ben thought that making them understand was hopeless and that becoming a stronger writer was a boring goal; it would be a better use of my talents to jump up a meta level and explain _how_ people were failing to engage. That is, insofar as I expected arguing to work, I had a model of "the rationalists" that kept making bad predictions. What was going on there? Something interesting might happen if I tried to explain _that_.
(I guess I'm only now, after spending an additional four years exhausting every possible line of argument, taking Ben's advice on this by finishing and publishing this memoir. Sorry, Ben—and thanks.)
------
-One thing I regret about my behavior during this period was the extent to which I was emotionally dependent on my posse, and in some ways particularly Michael, for validation. I remembered Michael as a high-status community elder back in the _Overcoming Bias_ era (to the extent that there was a "community" in those early days).[^overcoming-bias] I had been skeptical of him, then: the guy makes a lot of stridently "out there" assertions by the standards of ordinary social reality, in a way that makes you assume he must be speaking metaphorically. (He always insists that he's being completely literal.) But he had social proof as the President of the Singularity Institute—the "people person" of our world-saving effort, to complement Yudkowsky's anti-social mad scientist personality—which inclined me to take his "crazy"-sounding assertions more charitably than I otherwise would have.
+One thing I regret about my behavior during this period was the extent to which I was emotionally dependent on my posse, and in some ways particularly Michael, for validation. I remembered Michael as a high-status community elder back in the _Overcoming Bias_ era (to the extent that there was a "community" in those early days).[^overcoming-bias] I had been skeptical of him: the guy makes a lot of stridently "out there" assertions, in a way that makes you assume he must be speaking metaphorically. (He always insists he's being completely literal.) But he had social proof as the President of the Singularity Institute—the "people person" of our world-saving effort, to complement Yudkowsky's antisocial mad scientist personality—which inclined me to take his assertions more charitably than I otherwise would have.
[^overcoming-bias]: Yudkowsky's Sequences (except the [last](https://www.lesswrong.com/s/pvim9PZJ6qHRTMqD3)) had originally been published on [_Overcoming Bias_](https://www.overcomingbias.com/) before the creation of _Less Wrong_ in early 2009.
-Now, the memory of that social proof was a lifeline. Dear reader, if you've never been in the position of disagreeing with the entire weight of Society's educated opinion, _including_ your idiosyncratic subculture that tells itself a story about being smarter and more open-minded than the surrounding Society—well, it's stressful. [There was a comment on the /r/slatestarcodex subreddit around this time](https://old.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/) that cited Yudkowsky, Alexander, Piper, Ozy Frantz, and Rob Bensinger as leaders of the "rationalist" community—just an arbitrary Reddit comment of no significance whatsoever—but it was salient indicator of the _Zeitgeist_ to me, because _[every](https://twitter.com/ESYudkowsky/status/1067183500216811521) [single](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) [one](https://thingofthings.wordpress.com/2018/06/18/man-should-allocate-some-more-categories/) of [those](https://theunitofcaring.tumblr.com/post/171986501376/your-post-on-definition-of-gender-and-woman-and) [people](/images/bensinger-doesnt_unambiguously_refer_to_the_thing.png)_ had tried to get away with some variant on the "word usage is subjective, therefore you have no grounds to object to the claim that trans women are women" _mind game_.
+Now, the memory of that social proof was a lifeline. Dear reader, if you've never been in the position of disagreeing with the entire weight of Society's educated opinion, including your idiosyncratic subculture that tells itself a story about being smarter and more open-minded than the surrounding Society—well, it's stressful. [There was a comment on the /r/slatestarcodex subreddit around this time](https://old.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/) that cited Yudkowsky, Alexander, Piper, Ozy Brennan, and Rob Bensinger as leaders of the "rationalist" community. Just an arbitrary Reddit comment of no significance whatsoever—but it was a salient indicator of the _zeitgeist_ to me, because _[every](https://twitter.com/ESYudkowsky/status/1067183500216811521) [single](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) [one](https://thingofthings.wordpress.com/2018/06/18/man-should-allocate-some-more-categories/) of [those](https://theunitofcaring.tumblr.com/post/171986501376/your-post-on-definition-of-gender-and-woman-and) [people](/images/bensinger-doesnt_unambiguously_refer_to_the_thing.png)_ had tried to get away with some variant on the "word usage is subjective, therefore you have no grounds to object to the claim that trans women are women" mind game.
-In the face of that juggernaut of received opinion, I was already feeling pretty gaslighted. ("We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there, and _you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/ButYouWereThereAndYouAndYou) ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ...") I don't know how I would have held up intact if I were just facing it alone; it's hard to imagine what I would have done in that case. I _definitely_ wouldn't have had the impudence to pester Alexander and Yudkowsky the way I did—especially Yudkowsky—if it was just me alone against everyone else.
+In the face of that juggernaut of received opinion, I was already feeling pretty gaslighted. ("We ... we had a whole Sequence about this. And [_you_ were there, and _you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/ButYouWereThereAndYouAndYou) ... It—really happened, right? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ...") I don't know how I would have held up intact if I were facing it alone. I _definitely_ wouldn't have had the impudence to pester Alexander and Yudkowsky—especially Yudkowsky—if it was just me against everyone else.
-But _Michael thought I was in the right_—not just intellectually on the philosophy issue, but morally in the right to be _prosecuting_ the philosophy issue with our leaders, and not accepting stonewalling as an answer. That social proof gave me a lot of bravery that I otherwise wouldn't have been able to muster up—even though it would have been better if I could have internalized that my dependence on him was self-undermining, insofar Michael himself said that the thing that made me valuable was my ability to think independently.
+But _Michael thought I was in the right_—not just intellectually, but _morally_ in the right to be prosecuting the philosophy issue with our leaders. That social proof gave me a lot of bravery that I otherwise wouldn't have been able to muster up—even though it would have been better if I could have internalized that my dependence on him was self-undermining, insofar as Michael himself said that what made me valuable was my ability to think independently.
-The social proof was probably more effective in my own head, than it was with anyone we were arguing with. _I remembered_ Michael as a high-status community elder back in the _Overcoming Bias_ era, but that had been a long time ago. (Luke Muelhauser had taken over leadership of the Singularity Institute in 2011; and apparently, some sort of rift between Michael and Eliezer had widened in recent years, the details of which had never been explained to me.) Michael's status in "the community" of 2019 was much more mixed. He was intensely critical of the rise of the Effective Altruism movement, which he saw as using bogus claims about how to do the most good to prey on the energies of the smartest and most scrupulous people around. (I remember being at a party in 2015 and asking Michael what else I should spend my San Francisco software engineer money on, if not the EA charities I was considering. I was surprised when his answer was, "You.")
+The social proof was probably more effective in my head than with anyone we were arguing with. I remembered Michael as a high-status community elder back in the _Overcoming Bias_ era, but that had been a long time ago. (Luke Muelhauser had taken over leadership of the Singularity Institute in 2011, and apparently, some sort of rift between Michael and Eliezer had widened in recent years.) Michael's status in "the community" of 2019 was much more mixed. He was intensely critical of the rise of the Effective Altruism movement, which he saw as using bogus claims about how to do the most good to prey on the smartest and most scrupulous people around. (I remember being at a party in 2015 and asking Michael what else I should spend my San Francisco software engineer money on, if not the EA charities I was considering. I was surprised when his answer was, "You.")
-Another blow to Michael's "community" reputation was dealt on 27 February 2019, when Anna [published a comment badmouthing Michael and suggesting that talking to him was harmful](https://www.lesswrong.com/posts/u8GMcpEN9Z6aQiCvp/rule-thinkers-in-not-out?commentId=JLpyLwR2afav2xsyD), which I found pretty disappointing—more so as I began to realize the implications.
+Another blow to Michael's reputation was dealt on 27 February 2019, when Anna [published a comment badmouthing Michael and suggesting that talking to him was harmful](https://www.lesswrong.com/posts/u8GMcpEN9Z6aQiCvp/rule-thinkers-in-not-out?commentId=JLpyLwR2afav2xsyD), which I found disappointing—more so as I began to realize the implications.
-I agreed with her point about how "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help" fill the role of vetting and [common knowledge](https://www.lesswrong.com/posts/9QxnfMYccz9QRgZ5z/the-costly-coordination-mechanism-of-common-knowledge) creation. That's exactly why I was so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserved to be _laughed out the room_. Why was she trying to ostracize the guy who was one of the very few to back me up on this incredibly obvious thing!? The reasons to discredit Michael given in the comment seemed incredibly weak. (He ... flatters people? He ... _didn't_ tell people to abandon their careers? What?) And the evidence against Michael she offered in private didn't seem much more compelling (_e.g._, at a CfAR event, he had been insistent on continuing to talk to someone who Anna thought looked near psychosis and needed a break).
+I agreed with her point about how "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help" fill the role of vetting and [common knowledge](https://www.lesswrong.com/posts/9QxnfMYccz9QRgZ5z/the-costly-coordination-mechanism-of-common-knowledge) creation. That's why I was so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserved to be laughed out of the room. Why was she trying to ostracize the guy who was one of the very few to back me up on this incredibly obvious thing!? The reasons given to discredit Michael seemed weak. (He ... flatters people? He ... _didn't_ tell people to abandon their careers? What?) And the evidence against Michael she offered in private didn't seem much more compelling (_e.g._, at a CfAR event, he had been insistent on continuing to talk to someone who Anna thought looked near psychosis and needed a break).
-It made sense for Anna to not like Michael anymore because of his personal conduct, or because of his opposition to EA. (Expecting all of my friends to be friends with _each other_ would be [Geek Social Fallacy #4](http://www.plausiblydeniable.com/opinion/gsf.html).) If she didn't want to invite him to CfAR stuff, fine; that was her business not to invite him. But what did she gain from _escalating_ to publicly denouncing him as someone whose "lies/manipulations can sometimes disrupt [people's] thinking for long and costly periods of time"?! She said that she was trying to undo the effects of her previous endorsements of him, and that the comment seemed like it ought to be okay by Michael's standards (which didn't include an expectation that people should collude to protect each other's reputations).
+It made sense for Anna to not like Michael anymore because of his personal conduct, or because of his opposition to EA. (Expecting all of my friends to be friends with each other would be [Geek Social Fallacy #4](http://www.plausiblydeniable.com/opinion/gsf.html).) If she didn't want to invite him to CfAR stuff, fine. But what did she gain from publicly denouncing him as someone whose "lies/manipulations can sometimes disrupt [people's] thinking for long and costly periods of time"?! She said she was trying to undo the effects of her previous endorsements of him, and that the comment seemed like it ought to be okay by Michael's standards (which didn't include an expectation that people should collude to protect each other's reputations).
-----