+
+In the English language as it is spoken today, third-person singular gender pronouns _do_ have truth conditions. If a stranger crossing your path is rude to you, you'll say, "What's _her_ problem?" or "What's _his_ problem?" depending on your perception of their secondary sex characteristics.
+
+(1) If _x_ is a noun, you can't define _x_ any way you want without negative side-effects on your cognition (for at least 37 different reasons).
+(2) _Woman_ is a noun.
+[From (1), (2), and _modus ponens_] Therefore, you can't define the word _woman_ any way you want without negative side-effects on your cognition.
+
+It's _unhealthy_ to spend this many hours stuck in a loop of, "We had an entire Sequence about this! You lying motherfuckers!"
+
+What are you looking at me like that for? [It's not a cult!](https://www.lesswrong.com/posts/gBma88LH3CLQsqyfS/cultish-countercultishness)
+
+At least, it [_wasn't_](https://www.lesswrong.com/posts/yEjaj7PWacno5EvWa/every-cause-wants-to-be-a-cult) a cult.
+
+(A _secondary_ reason for explaining, is that it could _possibly_ function as a useful warning to the next guy to end up in an similar situation of trusting the branded systematically-correct-reasoning community to actually be interested in doing systematically correct reasoning, and incurring a lot of wasted effort and pain [making an extraordinary effort](https://www.lesswrong.com/posts/GuEsfTpSDSbXFiseH/make-an-extraordinary-effort) to [try to](https://www.lesswrong.com/posts/XqvnWFtRD2keJdwjX/the-useful-idea-of-truth) correct the situation. But I don't know how common that is.)
+
+https://thezvi.wordpress.com/2017/08/12/what-is-rationalist-berkleys-community-culture/
+https://srconstantin.wordpress.com/2017/08/08/the-craft-is-not-the-community/
+
+I feel betrayed, but that doesn't
+
+"chromosomes" isn't as dumb as it sounds—it's the "root" of the causal net of all other sex differences
+
+Am I suffering from a "hostile media" effect?
+
+Choose a gerrymandered or thin-subspace category isn't that dangerous in itself—it's the dark-side epistemology that kills everyone
+
+deconfusion https://intelligence.org/2018/11/22/2018-update-our-new-research-directions/
+
+I want the thing Ozy is doing here to be _socially unacceptable_; I want it to be _laughed out of the room_
+https://thingofthings.wordpress.com/2019/04/10/in-my-culture/
+https://www.lesswrong.com/posts/zGJw9PGhu9e8Z6BEX/fake-norms-or-truth-vs-truth
+
+https://srconstantin.wordpress.com/2018/12/24/contrite-strategies-and-the-need-for-standards/
+
+ the elephant in my brain has been using "I'm going to be in incredible emotional pain until I write the story down" as a precommitment device
+
+If an Outer Party member in the world of George Orwell's 1984 says, "Oceania has always been at war with Eastasia," even though they clearly remember events from last week, when Oceania was at war with Eurasia instead [...] even if it's not really their fault
+
+
+> but not worth starting over over
+
+I mean, this is the part where I do a very not Effective Altruist-themed thing, and stop talking as if I do anything for the good of the lightcone. (Maybe see Ben on "Against Responsibility" and "The Humility Argument for Honesty".) I internalized a particular vision [...] of what conduct is appropriate to a "rationalist"; I'm didn't that standard upheld with respect to my Something to Protect; so I am doing a halt–melt–catch-fire on "the community." It's worth starting over over _for me_. If my actions (implausibly) represent a PR risk to someone else's Singularity strategy, then they're welcome to try to persuade or negotiate with me.