So, a striking thing about my series of increasingly frustrating private conversations and subsequent public Facebook meltdown (the stress from which soon landed me in psychiatric jail, but that's [another](/2017/Mar/fresh-princess/) [story](/2017/Jun/memoirs-of-my-recent-madness-part-i-the-unanswerable-words/)) was the tendency for some threads of conversation to get _derailed_ on some variation of, "Well, the word _woman_ doesn't necessarily mean that," often with a link to ["The Categories Were Made for Man, Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) by Scott Alexander, the _second_ most prominent writer in our robot cult.
-So, this _really_ wasn't what I was trying to talk about; _I_ thought I was trying to talk about autogynephilia as an _empirical_ theory in psychology, the truth or falsity of which obviously cannot be altered by changing the meanings of words. But because I trusted people in my robot cult to be dealing in good faith rather than fucking with me because of their political incentives, I took the bait.
+So, this _really_ wasn't what I was trying to talk about; _I_ thought I was trying to talk about autogynephilia as an _empirical_ theory in psychology, the truth or falsity of which obviously cannot be altered by changing the meanings of words.
Psychology is a complicated empirical science: no matter how "obvious" I might think something is, I have to admit that I could be wrong—not just as a formal profession of modesty, but _actually_ wrong in the real world. But this "I can define the word _woman_ any way I want" mind game? _That_ part was _absolutely_ clear-cut. That part of the argument, I knew I could win. [We had a whole Sequence about this](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) back in 'aught-eight, in which Yudkowsky pounded home this _exact_ point _over and over and over again_, that word and category definitions are _not_ arbitrary, because there are criteria that make some definitions _perform better_ than others as "cognitive technology"—
> ["One may even consider the act of defining a word as a promise to \[the\] effect [...] \[that the definition\] will somehow help you make inferences / shorten your messages."](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace)
-So I ended up spending three years of my life re-explaining the relevant philosophy-of-language issues in exhaustive, _exhaustive_ detail, at first in the object-level context of gender on this blog in ["The Categories Were Made for Man to Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), and the ["Reply on Adult Human Females"](/2018/Apr/reply-to-the-unit-of-caring-on-adult-human-females/), and later (after [Eliezer Yudkowsky joined in the mind games on Twitter in November 2018](https://twitter.com/ESYudkowsky/status/1067183500216811521), and I _flipped the fuck out_) [strictly abstractly](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [on](https://www.lesswrong.com/posts/edEXi4SpkXfvaX42j/schelling-categories-and-simple-membership-tests) [the](https://www.lesswrong.com/posts/fmA2GJwZzYtkrAKYJ/algorithms-of-deception) [robot](https://www.lesswrong.com/posts/4hLcbXaqudM9wSeor/philosophy-in-the-darkest-timeline-basics-of-the-evolution)-[cult](https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist) [blog](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception).
+So, because I trusted people in my robot cult to be dealing in good faith rather than fucking with me because of their political incentives, I took the bait. I ended up spending three years of my life re-explaining the relevant philosophy-of-language issues in exhaustive, _exhaustive_ detail, at first in the object-level context of gender on this blog (in ["The Categories Were Made for Man to Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), and the ["Reply on Adult Human Females"](/2018/Apr/reply-to-the-unit-of-caring-on-adult-human-females/)), and later (after [Eliezer Yudkowsky joined in the mind games on Twitter in November 2018](https://twitter.com/ESYudkowsky/status/1067183500216811521), and I _flipped the fuck out_) [strictly abstractly](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [on](https://www.lesswrong.com/posts/edEXi4SpkXfvaX42j/schelling-categories-and-simple-membership-tests) [the](https://www.lesswrong.com/posts/fmA2GJwZzYtkrAKYJ/algorithms-of-deception) [robot](https://www.lesswrong.com/posts/4hLcbXaqudM9wSeor/philosophy-in-the-darkest-timeline-basics-of-the-evolution)-[cult](https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist) [blog](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception).
[TODO:
_Everyone else shot first_.
+"You can't define a word any way you want" and "You can" are both true in different senses, but if you opportunistically choose which one to emphasize
+
_I need the right answer in order to decide whether or not to cut my dick off_—if I were dumb enough to believe Yudkowsky's insinuation that pronouns don't have truth conditions, I might have made a worse decision
If rationality is useful for anything, it should be useful for practical life decisions like this
https://www.newsweek.com/we-need-balance-when-it-comes-gender-dysphoric-kids-i-would-know-opinion-1567277
https://trickymothernature.substack.com/p/one-long-cum
+
+"Horrified Mother Gets Front Row Seat to Sex & Gender Indoctrination Strategy Meeting"
+https://archive.li/pmuhQ
+
+https://gcacademianetwork.weebly.com/