-
-------
-
-[TODO outlining this section]
-
-[If Yudkowsky's comments don't pass the laugh test, what's _actually_ going on here? When smart people act stupid, there's usually a reason—it's optimized stupidity]
-
-[We get a clue in the form of a DISCLAIMER comment]
-
-
-
-
-
-> A clearer example of a forbidden counterargument would be something like e.g. imagine if there was a pair of experimental studies somehow proving that (a) everybody claiming to experience gender dysphoria was lying, and that (b) they then got more favorable treatment from the rest of society. We wouldn't be able to talk about that. No such study exists to the best of my own knowledge, and in this case we might well hear about it from the other side to whom this is the exact opposite of unspeakable; but that would be an example.
-
-Again, [Yudkowsky's pathological fixation on avoiding "lying"](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly)
-
-"the other side"—interesting phrasing
-
-
-> it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do *know* they're living in a half-Stalinist environment
-
-Trying to achieve a political objective "in ways that exhibit generally rationalist principles"
-
-"A Rational Argument"
-"Super-Proton Things"
-
-
-[TODO:
- * Stalin and "A Rational Argument"
- * "If there were unspeakable arguments against, we couldn't talk about them"—okay, then you and your rationalists are frauds
- * I know none of this matters (If any professional alignment researchers wasting time reading this instead of figuring out how to save the world, get back to work!!), but one would have thought that the _general_ skills of correct argument would matter for saving the world.
-a rationality community that can't think about _practical_ issues that affect our day to day lives, but can get existential risk stuff right, is like asking for self-driving car software that can drive red cars but not blue cars
-It's a _problem_ if public intellectuals in the current year need to pretend to be dumber than seven-year-olds in 2016
-nearest unblocked strategy and preview of "Hill of Validity in Defense"
-it's hollow to say, we should talk about the real issues with nouns, not grammar, if you have no intention whatsoever of talking about the real issues!—the reason TERFs go to the dark side and stop respecting pronouns is because they're sick of being fucked with
-I was going to write a different multi-thousand word blog post, but I'm sick of Eliezer Yudkowsky fucking with me for years
-]