-------
-
-[TODO outline of the section analyzing the political game here—]
-
-[If Yudkowsky's comments don't pass the laugh test, what's _actually_ going on here? When smart people act stupid, there's usually a reason—it's optimized stupidity, stupidity that acheives a goal]
-
-[We get a clue in the form of a DISCLAIMER comment https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228 ]
-
-> It unfortunately occurs to me that I must, in cases like these, disclaim that—to the extent there existed sensible opposing arguments against what I have just said—people might be reluctant to speak them in public, in the present social atmosphere. That is, in the logical counterfactual universe where I knew of very strong arguments against freedom of pronouns, I would have probably stayed silent on the issue, as would many other high-profile community members, and only Zack M. Davis would have said anything where you could hear it.
->
-> This is a filter affecting your evidence; it has not to my own knowledge filtered out a giant valid counterargument that invalidates this whole post. I would have kept silent in that case, for to speak then would have been dishonest.
->
-> Personally, I'm used to operating without the cognitive support of a civilization in controversial domains, and have some confidence in my own ability to independently invent everything important that would be on the other side of the filter and check it myself before speaking. So you know, from having read this, that I checked all the speakable and unspeakable arguments I had thought of, and concluded that this speakable argument would be good on net to publish, as would not be the case if I knew of a stronger but unspeakable counterargument in favor of Gendered Pronouns For Everyone and Asking To Leave The System Is Lying.
->
-> But the existence of a wide social filter like that should be kept in mind; to whatever quantitative extent you don't trust your ability plus my ability to think of valid counterarguments that might exist, as a Bayesian you should proportionally update in the direction of the unknown arguments you speculate might have been filtered out.
-
-[On the one hand, kudos to Yudkowsky for pointing out the filter. On the other hand, this claim to have "independently invent everything important that would be on the other side of the filter" is _laughable_; my point about the appeal of the self-ID pronoun convention rests on the existing meanings of gendered pronouns, such that it's hypocritical to play dumb about there being existing meanings while defending the self-ID convention is really obvious. If you're persuaded by anything I've said in this post, you should quantitatively downgrade your trust in Yudkowsky]
-
-> A clearer example of a forbidden counterargument would be something like e.g. imagine if there was a pair of experimental studies somehow proving that (a) everybody claiming to experience gender dysphoria was lying, and that (b) they then got more favorable treatment from the rest of society. We wouldn't be able to talk about that. No such study exists to the best of my own knowledge, and in this case we might well hear about it from the other side to whom this is the exact opposite of unspeakable; but that would be an example.
-
-[again, [Yudkowsky's pathological fixation on avoiding "lying"](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly) is a distraction from anything that anyone actually cares about. Certainly, I don't believe (a) as stated. But something substantively _similar_ to (a), that self-reports are biased by social desirability and we can't take "gender identity" literally seems _overwhelmingly_ likely to be true, and I have a study to support it— /papers/blanchard-clemmensen-steiner-social_desirability_response_set_and_systematic_distortion.pdf . This should not seem implausible to someone who co-blogged with Robin Hanson for three years! As for (b), I don't think they get more favorable treatment from the rest of Society uniformly, but there absolutely _is_ an ideological subculture that encourages transition.
-
-Can we talk about _that_? (Which is the stuff I actually care about, not pronouns)
-
-https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421986539228&reply_comment_id=10159424960909228
-
-> now that we have a truth-bearing sentence, we can admit of the possibility of using our human superpower of language to *debate* whether this sentence is indeed true or false [...] Trying to pack all of that into the pronouns you'd have to use in step 1 is the wrong place to pack it.
-
-It's a hollow stalling tactic to say "Don't pack this into pronouns; let's use nouns to talk about the real issues, when you _motherfuckers_ have _no intention whatsoever_ of actually using the human superpower of language to debate the real issues!
-
-I was going to write a different multi-thousand word blog post, but I'm sick of Eliezer Yudkowsky fucking with me for years
-
-All of my heretical opinions are just _his_ opinions from the 'aughts; did he actually change his mind, or just he just think he can get away with pretending he didn't say it
-
-nearest unblocked strategy
-
-4 levels of intellectual conversation https://rationalconspiracy.com/2017/01/03/four-layers-of-intellectual-conversation/
-
-If we can't talk about that, then your rationality community is a _fraud_. If there's anything at all to your rationality stuff https://www.lesswrong.com/posts/LqjKP255fPRY7aMzw/practical-advice-backed-by-deep-theories it should be _useful_ high-stakes pratical decisions that your community members need to make, like _whether or not to cut my dick off_. If we can't talk about the information I need to make an informed decision about that, then your rationalists are worse than useless. (Worse, because false claims to authority are more misleading than dead air.) You _motherfuckers_ need to rebrand, or disband, or be destroyed
-
-I know none of this matters (If any professional alignment researchers wasting time reading this instead of figuring out how to save the world, get back to work!!), but one would have thought that the _general_ skills of correct argument would matter for saving the world.
-a rationality community that can't think about _practical_ issues that affect our day to day lives, but can get existential risk stuff right, is like asking for self-driving car software that can drive red cars but not blue cars
-It's a _problem_ if public intellectuals in the current year need to pretend to be dumber than seven-year-olds in 2016
-
-If you don't actually believe in the [common interest of many causes](https://www.lesswrong.com/posts/4PPE6D635iBcGPGRy/rationality-common-interest-of-many-causes) anymore, you could _say_ that—rebrand your thing as a more narrowly-scoped existential-risk-reduction cult, rather than falsely claiming to be about modeling-and-optimizing the world
-
-"We might well hear about it from the other side" is an interesting phrasing—almost admitting that you're a wholly owned subsidary of the Blue Tribe]
-
-> it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do *know* they're living in a half-Stalinist environment
-
-> I don't see what the alternative is besides getting shot, or utter silence about everything Stalin has expressed an opinion on including "2 + 2 = 4" because if that logically counterfactually were wrong you would not be able to express an opposing opinion.
-
-[Summarize "A Rational Argument" https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument
-You could imagine the campaign manager saying the same thing—"I don't see what the alternative is".]