Of course, such speech restrictions aren't necessarily "irrational", depending on your goals! If you just don't think "free speech" should go that far—if you _want_ to suppress atheism or gender-critical feminism—speech codes are a perfectly fine way to do it! And _to their credit_, I think most theocrats and trans advocates are intellectually honest about the fact that this is what they're doing: atheists or transphobes are _bad people_ (the argument goes) and we want to make it harder for them to spread their lies or their hate.
-In contrast, by claiming to be "not taking a stand for or against any Twitter policies" while insinuating that people who oppose the policy are ontologically confused, Yudkowsky was being less honest than the theocrat or the activist: of _course_ the point of speech codes is suppress ideas! Given that the distinction between facts and policies is so obviously _not anyone's crux_—the smarter people in the "anti-trans" faction already know that, and the dumber people in the faction wouldn't change their alignment if they were taught—it's hard to see what the _point_ of harping on the fact/policy distiction would be, _except_ to be seen as implicitly taking a stand for the "pro-trans" faction, while [putting on a show of being politically "neutral."](https://www.lesswrong.com/posts/jeyvzALDbjdjjv5RW/pretending-to-be-wise)
+In contrast, by claiming to be "not taking a stand for or against any Twitter policies" while accusing people who oppose the policy of being ontologically confused, Yudkowsky was being less honest than the theocrat or the activist: of _course_ the point of speech codes is suppress ideas! Given that the distinction between facts and policies is so obviously _not anyone's crux_—the smarter people in the "anti-trans" faction already know that, and the dumber people in the faction wouldn't change their alignment if they were taught—it's hard to see what the _point_ of harping on the fact/policy distiction would be, _except_ to be seen as implicitly taking a stand for the "pro-trans" faction, while [putting on a show of being politically "neutral."](https://www.lesswrong.com/posts/jeyvzALDbjdjjv5RW/pretending-to-be-wise)
It makes sense that Yudkowsky might perceive political constraints on what he might want to say in public—especially when you look at [what happened to the _other_ Harry Potter author](https://en.wikipedia.org/wiki/Politics_of_J._K._Rowling#Transgender_people). (Despite my misgivings—and the fact that at this point it's more of a genre convention or a running joke, rather than any attempt at all to conceal my identity—this blog _is_ still published under a pseudonym; it would be hypocritical of me to accuse someone of cowardice about what they're willing to attach their real name to.)
Especially compared to normal Berkeley, I had to give the Berkeley "rationalists" credit for being _very good_ at free speech norms. (I'm not sure I would be saying this in the world where Scott Alexander didn't have a [traumatizing experience with social justice in college](https://slatestarcodex.com/2014/01/12/a-response-to-apophemi-on-triggers/), causing him to dump a ton of anti-social-justice, pro-argumentative-charity antibodies in the "rationalist" collective "water supply" after he became our subculture's premier writer. But it was true in _our_ world.) I didn't want to fall into the [bravery-debate](http://slatestarcodex.com/2013/05/18/against-bravery-debates/) trap of, "Look at me, I'm so heroically persecuted, therefore I'm right (therefore you should have sex with me)". I wasn't angry at the "rationalists" for being silenced or shouted down (which I wasn't); I was angry at them for _making bad arguments_ and systematically refusing to engage with the obvious counterarguments when they're made.
-For example, in an argument on Discord in January 2019, I said, "I need the phrase 'actual women' in my expressive vocabulary to talk about the phenomenon where, if transition technology were to improve, then the people we call 'trans women' would want to make use of that technology; I need language that _asymmetrically_ distinguishes between the original thing that already exists without having to try, and the artificial thing that's trying to imitate it to the limits of available technology".
+As an illustrative example, in an argument on Discord in January 2019, I said, "I need the phrase 'actual women' in my expressive vocabulary to talk about the phenomenon where, if transition technology were to improve, then the people we call 'trans women' would want to make use of that technology; I need language that _asymmetrically_ distinguishes between the original thing that already exists without having to try, and the artificial thing that's trying to imitate it to the limits of available technology".
Kelsey Piper replied, "[T]he people getting surgery to have bodies that do 'women' more the way they want are mostly cis women [...] I don't think 'people who'd get surgery to have the ideal female body' cuts anything at the joints."
> I am reminded of someone who I talked with about Zack writing to you and Scott to request that you clarify the category boundary thing. This person had an emotional reaction described as a sense that "Zack should have known that wouldn't work" (because of the politics involved, not because Zack wasn't right). Those who are savvy in high-corruption equilibria maintain the delusion that high corruption is common knowledge, to justify expropriating those who naively don't play along, by narratizing them as already knowing and therefore intentionally attacking people, rather than being lied to and confused.
]
-
[TODO small section: concern about bad faith nitpicking—
One reason someone might be reluctant to correct mistakes when pointed out, is the fear that such a policy could be abused by motivated nitpickers. It would be pretty annoying to be obligated to churn out an endless stream of trivial corrections by someone motivated to comb through your entire portfolio and point out every little thing you did imperfectly, ever.
But, well, I thought I had made a pretty convincing that a lot of people are making a correctable and important rationality mistake, such that the cost of a correction (about the philosophy of language specifically, not any possible implications for gender politics) would actually be justified here. If someone had put _this much_ effort into pointing out an error _I_ had made four months or five years ago and making careful arguments for why it was important to get the right answer, I think I _would_ put some serious thought into it.
-
]
+[TODO: We lost?! How could we lose??!!?!? And, post-war concessions ...
-* We lost?! How could we lose??!!?!?
+curation hopes ... 22 Jun: I'm expressing a little bit of bitterness that a mole rats post got curated https://www.lesswrong.com/posts/fDKZZtTMTcGqvHnXd/naked-mole-rats-a-case-study-in-biological-weirdness
"Univariate fallacy" also a concession
https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/
+"Yes Requires the Possibility of No" 19 May https://www.lesswrong.com/posts/WwTPSkNwC89g3Afnd/comment-section-from-05-19-2019
+scuffle on LessWrong FAQ 31 May https://www.lesswrong.com/posts/MqrzczdGhQCRePgqN/feedback-requested-draft-of-a-new-about-welcome-page-for#iqEEme6M2JmZEXYAk
-[TODO: my reluctance to write a memoir, displacement behavior
-Ben thought it was imporant on 30 Apr, 12 Aug , I confess to being stuck on 9 Nov
-I write more about the philosophy of language instead
-
]
-curation hopes ... 22 Jun: I'm expressing a little bit of bitterness that a mole rats post got curated https://www.lesswrong.com/posts/fDKZZtTMTcGqvHnXd/naked-mole-rats-a-case-study-in-biological-weirdness
-
-[TODO scuffle on "Yes Requires the Possibility of No"
-19 May
-https://www.lesswrong.com/posts/WwTPSkNwC89g3Afnd/comment-section-from-05-19-2019
-]
-[TODO scuffle on LessWrong FAQ 31 May https://www.lesswrong.com/posts/MqrzczdGhQCRePgqN/feedback-requested-draft-of-a-new-about-welcome-page-for#iqEEme6M2JmZEXYAk
+Since arguing at the object level had failed (["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), ["Reply on Adult Human Females"](/2018/Apr/reply-to-the-unit-of-caring-on-adult-human-females/)), and arguing at the strictly meta level had failed (["... Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries)), the obvious thing to do next was to jump up to the meta-meta level and tell the story about why the "rationalists" were Dead To Me now, that [my price for joining](https://www.lesswrong.com/posts/Q8evewZW5SeidLdbA/your-price-for-joining) was not being met. (Just like Ben had suggested in December and in April.)
-I furthermore claim that the following disjunction is true:
+I found it trouble to make progress on. I felt—constrained. I didn't know how to tell the story without (as I perceived it) escalating personal conflicts or leaking info from private conversations.
-> Either the quoted excerpt is a blatant lie on Scott's part because there are rules of rationality governing conceptual boundaries and Scott absolutely knows it, or
-> You have no grounds to criticize me for calling it a blatant lie, because there's no rule of rationality that says I shouldn't draw the category boundaries of "blatant lie" that way.
+Unable to make progress on the grief-memoir, I instead turned to a combination of writing bitter and insulting comments whenever I saw someone praise "the rationalists" collectively, and—more philosophy-of-language blogging!
-there needs to be _some_ way for _someone_ to invest a _finite_ amount of effort to _correct the mistake_
+[TODO 2019 activities—
+"Schelling Categories" Aug 2019, "Maybe Lying Doesn't Exist" Oct 2019, "Algorithms of Deception!" Oct 2019, "Heads I Win" Sep 2019, "Firming Up ..." Dec 2019
+"epistemic defense" meeting
]
[TODO section on factional conflict:
mutualist pattern where Michael by himself isn't very useful for scholarship (he just says a lot of crazy-sounding things and refuses to explain them), but people like Sarah and me can write intelligible things that secretly benefited from much less legible conversations with Michael.
]
-
[TODO: Yudkowsky throwing NRx under the bus; tragedy of recursive silencing
15 Sep Glen Weyl apology
]
5 Jan memoir as nuke
]
-(There's another very important part of the story that would fit around here chronologically, but unfortunately, it's not my story to tell.)
+There's another very important part of the story that would fit around here chronologically, but unfortunately, it's not my story to tell.
+
+[TODO: theorizing about on the margin]
[TODO: "out of patience" email]
[TODO: Sep 2020 categories clarification from EY—victory?!]
[TODO: if he's reading this, win back respect— reply, motherfucker]
-
[TODO: the Death With Dignity era]
-[TODO: regrets]
\ No newline at end of file
+[TODO: regrets]
noncontiguous on deck—
-_ reluctance to write a memoir
+X reluctance to write a memoir during 2019
_ let's recap / being put in a box
_ if he's reading this
-_ tie off reply to Xu
+_ tie off reply to Xu
+_ "duly appreciated"
_ bridge to "Challenges"
-_ Christmas party 2019
-_ Anna vs. Michael
-
+_ Christmas party 2019 and state of Church leadership
+_ Anna vs. Michael factional conflict
with internet available—
+_ "praise Ba'al" language from "Rationalist Blogging"
_ link simulacrum posts: Zvi (he has a category), Elizabeth, at least one more from Ben
+_ more examples of snarky comments about "the rationalists"
_ Discord logs before Austin retreat
_ screenshot Rob's Facebook comment which I link
_ 13th century word meanings
"Maybe Lying Doesn't Exist" Oct 2019
"Algorithms of Deception!" Oct 2019
"Firming Up ..." Dec 2019
+
"Darkest Timeline" June 2020
"Maybe Lying Can't Exist?!" Aug 2020
"Unnatural Categories" Jan 2021
https://www.glowfic.com/replies/1853001#reply-1853001
> Another reason people go to Hell? Malediction! An Asmodean priest was using that spell on children too! Pharasma apparently doesn't give a shit! At best, it might be a negative weight in Her utility function that She traded to the ancient gods of Evil for something else that She wanted. A tradeable medium-sized negative utility is not the same as Her _really giving a shit_.
+
+
+I furthermore claim that the following disjunction is true:
+
+> Either the quoted excerpt is a blatant lie on Scott's part because there are rules of rationality governing conceptual boundaries and Scott absolutely knows it, or
+> You have no grounds to criticize me for calling it a blatant lie, because there's no rule of rationality that says I shouldn't draw the category boundaries of "blatant lie" that way.
+
+there needs to be _some_ way for _someone_ to invest a _finite_ amount of effort to _correct the mistake_