+> Hate-warp like this is bad for truth-perception; my understanding of the situation is that it's harm done to you by the group you say you left. I would read this as being a noninnocent error of that group; that they couldn't get what they wanted from people who still had friends outside their own small microculture, and noninnocently then decided that this outer culture was bad and people needed to be pried loose from it. They tried telling some people that this outer culture was gaslighting them and maliciously lying to them and had to be understood in wholly adversarial terms to break free of the gaslighting; that worked on somebody, and made a new friend for them; so their brain noninnocently learned that it ought to use arguments like that again, so they must be true.
+> This is a sort of thing I super did not do because I _understood_ it as a failure mode and Laid My Go Stones Against Ever Actually Being A Cult; I armed people with weapons against it, or tried to, but I was optimistic in my hopes about how much could actually be taught.
+> **zackmdavis** — 11/29/2022 11:20 PM
+> Without particularly defending Vassar _et al._ or my bad literary criticism (sorry), _modeling the adversarial component of non-innocent errors_ (as contrasted to "had to be understood in wholly adversarial terms") seems very important. (Maybe lying is "worse" than rationalizing, but if you can't hold people culpable for rationalization, you end up with a world that's bad for broadly the same reasons that a world full of liars is bad: we can't steer the world to good states if everyone's map is full of falsehoods that locally benefitted someone.)
+> **Eliezer** — 11/29/2022 11:22 PM
+> Rationalization sure is a huge thing! That's why I considered important to discourse upon the science of it, as was then known; and to warn people that there were more complicated tangles than that, which no simple experiment had shown yet.
+> **zackmdavis** — 11/29/2022 11:22 PM
+> yeah
+> **Eliezer** — 11/29/2022 11:23 PM
+> It remains something that mortals do, and if you cut off anybody who's ever done that, you'll be left with nobody. And also importantly, people making noninnocent errors, if you accuse them of malice, will look inside themselves and correctly see that this is not how they work, and they'll stop listening to the (motivated) lies you're telling them about themselves.
+> This also holds true if you make up overly simplistic stories about 'ah yes well you're doing that because you're part of $woke-concept-of-society' etc.
+> **zackmdavis** — 11/29/2022 11:24 PM
+> I think there's _also_ a frequent problem where you try to accuse people of non-innocent errors, and they motivatedly interpret _you_ as accusing malice
+> **Eliezer** — 11/29/2022 11:25 PM
+> Then invent new terminology. I do that all the time when existing terminology fails me.
+> Like I literally invented the term 'noninnocent error' right in this conversation.
+> **zackmdavis** — 11/29/2022 11:27 PM
+> I've tried this, but maybe it wasn't good enough, or I haven't been using it consistently enough: [https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)
+> I should get ready for bed
+> I will endeavor to edit out the hate-warp from my memoir before publishing, and _probably_ not talk in this server
+> **Eliezer** — 11/29/2022 11:31 PM
+> I think you should worry first about editing the hate-warp out of yourself, but editing the memoir might be useful practice for it. Good night.