check in
[Ultimately_Untrue_Thought.git] / notes / a-hill-of-validity-sections.md
index a10278f..8bbd699 100644 (file)
@@ -976,3 +976,10 @@ https://www.lesswrong.com/posts/ax695frGJEzGxFBK4/biology-inspired-agi-timelines
 ------
 
 Lightwavers on Twitter (who Yudkowsky knew from /r/rational) dissed Charles Murray on Twitter
+
+https://nostalgebraist.tumblr.com/post/686455476984119296/eliezer-yudkowsky-seems-really-depressed-these
+
+> So now my definitely-not-Kelthorkarni have weird mental inhibitions against actually listening to me, even when I clearly do know much better than they do.  In retrospect I think I was guarding against entirely the wrong failure modes.  The problem is not that they're too conformist, it's that they don't understand how to be defiant without diving heedlessly into the seas of entropy.  It's plausible I should've just gone full Kelthorkarnen
+https://www.glowfic.com/replies/1614129#reply-1614129
+
+I was pleading to him in his capacity as rationality leader, not AGI alignment leader; I know I have no business talking about the latter