X-Git-Url: http://unremediatedgender.space/source?p=Ultimately_Untrue_Thought.git;a=blobdiff_plain;f=notes%2Fmemoir-sections.md;fp=notes%2Fmemoir-sections.md;h=90c8b511b12f976a6509a730b9c1d8ad1b24627c;hp=79383ad76ad738769ad43ceccbd567592c58a8d6;hb=12ddb8be0f71b38b004a7e890c07b7a1319e7183;hpb=dae287386cebd70c79c87592bd52182ef83a0ef8 diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index 79383ad..90c8b51 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -10,7 +10,6 @@ With internet available— ✓ tussle with Ruby on "Causal vs. Social Reality" ✓ The End (of Sequences) _ retrieve own-blog links for "futurist-themed delusions" -_ URL for "Why Quantum?" _ double-check that "Changing Emotions" was in January _ No such thing as a tree _ Yudkowsky on AlphaGo @@ -1738,7 +1737,15 @@ from "Go Forth and Create the Art"— contrast the sneering at Earth people with the attitude in "Whining-Based Communities" -from "Why Quantum?"— +from "Why Quantum?" (https://www.lesswrong.com/posts/gDL9NDEXPxYpDf4vz/why-quantum) > But would you believe that I had such strong support, if I had not shown it to you in full detail? Ponder this well. For I may have other strong opinions. And it may seem to you that _you_ do't see any good reason to form such strong beliefs. Except this is _not_ what you will see; you will see simply that there _is_ no good reason for the strong belief, that there _is_ no strong support one way or the other. For our first-order beliefs are how the world seems to _be_. And you may think, "Oh, Eliezer is just opinionated—forming strong beliefs in the absence of lopsided support." And I will not have time to do another couple of months worth of blog posts. > > I am _very_ far from infallible, but I do not hold strong opinions at random. + +Another free speech exchange with S.K. in 2020: https://www.lesswrong.com/posts/YE4md9rNtpjbLGk22/open-communication-in-the-days-of-malicious-online-actors?commentId=QoYGQS52HaTpeF9HB + +https://www.lesswrong.com/posts/hAfmMTiaSjEY8PxXC/say-it-loud + +Maybe lying is "worse" than rationalizing, but if you can't hold people culpable for rationalization, you end up with a world that's bad for broadly the same reasons that a world full of liars is bad: we can't steer the world to good states if everyone's map is full of falsehoods that locally benefitted someone + +http://benjaminrosshoffman.com/bad-faith-behavior-not-feeling/