From de030c8106799b42398da36c049aefd3290db060 Mon Sep 17 00:00:00 2001 From: "M. Taylor Saotome-Westlake" Date: Wed, 8 Feb 2023 21:05:50 -0800 Subject: [PATCH] memoir: Eliezerfic fight, cont'd ... --- content/drafts/standing-under-the-same-sky.md | 15 +++++++++++++-- 1 file changed, 13 insertions(+), 2 deletions(-) diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index b38f194..6f82f6f 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -569,9 +569,20 @@ I said that I thought people were missing this idea that the reason "truth is be I started a new thread to complain about the attitude I was seeing (Subject: "Noble Secrets; Or, Conflict Theory of Optimization on Shared Maps"). When fiction in this world, _where I live_, glorifies Noble Lies, that's a cultural force optimizing for making shared maps less accurate, I explained. As someone trying to make shared maps _more_ accurate, this force was hostile to me and mine. I understood that secrets and lies are different, but if you're a consequentialist thinking in terms of what kinds of optimization pressures are being applied to shared maps, it's the same issue: I'm trying to steer _towards_ states of the world where people know things, and the Keepers of Noble Secrets are trying to steer _away_ from states of the world where people know things. That's a conflict. I was happy to accept Pareto-improving deals to make the conflict less destructive, but I wasn't going to pretend the pro-ignorance forces were my friends just because they self-identify as "rationalists" or "EA"s. I was willing to accept secrets around nuclear or biological weapons, or AGI, on "better ignorant than dead" grounds, but the "protect sadists from being sad" thing was _just_ coddling people who can't handle the truth, which made _my_ life worse. -And just—back in the 'aughts, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog in 2008. +I wasn't buying the excuse that secret-Keeping practices that wouldn't be OK on Earth were somehow OK on dath ilan (which was asserted by authorial fiat to be sane and smart and benevolent enough to make it work). Or if I couldn't argue with authorial fiat: the reasons why it would be bad on Earth (even if it wouldn't be bad on dath ilan) are reasons why _fiction about dath ilan is bad for Earth_. + +And just—back in the 'aughts, Robin Hanson had this really great blog called _Overcoming Bias_. (You probably haven't heard of it.) I wanted that _vibe_ back, of Robin Hanson's blog in 2008—the will to _just get the right answer_, without all this galaxy-brained hand-wringing about who the right answer might hurt. + +I would have expected a subculture descended from the memetic legacy of Robin Hanson's blog in 2008 to respond to that tripe about protecting people from being destroyed by the truth as a form of "recognizing independent agency" with something like— + +"Hi! You must be new here! Regarding your concern about truth doing harm to people, a standard reply is articulated in the post ["Doublethink (Choosing to be Biased)"](https://www.lesswrong.com/posts/Hs3ymqypvhgFMkgLb/doublethink-choosing-to-be-biased). Regarding your concern about recognizing independent agency, a standard reply is articulated in the post ["Your Rationality Is My Business"](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business)." + +—or _something like that_. Not that the reply needed to use those particular Sequences links, or _any_ Sequences links; what's important is that someone needs counter to this very obvious [anti-epistemology](https://www.lesswrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology). + +And what we actually saw in response to the "You don't get to do harm to other people" message was ... it got 5 "+1" emoji-reactions. + +Yudkowsky [chimed in to point out that](/images/yudkowsky-it_doesnt_say_tell_other_people.png) "Doublethink" was about _oneself_ not reasonably being in the epistemic position of knowing that one should lie to oneself. It wasn't about telling the truth to _other_ people. -[TODO: Eliezerfic fight, cont'd] [TODO: regrets and wasted time * Do I have regrets about this Whole Dumb Story? A lot, surely—it's been a lot of wasted time. But it's also hard to say what I should have done differently; I could have listened to Ben more and lost faith Yudkowsky earlier, but he had earned a lot of benefit of the doubt? -- 2.17.1