From: Zack M. Davis Date: Sun, 12 Nov 2023 06:10:05 +0000 (-0800) Subject: memoir: a little more guidance on "he could think of differences if he wanted" X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=1cd552c549d3f5c896361521871130ac3cca7b0c;p=Ultimately_Untrue_Thought.git memoir: a little more guidance on "he could think of differences if he wanted" --- diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index a21a94f..06d626f 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -360,7 +360,7 @@ For the savvy people in the know, it would certainly be convenient if everyone s [Policy debates should not appear one-sided.](https://www.lesswrong.com/posts/PeSzc9JTBxhaYRp9b/policy-debates-should-not-appear-one-sided) Faced with this kind of dilemma, I can't say that defying Power is necessarily the right choice: if there really were no other options between deceiving your readers with a bad faith performance, and incurring Power's wrath, and Power's wrath would be too terrible to bear, then maybe deceiving your readers with a bad faith performance is the right thing to do. -But if you care about not deceiving your readers, you would want to be sure that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore politically unfavorable counteraguments_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion", but this seems like yet another instance of Yudkowsky motivatedly playing dumb: _if he wanted to_, I'm sure Eliezer Yudkowsky could think of _some relevant differences_ between "2 + 2 = 4" and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'". +But if you cared about not deceiving your readers, you would want to be sure that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore politically unfavorable counteraguments_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion". I think he's playing dumb here. In other contexts, he's written about ["attack[s] performed by selectively reporting true information"](https://twitter.com/ESYudkowsky/status/1634338145016909824) and ["[s]tatements which are technically true but which deceive the listener into forming further beliefs which are false"](https://hpmor.com/chapter/97). I think that _if he wanted to_, Eliezer Yudkowsky could think of some relevant differences between "2 + 2 = 4" and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'". "[P]eople do _know_ they're living in a half-Stalinist environment," Yudkowsky says. "I think people are better off at the end of that," he says. But who are "people", specifically? One of the problems with utilitarianism is that it doesn't interact well with game theory. If a policy makes most people better off, at the cost of throwing a few others under the bus, is enacting that policy the right thing to do? diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index 87d8954..c4bf50a 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -34,8 +34,8 @@ _ GreaterWrong over Less Wrong for comment links pt. 4 edit tier— ✓ "Ideology Is Not the Movement" mentions not misgendering ✓ mention Nick Bostrom email scandal (and his not appearing on the one-sentence CAIS statement) +✓ explain why he could think of some relevant differences _ body odors comment -_ if he wanted to, I'm sure Eliezer Yudkowsky could think of some relevant differences (I should explain) _ emphasize that 2018 thread was policing TERF-like pronoun usage, not just disapproving of gender-based pronouns _ if you only say good things about Republican candidates _ to-be-continued ending about how being a fraud might be a good idea @@ -2290,12 +2290,8 @@ https://www.goodreads.com/quotes/38764-what-are-the-facts-again-and-again-and-ag In October 2016, - if [...] wrote her own 10,600 draft Document explaining why she thought [...] is actually a girl, that would be really interesting!—but rather that no one else seemed _interested in having a theory_, as opposed to leaping to institute a social convention that, when challenged, is claimed to have no particular consequences and no particular objective truth conditions, even though it's not clear why there would be moral urgency to implement this convention if it weren't for its consequences. -https://twitter.com/ESYudkowsky/status/1634338145016909824 re "malinformation" -> If we don't have the concept of an attack performed by selectively reporting true information—or, less pleasantly, an attack on the predictable misinferences of people we think less rational than ourselves—the only socially acceptable counter is to say the info is false. - I said that I couldn't help but be reminded of a really great short story that I remembered reading back in—it must have been 'aught-nine. I thought it was called "Darkness and Light", or something like that. It was about a guy who gets transported to a fantasy world where he has a magic axe that yells at him sometimes, and he's prophecied to defeat the bad guy, and he and his allies have to defeat these ogres to reach the bad guy's lair. And when they get there, the bad guy _accuses them of murder_ for killing the ogres on the way there. (The story was actually Yudkowsky's ["The Sword of Good"](https://www.yudkowsky.net/other/fiction/the-sword-of-good), but I was still enjoying the "Robin Hanson's blog" æsthetic.) @@ -2307,7 +2303,6 @@ need to fit this in somewhere— Everyone believed this in 2005! Everyone _still_ believes this! - > Dear Totally Excellent Rationalist Friends: > As a transhumanist and someone with a long, long history of fantasizing about having the property, I am of course strongly in favor of there being social norms and institutions that are carefully designed to help people achieve their lifelong dream of acquiring the property, or rather, the best approximation thereof that is achievable given the marked limitations of existing technology. > However, it's also kind of important to notice that fantasizing about having the property without having yet sought out interventions to acquire the property, is not the same thing as somehow already literally having the property in some unspecified metaphysical sense! The process of attempting to acquire the property does not propagate backwards in time! @@ -2777,10 +2772,6 @@ https://www.lesswrong.com/posts/qbcuk8WwFnTZcXTd6/thomas-kwa-s-miri-research-exp Weird tribalist praise for Scott: https://www.greaterwrong.com/posts/GMCs73dCPTL8dWYGq/use-normal-predictions/comment/ez8xrquaXmmvbsYPi -https://hpmor.com/chapter/97 -> "Or tricks," Harry said evenly. "Statements which are technically true but which deceive the listener into forming further beliefs which are false. I think it's worth making that distinction. - - ------- I like to imagine that they have a saying out of dath ilan: once is happenstance; twice is coincidence; _three times is hostile optimization_.