From 17877743e48e5ed330876dc580ea415796887e88 Mon Sep 17 00:00:00 2001 From: "M. Taylor Saotome-Westlake" Date: Wed, 11 Jan 2023 22:50:16 -0800 Subject: [PATCH] memoir pokes: critical communism, vaccine polarization --- ...ys-that-exhibit-generally-rationalist-principles.md | 9 +++++++-- content/drafts/if-clarity-seems-like-death-to-them.md | 10 +++++++--- notes/memoir-sections.md | 8 ++++---- notes/memoir_wordcounts.csv | 3 ++- 4 files changed, 20 insertions(+), 10 deletions(-) diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index 49e18b1..6d1923c 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -657,9 +657,14 @@ https://twitter.com/davidxu90/status/1435106339550740482 https://twitter.com/zackmdavis/status/1435856644076830721 > The error in "Not Man for the Categories" is not subtle! After the issue had been brought to your attention, I think you should have been able to condemn it: "Scott's wrong; you can't redefine concepts in order to make people happy; that's retarded." It really is that simple! 4/6 -I once wrote [a post humorously suggesting that trans women owe cis women royalties](/2019/Dec/comp/) for copying the female form (as "intellectual property"). In response to a reader who got offended, I [ended up adding](/source?p=Ultimately_Untrue_Thought.git;a=commitdiff;h=03468d274f5) an "epistemic status" line to clarify that it was a joke, not a serious proposal. +I once wrote [a post whimsically suggesting that trans women should owe cis women royalties](/2019/Dec/comp/) for copying the female form (as "intellectual property"). In response to a reader who got offended, I [ended up adding](/source?p=Ultimately_Untrue_Thought.git;a=commitdiff;h=03468d274f5) an "epistemic status" line to clarify that it was not a serious proposal. + +But if knowing it was a joke partially mollifies the offended reader who thought I might have been serious, I don't think they should be _completely_ mollified, because the joke (while a joke) reflects something about my thinking when I'm being serious: I don't think sex-based collective rights are inherently a crazy idea; I think _something of value has been lost_ when women who want female-only spaces can't have them, and the joke reflects the conceptual link between the idea that something of value has been lost, and the idea that people who have lost something of value are entitled to compensation. + +At Valinor's 2022 Smallpox Eradication Day party, I remember overhearing[^overhearing] Yudkowsky saying that OpenAI should have used GPT-3 to mass-promote the Moderna COVID vaccine to Republicans and the Pfizer vaccine to Democrats (or vice versa). I assume this was not a serious proposal. + +[^overhearing]: Conversations at a party with lots of people are not protected by privacy norms. -But if knowing it was a joke partially mollifies the offended reader who thought I might have been serious, I don't think they should be _completely_ mollified, because the joke (while a joke) reflects something about my thinking when I'm being serious: I don't think sex-based collective rights are inherently a crazy idea; I think _something has been lost_ when women who want them can't have female-only spaces that are actually female-only. ] diff --git a/content/drafts/if-clarity-seems-like-death-to-them.md b/content/drafts/if-clarity-seems-like-death-to-them.md index 8c60639..66f2dae 100644 --- a/content/drafts/if-clarity-seems-like-death-to-them.md +++ b/content/drafts/if-clarity-seems-like-death-to-them.md @@ -131,9 +131,11 @@ MIRI researcher Scott Garrabrant wrote a post about how ["Yes Requires the Possi On 31 May 2019, a [draft of a new _Less Wrong_ FAQ](https://www.lesswrong.com/posts/MqrzczdGhQCRePgqN/feedback-requested-draft-of-a-new-about-welcome-page-for) included a link to "... Not Man for the Categories" as one of Scott Alexander's best essays. I argued that it would be better to cite _almost literally_ any other _Slate Star Codex_ post (most of which, I agreed, were exemplary). I claimed that the following disjunction was true: _either_ Alexander's claim that "There's no rule of rationality saying that [one] shouldn't" "accept an unexpected [X] or two deep inside the conceptual boundaries of what would normally be considered [Y] if it'll save someone's life" was a blatant lie, _or_ one had no grounds to criticize me for calling it a blatant lie, because there's no rule of rationality that says I shouldn't draw the category boundaries of "blatant lie" that way. The mod [was persuaded on reflection](https://www.lesswrong.com/posts/MqrzczdGhQCRePgqN/feedback-requested-draft-of-a-new-about-welcome-page-for?commentId=oBDjhXgY5XtugvtLT), and "... Not Man for the Categories" was not included in the final FAQ. Another "victory." +But winning "victories" wasn't particularly comforting when I resented this becoming a political slapfight at all. + [TODO: -"victories" weren't comforting when I resented this becoming a political slapfight at all—a lot of the objections in the Vanessa thread were utterly insane -I wrote to Anna and Steven Kaas (who I was trying to "recruit" onto our side of the civil war) ] +a lot of the objections in the Vanessa thread were utterly insane +I wrote to Anna and Steven Kaas (who I was trying to "recruit" onto our side of the civil war)] In "What You Can't Say", Paul Graham had written, "The problem is, there are so many things you can't say. If you said them all you'd have no time left for your real work." But surely that depends on what _is_ one's real work. For someone like Paul Graham, whose goal was to make a lot of money writing software, "Don't say it" (except for this one meta-level essay) was probably the right choice. But someone whose goal is to improve our collective ability to reason, should probably be doing _more_ fighting than Paul Graham (although still preferably on the meta- rather than object-level), because political restrictions on speech and thought directly hurt the mission of "improving our collective ability to reason", in a way that they don't hurt the mission of "make a lot of money writing software." @@ -296,7 +298,9 @@ What I would hope for from a rationality teacher and a rationality community, wo And from the way Yudkowsky writes these days, it looks like he's ... not interested in collective information-processing? Or that he doesn't actually believe that's a real thing? "Credibly helpful unsolicited criticism should be delivered in private," he writes! I agree that the positive purpose of public criticism isn't solely to help the author. (If it were, there would be no reason for anyone but the author to read it.) But readers _do_ benefit from insightful critical commentary. (If they didn't, why would they read the comments section?) When I read a story, and am interested in reading the comments _about_ a story, it's because _I want to know what other readers were actually thinking about the work_. I don't _want_ other people to self-censor comments on any plot holes or [Fridge Logic](https://tvtropes.org/pmwiki/pmwiki.php/Main/FridgeLogic) they noticed for fear of dampening someone else's enjoyment or hurting the author's feelings. -Yudkowsky claims that criticism should be given in private because then the target "may find it much more credible that you meant only to help them, and weren't trying to gain status by pushing them down in public." I'll buy this as a reason why credibly _altruistic_ unsolicited criticism should be delivered in private. Indeed, meaning _only_ to help the target just doesn't seem like a plausible critic motivation in most cases. But the fact that critics typically have non-altruistic motives, doesn't mean criticism isn't helpful. In order to incentivize good criticism, you _want_ people to be rewarded with status for making good criticisms! You'd have to be some sort of communist to disagree with this. +Yudkowsky claims that criticism should be given in private because then the target "may find it much more credible that you meant only to help them, and weren't trying to gain status by pushing them down in public." I'll buy this as a reason why credibly _altruistic_ unsolicited criticism should be delivered in private. Indeed, meaning _only_ to help the target just doesn't seem like a plausible critic motivation in most cases. But the fact that critics typically have non-altruistic motives, doesn't mean criticism isn't helpful. In order to incentivize good criticism, you _want_ people to be rewarded with status for making good criticisms. You'd have to be some sort of communist to disagree with this![^communism-analogy] + +[^communism-analogy]: That is, there's an analogy between economically valuable labor, and intellectually productive criticism: if you accept the necessity of paying workers money in order to get good labor out of them, you should understand the necessity of awarding commenters status in order to get good criticism out of them. There's a striking contrast between the Yudkowsky of 2019 who wrote the "Reducing Negativity" post, and an earlier Yudkowsky (from even before the Sequences) who maintained [a page on Crocker's rules](http://sl4.org/crocker.html): if you declare that you operate under Crocker's rules, you're consenting to other people optimizing their speech for conveying information rather than being nice to you. If someone calls you an idiot, that's not an "insult"; they're just informing you about the fact that you're an idiot, and you should plausibly thank them for the tip. (If you _were_ an idiot, wouldn't you be better off knowing rather than not-knowing?) diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index d8df44f..b5d6ab2 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -1,16 +1,15 @@ smaller TODO blocks— ✓ previous AI milestones [pt. 4] ✓ short timelines and politics [pt. 4] -_ "victories" weren't comforting [pt. 3] _ vaccine joke [pt. 4] _ Scott linked to Kay Brown [pt. 2] _ posted to /r/gendercritical [pt. 2] _ the Death With Dignity era [pt. 4] _ social justice and defying threats [pt. 4] +_ complicity and friendship [pt. 4] _ "If Clarity" recap intro [pt. 3] _ "Agreeing with Stalin" recap intro [pt. 4] - bigger blocks— _ dath ilan and Eliezerfic fight _ reaction to Ziz @@ -19,6 +18,7 @@ _ Michael Vassar and the Theory of Optimal Gossip _ psychiatric disaster With internet available— +_ May 2019 thread vs. Vanessa _ historical accuracy of Gallileo _ explain the "if the world were at stake" _Sword of Good_ reference better _ a poem I wrote in the _Less Wrong_ comments in 2011 @@ -57,12 +57,10 @@ _ the function of privacy norms is to protect you from people who want to select _ pull "agreeing with Stalin" quote earlier in ms. to argue that Yudkowsky apparently doesn't disagree with my "deliberately ambiguous" _ is the title of pt. 4 OK? (agreeing with Stalin _is_ correct when Stalin is right; the problem is that Stalin isn't right about gender) _ illustrate "student dysphoria is worse" with anecdote about leaving physics class and going to the counselor to see if I could graduate earlier? -_ as an author and a critic, I expect to lose status when my critics make good points, and expect to gain status when my criticism makes good points; communism analogy: you wouldn't expect credibly helpful work to be unpaid; and I have to make this kind of analysis even he didn't _say_ "I'm a communist" _ hate for secrecy probably correlates with autogynephilia blogging _ mention Said rigor check somewhere, nervousness about Michael's gang being a mini-egregore _ at some point, speculate on causes of brain damage _ the "reducing negativity" post does obliquely hint at the regression point being general ("Let's say that the true level of negativity"), does that seriously undermine my thesis, or does it only merit a footnote? -_ worth footnoting the "some sort of communist" joke? _ elaborate on why I'm not leaking sensitive bits, by explaining what can be inferred by what was and wasn't said in public _ footnote on "no one would even consider" _ post-Christmas conversation should do a better job of capturing the war, that Jessica thinks Scott is Bad for being a psychiatrist @@ -1967,3 +1965,5 @@ https://glowfic.com/replies/1860952#reply-1860952 the generic experience is that the future is more capable but less aligned, and we basically expect this to continue people from the past would envy our refrigeration, vaccines, infinite food, &c., but that doesn't mean they would regard our value-drifted-with-respect-to-them culture as superior paperclipping is just that turned up to 11 (well, 10¹¹) + +Bostrom's apology for an old email—who is this written for?? Why get ahead, when you could just not comment? diff --git a/notes/memoir_wordcounts.csv b/notes/memoir_wordcounts.csv index f72fcdd..277dd5d 100644 --- a/notes/memoir_wordcounts.csv +++ b/notes/memoir_wordcounts.csv @@ -288,4 +288,5 @@ 01/07/2023,75297 01/08/2023,75830 01/09/2023,76029 -01/10/2023, \ No newline at end of file +01/10/2023,76060 +01/11/2023,76304 -- 2.17.1