From: Zack M. Davis Date: Sun, 8 Oct 2023 04:51:53 +0000 (-0700) Subject: check in X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=05b356c2fa4e427080932e391af1dd06eccf3bda;p=Ultimately_Untrue_Thought.git check in --- diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index 67838f9..e42e59b 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -480,7 +480,7 @@ And you need to be able to say, in public, that trans women are male and trans m If you don't want to say those things because hurting people is wrong, then you have chosen Feelings. -Scott Alexander chose Feelings, but I can't really hold that against him, because Scott is [very explicit about only speaking in the capacity of some guy with a blog](https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/). You can tell from his writings that he never wanted to be a religious leader; it just happened to him on accident because he writes faster than everyone else. I like Scott. Scott is great. I feel sad that such a large fraction of my interactions with him over the years have taken such an adversarial tone. +Scott Alexander chose Feelings, but I can't really hold that against him, because Scott is [very explicit about only speaking in the capacity of some guy with a blog](https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/). You can tell from his writings that he never wanted to be a religious leader; it just happened to him on accident because he writes faster than everyone else. I like Scott. Scott is alright. I feel sad that such a large fraction of my interactions with him over the years have taken such an adversarial tone. Eliezer Yudkowsky ... did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he consciously knows to be unambiguously false. And the reason I can hold it against _him_ is because Eliezer Yudkowsky does not identify as just some guy with a blog. Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. He markets himself as a master of the hidden Bayesian structure of cognition, who ["aspires to make sure [his] departures from perfection aren't noticeable to others"](https://twitter.com/ESYudkowsky/status/1384671335146692608). diff --git a/notes/dath_ilan_scrap.md b/notes/dath_ilan_scrap.md index 43aaa78..1f8b972 100644 --- a/notes/dath_ilan_scrap.md +++ b/notes/dath_ilan_scrap.md @@ -208,3 +208,27 @@ Yudkowsky had this whole marketing image of him being uniquely sane and therefor And so, yeah, insofar as fiction about dath ilan functioned as marketing material for Yudkowsky's personality cult that I thought was damaging people like me (in some ways, while simultaneously helping us in other ways), I had an incentive to come up with literary criticism that paints dath ilan negatively? It was great for ajvermillion to notice this! It _would_ be bad if my brain were configured to come up with dath-ilan-negative literary criticism, and for me to _simultaneously_ present myself as an authority on dath ilan whom you should trust. But if dath-ilan-negative literary criticism was undersupplied for structural reasons (because people who like a story are selected for not seeing things the story is doing that are Actually Bad), and my brain was configured to generate it anyway (because I disliked the person Yudkowsky had become, in contrast to the person he was in 2008), it seemed pro-social for me to post it, for other people to take or leave according to their own judgement? + +The problem I saw with this is that becoming rich and famous isn't a purely random exogenous event. In order to make an informed decision about whether or not to put in the effort to try to _become_ rich and famous (as contrasted to choosing a lower-risk or more laid-back lifestyle), you need accurate beliefs about the perks of being rich and famous. + +The dilemma of whether to make more ambitious economic choices in pusuit of sexual goals was something that _already_ happens to people on Earth, rather than being hypothetical. I once met a trans woman who spent a lot of her twenties and thirties working very hard to get money for various medical procedures. I think she would be worse off under a censorship regime run by self-styled Keepers who thought it was kinder to prevent _poor people_ from learning about the concept of "transsexualism". + +Further discussion established that Yudkowsky was (supposedly) already taking into account that class of distortion on individuals' decisions, but that the empirical setting of probabilities and utilities happened to be such that ignorance came out on top. + + +Even if you specified by authorial fiat that "latent sadists could use the information to decide whether or not to try to become rich and famous" didn't tip the utility calculus in itself, [facts are connected to each other](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies); there were _more consequences_ to the coverup, more ways in which better-informed people could make better decisions than worse-informed people. + + +Or imagine a world where male homosexuality couldn't be safely practiced due to super-AIDS. (I know very little about BDSM.) I still think men with that underlying predisposition would be better off _having a concept_ of "homosexuality" (even if they couldn't practice it), rather than the concept itself being censored. There are also other systematic differences that go along with sexual orientation (the "feminine gays, masculine lesbians" thing); if you censor the _concept_, you're throwing away that knowledge. + +(When I had brought up the super-AIDS hypothetical in the chat, Ajvermillion complained that I was trying to bait people into self-cancelling by biting the bullet on suppressing homosexuality. I agreed that the choice of example was engineered to activate people's progressive moral intuitions about gay rights—it was great for him to notice that—but I thought that colliding philosophical intuitions like that was intellectually productive; it wasn't an attempt to gather blackmail material.) + +---- + +It seemed like the rationale for avoiding spoilers of movie plots or homework exercises had to do with the outcome being different if you got spoiled: you have a different æsthetic experience if you experience the plot twist in the 90th minute of the movie rather than the fourth paragraph of the _Wikipedia_ article. Dath ilan's sadism/masochism coverup didn't seem to have the same structure: when I try to prove a theorem myself before looking at how the textbook says to do it, it's not because I would be _sad about the state of the world_ if I looked at the textbook; it's because the temporary ignorance of working it out myself results in a stronger state of final knowledge. + +That is, the difference between "spoiler protections" (sometimes useful) and "coverups" (bad) had to do with whether the ignorant person is expected to eventually uncover the hidden information, and whether the ignorant person knows that there's hidden information that they're expected to uncover. In the case of the sadism/masochism coverup (in contrast to the cases of movie spoilers or homework exercises), it seemed like neither of these conditions pertained. (Keltham knows that the Keepers are keeping secrets, but he seems to actively have beliefs about human psychology that imply masochism is implausible; it seems more like he has a false map, rather than a blank spot on his map for the answer to the homework exercise to be filled in.) I thought that was morally relevant. + +(Additionally, I would have hoped that my two previous mentions in the thread of supporting keeping nuclear, bioweapon, and AI secrets should have already made it clear that I wasn't against _all_ cases of Society hiding information, but to further demonstrate my ability to generate counterexamples, I mentioned that I would also admit _threats_ as a class of legitimate infohazard: if I'm not a perfect decision theorist, I'm better off if Tony Soprano just doesn't have my email address to begin with, if I don't trust myself to calculate when I "should" ignore his demands.) + +----- diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index df70f6d..f5aaca0 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -82,12 +82,15 @@ _ Sarah's point that Scott gets a lot of undeserved deference, too: https://twit _ clarify that Keltham infers there are no mascochists, vs. Word of God _ "Doublethink" ref in Xu discussion should mention that Word of God Eliezerfic clarification that it's not about telling others - dath ilan ancillary tier— _ Who are the 9 most important legislators called? _ collect Earth people sneers +_ what kind of person wants to delete history? History would tell you _ psyops don't require people taking them seriously, only that they make people think the govenment is doing psyops _ the reason "truth is better than lies; knowledge is better than ignorance" is general; can assert by authorial fiat whatever details are needed to make it all turn out for the greater good, but _that's not how anything works in real life_. +_ "telling people things would be meddling" moral needs work; obvious objection is that different rules apply to Keepers +_ "obligate" is only Word of God, right?—I should probably cite this + things to discuss with Michael/Ben/Jessica— _ Anna on Paul Graham diff --git a/notes/memoir_wordcounts.csv b/notes/memoir_wordcounts.csv index c317800..f52d760 100644 --- a/notes/memoir_wordcounts.csv +++ b/notes/memoir_wordcounts.csv @@ -535,4 +535,7 @@ 10/03/2023,121738,177 10/04/2023,122627,889 10/05/2023,123572,945 -10/06/2023, \ No newline at end of file +10/06/2023,119136,-4436 +10/07/2023,118644,-492 +10/08/2023,, +10/09/2023,,