From: M. Taylor Saotome-Westlake Date: Sun, 5 Jun 2022 00:36:22 +0000 (-0700) Subject: Saturday evening retreat 2: simulation hypotheses X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=635a9b5a9de80ae6349853c811294aab8df5f189;p=Ultimately_Untrue_Thought.git Saturday evening retreat 2: simulation hypotheses --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index a555e12..f111d61 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -240,21 +240,24 @@ Until suddenly, in what was then the current year of 2016, it was now seeming th The claim that political privileges are inculcating "a culture of worthless, unredeemable scoundrels" in some _other_ group is easy to dimiss as bigotry, but it hits differently when you can see it happening to _people like you_. Notwithstanding whether the progressive story had been right about the trevails of blacks and women, I _know_ that straight boys who wish they were girls are not actually as fragile and helpless as we were being portrayed—that we _weren't_ that fragile, if anyone still remembers the world of 2006, when straight boys who wished they were girls knew that they were, in fact, straight boys, and didn't think the world owed them deference for their perversion. And this experience _did_ raise further questions about whether previous iterations of progressive ideology had been entirely honest with me. (If nothing else, I couldn't help but notice that my update from "Blanchard is probably wrong because trans women's self-reports say it's wrong" to "Self-reports are pretty crazy" probably had implications for "[Red Pill](https://heartiste.org/the-sixteen-commandments-of-poon/) is probably wrong because women's self-reports say it's wrong".) -While I was in this flurry of excitement about my recent updates and the insanity around me, I thought back to that "at least 20% of the ones with penises are actually women" Yudkowsky post from back in March that had been my wake-up call to all this. What _was_ going with that? +While I was in this flurry of excitement about my recent updates and the insanity around me, I thought back to that "at least 20% of the ones with penises are actually women" Yudkowsky post from back in March that had been my wake-up call to all this. What _was_ going on with that? -I wasn't, like, _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a work buddy or a college friend. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about he trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation, and I had his email address from previous contract work I had done for MIRI back in '12, so I wrote him offering $1,000 to talk about sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228) what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality, mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." +I wasn't, like, _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a work buddy or a college friend something. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about he trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation, and I had his email address from previous contract work I had done for MIRI back in '12, so I wrote him offering $1,000 to talk about sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228) what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality, mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." At this point, any _normal people_ who are (somehow?) reading this might be thinking, isn't that weird and a little cultish?—some blogger you follow posted something you thought was strange earlier this year, and you want to pay him _one grand_ to talk about it? -To the normal person I would explain thusly. First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and +To the normal person I would explain thusly. First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and paying people money in exchange for them using their time differently from how they otherwise would is a perfectly ordinary thing for microeconomic agents to do. Upper-middle class normal people don't blink at paying a licensed therapist $100 to talk for an hour, because their culture designates that as a special ritualized context in which paying money to talk to someone isn't weird. In my culture, we don't need the special ritualized context; Yudkowsky just had a higher rate than most therapists. Second, $1000 isn't actually real money to a San Francisco software engineer. -[$1000 isn't actually real money to a San Francisco software engineer] -[the absurd hero-worship I had] +Third—yes. Yes, it _absolutely_ was a little cultish. + +[TODO: explain religion] One of my emails included the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" +(Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including inside of future superintelligences running simulations of the past, there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.) + [TODO: I can't actually confirm or deny whether he accepted the happy price offer because if we did talk, it would have been a private conversation] ]