From 5785f48f8cba09aa37ee116cd922af51d07a8089 Mon Sep 17 00:00:00 2001 From: "M. Taylor Saotome-Westlake" Date: Sun, 17 Nov 2019 16:54:11 -0800 Subject: [PATCH] "I Tell Myself" Sunday drafting session 4: "kind of obvious, really" --- notes/i-tell-myself-notes.txt | 13 ++----------- notes/i-tell-myself-sections.md | 17 +++++++++++++++++ 2 files changed, 19 insertions(+), 11 deletions(-) create mode 100644 notes/i-tell-myself-sections.md diff --git a/notes/i-tell-myself-notes.txt b/notes/i-tell-myself-notes.txt index c7265fe..1b3b9b4 100644 --- a/notes/i-tell-myself-notes.txt +++ b/notes/i-tell-myself-notes.txt @@ -72,9 +72,9 @@ The sheer number of hours we invested in this operation is _nuts_: desperate all what about _my_ mental health? -Men who wish they were women do not particularly resemble actual women? We just—don't? This seems kind of obvious, really? -my friend thinks I'm naive to have expected such a community—she was recommending "What You Can't Say" in 2009—but in 2009, we did not expect that _whether or not I should cut my dick off_ would _become_ a politicized issue, which is new evidence about the wisdom of the original vision +Telling the difference between fantasy and reality is kind of an important life skill + but if my expectations (about the community) were wrong, that's a problem with my model; reality doesn't have to care @@ -472,8 +472,6 @@ fame: arguing with a Discord server was low-impact compared to getting the leade https://www.lesswrong.com/posts/CEGnJBHmkcwPTysb7/lonely-dissent -Telling the difference between fantasy and reality is kind of an important life skill - https://slatestarcodex.com/2017/11/07/does-age-bring-wisdom/ @@ -488,13 +486,6 @@ https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2 -The short story ["Failed Utopia #4-2"](https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2) portrays an almost-aligned superintelligence constructing a happiness-maximizing utopia for humans—except that because [evolution didn't design women and men to be optimal partners for each other](https://www.lesswrong.com/posts/Py3uGnncqXuEfPtQp/interpersonal-entanglement), and the AI is prohibited from editing people's minds, the happiness-maximizing solution ends up splitting up the human species by sex and giving women and men their own _separate_ utopias, complete with artificially-synthesized romantic partners. - -At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at the idea in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back eleven years later (my deconversion from my teenage religion being pretty thorough at this point, I think), the _argument makes sense_ (though you need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other). - -On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_, rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, and judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. - - https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions > Jun 18, 2008 diff --git a/notes/i-tell-myself-sections.md b/notes/i-tell-myself-sections.md new file mode 100644 index 0000000..e884718 --- /dev/null +++ b/notes/i-tell-myself-sections.md @@ -0,0 +1,17 @@ +The short story ["Failed Utopia #4-2"](https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2) portrays an almost-aligned superintelligence constructing a happiness-maximizing utopia for humans—except that because [evolution didn't design women and men to be optimal partners for each other](https://www.lesswrong.com/posts/Py3uGnncqXuEfPtQp/interpersonal-entanglement), and the AI is prohibited from editing people's minds, the happiness-maximizing solution ends up splitting up the human species by sex and giving women and men their own _separate_ utopias, complete with artificially-synthesized romantic partners. + +At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at the idea in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back eleven years later (my deconversion from my teenage religion being pretty thorough at this point, I think), the _argument makes sense_ (though you need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other). + +On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_, rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, and judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. Calling the _verthandi_ "women" would be _worse writing_; it would _fail to communicate_ what, in the story, has been lost. + +----- + +Men who wish they were women do not particularly resemble actual women! We just—don't? This seems kind of obvious, really? + +Okay, I understand that in Berkeley 2020, that probably sounds like some kind of reactionary political statement. But try taking it _literally_, as a _factual claim_ about the world. Men (adult human males) who _fantasize about_ being women (adult human females), are still neverless drawn from the _male_ multivariate trait distribution, not the female distribution. + +Telling the difference between fantasy and reality is kind of an important life skill! + +It seems useful to be able to express this claim in natural language? "The trait distribution of trans women isn't identical to that of cis women" does not _convey the same meaning_. + +---- -- 2.17.1