X-Git-Url: http://unremediatedgender.space/source?a=blobdiff_plain;f=notes%2Fi-tell-myself-notes.txt;h=c7265fed368b032925a2719b63209494f454eee2;hb=3c496f3c4c6f5046ce4465a60930f6e91226901f;hp=701c9ae9d9b7908e02fe67e29ece622cba59c43b;hpb=4d5e91ded50139b038a61198f0f96e579de62330;p=Ultimately_Untrue_Thought.git diff --git a/notes/i-tell-myself-notes.txt b/notes/i-tell-myself-notes.txt index 701c9ae..c7265fe 100644 --- a/notes/i-tell-myself-notes.txt +++ b/notes/i-tell-myself-notes.txt @@ -481,16 +481,19 @@ https://slatestarcodex.com/2017/11/07/does-age-bring-wisdom/ https://www.greaterwrong.com/posts/FBgozHEv7J72NCEPB/my-way#comment-W4TAp4LuW3Ev6QWSF -> I am skeptical that either sex can ever really model and predict the other’s deep internal life, short of computer-assisted telepathy. These are different brain designs we’re talking about here. +> I am skeptical that either sex can ever really model and predict the other's deep internal life, short of computer-assisted telepathy. These are different brain designs we're talking about here. https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2 -https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb -https://www.lesswrong.com/posts/Py3uGnncqXuEfPtQp/interpersonal-entanglement -"Failed Utopia #4-2" portrays -[on my reading, it's an important part of the argument that _verthandi_ are a _separate thing_, not just synthesized women] + +The short story ["Failed Utopia #4-2"](https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2) portrays an almost-aligned superintelligence constructing a happiness-maximizing utopia for humans—except that because [evolution didn't design women and men to be optimal partners for each other](https://www.lesswrong.com/posts/Py3uGnncqXuEfPtQp/interpersonal-entanglement), and the AI is prohibited from editing people's minds, the happiness-maximizing solution ends up splitting up the human species by sex and giving women and men their own _separate_ utopias, complete with artificially-synthesized romantic partners. + +At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at the idea in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back eleven years later (my deconversion from my teenage religion being pretty thorough at this point, I think), the _argument makes sense_ (though you need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other). + +On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_, rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, and judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. + https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions @@ -505,3 +508,10 @@ https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions > > Zachary Michael Davis +Arguing with a specific person's published words is important, because otherwise you can strawman + +[Am I the asshole?](https://www.reddit.com/r/AmItheAsshole/) + +(I told people that my father was coming to pick me up at the end of my 72-hour (== 3 days) evaluation period, but that it wasn't fair that I couldn't rescue everyone.) + +blegg commentary: https://www.lesswrong.com/posts/GEJzPwY8JedcNX2qz/blegg-mode#aAgSDZ4ddHpzj9fNN